If you’ve ever dreamed of talking with your car, Apple may have good news for you. Within the latest CarPlay developer guide is support for “voice-based conversational apps,” a sign that Apple could be about to open the doors to AI chatbot apps like ChatGPT, Gemini and Claude right on your dashboard.
The guidelines indicate that AI companies like Google or OpenAI will need to create an interface that shows the conversational AI is listening in CarPlay, and then “appropriately respond to questions or requests and perform actions.”
Support is expected to arrive in March with the release of iOS 26.4, which is currently in beta. Companies that want to participate will have to jump through all the usual Apple hoops to qualify for CarPlay.
An Apple representative didn’t immediately respond to a request for comment.
How will talking to AI while driving work?
Conversational AI will have limitations in CarPlay.
Apple has limited what apps work with CarPlay, partially to help keep drivers focused and undistracted. Siri commands were enabled under certain circumstances, but that was all.
With iOS 26.4 and the new conversational AI support, drivers could potentially have more in-depth conversations, but with a few significant limitations. First, Apple won’t be enabling wake words, meaning drivers will have to use their dashboard controls to open the AI app before they start talking.
CarPlay apps must also be designed for “voice interaction in the driving environment,” and can’t show text or images in response to your questions, unlike your usual use of AI chatbots.
Also, Apple makes it clear that these apps won’t be able to control your vehicle, your iPhone or related devices. So you’re limited to the basic chatbot conversation, which could let you brainstorm ideas for dinner, vent about your work day or ponder the great questions of the universe. Just don’t ask for home security advice and never use them for therapy, medical diagnoses, financial advice, tax planning and more.Â
And always double-check that and AI chatbot hasn’t hallucinated while giving you information it says is true.
Read the full article here











