OnStar and Nuance: Better Understanding

One of the things that voice-activated command systems in vehicles today do to people is make them more like, well, automated machines.

One of the things that voice-activated command systems in vehicles today do to people is make them more like, well, automated machines. Consider that there tends to be a set series of commands that much be annunciated in a particular order or you’re going to get something along the lines of “Say command again” repeatedly or “I don’t understand.” Which is one part frustrating and two parts annoying.

OnStar, which is well known for its human advisors (i.e., you push the blue button and you can actually talk to a real person who is more than likely able to understand words in a non-specific order unless they happen to be completely random, which is probably a sign that you shouldn’t be driving), also has a virtual advisor. Which means that a digital voice talks to you and virtual ears connected to a processor in the cloud somewhere awaits your response.

While this could be a situation as described above in terms of lack of understanding, OnStar has announced that it is deploying new software from Nuance Communications that allow straightforward understanding, as demonstrated here:

OnStar & Nuance

Engineers at OnStar and Nuance have validated the system so that it is capable of understanding not only regional dialects in the U.S. but also French, Spanish and Mandarin language models.

We must say that one of the questions asked in the video, “Any traffic today?” is one of the great understatements of all time for those who are using their OnStar system during commuting.