11/15/2019 | 5 MINUTE READ

How Mitsubishi Electric Automotive Wants to Improve Your Autonomous Driving Experience

Facebook Share Icon LinkedIn Share Icon Twitter Share Icon Share by EMail icon Print Icon

Tech developments just aren’t about sensors and processors but what people will actually do in an autonomous vehicle.

Share

Facebook Share Icon LinkedIn Share Icon Twitter Share Icon Share by EMail icon Print Icon

 

Mitsubishi Electric Automotive

Yes, that’s a 48-inch long screen running through the middle of the Lincoln Navigator. It is part of Mitsubishi Electric Automotive America’s research into how people will interact within autonomous vehicles. (Images: Mitsubishi Electric Automotive America)

According to a recently published study by the Center for Automotive Research, Technology Roadmap: Intelligent Mobility Technologies by Zahra Bahrani Fard and Valerie Sathe Brugeman, by 2030:

  • Shared automated electric vehicle fleets will account for nearly 25% of all auto passenger miles traveled.
  • Shared automated electric vehicle fleets will account for nearly 20% of all vehicles sold.
  • Majority of shared cars on the road will have utilization rates above 50%.

All of which is to say that there is a strong potential that in not-too-long a time there will be plenty of people being transported in automated vehicles (that utilization rate of +50% is perhaps the most important number to note as it underlines the time that the automated vehicles will be on the road).

One of the things that probably doesn’t get as much attention as things in the realm of sensors and processors and which deserves a whole lot more is that of just what the riders in these vehicles are going to be doing with their time.

But someone who is doing a considerable amount of thinking about this is Jacek Spiewla, senior manager, User Experience, Mitsubishi Electric Automotive America.

“This is really a space where you can do experimentation,” Spiewla says.

The Big Screen

And one of the experimental testbeds that Spiewla and his colleagues are working on is a Lincoln Navigator that has a 48-inch, custom-made color touchscreen running down the middle of the SUV where a center console would be—and well beyond. Forty-eight inches. Four feet. A screen that offers a variety of functions: “Each person in the vehicle can get their own zone or they can interact.” Maybe someone wants to read a newsfeed. Maybe everyone wants to play an interactive game—a game that might include passengers in other autonomous vehicles that might be in the vicinity.

“In general,” Spiewla says, “the idea of this concept is: how will we interact socially inside of an autonomous vehicle?”

While there are those who talk about how much more “productive” people will be when they don’t have to pilot their vehicles, this overlooks the fact that not all traveling from one point to another is predicated on working.

While the 48-inch screen is easily the most visible thing going on in the vehicle’s interior, Spiewla points to a couple more: a 12.3-inch unit in the center cluster and a 4K display in the IP, which, he says, are running on a Qualcomm Snapdragon 820 system on a chip.

Who’s There?

There are various functions that they’re trying out, not all of which are based on an automated driving scenario; some are almost immanent. For example, there is an integrated camera that is able to determine who the driver and front-seat passenger are (when I’m detected I’m “Guest” while Spiewla, who has been registered in the vehicle’s database, is identified as himself). This allows, he explains, the individual to gain various levels of access to the vehicle’s functionality. (And going forward in time, to when there is, perhaps, Level 3 or 4 autonomous capability, the camera could observe the designated driver to help make a determination of whether that person is ready and able to take control if necessary.)

Mitsubishi Electric Automotive

One of the areas they’re working on is improved voice interaction, whether this is getting texts “read” by the person who sent it (this requires using an AI system that learns a person’s intonation) or having virtual assistants that can respond to multiple commands.

Another thing that the camera-based system could do, he says, is to determine where the driver spends time looking. This could serve as a coach for the driver (e.g., “You have spent x% of time looking downward”), to encourage looking straight ahead through the windshield, as well as provide companies like Mitsubishi with information regarding where various elements of information should be located within the driver’s view.

Digital assistants are becoming more prevalent in people’s day-to-day activities, including in their vehicles. Spiewla and his colleagues are working with the company’s FLEXconnect system to develop a digital assistant array for the vehicle. He explains that rather than just having one on-board assistant (e.g., Alexa, Siri, Cortana), they would offer the selection of more than one system that the user could select, depending on the specific query.

He says they’re working with Houndify for the development of the intelligent speech recognition.

Talk, Talk

One of the things that they’re working on is the ability to perform multiple queries in a single sentence. That is, he points out that for the typical virtual assistant it is generally a case where there are single requests rather than “What is the weather in Detroit and what was the score of the Red Wings game?”

“Everyone claims their virtual assistant is conversational,” Spiewla says, “but it is really command-and-answer.”

They are also working with Acapela Group of Mons, Belgium, on providing the means for personalized text-to-speech. That is, based on a recording of an individual’s voice—either a collection of recordings or a set series of sentences—there is an AI-based model in the cloud such that if Sally sends you a text, then the voice heard through the speakers is that of Sally or whomever else is in the system.

And closer still to today, Spiewla says they’re working with Chamberlain Group, the company well known for its garage door openers. Here, it is a case where rather than using a remote or a button near the rearview mirror, the garage door can be opened or closed through the touchscreen. Spiewla says that this is an IoT-based system and they’re looking at geofencing solutions, such as having the ability to automatically open the garage when the vehicle has come within a set proximity.

Screen sizes, resolutions, numbers, content. They’re looking at all of these things, not only how they will fit into the vehicle of tomorrow—and today, for that matter—but how users will interact with them. Which is an important part of what could be an autonomous experience.

RELATED CONTENT

  • NISSAN'S Platform Play

    The mid-size 2005 Pathfinder, Nissan's largest design and development program to date, involved three technical centers, and took 36 months and countless trans-Pacific trips to complete. Though it borrows major components from the full-size Titan pickup and Armada SUV, it's not just a downsized clone.

  • Local Motors & DARPA

    While we wrote about Local Motors in the entry below earlier today, we were just advised by the company that another one of its competitions—this one held in cooperation with the U.S.

  • 2019 Lexus ES 350 Introduced

    The Lexus ES sedan is more than just an offering within the company’s lineup.