Mitsubishi Electric Automotive America’s Approach to Building Trust in Autonomy

There is one thing that needs to be taken into account when it comes to automated driving, one critical thing: Trust. As in the trust that the person who takes her hands from the wheel—assuming that there is one—and the trust of the person who is sitting in the passenger’s seat—assuming that all the seats aren’t passengers’ seats.

And to address that, Jacek Spiewla, Mitsubishi Electric Automotive America (MEAA; meaa-mea.com) advanced development engineer, demonstrates, they have developed what they’re calling the FLEXConnect.AI in-vehicle infotainment system that features a multiscreen interface. There are three displays built by Mitsubishi that are integrated into an instrument panel: two 12.3-inch 1920 x 720 displays and a 10.2-inch 768 x 1280 capacitive touch display in between the two. The first 12.3-inch display is for the driver’s information. The central 10.2-inch display is the infotainment unit where swipes and touches allow screens to be changed and options to be selected. Then there is that third screen, which is located directly in front of the front-seat passenger.

The system is running on a Snapdragon 820Am processor from Qualcomm. Spiewla notes that this quad-core processor is capable of being able to handle all the screens, whereas “with traditional systems, you would need three separate processors to run the cluster, the center display and the passenger display.” He adds of the approach they’re taking: “We’re able to reduce both complexity and time to market.”

For the operating system, they’re using Android. “One of the nice things about Android,” he says, “is that you can customize each screen independently.”

So what does any of this have to do with trust?

One of the things they’ve done is, as Spiewla puts it, make it possible for the vehicle occupants to be brought “into the loop of what the vehicle is doing.”

The car goes into automated driving mode. At the same time, the screens can show what the vehicle’s sensors are “seeing” in real-time. “We have a real-time visualization of the automated driving task based on cameras and radar. It shows lane markings, pedestrians, parked vehicles, cross-street traffic, and other cars.” Then a “threat assessment” is made of the various items categorized. This allows the people to better understand not only what the vehicle is “seeing,” but gain the confidence that what they can see by looking through the windshield is also being monitored by the car.

Another feature of the Snapdragon 820Am processor is a Qualcomm Snapdragon X12 LTE modem that provides up to 600 Mbps downlink and 150 Mbps uplink speeds. Spiewla shows how they’re using it in a couple of ways, one of which is that it can add a measure of confidence whether autonomous driving is involved.

MEAA is partnering with AccuWeather. What this facilitates is providing information on what’s going to occur along a route. Spiewla programs a route from Atlanta to Memphis that the car “thinks” it is traveling. (We’re actually in a static Audi in a garage in Plymouth, Michigan.) At the time, there are actual thunderstorms 60 miles ahead on the route, which are displayed on screens of the FLEXConnect.AI system. This allows the driver and vehicle occupants to have a sense of what’s ahead.

Another company MEAA is working with is Movimento (movimentogroup.com). That company provides over-the-air updates. Think of it as updating your smart phone, but in this case you’re updating various systems in your vehicle. Connectivity can occur via the embedded modem in the car or with a Wi-Fi connection (e.g., when the car is parked in your garage, it can connect with the home Wi-Fi system or a smart phone can be set up to serve as a hot-spot). The vehicle makes a secure connection to the cloud, there is a check made of the vehicle’s VIN number to assure that it is the vehicle in question, and necessary downloads are made to the required components in the vehicle (or the updates can be scheduled for convenience). These updates could be for a variety of things including infotainment, navigation and the powertrain. Or for presenting the information of what the car is “seeing” to the vehicle occupants, as in adding labels to the objects (e.g., “pedestrian”).

Getting data on the screens is something that MEAA is working on, too, says Mark Rakowski, executive director of sales and engineering for the organization. “We have a variety of ADAS [advanced driver assistance systems] products, such as for automated parking, lane-keeping assistance and lane departure warning. Another division is working on mobile mapping.” They’re producing radar and ultrasonic sensors. They are developing ECUs for ADAS applications (working with processor suppliers like Qualcomm, Renesas, NVIDIA and Intel). While they don’t make LiDAR sensors, Rakowski says they’re looking to partner with companies that do.

Rakowski says that system development is occurring at an unprecedented speed. “People are designing systems around chips that aren’t even available yet. From an automotive standpoint, we never did that. The computer industry did.” He explains that a company will tell MEAA what its next chip will do, and that they’ll provide samples in six months. “We used to wait for the samples,” Rakowski says. “Now we’re designing around something that’s just on paper.”

While he thinks that it will take some time for there to be a significant number of fully autonomous vehicles (“A lot of people are talking about 2021, but how much of the fleet will that be?”), he thinks that when it comes to the safety, comfort and convenience that ADAS can provide, this will continue to make big strides.

“Costs will keep coming down as you have more sensor fusion and deep learning takes over,” he says, pointing out, “This is not going to be technology just for luxury cars, but it will be in trim lines for every vehicle.”

CHANNEL PARTNERS