Ever-Evolving Automotive Electronics

Revolution or evolution? Whether in terms of electrical architecture or electronic capability, it all depends on your perspective.

Why would NVIDIA (nvidia.com) — a company founded in 1993 to create graphics chips for the budding computer gaming industry — get involved in autonomous vehicles? The simple answer is that the companies specializing in automotive silicon chips don’t make anything powerful enough because the market isn’t big enough to support the necessary investment. “Current suppliers make it powerful enough for Level 2 functionality, but not much more than that at this point,” says Aaron Jefferson, vice president, Product Planning, Global Electronics — ZF Active and Passive Safety Systems Division (zf.com). There is, however, another reason, one that is surprising in its simple logic.

“When you look at the problem of automated driving,” says Andy Whydell, vice president, Product Planning, Strategy and Systems, ZF Group, “you are essentially looking around, interpreting the complex 3D environment, converting that into mathematical models, and — from that — figuring out how you want to interact with it. A gaming system does exactly the same thing, but the other way around. That’s why NVIDIA thought it could, in essence, turn its chip around and, instead of creating the wireframe and rendering the exterior of the starship and its environs, it would take the 3D information and turn it into the mathematical models used to determine how you want to control your vehicle. Companies like NVIDIA and Intel have that technology developed, and have been doing it for more than 20 years.”

Related Stories

Currently, most automotive sensors are self-contained with the processing and functionality residing within the sensor itself. Add adaptive cruise control to a radar-equipped vehicle, for example, and the software for that function is typically inside the radar unit. Similarly, the lane-keeping function is resident in the camera, and the vehicle’s blind spot system is looking to the rear, and not linked to anything else. The OEMs, for their part, offer these systems independently or in different combinations and packages, but as the industry moves toward higher levels of driver assistance, the benefits of bringing that data into a central ECU that can combine all of the sensor data increases. The problem is not only in getting the units to talk to one another, but determining what the information means.

“The different sensors have different strengths and weaknesses,” says Whydell. “Radars are good at measuring distances and relative speed, but not at measuring angles. Cameras are very good at measuring angles, seeing exactly where targets are side-to-side, and recognizing objects. When you combine the two, you get the strengths of both.” Therefore, an advanced driver assist system (ADAS) would use the rear-facing blind spot radar to see an object moving up from behind, calculate when it will be in the field of vision of the front radar and camera units, relay this information to the ECU so that it can alert them, and track any changes in the data. Then it acquires the target as it appears in the forward-facing radar, uses the camera to determine whether or not it has moved into the same lane, and reacts if necessary. This takes significant processing power.

Lee Bauer, vice president, Mobility Architecture Group at Aptiv (aptiv.com), says the data needs are “forcing a change in vehicle architecture, not the least of which is how the car manages the data.” Currently, a vehicle may have anywhere from 30 to 100+ devices that provide and use both data and power. Depending on the system, that data can move anywhere from 1 MB/sec. to 50 MB/sec. “Going forward,” says Bauer, “the speed will need to increase substantially to meet not only the diagnostic parameters, but also the increasing sensor and other functional requirements.” Bauer expects required data speeds to be greater than 1 GB/sec., a massive increase that will require the bandwidth necessary to meet these needs as well as that of ever-evolving features.

“Critical electrical architecture decisions will need to be made around the bandwidth and data speed, as well as the network of controllers and their interconnects. This means intelligent compute nodes coupled with appropriate high-speed data transmission and power distribution to enable a safe, future-sensitive, high-fidelity end-user experience that functions from the sensor to the cloud.” Thus, an autonomous Level 4/Level 5 ADAS unit, Bauer says, will take a revolutionary, not evolutionary, architecture that provides significant advantages like fewer control units, full power supply redundancy, and scalability for lower automation levels.

It’s not, however, a jump straight from “here” to “there” as ZF’s Jefferson explains. “You might go from CAN communication to a more Ethernet-based system for sending data around the vehicle. An eight-megapixel camera, for example, “sends eight gigabytes of data repeatedly over Ethernet cable, and you’re going to have maybe 20 of those sensors around the vehicle sending data to a set of central computers in a distributed architecture.” Unlike Aptiv’s Bauer, he doesn’t see the architecture change as revolutionary. “I think you can expect to see an evolution from hardwired CAN to hardwired Ethernet that offers very high data rates for a very low cost, and will be brought in from the PC and IT industries.”

Cybersecurity is the next hurdle, and one that will be handled as a system as safety-critical and other systems are connected together and networked. The first line of defense is vehicle-level protection and focuses on physical access through either the OBD-II port or wireless access, the latter through the infotainment or telematics system. Since physical hacking is restricted to one vehicle at a time, firewalls against wireless tampering take on greater importance. The second level takes place at the ECU level, and is designed to not only protect them from being hacked into, but to protect the communication network between the ECUs. This is supplemented in the third level, which takes place on the individual chip plane, and includes communication between the chip and ECU.

According to Whydell, “Some of the security methods have been around for a long time, and date back to the days when every vehicle had interchangeable DIN-slot audio head units. Before OEMs started to customize their head units to make them more difficult to steal and use, it was all about protecting the microprocessor by making PIN codes unreadable so they couldn’t be overwritten.” This was followed by using specific communication protocols so that — even if you bought a CD unit from the same company that made the head unit — it couldn’t communicate with it. “The OEMs wanted to make sure you could only buy a CD changer from them in order to protect their revenue stream,” he says.

“At the circuit board level,” says Jefferson, “there are requirements for the number of access points you can have to a microprocessor, which limits the opportunity for hacking into the system.” It wasn’t always this way. “Early on, GM was leading the cybersecurity effort, and no one else had a firm requirement. Now everybody seems to be converging on a consensus specification based around experience from the automotive, banking, investment and other industries. As a result, there’s much more confidence today in being able to thwart a threat than there was even five years ago.”

As Whydell sees it: “We’re really seeing the fruits of a continuing evolution, the roots of which are probably 20 years old. Any new technology typically gets launched by Mercedes on the S-Class, and within 10 years has migrated to every luxury car and the top of the mainstream. Ten years after that, it’s standard. If you look where Audi, BMW and Mercedes are today with automated driving, add another 15 years and it probably will have trickled down to everything. The technology is there. The rest is down to implementation and getting the public on board.”

Fig. 1a and 1b (092018ADP-Feature-Electronics1a.jpg092018ADP-Feature-Electronics1b.jpg)

The massive increase in electronic complexity is illustrated here. In 30 years the industry has moved from increasingly complex copper wire harnesses to high-speed data cables as the number of safety, infotainment and telematics systems have mushroomed.

 

Fig. 2 (092018ADP-Feature-Electronics2.jpg)

ZF ProAI chip uses scalable NVIDIAs Drive PX 2 AI platform to process inputs from LiDAR, radar, ultrasonic sensors, and multiple cameras. Capable of over-the-air updates, it can communicate with other vehicles and the surrounding infrastructure. In fully autonomous vehicles, it will be part of a controller about eight inches square.

 

Fig. 4 (092018ADP-Feature-Electronics3.jpg)

Automatic Emergency Braking regulations are driving the adoption of updated single-lens camera systems capable of recognizing and reacting to pedestrians and crossing bicycles. Semi-autonomous driving functions will likely require a three-lens unit that adds a telephoto lens for long-range sensing, and a fish-eye lens with a wider field of view for improved short-range sensing.