While you watch the road, your car may be watching you back. The automotive industry’s transition toward self-driving technology means cars increasingly are equipped with features that measure driver alertness and engagement, among many other data points. Executives say such features save lives and spur innovation, while simultaneously raising significant technical, legal, and ethical questions.
The discussion comes at a time when governments are exploring how driver monitoring systems can make roads safer. The European Union’s new vehicle safety rules, which went into effect in July, require all new vehicles from July 2024 to be equipped with multiple features including systems that monitor driver drowsiness. Earlier this summer, the National Highway Traffic Safety Administration expanded its investigation into whether Tesla’s Autopilot system exacerbates “human factors or behavioral safety risks by undermining effectiveness of the driver’s supervision.” The agency’s initial investigation, opened last year, included an assessment of the “technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation.”
Experts agree that driver-monitoring systems are a critical bridge between today’s high-tech autos and the fully self-driving cars of the future. Most commercial cars contain autonomous technology at a Level 2 or lower and require the driver to remain alert at all times. Only a few modern commercial vehicles have technology as high as Level 3, necessitating the driver to take control of the vehicle when it encounters a situation for which it is ill-equipped. Level 5 autonomous cars, which do not exist, require no human assistance. That means the cars of the foreseeable future will depend heavily on monitoring systems to ensure the driver remains alert and ready to operate the vehicle when necessary.
The financial considerations for automotive OEMs go beyond liability to safety-related lawsuits and regulatory actions. Modern monitoring systems generate a massive amount of data, some of which is used by car companies to improve and progress autonomous technology. But the extent to which drivers are aware of this usage and are willing to swap their privacy for the promise of safer roads remains to be seen.
National conversations about these topics are in their early stages, but some of the hardware involved, like pressure sensors and cameras, is well-established. Sensors in seats determine when and if airbags should deploy, while those in steering wheels measure whether or not the driver is “hands-on.” Driver-facing cameras are included in most new cars today that sell for more than $40,000 or so, though the sophistication varies widely between auto models and makers. LEDs illuminate the driver’s face and allow for the camera system to make its measurements. The cameras that monitor driver fatigue generally do so by measuring eyelid droop, alerting the driver to pull over if the system detects eyelid droop beyond normal levels.
Far more complex is the hardware that enables artificial intelligence to process these relatively straightforward inputs and allow the car to make decisions based on that data. “As soon as you have an AI, then we need to talk about different levels of compute complexity,” said David Fritz, vice president of hybrid-physical and virtual systems at Siemens Digital Industries Software. “A simple monitor is probably a 32-bit microcontroller doing some simple things. With AI, it can take six or eight cores, and it’s got an AI accelerator unit (also called NSP or NPU). You need all that to recognize what you’re seeing and make some intelligent decisions based on the context of what you’re seeing. The compute that’s required to do that is pretty significant.”
Vasanth Waran, senior director of business development for automotive at Synopsys, says automotive companies think about the technology in terms of driver monitoring systems, or DMS, and occupant monitoring systems, OMS. While a DMS is focused solely on the person operating the car, an OMS could be used to monitor distracting activity from other passengers or whether a baby is left in a car alone.
Waran says it will take several years for the industry to figure out the standards that will allow for transmitting driver monitoring data beyond the vehicle. “It’s not just vehicle-to-vehicle,” he said. “It’s also through the infrastructure. Then it doesn’t have to go through a network if it can be transmitted back and forth through infrastructure. Another possibility is satellite communication. The standards are evolving to address a situation where you don’t have infrastructure or connection to a 5G network, but you will still have communication.”
Driver monitoring data likely will be stored both in the cloud and locally to accommodate situations where the car is unable to connect to a network. “The first level of storage will always be some sort of local drive, whether it’s a flash drive or a system with some sort of embedded storage, like a solid state drive,” he said. “And once you park your car, you have the last 10 minutes of data going back to the cloud.”
Also required is lightning-fast communication between the monitoring systems, the driver, the systems with the car, other cars, and the manufacturer’s computing infrastructure. Amol Borkar, director of product management, marketing and business development for Tensilica Vision and AI DSPs at Cadence, says the evolution is similar to what was seen with the internet over the past 20 years.
“Initially, everything was local on our personal machines, but now we are always connected through our machines and phones,” Borkar said. “At the moment, this data is mostly local for vehicles, as well. For the small percentage of vehicles that have built-in 3G (or higher) data communicators, it can be streamed out and used to improve the software, ADAS applications, and much more.”
Fig. 1: An overview of processing between sensors and a central compute unit in an automotive. Source: Cadence
Some critical tasks are likely to remain local due to uncertainties in network latency and performance. “As this segment evolves, V2X will also evolve significantly, allowing for much more dense communication and more connected vehicles,” Borkar said. “The main motivations for V2X are currently road safety, traffic efficiency and energy savings, amongst others. As you can expect, there will be multiple building blocks, but many will revolve around high-speed communication (WiFi, cellular, automotive Ethernet), sensor and AI processing for analytics and understanding of audio, vision, lidar etc., and a central gateway or vehicle controller to manage all this data. All of this combined results in vehicles that could communicate with each other or the infrastructure to avoid accidents, reroute to reduce congestion in areas, and have a more deterministic flow of traffic.”
Additional complexity is found in the processes, both established and not yet invented, through which the car will use monitoring data to train an AI. “There are so many different possibilities, even when it comes to just watching your eyes,” said Paul Graykowski, senior technical marketing manager at Arteris IP. “What if you are wearing sunglasses or a hat? Or encounter a road sign you’ve never seen before? You’re going to need some kind of pathway to go up to the cloud and back from the cloud with the unique data sets that have been encountered. Your local SoC is processing all this and has to decide what’s unique and interesting.”
So, too, are the implications of that decision-making. Gaize is a Montana-based company behind a product that measures and records eye movements, mimicking the eye-tracking test law enforcement uses to gauge driver impairment. AI-powered software then takes those inputs and correlates them to data sets of sober and impaired eyeballs. The software is focused on cannabis-related driver impairment, though Gaize CEO Ken Fichtler says that scope could broaden in the future.
He says integrating the technology, or something similar to it, directly into a vehicle is part of the company’s long-term roadmap. “The tests we use are the most studied out there for eye movement and impairment,” said Fichtler. “We believe we’re going to be able to use these tests and the data we capture to ultimately design a sort of continuous monitoring system, similar to the fatigue monitoring systems that exist today.”
The product in its current form consists of a virtual reality headset manufactured by Pico, which captures video of the eye movements and generates several types of data. The chip is from Qualcomm and the eye-tracking sensors are by Tobii. Law enforcement places the headset on the driver and the device runs through multiple tests. The first test measures the extent to which both eyes track a stimulus equally. Then the driver is asked to look toward the periphery of their vision on a horizontal plane to detect any “pronounced twitching” of the eye.
A similar test detects twitching as the eye travels from a horizontal plane to 45 degrees, while another measures the ability of the eyes to smoothly track a stimulus without jerking. Other tests monitor vertical twitching of the eye, whether the eyes can cross and maintain focus, and how the pupil dilates and constricts in response to light stimulus. The data is stored on the device and also uploaded to the cloud. “Vector data, eye data, accelerometer and gyroscope data are all recorded at 90 times a second,” said Fichtler. “It’s very high resolution. Using that, we can then get a very clear understanding of what’s happening in the eye and, by extension, what’s happening in the body.”
Fichtler says the process is aimed at eliminating human error from sobriety field tests. “It’s a great benefit not only to law enforcement as they go to prosecute these cases, but also the people being accused of these crimes,” he said. “No one wants an officer that has a bias going into it or didn’t perform the test properly or something like that. But these things happen.”
If technology like Gaize’s is one day integrated into commercial vehicles, it will provide a new avenue for keeping impaired drivers off the roads. It also will generate concerns about how eye movement data is stored and shared, and the circumstances under which it can be accessed by law enforcement and other parties. Such issues speak to a legal and philosophical question that arises as driver monitoring technology becomes more sophisticated — to what extent is one’s car a private place?
The answer, according to Paul Karazuba, vice president of marketing at Expedera, depends to some extent on who is asking and where that person lives. “A car is obviously a private place in the sense that strangers are not allowed to be in it, but a car is also a giant glass-encased cabin where you can look in and see what’s happening,” said Karazuba. “In Europe, car manufacturers have pretty much determined that the inside of your car is a private place. I don’t know if that’s necessarily true of the rest of the world.” He noted that auto OEMs are likely to err on the side of keeping monitoring data more private rather than less, but legal precedents set in the U.S. and other major markets will be the deciding factor.
A corollary issue is how the data generated from driver-monitoring systems will be used by insurers. Present-day owners of cars with self-driving features tend to have higher insurance premiums because high-tech cars are more expensive than their counterparts. That could soon change as countries seek to reduce driver error-related road deaths. Last month, the British government said manufacturers, not drivers, will be held liable for incidents that take place when a vehicle is controlling itself. Karazuba says he wonders about what will happen when driver monitoring systems allow a manufacturer to “know” when a particular driver is consistently distracted or making poor decisions. “Would a car manufacturer send a message to that person’s insurance company saying, ‘This person is a potentially dangerous driver’?” said Karazuba. “What’s the ethical obligation of the car company to notify someone about that?”
As recent Tesla headlines have shown, much of the public discussion as of late has been focused on the extent to which drivers fully understand what their car is and is not capable of, and the extent to which monitoring systems can ensure the driver is engaged with the car when necessary. Expedera’s Karazuba says the process is essentially about education. “It’s not about teaching people new skills, it’s about telling them, ‘Sit back and enjoy the ride, but at the same time, be ready to take over,’” he said.
Ultimately, Siemens’ Fritz says it will be critical for the AI to understand to only what it is encountering as it monitors a driver, but also the meaning of that data, which could vary from driver to driver. An experienced driver, for example, may not make the same facial expressions as a less experienced driver when encountering a challenging situation, because the experienced driver is more confident in their ability to drive appropriately. Sorting that out will require at least some amount of in-vehicle learning, he says, to allow the AI to understand the driver’s idiosyncrasies and process them without latency or network connection issues.
“A lot of people are going to try to do it in the cloud only to find that the cloud is cluttered with other things like pictures of people’s breakfast,” said Fritz.
Privacy Protection A Must For Driver Monitoring
Why driver data collected by in-cabin monitoring systems must be included as part of the overall security system.
Big Changes Ahead For Inside Auto Cabins
New electronics to monitor driver awareness, reduce road noise, ensure no babies or pets are left in hot cars.
Automotive Bandwidth Issues Grow As Data Skyrockets
Increasing autonomy and features require much more data to be processed more quickly.
New Challenges For Connected Vehicles
Security, safety and functionality concerns are dominating new automotive designs.