Distracted driving leads to thousands of deaths every year. Even as cars become more intelligent, autonomous cars still rely on human intervention to ensure proper functionality and safety. In fact, level 0 through level 4 autonomous vehicles all require some level of human interaction with only level 5 operating entirely without intervention.
To put this in context, Tesla vehicles are currently rated at level 2, a far cry away from level 5.
The levels of driving automation. Image used courtesy of Synopsys
Clearly, it is imperative that drivers maintain focus, no matter how smart their car is. Yet, as we know, distractions are unavoidable today more than ever before. The solution that many engineers have developed is driver monitoring systems (DMS).
Innovation for these systems often occurs at the chip level. OmniVision, a company focused on digital imaging, has produced a new ASIC to rise to the occasion—the first of its kind to integrate a neural processing unit and image signal processor for eye gaze and eye tracking algorithms.
How Driver Monitoring Systems Track Attention
As the name implies, a DMS is a system that provides a real-time evaluation of the driver, determining their focus and intervening as needed.
DMS tracks head position, eye direction, and eyelid activity. Image used courtesy of Valeo
As AAC contributor Nicholas St. John writes, these systems normally work by having a charged-coupled camera, including infrared light, that is physically mounted on the steering wheel. The camera tracks the driver’s actions, including eye movement, head position, and eyelid activity, to determine whether the driver is paying attention or not. The system then issues an alert to regain the driver’s attention.
The Hardware Blocks of a DMS
A driver monitoring system generally consists of several key hardware blocks. These include the camera module, LED drivers, a dedicated processor (normally an SoC or ASIC), PMICs, and memory.
A typical DMS block diagram. Image used courtesy of NXP
The system may seem straightforward, but the design can include many challenges. For starters, the systems need to have a small footprint to preserve space in the car. They must also be insensitive to noise since an automobile is rife with sources of interference such as the radio and the engine.
They need to be low powered to avoid draining the car battery. This is particularly challenging because these systems must to run AI applications on the processor to perform the facial tracking that actually monitors the driver. This has been the focus of many design efforts in this space.
OmniVision’s ASIC Brings More Intelligence to DMS
This week, OmniVision released news of its newest product, an ASIC designed explicitly for DMS. Looking to develop a low-power, high-performance AI processor, OmniVision introduced many industry firsts into the ASIC.
Notably, the company claims that its new product, the OAX8000, is the first dedicated DMS processor to integrate a neural processing unit and an image signal processing unit to support the facial tracking that a DMS must perform. The ASIC is said to achieve a peak of 1.1 TOPS with a normal use case power consumption of 1 W.
Functional block diagram of the OAX8000. Image used courtesy of OmniVision
The company also claims that the ASIC is the industry’s first to include on-chip DDR3 SDRAM, offering 1 GB. Other dedicated hardware in the device, like the quad Arm Cortex A5 CPU cores, manages image processing, video encoding, and RGB/IR processing. Arm’s Neon technology offers algorithms for on-chip video analytics, and TensorFlow, Caffe, MXNet and ONNX toolchains support the embedded NPU.
Market Growth in Driver Monitoring Systems
Market analysts predict that the accelerated market drive for DMS is expected to generate a 56% CAGR between 2020 and 2025. The new release from OmniVision continues the trend of companies aiming for low-power, AI-specific processors to meet this DMS demand and (perhaps) level 5 autonomy in the future.