While autonomous vehicles, or self-driving cars, are still far from being available to the average driver, most cars on the market now offer advanced driver-assistance systems (ADAS). These semi-autonomous features recognize driving threats and react accordingly, drastically reducing traffic fatalities and enhancing vehicle reliability and safety. ADAS not only help vehicles avoid hazards outside of human control, but they also help those behind the wheel become better drivers so that they can learn to recognize danger and avert imminent threats on their own.
ADAS are able to diminish human error by notifying the driver of risks and responding to those hazards efficiently and effectively. But just like drivers, the cameras on cars can only react to what they can see. Consequently, in the absence of “super” vision that can deliver crucial image data, the safety and reliability of these systems are frequently compromised.
What’s the Low-Visibility Challenge?
Today, ADAS aren’t functional in many common adverse weather conditions and in low light. Despite only25% of travel occurring during nighttime, nearly 49% of accidents occur during this time, meaning drivers are nearly twice as likely to crash their vehicle when driving at night. A major obstacle in the creation of a secure ADAS solution has been integrating a sensor that can see in low-visibility scenarios, such as fog, haze, dust, glare, rain, and darkness. As a result, driver-assistance systems lack critical information about the immediate environment that’s required to make smart and safe decisions.
Moreover, even under optimal visibility conditions, ADAS hazard-detection capabilities are extremely deficient and pose a serious threat for drivers who depend on the technology’s accuracy. For example, current-sensing technologies struggle to detect pedestrians wearing dark clothes or dark-fur animals crossing the road.
A recent AAA studytested pedestrian-detection automated emergency-brake systems and found that no system was able to detect an adult pedestrian crossing in front of a vehicle at night. This deficiency leaves room for many unnecessary accidents to occur. For instance, more than 1.3 million deer-related accidents occur in the United States every year. In addition, invisible hazards such as oil slicks and black ice can’t be detected from a safe distance. Thus, the ability to recognize hazards that are nearly impossible to detect with current ADAS technology is urgent.
Seeing Beyond the Visible
ADAS found in most new vehicles today rely mainly on a combination of cameras to “see” with the support of radar. Autonomous-vehicle systems also include light detection and ranging radar (LiDAR), which currently comes at an extremely high cost.
It’s undeniable that ADAS technology existing on the market today has reduced the likelihood of car accidents. However, until now, industry consensus on sensor-fusion components hasn’t been able to offer a consistently reliable solution. Most of today’s solutions in those scenarios rely on radar, which has low resolution and high false-positive rates, making it unreliable on its own. The ADAS sensor-fusion solution still lacks an affordable sensor modality that can provide high resolution in common low-visibility conditions for the automotive market.
Fortunately, a viable solution to the problem of low visibility is possible. Short-wave infrared (SWIR) sensing has the potential to enhance driver capabilities and enable precise hazard detection. In contrast to cameras on the visible spectrum, a SWIR camera has a lower refractive coefficient, meaning that it’s significantly less scattered. Existing SWIR cameras are based on an exotic compound called indium gallium arsenide (InGaAs). They’re currently being used in industries such as defense and aerospace, but up to now, they haven’t found their way into mass-market applications due to high cost and long lead time.
Based on almost a decade of academic research, TriEye was able to fabricate the industry’s first CMOS-based SWIR sensing solution that can be mass-produced. As a result, SWIR, which is able to “see beyond the visible,” can be applied to ADAS to help cars perceive what standard visible cameras are unable to see. It also allows for the detection of obscured and unseen objects at longer ranges so that the ADAS can alert the driver and react to hazards before it’s too late. The combination of SWIR’s vision capabilities and the manufacturability of CMOS holds the promise of a considerably safer driver experience in the near future.
Ziv Livne is the VP of Product and Business Development at TriEye.