While driving is something many people do every day, it always comes with a certain level of risk. This risk is amplified when drivers face adverse conditions such as darkness or low-visibility weather (i.e., rain, snow, fog). Poor visibility, even for a moment from oncoming traffic glare, can lead to casualties. This is where sensing technology comes in. According to a 2019 NHTSA report, most pedestrian fatalities take place at night, away from intersections. Over the past 10 years, nighttime crashes accounted for more than 90% of the total increase in pedestrian deaths.
Sensing technology in the automotive industry is designed to assist drivers [ADAS] or driverless vehicles to detect hazards, road users, and pedestrians on the road. These technologies become even more critical as we move towards commercial availability with more and more significant companies throwing their hats into the AV ring. All this can’t happen until AVs are proven to be safe and reliable in all visibility conditions - enabling their operations 24/7, year-round.
In this article, we’ll look at some of the most common imaging and sensing technologies used today, and what situations they are ideal for vs. situations where they simply don’t work well enough. We will address some of the current myths of the industry regarding sensors and sensing tech - what must be included in the AV sensor suite to optimize cost-effectiveness, what technology is market-ready, and when sensors today simply don't work. We will also evaluate the automotive gated imaging camera that may just resolve the suite composition dilemma.
Every car needs a high-quality visible spectrum camera that can at least see in the same conditions as the human eye can, i.e., during the day until it gets dark. The standard visible camera doesn’t work as well at night and adverse weather, and even the HDR cameras (High Dynamic Range cameras commonly used as front-facing cameras) have severe performance reduction. The advantages of standard visible cameras are their low cost and the fact that they rely on robust and validated CMOS technology. They are a must-have, but are not sufficient as a standalone solution in the sensor suite due to their sensitivity to contrast and the fact that they’re blind most of the time, i.e., in darkness, rain, snow, fog, and direct glare.
Thermal imaging works by sensing the long wave infrared (LWIR) radiation generated by all real-world objects. It can identify pedestrians even in congested areas; however, the sensor’s output can be affected by target temperature, emissivity, and environment.
Thermal imaging is a popular solution for low visibility problems often encountered when driving. It can perform in darkness and dust and is hardly affected by sunshine or direct glare.
That being said, thermal cameras are limited and can distinguish between different objects only if they have a temperature difference. They have great difficulty detecting objects on the road like fallen cargo or even motorcycle riders if they are wearing highly insulated clothing. While thermal imaging is excellent for use in the dark, it can be on the pricier side and has its critical limitations – it can’t see through rain or fog , and it can’t be mounted behind the front windshield (since it can’t penetrate glass), so it’s much more difficult to keep clean with no debris blocking the lens.
LiDAR stands for Light Detection and Ranging and uses light in the form of a pulsed laser to create a 3D perception of the world around it and provide highly-accurate information on the surrounding area and how objects are positioned. LiDARs come in many shapes and forms, most notably the mechanical spinning LiDAR, and recently, the market is turning towards solid-state LiDARs. The problems with all current LiDARs are low spatial resolution compared to cameras and inefficiency in adverse weather, like rain or fog, where the laser gets reflected all over when hitting the water molecules in the air. Also, LiDAR is not an imaging sensor, meaning it can’t exactly know what it’s seeing unless it’s very close and has enough detection points on it. In addition, since this technology is rather new, it is very expensive and will take time to mature and cut costs.
RADAR, an abbreviation of Radio Detection and Ranging System, is quite similar to LiDAR in that it relies on electromagnetic waves (radio waves) to identify the distance and position of objects in relation to the radar’s source. Energy is emitted to the surrounding space, and the echoes or signals mirrored by objects are detected. While radar is exceptionally effective in detecting very large objects even far away, the low spatial resolution is inadequate for detecting small objects at stopping distances from vehicles. In addition, the effect of returning signals causing false detections makes it an add-on sensor but it can’t be relied on as a main sensor in low-visibility weather conditions.
GatedVision is a patented imaging technology developed by Bright Way Vision. Bright Way Vision’s product, VISDOM, is a market-ready enhanced vision solution for the automotive, transportation, and smart city industries, enabling clear vision in all weather and lighting conditions. VISDOM is an automotive-qualified eye-safe (class 1) camera system with high-speed gated illumination. It can be mounted behind the front windshield or integrated into the vehicle headlights, and comes in three configurations for multiple transportation sectors, including cars, trucks, light trains, buses, and robotaxis - offering a uniform lit range of up to 300m, horizontal FOV from 16 to 60 degrees, and resolution up to 2M pixels. VISDOM enables optimal contrast, detection, and recognition required by leading automotive manufacturers and AI perception technologies.
High-contrast images are produced using thousands of micro-exposures per frame with versatile range slices. The various range slices collected from varying depths are then compiled into a single clear frame. The resulting high-resolution image is highly detailed and can detect small objects even at night and at high speeds. Made of two parts, a CMOS imager and illuminator, the system sends out pulses of NIR light (invisible to the naked eye) in varying numbers and uses them to determine the opening time of the camera. By varying these conditions, the system can create slices in multiple ranges, and each of these slices provides the system with new information. The number of pulses can be used to control shape and range and create a uniform and clear image with exceptional contrast while avoiding backscatter light that increases noise. In different weather conditions, the ability to vary pulses is critical.
The high resolution of the images created by GatedVision sets it apart from a standard camera, which would have difficulty collecting images in more complex conditions such as fog and rain or even at nighttime with low ambient light. GatedVision expands the visibility range and presents clear images even in long distances and low-visibility weather.
By delivering a continuous clear image in all weather and lighting conditions, GatedVision enables reliable object detection (including shadow and instinctive detection) and road visibility, thus improving the reliability of any sensor suite and regaining road safety where vehicles today simply can't see.
While many of these technologies have advantages in some areas, GatedVision stands out as the most effective and adaptable to various conditions.
An AAA report published in October of 2019 concluded that while imaging technologies may be reliable in ideal conditions, they are often ineffective at night or in conditions of reduced visibility. The report states: "drivers must not rely on assistance from current pedestrian detection systems during nighttime driving or other environments with reduced visibility."
Independent tests performed under the DENSE European Research project evaluated the performance of various optical sensing technologies, including visual cameras, thermal imagers, LiDARs, and SWIR under adverse weather conditions. Their "Benchmarking Image Sensors Under Adverse Weather Conditions for Autonomous Driving" report concluded: "The main disadvantage of standard cameras is the loss of contrast due to air-light and attenuation …in contrast, the gated camera shows much better contrast and higher viewing distances can be perceived.”
GatedVision is the only imaging technology that is reliable in any weather and lighting condition, from darkness and glare to rain, fog, and snow. It delivers a continuous clear image at up to 120 fps and enables object detection and road visibility, making it a key component in any sensor suite.
An AAA report published October 2019 examined automatic emergency braking with pedestrian detection states that "all evaluated systems were ineffective during nighttime conditions."
Level 5 autonomy can never fully be implemented in fleets or taxis until vehicles are prepared to drive in any and all weather, especially at night