Here are the major insights and takeaways we’d like to share from our talks with smart mobility innovators, researchers, safety and automotive experts, and more.
Each event included the active exchange of ideas through private meetings and presentations, and our team gained valuable insights. One of the main topics of discussion was the future of mobility. Many organizations shared their expectations for the future of the smart mobility market going forward. A key challenge mentioned repeatedly was the need for more sophisticated and efficient image sensors.
In their presentation at this year’s AutoSens event, Woodside Capital revealed that currently, 63% of cars sold in the US include lane-keep assist, but only 10% of cars in use around the globe have any form of ADAS. Although the use of automation in vehicles is increasing, manufacturers are hesitant to make the leap to full automation until they can fully guarantee safety. Current visibility solutions (e.g., LiDAR, radar, thermal, etc.) and even deep learning computer vision approaches can’t guarantee user safety in all situations (i.e., nighttime, snow, rain, fog)causing the industry to plateau.
The consensus reached is that most of today’s visibility systems have limited capabilities and hold the industry back. Current camera solutions as well as sensors like LiDAR, thermal, and radar become partially ineffective in certain compromising visibility conditions, such as limited lighting or poor weather conditions (e.g., rain, snow, fog) as well as in some daytime use cases. The consensus is that a fusion of sensors will prevail, either via a camera-only approach (as we have been advocating for some time with our two-camera sensor suite approach) or by adding non-visual sensors into the mix as redundancies and backups, as well as use-case specific sensors.
For a driverless future to be a reality, a solution for limited visibility conditions is a must. This month, we’ve been visited by some of the greatest automotive companies in the world, new customers, and previous connections - all of which realize that it’s the right time to re-address and solve the weather and night challenges. We’re excited and humbled by all of the flattering interest and partnership invitations.
Many event attendees concluded that the ideal solution would be camera technology for all day, night, and weather conditions, a redundancy sensor, and advanced visual perception software.
Truck brands and T1s shared their immediate need (on stage and with us) for L2 vision at night and limited visibility conditions, as there are fleets with thousands of trucks driving in challenging conditions around the world. These fleets could greatly benefit from a retrofit of a camera system.
New trucks and L4s present an almost “low hanging fruit” opportunity and are a solid economical case for driverless, 24/7 trucking, day and night, year-round.
Some event attendees were not yet familiar with our technology and its uniqueness. As we explained and demonstrated, it is the only visibility solution currently on the market capable of providing a clear image through darkness, glare, adverse weather conditions, and their combinations (e.g., rain at night).
Powered by GatedVision technology, VISDOM is a camera system that combines high-contrast images collected by a camera with variable range slices collected from varying depths into a single clear frame that can detect both small and large objects in all lighting conditions, even at high speeds.
VISDOM provides the ideal solution to many of the industry’s current challenges.
A recent study by the American Insurance Institute for Highway Safety found that AEB (automatic emergency braking) systems only prevent crashes and fatalities in the daytime.
We are proud to join 20 of the world’s top organizations and use our innovative tech to solve automated driving in variable traffic and weather at the AI-SEE project.