The automotive industry is facing one of the most demanding challenges in its history: how to make automated travel safe in all conditions. With advanced and autonomous vehicles entering the market, solving problems linked to illumination and weather conditions such as rain, fog and snow is key to ensuring a safe environment for drivers, passengers and pedestrians. FIFTY2 is proud partner of the international AI-SEE Project (Artificial Intelligence enhancing vehicle vision in low visibility conditions) helping to tackle these challenges with PreonLab.
Transport and Smart Mobility
Safety, Security and Reliability
Germany
Canada
Austria
Sweden
Finland
Israel
The automotive industry is facing one of the most demanding challenges in its history: how to make automated travel safe in all conditions. There have been great advances towards automation with new vehicles increasingly equipped with driver assistance systems (ADAS). The biggest barrier now remaining to full automation is safe driving under poor weather and low visibility. The AI-SEE project aims to build a novel, robust sensing system supported by Artificial Intelligence (AI) that will enable automated travel in varied traffic, lighting and weather conditions. It will extend the Operational Design Domain (ODD) of automated vehicles (i.e. the scope of what they can do), taking the technology from SAE level 3 (conditional automation) to level 4 (high automation) where vehicles drive themselves with no human interaction in most circumstances.
With advanced and autonomous vehicles entering the market, solving problems linked to illumination and weather conditions such as rain, fog and snow is key to ensuring a safe environment for drivers, passengers and pedestrians. However, to move from level 3 to level 4 requires solutions to four key challenges:
AI-SEE is focusing primarily on the second challenge by increasing the environmental and situational awareness of vehicles.
Humans ‘see’ by combining stored memories and sensory input to interpret events and anticipate upcoming scenarios. Today’s automated vehicles cannot yet provide this inferential thinking, nor communicate in real-time with the environment. For automated vehicles to drive without human intervention, the information content from current sensors needs to be enhanced significantly. But this will create an increasingly large amount of data transmitted at huge data rates which, along with all the additional sensors, will quickly exceed the limits of in-vehicle storage space, and vehicle computational and energy resources.
Together, the high number of sensors needed for 360 degree environment perception and situation awareness and the high cost of LiDAR (Light Detection & Ranging) used for measuring distances to objects, represent significant barriers to the wider roll out of automated driving (Prices for individual LiDAR-sensors can reach up to 10 000 €. Garmin, 2019. LIDAR-Lite v3HP a Low-Cost Solution to Autonomous Building-Interior Mapping.).
Algolux (Germany) GmbH
Algolux Inc.
ams AG
ANSYS Germany GmbH
AstaZero AB
AVL List
Basemark
BWV
FIFTY2 Technology GmbH
Ibeo Automotive Systems
Meluta Oy
Mercedes-Benz AG (Project Leader)
OQmented GmbH
Patria Land Oy
Robert Bosch GmbH
Technical University of Ingolstadt
UNIKIE Oy
University of Stuttgart
University of Ulm
Veoneer Sweden AB
VTT Technical Research Centre of Finland Ltd
Start: 2021-06
End: 2024-06
AI-SEE will address these challenges by combining complex hardware and software development, creating automotive perception systems that go beyond today’s state-of-the-art. Its goal is to introduce reliable, secure, trustable sensors and software by implementing self-diagnosis, adaptation and robustness.
The AI-SEE concept is built on four main blocks:
The project will deliver the first high-resolution adaptive multi-sensor suite building on ainnovativenovel AI perception-processing scheme for low visibility conditions.
Specifically, AI-SEE will create novel sensor hardware comprising an active polarimetric imager and congruent LIDAR data; a short-wave infrared (SWIR) LIDAR with a novel SPAD receiver architecture; a high resolution 4D MIMO radar and a gated SWIR-camera. To support the novel sensing system and improve localization performance in poor weather, the project will also take high definition (HD) dynamic mapping to a new level. In addition, to handle the multisensory data fusion, an AI platform will be built to advance early signal enhancement for robust perception.
Importantly, the project will develop sensor-near simulation models for all active sensors for artificial generation of synthetic inclement weather datasets. This is expected to revolutionise simulation, with conversion of good weather neural network datasets into inclement weather datasets – thereby saving large amounts of money and time in testing and validating inclement weather sensor performance. Moreover, a large outdoor weather data bank for testing, modelling and validation will also be created. All of which will lead to a paradigm shift in signal enhancement techniques and a competitive advantage for the European automotive industry.