Link: https://www.sae.org/news/2024/07/adas-sensor-update
Are today’s sensors ready for next-level automated driving?
SAE Level 3 automated driving marks a clear break from the lower levels of driving assistance since that is the dividing line where the driver can be freed to focus on things other than driving. While the driver may sometimes be required to take control again, responsibility in an accident can be shifted from the driver to the automaker and suppliers. Only a few cars have met regulatory approval for Level 3 operation. Thus far, only Honda (in Japan), the Mercedes-Benz S-Class and EQS sedans with Drive Pilot and BMW’s recently introduced 7 Series offer Level 3 autonomy.
With more vehicles getting L3 technology and further automated driving skills being developed, we wanted to check in with some of the key players in this tech space and hear the latest industry thinking about best practices for ADAS and AV Sensors.
Towards More Accurate 3D Object Detection
Researchers from Japan's Ritsumeikan University have developed DPPFA-Net, an innovative network that combines 3D LiDAR and 2D image data to improve 3D object detection for robots and self-driving cars. Led by Professor Hiroyuki Tomiyama, the team addressed challenges in accurately detecting small objects and aligning 2D and 3D data, especially in adverse weather conditions.
DPPFA-Net incorporates three key modules:
- Memory-based Point-Pixel Fusion (MPPF): Enhances robustness against 3D point cloud noise by using 2D images as a memory bank.
- Deformable Point-Pixel Fusion (DPPF): Focuses on key pixel positions for efficient high resolution feature fusion.
- Semantic Alignment Evaluator (SAE): Ensures semantic alignment between data representations during fusion.
The network outperformed existing models in the KITTI Vision Benchmark, achieving up to 7.18% improvement in average precision under various noise conditions. It also performed well in a new dataset with simulated rainfall.
Ritsumeikan University researchers said this advancement has significant implications for self driving cars and robotics. It could lead to reduced accidents, improved traffic flow and safety, and enhanced robot capabilities in various applications. The improvements in 3D object detection are expected to contribute to safer transportation, enhanced robot capabilities, and accelerated development of autonomous systems.
AEVA
Aeva has introduced Atlas, the first 4D lidar sensor designed for mass-production automotive applications. Atlas aims to enhance advanced driver assistance systems (ADAS) and autonomous driving, meeting automotive-grade requirements.
- The company’s sensor is powered by two key innovations: the fourth-generation lidar-on-chip module called Aeva CoreVision that incorporate all key lidar elements in a smaller package, using silicon photonics technology.
- Aeva X1 new system-on-chip (SoC) lidar processor that integrate data acquisition, point cloud processing, scanning system, and application software.
These innovations make Atlas 70% smaller and four times more power-efficient than Aeva's previous generation, enabling various integration options without active cooling. Atlas uses Frequency Modulated Continuous Wave (FMCW) 4D lidar technology, which offers improved object detection range and immunity to interference. It also provides a 25% greater detection range for low-reflectivity targets and a maximum range of 500 meters.
Atlas is accompanied by Aeva’s perception software, which harnesses advanced machine learning-based classification, detection and tracking algorithms. Incorporating the additional dimension of velocity data, Aeva’s perception software provides unique advantages over conventional time-of-flight 3D lidar sensors.
Atlas is expected to be available for production vehicles starting in 2025, with earlier sample availability for select customers. Aeva's co-founder and CTO Mina Rezk said that Atlas will enable OEMs to equip vehicles with advanced safety and automated driving features at highway speeds, addressing previously unsolvable challenges. Rezk believes Atlas will accelerate the industry's transition to Frequency-Modulated Continuous-Wave 4D lidar technology, which is increasingly considered the end state for lidar due to its enhanced perception capabilities and unique instant velocity data.
Luminar
Following several rocky financial months and five years of development, global automotive technology company Luminar is launching Sentinel, its full-stack software suite. Sentinel enables automakers to accelerate advanced safety and autonomous functionality, including 3D mapping, simulation, and dynamic lidar features. A study by the Swiss Re Institute showed cars equipped with Luminar lidar and Sentinel software demonstrated up to 40% reduction in accident severity.
Developed primarily in-house with support from partners, including Scale AI, Applied Intuition, and Civil Maps (which Luminar acquired in 2022), Sentinel leverages Luminar's lidar hardware and AI-based software technologies.
CEO and founder Austin Russell said Luminar has been building next-generation AI-based safety and autonomy software since 2017. “The majority of major automakers don't currently have a software solution for next-generation assisted and autonomous driving systems,” he said. “Our launch couldn't be more timely with the new NHTSA mandate for next-generation safety in all U.S.-production vehicles by 2029, and as of today, we're the only solution we know of that meets all of these requirements.”
Mobileye
Mobileye has secured design wins with a major Western automaker for 17 vehicle models launching in 2026 and beyond. The deal covers Mobileye's SuperVision, Chauffeur, and Drive platforms, offering varying levels of autonomous capabilities from hands-off, eyes-on driving to fully autonomous robotaxis.
All systems will use Mobileye's EyeQ 6H chip, integrating sensing, mapping, and driving policy. The agreement includes customizable software to maintain brand-specific experiences.
CEO Amnon Shashua called this an "historic milestone" in automated driving, emphasizing the scalability of Mobileye's technology. He highlighted SuperVision's role as a bridge to eyes-off systems for both consumer vehicles and mobility services.
Initial driverless deployments are targeted for 2026.
BMW
BMW new 7 Series received the world’s first approval for a combination Level 2/Level 3 driving assistance systems in the same vehicle. This milestone offers drivers unique benefits from both systems.
The Level 2 BMW Highway Assistant enhances comfort on long journeys, operating at speeds up to 81 mph (130 km/h) on motorways with separated carriageways. It allows drivers to take their hands off the steering wheel for extended periods while remaining attentive. The system can also perform lane changes autonomously or at the driver's confirmation.
The Level 3 BMW Personal Pilot L3 enables highly automated driving at speeds up to 37 mph (60 km/h) in specific conditions, such as motorway traffic jams. Drivers can temporarily divert their attention from the road, but they have to retake control when prompted.
This combination of systems offers a comprehensive set of functionalities for a more comfortable and relaxing driving experience on both long and short journeys. The BMW Personal Pilot L3, which includes both systems, is available exclusively in Germany for €6,000 (around $6,500). Current BMW owners can add the L2 Highway Assistant to their vehicle, if applicable, free of charge starting August 24.
Mercedes-Benz
Mercedes-Benz’s groundbreaking Drive Pilot Level 3 autonomous driving system is available for the S-Class and EQS Sedan. It allows drivers to disengage from driving in specific conditions, such as heavy traffic under 40 mph (64 km/h) on approved freeways under certain circumstances. The system uses advanced sensors – including radar, lidar, ultrasound, and cameras – to navigate and make decisions.
While active, Drive Pilot enables drivers to use in-car entertainment features on the central display. However, drivers must remain alert and take control when requested. Drive Pilot functions under the following conditions:
- Clear lane markings on approved freeways
- Moderate to heavy traffic with speeds under 40 mph
- Daytime lighting and clear weather
- Driver visible by camera located above driver's display
- The car is not in a construction zone.
Drive Pilot relies on a high-definition 3D map of the road and surroundings. It's currently certified for use on major freeways in California and parts of Nevada.
NPS
At CES 2024, Neural Propulsion Systems (NPS) demonstrated its ultra-resolution imaging radar software for automotive vision sensing. The technology significantly improves radar precision without expensive lidar sensors or weather-related limitations.
NPS CEO Behrooz Rezvani likens the improvement to enhancing automotive imaging from 20/20 to better than 20/10 vision. The software enables existing sensors to resolve to one-third of the radar beam-width, creating a 10 times denser point cloud and reducing false positives by over ten times, the company said.
The demonstration compared performance using Texas Instruments 77 GHz chipsets with and without NPS technology. Former GM R&D vice president and Waymo advisor Lawrence Burns noted that automakers can use NPS to enhance safety, performance, and cost-effectiveness of driver-assistance features using existing hardware.
NPS' algorithms are based on the Atomic Norm framework, rooted in magnetic resonance imaging technology. The software can be deployed on various sensing platforms and implemented on processors with neural network capability. Advanced applications of NPS software with wide aperture multi-band radar enable seeing through physical barriers like shrubs, trees, and buildings — and even around corners. The technology is poised to help automakers meet NHTSA's proposed stricter standards for automatic emergency braking, aiming to reduce pedestrian and bicycle fatalities on U.S. roads.
0 Response to "SAE article on L3 autonomy"
Post a Comment