FleetDrive Issue 49 - October 2024 | Page 27

FLEETDRIVE
• Lane Departure Warning ( LDW ): This system alerts the driver if the vehicle drifts out of its lane without activating the turn signal .
• Automatic Emergency Braking ( AEB ): This safety feature applies the brakes automatically to help prevent or reduce the severity of collisions with other vehicles or pedestrians .
• Blind Spot Monitoring ( BSM ): This technology notifies the driver when there are objects present in the vehicle ’ s blind spots .
Studies have shown that vehicles equipped with advanced ADAS features experience fewer accidents compared to those without these systems . For instance , a report by the European Commission noted that Forward Collision Warning ( FCW ) systems can reduce rear-end crashes and injuries by 20 per cent . The report also indicate that Advanced Emergency Braking ( AEB ) systems can reduce such crashes by about 45 per cent . When combined , FCW and AEB systems can reduce this type of crashes by about 55 per cent .
Manufacturers are continuously enhancing their ADAS capabilities with AI-based sensors , improving safety for everyone on the road . AI can learn from data collected during drives , helping systems adapt to different driving conditions . This means that the more a vehicle uses its ADAS features , the better it becomes at recognising potential hazards and responding effectively .
Enhancing Autonomous Driving
Autonomous vehicles use a range of AI-based sensors including LiDAR , radar , and cameras to perceive the world . It mimicks human sensory capabilities but with greater accuracy and speed . Here ’ s how it works :
1 . Sensing the Environment : The vehicle ’ s sensors work in tandem to gather data about its surroundings . LiDAR creates 3D maps using laser beams to measure distances , while radar detects objects and their speeds , even in poor visibility conditions . Cameras capture visual data for interpreting road signs and identifying other vehicles .
2 . Data Fusion and Processing : Once the sensors collect the raw data , the system uses AI to process this information . The process , called sensor fusion , combines data from different sensors to create a complete and accurate understanding of the environment . This multi-source input reduces errors and ensures reliable perception .
ISSUE 49 OCTOBER 2024 / WWW . AFMA . ORG . AU 27