This website is best viewed in portrait mode.

5 Challenges in the adoption of Autonomous vehicles

5 Challenges in the adoption of Autonomous vehicles

Automotive OEMs and suppliers concentrate their efforts on developing a comprehensive ADAS platform that combines features and capabilities from multiple platforms.

The increase in architectural complexity and computational requirements will necessitate solutions that strike an appropriate balance between performance and cost-effectiveness—the need for high-performance, precise sensors and an accurate model of the vehicle’s surroundings. In addition, autonomous driving will be made possible through cameras, radar sensors, and infrared sensors.

The majority of OEMs have adopted the L3 level of autonomy in developed countries, with a human in the driver’s seat to take control when necessary. However, this L3 level of autonomy is not entirely safe. It is difficult to hold someone accountable for an accident because the driver may not have enough reaction time when the car requests control be taken over. Here are five major obstacles that could impede the adoption of self-driving cars.

1. AI is still learning ‘common sense.’

The artificial intelligence software in a self-driving car uses deep neural networks. The machine learning algorithm is used to track the movement of objects on-road or road signs and traffic signals, and the control system then makes decisions accordingly. So, for example, in red light, the autonomous car stops, or a diversion sign for construction work going on, the car moves accordingly. 

However, when the AI fails to understand the real-world scenarios like when it will see a plastic bag flying in front, it will stop unnecessarily, or the AI system will slam the brake if the sensor senses a flock of birds sitting on the road. Unlike human drivers, the AI may not understand that the birds will fly away as it moves forward. 

Also, drivers in the real- world deal with many complex social interactions. For example, a cyclist’s hand movement to the direction it is trying to take a diversion or eye contact with a pedestrian giving the car a signal to go ahead are some of the signs robots may find difficult to detect. 

AI is training the system with more data to increase accuracy, but AI needs more common sense to operate appropriately for the fully autonomous cars on the road. As a result, the transition from manual to fully autonomous vehicles will be slow as the AI will need time to develop common sense and apply it in real-life situations involving pedestrians, cyclists, fellow drivers, or animals.

2. Infrastructure and technology supplement

A fully autonomous vehicle must determine the allowable speed limit by reading traffic signs. However, in developed countries, traffic signs may be absent in some locations, as may clear traffic lane markings.

Additionally, as 5G is deployed, a more connected vehicle-infrastructure combination will ensure the proper operation of fully autonomous vehicles. Even in the absence of roadside traffic signs, driverless cars will behave safely if traffic signals or nearby vehicles transmit information, rendering camera readings obsolete.

Alternatively, the question of whether communication will function adequately in all locations must be addressed. OEMs and connectivity providers must collaborate to resolve connectivity issues. Massive investments in infrastructure and technology are required to bring the dream of self-driving cars to the road.

3. Complex 3D route map creation

Would you purchase a car if it could not operate with maximum accuracy in any location in your country? OEMs and Tier 1s must bear this in mind as they develop and support self-driving cars. First, they conduct a trial run on the road to drive and feed map data to the system via a sophisticated machine learning algorithm. However, if the passenger wishes to visit a not included location in the map system, the self-driving car may become disoriented.

The three-dimensional map will guide the car, looking for additional vehicles or other objects on the road to drive appropriately. However, creating this map is a time-consuming process in terms of achieving coverage and accuracy.

Additionally, while the map is created digitally when the fully autonomous car is tested on the same road, numerous changes may occur due to traffic signal changes or recent construction work. Therefore, OEMs must ensure that the fully autonomous car’s self-learning process is efficient enough to incorporate new objects that were not present during training.

4. Cybersecurity – the flip side of advanced connectivity 

Data privacy and cybersecurity are significant concerns in this new age of connected mobility. While OEMs must ensure they do not infringe on the consumer’s data privacy, they must also protect the data from hackers to avoid any undesirable scenario. Therefore, robust security protocols should be developed to safeguard the car manufacturer’s data processed inside the vehicle and transmitted via cloud-based communication platforms.

With the rollout of 5G and a highly connected transport system, stricter cybersecurity should be ensured for fully autonomous cars. Many instances, including the one example in 2015 where Fiat Chrysler had recalled 1.4 million vehicles to fix bugs as they could be hacked, and control could be taken remotely. It sounds dangerous concerning the safety of the public on the road. Companies need to be extra protective towards the self-driving car from the technology misuse of the carjackers.

5. Sensors tricked by lousy weather

Bad weather is one of the significant challenges of driverless cars. Self-driving cars use a broad range of sensors – camera sensors, radars, and lidars to detect the objects on the way. Camera sensors help the car view and identify the object, whether a pedestrian, cyclist or another car. Lidar uses a laser to track the distance of any object and radar to measure the speed of the object and the direction of its movement.

The sensors capture the data and feed that back to the self-driving car’s control system. The system then decides whether to stop or take a turn on either side or move ahead or take the reverse gear if required. But snow, fog, or heavy rain makes it difficult for the sensors to function correctly. As the accuracy of the sensing capability is negatively impacted due to adverse weather, the safety of the consumer may get compromised.

Probably technology will overcome these challenges, and fully autonomous cars will be seen running in all weather conditions from Alaska to Zanzibar or even in Colorado Rockies. Autonomous driving is expected to change human life by improving efficiency on the road, reducing accidents, increasing productivity, and reducing environmental impact during the process.

Authored by: Praveen Kumar SL and Sayan Chakraborty.

 Note: The opinions that may be presented in the article are that of the author/authors.