Attention
This website is best viewed in portrait mode.

Cloud-based virtual development and validation of ADAS

Dr. Jihas Khan, Dr. Sudha Natarajan, Chethana K, Karishma Anna Koshy, Abhiram R
Navigation
NavigationIntroduction
Advanced Driver Assistance Systems (ADAS) play a significant role in enhancing road safety and driving convenience. These systems rely on sophisticated sensors, algorithms, and control mechanisms to assist drivers in real-time decision-making. However, the development and validation of ADAS functionalities is a complex and resource-intensive process. Software-Defined Vehicles (SDVs) are driving a shift from decentralized architectures with multiple Electronic Control Units (ECUs) to centralized High-Performance Computers (HPCs) that host applications such as ADAS, infotainment, body control, and more. HPCs typically use advanced System-on-Chip (SoC) technology to enable the concurrent execution of multiple applications. However, the physical availability of such complex SoCs often takes significant time, which can delay ADAS application development, the SoC porting process, and validation efforts. The safety-critical nature of ADAS demands rigorous testing to meet stringent regulatory requirements. Achieving this necessitates testing over millions of kilometres across diverse driving conditions. Relying solely on vehicle-based testing is not practical due to the associated time, cost, and logistical challenges.
To address these challenges, hardware virtualization and virtual validation techniques have become essential. Virtualizing HPC hardware through software models enables accelerated application development by left shifting the development cycle. Hardware virtualization allows early development and testing of ADAS applications by simulating HPCs in the absence of physical SoC hardware. In the cloud, this is achieved using hypervisors, containerization, and SoC emulation tools, enabling developers to build, test, and validate software in virtual environments [1]. This approach accelerates SoC porting, supports the parallel development of hardware and software, and reduces time-to-market, making it essential for SDV programs.
Virtual validation [2] enables a vast number of scenarios to be tested quickly and repeatedly, ensuring thorough validation without the constraints of physical testing. Replay-based methods, where recorded real-world scenarios are replayed to the ADAS algorithm under test, support open-loop feature validation. In contrast, simulation platforms like CARLA facilitate closed-loop validation, where virtual environments replicate dynamic driving situations. Modern software development cycles require the virtual validation of ADAS algorithms over millions of kilometres of driving data within just weeks. Local, on-premise test execution often struggles to meet this demand economically, as the necessary hardware investments can be substantial. Cloud-based solutions offer the scalability and cost-efficiency needed for ADAS Software-in-the-Loop (SIL) validation. Multiple cloud instances can run in parallel, completing virtual validation of millions of kilometres of driving data within weeks. Users only pay for the compute instances they use.
Several frameworks for the virtual calibration and validation of driver assistance systems have been proposed [3–5]. Synthetic data has proven particularly effective in the development of deep learning-based ADAS algorithms [6]. To address the challenges of large-scale validation, cloud-based simulation and verification solutions offer a practical and scalable approach [7]. These solutions enhance test coverage by leveraging both real-world and synthetic data, supporting SIL simulation.
This white paper presents innovative methods for cloud-based virtual development and validation of ADAS, using both real-world and synthetic driving data. The following section introduces approaches to cloud-based virtualization, while the subsequent two sections describe the architecture for cloud-based SIL development and validation of ADAS, as well as its implementation.
Virtualization on Cloud
There are two primary approaches to virtualizing an ECU: host-compiled and target-compiled virtualization. In the host-compiled approach, the ECU software is cross-compiled to run on a simulation host—typically a workstation or a cloud instance—instead of the actual ECU hardware. This method eliminates the dependency on physical hardware during early software development and validation. In contrast, target-compiled virtualization involves executing the target-compiled binary within a virtual ECU software tool running on a workstation or cloud instance. This virtual ECU tool includes simulation models of the SoC, controllers, peripherals, and their interactions.
A host compiled virtual ECU does not include a model of the ECU hardware itself. Instead, various Application Programming Interfaces (APIs) are simulated at specific points within the software stack to bypass the lower, hardware-dependent software layers. Host compiled virtual ECUs can be implemented at different levels of fidelity. This white paper adopts Level-2 virtualization, where production-grade application software along with production-grade middleware is executed on top of a simulated operating system. This enables software development teams to focus on application and middleware development without waiting for the physical ECU availability.
Hosting virtual environments in the cloud offers several advantages, including scalability, flexibility, cost efficiency, and faster provisioning compared to traditional on-premise setups. Cloud-based virtualization also enables easy access to high-performance computing resources without the need for physical infrastructure. Multiple Cloud Service Providers (CSPs) and hyperscalers offer these virtualization capabilities, each providing diverse services and infrastructure options. This white paper uses the cloud services from Amazon, namely Amazon Web Services (AWS). AWS leverages Amazon Machine Images (AMIs) [8] to create and provision virtual machines (VMs) within the Amazon Elastic Compute Cloud (EC2). An AMI is a pre-configured template that includes the necessary operating system, software stack, libraries, and processor architecture needed for a virtual instance. Automotive operating systems such as Ubuntu, Red Hat, QNX, Android etc. are supported through AMIs. As a result, automotive OS–compiled applications and middleware can be hosted and executed on cloud-based virtual instances with ease. By combining host compiled ECU software with AWS’s cloud virtualization infrastructure, automotive software development teams can accelerate ADAS application development, integration testing, and validation, even before physical SoCs and ECUs become available.
Cloud-based ADAS development and validation
The architecture of cloud-based ADAS development and validation environment is illustrated in Figure 1. The on-premise computer connects to the AWS Virtual Private Cloud (VPC) via a secure connection. The cloud implementation comprises a public subnet and a private subnet.
Figure 1: Architecture of cloud-based ADAS development and validation environment.
The public subnet hosts a Bastion instance, which acts as a secure entry point for access to resources in the private subnet.
In the private subnet, two instances are deployed:
- The data generation instance generates the data needed for development and validation of ADAS algorithm in the cloud. The data can be the recorded real-world sensor data or the data generated by a scenario simulator. The instance also stores the corresponding annotation files, which include the scene-specific information. The raw data along with the ground truth annotations are transmitted to the ADAS instances for development and validation.
- The ADAS instance runs the automotive OS compiled ADAS algorithm along with the middleware, with the object detection algorithm as an example. The module can be validated on multiple instances simultaneously for different driving scenarios. The object detection results, including object labels and bounding boxes, are forwarded to the performance metric computation module, where they are compared with the ground truth to derive the Key Performance Indicators (KPI).
The architecture can be easily extended to develop and validate other computationally intensive ADAS tasks, including sensor fusion, traffic sign recognition, collision avoidance, path planning etc.
Implementation and results
Two separate studies were conducted to evaluate the proposed architecture for ADAS development and validation - one using recorded real-world data and the other using simulated data. The implementation specifications of private subnet are summarized in Table 1.
Data type & source |
Data generation instance |
ADAS instance |
||
AMI |
EC2 |
AMI |
EC2 |
|
Real-world road scene, KITTI recorded video |
Ubuntu 22.04 LTS |
t3 |
Redhat |
t3 |
Synthetic road scene, CARLA v0.9.15 simulator |
Ubuntu 22.04 LTS |
g5.xlarge |
Redhat |
t3 |
Table 1: Implementation specifications
In the first study, the real-world road scene images were extracted from the driving videos of KITTI dataset [9]. A deep learning algorithm for object detection (YOLO [10]) was deployed in the ADAS instance. To validate the algorithm across different driving scenes simultaneously, the object detection process was executed in parallel on multiple EC2 instances. To ensure a consistent runtime environment, an AMI was created from a single configured instance, which included the necessary software tools and dependencies. The remaining instances were launched using this AMI, guaranteeing an identical environment setup across all instances. The object detection algorithm was then compiled and executed on each instance, each processing a specific and different driving scene. Figure 2 shows the development environment setup, algorithm execution, and the terminal output of one instance. Various stages of algorithm execution can be seen in the terminal output. Developers could debug and update various stages of algorithm execution in the cloud. Figure 3 shows the cloud-based virtual validation process, showcasing simultaneous object detection algorithm testing across four different road scenes executed in four different cloud instances in parallel.
(a)
(b)
(c)
Figure 2: Cloud-based development of object detection module. (a) Environment setup, (b) Compilation and running of object detection algorithm, (c) Terminal output with debug steps.
Figure 3: Cloud-based validation of object detection module on real road scenes.
For the study using synthetic data, the CARLA v0.9.15 simulator [11] is utilized. CARLA is an open-source simulator developed for autonomous driving research, providing flexible configurations for sensor setups, environmental conditions, and complete control over both static and dynamic road objects. The simulator generated road scene video images along with corresponding object bounding boxes, which were then transmitted to the ADAS instance for processing. In the ADAS instance, the object detection algorithm was brought up and validated. To assess the object detection algorithm accuracy, the Intersection over Union (IoU) metric was computed, quantifying the overlap between the predicted and ground truth bounding boxes as the ratio of their intersection area to their union area. Figure 4 shows the cloud-based object detection executed on simulated road scene. The green bounding box shows the ground truth data while the red bounding box shows the object detection algorithm data. The detection results for various vehicle types across different simulated road scenes are summarized in Table 2.
Figure 4: Cloud-based validation of object detection algorithm on simulated road scene.
Object Class |
Centre (ground truth) |
Centre (detected) |
IoU |
Motorcycle |
(75.5, 251.0) |
(80.0, 250.0) |
0.91 |
Bus |
(233.0, 214.0) |
(235.0, 215.0) |
0.85 |
Truck |
(351.5, 207.0) |
(355.0, 210.0) |
0.82 |
Car 1 |
(446.0, 197.0) |
(447.5, 197.5) |
0.88 |
Car 2 |
(469.5, 193.5) |
(472.5, 195.0) |
0.8 |
Table 2: Detection results for different vehicles on simulated road scenes
Conclusion
The study highlights the effectiveness of utilizing a cloud platform for development and validation of computationally intensive tasks associated with ADAS. Specifically, the object detection process has been successfully developed and validated on the AWS cloud, enabling the selection of appropriate virtual hardware resources to evaluate the performance. The system’s practicality across diverse driving scenarios can also be simultaneously tested on different instances using the cloud platform. The proposed cloud-based validation approach is scalable and can be readily extended to develop and validate additional ADAS modules.
Tata Elxsi is actively exploring the migration of development and validation for data-intensive ADAS tasks to the cloud to enhance scalability, reduce timelines, and achieve cost-effective, high-fidelity development and testing in virtual environments.
References
- Aditya Bhardwaj, C. Rama Krishna, “Virtualization in cloud computing: Moving from hypervisor to containerization – A survey”, Arabian Journal for Science and Engineering, No. 46, pp. 8585-8601, 2021.
- Riccardo Dona, Biagio Ciuffo, “Virtual testing of automated driving systems. A survey on validation methods”, IEEE Access, Vol. 10, pp. 24349-24367, Feb 2022.
- Markofsky, M., Schramm, D., “A modular framework for virtual calibration and validation of driver assistance systems”, In: Pfeffer, P. (eds) 13th International Munich Chassis Symposium 2022. IMCS 2022. Proceedings. Springer Vieweg, Berlin, Heidelberg.
- Scharke, H., Schneider, H. & Kinalzyk, D., “Virtual validation of ADAS/AD functions”, ATZ Electronics Worldwide, No. 17, pp. 8–13, 2022.
- Won, M., Kim, S., “Verification and validation utilizing CARLA simulator for autonomous driving development”, In: Wagner, G., Werner, F., De Rango, F. (eds) Simulation and Modeling Methodologies, Technologies and Applications. SIMULTECH 2022. Lecture Notes in Networks and Systems, vol 780. Springer, 2023.
- B. Jelić, R. Grbić, M. Vranješ and D. Mijić, "Can we replace real-world with synthetic data in deep learning-based ADAS algorithm development?," in IEEE Consumer Electronics Magazine, vol. 12, no. 5, pp. 32-38, 1 Sept. 2023.
- Simulation & Verification | AWS Solutions for Automotive | AWS Solutions Library
- Amazon Machine Image
- The KITTI Vision Benchmark Suite
- YOLO object detection model
- CARLA Simulator