RoboK, a spinoff from the University of Cambridge Department of Computer Science and Technology, has developed new computer vision and deep learning-based 3D sensing and perception software solutions that enable cars to detect, process and react to changes in their environment. These solutions have been designed to run on low-power computing platforms, thus reducing the resources and footprint required to create ADAS (advanced driver assistance systems) with these new technologies.
According to RoboK, most traffic accidents are due to human error, but the cost and complexity of developing next-generation ADAS that can lower the risk of these accidents has traditionally made such systems unavailable on lower-cost vehicles. In order to lower the development costs of these safety systems, RoboK has partnered with Siemens Digital Industries Software to create a virtual environment for developing, testing and validating advanced driving systems.
The virtual environment will be based on a closed-loop simulation that is capable of testing an entire vehicle with an unlimited number of complex driving simulations. To demonstrate the capabilities of the system, which is built upon the PAVE360 pre-silicon autonomous validation environment, RoboK’s 3D perception module has been used to develop a digital twin demonstration of an autonomous emergency braking system.
RoboK’s perception module provides the virtual platform with enhanced speed and efficiency, enabling developers of advanced driving functions to test and update software and hardware designs for planning and action modules within a holistic environment. The reduced development time and costs, as well as power consumption, will allow these solutions to be made more accessible for mass-market vehicles.
Hao Zheng, co-founder and CEO of RoboK, commented: “Advanced driving features, which range from collision avoidance and automatic lane-keeping through to fully automated driving, require miles of road test driving to ensure their safety.Although simulation provides a resource-efficient alternative, it can be time-consuming. To accurately and realistically simulate all elements in the entire system it can take many hours to run and process even a single driving scenario.”
“With the proprietary software developed by our team, which can run on general-purpose and low-power computing platforms, we can shorten the processing time from hours to seconds, drastically improving the efficiency of system-level validation and testing. We have reduced the computation time by developing a significant new method for fusing raw data directly from a range of sensors, such as cameras, radars, GPS and IMU, as well as for performing depth estimation to gain 3D information, all running on low-power computing platforms. This significantly reduces the memory and computing requirement. When this is combined with our novel and highly optimised AI-based perception modules, intelligent insights can be gained rapidly and efficiently, which is vital for fast decision-making.”
David Fritz, Senior Director for Autonomous and ADAS at Siemens Digital Industries Software, said: “RoboK’s novel 3D perception algorithms perform sensor fusion, enabling the virtual vehicle to ‘see’ its environment. The algorithms can process data in real-time on the virtual platform. With this validation platform, AV and ADAS designers can make sure that every software or hardware design iteration can be tested and validated virtually, quickly, and, most importantly, before any hardware is produced.”