Kudan, a developer of SLAM (Simultaneous Localization and Mapping) technology, has announced that it has developed real-time 3D mapping and position tracking capabilities via camera that are aimed at markets such as autonomous cars, drones, and other robotics applications.
SLAM technology provides computers with the “computer vision” ability to acquire, process, analyse and understand digital images as well as the ability to map 3D environments and objects, and understand the current location within the environment.
Kudan has also been developing space and object tracking technology through AR (augmented reality). As a result, Kudan has claimed to have developed a next-generation algorithm which would replace existing SLAM algorithms such as ORB and PTAM.
Key features of KudanSLAM include:
- Hardware friendly: flexible with camera setup including monocular, rolling shutter, and other sensors. Ready to be embedded on processor any other tech architecture
- High speed / Low consumption: less than 5% of mobile CPU consumption
- High tracking accuracy: 1mm-1cm
- Robustness: Capable of working under severe lighting condition and with unpredictable movement
Potential uses of KudanSLAM include:
- Autonomous cars – KudanSLAM is ready to be combined with internal sensors and LiDAR, leading to further robustness and more precise position tracking. It could be used to monitor both front and back of a vehicle without being affected by environmental noise, making it useful for automated parking, which needs precise position tracking down to a few centimeters.
- Drones – with even low-end drone cameras, KudanSLAM enables precise object recognition and position tracking with 1mm to 1cm accuracy.
- Robotics – even without external sensors, KudanSLAM enables robots to work independently of any specific facility or environment.