SeeByte has been awarded a contract by the UK’s Defence Science and Technology Laboratory (Dstl) to create an advanced DNN (deep neural network) framework that will provide enhanced situational awareness capabilities for passive sensor suites such as those employed on UAVs (unmanned aerial vehicles) and USVs (unmanned surface vessels).
Future Active Protection Systems (APS), and specifically Modular Integrated Protection Systems (MIPS), are likely to incorporate passive sensor subsystems as a crucial element of their sensing suite. With advanced imaging processing techniques, these passive imaging sensors could provide capabilities such as object detection, identification and tracking, and image segmentation and range estimation, whilst also carrying out their core APS function.
SeeByte’s multi-task DNN framework, developed under Phase 2 of Dstl’s “The Advanced Vision 2020 and Beyond” competition, will provide semantic image segmentation, object detection, and depth estimate (bearing and range) outputs for monocular electro-optic/infrared (EO/IR) sensors. The company’s previous experience with multi-task DNN architectures has demonstrated that it is possible to substantially compress DNN model size and complexity without a drop in performance.
In later phases SeeByte will also address limited imagery datasets containing relevant target objects by using Generative Adversarial Networks (GAN) to inject synthetic objects into real imagery.