The objective of this paper is to highlight the importance of detecting, tracking, and labeling objects at distances beyond 300m to support high-speed driving, monitor Region of Interest (RoI) early on, allocating minimum sensors resources to save time and energy, and most importantly operate in most of weather and operating condition. By offloading radar digital and MIMO complexity to the analog beam-steering front-end, high-resolution and accuracy at longer ranges are achieved over wider Field of View (FoV) and faster frame rates. The integration of such radars with Cameras and Lidars offer an ADAS capability unmatched with toady’s architectures. Augmented with robust AI capabilities for real-time object classifications at the edge, these next generation of radars can deliver the 5D imaging capabilities without going to optical frequencies. Such augmented intelligence capabilities at the edge while working closely with Sensor Fusion systems will make roads safer and vehicles operation secure.