New applications for vehicle radar sensing have been enabled by the expansion of the NXP Semiconductors automotive radar one-chip family.
The SAF86xx monolithically integrates a high-performance radar transceiver, multi-core radar processor and MACsec hardware engine for secure data communication over Automotive Ethernet. Combined with NXP’s S32R high-performance radar processors, vehicle network connectivity and power management, the full system solution paves the way for advanced, software-defined radar, according to NXP.
The highly integrated radar SoC (System-on-Chip) is intended for streaming rich low-level radar sensor data at up to 1 Gbit/s. It helps carmakers optimize next-generation ADAS partitioning for software-defined vehicles, while providing for a smooth transition to new architectures.
It also shares a common architecture with the SAF85xx introduced in 2023 and leverages 28 nm RFCMOS performance for significantly improved radar sensor capabilities, compared to prior-generation 40 nm or 45 nm products.
This enables Tier-1 suppliers to build more compact and power-efficient radar sensors. Drivers and other road users will benefit from extended detection range beyond 300 meters, along with more reliable detection of small objects like curb stones as well as vulnerable road users including cyclists and pedestrians.
Automotive Industries (AI) asked Matthias Feulner, Senior Director for NXP’s Product Line ADAS, what the benefits are of the SAF86xx.
Feulner: Using our new SAF86xx radar one-chip family, OEMs can quickly and easily migrate their current radar platforms to new software-defined vehicle architectures.
It shares common architecture with SAF85xx, which, when we introduced it in 2023, was the industry’s first fully integrated single chip for automotive radar in 28 nanometer (nm) RFCMOS.
With that, we had a complete platform of automotive radar devices to cover long range front radar and high resolution for the imaging radar, based on common hardware and software IPs.
With the introduction of the SAF86xx SoC, we are extending our radar platform to cover new vehicle architectures. Classical architectures are based on edge processing using smart sensors, which do most of the processing. The intelligence and thereby sensor function resides mostly on the sensor.
The new generation of software-defined vehicle architectures, which will enter the market around 2027/2028, distributes the intelligence and processing between the sensor and a more powerful ADAS compute controller.
It is important to understand that these future architectures require a repartitioning of the processing and a repartitioning of the intelligence.
Basic NCAP safety critical functions, such as blind spot detection and emergency braking, may have to remain on the sensor for when the connection to the high-performance compute is not available.
But for these new architectures, which are more software based, advanced functions are being moved off the sensor into the centralized controller.
In order to do that, we need to have the ability to stream wideband sensor data. One of the benefits of the SAF86xx is that it supports wideband data streaming better than previous generation sensors, which are limited to 10 or 100 megabit per second Ethernet connections.
We are now able to stream at up to a gigabit Ethernet speed. This allows us to use richer sensor data, which makes for better data fusion in our centralized controller.
We are able to establish synchronization between individual sensors to form a network to enable advanced comfort functions in Level 2 and Level 3 autonomous driving.
AI: What additional functions does it support?
Feulner: In software defined vehicles, you would expect software defined radar to enhance physical radar.
Consequently, now that we are hosting radar functions on a centralized controller with a high-performance processor rather than the sensor, we can keep adding new functions and upgrading existing functions using over-the-air updates throughout the lifetime of the vehicle.
In order to be able to do that, we are equipping the chip with a state-of-the-art security engine that allows us to establish a secure connection and perform those upgrades. Some of our customers want upgrades on a monthly basis to create a user experience that is much closer to that of a mobile phone.
AI: What new sensing capabilities is the NXP automotive radar platform offering for software defined radar?
Feulner: The platform covers both today’s and tomorrow’s architectures. The network of connected radar sensors with software-defined functions on a dedicated S32R radar processor in a distributed architecture can enhance radar-based perception to support advancements in autonomous driving.
That includes 360-degree sensing, more powerful AI-based algorithms and secure over the air (OTA) software updates.
This network of connected sensors is adaptable for different use cases. And of course, it is upgradeable through over the air updates, which are expected to be delivered on a quarterly or monthly basis and will keep adding new capabilities to the sensors.
Combined with NXP’s S32R high-performance radar processors, vehicle network connectivity and power management, the full system solution paves the way for advanced, software-defined radar.
While most cars will have five sensors, premium vehicles will have more than 10, including side facing sensors to create additional detail about the environment of the wheel.
As an example, I would like to highlight 360-degree sensor fusion.
Now that we have sensors mounted all around the vehicle, they can give us very detailed mapping and image-like perception of the vehicle’s environment, which is what enables the Level 2+ and Level 3 autonomous driving modes.
AI: How can you help carmakers achieve the next level in automotive radar sensor performance?
Feulner: In 2023, we introduced the first 28 nanometer RFCMOS radar, allowing us to double the RF performance with conventional PCB based patch antennas, which is a well-proven, mature technology. Recent testing of such sensor allowed us to detect smaller objects like a motorcycle up to 200 meters away.
This has been extended now to up to 400 meters using innovative Launcher-in-Package (LiP) technology, which allows us to radiate the signal directly through the bottom of the unit into a 3D antenna, such as a HUBER+SUHNER 3D waveguide antenna. By not having to route the signals through the PCB, we save substantially on routing and coupling losses.
This combination of 28 nanometer RFCMOS plus launcher in package and 3D waveguide antenna allows us to detect and classify even smaller structures like curbstones more reliably.
AI: How is NXP enabling better sensor resolution?
Feulner: What we have in production today is an edge-based sensor, where the processing is done exclusively on the sensor. This proven technology is used in many autonomous and semi-autonomous driving applications.
With distributed architecture, we have the option to implement high resolution collaborative sensing, also known as distributed coherent radar, where we take the signal from two physically separated sensors and coherently combine the data in our radar processor.
The fusing together of the data from the two sensors creates a larger virtual aperture. In radar sensor terms, the resolution of the sensor is proportional to the size of the aperture or the size of the antenna.
There are practical limits to the size of the antenna because car manufacturers have to integrate the sensor into the vehicle.
There are also industrial design constraints and, consequently, the ability to use two smaller, physically separated sensors to form a larger virtual aperture is another way to implement high resolution sensing in these new distributed architectures.
AI: What is next for NXP?
Feulner: Software-defined features are a definite focus.
A 28-nanometer sensor delivers us capabilities that we have not had before. Using different antennas, the sensor can detect smaller objects and see further.
With software-defined radar we are able to keep refining the capabilities and adding new features, which is what car manufacturers, and their customers expect.
Manufacturers want to create additional revenue streams throughout the lifetime of vehicle after it has been delivered to the customer. An example of a software-defined feature hosted on a radar processor is object classification, which we demonstrated at CES 2024.
When we compare the point cloud data delivered by the radar sensor compared to camera data, it is difficult to recognize the point cloud objects with the bare eye, while the camera image is clear.
That is why an object classifier is used. It uses artificial intelligence to run the point cloud data through a neural network, which is trained with sample data to determine the type of object.
The training of the neural network continues after the vehicle has been delivered to the customer. This means that, over time, the efficiency of the neural network improves.
Using OTA updates, manufacturers will be able to use this data to enhance the object classification based on radar sensors. This is exciting as you are moving beyond the capabilities provided by the hardware itself to a system approach where the interplay of software and hardware opens up a whole lot of new possibilities.
HELLA to use NXP SoC family in new radar portfolio
HELLA, a leading supplier of automotive electronics, will leverage NXP’s SoC family as the foundation for its 7th generation radar portfolio, including variants for front, rear, corner, and side radar.
Dr. Dietmar Stapel, Vice President, Radar Program Management at HELLA, stated: “After initiating our collaboration with NXP on the development of RFCMOS-based radar sensors approximately 10 years ago, we can now look back and confidently say that choosing this collaboration was the right decision.
“Leveraging NXP’s RFCMOS TEF81xx and TEF82xx automotive radar transceivers, we have become a leading provider of automotive 77/79 GHz corner radar sensors. With NXP’s radar SoC family, we are now prepared to expand our market position and offer radar solutions for all relevant radar-supported functions, up to level 4 automated driving.”
More Stories
Some Ways How Motorists End Up in Collisions at U-Turns
Maximise Margins with Proven PPF Tactics
Finding the Car Boot Release Button – Tips and Tricks