AI Online

Ai INNOVATION, SINCE 1895

Cutting through the data overload with customized information

Cutting through the data overload with customized information

People are being bombarded with information in the modern world. When we can’t digest or process it all our senses simply start ignoring it. In a car or truck that could be fatal.

There is a real danger of information overload in a vehicle because of what is happening outside and inside the compartment passenger, as well as under the hood. There is a growing array of advanced driver assistance (ADAS) systems that collect data. But, this information has to be processed and presented in such a way as to cut through the clutter.

Automotive Industries (AI) asked Robert Kempf, Vice President Sales and Business Development ADAS/Autonomous Driving at HARMAN – how advanced driver assistance systems will help to cut the clutter. Kempf: Starting with user expectations, the vehicle as we know it as a simple means of transport is transforming into becoming an immersive living space as we move through the different levels of autonomy towards the robo taxi that can operate in any environment. To navigate this route, you will need new technologies like digital mirrors that display the outside world within the car, other displays and augmented reality. You will need to fully understand not only the vehicle environment (that means using sensors outside of the vehicle, like cameras, radar and LiDAR), but also what is happening inside the vehicle.

To do this HARMAN is working on cabin monitoring systems that combine sensors like cameras, infrared and normal light micro radar, and audio analysis. This integration really holds the key to mastering the processing of all these different sensors to monitor the environment and the intentions and state of the vehicle user in order to tailor the information provided to the driver and to make it easy to understand. HARMAN has a big advantage through our knowledge of the infotainment space in that we understand what users want to see, and how they want to consume content.

AI: How do you see the evolution of ADAS unfolding? Kempf: When we talk about autonomous driving, ADAS is the first step, in that it provides warnings to drivers, and allows for partial automation. There are three key factors you need to consider. First is the increasing safety regulations which are led by the European Union, and will reinforce the need for more and more ADAS features in the vehicle. Second, and even more important from a HARMAN perspective, is the impact of changing user expectations – which brings us to the third, which is comfort and mobility. These ADAS drivers will significantly change the automotive industry.

AI: How does your sensor technology combine camera, radar and LiDAR? Kempf: In order to meet more stringent safety regulations, you are going to have to automate the processing of information from a number of different sensors. Basically, the key challenge is to extract and collate the relevant information from all the sensor data. Talking about sensors, we are developing also forwardfacing cameras, as well as systems to monitor other systems. The key is to be able to migrate new electrical/electronic (EE) architectures. The distributed architectures you see today, where you have in each and every sensor a small ECU doing the data processing and giving you object outputs, needs to migrate to more of a domain controller or a central compute architecture that enables you to optimize the utilization of different sensor inputs. It must also give you scalability and flexibility to handle different sensor topologies because there will be different sets of sensors to provide different features as dictated by the cost level. This migration of EE architecture will help not just with the rollout of the technology, but also make it affordable in the long term.

AI: Is there “one for all” sensor solution? Kempf: For us the sensor fusion topic holds the key to future success. There won’t be a “one for all” sensor solution. There are pros and cons for each and every sensor, whether is radar, cameras or LiDAR. What is needed is fusion of the information collected by the cameras, radars and LiDARs – as well as other sensors that may play a role. Right now, each and every sensor provider is moving to higher and higher resolution and increased performance, but nobody except HARMAN is really looking at the bigger picture. This is the optimum sensor mix to fulfil the necessary KPIs in terms of performance and robustness and cost. When it comes to automated vehicles, the sensors need to be able to keep the vehicle moving without any connectivity.

AI: Will V2X and mesh networking enable vehicles to become part of the urban structure? Kempf: Ultimately each and every vehicle, and every building will be a part of a larger IoT environment which requires backend solutions that are able to orchestrate all the different inputs. The vehicle in that sense will be just a part of a broader IoT network. If you look at what 5G is bringing to the game, you can talk about adding a seventh sense to the vehicle. Connectivity will help reduce the environmental footprint of transport by maximizing traffic flow – which, in turn, will strengthen the case for automated driving.

AI: Why do we need to monitor what’s happening inside the vehicle? Kempf: While automated driving technology is evolving pretty quickly, the consumer is not being asked whether they want or need automated driving, or what benefits they expect from it. There is also a lack of trust in the technology. By understanding the user intensions in more detail, you can adapt and personalize the system. Internal sensors help identify the user’s needs, while the backend stores the user profile. Our expectation is that if you enable this personalization you will see a steep increase in acceptance and trust in autonomous driving technology. Everyone drives differently and has their own preferences. If you can combine ADAS technology together with digitalized lifestyle personalization, there will be greater demand for autonomous vehicles.

AI: How would augmented reality lead to new user experiences? Kempf: AR is a different way of presenting information to the user. The nice thing about AR is that it will give you a natural but immersive experience when you consume information by overlaying the data on AI images. It will be like reading your phone, but you will have the information at your fingertips on the windscreen. Navigation will be much more immersive than we are used to. While at present the information is displayed in the center stack cluster, with AR each and every window can interactively display information.

AI: What’s next for HARMAN? Kempf: For us it is really about creating this immersive living space which will be the future of automated vehicles. When we refer to “experiences per mile”, what we are focusing on is not just a personalized experience with ADAS and automated driving (which requires that you know in depth what is going on in the ADAS domain), but also in the telematics space and infotainment space. We have most of the enabling technologies in-house, and we’ve demonstrated a number of the components at shows like CES. And within the next 20 years we will see everything coming together, presenting to the world mobility in a way that leverages overlay of ADAS, telematics and infotainment.   

Previous posts

Next posts

Thu. March 28th, 2024

Share this post