Please fill in your name

Mobile phone format error

Please enter the telephone

Please enter your company name

Please enter your company email

Please enter the data requirement

Successful submission! Thank you for your support.

Format error, Please fill in again


The data requirement cannot be less than 5 words and cannot be pure numbers

Vison System: How Do Autonomous Vehicles See?

From:Nexdata Date: 2024-04-02

One of the key factors in the development of self-driving cars is the on-board sensors that allow the car to “see” the road and help the car understand what’s going on around it — in most cases, this sensing capability is better than any human can do. Everything that arrives is good. Self-driving cars need to be able to tell who or what is in their path and recognize features of the road system, all while constantly dealing with traffic and other challenges we deal with on the road every day. To overcome these developmental hurdles, self-driving cars require a range of technologies including cameras, radar, lidar and infrared.

As we develop from L1 to L5 autonomous driving, the number of sensors will increase massively. In order to handle the increasing data bandwidth brought by sensors, we need to greatly increase the processing power. This will ensure real-time information extraction and updates on what is happening in the world of transport. Processing power is a vital component of self-driving cars, its importance cannot be underestimated, and it should also be considered when we discuss sensors and their application in self-driving cars.

When a new car is introduced, many people wonder what’s under the hood. For self-driving cars, it’s even more important to know what’s in the trunk, because processing power is often built into the trunk these days. OEMs must equip their vehicles with the necessary technology to ensure that all ADAS and autonomous driving functions perform as they were designed. Graphics processing units (GPUs) and neural network accelerators (NNAs) will play key roles in this development.

GPUs already allow automakers to stitch together data streams from multiple cameras to create a 360-degree surround view of the car. NNA can support multi-core artificial intelligence (AI) chips for large-scale parallel computing to realize functions such as road sign recognition, pedestrian detection and automatic driving. With the integration of a large number of sensors, mobility will rapidly move towards autonomous driving in the next 10 to 20 years. Let’s take a look at some of the sensor systems that will make fully autonomous vehicles a reality.

RGB Camera
Cameras have been a staple in cars for years, with their primary function being to provide a better view when reversing. In addition to the reversing cameras that are now mandated by industry organizations in various regions, cameras are also becoming a basic requirement for strengthening ADAS functions and a key element of self-driving cars. The camera can act as the eyes of the car, observing various details on the road in a way similar to human eyes. Higher resolution means objects can be detected from greater distances and their condition understood. Since cameras are the only sensors capable of detecting color, they are an essential element for traffic light detection, road sign reading, and more. They can assist adaptive cruise control and emergency braking functions, and can also serve as primary sensors in automated solutions.

Lidar can be found in almost every self-driving car tested. Designed to provide a full 360-degree panoramic view, lidar uses laser pulses to visualize a vehicle’s surroundings in the form of a three-dimensional “point cloud.” The technology has already been extremely successful in helping many OEMs bring their self-driving car ambitions to life. It’s powerful enough to work on its own; it’s also versatile enough to work with other sensors. Multiple lasers could also be used to enable more robust self-driving car functions. Currently, lidar is an expensive option in self-driving car systems, and many companies are working to reduce its cost so that automakers can use it in mid-range and premium models.


Radar has been used in aircraft for decades, but it will soon become an essential technology for the future of mobility. Cameras and lidar are great for seeing the car’s surroundings, while radar is especially useful for detecting moving objects blocked by natural or man-made obstacles. For example, when a deer is hidden behind a tree, it is invisible to the naked eye, but radar can detect its presence and display a warning or automatically slow down to prevent a collision. Radar typically uses a narrowly focused long-range beam or a large-area low-frequency beam to detect objects in the surrounding area in front of the car. The advantage of radar is the early detection of objects and knowledge of their speed and direction. This is crucial for predicting the movement paths of other traffic participants in the vehicle’s field of view.

Submarines rely on sonar technology to detect the ocean and avoid collisions with ships, animals and other submarines. Now, the technology is finding its way into mobility in the form of sensors that detect reflected sound waves to determine the position of objects relative to the vehicle. This is useful for locating pedestrians on crosswalks and even spotting critters lurking nearby. Like radar, the technology can also detect moving objects under various environmental conditions.

Infrared Sensor
Infrared sensors are common in cell phones and other electronics, and can improve image quality, facial recognition capabilities, and a host of other features. Self-driving cars could also use infrared sensors, likely in combination with other technologies. For example, if you build a thermal imaging camera with infrared sensors, then infrared light can help vehicles see better in conditions such as rain, fog, dust and smoke. This feature could benefit nearly every car on the road in rain and fog, and it also opens up incredible opportunities to keep self-driving cars immune to dust or smog.