For Self-Driving Cars, the Hot New Technology Is… Radar

One of the hottest new technologies for next-generation auto-safety systems and planned “autonomous” vehicles has roots going back a century.

Yes, we’re talking about radar, the same technology that began as a curiosity in the early 20th century, helped Britain to repel the Luftwaffe in World War II, and has long enabled weather forecasting and allowed air-traffic controllers to keep our skies safe.

Today, radar is no longer just for airplanes and military installations. A number of new companies on both the hardware and software side are making radar an integral part of safety systems to detect cyclists and pedestrians.

The need for such technology is urgent: Pedestrian deaths in the U.S. have surged in the past few years, even as Americans logged fewer miles on the road. And increasing levels of autonomy in new vehicles that enable features like collision warning, automatic braking, and blind-spot detection—not to mention the driverless cars of the future—are entirely dependent on advanced sensory systems. Such systems are also essential for auto makers to meet their promises to incorporate automatic braking systems into all vehicles by 2022.

To reduce and prevent carnage on our roads, companies like Mobileye, a subsidiary of Intel, are working on chips bristling with tiny radar antennas. General Motors recently invested in Oculii, an all math-and-software startup that uses machine learning to shape the kinds of signals automotive radar systems use. The software company MathWorks is developing algorithms that can allow auto makers to integrate data from radar and other sensors into a trustworthy picture of the world around a vehicle.

For engineers who work on vehicle sensors, now is a time of rapid change, says Erez Dagan, executive vice president of products and strategy at Mobileye. Cameras used in automobiles continue to become higher resolution, and are able to sense a wider range of natural light than they used to. Lidar, which bounces lasers off surrounding objects to “see” the world in 3-D, is becoming less expensive than it used to be. (Lidar is common in robot taxi prototypes, such as those from Waymo, the Google sister company, GM’s Cruise and Amazon’s Zoox.)

Radar, which bounces radio waves off objects—the term was born as an acronym for “radio detection and ranging”—has been used on some first-generation safety systems in vehicles since the 1990s. Automotive radar systems have a number of advantages. They’re tough enough to survive years of jostling and temperature swings when mounted on cars. They’re much, much less expensive than lidar, good at instantaneously measuring the velocity of objects, and able to peer through the kinds of inclement weather, like fog and rain, that can foil both cameras and lidar systems. But they have until recently had one major drawback: They have only a fraction of the resolution of those other systems, which means in essence that the images they produce are much blurrier.

Oculii’s engineers work to install imaging radar sensors on a car.

Photo: Oculii

Oculii’s technology works by changing the shape—also known as the waveform—of the radar signal sent out by radar on cars. The physics are complicated, but by changing the nature of the radar signal depending on what sort of objects it’s bouncing off of, it can resolve objects whose shape would be impossible to “see” otherwise. The result, says Chief Executive Steven Hong, is that existing automotive radar sensors, which cost around $50 apiece, can generate three-dimensional images of a car’s surroundings with a much higher resolution. The company’s software is set to make its debut in radar-based safety systems in the forthcoming Lotus Lambda SUV, to be released in 2023.

Leveraging the chip manufacturing abilities of parent company Intel, Mobileye is working on individual microchips covered with nearly 100 tiny antennas. By using artificial intelligence software to process the noisy signals they receive, Mobileye says its systems can do things like identify pedestrians, at least in the lab. That’s something that previously could only be achieved with cameras and lidar.

There is no unanimity among automobile technologists about what configuration of cameras, lidar and radar will become the standard way to achieve various safety systems or autonomous driving, but nearly all agree that the best solution will be some combination of them.

The resolution that even the best automotive radar can achieve is only as good as the worst lidar systems available, says Matthew Weed, an engineer and senior director of product management at Luminar, which makes lidar systems for automobiles. Luminar’s system, which Mr. Weed says is superior to radar for most applications, costs $1,000, however.

Mr. Weed says that Luminar’s lidar-based systems could justify their cost by being so good that they could lower drivers’ insurance costs by preventing accidents and pedestrian deaths. Even with such a system on a car, radar would be a good backup for when it fails or can’t handle severe weather, he adds.

Mobileye uses lidar, cameras and radar in its most advanced systems. CEO Amnon Shashua has said that while lidar systems have come down in price, they are still 10 times the cost of radar, and likely to remain so for the foreseeable future, on account of the complexity of the hardware involved.

A vehicle outfitted with Mobileye technology moves through traffic in Jerusalem.

Photo: Mobileye

Elon Musk’s Tesla has gone all-in on its bet that the company can achieve true autonomous driving in its vehicles using only cameras.

Cameras have the advantage of extremely high resolution, and they’re affordable and compact thanks to years of advances in smartphone cameras. But for a system that can achieve the highest safety standards, and even eventually full autonomy, cameras need backup sensors that fail under different sets of conditions than they do, Mr. Dagan adds.

Take fog, which looks like an obstacle to both camera-based and lidar-based systems, potentially causing vehicles to stop when they shouldn’t. In research published in 2020, radar-based automotive sensors had no trouble penetrating fog and correctly identifying stopped vehicles hidden within it, says Dinesh Bharadia, an assistant professor of engineering at the University of California San Diego who contributed to the work.

More Keywords

Dr. Bharadia says his team found that one key is using multiple radars, spaced at least five feet apart on a vehicle. It’s the same principle at work in the ever-expanding number of cameras on the backs of our smartphones, he adds. It’s possible to build up an “image” of a car’s surroundings using multiple low-cost radar sensors, just as our phone can use multiple small and inexpensive cameras, and then recombining the images they gather into something much sharper.

Bringing all the sensors on a car together into a single, coherent view of the reality outside a vehicle requires fusing all that data together, says Rick Gentile, an engineer who used to work on radar systems for defense applications and is now a product manager at MathWorks, a software company that builds tools to help process data. For example, while radar might be able to detect that there’s a sign up ahead, it can’t see its color, which is critical to quickly identifying what kind of sign it is.

For so-called robot taxis, the way to make up for the gaps in the capabilities of each kind of sensor is to use all of them. The goal is “full redundancy,” says Mr. Dagan, so that even if one sensor has an error, others perceive the world correctly. This, he argues, is the fastest way to give vehicles senses that are at least as good as a human’s. (Whether those vehicles will have judgment sufficient to actually drive themselves around safely is a separate matter.)

Until we get real autonomous vehicles—something that could be years, if not decades away—auto makers will have to choose among radar, lidar and cameras, or some combination of the three, to create safety systems that can meet their promises to make automatic braking systems standard by 2022, and to continue to improve those systems. All three sensor types continue to get better, but the difference in cost among them has led auto makers to favor one technology or another, depending on how well they think they can make up for its deficiencies with software and AI.


How do you think radar might help drive further innovation? Join the conversation below.

This has led to a healthy competition among makers of safety systems, sensors and supporting software—whomever you talk to, they argue their systems are the best.

As all of these companies jostle for a place on your car, the goal of all of these technologists is to profit by significantly reducing road fatalities of every kind, when a human is behind the wheel. It’s a goal that they all agree is much closer at hand than fully autonomous vehicles.

For more WSJ Technology analysis, reviews, advice and headlines, sign up for our weekly newsletter.

Write to Christopher Mims at

Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8