The idea of the driverless car has captivated the public imagination for over a century. Long before modern autonomous vehicle and ADAS development, remote-controlled“phantom autos” attracted vast crowds across the US. But they were far from being a realistic reflection of the grim reality on the roads in the early 1900s.
In 1917, Detroit and its suburbs were home to 65,000 cars but witnessed a horrific total of 7,171 road traffic accidents, with pedestrians accounting for three-quarters of the victims.
Throughout the 20th century, automakers began to take strides to evolve safety with automated systems like cruise control, anti-lock braking (ABS), Blind Spot Detection (BSD), Lane Change Assist (LCA), traction control and Electronic Stability Program (ESP).
These applications all shared the same goal – improving road safety by overcoming human error. And they’ve proven their worth. Cars equipped with ESP, for example, have historically been 25% less likely to be involved in a fatal crash. And while developments like this were never the end goal; they were key stepping stones toward vehicle autonomy.
When was ADAS implemented in cars?
The modern momentum toward driverless cars began in 2004 when the US Defense Advanced Research Projects Agency (DARPA) held its first Grand Challenge.
Teams drawn from academia and industry developed prototypes of autonomous vehicles that would have to negotiate a grueling 240 km route through the Mojave Desert. Within a few hours, all 15 AVs had crashed out of the contest, with the frontrunner completing less than 5% of the course.
One year later, however, five vehicles finished the route, instilling hope that vehicle autonomy could eventually become commercially feasible. But major questions remained. Could the technology used by most of the teams ensure safety in real-world conditions? Would autonomous platforms be reliable enough to actively reduce the number of road deaths and injuries?
ADAS technology development: accelerating safety
Road Traffic Accidents (RTA) cause 1.35 million fatalities and around 50 million serious injuries every year, equally divided between vehicle occupants and Vulnerable Road Users (VRU). RTAs claim the lives of more people aged 5-29 years than any other cause and cost the global economy $1.8 trillion annually. Part of the problem is the exponential rise in the use of scooters and other two-wheelers.
That’s where Advanced Driver-Assistance Systems (ADAS) come in. Sophisticated sensors, far more advanced than those deployed in the Grand Challenge, are predicted to prevent 37 percent of injuries and 29 percent of deaths from passenger vehicle crashes.
Which technology has what it takes?
To deliver the required performance, ADAS sensors must provide reliable performance in all conditions, especially in darkness and severe weather when the risk of an RTA is substantially higher. They must also cover a wide field of view, both horizontally and vertically to eliminate any blind spots around the vehicle. The elevation component is vital for depth perception and for height-obstacle detection.
Another requirement is high resolution for spotting VRUs as they approach the vehicle, especially in urban neighborhoods and crowded parking lots where pedestrians suddenly emerge from between vehicles and scooters lurk in the vehicle’s blind spots.
Optimal ADAS functionality also demands differentiation between static obstacles such as dividers, curbs and parked vehicles, and between different types of VRUs, moving cars and other hazards.
Issues with current automotive safety systems
Standard 2D radar is inadequate for these complicated scenarios which require high angular resolution. High-resolution LIDAR sensors are unreliable in darkness or adverse weather conditions and aren’t always scalable due to their high price point, making LIDAR-based safety unrealistic for economy vehicles. Cameras, meanwhile, are notoriously intrusive of privacy – an increasingly important concern – and can be rendered inoperative by something as simple as dirt on a lens.
There’s another crucial consideration: the overall complexity and cost of a vehicle’s sensor ecosystem. The legacy strategy of fitting more single-function sensors to meet each new safety standard is overloading the modern car with sensors, wiring and ECUs. With up to 200 sensors per car, electronics make up over 35% of overall vehicle cost, a percentage expected to rise to 50% by 2030. The single-function sensor approach also demands more software, integration efforts and power consumption, while increasing development risks.
Automakers are realizing the need to embrace a more holistic approach that enhances safety from every angle. One technology is enabling them to do exactly that.
4D imaging radar: ADAS development
4D imaging radar is the next generation of radar technology that equips vehicles with the instincts required to make lifesaving decisions.
It leverages a Multiple Input Multiple Output (MIMO) antenna array for high-resolution detection and tracking of people and objects around the vehicle in real time. The technology combines 3D imaging with Doppler analysis of wave distortion caused by movement to create the additional dimension – velocity.
With no optics involved, 4D imaging radar protects privacy and is robust in all lighting and weather conditions. It does not require line of sight and can even detect targets through walls and other objects, enabling it to sense people and objects around street corners.
With leading automakers weighing up the various solutions on the market, they must keep in mind that many 3D/4D imaging radar solutions require chip concatenation, which requires additional hardware and incorporates a heavy external processor. Although this approach delivers the required resolution for advanced use cases, it demands a large form factor, comes with a limited field of view, and carries a hefty price tag.
One vendor has developed a solution that overcomes resolution, complexity and cost challenges with a compact, single-chip platform. It’s miles ahead in terms of performance, price, and the ability to address numerous safety requirements with fewer single-chip sensors.
Vayyar: leader in ADAS development
Vayyar, the global leader in 4D imaging radar, provides an integrated 76-81GHz software-hardware platform that is the optimal ADAS solution, both in the here and now and for the AVs of the near feature.
This ADAS development delivers unprecedented resolution thanks to rich point cloud imaging and far greater robustness than alternatives, at a lower cost. It also supports a zero-minimum distance from its targets, no dead zones, and a wide azimuth-elevation field of view for rich elevation perception.
Multifunctionality on a single-chip platform
Vayyar’s solution also provides the industry’s-first multi-range 4D imaging radar that covers 20cm-300m on a single RFIC. What’s more, it’s the only automotive solution that provides multifunctionality on a single-chip platform. This groundbreaking approach, supported by exceptional resolution and an ultra-wide azimuth-elevation field of view, enables OEMs to ensure 360° protection with far fewer sensors, dramatically reducing both complexity and cost.
Affording a range of 300m, the multi-range ADAS solution differentiates between static obstacles such as dividers, curbs and parked vehicles, and between different types of VRUs, moving vehicles and other hazards. In low-speed environments such as parking lots, the XRR chip’s uSRR and SRR sensing supports advanced parking assistance, scanning the vehicle’s surroundings for pedestrians and obstacles.
On the open road, MRR and LRR capabilities facilitate a variety of ADAS and autonomy applications such as Lane Change Assist (LCA), Adaptive Cruise Control (ACC), Blind Spot Detection (BSD), Collision Warnings (fCW/rCW), Cross Traffic Alerting (CTA) and Autonomous Emergency Braking (AEB).
ADAS sensor features
Vayyar’s ADAS sensors provide a range of data outputs tailored to various ADAS and AV architectures:
Edge computing: processing-on-chip enables smart object clustering and tracking, significantly reducing ECU costs.
Raw 4D point cloud data streaming: for native AV sensor fusion and synergy with centralized computing systems.
Hybrid configurations for powerful signal processing embedded on chip at the sensor level, enabling transmission of compressed high-resolution point cloud over high-speed vehicle networks.
With Vayyar’s multifunctionality on a single-chip platform, vehicles can earn up to 33 Euro NCAP safety points, including 24 points in the VRU protocol, and nine points in the Safety Assist protocol. What’s more, Vayyar’s sensors are ideally suited to Advanced Rider Assistance Systems (ARAS) for two-wheel vehicles, good news for motorcyclists who represent a uniquely high-risk group of road users.
As the auto industry gets closer to achieving its long-term goals of autonomy and maximum safety for all road users, Vayyar’s affordable 4D imaging radar has emerged as a key enabling technology for ADAS growth, with unparalleled robustness and multifunctionality.
Read about Vayyar’s production-ready Radar-on-Chip platform for ADAS at www.vayyar.com/auto.