Your Car's Driving Assistance Tech Isn't Meant to Be Used Alone—Here's Why

Keep your eyes on the road

Don't trust your car to do the driving for you, experts say. 

Some automakers like Tesla say that assistance technologies like Autopilot aren't meant to be used without close human supervision. But Americans may not be getting the message. A new study finds that drivers using driver assistance features often treat their vehicles as fully self-driving. 

"These applications still require the human to keep their eyes on the road and hands ready to take over the wheel, just as we have been doing with traditional cruise control for decades," Stan Caldwell, a professor of transportation and public policy at Carnegie Mellon University told Lifewire in an email interview. 

Not Yet Full Auto

Some drivers seem to think they are riding a bus when using assistive technologies. Drivers of Cadillac Super Cruise, Nissan/Infiniti ProPILOT Assist, and Tesla Autopilot said they were more likely to perform non-driving-related activities like eating or texting while using their partial automation systems than while driving unassisted, according to the study by industry group the Insurance Institute for Highway Safety. And 53 percent of Super Cruise users, 42 percent of Autopilot users, and 12 percent of ProPILOT Assist users said they were comfortable treating their vehicles as fully self-driving.

"The big-picture message here is that the early adopters of these systems still have a poor understanding of the technology's limits," said IIHS President David Harkey in a news release. "But we also see clear differences among the three owner populations. It's possible that system design and marketing are adding to these misconceptions."

Vehicles that you can buy currently can have levels 1 and 2 automation and include applications such as automated lane keeping, adaptive cruise control, and automated emergency braking, Caldwell said. 

"Level 3 automation is coming, and already on some roads in Germany with Mercedes, where eyes can be off the road, but the human driver still may have to take over," he added. "My concern is that if people are already over-relying on Level 2 automation, the situation may get worse."

Chris Piche, the CEO Smarter AI, a company that makes systems that use AI and cameras for car navigation and other areas, said in an email interview that today's autonomous vehicles, like Tesla Autopilot and Full Self-Driving, rely on vehicle cameras and computer vision systems to perform "line of sight" detection.

"This has proven effective with moving obstacles like bicycles, pedestrians, and other vehicles. But there have been many examples of fatal collisions with stationary obstacles—like traffic barriers and parked trailer trucks—caused by autonomous vehicles' inability to identify these stationary objects that have blended with their backgrounds," Piche added. 

The inherent dangers in relying on automation in autonomous vehicles come down to trust, Bob Rogers, data scientist and CEO of Oii.ai, said via email. Can drivers trust that the car can handle the situations it faces? 

Artur Debat / Getty Images

"Autonomous vehicles depend on a slew of sensors to react. So if the sensor for 'speed' malfunctions, the speed-assist can't adjust for speed limit changes," Rogers said. "If the sensor that triggers automatic emergency braking malfunctions, a driver who trusted that system is obviously going to end up in a crash."

Rogers noted that car manufacturers have been touting more and more safety features in their autonomous vehicles. Blind spot technology is one advancement that helps alert drivers, especially during lane changes. He said that researchers are also fine-tuning forward collision warning systems, which alert drivers to swerve, slow down or correct their course depending on what's ahead."

"Automakers want to safely transport people in those cars while they sit in the back seat or work on their laptops," he added. "So researchers will continue training AI to handle countless hazards, and drivers can expect to be less hands-on during a road-trip experience."

The Future May Be Safer

While fully self-driving cars are not yet safe enough for human transportation, AI could help. Companies, including Smarter AI, are developing V2X (vehicle to everything) systems that enable autonomous vehicles to see beyond the "line of sight"—around corners, through traffic, and without blind spots—by integrating cameras and computer vision systems with wireless sensors and networks in vehicles and transportation infrastructure including roads, intersections, and crosswalks. 

"Vehicle cameras, computer vision, and V2X will eliminate traffic collisions and fatalities," Piche predicted. "And by orchestrating navigation and speeds, V2X will eliminate traffic jams in even our largest cities."

https://www.lifewire.com/your-cars-driving-assistance-tech-isnt-meant-to-be-used-alone-heres-why-6751861

Previous
Previous

Future of Federated AI and ML: Interview with Chris Piche, Founder and CEO of Smarter AI

Next
Next

Schools May Be Using AI to Keep Tabs on Students—Here's Why