Autonomous vehicles learning to drive by mimicking others

Self-driving cars are powered by machine learning algorithms that require vast amounts of driving data in order to function safely. But if self-driving cars could learn to drive in the same way that babies learn to walk—by watching and mimicking others around them—they would require far less compiled driving data. That idea is pushing Boston University engineer Eshed Ohn-Bar to develop a completely new way for autonomous vehicles to learn safe driving techniques—by watching other cars on the road, predicting how they will respond to their environment, and using that information to make their own driving decisions.

Ohn-Bar, a BU College of Engineering assistant professor of electrical and computer engineering and a junior faculty fellow at BU’s Rafik B. Hariri Institute for Computing and Computational Science & Engineering, and Jimuyang Zhang, a BU PhD student in electrical and computer engineering, recently presented their research at the 2021 Conference on Computer Vision and Pattern Recognition. Their idea for the training paradigm came from a desire to increase data sharing and cooperation among researchers in their field—currently, autonomous vehicles require many hours of driving data to learn how to drive safely, but some of the world’s largest car companies keep their vast amounts of data private to prevent competition. 

“Each company goes through the same process of taking cars, putting sensors on them, paying drivers to drive the vehicles, collecting data, and teaching the cars to drive,” Ohn-Bar says. Sharing that driving data could help companies create safe autonomous vehicles faster, allowing everyone in society to benefit from the cooperation. Artificially intelligent driving systems require so much data to work well, Ohn-Bar says, that no single company will be able to solve this problem on its own.

“Billions of miles [of data collected on the road] are just a drop in an ocean of real-world events and diversity,” Ohn-Bar says. “Yet, a missing data sample could lead to unsafe behavior and a potential crash.”

The researchers’ proposed machine learning algorithm works by estimating the viewpoints and blind spots of other nearby cars to create a bird’s-eye-view map of the surrounding environment. These maps help self-driving cars detect obstacles, like other cars or pedestrians, and to understand how other cars turn, negotiate, and yield without crashing into anything. 

Through this method, self-driving cars learn by translating the actions of surrounding vehicles into their own frames of reference—their machine learning algorithm–powered neural networks. These other cars may be human-driven vehicles without any sensors, or another company’s auto-piloted vehicles. Since observations from all of the surrounding cars in a scene are central to the algorithm’s training, this “learning by watching” paradigm encourages data sharing, and consequently safer autonomous vehicles.

Ohn-Bar and Zhang tested their “watch and learn” algorithm by having autonomous cars driven by it navigate two virtual towns—one with straightforward turns and obstacles similar to their training environment, and another with unexpected twists, like five-way intersections. In both scenarios, the researchers found that their self-driving neural network gets into very few accidents. With just one hour of driving data to train the machine learning algorithm, the autonomous vehicles arrived safely at their destinations 92 percent of the time. 

“While previous best methods required hours, we were surprised that our method could learn to drive safely with just 10 minutes of driving data,” Ohn-Bar says.

These results are promising, he says, but there are still several open challenges in dealing with intricate urban settings. “Accounting for drastically varying perspectives across the watched vehicles, noise and occlusion in sensor measurements, and various drivers is very difficult,” he says.

Looking ahead, the team says their method for teaching autonomous vehicles to self-drive could be used in other technologies, as well. “Delivery robots or even drones could all learn by watching other AI systems in their environment,” Ohn-Bar says.

New algorithm may help autonomous vehicles navigate narrow, crowded streets

More information:
Jimuyang Zhang et al, Learning by Watching, arXiv:2106.05966 [cs.CV] arxiv.org/abs/2106.05966

Provided by
Boston University

Citation:
Autonomous vehicles learning to drive by mimicking others (2021, July 30)
retrieved 30 July 2021
from https://techxplore.com/news/2021-07-autonomous-vehicles-mimicking.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Access the original article
Subscribe
Don't miss the best news ! Subscribe to our free newsletter :