Artificial intelligenceBrain power

Providing real-time mapping for autonomous vehicles

Originally posted on The Horizons Tracker.

Autonomous vehicles are becoming ever more capable, and if they are to be successful, and safe, when operating on our roads, they need extremely effective vehicle-to-vehicle communication.

Researchers from New York University have partnered with HERE HD Live Map to ensure vehicles have accurate information on the status of lanes, obstacles, hazards and speed-limits in real time.

The team are developing a deep learning based system that allows autonomous vehicles to navigate and respond to changing road conditions by pairing the data they’re collecting from onboard sensors with that from the Here HD live maps.

High-definition mapping

The kind of maps used by autonomous vehicles are accurate to within 10-20cm, and there is a strong requirement for these maps to be updated in real-time to ensure their accuracy. The goal of the research partnership is to enhance the car-to-map precision to around 10cm.

“Essentially, we want to be able to precisely match what the car sees with what’s in the cloud database. An incredibly precise ruler isn’t of much use if your vision is blurry,” the team say.

“Our work involves employing computer vision techniques to refine the vehicle’s ability to continually locate itself with respect to HERE’s cloud-based service,” they continue. “That requires real-time images of the street and surrounding objects derived from cameras, LiDAR [a laser-based range-finding technology], and other on-board sensors.”

This level of precision is crucial because the cars connected to the Live Map service are sending a constant stream of data on things such as road conditions, weather, traffic levels, speed limits and any obstacles they encounter on the road. This stream of data will allow the maps to be updated in near real-time.

It’s a fascinating partnership that is well worth keeping an eye on.

Article source: Providing real-time mapping for autonomous vehicles.

Header image caption: Edward K. Wong, an associate professor in the NYU Tandon Department of Computer Science and Engineering, and Yi Fang, a research assistant professor in the Department of Electrical and Computer Engineering and a faculty member at NYU Abu Dhabi are developing a deep learning system that will allow self-driving cars to navigate, maneuver, and respond to changing road conditions by mating data from onboard sensors to information on HERE HD Live Map, a cloud-based service for automated driving.

Header image source: Wong/Fang is open access.

Rate this post

Adi Gaskell

I'm an old school liberal with a love of self organizing systems. I hold a masters degree in IT, specializing in artificial intelligence and enjoy exploring the edge of organizational behavior. I specialize in finding the many great things that are happening in the world, and helping organizations apply these changes to their own environments. I also blog for some of the biggest sites in the industry, including Forbes, Social Business News, Social Media Today and Work.com, whilst also covering the latest trends in the social business world on my own website. I have also delivered talks on the subject for the likes of the NUJ, the Guardian, Stevenage Bioscience and CMI, whilst also appearing on shows such as BBC Radio 5 Live and Calgary Today.

Related Articles

Back to top button