Uber and GM's Cruise Opened Their Autonomous Car Visualization Software, A Rare Initiative In An Extremely Competitive Autonomous World

0
5

To date, you've probably seen these graphic images from top to bottom of a stand-alone car as it sailed in a neon-colored world of yellow and purple boxes representing other cars and pedestrians. These images are used to translate the raw data generated by an autonomous vehicle's hardware and software stack into something more attractive to operators and help them better understand how their cars "see" and interact with the world around them. .

Now, two big players in the autonomous car – Uber and GM's Cruise – are putting their visualization software on the Web and making it free for anyone. This is an unprecedented step in the world of the well-kept secrets of autonomous driving, but it will encourage developers to build a variety of interesting applications that can ultimately help the entire industry.

Last week, in a Medium article, Cruise presented its graphic library of two- and three-dimensional scenes, called "Worldview." "We hope Worldview will lift the entry threshold into the powerful world of WebGL, giving web developers a simple, self-contained foundation. to create more complex visualizations, "said the company.

It provides 2D and 3D cameras, mouse and keyboard movement commands, click interaction and a suite of integrated drawing commands. Our engineers can now easily create custom visualizations, without having to learn complex graphical APIs or write wrappers to make them work with React.

The new Uber tool seems more specifically for audiovisual operators. The company's standalone AVS is a "customizable web platform that allows developers of standalone technologies, big or small – to turn their vehicle data into an easily understandable visual representation of what the vehicle actually sees. world, "says Uber.

"Customizable Web Platform"

As autonomous vehicles travel more and more kilometers on public roads, it is increasingly necessary to isolate some extreme cases to help operators understand why their cars have made some decisions. The visualization system allows engineers to take out and read certain trigger intervals for closer inspection.

Today, many audio-visual operators rely on ready-to-use visualization systems that have not been designed for autonomous cars. They are often limited to bulky desktop computers that are difficult to navigate. Uber is now letting rival AV operators use its web-based visualization platform to avoid "learning complex graphics and data visualization techniques to deliver effective tool solutions," the company says in a blog .

"Being able to visually explore sensor data, predicted trajectories, tracked objects and status information such as acceleration and speed is invaluable for the triage process," said Drew Gray, head of technology at Travel, an autonomous company, in a statement provided by Uber. "At Travel, we use this information to make data-driven decisions about engineering priorities."

This operation comes less than two months after Uber's return on the highway for the first time since one of his vehicles hit and killed a pedestrian in Tempe, Arizona, in March 2018. Vehicles Autonomous Uber are back on the road in Pittsburgh. much more reduced fashion.