Sunday, December 22, 2024
Technology

Waabi’s genAI promises to do so much more than power self-driving trucks

For the last two decades, Raquel Urtasun, founder and CEO of autonomous trucking startup Waabi, has been developing AI systems that can reason as a human would. 

The AI pioneer had previously served as the chief scientist at Uber ATG before launching Waabi in 2021. Waabi launched with an “AI-first approach” to speed up the commercial deployment of autonomous vehicles, starting with long-haul trucks. 

“If you can build systems that can actually do that, then suddenly you need much less data,” Urtasun told TechCrunch. “You need much less computation. If you’re able to do the reasoning in an efficient manner, you don’t need to have fleets of vehicles deployed everywhere in the world.” 

Building an AV stack with AI that perceives the world as a human might and reacts in real time is something Tesla has been attempting to do with its vision-first approach to self-driving. The difference, aside from Waabi’s comfort with using lidar sensors, is that Tesla’s Full Self-Driving system uses “imitation learning” to learn how to drive. This requires Tesla to collect and analyze millions of videos of real-world driving situations that it uses to train its AI model. 

The Waabi Driver, on the other hand, has done most of its training, testing and validation using a closed-loop simulator called Waabi World that automatically builds digital twins of the world from data; performs real-time sensor simulation; manufactures scenarios to stress test the Waabi Driver; and teaches the Driver to learn from its mistakes without human intervention. 

In just four years, that simulator has helped Waabi launch commercial pilots (with a human driver in the front seat) in Texas, many of which are happening through a partnership with Uber Freight. Waabi World is also enabling the startup to reach its planned commercial fully driverless launch in 2025. 

But Waabi’s long-term mission is much grander than just trucks.

“This technology is extremely, extremely powerful,” said Urtasun, who spoke to TechCrunch via video interview, a white board full of hieroglyphic-looking formulas behind her. “It has this amazing ability to generalize, it’s very flexible, and it’s very fast to develop. And it’s something that we can expand to do much more than trucking in the future … This could be robotaxis. This could be humanoids or warehouse robotics. This technology can solve any of those use cases.”

The promise of Waabi’s technology — which will first be used to scale autonomous trucking — has allowed the startup to close on a $200 million Series B round, led by existing investors Uber and Khosla Ventures. Strong strategic investors include Nvidia, Volvo Group Venture Capital, Porsche Automobil Holding SE, Scania Invest and Ingka Investments. The round brings Waabi’s total funding to $283.5 million. 

The size of the round, and the strength of its participants, is particularly noteworthy given the hits the AV industry has taken in recent years. In the trucking space alone, Embark Trucks shut down, Waymo decided to pause on its autonomous freight business, and TuSimple closed its U.S. operations. Meanwhile in the robotaxi space, Argo AI faced its own shutdown, Cruise lost its permits to operate in California following a major safety incident, Motional slashed nearly half its workforce, and regulators are actively investigating Waymo and Zoox

“You build the strongest companies when you fundraise in moments that are actually difficult, and the AV industry in particular has seen a lot of setbacks,” Urtasun said. 

That said, AI-focused players in this second-wave of autonomous vehicle startups have secured impressive capital raises this year. U.K.-based Wayve is also developing a self-learning rather than rule-based system for autonomous driving, and in May it closed a $1.05 billion Series C led by SoftBank Group. And Applied Intuition in March raised a $250 million round at a $6 billion valuation to bring AI to automotive, defense, construction and agriculture. 

“In the context of AV 1.0, it’s very clear today that it’s very capital intensive and really slow to make progress,” Urtasun said, noting that the robotics and self-driving industry has been held back by complex and brittle AI systems. “And investors are, I would say, not very excited about that approach.”

What investors are excited about today, though, is the promise of generative AI, a term that wasn’t exactly in vogue when Waabi launched, but nonetheless describes the system that Urtasun and her team created. Urtasun says Waabi’s is a next generation genAI, one that can be deployed in the physical world. And unlike the popular language-based genAI models of today, like OpenAI’s ChatGPT, Waabi has figured out how to create such systems without relying on huge datasets, large language models and all the compute power that comes with them.

The Waabi Driver, Urtasun says, has the remarkable ability to generalize. So rather than trying to train a system on every single possible data point that has ever or could ever exist, the system can learn from a few examples and handle the unknown in a safe manner.

“That was in the design. We built these systems that can perceive the world, create abstractions of the world, and then take those abstractions and reason about, ‘What might happen if I do this?’” Urtasun said.

This more human-like, reasoning-based approach is far more scalable and more capital efficient, Urtasun says. It’s also vital for validating safety critical systems that run on the edge; you don’t want a system that takes a couple of seconds to react, otherwise you’ll crash the vehicle, she said. Waabi announced a partnership to bring Nvidia’s Drive Thor to its self-driving trucks, which will give the startup access to automotive-grade compute power at scale. 

On the road, this looks like the Waabi Driver understanding that there is something solid in front of it and that it should drive cautiously. It might not know what that something is, but it’ll know to avoid it. Urtasun also said the Driver has been able to predict how other road users will behave without needing to be trained in various specific instances. 

“It understands things without us telling the system about the concept of objects, how they move in the world, that different things move differently, that there is occlusion, there is uncertainty, how to behave when it’s raining a lot,” Urtasun said. “All these things, it learns automatically. And because it’s exposed right now to driving scenarios, it learns all those capabilities.”

Waabi’s foundational genAI model also doesn’t fall prey to the black-box or hallucination effect that’s prevalent among LLM-based genAI models today, Urtasun says. This is because the model that runs on the Waabi Driver is an interpretable end-to-end trainable system that can be validated and verified, and whose decisions can be traced.

This ability in a streamlined, single architecture means it can be applied to other autonomy use cases, Urtasun says. 

“If you expose it to interactions in a warehouse, picking up and dropping things, it can learn that, no problem,” she said. “You can expose it to multiple use cases, and it can learn to do all those skills together. There is no limit in terms of what it can do.”

This article was updated with information from Waabi on how its AI model avoids hallucination.

source

Leave a Reply

Your email address will not be published. Required fields are marked *