Las Vegas: Nvidia chief executive Jensen Huang used the CES technology conference in Las Vegas to unveil Alpamayo, a new technology platform designed to help self-driving cars reason and behave more like human drivers.
According to Nvidia, Alpamayo allows autonomous vehicles to think through unusual and unpredictable situations, navigate crowded and complex streets, and clearly explain the decisions they make on the road. The platform is aimed at tackling the hardest problems in self-driving technology, especially rare scenarios that traditional systems struggle to handle.
During his keynote appearance, Huang also confirmed that Nvidia has begun producing a driverless version of the Mercedes-Benz CLA, developed in close partnership with the German carmaker. The vehicle is expected to launch in the United States in the coming months, followed by a wider rollout in Europe and Asia.
$NVDA CEO Jensen Huang just announced Alpamayo which he calls the world’s first thinking and reasoning model built for autonomous vehicles.
By open sourcing the Alpamayo stack, Nvidia is pushing self driving forward as a category after years of work by thousands of engineers. pic.twitter.com/1X4TWfbA1b
— Shay Boloor (@StockSavvyShay) January 5, 2026
Huang told the audience that working directly on a production vehicle has given Nvidia deep insight into how advanced robotic systems should be built. The experience, he said, has strengthened the company’s ability to support carmakers and other partners moving into automation.
Industry analysts said the announcement underlined Nvidia’s growing influence beyond chipmaking. Paolo Pescatore of PP Foresight described the move as a major step in Nvidia’s evolution from a hardware supplier to a full platform provider for physical systems, helping it stay well ahead of competitors.
Nvidia showcased a video of the AI-powered Mercedes-Benz driving smoothly through the streets of San Francisco, with a passenger sitting calmly behind the wheel, hands resting in their lap. The company said the car drives naturally because it learned from human driving behaviour and continuously reasons through each action before taking it.
NVIDIA Alpamayo is the first open ecosystem for developing reasoning vision-language-action (VLA) models for autonomous vehicles.
Alpamayo 1, AlpaSim, and Physical AI Open Datasets help AVs perceive, reason, and act with human-like judgment, paving the way for level 4 autonomy. pic.twitter.com/EO827yfmJG
— NVIDIA Newsroom (@nvidianewsroom) January 5, 2026
Alpamayo has been released as an open-source model, with its core code available on the Hugging Face platform. Nvidia said this will allow researchers and developers worldwide to freely access and retrain the system, accelerating progress across the autonomous vehicle industry.
The announcement drew a swift reaction from Tesla chief Elon Musk, who said the hardest challenge in self-driving remains mastering rare edge cases. Nvidia, like Tesla, plans to launch a robotaxi service next year, although details about its partner and launch location have not yet been disclosed.
Nvidia also revealed that its next-generation Rubin AI chips are already in production and scheduled for release later this year. The new chips promise higher performance with lower energy use, potentially reducing the cost of developing advanced computing systems.







