At CES 2026 this year, NVIDIA once again demonstrated its technological prowess in the autonomous driving arena, unveiling a new suite of open-source AI models and simulation tools under the name “Alpamayo.” The initiative is designed to tackle one of the most stubborn challenges in autonomous vehicle development: the “long tail” of rare and complex driving scenarios. NVIDIA also confirmed that its full-stack autonomous driving software, DRIVE AV, will be officially deployed later this year in the all-new Mercedes-Benz CLA, while the DRIVE Hyperion ecosystem continues to expand through partnerships with leading suppliers to accelerate the real-world rollout of Level 4 autonomy.
The core objective of the Alpamayo open-source model family is to strengthen autonomous systems’ ability to handle infrequent yet highly complex situations—the essence of the long-tail problem. By endowing vehicles with enhanced perception, reasoning, and human-like decision-making capabilities, NVIDIA aims to deliver a substantial leap forward in safety.
Alpamayo consists of three key components:
- Alpamayo 1 Model: The industry’s first open-source reasoning-based Vision-Language-Action (VLA) model. Built on a 10-billion-parameter architecture, it can generate driving trajectories from visual inputs while simultaneously producing “reasoning traces” that explain the logic behind each driving decision.
- AlpaSim Simulator: A fully open-source, high-fidelity simulation framework supporting realistic sensor modeling and configurable traffic dynamics, enabling developers to rapidly validate and stress-test their models.
- Physical AI Open Dataset: A large-scale dataset comprising more than 1,700 hours of driving data across diverse geographies and road conditions, specifically curated to train and enhance models’ reasoning capabilities.
A growing number of industry players—including Jaguar Land Rover, Lucid, Uber, and academic research institutions such as Berkeley DeepDrive—have already announced plans to leverage Alpamayo to advance their Level 4 autonomous driving roadmaps. On the production vehicle front, NVIDIA revealed that its DRIVE AV software will debut on U.S. roads later this year, with the new Mercedes-Benz CLA serving as the launch vehicle.
The software will deliver enhanced Level 2 point-to-point driver assistance features, including urban navigation in complex environments, active collision avoidance, and automated parking. Its architecture adopts a “dual-stack” design: a core end-to-end AI stack responsible for primary driving tasks operates in parallel with a traditional safety stack built on NVIDIA’s Halos safety system, ensuring redundancy and robust fail-safe behavior.
To address more advanced autonomous driving requirements, NVIDIA also announced a significant expansion of the global DRIVE Hyperion ecosystem. Newly added partners include Sony, Bosch, Hesai, Magna, and ZF Group—spanning sensor manufacturers and top-tier automotive suppliers.
The DRIVE Hyperion platform is powered by dual DRIVE AGX Thor system-on-chips (SoCs) based on the Blackwell architecture, delivering over 2,000 TFLOPS (FP4) of real-time compute performance. All deployments are anchored in NVIDIA Halos, a comprehensive safety and cybersecurity framework designed to ensure that autonomous driving systems undergo rigorous validation and certification.