At CES 2026, Nvidia unveiled a sweeping robotics stack that makes its ambitions clear.
The company doesn’t just want to sell chips.
It wants to power how robots think, learn, and move.
Much like Android became the default system for smartphones, Nvidia is positioning itself as the default platform for general-purpose robotics.
Why Robotics Is Shifting Out of the Cloud
For years, AI lived mostly in the cloud. That’s changing fast.
Robots now need to make decisions in real time. They need to see. Move. Adapt. All in the physical world. That shift is happening because sensors are cheaper, simulations are better, and AI models can now generalize across tasks.

Nvidia is building for that exact moment.
A Full Stack for Physical AI
What Nvidia revealed isn’t a single product.
It’s an ecosystem.
At the center are open foundation models, all available on Hugging Face, designed to help robots reason and act across many environments.
The New Models at a Glance
| Model | What It Does |
|---|---|
| Cosmos Transfer 2.5 | Generates synthetic data for robot training |
| Cosmos Predict 2.5 | Tests robot behavior in simulation |
| Cosmos Reason 2 | Helps robots see, understand, and plan |
| Isaac GR00T N1.6 | Controls humanoid robots’ full-body actions |
GR00T relies on Cosmos Reason as its “brain.”
That combo allows humanoid robots to move, balance, and manipulate objects at the same time.
In simple terms: robots can finally coordinate their whole body, not just one task at a time.
Testing Robots Without Breaking Them
Training robots in the real world is expensive. It’s also risky. Drop a robot once, and the bill hurts.
To fix this, Nvidia introduced Isaac Lab-Arena, an open-source simulation framework hosted on GitHub. It lets teams test robots safely in virtual environments before touching real hardware.

Isaac Lab-Arena brings together tools, tasks, and shared benchmarks like Libero and RoboCasa. Before this, the industry lacked a common testing standard. Now, there’s one place to start.
The Glue Holding It Together
Nvidia also rolled out OSMO, an open-source command center that ties everything together. It manages data, training, and deployment across both desktop and cloud setups.
Think of it as mission control for robot development.
Hardware Built for the Edge
All this software needs power. Nvidia delivered that too.
The company introduced the Jetson T4000, part of its Thor family. It runs on Blackwell architecture and delivers:
- 1,200 teraflops of AI compute
- 64GB of memory
- Just 40–70 watts of power use
That makes it practical for robots operating outside data centers.
Making Robotics Easier to Enter
Nvidia is also betting on openness.
Its partnership with Hugging Face connects millions of developers across both platforms. Nvidia’s Isaac and GR00T tools now work inside Hugging Face’s LeRobot framework.
Even better, the open-source Reachy 2 humanoid robot now runs directly on Nvidia’s Jetson Thor chip. Developers can swap models without getting locked into one system.
Lower barriers mean more builders. More builders mean faster progress.
Early Signs the Strategy Is Working
This isn’t theoretical. It’s already happening.
Robotics is now the fastest-growing category on Hugging Face. Nvidia’s models lead in downloads. And real companies are using the stack today.
That includes Boston Dynamics, Caterpillar, Franka Robotics, and NEURA Robotics.
The Bigger Bet
Nvidia isn’t just chasing robotics growth. It’s shaping the foundation.
If robots become as common as smartphones, someone needs to power them underneath. Nvidia wants that role.
Not flashy.
Not consumer-facing.
But everywhere.
And if this strategy works, robotics may soon have its Android moment.

