NVIDIA GTC 2025 Report: Humanoid Robots on the Frontlines — The Futures Envisioned by 1X, Agility, Boston Dynamics, and Disney
NVIDIA GTC 2025 Report: Humanoid Robots on the Frontlines — The Futures Envisioned by 1X, Agility, Boston Dynamics, and Disney
NVIDIA's GPU Technology Conference (GTC), held recently in the heart of Silicon Valley, transcended the typical tech conference — it became a place to witness the future of technology becoming real. The hall buzzed with intense energy, and the evolution of humanoid robots — now approaching genuine integration into human society — drew the most attention. The human-shaped machines once confined to science fiction are now beginning to step into our living spaces and workplaces.
This article delivers a detailed on-site report from GTC, covering the leading companies racing to develop humanoid robots — 1X Technologies, Agility Robotics, Boston Dynamics, and entertainment giant Disney — including their latest developments, their visions for the future, and the technical challenges they face. From in-home assistance to warehouse efficiency to human-like emotional expression, humanoid robots are generating enormous expectations across diverse fields. The development race is intensifying, with each company blazing its own path to commercialization. We hope this article helps you grasp the contours of a society where robots coexist with humans, and the technology trends business leaders should be tracking.
- The Path to "Home Robots" — 1X Neo Gamma's Case for Mixed Autonomy and Staged Evolution
- Agility Robotics' Digit and Boston Dynamics' Atlas: Leaps in Safety and Learning Ability for Practical Use
- Disney's Pursuit of Human-Robot Co-Creation: The Optimal Balance of Autonomy and Control in Storytelling
- Summary
Looking to optimize community management?
We have prepared materials on BASE best practices and success stories.
The Path to "Home Robots" — 1X Neo Gamma's Case for Mixed Autonomy and Staged Evolution
At the GTC exhibition hall, the booth that exuded the most distinctly domestic atmosphere was that of Norwegian robotics company 1X Technologies. In a space designed to resemble a living room, the company's humanoid robot "Neo Gamma" performed everyday tasks with natural-looking movements, drawing crowds of curious onlookers. Neo Gamma picked up a watering can to water plants, posed with visitors for photos, and vacuumed and cleaned the floor — presenting itself as if it were already a member of the household of the future. This demonstration went beyond a simple technology exhibit; it was a concrete invitation to imagine a future in which humanoid robots are woven seamlessly into our daily lives.
1X co-founder and CEO Bons Bernick explained that the demonstration was realized not through full autonomy but through "mixed autonomy" — a combination of autonomous operation and remote control by human operators. According to Bernick, in the early stages when a robot like Neo Gamma is first introduced into a home, teleoperation — remote control — will play an important role. "In the first few days, a lot of the operation will be remote," he said. "What's important is getting the robot to a level where it succeeds at specific tasks enough times that the user recognizes it as genuinely useful." This reflects a pragmatic approach: rather than demanding perfect autonomy from day one, establish the robot's usefulness through human assistance, then gradually increase its autonomy from there.
The concept of "mixed autonomy" plays a central role in the robot's learning process as well. As a robot repeatedly executes specific new tasks and accumulates successful experiences, an autonomous improvement process begins. According to Bernick, the robot has the capacity to learn from its own failures and optimize its movements — meaning that after an initial "bootstrap" period through teleoperation, the robot grows smarter through experience in real environments. This is made possible through AI techniques like reinforcement learning and imitation learning, and is an essential element for robots to adapt to diverse home environments and unexpected situations.
Bernick also has a clear vision for 1X's ultimate goal. He emphasizes that what they are selling in 2025 is not the all-purpose home robot "Rosie" that many people dream of — the robotic maid from the animated series "The Jetsons." "What we're selling in 2025 is not Rosie herself. It's the journey to Rosie. If you want to be part of that journey — as I have since I was a kid — this is for you." This framing makes clear that realizing humanoid robots is not an overnight achievement but a long-term process of evolving alongside users.
On pricing, 1X has set an ambitious target: the Neo robot is expected to be priced "less than a car," representing a potential breakthrough on cost — one of the biggest hurdles to widespread adoption of home robots. The rollout to the first paying customers is scheduled to begin in the second half of 2026, signaling concrete momentum toward commercialization. 1X's approach is based on a steady yet ambitious strategy: advance through realistic steps while working toward a future home robot with high autonomy. Their model of "mixed autonomy" and staged evolution may offer important lessons for other humanoid robot developers as well.
Agility Robotics' Digit and Boston Dynamics' Atlas: Leaps in Safety and Learning Ability for Practical Use
Right next to 1X's booth, Agility Robotics' humanoid robot "Digit" was quietly carrying out its assigned work. Digit's task was a continuous instruction: "Pick any three items from the shelf, place them in a basket, and return them to where they belong." Interestingly, the groceries on the shelf were items purchased by staff on-site during the GTC event itself. "The items on the shelves — we literally went to a local grocery store and bought some products we'd never seen before." This suggests Digit has the capacity to respond flexibly to objects it has never encountered.
During the demonstration, Digit occasionally emitted distinctive "beep" and "goop" sounds. These were the robot narrating the processes and steps it was executing — and via a dedicated app, visitors could confirm the content in real time: "Moving to standoff position," "Placing item," and so on. Particularly memorable was the sound "I missed, sad face," emitted when Digit failed to grip a basket — an indication that the robot is designed to communicate its status in ways humans can intuitively understand.
Digit is being deployed not just in demonstration settings like GTC, but in actual warehouse environments. Some facilities already run Digit for full shifts — it is, in a meaningful sense, a working robot. At this stage, however, it is limited for safety reasons to operation within enclosed, fenced-off spaces separate from human employees. This reflects the fact that the technology enabling humans and robots to collaborate safely in the same space is still developing. Agility Robotics aims to overcome this challenge and eventually allow humans and robots to work in much closer proximity.
A company spokesperson explained: "There are two parallel systems right now — one thinking about what to do to solve the task, and one thinking about what to do right now to be safe. And the safety-side system is definitely leveraging some of the AI techniques we're talking about here." Specifically, the company is developing "safe human detection" capabilities, which it plans to incorporate into its next-generation robot — potentially arriving within the next 18 months. Great anticipation surrounds the evolution of safety technology designed with human collaboration in mind.
Boston Dynamics' Atlas, meanwhile, generated enormous buzz at the conference without any physical exhibit at all. On the morning of GTC, the company released a video showing its new electric Atlas performing stunning acrobatic movements, sparking widespread attention. Boston Dynamics CTO Aaron Saunders, speaking after a panel discussion, revealed that the movements displayed in the video were learned through a new method quite different from conventional approaches. The process works as follows:
- Human motion capture: Trained staff wearing motion capture suits perform the target motion (in the video, movements resembling handstands with body rotations).
- Input of reference data into a software pipeline: The motion capture data is fed as reference information into a dedicated software pipeline.
- Generation of a control policy via reinforcement learning: Within this pipeline, reinforcement learning (RL) algorithms use the motion capture data to generate a new control policy — a set of control rules — enabling the robot to stably execute the motion. This process typically completes overnight.
- Deployment and execution on the robot: The generated control policy is implemented on the robot and tested on the actual hardware. Remarkably, using this method, "from the first time you try it on the robot, it works at a fairly stable level."
Saunders explained: "What you saw in the video is an example of human motion capture. You take that motion capture reference data, put it into a software pipeline, use reinforcement learning to generate a new control policy, and deploy it to the robot. From a single motion capture, through overnight training, you can get to the point where the first time you try it on the robot, it's pretty stable."
He also acknowledged current limitations: "The caveat is these are still single policies — specialized policies. You saw policies for running, walking, doing a cartwheel." In other words, while specialized control policies for specific motions can be created, the next major step is acquiring "generalization" — the ability to handle diverse actions through a single flexible policy. "The next step is to make a generalist policy. So generalization will be a theme for us, and I think for the community, for some time." This challenge of generalization — enabling robots to respond flexibly to unpredictable real-world environments in a human-like way — will be the central theme in humanoid robot R&D going forward. Agility Robotics' focus on safety and Boston Dynamics' pursuit of advanced locomotion and learning represent distinct but complementary steps toward practical humanoid deployment.
Disney's Pursuit of Human-Robot Co-Creation: The Optimal Balance of Autonomy and Control in Storytelling
At the NVIDIA GTC exhibition hall, a robot developed for an entirely different purpose — not industrial applications or home assistance, but "storytelling" — captured considerable attention. The "BDX Droid" exhibited by the Walt Disney Company charmed large crowds with its endearing appearance and lively movements. The droid is fundamentally operated by human operators via remote control. The operator creates the character's personality and performance, and through interaction with the audience, imbues the droid with a presence that feels genuinely alive.
Morates Beecher, Associate Lab Director at Disney's Zurich robotics team, explained the operation process: "The operator is usually a creative director, but with more complex characters, it gets hard to control with just two joysticks and buttons." In other words, the richer the character's expression, the more pure manual control reaches its limits — and some degree of autonomy becomes necessary. "And clearly, more autonomy is needed," Beecher added.
What makes Disney's robotics development unique is that complete autonomy is not necessarily the final goal. What they're pursuing is finding the optimal balance between autonomy and human control — maximizing human creative expression while enhancing the character's authenticity and believability. Beecher used the BDX Droid's self-balancing capability as an example: "Whatever I do with the controls, they don't fall over. This provides a very nice interface — a level of autonomy where you don't have to worry about the robot's performance. In practice, you can concentrate on the creative aspects." By delegating fundamental movement stability to the robot's autonomous functions, operators are freed to focus on higher-order creative aspects like the character's expression and emotional delivery. This is an excellent example of technology functioning not to impede human creativity, but to expand it.
Beecher briefly touched on Disney's future robotics research in a subsequent talk, including references to humanoid robot development — though details were reserved for future announcements. Nevertheless, Disney's case powerfully demonstrates that robotic technology holds enormous potential not only for efficiency and physical task assistance, but also for entertainment and artistic expression.
As noted in the later portion of the original discussion: "You might be tempted to think that more autonomy is always the goal, but Disney is a very interesting example of a use case where finding the right combination of autonomy and control to maximize human creative expression and character believability is more important." Humanoid robot development requires very different levels of autonomy and human involvement depending on the use case. Disney's approach suggests a future where technology, art, and human-machine collaboration converge to create new value — and reminds us that human-centered design thinking is critical across all domains of robot development. How Disney will use the humanoid robot platform to create compelling characters and experiences in the future is something to watch closely.
Summary
The humanoid robots on display at NVIDIA GTC 2025 made clear that robotic technology has entered a new phase. 1X Technologies' staged approach to home introduction through "mixed autonomy." Agility Robotics' progress toward practical deployment in warehouse settings and its work on safety. Boston Dynamics' demonstration of extraordinary motion capability acquired through reinforcement learning. And Disney's pursuit of human-robot co-creation in storytelling. These diverse approaches collectively suggest that humanoid robots are not single-purpose machines — they hold the potential to be applied across many different contexts of human society, adapted to each specific need.
Two common themes emerged from GTC: "acquiring generalization ability" and "safe coexistence and collaboration with humans." How to move from specialized performance on specific tasks to the general-purpose intelligence and motor skills needed to respond flexibly to novel situations and diverse instructions. And how to establish the safety and smooth interaction capabilities necessary for robots operating in human spaces. These are the critical keys to humanoid robots moving beyond labs and demonstrations and achieving true social deployment. Advances in AI — particularly reinforcement learning and simulation technology — are unquestionably accelerating progress on these challenges.
The vectors of robot development extend beyond purely functional pursuits as well. The references in the source material to "a beautiful future where humans and machines can communicate better through abstract expressions of form and color," the focus on "human-scale technology," and the framing of robots as "truly augmenting and upskilling what can be done at a facility" rather than directly replacing workers — these perspectives suggest the need for deeper reflection on the impact robots will have on human society. The question of what happens when robots that mimic human size, shape, and strength acquire even greater capabilities — such as the ability to lift a ton — points to the need for discussion that encompasses ethical dimensions alongside technical possibilities.
NVIDIA GTC 2025 was a precious opportunity to glimpse the present state of humanoid robot development and the future it opens up. As each company continues pushing through technical challenges in pursuit of its own vision, robots will grow smarter, safer, and more attuned to human society. We must continue to watch the arc of that evolution and the societal changes it brings. The day when humanoid robots become our neighbors and colleagues may not be as distant as we once imagined.
Reference: https://www.youtube.com/watch?v=Kbh-K6zrjtk
TIMEWELL AI Adoption Support
TIMEWELL is a professional team supporting business transformation in the AI agent era.
Services
- ZEROCK: High-security AI agent running on domestic servers
- TIMEWELL Base: AI-native event management platform
- WARP: AI utilization and workforce development program
In 2026, AI is evolving from a tool you use to a colleague you work with. Let's think through your company's AI strategy together.
