Robots of tomorrow will be bartenders, surgical assistants and puppies.
Join our daily and weekly emails to receive the latest news and exclusive content about AI. Learn More
Humanoid robotics are not just science fiction. Imagine a future where robots collaborate with us not only in factories, but also in stores and during surgeries. They would even take care of our loved ones. The age of humanoid robotics is closer than you think, with Tesla planning to deploy Thousands of Optimus Robots by 2026. This vision is becoming more tangible as companies demonstrate groundbreaking innovations. The 2025 Consumer Electronics Show showcased a number of examples that demonstrated how robotics has advanced in terms of both functionality and design. The robot bartender ADAM from Richtech Robotics mixes over 50 different drinks and interacts directly with customers. Tombot Inc. has created puppy dogs with waggable tails that make soothing sounds for older adults suffering from dementia. While there may be a market for these and other robots on display at the show, it is still early days for broad deployment of this type of robotic technology.
Nevertheless, real technological progress is being made in the field. This includes more and more “humanoid robots” that are using generative AI in order to develop human-like capabilities. They can now learn, sense, and act within complex environments. Humanoid robots will be a common sight in the coming decade, from Optimus by Tesla and Aria by Realbotix.
A Conversation with “Aria.” CNET
Experts caution that, despite these impressive advances, achieving human-like abilities is still far off. Citing shortcomings in current technology, Yann LeCun — one of the “Godfathers of AI” — argued recently that AI systems do not “have the capacity to plan, reason … or understand the physical world.” He added that we cannot build smart enough robots today because “we can’t get them to be smart enough.”
LeCun might be correct, although that doesn’t mean we will not soon see more humanoid robots. Elon Musk said recently that Tesla would produce several thousand Optimus models in 2025, and he expected to ship 50,000-100,000 of them by 2026. This is a huge increase over the few units that are currently available and perform circumscribed functions. Musk is known for making mistakes with his timelines, as he did in 2016, when he predicted that autonomous driving would be possible within two years.
Nevertheless, it seems clear that significant advances are being made with humanoid robots. Tesla isn’t the only company pursuing this goal. Agility Robotics and Boston Dynamics are also leaders in the field of humanoid robots. Business Insider spoke with Agility Robotics’ CEO Peggy Johnson recently, who stated that it will soon be “very common” for humanoid robotics to work alongside humans in a wide range of workplaces. Last month, Figure announced in a LinkedIn post: “We delivered F.02 humanoid robots to our commercial client, and they’re currently hard at work.” With significant backing from major investors including Microsoft and Nvidia, Figure will provide fierce competition for the humanoid robot market.
Figure 02 humanoid robots at work in a BMW factory. Source: YouTube:
Creating an overall view
LeCun was right, but more progress is needed before robots can match human capabilities. It is simpler to move parts in a factory than to navigate dynamic, complex environments.
The current generation of robots face three key challenges: processing visual information quickly enough to react in real-time; understanding the subtle cues in human behavior; and adapting to unexpected changes in their environment. The majority of humanoid robotics are dependent on cloud computing, and the network latency that results can make simple tasks such as picking up an item difficult. World Labs was founded by Fei Fei Li, the “AI Godmother”, to help overcome robotics’ current limitations. Li told Wired: “The computer brain and cameras are the eyes that see the physical world. Understanding the physical structure and dynamics of the world is necessary to turn that vision into reasoning, generation, and eventually interaction. This technology is called spatial Intelligence.
GenAI powers spatial intelligence, by helping robots map the environment in real-time and predict how objects may move or change. Such advancements are crucial for creating autonomous humanoid robots capable of navigating complex, real-world scenarios with the adaptability and decision-making skills needed for success.
While spatial intelligence relies on real-time data to build mental maps of the environment, another approach is to help the humanoid robot infer the real world from a single still image. In a paper that has been published, Generative World Explorer uses AI to create an entire virtual world using a single still image. This mimics how humans infer information about their environment. This capability, which is still in research, will allow robots to make split second decisions or navigate unfamiliar environments using limited sensor data. The ChatGPT moment is coming for robotics
While World Labs, GenEx and Nvidia Cosmos push the limits of AI reasoning and provide humanoid robots real-world adaptability and interactivity. Cosmos is a set of AI models that helps robots learn about physics and spatial relationships. GR00T, or Generalist Robot 00 Technology (GR00T), allows robots to observe humans in order to learn. Together, these technologies help robots understand both what to do and how to do it naturally.
These innovations reflect a broader push in the robotics industry to equip humanoid robots with both cognitive and physical adaptability. GR00T, for example, could allow humanoid robotics to assist in healthcare by observing medical professionals and mimicking them. GenEx may enable robots navigate disaster zones using limited visual input. Nvidia CEO Jensen Huang, as reported by Investor’s Business Daily said: “The ChatGPT for robotics moment is coming.” Timothy Brooks, a researcher at the company, wrote this month about plans to create large-scale gen simulations of the physical world. These new physical world models can be used to better plan, predict and learn from experiences, which are all essential capabilities for future humanoid robotics. Google is creating world simulation models. Source: X.com
The robots will soon arrive. They will initially focus on specific tasks such as manufacturing and logistics, or disaster response where automation is of immediate benefit. As technology matures, broader applications such as retail or caregiving will be possible. However, progress with AI and mechanical engineering is accelerating such humanoid robot development.
Consulting firm Accenture recently took note of the developing full stack of robotics hardware, software and AI models purpose-built for creating machine autonomy in the human world. In their “2025 Technology Vision” report, the company states: “Over the next decade, we will start to see robots casually and commonly interacting with people, reasoning their way through unplanned tasks, and independently taking actions in any kind of environment.”
A timeline of past and estimated future robot adoption. Source: Accenture Technology Vision 2025 – Technology Vision 2025
Morgan Stanley, a Wall Street firm, has estimated that eight million U.S. robots will be available by 2040. By 2050 the company expects 63 million. The company said that, in addition to technological advances, long-term demographic shifts creating labor shortages may help drive the development and their adoption.
Building trustworthy robots
Beyond the purely technical obstacles, potential societal objections must be overcome. Public skepticism may prevent the adoption of robots that look like humans, even when they have clear benefits. Humanoid robots must be viewed as trustworthy and the public must believe they are helping society. MIT Technology Review noted that “few would feel warm or comfortable with such a robotic if it walked right into their living room.”
To overcome trust challenges, researchers are exploring ways to make robots seem more relatable. Engineers in Japan, for example, have developed a mask made from human skin cells that can be attached to robots. A study reported in The New York Times last summer stated that “human-like expressions and faces improve communication and empathy between humans and robots, making robots better in roles such as health care, service, and companionship.”
In order to be accepted by humans, robots that look like people will also need to behave consistently ethically and responsibly. Humanoid robots equipped with cameras in public places could unintentionally collect sensitive data such as facial or conversational details, raising questions about surveillance. The next decade
In near-term, humanoid robotics will concentrate on specific tasks such as manufacturing and logistics, or disaster response where automation is of immediate benefit. As technology advances, specialized roles will be developed that will highlight the strengths of humanoid robotics in structured environments. These machines will not only perform tasks but also integrate into society, forcing humans to learn new ways of interacting with technology. They could help solve labor shortages and increase efficiency in the service sector, but they may also spark debates on job displacement, privacy, and human identity as we move into an increasingly automated society. Preparing for these shifts will demand not just technological progress, but thoughtful societal adaptation.
By addressing challenges and leveraging the efficiency and adaptability of humanoid robots, we can ensure these technologies serve as tools for progress. It’s not just policymakers and technologists who are responsible for shaping this future. Everyone has a role to play. Welcome to the VentureBeat community!
DataDecisionMakers
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
Read More From DataDecisionMakers