Article

OpenAI Accelerates Humanoid Robotics Effort, Hiring Experts to Train AI Through Teleoperation and Simulation

DATE: 9/15/2025 · STATUS: LIVE

OpenAI’s secretive robotics team is training humanoid controllers, blending simulation and teleoperation, the results could change how robots move naturally…

OpenAI Accelerates Humanoid Robotics Effort, Hiring Experts to Train AI Through Teleoperation and Simulation
Article content

OpenAI, the company behind ChatGPT, is building a team to develop algorithms that control robots and appears to be hiring roboticists who focus on humanoid platforms. The moves point to a growing emphasis on bringing advanced AI into physical systems, with job listings and recent hires offering a glimpse of the effort.

People familiar with the work say the company has recruited multiple researchers with backgrounds in AI control for humanoid and other robot forms. The openings indicate the group aims to produce systems that can be trained through teleoperation and simulation, a common route for teaching robots to act in the real world.

Sources with knowledge of the project add that OpenAI is targeting hires who have worked on humanoid systems—machines with partial or full human form. One person active in cutting-edge robotics says the firm has started training algorithms that make better sense of physics and spatial relationships, which could give robots improved ability to move and carry out tasks.

Recent hires suggest the effort is accelerating. Chengshu Li joined OpenAI in June 2025 from Stanford University, where he worked on several robotics projects, including a benchmark meant to measure how well humanoid robots can handle a wide variety of household chores. Li’s dissertation centers on benchmark development and focuses on robots that are partly humanoid, with two arms but wheels instead of legs.

LinkedIn profiles show two other researchers from a different robotics lab have already moved to the company. A professor at a separate lab that does humanoid work says one of their students was recruited recently as well.

OpenAI declined to comment on its recruitment push or robot research plans. OpenAI has posted a number of revealing job listings related to robotics research on its careers site. One role asks for expertise in teleoperation and simulation. Teleoperation is a critical tool for training partial or fully humanoid robots: a human operator performs chores and controls the limbs of the robot as an algorithm learns how to mimic those actions. The position requests familiarity with simulation tools such as Nvidia Isaac, which is widely used to teach humanoids by running algorithms inside virtual physical environments.

It remains unclear whether OpenAI plans to build its own robots, buy off-the-shelf hardware, or partner with an existing robotics firm. A recent posting for a mechanical engineer asks for skills in prototyping and building robot systems equipped with sensors for touch and motion. One roboticist says that could mean OpenAI intends to develop its own chassis or is building teleoperation hardware and interfaces for training. The job listing also asks for “experience designing mechanical systems intended for high volume (1M+), problem-solving on assembly lines,” language that points to designs that could be mass-produced or placed into industrial settings.

Every one of the robot-related listings repeats that the robotics team “is focused on unlocking general-purpose robotics and pushing towards AGI-level intelligence in dynamic, real-world settings.”

A shift toward robots would signal that OpenAI sees progress toward artificial general intelligence (AGI)—AI that can match or exceed human capabilities—as tied to systems that interact with the physical world. The argument is that models need to be trained against the messiness of sensors, dynamics, friction, and the unpredictable conditions of real environments to reach broader competence.

Stefanie Tellex, a roboticist at Brown University, says building more capable robots will require models that can handle sensory and motor demands at high temporal and dimensional scales. She puts it this way: “processing high-frame-rate, high-dimensional perceptual input, and producing high-frame-rate, high-dimensional physical outputs.” Tellex says she has not seen OpenAI’s internal plans.

Even with OpenAI’s industry-leading advances in conversation, reasoning, coding, and image and video generation, the company will compete with a field that is already investing heavily in humanoid hardware and control software. A handful of startups focused on humanoids—Figure, Agility, and Apptronik—have emerged in recent years, and larger companies such as Tesla and Google are running their own efforts. “I don’t see them having any magical advantage over anyone else,” says Tellex.

Humanoid designs have become more feasible as motors, power systems, and other components improve and as development platforms mature. Yet humanoid robots remain costly and complex to build. Recent advances in actuation and modular components have reduced the cost and simplified construction of working prototypes. Software tools such as Nvidia’s Isaac robot development platform have lowered the barrier to writing control and training code for legged, bipedal, and arm-equipped systems.

Investor interest has followed. Venture capitalists have put more than $5 billion into humanoid startups since the start of 2024. Morgan Stanley projects that the broader humanoid sector could be worth $5 trillion by 2050.

Humanoids can already perform striking demonstrations, from choreographed motion to object manipulation in constrained setups, yet they lack the general intelligence to work in complex, unstructured environments. To reach that level, systems must move beyond language-model-style reasoning and gain direct control of limbs, grippers, and locomotion so they can walk, grasp, and manipulate a wide variety of items. Research groups are beginning to show progress on AI models that generalize more broadly across tasks for robots.

At the same time, growing awareness has spread that fresh research directions may be needed to advance AI toward humanlike capabilities. The disappointment around OpenAI’s GPT-5 release fed a wider reassessment about the limits of current approaches. “They've asymptoted on GPT-5,” says Tellex.

Keep building
END OF PAGE

Vibe Coding MicroApps (Skool community) — by Scale By Tech

Vibe Coding MicroApps is the Skool community by Scale By Tech. Build ROI microapps fast — templates, prompts, and deploy on MicroApp.live included.

Get started

BUILD MICROAPPS, NOT SPREADSHEETS.

© 2025 Vibe Coding MicroApps by Scale By Tech — Ship a microapp in 48 hours.