AI & Humanoids

OpenAI's Eve humanoids make impressive progress in autonomous work

View 2 Images
1X robots might be physically undergunned compared to the competition, but they're doing useful work fully autonomously
1X
1X robots might be physically undergunned compared to the competition, but they're doing useful work fully autonomously
1X
A companion cube, eh? Autonomous sorting.
1X

"The video contains no teleoperation," says Norwegian humanoid robot maker 1X. "No computer graphics, no cuts, no video speedups, no scripted trajectory playback. It's all controlled via neural networks, all autonomous, all 1X speed."

This is the humanoid manufacturer that OpenAI put its chips behind last year, as part of a US$25-million Series A funding round. A subsequent $100-million Series B showed how much sway OpenAI's attention is worth – as well as the overall excitement around general-purpose humanoid robot workers, a concept that's always seemed far off in the future, but that's gone absolutely thermonuclear in the last two years.

1X's humanoids look oddly undergunned next to what, say, Tesla, Figure, Sanctuary or Agility are working on. The Eve humanoid doesn't even have feet at this point, or dextrous humanoid hands. It rolls about on a pair of powered wheels, balancing on a third little castor wheel at the back, and its hands are rudimentary claws. It looks like it's dressed for a spot of luge, and has a dinky, blinky LED smiley face that gives the impression it's going to start asking for food and cuddles like a Tamagotchi.

A companion cube, eh? Autonomous sorting.
1X

1X does have a bipedal version called Neo in the works, which also has nicely articulated-looking hands – but perhaps these bits aren't super important in these early frontier days of general-purpose robots. The vast majority of early use cases would appear to go like this: "pick that thing up, and put it over there" – you hardly need piano-capable fingers to do that. And the main place they'll be deployed is in flat, concrete-floored warehouses and factories, where they probably won't need to walk up stairs or step over anything.

What's more, plenty of groups have solved bipedal walking and beautiful hand hardware. That's not the main hurdle. The main hurdle is getting these machines to learn tasks quickly and then go and execute them autonomously, like Toyota is doing with desk-mounted robot arms. When the Figure 01 "figured" out how to work a coffee machine by itself, it was a big deal. When Tesla's Optimus folded a shirt on video, and it turned out to be under the control of a human teleoperator, it was far less impressive.

In that context, check out this video from 1X.

The above tasks aren't massively complex or sexy; there's no shirt-folding or coffee machine operating. But there's a whole stack of complete-looking robots, doing a whole stack of picking things up and putting things down. They grab 'em from ankle height and waist height. They stick 'em in boxes, bins and trays. They pick up toys off the floor and tidy 'em away.

They also open doors for themselves, and pop over to charging stations and plug themselves in, using what looks like a needlessly complex squatting maneuver to get the plug in down near their ankles.

In short, these jiggers are doing pretty much exactly what they need to do in early general-purpose humanoid use cases, trained, according to 1X, "purely end-to-end from data." Essentially, the company trained 30 Eve bots on a number of individual tasks each, apparently using imitation learning via video and teleoperation. Then, they used these learned behaviors to train a "base model" capable of a broad set of actions and behaviors. That base model was then fine-tuned toward environment-specific capabilities – warehouse tasks, general door manipulation, etc – and then finally trained the bots on the specific jobs they had to do.

This last step is presumably the one that'll happen on site at customer locations as the bots are given their daily tasks, and 1X says it takes "just a few minutes of data collection and training on a desktop GPU." Presumably, in an ideal world, this'll mean somebody stands there in a VR helmet and does the job for a bit, and then deep learning software will marry that task up with the bot's key abilities, run it through a few thousand times in simulation to test various random factors and outcomes, and then the bots will be good to go.

"Over the last year," writes Eric Jang, 1X's VP of AI, in a blog post, "we’ve built out a data engine for solving general-purpose mobile manipulation tasks in a completely end-to-end manner. We’ve convinced ourselves that it works, so now we're hiring AI researchers in the SF Bay Area to scale it up to 10x as many robots and teleoperators."

Pretty neat stuff, we wonder when these things will be ready for prime time.

Source: 1X

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
11 comments
Babaghan
All the workers are happy, no one on Facebook, quietly beavering away at their tasks. My boss would be thrilled with this kind of office.
Daishi
I think they were smart to go with wheeled robots and focus their effort on interacting with the environment instead of doing mobility on hard mode. Biped robots are an expensive and time consuming distraction if your intent is accomplishing more than just making Youtube videos.
Tommo
The tech is advancing at an amazing rate and this really is impressive.

HOWEVER .... This scares the bejeezus out of me though - I have vivid thoughts of being totally alone with about twenty of them in joined rooms similar to the first video, then the lights go off and their lit faces turn from nicey nicey smiley types to angry psychopathic human hunter types...
Ric
Au contraire, I think these guys ARE sexy. I especially like the gray one. Teach him to water the plants and take him ;)
veryken
Yeah but why “complete-looking”? Best to put autonomous-thinking robots to work on factory floors and trash-sorting conveyor lines where they don’t even need heads nor legs. Just plop some cameras and sensors onto flexible arms and let them figure it out. The jobs that people don’t want also don’t need human appearance.
CDE
I would like to see robots that can pick up trash alongside our roadways. If they were hit by a vehicle, it would be expensive, but it would keep someone from being killed or injured. Create them to clean up our planet since many people have unfortunately forgotten how to clean up after themselves.
epochdesign
Besides the creepy fake smiley faces, the one tiny detail that made me cringe regarding the stark reality of the non-humanity here, is the dead houseplant in the office. They don't need air and the don't need life. Totally agree with Veryken, why aren't they designing robots to do the work that no one wants to do? Robots can work outside sorting recycling more effectively than people and the don't need legs, heads, batteries or fake smiley faces.
anthony88
Is a smiling robot happy?
Gregg Eshelman
Picking up kids' toys isn't a good thing for a robot to do. The kids should learn to pick up their own toys. If you want a robot to pick up the toys, have it put them in a bin out of reach of the child. Then the child must ask a parent for the toys and the parent can tell the kid the robot took the toys because they weren't put away when the kid was done playing. That should teach kids that if they want anytime access to their toys they have to keep their room neat.
cgroh
"No computer graphics, no cuts, no video speedups, no scripted trajectory playback".
It must have had some sort of "choreography" since there are too much "coincidences" in the actions.
But it's amazing, nonetheless.