banner
Home / Blog / Comment: Robots are coming, but with humans at the controls
Blog

Comment: Robots are coming, but with humans at the controls

Oct 21, 2023Oct 21, 2023

In order for robots to learn tasks, people are having to direct their movement, creating a new job.

By Parmy Olson / Bloomberg Opinion

I don't know if you’ve heard, but the robots are coming.

Tesla Inc. has one with opposable thumbs called Optimus, and other startups like California-based Figure and Norway's 1X are building walking machines with torsos and arms that can stack warehouses goods. But to be truly useful in their first years of labor, many of these robots will need to be steered by humans, posing a unique challenge around privacy and marketing for their makers.

Take Alfie. It's the prototype of London startup Prosper Robotics and looks like a Minecraft character made real: slightly taller than a grown man, bulky and gliding slowly around on wheels. It is also steered by a team of gamers in the Philippines, who wear virtual reality headsets throughout the day to control its movements.

At Prosper's office, machines whir in the background of its brightly-lit warehouse brimming with printed circuit boards, wires and plastic drawers. Five large robots in vivid colors move slowly around the room, some manipulating objects as part of their training or going through "obstacle courses" in a mock kitchen. An orange robot at the back takes an empty juice bottle and drops it over and over into a garbage can. Another yellow bot is tinkering with a plastic box of Tupperware.

Do these things enough times and the robots will learn how to open a box or fold a towel. The paradox of AI is that while it can write humanlike essays, machines can still barely walk or pick up a cup. Why? Because while the models powering ChatGPT have been trained on billions of words on the public internet, there is no similar database to help them copy our biomechanical movements.

So is that orange robot really turning toward me to wave? A human tele-operator is doing that work. There are about six of these operators working in shifts at a small office in Dumaguete, Philippines. On any given day, an operator or "pilot" will be wearing a Quest 2 virtual reality headset, made by Meta Platforms Inc., moving their arms and using the Quest's controllers to pick things up.

Steering a robot in this way is much more intuitive than using a keyboard. When the operator raises their arm, the robot arm goes up. When they turn their head to the left, a camera on Alfie's head looks left. The staff are mostly gamers who were hired for their skills playing first-person shooter and strategy games like Counterstrike. Now instead of firing guns, they’re making beds. But somehow, for now at least, it still feels like a game.

"It's fun," says Lienelson Mark Pardo Samosa, who manages the team in the Philippines and is a skilled Call of Duty player. "I’d rather choose using the headset and do the laundry than doing the laundry at home." The first session he put a headset on, his eyes watered from overuse. Now Samosa makes sure to blink frequently and tilt his head back to avoid straining it. Sometimes he's wearing the headset for several hours at a time.

But for the most part he enjoys the work. He and his co-workers often spend their shifts joking and chatting, which can feel incongruous alongside the sensation of being in another country. "It's like bringing your colleagues to London," says Samosa, 37.

For the person on the other side of the world who's having their house cleaned, the human-robot dynamic is also hard to wrap one's head around. Lukas Kobis, a local startup entrepreneur, had an Alfie robot in his London apartment between February and March of this year. He recalls being unsure at first about having a person steer it around his house, even when he was out at work. But eventually he didn't mind at all. In fact at some points, he seemed to forget a person was involved.

"It was nice when it waved at me when I got back home," Kobis remembers. "I know someone was controlling it, but it felt like I was saying ‘hi’ to my robot."

Alfie stayed mostly in Kobis’ kitchen and living room, mopping the floor, cleaning surfaces and loading the dishwasher, even using its rubber grippers to put little detergent pods into the washer. Such a broad range of tasks would be extremely difficult for an autonomous robot to do properly, but Alfie can perform them with little trouble thanks to its human operators.

Prosper Robotics founder Shariq Hashme admits that it may be challenging to convince people to allow human-driven robots into their homes but he points out that his tele-operators won't be able to read text or see people's faces because they will be blurred out, and customers can say "freeze" to get Alfie to stand still.

Hashme, who previously did research work at OpenAI, had the robot work around his house for about two weeks, cleaning and making his bed when he was out at work. "In theory it was going to prepare breakfast for me every morning," he says. "We never got around to that."

Consumers seem to be slowly acclimating to "listening" machines like Alexa and Google Assistant in their homes, even though thousands of humans have been known to monitor their usage to make the devices better.

That may give Hashme license to focus more on making sure his tele-operators are comfortable. Even before the most recent hype-cycle sparked by OpenAI's ChatGPT, AI startups were notorious among venture capital investors for exaggerating their tech's capabilities and even using humans to do work that algorithms couldn't do, while keeping those people secret.

Some of the most sophisticated AI systems we use today, including ChatGPT, have also been trained by labelers in developing countries, often with little or no credit and with sometimes exploitative practices. A better approach is to highlight the work of these labelers and pay them well. Samosa says he and his team in the Philippines are satisfied with their pay. And working remotely for someone in London means they don't have to leave their families, he adds.

Hashme says that over time, his operators will be able to steer several machines at a time, just like today's delivery robots that zoom along some neighborhood sidewalks. "An operator can control 10 of these robots and get paid more, because the work is worth more," Hashme says.

Positioning robots as people-led might make some consumers uncomfortable, but it's probably the only way to gather the training data necessary to make such devices fully autonomous down the line. Consumers might even find greater faith in machines with people behind the wheel than in something automated and clunkier.

Hashme estimates he’d need to collect data from between 10,000 and 100,000 robots, initially steered by humans, to train them to a point where they could work autonomously. And even in the future, "you’ll always need a person overlooking, say, 100,000 robots," he says.

Here in London, people still watch a long-running sci-fi show called "Dr. Who," whose villainous Daleks look like giant pepper pots and roll around with a single eye and a frightening robotic voice. There may be something a little less terrifying about having humans behind similarly sized "automatons." It may not be to most people's tastes, but it's a start.

Parmy Olson is a Bloomberg Opinion columnist covering technology. A former reporter for the Wall Street Journal and Forbes, she is author of "We Are Anonymous."

Talk to us

By Parmy Olson