3 min read

Amazon, OpenAI, and Microsoft Put ChatGPT into Robots

How will home robots and assistants change with LLMs and ChatGPT? It’s a question that is being asked by OpenAI, Microsoft, Amazon, and even Good Morning America.

Amazon is working on an upgraded version of its home robot Astro, powered by 'Burnham' technology. The robot has ChatGPT-like features, using large language models and other advanced AI.

"To put it simply: Our robot has a strong body. What we need next is a brain," one of the documents stated. "A robot with Burnham would understand – in the same way a human understands – the thousands of things that happen within a home every day without having to explicitly code for each one because that 'common-sense' knowledge is implicit in the data the language model was trained on." – Insider

The conversational interface is what makes it so intriguing. After years of speaking with Alexa, Siri, and Google Assistant, we all can recognize the vast difference in conversing with ChatGPT. An LLM offers a home assistant or home robot a nearly infinite number of functions.

Astro will be able to engage in dialogue based on what it has sensed around the house. And because it uses large language models, it could expand on whatever question you ask. Some useful examples would be:

  • Asking if you left any lights or appliances (like the stove) on after you’ve left the house.
  • Detecting falls or accidents and checking in on you.
  • Looking for dirty spots around the house that need cleaning.
  • Finding broken glass or sharp objects lying around.
  • “Where’d I leave my keys?”
  • Advanced babysitting functions, such as monitoring if kids did their homework before playing video games or seeing if they had friends over after school.

These are all tasks that are hard to explicitly code into a home robot. But with an LLM, they could infer and learn from anything you ask them to do.

One problem I foresee is that LLMs rarely say, “I don’t know.” Instead, they will make up an answer – or “hallucinate” as it’s commonly referred to. So what happens when it hallucinates an Amazon delivery employee (in plain clothes) as an intruder?

Nonetheless, this would be a great next step for Alexa, and would make sense why you would disband the previous group that led development on Alexa. Different skill sets are needed to ensure people trust and build healthy habits with AI (or talking computers in general).

Amazon has the right brand to make this happen. Google, as they do with most products, has screwed up their smart home offerings to the point where they’re almost unusable. And no other company has as many smart home devices already in the home.

Still, that’s not stopping other behemoths from competing on “ChatGPT meets home robots” products.

WTF? Robots and ChatGPT

Unsurprisingly, OpenAI is working on integrating their LLMs with robots. They’ve invested in 1X, a Norway-based engineering and robotics company “producing androids capable of human-like movements and behaviors.” Although, their humanoid form factor has much to improve before anyone puts that thing in their home.

Microsoft researchers are exploring how they can extend ChatGPT’s functionality into existing robotics such as robot arms, drones, and home assistants. They recognize that the current design loop for robotics involves an engineer or technical user coding the robot’s actions. However, because ChatGPT can generate code, average people could prompt a robot (with ChatGPT) to create entirely new actions through natural language. Check out their paper on ChatGPT for Robotics here.

Lastly, we can’t talk about modern robotics without mentioning Boston Dynamics – the company that has been frightening us with their advanced four-legged robots called Spot for years. In Everydays 19, I theorized how these “Robot Service Dogs” will assist in everything from home security monitoring to being the sight for the blind.

Now, they’re testing how to integrate ChatGPT into Spot for any function:

ChatGPT makes it so we can command and control these robodogs with our words. Whereas I look at their robodogs in the consumer context, Boston Dynamics sees it for industrial settings.

In an email interview, Nielsen says these abilities have enabled large factory sites to use these robots in automated reconnaissance missions instead of installing thousands of expensive sensors everywhere. Each of these robodog missions have long tasks lists detailing what they need to check out during every walk. These tasks result in large datasets of observations that get fed into a database. “Only technical people can handle these. At the end of each mission, the robots capture a ton of data. There’s no simple way to query all of it on demand,” he says.

That’s where ChatGPT comes in. Nielsen says his team has created a much more streamlined way for Spot and its human controllers to communicate using natural language. Humans can talk to Spot using normal language commands, but perhaps more useful is Spot’s new ability to instantly parse tons of information and use those insights to answer previously unanswerable questions. “For example, we can cross-reference information from different [reconnaissance] missions for the first time without pre-programing that capability,” Nielsen says. “We give ChatGPT the raw data and instructions on interpreting it, and it answers the customer’s requests.” – Fast Company

Like Microsoft, Boston Dynamics views ChatGPT as a means of “dumbing down” the technical process of designing and controlling advanced robotics. And that’s both empowering and frightening.