The Future is Here: Robots That Understand Natural Language and Navigate Our Homes

Imagine asking your robot assistant to “go grab my keys from the kitchen table and bring them to me on the couch.” For many years, this type of request only existed in science fiction. But rapid advances in artificial intelligence are bringing this vision closer to reality.

Researchers at New York University recently developed a robotic system called OK-Robot that can understand natural language commands and complete tasks like moving objects around typical home environments. 

This new system demonstrates how the latest AI capabilities can be combined to create truly useful robot assistants.

Understanding Natural Language

A key innovation that enables OK-Robot’s abilities is the use of neural networks - AI systems loosely inspired by the human brain - that have been trained on huge datasets to understand language. Systems called Vision-Language Models can now identify over 20,000 different objects when shown images and can understand written descriptions and questions about those images.

The researchers used these models to give OK-Robot the ability to interpret natural language commands using common words to describe objects, places they can be found, and where they should be moved. This gives untrained users the ability to give instructions without needing to learn a rigid syntax or command structure.

Navigating Like a Human

But understanding language is only the first step to completing tasks - the robot also needs to be able to navigate environments and manipulate objects. Drawing inspiration from technologies self-driving cars use to "see" and move through spaces, the team gave OK-Robot the ability to build a 3D map of rooms using images captured from phone cameras.

This allows OK-Robot to create navigation plans to move around obstacles and get near requested items. It also uses algorithms that simulate human visual and physical reasoning abilities to identify flat surfaces, avoid collisions with clutter, and select optimal paths. The result is fluid navigation using the same sort of common-sense logic humans implicitly understand about moving through home environments.

Manipulating Household Objects 

Finally, to pick up and move everyday items, OK-Robot employs AI recognition capabilities to identify graspable points on target objects. It considers shape, size, and physical properties learned from experience grasping thousands of objects to select a suitable gripper pose. This allows OK-Robot to handle items ranging from boxes and bottles to clothing and coffee mugs.

The system combines its language interpretation, navigation system, and grasping abilities to fulfill requests like “Put my phone on the nightstand” or “Throw this soda can in the recycling”. It even handles specifying destinations using relationships like “on top of” or “next to”.

Real-World Robot Challenges

Evaluating their new system across 10 real homes, the NYU team found OK-Robot could fulfill requests like moving common household items nearly 60% of the time with no prior training or exposure to the environment. This major leap towards capable home robots highlights the progress AI is making.

However, it also uncovered real-world challenges robots still face operating in human spaces. Items placed in difficult to reach locations, clutter blocking paths or grasps, and requests involving heavy, fragile, or transparent objects remain problematic areas. Quirks of language interpretation can also lead to confusion over which specific item is being indicated or where it should be moved.

Still, by integrating the latest AI in an adaptable framework, OK-Robot sets a new high bar for language-driven robot competency. And its failures help illustrate remaining gaps researchers must close to achieve fully capable assistants.

The Path to Robot Helpers

The natural language understanding and navigation capabilities demonstrated by OK-Robot lend hope that AI and robotics are stepping towards the dreamed-of era of useful automated helpers. Continued progress pairing leading-edge statistical learning approaches with real-world robotic systems seems likely to make this a reality.

Key insights from this research illustrating existing strengths and limitations include:

  • Modern natural language AI allows untrained users to give robots useful instructions 
  • Advanced perception and planning algorithms enable feasible navigation of home spaces  
  • Data-driven grasping models generalize reasonably well to new household objects
  • Real-world clutter, occlusion, and ambiguity still frequently thwart capable robots
  • Careful system integration is crucial to maximize performance from imperfect components

So while the robot revolution still faces hurdles in reliably handling everyday situations, projects like OK-Robot continue pushing towards convenient and affordable automation in our homes, workplaces, and daily lives.

Reference Paper