Language Grounding for Robotics

A number of long-term goals in robotics – for example, using robots in household settings – will require robots that can interact with humans in an intuitive way. In this project, we explore how robots can learn to correlate natural language with the physical world being sensed and manipulated, an area of research that falls under grounded language acquisition. In particular, we are exploring how robots can learn to recognize objects by their names and attributes, and learn from a small number of trials in which humans provide verbal descriptions of an action being demonstrated.


Building Hierarchies of Concepts via Crowdsourcing
Sun, Y., Singla, A., Fox, D., Krause, A. IJCAI 2015

Learning from Unscripted Deictic Gesture and Language for Human-Robot Interactions, C. Matuszek, L. Bo, L. Zettlemoyer, and D. Fox. AAAI 2015

Learning to Identify New Objects
Y. Sun, L. Bo, D. Fox. ICRA 2014

Attribute-Based Object Identification
Y. Sun, L. Bo, D. Fox. ICRA 2013

A Joint Model of Language and Perception for Grounded Attribute Learning, C. Matuszek, N. Fitzgerald, L. Bo, L. Zettlemoyer, and D. Fox. ICML 2012

Learning to Parse Natural Language Commands to a Robot Control System, C. Matuszek, E. Herbst, L. Zettlemoyer, and D. Fox. ISER 2012

Following Directions Using Statistical Machine Translation, C. Matuszek, K. Kosher, and D. Fox. HRI 2010