Thanks to our authors, speakers, and everyone who joined us in Toronto for a very successful workshop!
Natural language is a powerful and intuitive modality for enabling humans to interact with physical systems. Understanding language about physical systems requires the ability to extract a semantically meaningful representation from the language and map it to aspects of the external world. This problem, referred to as the language grounding problem, has received substantial attention recently, in part because advances in robotics and sensing technology have enabled more robust sensing, manipulation, and simulation of the physical world, and in part as a result of recent advances in natural language processing and formal representation systems.
The AAAI-12 Workshop on Grounding Language for Physical Systems
aims to bring together different communities considering the language grounding problem,
including natural language understanding, spatial cognition, knowledge representation,
computer vision, and robotics. The workshop will provide a venue to discuss
shared problems, describe key research problems and challenges,
and make progress towards formulating shared definitions.
The following topics are of particular interest:
- Definitions of and possible approaches to the language grounding problem.
- Methods and models for mapping between language and the external world.
- Interactive physical systems that can be used in exploring the grounding problem, including
robots, sensors, and physically interactive systems.
- Knowledge representations that support a range of semantic constructions, such as ambiguity
- Algorithms for learning grounded meanings from gesture, language, and other input types.
- Interpreting instructions for physically-grounded perceptual or manipulative tasks.
- Vision, haptics, audio, and other sensing modalities for grounding linguistic elements such as
attributes, objects, tasks, and spatial relationships.
- Challenge problems in the grounding space.
The workshop will consist of six talks, a poster session in which accepted authors will present their work, and
a discussion period. The emphasis will be on discussion and interaction among the participants.
Abstracts for talks and accepted papers can be found
| 9:00 - 9:30 ||Introductions |
| 9:30 - 9:50 ||Stefanie Tellex for Seth Teller:
Perceptual Apparatus for Sharing Semantic Mental Models
| 9:50 - 10:30 ||Luke Zettlemoyer
| 10:30 - 11:00 ||Break |
| 11:00 - 12:00 ||Ray Mooney:
Learning to Interpret Natural Language Navigation Instructions from Observation
| 12:00 - 1:30 ||Lunch |
| 1:30 - 2:00 ||Henry Kautz:
A Multimodal Corpus for Integrated Language and Action
| 2:00 - 2:30 ||Cynthia Matuszek:
Learning Novel Attributes from Combined Language and Perception|
| 2:30 - 3:30 ||Poster session
| 3:30 - 4:00 ||Break |
| 4:00 - 4:30 ||Matthias Scheutz: Towards Robust Human-Robot Dialogues for Natural Language
Instruction Tasks |
| 4:30 - 5:00 ||Stefanie Tellex:
Toward Information Theoretic Human-Robot Dialog |
| 5:00 - 5:30 ||Discussion Session|
- March 30: Workshop submissions due.
- April 20: Author notifications.
- May 16: Camera-ready workshop paper submissions due to AAAI.
- July 23: Workshop held. (W4 at the AAAI 2012 Workshop Program, Ontario, Canada)
Cynthia Matuszek, University of Washington
Stefanie Tellex, Massachusetts Institute of Technology
Dieter Fox, University of Washington
Luke Zettlemoyer, University of Washington