Skip to content

Interaction with the Physical World

Advances in low- and no-power sensing, communication and interaction technologies offer new possibilities for blending digital innovation with our physical environment. 

From gesture recognition that allows people to interact with objects in new ways, to low-power sensors that collect and transmit data about temperature, air quality, urban accessibility and more, our researchers are tapping into the potential of computation to transform how we experience the world around us.


Research Groups & Labs

Closeup of a person's finger illuminated in red by smartphone camera

UbiComp Lab

The Ubiquitous Computing (UbiComp) Lab develops innovative systems for health sensing, low-power sensing, energy sensing, activity recognition and novel user interface technology for real-world applications.

Street scene overlaid with color-coded object recognition labels for depicted car, bicycle, vegetation, utility pole, and manhole cover

Makeability Lab

The Makeability Lab specializes in Human-Computer Interaction and applied machine learning for high-impact problems in accessibility, computational urban science, and augmented reality.


Faculty Members


Centers & Initiatives

DFab is a network of researchers, educators, industry partners, and community members advancing the field of digital fabrication at UW and in the greater Seattle region.

Computing for the Environment (CS4Env) at the University of Washington supports novel collaborations across the broad fields of environmental sciences and computer science & engineering. The initiative engages environmental scientists and engineers, computer scientists and engineers, and data scientists in using advanced technologies, methodologies and computing resources to accelerate research that addresses pressing societal challenges related to climate change, pollution, biodiversity and more.

Highlights


UW News

New research from UW researchers and the Toyota Research Institute, or TRI, explores how drivers balance driving and using touch screens while distracted. The results could help auto manufacturers design safer, more responsive touch screens and in-car interfaces.

Allen School News

In a Q&A, professor Kurtis Heimerl and postdoc Esther Han Beol Jang (Ph.D., ‘24) discuss their work with residents of two Seattle tiny house villages on how they can leverage smart technologies to improve living conditions, balanced against concerns such as cost and continuity of deployment.

Allen School News

From a robotic arm that learns to pick up new objects in real time, to a model that converts 2D videos into 3D virtual reality, to a curious chatbot that adapts to users, to machine learning methods for decoding the brain, the 2025 Research Showcase and Open House had something for everyone.