Shwetak Patel wins MacArthur Chair's message CSE leads new Intel center Alum profile: Wen-Hann Wang News Aram Harrow joins CSE CSE’s newest ACM Fellows iGEM 2011 world champs! Datagrams Awards Refraction wins NHK prize CRA recognizes CSE ugrads 2012 Diamond Award winners Events Susan Eggers retires! Jean-Loup Baer turns 75 2011 Industrial Affiliates meeting CSE @ 2011 Hopper Conference
CSE leads new Intel Center for Pervasive Computing
SNUPI (Sensor Nodes Utilizing Powerline Infrastructure)
nodes are ultra-low-power, general-purpose 27 MHz
wireless sensor nodes that transmit their data by
coupling over the power line to a single receiver
attached to the power line in the home.
The Intel Science and Technology Center for Pervasive Computing (ISTC-PC) launched in the fall 2011. Led by CSE professor Dieter Fox and Intel principal engineer (and CSE PhD alum) Anthony LaMarca, UW serves as the hub, coordinating research among five other top-tier universities involved in the collaboration.
The center's mission is to develop fundamental technologies needed for pervasive computing systems that are trustworthy, richly aware of their users and their activities, and continuously learn and adapt. The ISTC-PC brings together leading researchers whose focus is on pervasive computing, wireless communication and sensing, artificial intelligence and machine learning, computer vision, and human computer interaction, fifteen faculty participants in total. Nine are from UW, two from Georgia Tech, and one each from Cornell, the University of Rochester, Stanford, and UCLA. From the UW, participating faculty are Jeff Bilmes, James Fogarty, Dieter Fox, Yoshi Kohno, Richard Ladner, Shwetak Patel, Josh Smith, David Wetherall, and Luke Zettlemoyer. In addition to faculty, the ISTC-PC will fund roughly 30 grad students and post docs and provide research experiences for undergrads.
To enable continuous, unobtrusive awareness of people, the research team will develop embedded, power-harvesting sensors capable of running for long periods of time. Novel algorithms will combine data from a variety of sensors to recognize a user's context and interactions with people, objects, and environments. These algorithms will learn; they will continuously improve system performance while simultaneously growing their understanding of the user's preferences, goals, and activities. Three key application areas drive the center's research: smart task spaces that provide task assistance in the home and on the go; mobile devices to improve a user's health and well-being by reducing and mitigating stress in everyday situations; and technology to support family life management in the face of increasing schedule complexity and mobility. These applications were chosen because of the difficult requirements presented to a pervasive computing system, and demonstrate the game-changing impact that pervasive computing will have on future devices, applications, and services.
Low-Power Sensing and Communication Research
Pervasive computing systems must be continuously aware of the environment, the people nearby, and the activities in which they're engaged. Because such systems need to be "always on," saving power whenever possible is crucial. The researchers will develop "perpetual power" techniques that harvest energy from ambient sources and allow simple sensing and computing systems to run indefinitely. For larger devices, they will explore how to dynamically use the most energy-efficient 802.11 and cellular modes available in the current locale, based on RF conditions and competing network traffic.
Because pervasive computing systems perform continuous sensing and inference about people, within their homes and on the go, developing privacy and trust is paramount. With that in mind, the researchers will investigate how applications, sensors and data coding techniques can be modified to improve privacy.
Understanding Human State and Activities Research
Next-generation pervasive systems require fine-grained recognition of activities, objects, and social context. To achieve this, the ISTC-PC researchers will deploy dense, heterogeneous sensors in mobile environments and smart spaces, including audio and depth video sensors (via novel cameras that measure 3D shapes) and classic pervasive computing sensors such as GPS, accelerometers, and wireless signals (802.11, cellular and RFID). The research will focus largely on developing new algorithms to extract complex context and activity information from sensor data far more accurately and robustly than the current state of the art. For instance, the algorithms might determine not just that someone is in the kitchen but that the person is slicing an onion, and that the slices are too thick for the recipe being used. To be the most useful, pervasive computing systems must be able to assess the user's context in real-time, a challenge for systems that must operate on low power. To address this challenge, the researchers will explore how to divide the computation between mobile devices and the cloud.
CSE Major Research Centers
In addition to the Intel Science and Technology Center for Pervasive Computing, CSE has recently announced two other research centers. These three center-scale activities take CSE's work to a new level and bring new opportunities to our students and faculty.
NSF Engineering Research Center for Sensorimotor Neural Engineering (CSNE): Led by CSE’s Yoky Matsuoka, the CSNE is funded with an $18.5 million, five-year grant from the National Science Foundation, with potential for a five-year renewal. The Center is a global hub for delivering neural-inspired sensorimotor devices. Using devices that mine the rich data in neural signals available from implantable, wearable, and interactive interfaces, the CSNE builds end-to-end integrated systems. The fall issue of Trend highlighted the new center: here.
For more information about the CSNE, visit:
The Center for Game Science (CGS): Led by CSE's Zoran
Popović, the CGS is funded at $15 million by DARPA and the Bill
& Melinda Gates Foundation. The CGS focuses on solving hard
problems facing humanity today in a game based environment. The fall
2010 issue of MSB highlighted the
CGS here. For more information about
the CGS, visit:
Bill Gates, in his CSE Distinguished Lecture on October 27, 2011, briefly discussed the Center and two of its play-based games: FoldIt and Refraction. His lecture may be viewed here.
Personalization and Adaptation Research
Successful pervasive computing systems must be able to learn interactively the environments, objects, schedules, and preferences of their users. It should be easy for a user to teach a device to recognize activities such as a regular jogging routine, places such as a favorite grocery store, or objects such as the user's car. The research in this area will focus on developing techniques for handling the complex estimation and learning problems required for lifelong learning, adaptation and personalization of systems for individual users. Probabilistic graphical models that describe users and their context will continuously adapt, allowing the incorporation of new places, activities, personal objects, and social contexts over time.
In addition to personalizing what systems know, the center plans to build systems that personalize how they interact with users. The goal is to enable interactions between users and systems that seamlessly blend multiple modalities (e.g., gestures and natural language), enabling users to focus on their goals rather than making the technology work.
The center will initially focus on three concept applications that will drive the research and demonstrate the technologies.
Mobile Health and Well-being
Improving physical and emotional wellbeing is a high-value application area. The center will explore this topic by developing technologies to help users recognize and reduce stress and anxiety in their daily lives. To achieve this, the researchers aim to develop mobile systems that can understand the rich context of their users' lives (both at home and on the go) and learn about their routines and interactions. The researchers will combine multiple sensing modalities to recognize common stressors and to learn which stressors most affect the user.
This application area will challenge the center to develop power-optimized mobile systems, fine-grained models of everyday activities, interactions, and environments, and personalized and adaptive feedback systems to help users manage stress.
Task Spaces: A Smart Cooking Assistant
Computers often assist with tasks, but most of the tasks are performed in a traditional computing context. The ISTC-PC aims to demonstrate a space that is capable of helping users with physical tasks that don't involve a computer. For example, a task space could help someone assemble a piece of furniture; teach a person to cook a soufflé; or assist someone who is visually impaired in configuring a new piece of audio equipment. This highlights an important promise of future pervasive computing apps: the ability to deliver expert knowledge to novices, providing "hands" on training at home or on the go.
ISTC-PC UW team
Provided with a recipe, the Smart Cooking Assistant will guide the cook through the different steps of the recipe. With the help of sensors that watch the kitchen counter top and other surfaces, the system will recognize tools and implements for cooking, and their use, and will detect when ingredients are measured and added. It will track the user's progress, providing audio and visually projected cues as needed. It will also monitor time and convey to the user the scheduling and timing associated with the recipe (e.g., timing the rising of dough, marinating, and baking time).
Users will be able develop new recipes for the Smart Cooking Assistant via interactive demonstration. The system can be taught to correctly interpret gesture and speech input, and trained to recognize the ingredients, amount, order, physical manipulation of ingredients, etc. The resulting recipe will be captured by the system for later use.
Although the researchers will demonstrate cooking-related functionalities in this space, the techniques developed will be general enough to be applied to other task spaces, such as workbenches, wet labs, and classrooms.
Family Coordination System
Thanks to increasingly busy, mobile, over-scheduled lives, it's becoming harder for families to spend time together. The ISTC-PC will address this challenge by developing pervasive computing systems that help families coordinate their lives through monitoring, tracking, and reflecting on family activities. Activities will be tracked over the entire day at a relatively coarse level both inside and outside the home, and at a far more detailed level in sensor-rich areas of the home, such as the kitchen and dining room. The system will use activity information to assist families in planning their lives. For instance, one goal might be to help a family get ready to leave on a weekday morning with a minimum of fuss, or to plan a day's strategy for picking up the kids at school if there's a change in normal daily routines.
To bootstrap learning of the models used for inferring activities, the researchers will develop techniques to motivate family members to provide data about their daily activities within and outside the home. The system will include engaging visualizations to encourage members to provide input. It will also use persuasive, unobtrusive, privacy-preserving methods of providing feedback and generating awareness.
For more information about the ISTC-PC, visit: