Various additional animations related to activity recognition and people tracking can be found on our MCL in action web site.

We develop techniques that can extract high-level information about a person's activties from raw sensor data. Such information can be used in various applications, ranging from helping elderly people or people suffering from brain injuries during their everyday lives (see also the assisted cognition project).

Project Contributors

Dieter Fox, Jeff Bilmes, Brian Ferris, Henry Kautz, Anthony LaMarca, Julie Letchner, Lin Liao, Don Patterson, Matthai Philipose, Alvin Raj, and Amarnag Subramanya

 

Example Scenario

In our GPS tracking work, for instance, a Rao-Blackwellised particle filter estimates a person's location and mode of transportation (bus, foot, car). A hierarchical dynamic Bayesian network is trained to additionally learn and infer the person's goals and trip segments. If you click on the left picture, you will see an animation that illustrates the filter as it observes a person getting on and off the bus. The color of the particles indicates the current mode of transportation estimate (foot=blue, bus=green, car=red). The middle animation shows the prediction of a person's current goal (black line). Size of blue circles indicates probability of this location being the goal. Right animation shows detection of an error. The person fails to get off the bus at the marked location.


Getting on and off a bus

Goal prediction

Error detection