MobileAccessibility is an entirely different approach to providing useful mobile functionality to blind, low-vision, and deaf-blind users. This new approach leverages the sensors that modern cell phones already have to keep devices cheap, and uses remote web services to process requests. Importantly, both fully-automated and human-powered web services are used to balance the cost and capability of the services available.

We are developing applications on mobile phones using Google's Android platform, and on the iPhone. We are designing, implementing, and evaluating our prototypes with blind, low-vision, and deaf-blind participants, through focus groups, interviews, lab studies, and field studies.

Current Projects:

  • V-Braille: Haptic Braille Perception using the Touch Screen and Vibration on Mobile Phones
  • Talking Barcode Reader
  • Color Namer
  • Talking Calculator
  • Talking Level
  • BrailLearn and BrailleBuddies: V-Braille Games for Teaching Children Braille
  • FocalEyes Camera Focalization Project: Camera Interaction with Blind Users
  • CameraEyes Photo Guide: Enabling Blind Users To Take Framed Photographs of People
  • Mobile OCR
  • Appliance Reader
  • Accessible Tactile Graphics with the Digital Pen
  • LocalEyes: Accessible GPS for Exploring Places and Businesses
  • ezTasker: Visual and Audio Task Aid for People with Cognitive Disabilities
  • Transportation Bus Guide for Deaf-Blind Commuters

Projects are described in more detail in the Projects section.

MobileAccessibility is a joint project between the University of Washington and the University of Rochester.
We would like to acknowledge and thank Google for their help in this project, as well as our blind and deaf-blind research participants.