The WebInSight Project includes is a collection of projects designed to make the web more accessible to blind web users.
The Tactile Graphics Assistant (TGA) is a program created at the University of Washington to aid in the tactile image translation process. The TGA separates text from an image so that the text can later be replaced by Braille and inserted back onto the image. In order to streamline the text selection process, the TGA employs machine learning to recognize text so that large groups (possibly hundreds) of images can be translated at a time.
MobileASL is a video compression project at the University of Washington and Cornell University with the goal of making wireless cell phone communication through sign language a reality in the U.S.
MobileAccessibility is an entirely different approach to providing useful mobile functionality to blind, low-vision, and deaf-blind users. This new approach leverages the sensors that modern cell phones already have to keep devices cheap, and uses remote web services to process requests. Importantly, both fully-automated and human-powered web services are used to balance the cost and capability of the services available.
The ASL-STEM Forum is part of a research venture at the University of Washington which seeks to remove a fundamental obstacle currently in the way of deaf scholars, both students and professionals. Due to its relative youth and widely dispersed user base, American Sign Language (ASL) has never developed standardized vocabulary for the many terms that have arisen in advanced Science, Technology, Engineering, and Mathematics (STEM) fields.
WebAnywhere is a non-visual interface to the web that requires no new software to be downloaded or installed.It works right in the browser, which means you can access it from any computer, even locked-down public computer terminals.WebAnywhere enables you to interact with the web in a similar way to how you may have used other screen readers before.
Modern smart phones like the iPhone, Windows phones, and Google Android phones have the ability to output speech and vibrations, which make them suitable as multi-modal devices for blind children to learn Braille as well as other knowledge that will be useful throughout life. V-Braille is an infrastructure for smart phones that employs vibration as the means for reading Braille. Additionally, it uses both vibration and text-to-speech for entering (or "writing") Braille.