most significant bits
newsletter of uw computer science & engineering
volume 24, number 2, winter 2015

university of washington
CSE logo
 home CSE Home    MSB archive Winter 2015 MSB    MSB Archive   Contact Info Contact Info 
contents
Internet of Things Research highlights Age progression software Brain-to-brain communication Chair’s message Alumni profile: Captricity 2014 faculty additions Faculty awards and honors TR35 winners Anderson's USENIX awards Domingos' KDD Award Fox IEEE Fellow News and events Taskar Center launches Upcoming events Datagrams
msb24.2 PDF

Make an impact: support CSE

Supporting UW CSE is a left-brain decision. Your gift provides the department with resources for scholarships, fellowships, research support, and funds to build the CSE community we hope you value. Small gifts might not seem significant by themselves, but when pooled together they make great things possible. As you think about giving, please consider making a gift to CSE. Go to our website (www.cs.washington.edu) and select “Support CSE.” You will find a variety of funds that can benefit from your support.

About MSB

MSB is a twice-yearly publication of UW CSE supported by the Industry Affiliates Program.

Editor: Kay Beck-Benton.
Contributors: Ed Lazowska, Hank Levy, Sandy Marvinney, Kristin Osborne, S. Morris Rose
Photo credits: Bruce Hemingway, Mary Levin, Jose Mandojana

We want to hear from you!

Do you have news you’d like to share with the CSE community? Comments or suggestions for future issues of MSB? Let us know! Email the editors at msb@cs.washington.edu and be sure to visit us online at:

Sign up for MSB email

MSB is now available via email. To sign up, send an email to

Picture this: New software reveals how people will look as they age

If you have ever wondered what your toddler will look like when he or she grows up, there may soon be an app for that. With new illumination-aware age progression software developed by Ira Kemelmacher-Shlizerman and her colleagues in CSE’s GRAIL Group, we now have the capability to generate remarkably accurate images of an individual’s face at multiple ages based on a single photograph.

age-progression software

The software, which runs on any standard computer, leverages thousands of random Internet photos to compute the average pixel arrangement of different parts of the face at various ages. It then applies the differences in shape and texture to the input photo. The software is able to correct for vagaries in lighting quality and facial expression and generate age-progressed images up to 80 years old -- and it can do it in about 30 seconds.

“Aging the faces of very young children from a single photo is considered the most difficult of all scenarios,” noted Kemelmacher-Shlizerman. “Our method generates results so convincing that the majority of people who participated in our user studies could not distinguish between the age-progressed photo and the real one.”

Her work will do more than satisfy people’s curiosity about their future selves. It could also change the face of missing child cases by providing families and law enforcement with a more accurate tool for determining what victims look like years after their disappearance. According to Kemelmacher-Shlizerman, aging is one of many factors that interest her team, which also includes Supasorn Suwajanakorn and Steven M. Seitz.

“I am intrigued by the prospect of finding a representation of everyone in the world,” she said. “The massive amount of facial photos captured digitally presents exciting possibilities for the future of computer vision research.”

To learn more, visit: grail.cs.washington.edu/aging/ and homes.cs.washington.edu/~kemelmi/.

UW CSE researchers successfully demonstrate brain-to- brain communication over the Web

Raj Rao

Cat got your tongue? Someday, it may not matter if you’re at a loss for words, because your brain may be able to communicate without them.

A team of UW researchers led by CSE professor Raj Rao recently replicated their ground breaking experiment from 2013 that established a direct brain-to-brain connection between two people. The results of the latest, more comprehensive demonstration, involving six people, was published this fall in the journal PLOS One.

In the demonstration, each pair of participants -- a sender and a receiver -- was separated into two different locations on the UW campus. The sender was connected to an electroencephalography (EEG) machine, which read his/her brain activity and sent electrical pulses over the Web to the receiver, who wore a transcranial magnetic stimulation (TMS) coil placed near the part of the brain that controls hand movements.

Participants were asked to cooperate in playing a computer game using only the link between their brains. The sender could see the game but could not physically control the gameplay, while the receiver could not see the game but had control of the touchpad that operated it. The only way the sender could meet the game’s objective of firing a cannon was to think about moving his/her hand, which would in turn cause the recipient’s hand to twitch on the touchpad across campus. While accuracy varied among participants, one pair achieved a rate of 83 percent.

In addition to assessing how often each pair successfully executed the “fire“ command, the researchers were able to quantify the amount of information conveyed between the two brains. Next, researchers want to move beyond quantity to focus on the quality of that information.

Brain-to-brain interface demonstration
UW students Darby Losey, top,
and Jose Ceballos are positioned in two different
buildings on campus as they would be during
a brain-to-brain interface demonstration.

“We are still a long way from being able to directly communicate abstract knowledge, thoughts, feelings, or skills,” said Rao, who was lead author on the study. “But we hope our simple demonstration will serve as a stepping stone for achieving more complex types of brain-to-brain interaction in the future, and as an inspiration for using this new paradigm to better understand brain function.”

The study was coauthored by assistant professors Andrea Stocco and Chantel Prat of UW's Institute for Learning & Brain Sciences, CSE student Joseph Wu, Devapratim Sarma and Tiffany Youngquist of UW Bioengineering, and CSE alumnus Matthew Bryan. Initial funding was provided by grants from the UW Royalty Research Fund and the U.S. Army Research Office, which was recently followed by a new $1 million grant from the W.M. Keck Foundation.

CSE logo
Computer Science & Engineering Box 352350, University of Washington Seattle, WA 98195-2350 Privacy policy and terms of use