Title: Expanding Interface Design Capabilities through Semantic and Data-Driven Analyses

Advisor: James Fogarty and Amy Ko

Supervisory Committee: James Fogarty (Co-Chair), Amy Ko (Co-Chair), Audrey Desjardins (GSR, Art, Art History, and Design), and Rastislav Bodik

Abstract: The design of an interface can have a huge impact on human productivity, creativity, safety, and satisfaction. Therefore, it is crucial that we provide user interface designers with the tools to make them efficient, more creative, and better understand their users. However, designers face key challenges in their tools throughout the design process. Designers explore alternatives of their interface layouts when prototyping. However, they are limited to exploring the layouts they can ideate and sketch,  create in their prototyping tools, or find in other examples not containing their interface elements. In usability testing, designers can conduct large-scale studies and deploy their interfaces to gather data from crowdworkers, however, such studies can be expensive, time consuming, and difficult to conduct iteratively throughout the design process. Finally, designers often find that existing interfaces can be a platform for prototyping and enabling new forms of interaction, but existing interfaces are often rigid and difficult to modify at runtime. In this dissertation, I explore how we can use advanced technologies from program analysis and synthesis and machine learning, to enable semantic and data-driven analyses of interfaces.

If we augment interface design tools with the capabilities of understanding, transforming, augmenting, and analyzing an interface design, we can advance designers' capabilities. Through semantic analysis of interfaces, we can help designers ideate more rapidly, prototype more efficiently, and more iteratively and cheaply evaluate the usability of their interface designs. I demonstrate this through four systems that (1) let designers rapidly ideate alternative layouts through mixed-initiative interaction with high-level constraints and feedback, (2) help designers adapt examples more efficiently by inferring semantic vector representation from an example screenshot, (3) enable designers to quickly and cheaply analyze a key aspect of the usability of their interfaces through a machine learning approach for modeling mobile interface tappability, and (4) prototype new modalities for existing web interfaces through applying program analysis to infer an abstract model of interface commands.

Place: 
CSE2 (Gates Center) 371
When: 
Wednesday, December 11, 2019 - 10:00 to 12:00