Title: Learning to Compile Learning

Advisor: Zach Tatlock

Supervisory Committee: Zach Tatlock (Chair), Eric Klavins (GSR, ECE), Luis Ceze, and Carlos Guestrin

Abstract: 

Frameworks for writing, compiling, and optimizing deep learning (DL) models have recently enabled progress in areas like computer vision and natural language processing. Extending these frameworks to accommodate the rapidly diversifying landscape of DL models and hardware platforms presents challenging tradeoffs between expressiveness, composability, and portability. I propose a design for an intelligent end-to-end deep learning compiler that can automatically adapt high-level programs to run on new hardware devices with human level performance. We demonstrate that a shared compiler and IR are useful infrastructure for frameworks and tools, including tools built at University of Washington (VTA) as well as Amazon (MxNet), Facebook (PyTorch) and other companies.

 

Place: 
CSE2 271 (Gates Center)
When: 
Monday, July 22, 2019 - 10:00 to Friday, April 26, 2024 - 03:01