Machine learning, speech/language/bioinformatics/music, submodularity & discrete optimization
Qingqing Cao's research interests include efficient NLP, mobile computing, and machine learning systems. My current focus is building efficient and practical NLP systems for diverse platforms including resource-constrained edge devices and the cloud servers. Previously, I have worked on projects like faster Transformer models for question answering (ACL 2020), as well as accurate and interpretable energy estimation of NLP models (ACL 2021).
Wenya is a research fellow in Paul G. Allen School of Computer Science and Engineering at the University of Washington, advised by Noah Smith and Hanna Hajishirzi. Her research interests lie in deep learning, logic reasoning and their applications in natural language processing, e.g., information extraction, natural language understanding etc.
Sean Welleck is a Postdoctoral Scholar at UW, working with Yejin Choi. His research interests include deep learning and structured prediction, with applications in natural language processing. He completed his Ph.D. at New York University, advised by Kyunghyun Cho and Zheng Zhang, and obtained B.S.E. and M.S.E. degrees from the University of Pennsylvania. His research has been published at ICML, NeurIPS, ICLR, and ACL, including two Nvidia AI Labs Pioneering Research Awards.
Tao Yu works with professor Noah Smith and Mari Ostendorf of the UW Natural Language Processing group. His research focuses on designing and building conversational natural language interfaces (NLIs) that can help humans explore and reason over data in any application (e.g., relational databases and mobile apps) in a robust and trusted manner. Tao completed his Ph.D. from Yale University.