Hello! I am a third-year PhD student in computer science at the University of Washington, advised by Yejin Choi and Noah Smith. My research area is natural language processing, and I am particularly interested in developing algorithms for dataset creation that lead to more robust and reliable models. I am supported by the NSF Graduate Research Fellowship.

I received my Bachelor’s degree from Northwestern University, majoring in computer science and math. There, I was super fortunate to learn about the world of research through natural language processing and computer audition with Professor Doug Downey, Professor Bryan Pardo, and Dr. Prem Seetharaman.

Education
  • PhD student, 2020 - present

    University of Washington

  • BA in Computer Science, Mathematics, 2020

    Northwestern University

Publications

We're Afraid Language Models Aren't Modeling Ambiguity.
Preprint 2023.

PDF Cite Code Dataset

Self-Instruct: Aligning Language Models with Self-Generated Instructions.
ACL 2023.

PDF Cite Code

Detoxifying Text with MaRCo: Controllable Revision with Experts and Anti-Experts.
ACL 2023.

PDF Cite

Generated Knowledge Prompting for Commonsense Reasoning.
ACL 2022.

PDF Cite Code

DExperts: Decoding-Time Controlled Text Generation with Experts and Anti-Experts.
ACL 2021.

PDF Cite Code Slides Media

Model Selection for Deep Audio Source Separation via Clustering Analysis.
DCASE 2020 (Best Student Paper Award).

PDF Cite Slides Talk

Bach or Mock? A Grading Function for Chorales in the Style of J.S. Bach.
ML4MD Workshop at ICML 2020.

PDF Cite Code Poster

Incorporating Music Knowledge in Continual Dataset Augmentation for Music Generation.
ML4MD Workshop at ICML 2020.

PDF Cite Code Poster

CODAH: An Adversarially-Authored Question Answering Dataset for Common Sense.
RepEval Workshop at NAACL 2019.

PDF Cite Dataset