Lucy Lu Wang

Allen Library 199D
University of Washington
Seattle, WA 98105
Assistant Professor, Information School, University of Washington
Adjunct, Paul G. Allen School of Computer Science & Engineering
Adjunct, Biomedical Informatics & Medical Education
Adjunct, Human Centered Design & Engineering
Research Scientist, Allen Institute for AI (Ai2)
Â
Hello and welcome! I am an NLP and health informatics researcher, studying how to design and build language technologies to improve access to and understanding of information, especially in high-expertise domains like science and healthcare. My work has resulted in techniques and tools to improve access to scholarly content, synthesize scientific evidence, and support people to make better decisions about their health.
I am an Assistant Professor at the University of Washington Information School, where I lead the Language Accessibility Research (LARCH) lab. You can find a list of our publications here.
Curriculum vitae: PDF
Â
news
02/06 | New paper acceptances! “Know Your Limits: A Survey of Abstention in Large Language Models” (led by Bingbing Wen has been accepted to TACL and “Varying Shades of Wrong: Aligning LLMs with Wrong Answers Only” (led by Jihan Yao) has been accepted to ICLR! |
---|---|
12/20 | Our paper “Explainable AI for Clinical Outcome Prediction: A Survey of Clinician Perceptions and Preferences” led by lab alum Jun Hou, has been accepted to AMIA Informatics Summit! |
11/05 | I received a Google Academic Research Award for “AI-Driven Accessibility Solutions for STEM Educational Materials: Bridging Accessibility Gaps for Blind and Visually Impaired Students”! |
10/16 | New preprint: “Varying Shades of Wrong: Aligning LLMs with Wrong Answers Only” showing the benefits of preference alignment between wrong answers only. |
10/03 | Two 🌴EMNLP🌴 paper acceptances! “APPLS: Evaluating Evaluation Metrics for Plain Language Summarization” to the main conference and “Characterizing LLM Abstention Behavior in Science QA with Context Perturbations” to Findings. |
select recent publications
Â