I specialize in designing intelligent interactive systems that leverage uncertain input technologies. A particular focus of my research is on systems that enhance the capabilities of users with permanent or situationally-induced disabilities. My broader interests include human-computer interaction (HCI), speech and language processing, mobile interfaces, and crowdsourcing.
What's new:
January 2023 – We had three papers accepted at IUI 2023! PhD student Dylan Gaines will present on his new ambiguous eye-free input method FlexType. We have an open science paper on our new dataset of noisy QWERTY typing. Finally, we have a study about the dwell-free gaze keyboard we created for Tobii Dynavox.
December 2022 – Are you interested how to design user interfaces or model users? Check out the new book Bayesian Methods for Interaction and Design. PhD student Dylan Gaines was the lead author on our chapter on Statistical Keyboard Decoding.
May 2022 – Our demo of Nomon was selected as runner-up for best demo at CHI 2022!
April 2022 – Our paper on Nomon, an interface for single-switch Augmentative and Alternative Communication (AAC) users, will appear at CHI 2022 as a full paper and as a demo paper. Come see us at CHI interactivty on Monday night, or visit https://nomon.app/demo. Our talk is at 9:15 on Tuesday morning, see the CHI program. You can also watch a pre-recorded video. Congrats to graduate student Nicholas Bonaker (MIT) on his first CHI paper!
July 2021 – I'm giving a 3-hour master class on Language Modeling and Predictive AAC at ISAAC Connect. My session is Monday August 9th starting at 9am Eastern. Hope to (virtually) see you there! [Talk slides]
April 2021 – Do you use Augmentative and Alternative Communication (AAC)? If so, we are looking for volunteers to donate sentences from their AAC history. See our study details.