I specialize in designing intelligent interactive systems that leverage uncertain input technologies. A particular focus of my research is on systems that enhance the capabilities of users with permanent or situationally-induced disabilities. My broader interests include human-computer interaction (HCI), speech and language processing, mobile interfaces, and crowdsourcing.
What's new:
March 2024 – Our evaluation of the eyes-free text input method FlexType with users with visual impairments was accepted at PETRA 2024.
July 2023 – Our paper evaluating the Nomon interface with seven motor-impaired users was accepted for presentation at ASSETS 2023. Most users were faster selecting pictures with Nomon and reported Nomon felt faster when writing text. Congrats to student first-author Nicholas Bonaker (MIT)!
July 2023 – I will present our work on Nomon, a single-switch Augmentative and Alternative Communication (AAC) interaction method, at a workshop at ISAAC 2023. How does Nomon work? Norman, our adorable Nomon mascot, will teach you all you need to know in this demo that runs in your browser.
June 2023 – I will present our work on Nomon, a single-switch Augmentative and Alternative Communication (AAC) interaction method, at a workshop at the 2023 BCI Society meeting.
May 2023 – Our paper on programming by voice was accepted for presentation at the ACM Conference on Conversational User Interfaces (CUI 2023). Congrats to PhD student Sadia Nowrin on her first full paper!
January 2023 – We had three papers accepted at IUI 2023! PhD student Dylan Gaines will present on his new ambiguous eye-free input method FlexType. We have an open science paper on our new dataset of noisy QWERTY typing. Finally, we have a study about the dwell-free gaze keyboard we created for Tobii Dynavox.