Complementing Text Entry Evaluations with a Composition Task

Home > Publications > Complementing Text Entry Evaluations with a Composition Task

Complementing Text Entry Evaluations with a Composition Task

Keith Vertanen, Per Ola Kristensson

ACM Transactions on Computer-Human Interaction, 2014.

A common methodology for evaluating text entry methods is to ask participants to transcribe a predefined set of memorable sentences or phrases. In this article, we explore if we can complement the conventional transcription task with a more externally valid composition task. In a series of large-scale crowdsourced experiments, we found that participants could consistently and rapidly invent high quality and creative compositions with only modest reductions in entry rates. Based on our series of experiments, we provide a best-practice procedure for using composition tasks in text entry evaluations. This includes a judging protocol which can be performed either by the experimenters or by crowdsourced workers on a microtask market. We evaluated our composition task procedure using a text entry method unfamiliar to participants. Our empirical results show that the composition task can serve as a valid complementary text entry evaluation method.

Paper:
Reference:
Video: