A longitudinal evaluation of hands-free speech-based navigation during dictation

Jinjuan Feng, Andrew Sears, Clare Marie Karat

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

Despite a reported recognition accuracy rate of 98%, speech recognition technologies have yet to be widely adopted by computer users. When considering hands-free use of speech-based solutions, as is the case for individuals with physical impairments that interfere with the use of traditional solutions such as a mouse, the considerable time required to complete basic navigation tasks presents a significant barrier to adoption. Several solutions were proposed to improve navigation efficiency based on the results of a previous study. In the current study, a longitudinal experiment was conducted to investigate the process by which users learn to use hands-free speech-based navigation in the context of large vocabulary, continuous dictation tasks as well the efficacy of the proposed solutions. Due to the influence initial interactions have on the adoption of speech-based solutions, the current study focused on these critical, initial, interactions of individuals with no prior experience using speech-based dictation solutions. Our results confirm the efficacy of the solutions proposed earlier while providing valuable insights into the strategies users employ when using speech-based navigation commands as well as design decisions that can influence these patterns.

Original languageEnglish (US)
Pages (from-to)553-569
Number of pages17
JournalInternational Journal of Human Computer Studies
Volume64
Issue number6
DOIs
StatePublished - Jun 1 2006

All Science Journal Classification (ASJC) codes

  • Software
  • Human Factors and Ergonomics
  • Education
  • Engineering(all)
  • Human-Computer Interaction
  • Hardware and Architecture

Fingerprint Dive into the research topics of 'A longitudinal evaluation of hands-free speech-based navigation during dictation'. Together they form a unique fingerprint.

Cite this