TY - GEN
T1 - Accessible gesture typing for non-visual text entry on smartphones
AU - Billah, Syed Masum
AU - Ko, Yu Jung
AU - Ashok, Vikas
AU - Bi, Xiaojun
AU - Ramakrishnan, I. V.
N1 - Funding Information:
We thank anonymous reviewers for their insightful feedback. This research was supported in part by NSF: 1806076, NEI/NIH: R01EY02662, NIDILRR: 90IF0117-01-00, and a Google Faculty Research Award (2018).
Publisher Copyright:
© 2019 Copyright held by the owner/author(s).
PY - 2019/5/2
Y1 - 2019/5/2
N2 - Gesture typing–entering a word by gliding the finger sequentially over letter to letter–has been widely supported on smartphones for sighted users. However, this input paradigm is currently inaccessible to blind users: it is difficult to draw shape gestures on a virtual keyboard without access to key visuals. This paper describes the design of accessible gesture typing, to bring this input paradigm to blind users. To help blind users figure out key locations, the design incorporates the familiar screen-reader supported touch exploration that narrates the keys as the user drags the finger across the keyboard. The design allows users to seamlessly switch between exploration and gesture typing mode by simply lifting the finger. Continuous touch-exploration like audio feedback is provided during word shape construction that helps the user glide in the right direction of the key locations constituting the word. Exploration mode resumes once word shape is completed. Distinct earcons help distinguish gesture typing mode from touch exploration mode, and thereby avoid unintended mix-ups. A user study with 14 blind people shows 35% increment in their typing speed, indicative of the promise and potential of gesture typing technology for non-visual text entry.
AB - Gesture typing–entering a word by gliding the finger sequentially over letter to letter–has been widely supported on smartphones for sighted users. However, this input paradigm is currently inaccessible to blind users: it is difficult to draw shape gestures on a virtual keyboard without access to key visuals. This paper describes the design of accessible gesture typing, to bring this input paradigm to blind users. To help blind users figure out key locations, the design incorporates the familiar screen-reader supported touch exploration that narrates the keys as the user drags the finger across the keyboard. The design allows users to seamlessly switch between exploration and gesture typing mode by simply lifting the finger. Continuous touch-exploration like audio feedback is provided during word shape construction that helps the user glide in the right direction of the key locations constituting the word. Exploration mode resumes once word shape is completed. Distinct earcons help distinguish gesture typing mode from touch exploration mode, and thereby avoid unintended mix-ups. A user study with 14 blind people shows 35% increment in their typing speed, indicative of the promise and potential of gesture typing technology for non-visual text entry.
UR - http://www.scopus.com/inward/record.url?scp=85067596223&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85067596223&partnerID=8YFLogxK
U2 - 10.1145/3290605.3300606
DO - 10.1145/3290605.3300606
M3 - Conference contribution
AN - SCOPUS:85067596223
T3 - Conference on Human Factors in Computing Systems - Proceedings
BT - CHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
PB - Association for Computing Machinery
T2 - 2019 CHI Conference on Human Factors in Computing Systems, CHI 2019
Y2 - 4 May 2019 through 9 May 2019
ER -