Spindex (speech index) improves auditory menu acceptance and navigation performance

Date of Publication: 
2011 April

Users interact with mobile devices through menus, which can include many items. Auditory menus have the potential to make those devices more accessible to a wide range of users. However, auditory menus are a relatively new concept, and there are few guidelines that describe how to design them. In this paper, we detail how visual menu concepts may be applied to auditory menus in order to help develop design guidelines. Specifically, we examine how to optimize the designs of a new contextual cue, called “spindex” (i.e., speech index). We developed and evaluated various design alternatives for spindex and iteratively refined the design with sighted users and visually impaired users. As a result, the “attenuated” spindex was the best in terms of preference as well as performance, across user groups. Nevertheless, sighted and visually impaired participants showed slightly different responses and feedback. Results are discussed in terms of acoustical theory, practical display design, and assistive technology design.


The contents of this website were developed under a grant from the National Institute on Disability, Independent Living, and Rehabilitation Research (NIDILRR grant number 90RE5025-01-00). NIDILRR is a Center within the Administration for Community Living (ACL), Department of Health and Human Services (HHS). The contents of this website do not necessarily represent the policy of NIDILRR, ACL, HHS, and you should not assume endorsement by the Federal Government.