DSpace Repository

Evaluating Non-Visual Interaction for Content Navigation and Targeting Tasks

Show simple item record

dc.contributor.advisor Dr. Paul Janecek (Chairperson)
dc.contributor.author Charoenchaimonkon, Eakachai
dc.contributor.other Dr. Matthew Dailey (Member) Dr. Atiwong Suchato (Member)
dc.date.accessioned 2015-01-29T04:21:29Z
dc.date.available 2015-01-29T04:21:29Z
dc.date.issued 2013-12
dc.identifier.other AIT Diss no.IM-13-03 en_US
dc.identifier.uri http://www.cs.ait.ac.th/xmlui/handle/123456789/716
dc.description.abstract Research on interaction techniques that entirely exclude the role of visual perception is rare, but is a fundamental step for reducing access barriers to the GUI for visually impaired users. A limitation of current interaction techniques for the visually impaired, such as Automatic Speech Recognition (ASR) technology and keyboard input devices, is that they only support indirect feedback and discrete control rather than the immediate feedback and continuous control offered by a visual cursor and pointing device such as a mouse. The goal of this thesis is to develop and characterize techniques that provide non-visual users‘ direct, continuous control in two basic tasks: content navigation and target acquisition. The first set of experiments focus on auditory input and displays. The results indicate that non-speech features in human voice, such as a continuous vowel sound, can enhance the operation of ASR technology, and provide users immediate feedback and continuous control. Participants were able to continuously navigate through linear and hierarchical structures using only speech and non-speech for input and feedback, but preferred linear structures and conversational-based interaction to reduce the conflict between users' hearing and vocalizing capability. The second set of experiments focused on pen and mouse-based input and tactile feedback. The results indicate that Fitts' law is accurate in modeling non-visual targeting tasks when the target distance is low, but becomes less accurate as distance increases. Haptic feedback that provides a sharp contrast to indicate a target, such as continuous vibration on the target area or a bump on its borders, reduces the time to acquire the target and enables users to select targets even as small as 4 pixels wide. However, subtle changes, such as varying surface friction, are ineffective. Furthermore, haptic reference points significantly improve accuracy, but reduce overall task completion time. Based on the results of these experiments and a thorough review of the literature, this thesis proposes a list of guidelines for the design of non-visual interfaces using auditory displays and tactile feedback. These results and design guidelines provide an important contribution towards developing interfaces that provide visually impaired users more than just accessibility, but also a sense of immediate feedback and continuous control. en_US
dc.description.sponsorship Office of Civil Service Commission Thailand en_US
dc.publisher AIT en_US
dc.subject multimodal user interface, non-visual interaction, auditory display, tactile feedback, content navigation, target acquisition en_US
dc.title Evaluating Non-Visual Interaction for Content Navigation and Targeting Tasks en_US
dc.type Dissertation en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account