DSpace Repository

Color Identification Mobile Application for Visually Impaired People: An Integration of Haptic and Auditory Sensories

Show simple item record

dc.contributor.advisor Esichaikul, Vatcharaporn
dc.contributor.author Raksasakul, Punchaya
dc.contributor.other Daily, Matthew N.
dc.contributor.other Janecek, Paul
dc.date.accessioned 2017-08-18T08:08:31Z
dc.date.available 2017-08-18T08:08:31Z
dc.date.issued 2017-08
dc.identifier.uri http://www.cs.ait.ac.th/xmlui/handle/123456789/876
dc.description.abstract One of the challenges of blind people is a lack of ability to perceive color which reduces their self-confidence in making a decision to accomplish daily activities identifying and distinguishing an object such as clothes, bank notes, and consumer goods, perceiving artistic works or colored graphical images or even participating in social activities. This research aims to help enhance the color perception of visually impaired and blind people by integrating haptic and auditory feedbacks to present color information in a smartphone with an interactive user interface for blind users. The mobile application for color identification is designed based on the requirements of 22 blind people, who are purposively selected for this research, and it is implemented on Android platform. Sixteen basic colors, used by blind people in daily life, are encoded by sixteen unique vibration patterns which are designed based on RYB color mixing theory of Johannes Itten (Itten, J., 1992). Haptics and auditories are integrated synchronously when hand movement is used as input stimuli to hold and move the mobile camera to acquire the image of the object. Then, the vibration signal is generated and the name of the dominant color of the object in the focus area is verbalized through the mobile speaker. For illustration, the developed application is used to identify the color of clothes by the blind participants through three tasks in four different environments including natural light, artificial light, low light, and noisy environments. In order to evaluate the performance of the developed application, the effectiveness, efficiency, and user satisfaction were measured. To evaluate the effectiveness, an accuracy to identify the correct color is calculated. Regarding efficiency, the time-based efficiency is measured by calculating the time taken by the participants to complete the task. In addition, the user satisfaction is measured through the survey. Besides, the feedbacks from the participants were collected for future improvement. As a result, the proposed mobile application can achieve high accuracy in three different lighting conditions which are natural light, artificial light, and low light with 90.36%, 81.34%, and 80.09% respectively. Moreover, the application can perform well in a noisy environment (84.49% accuracy). Finally, users are highly satisfied with this mobile application. Particularly, audible guide is considered to be the most important feature which can help them to learn and use the application by themselves. en_US
dc.description.sponsorship HM King en_US
dc.language.iso en_US en_US
dc.publisher AIT en_US
dc.subject mobile application en_US
dc.subject haptic en_US
dc.title Color Identification Mobile Application for Visually Impaired People: An Integration of Haptic and Auditory Sensories en_US
dc.type Thesis en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account