BlindShell: Android Multi-Touch Gesture Interface for Blind Users
|
If you are the presenter of this abstract (or if you cite this abstract in a talk or on a poster), please show the QR code in your slide or poster (QR code contains this URL). |
Abstract
Background
Nowadays, majority of blind users still use traditional mobiles with HW keyboard accompanied with special reader-screen software (MobileSpeak). Despite of fast penetration of smart-touch mobiles there are few apps targeted on blind users as Talkback and none of them use exclusively multi-gesture control.
Objective
Our primary goal was to design an Android launcher’s app for addressing basic functions (call, sms, contacts, calendar, alarm, reminders) solely via use of multi-gestures approach.
Method
A user centered design was applied. First, qualitative study was conducted (10 interviews of 60-90mins with blind focus group). Next, quantitative research with 30 blind participants was carried out. 82% users did not posses any proficiency in PC and smart mobile (SM) skills. Further, Android 4.0‘s prototype was implemented with Acapela text to speech support. Finally, prototype was iteratively tested/re-implemented on group of 10 blinded users (no PC/SM skills) using 2 scenarios (4 iterations in total).
Results
Two modes of GUI-control were found as the most appropriate: i) left/right-sector tap; long press //two-fingers long press, and ii) left/right flick; double tap//long press is used to move between screens; resp. for enter//back key. We left on an user final selection of the mode control via setting menu. Character recognition (Google GestureLibrary) and text input using alpha numeric was preferred by 90% of users. Audio help manual is included in the app and available on supportive web portal.
Conclusions
Currently, tens of users are using the app in daily life. The apps is regarded as very intuitive, 75% users are able to call and send sms after 30 mins of training. Users who were born blind need 2-3 days of training to use for text input character recognition. The app is going to be placed on Android market to gather feedback from users. We have contacted several worldwide NGO to increase the app penetration. Regarding future work, we are planning migration under Windows 8 and iOS. The authors wish to thank to Vodafone foundation for support.
Nowadays, majority of blind users still use traditional mobiles with HW keyboard accompanied with special reader-screen software (MobileSpeak). Despite of fast penetration of smart-touch mobiles there are few apps targeted on blind users as Talkback and none of them use exclusively multi-gesture control.
Objective
Our primary goal was to design an Android launcher’s app for addressing basic functions (call, sms, contacts, calendar, alarm, reminders) solely via use of multi-gestures approach.
Method
A user centered design was applied. First, qualitative study was conducted (10 interviews of 60-90mins with blind focus group). Next, quantitative research with 30 blind participants was carried out. 82% users did not posses any proficiency in PC and smart mobile (SM) skills. Further, Android 4.0‘s prototype was implemented with Acapela text to speech support. Finally, prototype was iteratively tested/re-implemented on group of 10 blinded users (no PC/SM skills) using 2 scenarios (4 iterations in total).
Results
Two modes of GUI-control were found as the most appropriate: i) left/right-sector tap; long press //two-fingers long press, and ii) left/right flick; double tap//long press is used to move between screens; resp. for enter//back key. We left on an user final selection of the mode control via setting menu. Character recognition (Google GestureLibrary) and text input using alpha numeric was preferred by 90% of users. Audio help manual is included in the app and available on supportive web portal.
Conclusions
Currently, tens of users are using the app in daily life. The apps is regarded as very intuitive, 75% users are able to call and send sms after 30 mins of training. Users who were born blind need 2-3 days of training to use for text input character recognition. The app is going to be placed on Android market to gather feedback from users. We have contacted several worldwide NGO to increase the app penetration. Regarding future work, we are planning migration under Windows 8 and iOS. The authors wish to thank to Vodafone foundation for support.
Medicine 2.0® is happy to support and promote other conferences and workshops in this area. Contact us to produce, disseminate and promote your conference or workshop under this label and in this event series. In addition, we are always looking for hosts of future World Congresses. Medicine 2.0® is a registered trademark of JMIR Publications Inc., the leading academic ehealth publisher.

This work is licensed under a Creative Commons Attribution 3.0 License.