Designing a learning environment for visually impaired and blind persons in order to develop touch access to digital content

Designing a learning environment for visually impaired and blind persons in order to develop touch access to digital content

Matthieu TixierGaëlle Garibaldi Dominique Aubert Charles Lenay 

University of Technology of Troyes, ICD (CNRS FRE 2019), Tech-CICO France, 12, rue Marie Curie - CS 42060 10004 Troyes, France

University of Technology of Compiègne, EA 2223 Costech, CRED France, Centre of Recherches, 60200 Compiègne, France

Corresponding Author Email: 
firstname.lastname@utt.fr
Page: 
201-205
|
DOI: 
https://doi.org/10.18280/mmc_c.790409
Received: 
28 September 2018
| |
Accepted: 
31 October 2018
| | Citation

OPEN ACCESS

Abstract: 

Through screen-reader and Braille display, trained blind persons can nowadays manage to access to a lot of activities using computers. However, graphical interfaces and content where the spatial dimension is essential for understanding, like charts, pictures or the majority of videogames, are remaining hardly accessible. The Tactos and Intertact.net technologies are aimed to overcome these limits by providing an efficient sensory supplementation technology enabling blind users to access the spatial dimension of content through touch. Following a participatory design approach, we have worked in cooperation with blind persons to develop a learning environment for touch access to digital content with Tactos. Adoption is important when it comes to develop technologies and we report here on the research we conduct for enabling an independent learning of our system by blind persons. From our perspective, this possibility is a cornerstone for the development of a users’ community.

Keywords: 

tactile interfaces, perceptual supplementation, technology learning, visual impairment and blindness

1. Introduction
2. Material and Methods
3. Results
4. Conclusion
Acknowledgement
  References

[1] Brock A, Truillet P, Oriola B, Picard D, Jouffrais, C. (2012). Design and user satisfaction of interactive maps for visually impaired people. In: Computers Helping People with Special Needs, Berlin Heidelberg: Springer; 2012, pp. 544-551. https://doi.org/10.1007/978-3-642-31534-3_80

[2] Safi W, Maurel F, Routoure J, Beust P, Dias G. (2014). Blind browsing on hand-held devices: Touching the web to understand it better. Data Visualization Workshop (DataWiz 2014) associated to 25th ACM Conference on Hypertext and Social Media (HYPERTEXT 2014), Santiago, Chile.

[3] Simonnet M, Bothorel C, Maximiano LF, Thépaut A. (2013). Exploration cartographique non visuelle sur tablette tactile multitouch : "la stratégie du piano". Sciences et Technologies pour le Handicap, Hermès 5(1): 1-20.

[4] Tixier M, Lenay C, Le Bihan G, Gapenne O, Aubert D. (2013). Designing interactive content with blind users for a perceptual supplementation system. In: Proceedings of TEI. New York: ACM, pp. 229 236. https://doi.org/10.1145/2460625.2460663

[5] Meijer PBL. (1992). An experimental system for auditory image representations. IEEE Transactions on Biomedical Engineering 39(2): 112-121. https://doi.org/10.1109/10.121642

[6] NVDA, http://www.nvaccess.org (last consulted, 28/09/15)

[7] Lave J, Wenger E. (1991). Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press.

[8] Lenay C, Gapenne O, Hanneton S, Marque C, Genouëlle C. (2003). Sensory substitution: limits and perspectives. In: Touching for Knowing, Cognitive psychology of haptic manual perception, Y. Hatwell, A. Streri, et E. Gentaz, (Eds.) Amsterdam/Philadelphia: John Benjamins Publishing Company.

[9] Stewart J, Gapenne O. (2004). Reciprocal modeling of active perception of 2-D forms in a simple tactile-vision substitution system. Minds and Machines 14(3): 309 330. https://doi.org/10.1023/b:mind.0000035423.93112.b2

[10] Hanneton S, Gapenne O, Genouelle C, Lenay C, Marque C. (1999). Dynamics of shape recognition through a minimal visuo-tactile sensory substitution interface. Third international conference on cognitive and neural systems, Boston, USA.

[11] Deschamps L, Le Bihan G, Lenay C, Rovira K, Stewart S, Aubert D. (2012). Interpersonal Recognition Through Mediated Tactile Interaction. In: Proceedings of IEEE Haptics Symposium, Vancouver, Canada, pp. 239 245. https://doi.org/10.1109/HAPTIC.2012.6183797

[12] Lenay C, Stewart J. (2012). Minimalist approach to perceptual interactions. Frontiers in Human Neuroscience 6(98). https://doi.org/10.3389/fnhum.2012.00098

[13] Rovira K, Gapenne O. (2009). Tactile classification on traditional and computerized media in three adolescents who are blind. Journal of Visual Impairment and Blindness 103(7): 430 435. https://doi.org/10.2340/16501977-0399

[14] Sribunruangrit N, Marque C, Lenay C, Gapenne O. (2004). Graphic-User-Interface system for people with severely impaired vision in mathematics class. In: Proceeding of the 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Francisco, USA, pp. 5145–5148. https://doi.org/10.1109/iembs.2004.1404432

[15] Tixier M, Lenay C, Gapenne O, Aubert D. (2013). From perceptual supplementation to the accessibility of digital spaces: the case of free exploration of city maps for blind persons. Ingénierie et Recherche BioMédicale 34(1): 64-68. https://doi.org/10.1016/j.irbm.2012.12.005

[16] Muller MJ, Kuhn S. (1993). Participatory design. Communications of the ACM 36(6): 24-28.

[17] Kensing F, Blomberg J. (1998). Participatory design: Issues and concerns. Computer Supported Cooperative Work (CSCW) 7(3): 167-185. https://doi.org/10.1023/a:1008689307411

[18] Miao M, Köhlmann W, Schiewe M, Weber G. (2009). Tactile paper prototyping with blind subjects. Haptic and Audio Interaction Design, LNCS, Springer, pp. 81-90. https://doi.org/10.1007/978-3-642-04076-4_9

[19] Brock A, Vinot, JL, Oriola B, Kammoun S, Truillet P, Jouffrais C. (2010). Méthodes et outils de conception participative avec des utilisateurs non-voyants. In: Conference Internationale Francophone sur I’Interaction Homme-Machine. ACM Press, 65-72.

[20] Brulé E, Bailly G, Brock A, Valentin F, Denis G, and Jouffrais C. (2016). MapSense: multi-sensory interactive maps for children living with visual impairments. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, ACM, pp. 445-457. https://doi.org/10.1145/2858036.2858375

[21] Owen I, Van R. As “Enabling the future”. http://enablingthefuture.org/faqs-general 

[22] Fab Lab My Human Kit, https://myhumankit.org/ (last consulted 16/02/18)

[23] OSHWA, Open Source Hardware Association, http://www.oshwa.org 

[24] OSI, Open Source Initiative, http://opensource.org (last consulted, 16/02/18)