Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2021 Aug 28;22(1):121–131. doi: 10.1007/s10209-021-00838-8

HELF (Haptic Encoded Language Framework): a digital script for deaf-blind and visually impaired

Simerneet Singh 1, Nishtha Jatana 1,, Vasu Goel 2
PMCID: PMC8401351  PMID: 34483807

Abstract

Purpose

Digital media has brought a revolution, making the world a global village. For people who are visually impaired and people with visual and hearing impairment, navigating through the digital world can be as precarious as moving through the real world. To enable them to connect with the digital world, we propose a solution, Haptic Encoded Language Framework (HELF), that uses haptic technology to enable them to write digital text using swiping gestures and understand the text through vibrations.

Method

We developed an Android application to present the concept of HELF and evaluate its performance. We tested the application on 13 users (five visually impaired and eight sighted individuals).

Results

The preliminary exploratory analysis of the proposed framework using the Android application developed reveals encouraging results. Overall, the reading accuracy has been found to be approximately 91%, and the average CPM is found to be 25.7.

Conclusion

The volunteering users of the HELF Android application found it useful as a means of using the digital media and recommended its usage as an assistive technology for the visually challenged. The results of their performance of using the application motivate further research and development in the proposed work to make HELF more usable by people who are visually impaired and people with visual and hearing impairment.

Keywords: Haptics, Blind, Visually impaired, People with visual and hearing impairment, Vibrotactile, Assistive technology

Introduction

The use of touch as a mode of communication was proposed by Geldard in the twentieth century [1, 2]. Touch sensation can be an effective method for interaction and perception. People with eyesight often do not realize the extent to which the sense of touch is useful in knowing about surroundings. They perceive the surroundings mostly with their eyes and ears. On the other hands, people who are visually impaired are bound to use their non-visual senses in their daily life activities. These non-visual senses include hearing/audition, taste, smell, and touch. In the absence of visual ability, the sense of hearing is primarily used [3]. However, speech feedback may not be very useful in public places and other noisy environments [4]. The use of tactile senses [5] can be beneficial for the people who are visually impaired and people with visual and hearing impairment. The usage of sense of touch has evolved through the years to guide humans about surroundings through the reception of non-visual stimuli like pressure, pain, and changes in temperature guiding the movement in the space around and also about positioning and movement of the limbs [6]. Haptic technology combines the use of the tactile senses with the field of computer science and engineering [7].

Jones and Sarter [8] provide a comprehensive guidance on the usage of tactile displays for human–computer interaction. One of the seminal contributions in the area of usage of haptics for people with visual impairment is the work of P. Bach-Y-Rita et al. [9].

Vibrotactile devices have been used mainly for guiding the directions while walking or other complex navigational tasks for the visually disabled [10]. People who are visually impaired develop sharper working memory [11] and have better tactile acuity than that of sighted people [12] [13].

People who are visually impaired can use screen readers and text-to-speech converters for accessing the web. Such accessible technologies can be helpful, but they can obstruct their sense of hearing.

In this work, we intend to provide accessibility of the digital world to people who are visually impaired and people with both visual and hearing impairment, by presenting a framework (HELF) based on the principles of vibration patterns. This can enable people with visual impairment to understand and read digital text on their screens using different patterns of small vibration pulses and long vibration pulses. Simple swiping gestures can be used for the digital text using HELF. Reading and writing using HELF may initially entail time and patience to learn, but can be highly beneficial once learned. HELF can potentially be an effective way of communication for our intended users and can enable them to get convenient access to the digital world. We developed a new haptic language for the framework (named HELF Script) instead of using braille or any such language already available for people who are blind, because digital braille is difficult to recreate in terms of development as well as usage without the need of any extra hardware. Also, braille typing is prone to errors when used with flat touch screens of smartphones. The HELF Script developed, on the other hands, can be more accessible, as it can conveniently use the existing haptic hardware present in modern smartphones without using any external hardware or device apart from the user’s smartphone.

We evaluated the performance of HELF as a framework using the HELF-based application by assessing its application and ease of usage on both the visually impaired and sighted volunteers.

The remainder of the paper is organized as follows. Section 2 highlights the related work. In Sect. 3, we explain the working of HELF. Section 4 further elaborates on the working of HELF through Android application and explains the set up for testing of the application. The results are presented in Sect. 5. Section 6 mentions the threats to validity of the presented results. The last section concludes the paper and gives the directions for future extension of the presented work.

Related work

The use of haptic devices for the visually impaired is quite relevant. Most of the assistive technologies designed for them make use of touch as the sensory substitution of vision. The book [14] lays emphasis on the touch skill ability of the people who are visually impaired and advocates the usage of haptic devices for perception.

To help the deaf-blind individuals, a variety of tactual communication methods have been employed serving as a substitute for vision and/ or hearing. These methods include the Tadoma method of speech reading [15], the sign language [16] and the tactually receiving the fingerspelling [17].

Geldard [2] developed a method of communication through touch named ‘vibratese.’ It comprises tactile equivalents of the 45 basic elements of the English language. However, vibratese is no longer used due to unavailability of much literature.

Fritz et al. [18] demonstrated a haptic visualization system. They rendered mathematical data in a form that can be understood by people who are visually impaired using a haptic environment for data visualization.

Sjöström et al. [19] gave an insight into the efforts of using haptics for providing new computer interaction techniques to design assistive technologies for the people with visual disabilities. They have elaborated on the PHANToM and other devices.

Zajicek and Powel [20] describe web chat as a text-to-speech service for reading out the contents of a webpage to enable people who are visually impaired to get access to the web content.

Avizzano et al. [21, 22] modified the GRAB system (a European project), by accumulation of a haptic device and audio commands to enable the visually challenged to explore the 3D world through touch and audio commands help.

Pascale et al. [23] proposed the usage of haptic devices with the second life (A Virtual world: product of Linden Lab of California) that can potentially help the visually impaired to get access to the virtual world.

Flores et al. [10] gave a vibrotactile belt to enable them to get haptic directional information to blind walkers.

An idea to enable reading braille on mobile devices with touch screens was presented [24]. The authors gave three interaction methods using temporal tactile feedback to enable reading of six dot braille characters on touchscreen devices without any additional display attached to it.

Pawluk et al. [25] conducted a survey of behavioral research which can directly impact the design of assistive technology using haptics for the visually impaired.

Chang et al. [26] developed a handheld vibrotactile device that when used as a sleeve on the back of a mobile phone was able to convert the hand pressure into vibrational intensity to enable real-time interpersonal communication over a tactile channel. The device developed is bi-directional, that is, both the users were able to send as well as receive signals simultaneously.

Murphy and Darrah [27] created twenty computer applications (apps) for the visually impaired to enable them to better learn maths and science. These apps incorporated computer haptics, high contrast visuals, and auditory prompts. The analysis of testing six of these applications in a classroom environment shows that significant learning gains can be achieved by using such apps.

Reed et al. [28] developed a wearable tactile device to simulate phonemic-based tactile codes at the forearm. A collection of distinct tactile symbols formed to symbolize 39 of the phonemes of the English language. Their results depict that ten participants could learn haptic symbols of these English phonemes after one to four hours of training and the accuracy of phoneme recognition was found to be 86%. The authors have recently extended this work [29] to test the acquisition of about 500 words on 51 participants, and the results depict the successful transmission of English words using the TActile Phonemic Sleeve (TAPS) with a reasonable amount of effort spent in learning the phonemic codes.

The HaptiComm project makes use of haptics to support linguistic communications within the community of people with both visual and hearing impairment [30]. It is a hardware and software platform that aims to serve the community of people with both visual and hearing impairment, by providing them a way to use and explore their natural form language(s).

HELF overview

Since people with visually impairment and people with both visual and hearing impairment utilize their non-visual senses to interact with the world, HELF hereby uses this special ability to help them interact with the digital world. The proposed and implemented framework resembles in working with the Morse codes that uses dots and dashes to denote symbols and letters and is conceptually similar to vibratese language developed by Geldard [2]. Much literature on vibratese is not available, and hence, we have developed our own script and tried to utilize it to enable tactile communication using smart devices. The framework uses HELF script developed using two lengths of vibrations and gaps. The libraries for HELF have been designed for the English language. The HELF framework can also be used with laptops, smartphones, and smart kiosks. The subsequent sub-sections give details of the HELF script and libraries.

HELF script

The HELF script encodes 26 English letters (A-Z) and numerals (0–9). The script can be extended to include special characters and punctuations. The HELF code is a dynamic vibration pattern which consists of a sequence of vibrations and gaps (named as HELFabets (shown in Table 1)). We make use of two variants of vibrations (short and long) and two types of gaps (letter gap and word gap) making one HELF code for the script. For ease, we provide a nomenclature to the HELF characters (variations of vibrations and gaps used). The nomenclature is provided in Table 2. These codes have been designed to approximate the inverse of the letter frequency in the English language character. The frequency of words has been used from the work of Jones and Mewhort [31]. The combined frequency (Com. Freq.) is obtained by summing the upper case frequency and lower case frequency of the alphabets (listed in Table 2). Most frequently occurring letters will have the shortest and easiest HELF code. (Number of vibrations patterns will be least and have least complex HELFabet combinations.) These codes can be memorized by people who are visually impaired and people with visual and hearing impairment. Once skilled in it, these codes are interpreted easily and thus can be effortlessly used for reading and writing using smartphones.

Table 1.

Nom of HELFabets

HELF character Name
Short Vibration Do
Long Vibration Da
Character/Letter Gap Di
Word Gap De

Table 2.

HELF Codes

C UC Freq LC Freq Com. Freq HELF code
E 138,443 7,741,842 7,880,285 Do
T 325,462 5,507,692 5,833,154 Da
A 280,937 5,263,779 5,544,716 DoDo
O 105,700 4,729,266 4,834,966 DaDa
I 223,312 4,527,332 4,750,644 DoDa
N 205,409 4,535,545 4,740,954 DaDo
S 304,971 4,186,210 4,491,181 DoDoDo
R 146,448 4,137,949 4,284,397 DaDaDa
H 123,632 2,955,858 3,079,490 DoDoDa
L 106,984 2,553,152 2,660,136 DoDaDo
D 129,632 2,369,820 2,499,452 DoDaDa
C 229,363 1,960,412 2,189,775 DaDoDo
M 259,474 1,467,376 1,726,850 DaDoDa
U 57,488 1,613,323 1,670,811 DaDaDo
P 144,239 1,255,579 1,399,818 DoDoDoDo
F 1,00,751 1,296,925 13,97,676 DaDaDaDa
G 93,212 1,206,747 1,299,959 DoDoDaDa
Y 94,297 1,062,040 1,156,337 DaDaDoDo
W 107,195 1,015,656 1,122,851 DoDaDaDa
B 169,474 866,156 1,035,630 DaDoDoDo
V 31,053 653,323 684,376 DoDaDaDo
K 46,580 460,788 507,368 DaDoDoDa
J 78,706 65,856 144,562 DoDoDoDa
X 7578 123,577 131,155 DoDaDoDo
Z 5610 66,423 72,033 DoDoDoDoDo
Q 11,659 54,221 65,880 DaDaDaDaDa
0 DaDoDaDo
1 DoDaDoDa
2 DoDoDaDo
3 DaDoDaDa
4 DaDaDoDa
5 DaDaDaDo
6 DaDaDoDaDa
7 DoDoDaDoDo
8 DoDaDoDaDo
9 DaDoDaDoDa

C: Character, Codes and frequency of HELF characters (UC Freq.: upper case frequency, LC Freq.: lower case frequency), Com. Freq: combined frequency

Table 2 lists the HELF codes for the alphabets and digits.

HELF architecture

This subsection elaborates the architecture of HELF. HELF typing explains how we can use HELF to write digital text. The HELF reading explains the reading of digital text using HELF. To use HELF with smartphones, we provide an input system for typing the English text using HELF script, and for the reading part, text is converted to HELF encoded vibration patterns.

HELF typing

To use HELF, a typing system has been designed that recognizes the swiping gestures and takes the desired input and converts it into digital text. The gestures used to designate the HELFabets are shown in Table 3. The users are simply required to memorize the HELF script and use swiping gestures to provide input in the form of digital text.

Table 3.

Gesture FOR HELF codes

Gesture Meaning
Left swipe Do
Right swipe Da
Single tap Di
Up swipe (De)word completion
Double tap message completion
Down swipe Backspace

HELF reading

We make use of vibrotactile feedback to understand digital text. HELFabets can be recognized by the duration of the vibrations felt and duration of void between vibrations. A HELF code can be recognized by using a combination of HELFabets (Table 2). For instance, the HELF code for letter “I” is “DoDa” that means a short vibration followed by a long vibration. So, users can recognize the text if they memorize the vibration patterns for the HELF codes thoroughly. To make the reading of HELF user-friendly, the duration of HELFabets is kept as variable by defining four different options where option 4 is the easiest and consequently option 1 being the fastest. The users can start with option 4 and steadily move to option 1 when the user gains proficiency. The value for these options is decided by the method of trial and error to determine the most suitable values which are recognizable and can be distinguished from each other. Subjects were asked to try all the speed options to check which is the most comfortable for them to distinguish between Do and Da, most of the subjects choose Option 2 to be the most comfortable. Table 4 gives the duration of vibrations and letter gap in milliseconds for different speed options available in the HELF Android application for reading.

Table 4.

Variable speed options for reading (G- Gap between Do and Da)

Option 4 Option 3 Option 2 Option 1
Do-320 ms Do-240 ms Do-120 ms Do-80 ms
Da-520 ms Da-400 ms Da-260 ms Da-200 ms
G-400 ms G-350 ms G-300 ms G-250 ms

Set up for user testing

We developed an Android application to present HELF as a proof of concept. We tested the application on five visually impaired individuals, collected and analyzed the results to estimate the performance of HELF.

Android application

The working of android applications is shown in Fig. 1.

Fig. 1.

Fig. 1

Using HELF Android application

As shown, the screen (Fig. 1A) on HELF testing app is divided into three major sections:

  • HELF Reading Section: Part of the HELF application screen allows the trainer to enter text manually and play the vibration patterns for the same. The text is converted to HELF vibrations patterns which can be perceived by the user to understand the text.

  • HELF Typing Section: The middle part of the HELF application screen allows the user to type text using HELF codes by swiping on the screen.

  • HELFabet Speed Control: The bottom part of the screen allows the trainer to select the speed option at which the HELF output is displayed for each HELFabet from available speed options (shown in Table 4).

For writing the letter ‘E,’ the swipe direction is left and then followed by a tap (Fig. 1B). A tap here indicates the end of HELF code for a character to be typed. To type the letter ‘T,’ the user needs to swipe right and then tap the screen (Fig. 1C). To type space between the words, the user needs to swipe up (Fig. 1D). Swipe down gesture is for backspace (Fig. 1E). Double tap is to read out the typed text using text-to-speech (intended for visually impaired) and make the screen ready for next input (Fig. 1F).

Testing of the application

All authors tested the application thoroughly before actual testing by the volunteering participants. Participants included eight sighted people and five visually impaired users. The sighted individuals are from a technical institute who voluntarily participated for testing of HELF application. The five visually impaired users who voluntarily participated in testing of the applications are member of the Blind Relief Association in New Delhi, India. Participants needed to be 15 years of age or older and understand the basics of English language.

Before the testing phase, participants were required to undergo a basic training, wherein they were required to learn the HELFabets and HELF codes. The pre-training was conducted in the presence of a researcher along with a team of three researchers to help with training and data collection. The visually impaired users were pre-trained and thereafter underwent the testing phase in a room at the Blind Relief Association, New Delhi. Sighted users are graduate students of a technical institute; hence, they underwent the training and testing phase at their institute itself. During the pre-training phase, the visually impaired subjects were recited the HELF codes for each character and using HELF mobile application, subjects typed the HELFabets on their phones, following which, they were asked to practice and memorize the HELF codes. Volunteers made sure that subjects were trained enough for actual testing by giving them a few easy word phrases to type. This process was conducted for two days, to make them understand the overall working of the application. After two days of pre-training, four out of five visually impaired users themselves volunteered to move to the testing phase. The fifth visually challenged user was then convinced that they were ready to undergo the training phase.

In Level 1 training, participants were required to learn the codes of E to L ((shown in Table 2) as they were shorter and easy to learn) and understand and memorize the gestures shown in Table 3. Level 2 training requires learning of all the codes shown in Table 2. Level 3 required the learning of codes for digits 0–9.

For testing the application after the training phase, three testing modules were designed to evaluate user performance. The writing module was taken first where the subjects were asked to type the phrases from each module. The results were noted based on time taken by the subject to type each phrase correctly by the subject based on which the CPM was calculated later. The subjects themselves got to know if the typed character is absolutely correct or not, by listening to the announced character typed by text-to-speech module used in the test HELF Android application. The reading module was given later followed by a short break after the writing test. We have counted only the absolutely correct words to calculate the CPM score, so even if one alphabet goes wrong in the desired word, we waited for the user to correct it and the correction time was also included in the CPM. Appendix I shows the three modules used for assessing the usage of the proposed framework using the application developed. The testing data for the reading and writing module were collected for content analysis. The reading part of the application could be best tested by people who have both visual and hearing impairment. This, however, required training such users, which in turn required skilled professionals. Also, due to the pandemic (Covid-19) situation, we were unable to test it on people with both visual and hearing impairment.

Results and analysis

The participants’ performance during the testing of the HELF application was thoroughly recorded and analyzed. The performance analysis was done for both writing and reading the HELF codes. The test sheets used for evaluation of the usage of the HELF using the android application are shown in Appendix 1.

Table 5 shows the recorded results of using the HELF application by the volunteers (Table 5 for sighted volunteers and Table 6 for visually impaired volunteers). Figure 2 presents the analysis of the recorded results in a graphical manner. The bar graph presents the performance of participants. The horizontal axis represents the modules or number of characters required to read and write using HELF. The vertical axis represents the number of percentage accuracy of reading, and the average characters per minute (CPM) for writing of each module in the testing phase. Figures 2 and 3 show the graphs for the eight sighted volunteers, and Figs. 4 and 5 show the graphs results for the five visually impaired users.

Table 5.

Result sheet of sighted volunteers (CPM shows the characters typed in a minute for writing, and the score of each individual depicts the reading score)

Module Size 20 Characters 58 Characters 116 Characters
Test Cases Module 1 Module 2 Module 3
Person 1 CPM 11 12 12
Person 1 Score 20 54 111
Person 2 CPM 30 37 22
Person 2 Score 15 43 86
Person 3 CPM 12 18 10
Person 3 Score 19 46 106
Person 4 CPM 6 19 17
Person 4 Score 14 42 82
Person 5 CPM 26 35 30
Person 5 Score 19 58 113
Person 6 CPM 16 14 16
Person 6 Score 18 49 101
Person 7 CPM 24 26 28
Person 7 Score 20 56 99
Person 8 CPM 21 25 22
Person 8 Score 17 55 93

Table 6.

Result sheet of visually impaired volunteers (CPM shows the writing score, and the score of each individual depicts the reading score)

Module size 20 characters 58 characters 116 characters
Test Cases Module 1 Module 2 Module 3
Person 1 CPM 24 34 32
Person 1 Score 20 55 112
Person 2 CPM 28 35 24
Person 2 Score 18 52 111
Person 3 CPM 27 31 30
Person 3 Score 19 54 109
Person 4 CPM 28 34 34
Person 4 Score 20 50 108
Person 5 CPM 30 38 39
Person 5 Score 18 56 113
Average Score 19 53.4 110.6
Average CPM 27.4 34.4 31.8

Fig. 2.

Fig. 2

Results depicting average reading accuracy of sighted volunteers

Fig. 3.

Fig. 3

Results depicting average writing accuracy of sighted volunteers

Fig. 4.

Fig. 4

Results depicting average reading accuracy of visually impaired volunteers

Fig. 5.

Fig. 5

Results depicting writing accuracy of visually impaired volunteers

As is evident from Fig. 2, average accuracy for reading in case of sighted users slightly decreases throughout. The plausible reason could be that Module 1 consists of characters with the easiest and shortest HELF codes, whereas Module 2 consists of characters with HELF codes having higher complexity. Similarly, characters in Module 3 consist of numbers along with least used alphabets in English language which have the most complex HELF codes as compared to the other two modules. Hence, there is a slight linear drop in the reading accuracy graph. This is due to the increasing complexity of HELF codes with each module, which needs time to learn and memorize, especially in case of sighted people who are not accustomed to using such a system.

The writing accuracy of sighted users increases from Module 1 to Module 2, and then, it slightly falls (as depicted in Fig. 3). With Module 1, users gradually get accustomed to HELF gesture typing system, and as they move to Module 2, the practice from Module 1 helps users to type faster in Module 2, in case of Module 3, the HELF codes are longer and require more swipes; hence, the graph in Fig. 3 shows a slight drop in the CPM for Module 3.

As can be seen from Fig. 4, the average reading accuracy is slightly different for volunteers with visual impairment as compared to sighted people, as people with visual impairment usually develop sharper senses with prolonged use, thus helping them to easily understand HELF codes and can easily differentiate between short haptic pulses and longer haptic pulses, hence getting better the reading accuracy as compared to sighted people. Having a close look at the graph, we observe a slight drop in accuracy for Module 2, which is because Module 1 has shorter HELF codes and smaller words but as we move to Module 2 the word size HELF codes size increases drastically. As depicted in Fig. 5, the writing accuracy of visually impaired users initially increases from Module 1 to Module 2, as the users gradually get used to HELF gesture typing system and as they move to Module 2, practice from Module 1 helps users to type faster, but in Module 3, the HELF codes are longer and require more swipes; hence, the graph shows a slight drop in the CPM for Module 3.

Overall, the reading accuracy can be approximated to 91%, and the average CPM is found to be 25.7. These results are comparable or even higher than the braille-based frameworks found in the literature [32], 33. If we consider the average CPM of HELF shown by only visually impaired (which is quite higher than that of sighted individuals), then HELF shows highest typing speed (CPM ~ 34) as compared to Flight [32], which is a braille-based system (having CPM ~ 33) and is much higher than the other similar works (having CPM ~ 12) [33]. The empirical analysis of the HELF shows encouraging results as the reading accuracy is found to be quite high, and the CPM in case of writing is also found to be good enough.

Threats to validity

The application was tested in a controlled environment with the participants. This section presents the threats to validity of the presented results.

The threats to internal validity:

The following factors related to the experimental condition may have affected the results of the research:

  1. Participants’ selection: The participants of the study volunteered to test the application and were limited in number. The application can be better tested with random selection of participants from various different organizations and with a larger number of participants involved.

  2. Testing and training: The training phase was constrained by a time limit, and the testing was performed using three modules. Extended training and further tests can assure the validity of the presented results. Testing the application on participants with hearing and visual impairment could further proclaim the effectiveness and usability of the application.

  3. Different testing hardwares: All participants were provided with different smartphones from different brands. Not all phones have the same haptic engines, which can lead to slight lag in vibrations or a significant difference can be felt for the intensity of vibrations.

The threats to external validity:

The following may affect the generalizability.

  1. Participants’ age group: The age of the participants of the study is between 20 and 35 years. The participants were already using smartphones. So, it was easy for them to understand the application during the training phase. People of an older age group may not be comfortable with smartphones.

  2. Participants’ experience on computer-based application: The participants were already comfortable with assistive technology-based smart applications. Not all visually impaired may be well-versed with it.

Conclusion and future work

In this paper, we presented HELF as an application for people with visual impairment and people with both visual and hearing impairment, to enable them to connect with the digital world. The usage of this framework can impact the lives of almost 30 million individuals over the world who are currently devoid of the expanse of resources that the World Wide Web can provide smartphones and other such devices. The use of HELF can empower the visually impaired to be socially connected to their loved ones without impacting their capacity to listen and be vigilant when in public. The application can be a great support for people with both visual and hearing impairment (referred to as deaf-blind people).

The preliminary analysis of HELF as an Android application gave encouraging results. The users (both sighted and visually impaired volunteers) found the presented framework usable as a mode of communication.

Future work includes the following:

  • Conduct longitudinal study of the proposed work, wherein more visually impaired users as well as users who have both visual and hearing impairment can be involved, so as to congeal the capacity and effectiveness of usage of the HELF for such users.

  • Hardware modules can be developed for deploying HELF with devices which do not have haptic engines and/or touchscreen interfaces.

  • Developing a library or an application programming interface (API) module of text to HELF code to enable developers to use HELF with any mobile application.

  • Developing a third-party keyboard for android and iOS users with HELF input gestures.

  • Developing a smart watch application for Android Wear OS and Apple Watch OS to allow typing using gestures from smart watch.

Appendix

HELF Test sheet

graphic file with name 10209_2021_838_Figa_HTML.jpg

Declarations

Conflict of interest

All authors state that there is no conflict of interest to be declared.

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Simerneet Singh, Email: mr.simerneet@gmail.com.

Nishtha Jatana, Email: nishtha.jatana@gmail.com.

Vasu Goel, Email: vasu18322@iiitd.ac.in.

References

  • 1.Geldard FA. Adventures in tactile literacy. Am. Psychol. 1957;12:115–124. doi: 10.1037/h0040416. [DOI] [Google Scholar]
  • 2.Geldard FA. Some neglected possibilities of communication. Science. 1960;131:1583–1588. doi: 10.1126/science.131.3413.1583. [DOI] [PubMed] [Google Scholar]
  • 3.Nicolau, H., Jorge, J., Guerreiro, T.: Blobby: how to guide a blind person. In: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems (2009)
  • 4.Rümelin, S., Rukzio, E., Hardy, R.N.: A novel tactile information display for pedestrian navigation. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (2011)
  • 5.Ross, D. A., Blasch, B. B.: Wearable interfaces for orientation and way finding. In: Proceedings of the fourth international ACM conference on Assistive technologies - Assets ’00 (2000)
  • 6.Lederman S, Klatzky R. Haptic perception: a tutorial. Atten. Percep. Psyschophy. 2009;71(7):1439–1459. doi: 10.3758/APP.71.7.1439. [DOI] [PubMed] [Google Scholar]
  • 7.Sreelakshmi, M., Subash, T.D.: Haptic technology: A comprehensive review on its applications and future prospects. In: Materials Today: Proceedings 4.2 (2017)
  • 8.Jones LA, Sarter NB. Tactile displays: guidance for their design and application. Hum. Factors. 2008;50(1):90–111. doi: 10.1518/001872008X250638. [DOI] [PubMed] [Google Scholar]
  • 9.Bach-Y-Rita P, Collins C, Saunders F, White B, Scadden L. Vision substitution by tactile image projection. Nature. 1969;221:963–964. doi: 10.1038/221963a0. [DOI] [PubMed] [Google Scholar]
  • 10.Flores G, Kurniawan S, Manduchi R, Martinson E, Morales LM, Sisbot EA. Vibrotactile guidance for wayfinding of blind walkers. IEEE Trans. Haptics. 2015;8(3):306–317. doi: 10.1109/TOH.2015.2409980. [DOI] [PubMed] [Google Scholar]
  • 11.Withagen A, Kappers AM, Vervloed MP, Knoors H, Verhoeven L. Short term memory and working memory in blind versus sighted children. Res. Dev. Disabil. 2013;34(7):2161–2172. doi: 10.1016/j.ridd.2013.03.028. [DOI] [PubMed] [Google Scholar]
  • 12.Postma A, Zuidhoek S, Noordzij ML, Kappers AM. (2007). Differences between early-blind, late-blind, and blindfolded-sighted people in haptic spatial-configuration learning and resulting memory traces. Perception. 2007;36(8):1253–1265. doi: 10.1068/p5441. [DOI] [PubMed] [Google Scholar]
  • 13.Wan CY, Wood AG, Reutens DC, Wilson SJ. Congenital blindness leads to enhanced vibrotactile perception. Neuropsychologia. 2010;48(2):631–635. doi: 10.1016/j.neuropsychologia.2009.10.001. [DOI] [PubMed] [Google Scholar]
  • 14.Heller, M. A., Gentaz, E.: Psychology of Touch and Blindness, 1st ed., Routledge (2013)
  • 15.Reed C, Rabinowitz W, Durlach N, Braida L, Conway-Fithian S, Schultz M. Research on the Tadoma method of speech communication. J. Acoust. Soc. Am. 1985;77(1):247–257. doi: 10.1121/1.392266. [DOI] [PubMed] [Google Scholar]
  • 16.Reed C, Delhorne L, Durlach NI, Fischer S. A study of the tactual reception of sign language. J. Speech Hear. Res. 1995;38(2):477–489. doi: 10.1044/jshr.3802.477. [DOI] [PubMed] [Google Scholar]
  • 17.Reed C, Delhorne L, Durlach N, Fischer S. A study of the tactual and visual reception of fingerspelling. J. Speech Hear. Res. 1990;33(4):786–797. doi: 10.1044/jshr.3304.786. [DOI] [PubMed] [Google Scholar]
  • 18.Fritz JP, Barner KE. Design of a haptic data visualization system for people with visual impairments. IEEE Trans. Rehabil. Eng. 1999;7(3):372–384. doi: 10.1109/86.788473. [DOI] [PubMed] [Google Scholar]
  • 19.Sjöström C, Rassmus-Gröhn K. The sense of touch provides new interaction techniques for disabled people. Technol. Disabil. 1999;10:1. doi: 10.3233/TAD-1999-10105. [DOI] [Google Scholar]
  • 20.Zajicek, M., Powell, C.: Enabling visually impaired people to use the internet. In: IEE Colloquium ‘computers helping people in the service of mankind (1997)
  • 21.Avizzano, C., Marcheschi, S., Angerilli, M., Fontana, M., Bergamasco, M., Gutierrez, T., Mannegeis, M.: A multi-finger haptic interface for visually impaired people. In: 12th IEEE International workshop on Robot and Human Interactive Communication (ROMAN 2003) (2003)
  • 22.Iglesias, R., Casado, S., Gutierrez, T., Barbero, J., Avizzano, C., Marcheschi, S., Bergamasco, M., Labein, F., Derio, S.: Computer graphics access for blind people through a haptic and audio virtual environment. Haptic. In: Haptic, Audio and Visual Environments and Their Applications (HAVE) (2004)
  • 23.Mulatto, S., Prattichizzo, D.: Bringing haptics to second life for visually impaired people. In: Human Haptic Sensing and Touch Enabled Computer Applications (2008)
  • 24.Rantala J, Raisamo R, Lylykangas J, Surakka V, Raisamo J, Salminen K, Pakkanen T, Hippula A. Methods for presenting braille characters on a mobile device with a touchscreen and tactile feedback. IEEE Trans. Haptics. 2009;2(1):28–39. doi: 10.1109/TOH.2009.3. [DOI] [PubMed] [Google Scholar]
  • 25.Pawluk DTV, Adams RJ, Kitada R. Designing haptic assistive technology for individuals who are blind or visually impaired. IEEE Trans. Haptics. 2015;8(3):258–278. doi: 10.1109/TOH.2015.2471300. [DOI] [PubMed] [Google Scholar]
  • 26.Chang, A., O'Modhrain, S., Jacob, R., Gunther, E. and Ishii, H.: ComTouch: design of a vibrotactile communication device. In: 4th conference on Designing interactive systems: processes, practices, methods, and techniques (2002)
  • 27.Murphy K, Darrah M. Haptics-based apps for middle school students with visual impairments. IEEE Trans. Haptics. 2015;8(3):318–326. doi: 10.1109/TOH.2015.2401832. [DOI] [PubMed] [Google Scholar]
  • 28.Reed CM, et al. A phonemic-based tactile display for speech communication. IEEE Trans. Haptics. 2018;12(1):2–17. doi: 10.1109/TOH.2018.2861010. [DOI] [PubMed] [Google Scholar]
  • 29.Tan, Hong Z., et al.: "Acquisition of 500 english words through a TActile Phonemic Sleeve (TAPS). IEEE Transactions on Haptics (2020) [DOI] [PubMed]
  • 30.Duvernoy, B., Topp, S., Hayward, V.: “HaptiComm”, a Haptic Communicator Device for Deafblind Communication. In: International AsiaHaptics conference, Singapore, (2018)
  • 31.Jones MN, Mewhort DJ. Case-sensitive letter and bigram frequency counts from large-scale English corpora. Behav. Res. Methods Instrum. Comput. 2004;36(3):388–396. doi: 10.3758/BF03195586. [DOI] [PubMed] [Google Scholar]
  • 32.Chakraborty, T., Khan, T. A., Al Islam, A.: Flight: a low-cost reading and writing system for economically less-privileged visually-impaired people exploiting ink-based braille system. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 531–540 (2017)
  • 33.Alnfiai M, Sampalli S. BrailleEnter: a touch screen braille text entry method for the blind. Proc. Comput. Sci. 2017;10:257–264. doi: 10.1016/j.procs.2017.05.349. [DOI] [Google Scholar]

Articles from Universal Access in the Information Society are provided here courtesy of Nature Publishing Group

RESOURCES