Disability and Rehabilitation: Assistive Technology

ISSN: 1748-3107 (Print) 1748-3115 (Online) Journal homepage: http://www.tandfonline.com/loi/iidt20

Haptic-assistive technologies for audition and vision sensory disabilities Francesca Sorgini, Renato Caliò, Maria Chiara Carrozza & Calogero Maria Oddo To cite this article: Francesca Sorgini, Renato Caliò, Maria Chiara Carrozza & Calogero Maria Oddo (2017): Haptic-assistive technologies for audition and vision sensory disabilities, Disability and Rehabilitation: Assistive Technology, DOI: 10.1080/17483107.2017.1385100 To link to this article: http://dx.doi.org/10.1080/17483107.2017.1385100

Published online: 10 Oct 2017.

Submit your article to this journal

Article views: 8

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=iidt20 Download by: [Southern Cross University]

Date: 13 October 2017, At: 18:39

DISABILITY AND REHABILITATION: ASSISTIVE TECHNOLOGY, 2017 https://doi.org/10.1080/17483107.2017.1385100

REVIEW

Haptic-assistive technologies for audition and vision sensory disabilities Francesca Sorgini

, Maria Chiara Carrozza , Renato Calio

and Calogero Maria Oddo

Downloaded by [Southern Cross University] at 18:39 13 October 2017

The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Pisa, Italy

ABSTRACT

ARTICLE HISTORY

Purpose: The aim of this review is to analyze haptic sensory substitution technologies for deaf, blind and deaf–blind individuals. Method: The literature search has been performed in Scopus, PubMed and Google Scholar databases using selected keywords, analyzing studies from 1960s to present. Search on databases for scientific publications has been accompanied by web search for commercial devices. Results have been classified by sensory disability and functionality, and analyzed by assistive technology. Complementary analyses have also been carried out on websites of public international agencies, such as the World Health Organization (WHO), and of associations representing sensory disabled persons. Results: The reviewed literature provides evidences that sensory substitution aids are able to mitigate in part the deficits in language learning, communication and navigation for deaf, blind and deaf–blind individuals, and that the tactile sense can be a means of communication to provide some kind of information to sensory disabled individuals. Conclusions: A lack of acceptance emerged from the discussion of capabilities and limitations of haptic assistive technologies. Future researches shall go towards miniaturized, custom-designed and low-cost haptic interfaces and integration with personal devices such as smartphones for a major diffusion of sensory aids among disabled.

Received 15 March 2017 Revised 14 September 2017 Accepted 24 September 2017 KEYWORDS

Sensory disability; sensory substitution; haptic feedback; vibrotactile stimulation; deaf; blind; tactile display

ä IMPLICATIONS FOR REHABILITATION

 Systematic review of state of the art of haptic assistive technologies for vision and audition sensory disabilities.  Sensory substitution systems for visual and hearing disabilities have a central role in the transmission of information for patients with sensory impairments, enabling users to interact with the not disabled community in daily activities.  Visual and auditory inputs are converted in haptic feedback via different actuation technologies. The information is presented in the form of static or dynamic stimulation of the skin.  Their effectiveness and ease of use make haptic sensory substitution systems suitable for patients with different levels of disabilities. They constitute a cheaper and less invasive alternative to implantable partial sensory restitution systems.  Future researches are oriented towards the optimization of the stimulation parameters together with the development of miniaturized, custom-designed and low-cost aids operating in synergy in networks, aiming to increase patients’ acceptability of these technologies.

Introduction Individuals with disabilities of physical senses may have auditory, visual or tactile impairments, also in combination among them. Those with tactile impairments are often upper- or lower-limb amputees as a consequence of vascular diseases, cancer or accidents [1,2]. More rarely, lack of touch sensitivity is not related to limb loss but is caused by rare pathologies such as neuro-degenerative disorders [3,4] or the ageing process [5]. Disabilities related to the tactile sense are not addressed in this review paper. Instead, we consider the cases in which tactile afference is functional and can be used in combination with assistive technologies in order to mitigate the effects associated to deficits of the other physical senses. Hence, in the following, we address the cases of a lack or a significant deficit of audition, vision or both these senses. Such sensory disabilities can appear either by birth, or later due CONTACT Francesca Sorgini [email protected]

to pathologies or events that may occur during the life. Causes include rare pathologies, malformations, accidents or the natural ageing process. According to the World Health Organization (WHO), sensory disabilities affect the 5.3% of the world population for audition impairments and the 9.3% for vision impairments [6–18]. The most frequent cases are those involving children and elderly [7,13,19–21]. There are many degrees of sensory impairments and most persons who experience these disabilities can make adaptations that lead to a partial reduction of the impact of their disability in everyday life, also in the workplace. Figure 1 shows the distribution of audition and vision disabilities across world regions according to WHO statistics, and also reports the degree of impairment. To the best of our knowledge, data referring to world distribution of deaf-blindness cannot be precisely estimated

[email protected] The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Pisa, Italy; Calogero Maria Oddo The BioRobotics Institute, Scuola Superiore Sant’Anna, Pontedera, Pisa, Italy

ß 2017 Informa UK Limited, trading as Taylor & Francis Group

F. SORGINI ET AL.

Downloaded by [Southern Cross University] at 18:39 13 October 2017

2

Figure 1. Distribution of audition (top) and vision (bottom) disabilities in world sub-regions, according to the World Health Organization (WHO).

because of the difficulty in collecting reliable information on combined sensory disabilities. Sensory substitution systems can in part cover the communication and interfacing gap caused by sensory disabilities. A sensory substitution system converts information related to a sense in a language understandable by another sense. The tactile sense can be a means of communication for auditory or visual information, or both, via sensory substitution aids. As a matter of fact, the tactile system is a communication channel since it possesses a

relatively high density of mechano-tactile receptors that can act as parallel access points to transduce, convey and process information [22]. The density of tactile receptors depends on the body area. Particularly referring to the human hand, about 17,000 total receptors are estimated to reside there with maximal densities in the fingertips and an average of about 140 units/cm2 for Meissner corpuscles and about 70 units/cm2 for Merkel receptors [23]. In this work, we review static and dynamic tactile displays used for communication, for relaying alerting messages, for navigation

Downloaded by [Southern Cross University] at 18:39 13 October 2017

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES

3

Figure 2. Taxonomy of reviewed haptic sensory substitution systems for audition and vision sensory disabilities.

and the perception of environmental information such as sounds and presence of objects in the surroundings. Substitutive stimulation can be achieved by means of static indentations, vibrations or sliding on the surface of the skin. A schematic representation of existing sensory substitution aids is systematized in Figure 2, in which light blue rectangles represent the haptic technologies discussed in the present review. We classify sensory substitution systems based on the disability they address, then we discern by targeted functionalities and used actuation technologies. Such taxonomy is reflected in the sectioning of the following of the present review paper. Tactile communication methods For visually and hearing impaired subjects, haptic displays target the substitution of missing senses with somatosensory stimuli, taking advantage of the tactile sense [24]. Haptic displays can be “static” or “dynamic”. In the former class, the actuation is achieved with mechanical pins statically positioned to form Braille characters, while in the latter the skin is stimulated by a dynamic mechanical action. Complex information can be conveyed by “tactile icons”, namely “tactons”. Tactons are abstract messages used to communicate information in a non-visual manner via the tactile sense. They are short tactile messages obtained through the variation of the parameters of cutaneous perception (i.e., frequency, amplitude and duration of the stimulation), and are delivered to the skin via haptic actuators [25,26]. Tactons are structured to be easily memorized, intuitive and able to induce abstraction for expert users. They encode information thanks to the variation of parameters such as frequency, amplitude, waveform and duration of the tactile pulse. Important is the selection of the body area to stimulate since discrimination capabilities of human skin are a consequence of different mechanoreceptor densities. All the technologies which convey information to the user involving tactile sense are classified as “haptic interfaces”. “Haptics” is defined as the science of applying tactile sensations on hands or other body parts of a user, with the main purpose of

transmitting useful information. Haptic technologies are mainly adopted for virtual reality purposes, gaming, surgical robotics and industrial applications. In this review, the main purpose of haptic technologies will be to overcome the isolation of sensory-disabled persons. Haptic interfaces can assist impaired persons in navigation, but can also help in remote communication (instant messaging) to allow to some extent speech comprehension and writing/reading. To better understand the working principle of these devices, in the following, we will briefly introduce the communication languages mostly used by persons with audition or vision impairments. Persons with single sensory disabilities such as deaf and blind can communicate and interface with the surroundings by means of codified languages (sign language for the deaf and Braille alphabet for the blind individuals). Communication can be harder for deaf–blind persons, because of the lack of both visual and audition senses. Codified communication methods involving tactile sense are then required. In both deaf/hard of hearing and deaf–blind communities the availability of a sign language as the only way of communication can lead to a linguistic isolation from the non-impaired. In fact, most of non-disabled people do not understand sign language and interpreters are expensive and not always available [27]. Sign languages for deaf and hard of hearing persons are based on visual sign patterns instead of sounds to transmit information. Characters and concepts are communicated using hand shapes, and also hands’, arms’ and body’s orientations, but also with movements combined with facial expressions [28]. Braille alphabet is the first method used to access written information by the majority of visually impaired people, yet not all blind individuals are able to understand Braille language [29]. In this code, characters are expressed using a cell of six dots put in three parallel rows. With 64 combinations, these cells can be used to express letters, numbers, punctuation marks and complex concepts [30]. Deaf or hard of hearing people also affected by visual impairments use sign languages among their communities. However,

Downloaded by [Southern Cross University] at 18:39 13 October 2017

4

F. SORGINI ET AL.

they are slower in signing or fingerspelling in comparison with people with the single hearing disability. Furthermore, some deaf–blind people with restricted peripheral vision may prefer the interlocutor to sign in a limited space at chest level, with the adaptation of some signs located at waist level. Nevertheless, the most common communication method among deaf–blind persons remains the tactile sign language. In the last decades, several tactile languages were codified across different countries. Using tactile sign language, deaf–blind individuals place one hand, or both, on the speaker’s hands. This allows deaf–blind people feeling shapes and movements, meaning characters and concepts. However, deaf–blind individuals with residual vision sometimes prefer to place their hands on the speaker’s wrists, leaving the interlocutor’s hands free to move. In this way, they can use their residual vision beside tactile sense to grasp what the other is saying. Fingerspelling, which is easier to learn than the sign language, is instead preferred by the visually impaired who became deaf later in life, or by deaf persons that only rely on speech reading, i.e., lip-reading. This method consists in the placing of interlocutor’s hand over the fingerspelling one, on the signer’s palm, or in closing interlocutor’s hand around the signer’s one. The “Print On Palm” method consists instead in hand-writing letters on the interlocutors’ hand palms [31]. “Tadoma” is instead a tactile lipreading where the interlocutor’s thumb is placed on speaker’s lips, while fingers lie along his/her jawline. It can also be performed via different hand positioning methods [32]. “Tadoma” allows the deaf–blind person to catch the speaker’s lips movement together with throat vibrations, and it is also used by persons with severe hearing impairments to improve their remaining hearing. It has been proved that “Tadoma” facilitates speechreading and language learning. Other tactile languages, called manual alphabets, allow the word spelling on deaf–blind individuals hand palm. The most common are “Lorm” and “Malossi”. In the former, each letter corresponds to a particular sign or location on the palm of the interlocutor [33], in the latter, the palm is touched following sequences associated with characters [34]. Another communication method is particularly used in Japan and is called “Finger-Braille”. In this case, users’ fingers are used as keys of a Braille typewriter. The six Braille dots can be represented on the index, middle and ring fingers of both hands (two-handed Finger-Braille method) or using the distal interphalangeal joints and proximal interphalangeal joints of the index, middle and ring fingers on a single hand (one-handed Finger-Braille method) [35,36]. Haptic technologies for deaf individuals Haptic aids for deaf persons concern devices dedicated to “listening and speech assistance” and “communication and interfacing” purposes (Figure 2). The former devices play the specific role of audition substitution, partly allowing deaf individuals to listen through the tactile sense and to learn how to speak and modulate their voice properly. The latter devices can enhance social interactions between deaf persons and not impaired people, with the purpose to catch attention and feel sounds through the skin. A list of hearing aids is presented in Table 1, with specific focus on actuation principles. In the next paragraph, the discussion about deaf-oriented technologies will follow the guideline presented in Figure 2, first analyzing “listening and speech assistance” aids and then going through “communication and interfacing” devices, with aggregation based on actuation technologies.

Listening and speech assistance aids The development of listening and speech assistance devices started in the second half of 1900, when the researches carried out by Geldard led to the formulation of a tactile-based language similar to a tactile Morse code, the “Vibratese”. In these years, improvements in the prototyping of assistive devices consisted mainly in the development of wearable aids, often based on the transposition of the basic components of the speech, like the signal frequency and energy, in haptic information. Recent researches have shown that the human voice can be classified in two key acoustic components: the fundamental frequency and the formants. The fundamental frequency is the rate of vocal folds’ vibration, and is controlled with the variation of the tension and length, or surfaces area, of the vocal folds. Formants are instead the resonant frequencies of the vocal tract, responsible of the variation of vocal timbre, and are altered depending on the vocal tract’s length [79]. Thus, speech can be considered a wide-band variable-frequency signal generator (represented by the vocal folds) driving a certain number of variable-frequency resonant filters (activated by the vocal tract). Vibrating motors Vibrating motor technology is the most used in haptic applications. The actuation principle is based on the rotation of an eccentric mass thanks to the activation of an electrical winding. Vibrating motors are characterized by a wide range of physical dimensions, ranging from millimitres to centimetres, thus they can be integrated within wearable devices where non-invasiveness is essential. The variability of their actuation parameters like transmitted power, vibration frequency and supplying voltage make them suitable for the transmission of haptic information via customizable waveforms. The first wearable devices based on “Vibratese” language were actuated by means of vibrating motors. In the one experimented by Geldard, the tactile code was transmitted on users’ trunk via five vibrating motors and could present 45 basic elements (letters, numbers, short words). Elements were coded by amplitude, duration and location, resulting in reading rates of approximately 38 words per minute [80], which can be considered a good achievement. Studies on deaf children who received multichannel implants demonstrated that the average speech comprehension rate without lipreading, after 5 years training, was about 44 words per minute [81]. To date, the “Vibratese” system seems not to be in use any more. Other researchers who focused on both vibratory [54] and electrical stimulation [73,74] could improve speech learning performance in deaf children. Vocoders are the most used aids for listening and speech assistance. These devices analyze speech frequencies using a bank of filters to detect spectral data, and convert them into haptic stimuli through a vibro- or electro-tactile display. The speech frequency spectrum is divided into filter channels, each one covering a range of frequencies. When the detected signal energy falls between the minimum and maximum pre-set thresholds for each channel, the filter output activates one of the haptic transducers, which delivers tactile feedback with intensities proportional to the signal energy. Vocoders are developed using a row of stimulators on the skin for each filter band [44]. Among vocoder pioneers, it is worth mentioning the work of Gault [55] and Wiener [38], who in 1928 experimented “Felix”, a haptic glove connected to a vocoder. It displayed different frequencies using vibrating motors placed on each finger.

Tactaid 7

Hey yaa

Sonic Bomb with Super Shaker Monofonator

[39]

[40]

[41]

Multi-channel speech aids

Haptic glove

Voice pitch control system Tactuator

Vibro-tactile Vocoders

Stereo-phonic devices

Wearable tactile sensory aid

Vocoder

Haptic chair

Sounzzz

Emoti-Chair

[44,45,46,47,48]

[49]

[50–53]

[54,38, 55–58,59,60]

[61–63]

[64,65]

[66]

[67,68]

[69]

[70]

[71]

[42,43]

Felix

[38]

Device

OSCAR

[37]

Ref.

Table 1. Haptic technologies for deaf people. Functionality

Audio-speakers and oscillators

Audio-speakers and oscillators Audio-speakers and oscillators

Piezo-electric

Solenoids

Solenoids

Conversion of music in tactile feedback Conversion of music into vibratings and light Conversion of music and speech in tactile feedback

Conversion of speech frequencies in tactile feedback Conversion of speech in tactile feedback

Conversion of sounds in tactile feedback

Conversion of speech in tactile feedback

Conversion of speech in tactile feedback

Head-positioning actuators Solenoids

Speech learning

Transposition of music in vibrotactile feedback

Conversion of speech in tactile feedback along single-channel technology Conversion of speech in tactile feedback along multi-channel technology

Turning sounds into vibrating cues Catching users’ attention Haptic alarm clock

Conversion of speech in tactile feedback

Conversion of speech in tactile feedback

Piezoelectric

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Actuation

Haptic chair equipped with 16 voice coils distributed on the back and seat and a microphone for voice inputs that allows music composition and experience voice through vibrating

Two audio oscillators and LEDs

Vocoder equipped with a 8  4 vibrator matrix of piezo-electrically actuated pins allowing the distribution of the time axis of speech on the finger surface and a better discrimination of consonants Contact speakers on back and arm of the chair

Devices that convert changes of the fundamental speech frequency in a tactile pattern using an array of transducers or bi-dimensional displays, where frequency and energy are coded along two perpendicular spatial axis Device that converts music in vibrotactile stimulation via vibrating motors placed on the back and on the palm of a glove. Vibrotactile signals can help the wearer to feel different music moods. Voice pitch control system that operates through a tactile display of 64 piezoelectric-actuated pins Two-channel tactile aid where three vibrating rods stimulate two fingers based on acquired speech signals Separation of frequency spectrum of speech and transmission on the user’s skin through small plungers connected to solenoids Stereophonic devices that help deaf in localization of environmental sounds through vibrators on arms and index fingers Transducer array on skin – array of eight miniature solenoids in a plastic box worn on the forearm

Device that conveys sounds and speech in haptic signals on the skin, amplifying the speech waveform through a single electromechanical transducer

Two waist belts activated with a button and actuated with a vibrating motor (Vibe Board) Vibrating unit with vibrating motor

Hand-held device equipped with a microphone and two vibrators providing a transversal stimulation of the thumb and a perpendicular stimulation of fingers and hand Haptic glove connected to a vocoder that displays different frequencies using different vibrating motors placed on each finger Seven vibrator motors on the skin

Description

Downloaded by [Southern Cross University] at 18:39 13 October 2017

Prototype

Prototype

Prototype

Prototype

Patented

Prototype

Commercial and prototypes

Prototype

Prototype

Prototype

Commercial and prototypes

Commercial

Commercial

Prototype

Commercial

Prototype

Prototype

Readiness level

Year

(continued)

2012

2009

2009

1996

1986

1955–1978

1928–today

2005

2010–2014

2016

1970–today

1985

2015

2011

1999

1951

1996

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES 5

1995 Prototype Electro-tactile device that stimulates the user’s nondominant hand with four rings equipped with eight electrodes

1989 Commercial

Tickle Talker [77,78]

Electrodes

Audiotact [73,74]

Electrodes

Conversion of speech in electrical stimulation of the skin Conversion of speech in electrical stimulation of the skin Conversion of speech in electrical stimulation of the skin Electro-tactile “Tacticon” Vocoder [74]

Electrodes

Audition restoration SoundBite [73,74,75,76]

Rare-earth magnets

32-channel electrotactile vocoder equipped with electrodes worn on the abdomen

1990 Commercial

2011 Commercial

2015 Commercial

Device with an external receiver that convert sounds into vibratings and send them to the skull bone via a transducer implanted under the skin Device in which vibratings are generated by a piezoelectric transducer and are transmitted to the skull bone by means of teeth Separation of frequency spectrum of speech and transmission to electrodes on the user’s skin Sophono [72]

Rare-earth magnets

Year Readiness level

Prototype Canvas chair which back is equipped with a 2  8 array of voice coils as vibrotactile actuators

Description Functionality

[72]

Actuation Device

Model Human Cochlea

Ref.

Table 1. Continued

Conversion of music and movies in tactile feedback Audition restoration

Downloaded by [Southern Cross University] at 18:39 13 October 2017

2009

F. SORGINI ET AL.

Audio-speakers and oscillators

6

Researches about vocoders used as speech learning devices are presented by Guelke [56], Kringlebotn [57] and Pickett [58]. Their tests did not reach successful results on deaf subjects, especially in the recognition of consonantal sounds. The experiments made in 1974 by Engelmann and his team at the Oregon Research Institute showed good results. They compared the effectiveness of their 32-channels vocoder prototype on hearing and deaf individuals, demonstrating that the performance of subjects was proportional to the amount of training they received. Furthermore, they found out that deaf individuals can be instructed to discriminate speech in a fine way through the tactile sense, thus confirming usefulness of vocoders for speech assistance [54,82]. Some tactile hearing aids amplify speech sound, transferring it to a single-channel electromechanical transducer (Gault et al. [55,42], Siemens “Monofonator” [43]), while other ones convert changes of the speech fundamental frequency in a tactile pattern using an array of transducers (multi-channel tactile aids) [44,45,46]. Multi-channel aids can code speech frequency stimulating different areas of the skin while the sound energy is represented with vibrations’ amplitude [59]. They can also be bi-dimensional displays where frequency and energy are coded along two perpendicular spatial axes [47,48]. Stimulation can be delivered discretely, via individual vibrators activated to be perceived separately, or continuously, via arrays of transducers transmitting stimulation patterns on the skin. Single-channel devices are simple, compact, low-weight, and can be easily worn. They only transmit speech rhythm, but information is not as complete as in multi-channels aids [83]. Among multi-channel aids, “Tactaid 7” is a haptic amplification system made of seven channels, with seven vibrating motors placed on the user’s neck, sternum, forearm and abdomen regions. Vibration parameters like frequency, amplitude, location and duration are proportional to the properties of the detected acoustic signals, and a filtering apparatus allows background noise suppression. The device is equipped with a T-coil so it can be used with audio loop systems and plugged into personal frequency modulation (FM) systems for classroom use. The use of “Tactaid 7” by hearing impaired individuals showed an enhancement of speech perception in comparison to the auditory condition only (thus, not-aided). Furthermore, different studies compared the capabilities of “Tactaid 7” to those of a previous device having two channels only, the “Tactaid IIþ”, as assistive devices in speech production and perception, and in the detection of closed-set contrasts of environmental sounds. In this comparison, both “Tactaid IIþ” and “Tactaid 7” enabled subjects to discriminate phonetic contrasts and impaired individuals reported slight subjective preference in the use of the “Tactaid 7” [39]. Detection of environmental sounds remains a present challenge for the development of hearing aids, both with haptic hearing aids and with partial audition restitution devices (see Figure 2 for taxonomy). As a related example, initiatives like the “NSF/CISE Hearables Challenge” were established in the last years to support the development of hearing solutions which can overcome the issue of sustaining a clear conversation in noisy environments, i.e. to address the so-called “Cocktail Party Problem”. Another category of listening and speech assistance devices are those designed to help the haptic transposition of fricative sounds, reviewed in the work edited by Risberg [84]. Voiceless fricatives are consonantal sounds (e.g., sibilants) present in some languages, as in English. These sounds are generated when the expired breath passes in a frictional way through a narrow point of the vocal tract, and correspond to high-frequency ranges. Some examples of fricatives sounds are the dental “th” in “think”, the

Downloaded by [Southern Cross University] at 18:39 13 October 2017

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES

labio-dental “f” in “fine” or the post-alveolar “ss” in “mission”. Risberg analyzes four types of systems: modulation systems [85,86], in which the sound signal is modulated with a carrier frequency of about 4 Hz; distortion systems [86,87], which distort sounds with low frequencies bypassing the detected sound; frequency-dividing systems [88], in which frequency transposition occurs by dividing the frequencies of a clipped speech signal; and vocoders [89,90]. A device representative of hand-held speech communication aids is “OSCAR”. Thanks to an embedded microphone it detects environmental sounds and supports lip-reading. Transversal stimulation of thumb and perpendicular stimulation of fingers and hand is provided through two vibrators [37]. The use of wearable assistive devices, especially gloves, is quite common among researchers. An example of haptic glove equipped with vibrating motors is the one by Mazzoni and BrianKinns [91]. It is specifically designed to help hearing-impaired experiencing mood music when watching movies, which usually are accessible only through visual and hearing senses. The glove communicates emotions through vibrotactile stimulation delivered by five vibrating motors placed on the back and three on the palm of the hand. Modulation of frequency and intensity of vibrotactile patterns communicate different moods to the wearer. Among this category of haptic aids, we also introduce some devices in which the actuation is achieved via vibrating motors based on solenoid and voice coil technology. In 1986, Boothroyd [64,65] patented a wearable haptic sensory aid which transmits to the users’ skin vibrational patterns proportional to sounds and voice intonations detected by the device. The haptic interaction occurs by means of an array of eight miniaturized solenoids placed on the forearm. Intonations are important in speech because they can define beginning and ending of sentences, and identify crucial words, questions and statements. Experiments demonstrated how this device could help deaf individuals to improve their lip-reading and generate intelligible speech [64,65]. A 23-channel vocoder actuated by small plungers connected to solenoids was used for the stimulation of deaf subjects, delivering vibration on fingers, forearm and legs (in this latter case with the best results). The impaired individuals were able to perform fine speech discrimination via the tactile sense. In the experiment, they were trained for 20–80 h to learn a vocabulary of 60 words and then understand sentences composed of these words. The authors also reported increases in memory and ability to discriminate, as in the number of words mastered [54]. Other solenoid-actuated technologies are the stereophonic devices by Gescheider [92,61], von Bekesy [62] (awarded Nobel prize in 1961 for the characterization of the mechanics of the inner ear) and Richardson et al. [63], which help deaf persons in the localization of environmental sounds through vibrators on arms and index fingers. Other technologies are actuated via voice coils, like haptic chairs. As an example, the “Emoti-chair” from the research of Baijal and colleagues [70] allows composing vibrotactile music and enhancing voice intonation thanks to 16 voice coils distributed on the back and the seat, and it also integrates functions related to communication and interfacing purposes. Piezoelectric transducers Among piezo-electrically actuated devices, we report an example of a vocoder based on this actuation principle and developed by Wada and colleagues [66]. They addressed the issue related to conventional vocoders, in which the presentation of sequential tactile stimuli on the same point of the skin made difficult the

7

discrimination of consonants followed, thus masked, by vocals. They developed an 8  4 vibrator matrix as an alternative to the conventional array of piezo-electrical actuated pins. This configuration allows the presentation of consecutive stimuli, representative of speech content, from the left to the right of the finger. In this way, the delivery of tactile information on the same point is avoided, and the temporal sequence of speech is transformed in a spatiotemporal sequence of events displayed on the finger surface. This solution was demonstrated to improve the identification rate of monosyllables. Spatial distribution of pins should be studied to improve skin sensitivity, considering that the range of spatial discrimination on fingertips varies from 0.5 mm [93] to 1.6 mm [94]. In this case, on index fingertip, pins are spaced 1 mm in columns and 3 mm in rows. The frequency value for pins vibration allows a proper sensitivity if it corresponds to the lowest absolute sensation threshold [66]. Another listening and speechassistance aid conceived for hearing impaired individuals is the “voice pitch control” device. This system converts information from audio to vibrotactile format by means of a display endowed with 64 piezoelectric-actuated pins [50–53]. Audio speakers Some audio speakers-actuated devices are described in this section. Haptic chairs prototypes found in literature allow deaf individuals to experience, via the tactile channel, sensations related to the listening of music. The “Haptic Chair” from the University of Singapore is designed to enable the experience of music for the deaf persons. Sounds registered from the environment are translated into vibrotactile feedback on the feet, back and hands of the subject [67,68] using audio speakers integrated in the chair. The one by Karam et al. [71] is called “Model Human Cochlea” (MHC) and is a canvas chair the back of which is equipped with a 2  8 array of voice coils as vibrotactile actuators. It is an interface for entertainment, allowing one to listen to music or watch a movie without using hands or arms. “Sounzzz” is instead a visual and tactile mp3 player that can convert sound into vibrations. The vibration of the audio speakers embedded in the player allows the user to feel the music while hugging the device. It is made of soft urethane, and it is also equipped with LEDs [69]. Communication and interfacing aids A plurality of haptic hearing aids was observed to cover the main needs of a hearing-impaired person living in a community, like communication with other persons and interaction with the surrounding environment. Some of these devices are fully available on the market, while others are only prototypes. In both cases, experimental data show that these technologies are able to enhance, or partially substitute, the hearing sense. In this paragraph we describe the technologies listed in Figure 2 under “Communication and interfacing”. Vibrating motors Some haptic sensory substitution devices integrated vibrating motors as actuators. These haptic technologies integrate rotating mass or hard-disk head-positioning actuators. “Hey yaa” is used to improve deaf persons distance communication through tactile vibrations on the waist. Two waist belts equipped with vibrating motors are activated with a button that, when pressed, induces a vibration in the belt of the receiver catching his/her attention [40]. Another category of devices dedicated to communication purposes and mainly actuated by means of vibrating motors is

Downloaded by [Southern Cross University] at 18:39 13 October 2017

8

F. SORGINI ET AL.

represented by the so-called “alerting devices” (“Communication and interfacing”, Figure 2). These are the most common haptic systems for deaf individuals on the market which advise the occurrence of a particular event via vibrations or visual signals. A study by Harkins et al. investigates how to optimize the parameters for vibratory alerting signals. The authors underline the importance of the strength, length and patterns of vibration [95]. Other devices are conceived for daily living activities. Clocks and wake-up alarms permit to wake up with a soft vibration. As an example, “Sonic Bomb with Super Shaker” (Sonic Alert) is an alarm clock with a bed shaker to be placed on the mattress [41]. Other technologies will activate at phone ringing, as well as remote receivers situated in different rooms of the house can notify a person when an event occurs in another room. Portable vibrating pagers can alert deaf parents when babies are crying [96,97]. Other prototyped devices convert sounds, like the ringing of telephones or doorbells, into tactile stimuli [44]. An example of tactile aid actuated via hard-disk head-positioning motors is the two-channel display developed by Yuan et al. [49]. This display acts on the fingertips and is developed in order to give information about consonant voicing during lip-reading. The frequencies of tactile signals employed for the bi-digital stimulation are proportional to the carrier frequencies of the two bands of speech. These frequencies are obtained from the modulation of the amplitude envelope of each band. The tactile device is actuated by means of three vibrating rods, each one connected to a hard-disk driven head-positioning motor. Assistive technologies actuated via vibrating motors based on voice coils are also present. A voice coil is a winding of conductive wire in which a permanent magnetic field produces a force which is proportional to the current flowing in the conductor. This force can be used to produce linear or rotational motion, thus for the fabrication of motors. As an example, the “Emoti-chair” from the research of Baijal et al. [70] is another haptic chair, this time developed to assist deaf persons in communicating and interfacing with other people. It allows composing vibrotactile music and enhancing voice intonation using 16 voice coils distributed on the back and the seat of the chair. Haptic technologies for blind individuals Haptic-assistive systems for blind people convert visual information into mechano-tactile stimulation, which could also involve multisensory feedback for those devices that elicit auditory cues in addition to cutaneous ones. Users require a long training to effectively understand this kind of information and to develop the capability to get oriented and to detect obstacles in the surrounding space. The discussed haptic technologies for blind individuals are focused on the improvement of visually impaired navigation and communication. Although the most widespread devices on the market are “smart” canes, hand-held devices and Braille displays, research is oriented towards the improvement and diffusion of ease-handling and wearable aids which can accomplish both these purposes. In the next section, we go through haptic aids dedicated to visual disabilities. We first describe devices specific for in- and out-door navigation, including wearable and non-wearable ones, and then those for communication and interfacing purposes, with aggregation based on actuation technologies as presented in Figure 2. A list of aids for visually impaired people is presented in Table 2, with specific focus on actuation principles.

Navigation aids With the expression “navigation aids”, we refer to assistive devices used to help sensory-disabled individuals in the exploration of indoor and outdoor environments. This category comprises ordinary aids like guide dogs and white canes, on which blind people usually rely. However, these aids have some limitations and are not always affordable. White canes give limited information, while guide dogs are expensive, have a work life range between 6 and 8 years and their speed is difficult to manage for some individuals [98]. Assistive devices for navigation also comprehend “smart” devices capable to detect obstacles around the user via different sensing technologies. Recent prototypes include GPS sensors and interact with environmental beacons. Examples of these systems are Electronic Travel Aids (ETAs), Electronic Orientation Aids (EOAs) and Position Locator Devices (PLDs) (“Navigation”, Figure 2) [99]. Among these, only ETAs convert visual information into tactile stimuli, while EOAs and PLDs deliver audio information. ETAs are based on ultrasonic (short range) and/or infrared (long-range) technologies for obstacle detection, and sometimes include video acquisition sensors. Haptic stimulation is achieved with tactile stimuli delivered to different body locations through vibro- or electro-tactile signals. More recently, thanks to widespread use of smartphones, travel aids are interfaced with apps helping the visually impaired user to navigate both out- and in-door. Vibrating motors Electronic Travel Aids with vibrating motors are probably the widest category of navigation devices for visually impaired due to the high stimulation intensity that such actuators can deliver. We list in the following section some of the implementations that are characteristic of this group, starting with wearables. The “People Sensor” distinguishes between animate (humans) and inanimate obstacles, and communicates their nature and distance through vibrational cues on the wrist [100]. The “Sunu Band” (Sunu Inc.) is a wristband with sonar sensors and vibrating transducers. It gives vibratory feedback when approaching an obstacle and can also interact with beacon technology [101]. Kammoun et al. [102] experimented a system composed by two wristbands transmitting information by means of vibrating motors. Head-mounted systems are equipped with stereo cameras for scene acquisition, vision algorithms and tactile display to represent detected obstacles. The helmet developed by Mann et al. [103] is a head-mounted device equipped with a Kinect camera for object detection and a vibrotactile array on the forehead, actuated with vibrating motors. The “Tyflos” navigation system by Dakopoulos et al. consists of glasses with cameras and a four by four array of small vibrators attached to the user’s chest for haptic stimulation [104]. Based on his study on computer vision systems, Adjouadi formulates an algorithm which identifies “safe” paths in the surroundings to assist navigation. Information acquired by a camera can be used to describe the environment to a visually impaired person, and to identify safe paths, which means paths clear from depressions, obstacles and drop-offs. Directions to take and obstacles are communicated through vocal advices, with the possibility to introduce a tactile module (not yet implemented) specifically dedicated to the representation of the clear route [105]. Haptic vests are commonly actuated by a matrix of vibrating coin motors, allowing the transmission of spatiotemporal vibrational patterns. Arrays of vibrators are usually placed on the torso or on the upper-back region, which is particularly suitable for

Piezoelectric Piezoelectric

Braille display

Braille display

Tactile display

Graphic Window Professional Dots View DV-2

HyperBraille

STReSS

BrailleNote

Graphic visualization on PC

Navigation glove

BrailleBack

MoBraille

BrailleType

TeslaTouch

Color detection glove

Tactile display

Pneumatic system

Tactile display

Phorm

[155]

[156]

[160]

[163]

[165,164]

[159]

[153]

[29]

[135]

[136]

[138]

[137]

[185]

[180]

[169]

[172,173]

[170]

[171]

[164]

Braille display

[162]

Pneumatic

Pneumatic

Pneumatic

Pneumatic

Electromagnetic

Electro-vibration

Piezoelectric

Piezoelectric

Piezoelectric

Piezoelectric

Piezoelectric

Piezoelectric

Piezoelectric

Piezoelectric

Piezoelectric

Piezoelectric

Piezoelectric

Piezoelectric

Piezoelectric

Mimizu

Actuation Piezoelectric

[161]

Device

Optacon

[151,152]

Ref.

Table 2. Haptic technologies for blind people.

Tactile pixel pneumatically activated Tactile display pneumatically activated

Tactile pixel pneumatically activated Injection needle of CO2

Conversion of colours in tactile stimuli

Interfacing App for Braille displays and Android devices Interfacing App for Braille displays and Android devices Interfacing App for Braille displays and Android devices Surface haptic display

Navigation

Computer interface

Computer interface

Lateral skin deformation display for tactile movies

Image representation

Image representation

Image representation

Lateral skin deformation display Object detection

Lateral skin deformation display

Information about text

Conversion of text into Braille characters Drawing for visually impaired

Functionality

Haptic display that can represent images, actuated through the application of a differential voltage Glove equipped with color sensors on thumb and index fingers and interacting with electro- magnetic vibrotactile transducers (tactors) placed on armbands Tactile pixel made of a polysilicon membrane actuated by means of a MEMS pneumatic micro-valve Needle that inflates a CO2 jet and moved by a tactile plotter in order to draw characters directly on the skin Display actuated by hydraulic pressure of paraffin wax expansion, electro-thermally controlled Bubble display composed by a flexible surface that can be shaped in edges and bumps thanks to a fluid injection that cause the expansion of the top polymer

Application for the connection of different refreshable Braille displays to an Android device via Bluetooth technology

Application for the connection of different refreshable Braille displays to an Android device via Bluetooth technology

Pen interacting with a pin array that allows the drawing for blind Braille display equipped with an array of 16  2 vibratory pins that transmit displayed text information to blind computer users He developed a Braille display able to communicate a single line of Braille dots through lateral skin deformation instead of normal indentation. Array of pins connected to a membrane that provide only lateral stretch stimulation of the skin A video acquisition camera convert data about objects localization in vibrotactile stimuli directly on the skin, through an array of piezo-electrically actuated raising pins Haptic display representing images from a PC through lines of piezoelectric elements Haptic display based on piezoelectric technology, characterized by a wide reading aperture Haptic display that embeds a matrix of seven 200 dots and is actuated with piezoelectric reeds Matrix of 10 piezo-electrically actuated pins that allows the representation of “tactile movies” refreshing tactile images with a frequency of 700 Hz Equipped with 40–80 Braille cells, with raising dots activated by piezoelectric technology Graphic tablet used to control inputs with the dominant hand, while the non-dominant hand receives tactile feedback through a pin-actuated mouse Glove that detects environmental information by means of a stereo camera and actuated with a piezoelectric buzzer on each finger Application for the connection of different refreshable Braille displays to an Android device via Bluetooth technology

Array of vibrating tactile rods

Description

Downloaded by [Southern Cross University] at 18:39 13 October 2017

Year

Commercial

Prototype

Prototype

Prototype

Prototype

Prototype

Commercial

Commercial

Commercial

Prototype

Prototype

(continued)

2009

2005

2010–2015

2003

2009

2011

2011

2010

2013

1999

2006

2000

2003

Prototype Commercial

2007

2015

2015

1970

2000

2005

2004

2002

1960

Commercial

Commercial

Commercial

Prototype

Prototype

Prototype

Prototype

Prototype

Commercial

Readiness level

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES 9

Visotoner

Computer vision system Tactile graphic representation People Sensor

Kahru

Wearable navigation device Wearable navigation belt Mowat

ActiveBelt

Navigation vest

Navigation belt

Tyflos

Vision environmental aid

Insole for navigation

Hand-free navigation tool

Navigation system

Sunu Band

Navigation system

Obstacle detection device

Lechal

[175]

[175]

[103]

[105]

[111]

[104]

[114]

[99]

[106]

[116–118]

[119]

[97]

[96]

[120]

[121]

[115]

[49]

[107]

[95]

[176]

VideoTact

[174]

Device

VIDET Glove

[181]

Ref.

Table 2. Continued Actuation

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Headset loudspeakers

Navigation by means of haptic feedback in the insole of shoes

Obstacle detection with computer vision

Haptic shoe and spectacles

Navigation system wristbandactuated Navigation system wristbandactuated

Navigation

Vision aid to detect people around the blind and communicate distances Navigation

Navigation

Navigation

Navigation

Navigation

Navigation

Navigation

Navigation

Navigation vest

Conversion of visual information in tactile feedback Conversion of images in tactile feedback Navigation aid

Conversion of visual information in tactile feedback Conversion of visual information in tactile feedback

Conversion of colours in tactile stimuli

Functionality

Wristband equipped with sonar sensors and vibrating transducers that gives vibratory feedback when approaching to an obstacle, and can also interact with beacon technology Shoe equipped with three pairs of ultrasonic transducers and with vibrating motors on the shoe collar for haptic output, interacting spectacles for detection of obstacles at head level A camera-motion capture system interacting with markers on wearer’s feet transmits a vibrating through an array of 32 vibrotactile actuators on the leg Insole electronic unit equipped with vibrating motor

Insole equipped with a vibrotactile array of four or sixteen vibrators Shoe-mounted device detects obstacles thanks to IR sensors, then conveys information through a vibrotactile display of vibrating motors on the arm Two wristbands equipped with a vibrating motor

Waist belt equipped with eight pager motors interfacing with environmental waypoints Hand-held device which detects obstacles with ultrasonic sensors, transmitting haptic vibrating with vibrating motors Belt equipped with eight vibrators around the waist giving obstacle information, interfacing with GPS and a host PC Vest equipped with an array of 4  4 vibrating motors on the lower torso transmits eight vibrotactile patterns Belt equipped with 14 vibrating motors that activate after the visual analysis of two web cameras mounted on another belt Two cameras placed on dark glasses, microphone, speaker for audio feedback interact with a 4  4 array of vibrators on the user’s chest Belt equipped with seven vibrating motors interacting with computer vision systems

Device converting video inputs in haptic vibrations on fingers, and cameras systems electronically interfaced with a flexible pad of vibrators placed on stomach’s or back’s skin Cameras system electronically interfaced with a flexible pad of vibrators placed on stomach’s or back’s skin Device that allows the representation of visual diagrams on a tactile display, combined with auditory information ETA capable of differentiating between animate (humans) and inanimate obstacles detecting them through ultrasonic sensors and interacting with users by means of vibrating motors on the wrist Six vibrating motors in contact with the skin and embedded in a vest 120 vibrators placed on a fleece vest on the torso

Glove equipped with tiny headset loudspeaker stimulators on three forefingers, referring to red, green and blue dimensions, that represents colours by means of variations of vibrotactile frequencies Arrays of vibrating motors

Description

Downloaded by [Southern Cross University] at 18:39 13 October 2017

Year

Commercial

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

(continued)

2015

2014

2012

2015

2012

2010

2008

2008

2007

Prototype Prototype

2006

2006

2004

2003

2005

2003

2001

1998

1996

1977

1977

1963

1998

Prototype

Prototype

Prototype

Commercial

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Commercial

Prototype

Readiness level

10 F. SORGINI ET AL.

Vibrating motors Vibrating motors

Navigation system

Navigation system

Haptic vest

Haptic helmet

Tacit

UltraCane

LaserCane

Haptic cane

Haptic white cane

Haptic white cane

Waist belt

Shape recognition

Haptic vest

Haptic white cane

Interactive cane

Navigation system

PhanToM

Computer vision system

WalkyTalky

UbiBraille

Direction indicator for dangerous areas

Haptic vest

[110]

[123]

[102]

[98]

[122]

[128]

[129]

[130]

[125]

[126]

[112]

[113]

[140]

[93]

[127]

[124]

[177,178]

[100]

[131]

[206]

[132]

[139]

SMA

Stepper motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Navigation system

Actuation Vibrating motors

[109]

Device

Navigation system

[108]

Ref.

Table 2. Continued

Haptic vest for navigation

Navigation device

Android application for navigation Communication

Navigation system

Force display

Haptic cane handle

Smart cane for navigation

Smart cane for navigation

Conversion of visual to tactile stimuli Navigation

Navigation

Smart cane for navigation

Smart cane for navigation

Navigation

Smart cane for navigation

Sonar haptic glove to detect obstacles Smart cane for navigation

Haptic helmet for navigation

Haptic vest for navigation

Navigation system beltactuated Navigation system beltactuated Navigation system beltactuated Navigation system for museum exploration

Functionality

The handle of the cane is equipped with ultrasonic sensors and actuated by two vibrating buttons Smart cane that can detect obstacles emitting light beams and gives audio and tactile feedback on the user’s hand Cane handle equipped with ultrasonic sensors giving feedback through a tactile display on the trunk or hand The handle of the cane is equipped with ultrasonic and infrared sensors and actuated by three coin vibrating motors Mobility aid system in outdoor environment consisting of a smartphone, path detectors and GPS that interacts with a smart cane Waist belt equipped with 13 vibrators around the circumference, giving a continuous stimulation Array of vibrators placed on the user’s back interacts with a camera module translating image contours to the user Vest of stretch fabric equipped with tactors of electric motors in a matrix of 3  3 elements, or of SMA actuators in a 2  2 matrix The handle of the cane interacts with environmental beacons and is actuated by one eccentric mass motor The handle of the cane is equipped with ultrasonic sensors and actuated by vibrating motors Haptic cane handle equipped with four eccentric rotating mass motors Force display equipped with an end-effector which gives haptic feedback to the user’s fingers and hands Computer vision system able to identify safe paths in the surroundings, which algorithms can be applied to a navigation aid for blind, communicating through tactile feedback or vocal advices The Application provides vibratings to the smartphone when the user walks the wrong way System made of six rings on the central fingers of both hands, on which are mounted vibrating coin motors that communicate Braille characters using vibrating Hand-held device providing haptic pulses and generating a pulling force in the desired direction through a stepper motor interacting with a swinging slider-crank mechanism Vest equipped with arrays of SMA

GPS, visual odometry, pedometry and radios interact with a tactile belt equipped with vibrating motors External localization systems interact with a belt equipped with eight coin vibrating motors GPS, sonar and wi-fi technologies interact with a belt equipped with four vibrating motors Mobile guide which comprises a software interface to be integrated with a mobile guide, and a two-channels haptic module acting on index and thumb fingertips Vest interacting with external localization systems equipped with 8  8 matrix of vibrating coin motors Helmet equipped with a Kinect camera for object detection and a vibrotactile array on the forehead Glove equipped with sonar sensors and two vibrating motors

Description

Downloaded by [Southern Cross University] at 18:39 13 October 2017

Year

Prototype

Prototype

Prototype

Prototype

Prototype

Commercial

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Commercial

Commercial

Prototype

Prototype

2003 (continued)

2008

2013

2011

1992

2015

2015

2014

2012

2004

2011

2005

2011

2010

2010

2007

2004

2011

2011

2011

2008

Prototype Prototype

2015

2015

2015

Prototype

Prototype

Prototype

Readiness level

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES 11

Tactile display

Tactel

VIDET

CyARM

Navigation device interacting with visual inputs Haptic dental chair

Moose

Novint Falcon and computer interface Navigation system

Haptic system

Hand-held device for navigation Tongue stimulators devices

[167]

[183]

[142]

[143]

[133,134]

[258]

[259]

[146]

[144]

[256]

[219,255]

[145]

Navigation glove

Intelligent Glasses

[141]

[182]

Tactile display

[168]

Device

Haptic vest

[140]

Ref.

Table 2. Continued Actuation

Electrodes

Electrodes

Robotic arms

Haptic exoskeleton

Robotic arms

Robotic arms

Voice coils

Solenoids

Solenoids

Wires

Wires

Electro-Rheological Fluids (ERF)

Electro Active polymers (EAP)

SMA

SMA

SMA

Navigation

Navigation

Navigation system for blind in museums System for appreciating pieces of art shapes in museums Navigation

Learning system

Computer interface for graphics

Conversion of visual information in tactile feedback

Navigation

Navigation system wireactuated

Navigation system

Tactual element or array

Soft tactile display

Navigation

Dynamic tactile display

Navigation

Functionality

Laser range sensors detect information about obstacles and send it through a tactile interface (not developed) Device equipped with a video camera for environment analysis mounted on a pair of glasses, giving tactile feedback through an array of electrodes placed on a orthodontic retainer Glove equipped with electrodes that gives navigation information by means of transcutaneous electrical nerve stimulation (TENS)

System that includes a TV camera for video input which is converted in haptic stimuli through a matrix of polarized solenoid stimulators placed on the back of a dental chair Actuated by a manipulandum coupled to two linear voice coil motors, that can change the workspace texture to transmit graphical representation on the PC’s display PC-based application allowing the use of the haptic “Novint Falcon” system for scientific learning System involving a telepresence robot, a RGB-D sensor of a camera and a haptic interface for feedback A haptic device for the upper-limb allows the visitor of a museum to feel shapes of statues

Vest of stretch fabric equipped with tactors of electric motors in a matrix of 3  3 elements, or of SMA actuators in a 2  2 matrix Dynamic Braille display equipped with an array of SMA actuators and a magnetic latch Eye-glasses equipped with stereo cameras for scene acquisition interacts with vision algorithms and a tactile display for the representation of information Soft polymer in which a matrix of dielectric elastomer dots is embedded and, when a voltage is applied, dots undergo an expansion or compression, transmitting information to fingertips Cylindrical tactual element made of a cell filled of an ERF and actuated by an electric field on cell’s walls that activates the fluid. Tactual elements can be also organized in 4  4 arrays Wearable robotic system interacting with a vision-system and a wire-actuated haptic interface transmitting the scene’s shape to the users’ fingers Hand-held device equipped with ultrasonic sensors communicates their distance through the length of a wire connected with user’s belt Glasses equipped with a camera converts visual signals in tactile stimuli through an array of pins actuated by solenoids

Description

Downloaded by [Southern Cross University] at 18:39 13 October 2017

Year

Prototype

Prototype

Prototype

2005

1998–2006

2004

2004

2014

Prototype Prototype

2015

1997

1970

2009

Commercial

Prototype

Prototype

Prototype

2009

2000

Prototype Prototype

2004

2008

2006

2005

2004

Prototype

Prototype

Prototype

Prototype

Prototype

Readiness level

12 F. SORGINI ET AL.

Downloaded by [Southern Cross University] at 18:39 13 October 2017

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES

haptic stimulation since this body area has a relatively high density of mechano-tactile receptors [106]. Tactile vests are commonly interfaced with other navigation technologies for the wireless communication of objects’ positions in the surroundings. The vest by Akhter et al. integrates an array of 8  8 motors and is actuated with signals coming from smartphones and computer vision systems [107]. Other examples of haptic vests equipped with arrays of vibrating motors are the “Kahru” system by Gemperle et al. [108], the device designed by Jones [109] and the fleece vest by Van Veen et al. which integrates a high-density haptic array with 120 vibrators [110]. Haptic belts are wearables less cumbersome than vests, but in comparison to them they allow the stimulation of a limited body area. These devices are generally equipped with a host PC, computer vision systems, GPS, sonar and Wi-Fi, like those by McDaniel et al. [111] and VanErp [112]. Other devices interact with smartphones’ apps to provide navigation information, like the one from Auburn University [113], or include a buzzer for auditory feedback together with tactile stimulation like in the prototypes by Flores [114] and Gilson et al. [115]. Vibrators embedded in belts can be allocated along different directions and with variable densities. “ActiveBelt” stimulates the skin utilizing eight vibrators around the torso [116], the belt by Nagel et al. [117] gives a continuous stimulation around the waist via 13 vibrators all around its circumference, while the one by Wu transmits information about objects’ shape to an array of vibrators on the user’s back [118]. The prototype by Johnson is distinguished from the previous ones thanks to the presence of two web cameras for scene acquisition, while actuation is delivered via 14 vibrating motors [119]. Shoe-integrated navigation systems consist in smart insoles or footwear able to detect obstacles and to transmit haptic vibrations to feet or other body locations. “Lechal” (Ducere Technologies) is a commercialized haptic shoe. It is actuated by small vibrating motors, and interacts via Bluetooth with a smartphone app [120]. A similar insole is equipped with a vibrotactile array of sixteen motors (diminished to four in the following versions) [121–123]. Other shoe-mounted devices can send haptic information to arrays on different body areas. An example is a system equipped with infrared sensors interfaced with a vibrotactile display of two vibrating motors mounted on the arm [124]. The rehabilitative shoe by Abu-Faraj et al. adds to these technologies also a pair of spectacles for the detection of head-level obstacles [125], while Lobo et al. proposed a camera-motion capture system interacting with markers on wearer’s feet and lower leg. In this case, haptic feedback is provided through an array of 32 vibrotactile actuators on the wearer’s leg [126]. Another group of wearable navigation aids are haptic gloves, enabling the detection of environmental information with range sensors and delivering haptic vibration over hand’s skin. An example is “Tacit” (Grathio Labs), equipped with sonar sensors and actuated through two servo motors [127]. Hand-held devices are another category of haptic aids transmitting navigation information directly on the hand. Despite these devices can limit the use of user’s hands because of their need to be constantly held, some of them are already present on the market, like the “Mowat” sensor (Wormald International Sensory Aids) [128]. Others are still prototypes like the “mobile guide”, which allows blind individuals to orient in museums between tagged artworks [129]. It comprises a software interface to integrate with a mobile museum guide, and a two-channels haptic module going to stimulate index and thumb fingertips via two vibrating rings. It shows success in artworks localization, and is effective with both tactile and auditory cues.

13

The most common hand-held devices are “smart” canes, which are comprised of a sensorized handle that serves as a tactile interface and a cane fixed to that. Several studies were focused on the identification of the best stimulation parameters for the transmission of information regarding obstacles. The studies by Kim [130] pointed out that a four-finger stimulation through tactors, combined with a spatiotemporal variation of stimuli, can lead to a correct identification of information. The smart cane by Gallo et al., together with haptic stimulators, is equipped with a dual-axis gyroscope for orientation detection [131], while the one by Wang et al. [98] interacts with environmental beacons and is actuated by one eccentric mass motor. Fanucci et al. [132] implemented a mobility aid system for visually impaired in outdoor environment, and tested it within Lucca’s city walls. The system comprises a smartphone, path detectors and GPS interacting with a smart cane, which conveys vibrotactile information to the hand guiding the blind person along a safe path. The “interactive cane” (BlindMaps) gives instead real-time information on the surroundings, interesting places and crossroads. It interfaces with user’s smartphone and interacts with environmental beacons [133]. “UltraCane” (Sound Foresight Technology Ltd) is already on the market and detects obstacles also at head level [134], while “LaserCane” (NurionRaycal Industries) can detect obstacles with light beam emission and gives audio and tactile feedback directly on user’s hand [135]. The prototype by Calder can be used in hands-free and stand-alone mode, respectively transmitting tactile feedback on the users’ trunk and the cane’s handle [136]. Smartphones as well can serve as substitution aids for navigation purposes thanks to GPS, compasses and maps integrated on-board. The haptic feedback is achieved via the embedded vibrators. An example is the “WalkyTalky” [137], an accessibility application for Android phones which improves the visually impaired navigation. The application, in fact, pronounces the address of nearby locations as the user passes them. It works in combination with spoken walking directions from Google Maps and also provides vibrotactile feedback. The last device we present for this category is a prototype actuated via a stepper motor. It was designed by Amemiya with the purpose of helping blind persons to escape from dangerous areas. It generates pulling forces in the desired direction using a stepper motor that interacts with a swinging slider-crank mechanism [138]. Regarding haptic displays, it is worthwhile mentioning an array of pins actuated with motors based on solenoid technology [139,140] and interacting with users’ palm. Piezoelectric transducers and Shape Memory Alloy technology In this paragraph, we merge some sensory aids which actuation technology is based on the variation of the geometrical shape of an element, i.e., piezoelectric and SMA actuators. Among piezoelectrically actuated navigation wearables, Zelek et al. [141] developed a navigation glove able to detect environmental information thanks to stereo cameras. It is actuated with piezoelectric buzzers placed one per finger. Some recent applications enable visually impaired persons to interact with the display of smartphones. “BrailleBack” allows us to connect different refreshable Braille displays to the smartphone. The displays are piezo-electrically actuated and controlled via Bluetooth technology [142]. “BrailleType” [143] and “MoBraille” frameworks [144] allow instead to write Braille characters on the touchscreen. Examples of devices actuated through Shape Memory Alloy (SMA) technology are discussed in the following. An example is a haptic vest equipped with a matrix of SMA actuators for the

14

F. SORGINI ET AL.

Downloaded by [Southern Cross University] at 18:39 13 October 2017

transmission of spatiotemporal patterns [145,146], while another example is a head-mounted system called “Intelligent Glasses” [147], in which a tactile display actuated through SMA elements delivers visual information from the environment on the user’s skin. Wired-actuated devices and robotic arms and hands In this paragraph, we discuss assistive technologies actuated by means of mechanical action involving relevant displacement of parts. First, we describe a couple of interesting prototypes actuated via wires mechanically extended or shortened according to the information from the surroundings. The “VIDET” wearable robotic system described in the work of Arcara transmits information to a wire-actuated haptic interface that modifies the shape of user’s finger according to the disposition of objects in the scene [148]. “CyARM” is a hand-held device for obstacle detection which is equipped with ultrasonic sensors. Distance from obstacles is communicated by varying the length of a wire which connects the device to the user’s belt. When approaching an obstacle the wire retracts, thus pulling the user’s arm back and giving the illusion of an imaginary arm extending from the obstacle [149]. We now report some technologies which involve the use of a robotic arm, and are also used to support navigation. Worthy of particular attention is the prototype by Yuan et al. [150]. The robotic arm in this device transmits environmental information to the user. Navigation aids and interfaces also allow blind individuals to explore museums and exposition sites, as reported in the work of Park et al. [151]. His system involves a telepresence robot, a camera’s RGB-D sensor and a haptic interface (robotic arm) for feedback. Although this is not the only existing prototype dedicated to the haptic exploration of art works in public expositions, it is the first which makes the user interact with a robot. In this field, it is worthwhile to point out the “Museum of pure form” from the research group led by Bergamasco [152]. In this project, a haptic device mounted on an upper limb exoskeleton allows the visitor of a museum to interact with pieces of art. Pieces of art are scanned and inserted in a virtually reconstructed museum, where the visitors can touch statues while viewing their shape in a virtual reality cave. Communication and interfacing aids The most common devices allowing blind people to communicate and interface with the external world are Braille displays (“Communication and interfacing”, Figure 2). Refreshable Braille displays (or Braille terminals) are electromechanical devices where Braille characters are represented by means of pins piezo-electrically raised from holes on a flat surface. These systems allow blind users to read text outputs from computers, but they are quite expensive and can display only 40 or 80 Braille cells. A list of available typologies of Braille displays is presented by Russomanno et al. [153], with a focus on refreshable Braille displays which did not achieve a successful commercial diffusion or were developed for research purposes only. Experiments with these devices demonstrated that single-cell displays, that do not allow sliding exploration, decrease the blind person reading performance. Tactile Vision Substitution Systems (TVSS) (“Communication and interfacing”, Figure 2) [106,154,155] are a category of haptic devices which acquire visual information with cameras and deliver vibrotactile stimuli to the skin via a two-dimensional matrix of tactile stimulators. Studies from the 1970s demonstrated the skin’s

ability to interpret visual information using these systems. The impaired individuals could recognize lines with a vertical, horizontal and diagonal orientation, and the most experienced users could identify common objects and peoples’ faces [106]. Recently, studies by Bach-y-Rita confirmed the effectiveness of tactons [156]. The experimentation focused on the conversion of video information in vibrotactile stimuli on the abdomen, the back or the thigh using arrays from 100 to 1032 points. Tablets’ and smartphones’ displays can be used to represent tactile images, thanks to their embedded piezoelectric transducer or vibrating motor. When the finger touches the surface it vibrates in correspondence of a displayed object allowing the detection of edges or the interaction with maps. A drawback of these technologies, arising when the display is too big, is a decrement of vibration strength proportional to the distance from the interaction point. Representative devices of the aforementioned displays are presented in the following, according to the taxonomy in Figure 2, with aggregation based on actuation technologies. Piezoelectric transducers, Electro-Active Polymers and Shape Memory Alloy technology In this paragraph, we present some examples of haptic aids actuated via three technologies, involving materials whose physical deformation, under different circumstances, generates a haptic effect. The first category we review in this paragraph is the one relative to piezoelectric haptic aids. “Optacon” (OPtical to TActile CONverter) (Telesensory Systems Inc.) is one of the first developed communication aids for blind individuals for reading non-Braille text. An embedded camera module is moved across text lines with one hand, while the other hand explores a dynamic Braille tactile array (24 by six metal rods) where the acquired information is displayed. The vibration intensity is regulated through an adjusting knob [157,158]. Analysing more in general Braille displays, some of them are improved with a speech text functionality which integrates the haptic feedback. An example is the “BrailleNote” (HumanWare) [159,160]. A particular category is represented by the “lateral skin deformation displays”. These are Braille displays able to communicate a single line of Braille dots through lateral skin deformation instead of normal indentation [161]. Hayward describes a similar display: an array of pins connected to a membrane selectively providing lateral stretch stimulation of the skin [162]. Such technology was also used to perform neuroscientific studies which targeted a more profound understanding of somatosensory system mechanisms [163,164], that is a major requirement for the development of advanced sensory-substitution devices. Based on the same pinned approach, “STReSS” by Pasquero et al. allows the representation of “tactile movies” with a refreshing rate of tactile images of 700 Hz [165]. The object-detection device by Bliss is a TVSS aid piezo-electrically actuated and equipped with piezo-electric raising pins [166]. Other devices can allow visually impaired users to draw, like the one by Kobayashi and Watanabe. It is equipped with a tactile display and a stylus pen which can be pointed over the image, delivering tactile feedback to the user through pins’ vibration [167]. Braille displays can also be used for images and graphs visualization, in most cases with the purpose of facilitating interaction with computers. In the research of Wall et al. a graphic tablet for graphs visualization is developed. It is controlled with the user’s dominant hand, while the non-dominant one receives tactile feedback

Downloaded by [Southern Cross University] at 18:39 13 October 2017

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES

through a pin-actuated mouse [29]. Homma developed a tactile display equipped with an array of 16 by two vibratory pins which transmits information about text’s formatting to blind computer users. Deaf individuals could also use it for speech communication [168]. The “Graphic Window Professional” (Handy Tech) can represent images from a PC through lines of piezoelectric elements [169]. Similar device is “Dots View DV-2” (KGS Electronics) [170] but characterized by a wider reading aperture, while “HyperBraille” (Metec) [171,170] is a display that embeds a matrix of a high number of dots (up to 7200) and is actuated with piezoelectric reeds. The second category we are going to mention here is the one relative to EAP, which were used in some prototypes for the development of haptic aids. The working principle of these devices is based on the EAP property of changing in size or shape when stimulated by an electric field. A relevant example of these systems is the finger-tactile display developed by Koo et al. [172]. Initially designed to stimulate fingertips, its flexibility makes it adaptable also for different body locations. It is made of a soft polymer which includes a matrix of dielectric elastomer dots. When voltage is applied, dots expand or compress transmitting information to fingertips. The third technology mentioned here and also exploited for the development of tactile aids is the shape memory alloy. The property of SMA actuators to easily return, after being deformed, to their original shape when heated makes them suitable for the development of haptic devices. An example of a communication and interfacing aid SMA-based is the display by Haga et al. It consists of a line of dynamic Braille characters and is equipped with an array of SMA actuators and a magnetic latch [173]. Pneumatic actuators Examples of refreshable Braille displays actuated by means of different pneumatic technologies are reported in the following: The Braille display by Yobas [174] consists of a polysilicon membrane actuated by means of a MEMS pneumatic microvalve. This membrane inflates when a pressure is applied into the pneumatic cell, thus allowing the user to perceive a deformation like a single Braille dot. The refreshable Braille cell by Lee [175] is instead actuated through melted paraffin wax which increases its volume after a heating process. The wax is placed in containers sealed with a silicon rubber diaphragm, but when it expands the silicon diaphragm undergoes a deformation and raises the Braille dot. “Phorm” (Tactus Technologies) [176] is a commercial bubble display made with a flexible surface that can be shaped in edges and bumps thanks to microfluidics. In the tactile display made by Cunha [177,178], the mechanical stimulation of the skin occurs instead through CO2 jets spilled by an injection needle. The needle is moved by a tactile plotter that can draw characters on the user’s skin, and can be used for both blind and deaf–blind persons. A drawback of this system, with respect to the other ones previously described in this paragraph, is its size which precludes its wearable application. Portability is, in fact, an essential feature that can enlarge the application field of assistive devices. Vibrating motors The experiments carried out by Bach-y-Rita on TVSS led to the commercialization of “VideoTact” [179], a device which allows blind users to recognize the shape of simple objects and line orientations thanks to array of vibrators acting on different areas of the skin. Other two devices enabling vision substitution are the “Visotoner”, which converts video inputs in haptic signals through

15

vibrators placed on fingers, and a camera system electronically interfaced with a flexible pad of vibrators placed on stomach’s or back’s skin [180]. Minagawa et al. proposed instead a device which represents visual diagrams on a tactile display, combined with auditory information [181]. “PhanToM”, already common on the market, is a force display equipped with an end-effector which gives haptic feedback to the user’s fingers and hands [182,183]. It is mainly used by nonimpaired persons, but it can also be used in virtual reality both for visually impaired and deaf users [184]. Researches on haptic substitution aids addressed the problem of colour representation as well. Some examples of devices enabling blind persons to appreciate colours are haptic gloves. The “Color detection glove” by Schwerdt et al. [185] is equipped with colour sensors on thumb and index fingers, and interacts with electromagnetic vibrotactile motors (tactors) placed on armbands. A similar device is the “VIDET Glove” developed by Cappelletti et al. [186], but in this case, the actuation is performed using tiny headset loudspeaker stimulators, placed on three forefingers. They refer to red, green and blue dimensions that code colours by vibrotactile frequency modulation. Despite experimental results are satisfying, these devices require a high cognitive load and a long training. We mention here some devices actuated via vibrating motors based on solenoid technology. The dental chair in the research of Collins [187] is a TVSS developed in the shape of a haptic chair. Here, polarized solenoid actuators stimulate the skin depicting bi-dimensional silhouettes of detected images. This actuation technology is also used for the development of displays which enable blind individuals to appreciate graphic elements and figures. Subjects can try to follow tactile images’ contours with hand exploration and lateral motion of the fingers [170]. “Moose” is a bidirectional haptic computer interface which represents windows, menus and buttons as patches and lines. Basically a powered mouse, it gives haptic representation of icons. The actuation is obtained with a manipulandum coupled with two linear voice coil motors through two perpendicular flexures [170]. Electro-Rheological Fluids and Electro-Vibration-based actuators Two actuation technologies based on the variation of the properties of a system, induced by an electric field, are ElectroRheological Fluids and Electro-Vibration-based assistive devices. ERF are suspensions of non-conducting but electrically active particles in an electrically insulating fluid which viscosity changes reversibly in response to an electric field. In the last years, this technology was also integrated in tactile displays. As an example, we mention “Tactel”, a display shown in the work of Klein et al. [188]. It is a cylindrical tactual element made of a piston that can move vertically in a gap filled with a ERF. Metal electrodes on the walls of the cell generate an electric field activating the fluid. These tactual elements can also be organized in arrays, which dimensions can vary according to the body area on which they are required, and can be used for the development of Braille displays or the transmission of tactile sensations in virtual reality applications [189]. Other devices allow the representation of images thanks to the electro-vibration principle. Haptic displays based on this technology transmit vibration on the user’s finger through the modulation of the friction created between this and the surface of a display. Here, a time varying voltage is applied between an electrode and an insulated ground plate, generating an electrostatic force between the plane and the finger. An example worth to

16

F. SORGINI ET AL.

mention is “TeslaTouch” [190] that, for this reason, is considered a “surface haptic display”.

Downloaded by [Southern Cross University] at 18:39 13 October 2017

Haptic technologies for deaf–blind individuals “Communication and interfacing” technologies as well as alerting aids designed for deaf individuals can also address deaf–blind population. This is true for “navigation” technologies and “communication and interfacing” aids designed for blind as well. Examples are haptic Braille displays for computer interaction or haptic ETAs. In the following paragraphs, we report some devices and prototypes specifically developed for deaf–blind people, according to the taxonomy presented in Figure 2. Their working principles are mostly inspired by the communication languages previously discussed. The dissertation starts from navigation aids, some of those being also suitable for blind users. Then the section is concluded with communication and interfacing technologies and listening and speech assistance aids, which can also address deaf communities. In Table 3, haptic technologies for deaf–blind individuals are summarized and listed according to actuation principles. Navigation aids Navigation devices for deaf–blind individuals are based on various actuation principles and may stimulate several body locations (“Navigation”, Figure 2, listed with aggregation based on actuation technologies). In this section, we focus on the devices specifically conceived for deaf–blind individuals. The present section does not discuss assistive aids that have already been presented for blind and that are also used by deaf–blind persons. Vibrating motors Here we treat tactile ETAs for deaf–blind persons, actuated by means of vibrating motors. In order to communicate to the impaired user his proximity to a street or a crossroad, Schmitz and Ertl designed a map display with a rumble gamepad [203]. It guides a cursor over the map giving a vibration to communicate users’ position. The “Tactile Handle” [201,202] consists in a 4  4 array of actuators placed on fingers’ phalanxes and interfaced with proximity sensors and micro-controllers. A commercial handheld device is the “Miniguide” [209]. It is equipped with ultrasonic sensors to detect information about the environment. The experimental evaluation provided evidences that this device is reliable for obstacle detection and to increase users’ functional mobility, free-time activities, community life and social interactions. Anyway, the device can be non-efficient in crowded or snowy environments [209]. “Polaron” (Nurion Industries) [206] is another commercial device which can be mounted on a wheelchair, held in hand or worn around the neck. It detects obstacles with laser and ultrasound sensors and gives haptic feedback through vibrating motors. Concerning environmental systems, Max et al. developed a virtual interface consisting of beacons which interact with echo-sensors [205]. This system allows the exploration of a three-dimensional map and permits the appreciation of complex textures. It provides feedback in the form of audio cues and vibrations. The field of wearables comprises the haptic system by Ogrinc et al. [210]. It is conceived to help blind and deaf–blind individuals in riding horses. The rider is informed about the directions that the horse must take through the haptic signals originating from two vibrating motors installed on his upper arms. These are

wirelessly controlled by a smartphone held by the instructor, who can remotely drive the impaired rider. In this way, the rider can pull the reins according to vibration received on the left or right arm. An example of shoulder-mounted systems is instead the one developed by Cardin et al. [204]. It is interfaced with sonar sensors and actuated by means of eight vibrators mounted along the shoulders’ line. “Anotimous”, developed by Spiers and Dollar [211], is a cubic hand-held device composed of two halves. It is conceived to help sensory impaired users in navigation, especially deaf–blind, but can also be addressed to blind and deaf persons. Information about direction and distance of navigational targets is delivered to the users’ fingers via translation and rotations of the upper half. It is actuated via one servo motor and can be optionally equipped with a vibrating motor. An example of a haptic device in which the embedded vibrating motors are based on solenoid technology is the “FingerBraille” [199,200]. It is a wearable ring-shaped device equipped with a vibrating motor or a solenoid motor on each ring. It is based on Finger-Braille method and interacts with radio-frequency identification (RFID) tags. Worth to mention among wearable haptic aids is the shouldertapping interface by Ross et al. [194]. This is a backpack where a 3x3 array of audio-speakers actuated by solenoids performs the same function of an array of vibrating motors, in order to prove spatiotemporal vibrational patterns to the user. Communication and interfacing aids and listening and speech assistance aids The devices for deaf–blind individuals that we discuss in this paragraph are those specified in Figure 2 under “Communication and interfacing aids” and “Listening and speech assistance aids”, and are listed with aggregation based on actuation technologies. These devices deliver information through haptic stimulation of the hand, mainly following the codification of manual sign languages. In this section, we focus on the devices specifically conceived for deaf–blind persons. This section does not discuss assistive aids that have already been presented for blind and deaf and that are also used by deaf–blind persons. Robotic arms and hands Based on a haptic robotic arm is the device developed by Murphy et al. [216], a PC-based application to improve math and science learning for students. They integrated the low-cost haptic interface “Novint Falcon” into the learning system. The interface is a commercial controller with three degrees of freedom. The student, holding the grip at the end of the controller, can interact with a virtual environment and receive haptic feedback when the cursor intercepts a virtual object on the screen. Results showed an improvement in math learning for students who used these applications in classrooms, confirming the effectiveness of haptic-based applications for education purposes. “Parloma” is a system for remote communication between deaf–blind individuals using hand shapes. It consists of a low-cost depth sensor as an input device, and a robotic hand to provide output to the receiver. The communication is achieved through the web, similarly to instant messaging technologies [214,215]. Vibrating motors Among vibrating motors displays, we collected examples of devices characterized by various working principles. An example of wearable device is the “Body-Braille” by Sarkar et al. [207]. This system can allow long distance communication of deaf–blind and

Downloaded by [Southern Cross University] at 18:39 13 October 2017

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES

blind with disabled and non-disabled persons. It consists of six armbands and allows the conversion of Braille text into vibrations on the body. “V-Braille” [196] allows us to use the smartphone’s touch-screen for the representation of Braille characters, while smartphones’ haptic vibrations deliver haptic cues. “MyVox” comprises a Braille display, a keyboard, a visual display, a speaker and is actuated by a vibration motor [198]. “Autosem” is a bimanual communication aid. A set of semaphores is used to represent an alphabet, which is defined by different orientations of both hands. The hand orientation is detected by means of low-cost game controllers with integrated accelerometers. Users can detect outputs by scanning series of available orientations, where the correct one is communicated with a vibrotactile feedback [197]. Since deaf–blind persons are used to receive haptic speech information mainly on hands, gloves are a good solution for wearable assistive devices dedicated to these persons. The “Mobile Lorm Glove” translates the “Lorm” alphabet into text. Textile pressure sensors on the palm enable the user to compose messages that are then sent via Bluetooth in the form of SMS. The message received from the user is conveyed to the back of the glove following tactile patterns, according to the “Lorm” alphabet. Actuation is achieved by small vibrating motors [33]. The “DBGlove” by Caporusso is based on “Malossi” language. Pressure sensors and vibrotactile actuators are located at the 16 locations assigned to “Malossi” alphabet [34]. The glove by the Birla Institute of Technology & Science is instead based on Braille code [208]. Capacitive touch sensors on the palm receive input information, while miniaturized vibrational motors on the dorsal side deliver information to the receiver in the form of vibrotactile messages. A PC or mobile phone manages the information conversion and transmission. The device developed by Nicolau et al. [217] is instead based on Finger-Braille. Six rings, each one of those equipped with a coin motor, are placed on both users’ hands. Experiments showed that the subjects could recognize characters between 40% and 100% of trials, although the recognition is very character- and user-dependent. Also, haptic vests were developed, like the one by Ohtsuka et al. [195] equipped with micro vibrators. It can transmit Body-Braille characters sent from a hand-held device via infrared technology. In the end, we mention “Omar”, a haptic device mainly conceived for listening and speech assistance based on “Tadoma” method. It stimulates kinesthetic and tactile receptors in one or two dimensions using head-positioning DC motors actuators [191].

Piezoelectric and pneumatic displays Kramer et al. [212] developed a communication glove equipped with strain gauge sensors that flex with hands movements. An algorithm detects characters in the sign language, and the information is sent to a piezoelectrically actuated Braille display, a voice synthesizer or an LCD monitor, depending on the visual and hearing capabilities of the interlocutor. Another device involving piezo-actuated raising pins is “SPARSHA” [192]. It is a haptic display interacting with a computer that allows the conversion of text printed on a PC in haptic Braille. The actuation occurs via 6 piezo-electrically actuated pins. Among piezo-electrically actuated devices, worth to mention is a vibrotactile glove equipped with piezoelectric transducers encapsulated in a soft polymeric matrix [213]. The haptic glove can be equipped with a variable number of actuators, addressing different areas of the hand from the fingertips to the hand palm. It delivers tactile information in a reliable manner and can be used for sensory substitution purposes like communication and navigation.

17

Haptic displays for deaf–blind individuals can also be pneumatically actuated. The example we mention here is the one developed by Caporusso et al. [193]. It is a tactile interface equipped with a pneumatic Braille display through which a blind or deaf– blind user can access computer games.

Discussion This review analyzed sensory substitution aids conveying information through the tactile sense. Making use of these technologies, sensory disabled persons can overcome sources of isolation from the external world and will have the possibility to interact with the not disabled community. “Disability” is a physical, mental, cognitive or developmental condition which impairs, interferes with or limits a person's ability to accomplish certain tasks or actions, or to participate in typical daily activities and interactions [218]. Anyway, disability is not permanent if appropriate technology is developed for overcoming the limitations due to the impairment. This objective can be achieved by means of sensors and actuators which will support the impaired individual in accomplishing tasks thanks to the provision of sensory feedback Thanks to technology, the sensory disabled would benefit from a better integration into society, without the feeling of alienation and inadequacy. Haptic sensory aids improve users’ independence in navigating unknown environments and their communication abilities. The role of these assistive devices becomes important also in the workplace, where sensory disabled users can use technologies such as computers, otherwise accessible only to non-impaired. Deaf individuals can benefit from haptic technologies mainly in the language-learning phase. Studies confirmed that tactile aids might facilitate the understanding of lip-reading, and improve vocal production when used on pre-lingually deaf children [83]. Other researches focused on the use of a wearable tactile aid for speech learning on three years old children. After a proper training, their initial vocabulary, of five words only, was improved with over 400 words [219]. Speech information coded via tactile stimuli can have a big potential in assistance to lip-reading [220]. With the increasing diffusion of high-performance-implanted sensory neuroprostheses and a high variability of haptic sensory substitution aids, researchers focused on the comparison between results achieved with the use of tactile aids and those obtained with implants. An old study reported that, after a consistent training, subjects using tactile aids showed abilities comparable to that of implanted subjects with the technology available at the time [221]. Anyway, a comparison between tactile aids and implants depends on the case study and on the maturity of the technology and clinical practice, as the following findings can explain. In the research of Miyamoto et al. [222] “Nucleus”, a 22 channels cochlear implant, and “Tactaid 7” were compared in speech perception tasks for pre-lingually deaf children. After one-and-a-half-year training, performances in speech recognition without lip-reading aid resulted significantly higher for implanted subjects. Multichannel implants highly outperform multichannel tactile aids, while performances of single-channel implants were found comparable to those of two-channel tactile aids [223]. Despite these evidences, a haptic aid could then be a good choice for deaf patients whose clinical or economic conditions discourage the installation of an implant [222]. The revised technologies refer to vibrotactile stimulation, with some reference to electro-tactile. Researches were carried out to investigate which one of these stimulation protocols was more effective. Results showed that sending vibrotactile cues is preferred to electro-tactile, especially for stimulation of fingertips,

Downloaded by [Southern Cross University] at 18:39 13 October 2017

18

F. SORGINI ET AL.

whose high electrical resistance makes them less sensitive to electro-tactile stimuli [224]. In order to transmit vibrotactile information in a reliable manner, the definition of proper geometries of actuators and their vibrating parameters become crucial. Low noise, low energy requirements and an adequate dynamic range are key features. A fine tuning of frequency, intensity and duration of the vibration is required, as well as the timing between stimuli and the selection of the stimulation area. Identification of minimum and maximum levels of stimulation is important to match the individual sensitivity thresholds and is strictly related to the involved area on the skin. Studies [225] show that the vibration frequency corresponding to the maximal perception for the receptors in our hand is around 250 Hz, and that tactile stimulation at frequencies under 100 Hz improves spatial discrimination [226]. The duration of the vibrotactile stimulus shall be chosen according to the purpose of the transmitted information. The identification of a tactile pattern is easier when stimuli are between 80 and 320 ms, while for alerting purposes, a stimulation between 50 and 200 ms is preferred to avoid annoyance [25]. The positioning of the actuators must respect the criteria for two-point discrimination, i.e., thresholds and size of the region of stimulation. The static two-point discrimination distance varies case by case, increasing from 0.5 mm [93] to 1.6 mm [94] on fingertips to 4–5 mm on the hairy skin [227]. To design arrays of actuators respecting the two-point discrimination minimum distance is difficult especially in children, whose small skin areas limit the dimensions of transducers [44]. This is one of the reasons making the actuators’ miniaturization mandatory. At present, most diffused aids are based on piezoelectric transducers, whereas in the near future the integration of components based on other actuation principles would be a key point in scaling devices’ dimensions. Promising technologies are EAP displays. They can easily be encapsulated in polymers for the development of miniaturized soft displays, and the manufacturing process is highly automatable. The characteristics of such displays make them suitable for the stimulation of different body areas, especially if integrated in clothes. However, one of their main limitations is the need of very high driving voltages. Together with the definition of optimal geometrical specifications for actuators, researchers identified that a desirable weight for transducers to be inserted in arrays is around 15–20 g, especially for children applications [44]. The investigation of the proper stimulation parameters is important in order to match the sensory disabled perception acuity. Recent analyses on fingertips showed that visually impaired persons can achieve a spatial discrimination around 0.2 mm lower than sighted ones, for whom it ranges from 1.2 to 1.7 mm. This means that visually impaired demonstrate a 15% better tactile acuity in two-point discrimination tasks in comparison to the nonimpaired group [228]. This fact is reflected at the cortical level, where the representation of reading fingers in blinded results greater than in sighted. At the moment, a scientific explanation of this phenomenon is only hypothetical and maybe related to a neural network remodelling experienced by blind Braille readers as an adaptive mechanism, which can lead to an increase of their tactile acuity [228]. Anyway, other experiments demonstrated that performances of the two groups in Braille reading could equalize with practice. The superior performance of blind individuals was not found even with different tactile stimuli [229–232]. These facts lead to the conclusion that, at the moment, there are no consolidated evidences about whether blinded tactile acuity is better than the sighted one in an absolute way, whether blind persons outperforming occurs only in comparison with untrained sighted subjects and whether this superior tactile acuity is related to

neural adaptation consequent to the blind individuals tactile reading experience. Among the drawbacks shown by haptic sensory substitution aids, one is the high cognitive load they require [102], particularly true in an outdoor environment where the interpretation of stimuli becomes challenging and may limit the usability of such devices [233]. Another critical point is the scarce success on the market, even though the current availability of a variety of sensory substitution technologies at a good readiness level. Reasons of the scarce diffusion of sensory substitution aids are different and related to the disability they are addressed to. Concerning the blind community, many of them prefer to move around without electronic aids, choosing the assistance of caregivers instead. Affordability of sensory aids is another factor limiting their diffusion. Most of the impaired population comes from underdeveloped countries, while the wealthy in many cases do not have propensity for new technologies. Elderly encounter instead problems in learning how to use the devices [234]. Moreover, sensory aids for blind persons are mainly conceived for adults and they are not easily adaptable to be worn by children. In many cases, they are bulky devices which require a high cognitive load or a long training, and children find difficult to reach a proper level of attention to properly learn how to use these devices. This is an obstacle for early interventions, which are instead important since blindness affects the development of social relationships, emotions and psychomotricity, and can lead to spatial and social deficits at the cognitive level [235]. Acceptability and diffusion of sensory aids in the deaf community is an issue for both children and elderly individuals. The former group is divided in those who accept suddenly hearing aids, and those who do not accept them at all. Clinicians must try to find ways, through gaming and long training, to make them more confident about the technology. People in the latter group often associate hearing impairment with a disability condition related to getting older [233] and find the adaptation to new technologies challenging. Impairment misperception is another important cause of non-acceptance of hearing aids [236] since some individuals do not realize the increasing of their impairment because it appears gradually. Anyway, experimental studies comparing behaviours of the two age groups showed that the acceptance of hearing impairment is higher for older individuals than for young ones. Elderly report fewer difficulties in daily living activities compared with younger subjects [236], and they are more prone to accept this disabling condition which is considered a natural consequence of the ageing process. Cosmetics aspects are also relevant. The importance of social factors is a key component for the acceptability of sensory aids. This fact was already discussed for blind individuals, but keeps valid for all kind of disabilities. A different attitude of nonimpaired persons is common during the comparison with impaired ones. More in general, making use of an assistive device implies to wear something that underlines a disabling condition, and can marginalize the disabled person in the society. This occurs even if assistive devices were proven useful [237]. An example of this situation can be found in the research of Hersh [238]. After the analysis of several experiences of deaf–blind persons, he finds out that the fear of being stigmatized by the non-impaired group prevents them from using assistive devices. In particular, some sensory aids were proven to be more difficult to accept in comparison to other ones. This is the case, for example, of white canes that are characterized by a high visibility, generating a sense of embarrassment and nonindependence in the user.

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES

Downloaded by [Southern Cross University] at 18:39 13 October 2017

Conclusions Haptic sensory substitution systems have been here reviewed in the present study and resulted in having a potentiality in transmitting information to subjects with low or total sensory impairment. Information can be related to vision (in blind persons), to audition and speech (in deaf persons), and to both these senses (in deaf–blind persons). Haptic-assistive devices approach visual and audition disabilities in a different manner in comparison with partial vision and audio restitution technologies that we briefly mention in the following. Nowadays, the available solutions for partial audition restitution are represented by five prosthetic approaches (“Partial audition restitution”, Figure 2). “Hearing aids” are devices used for enhancing the volume of sounds. A microphone detects environmental sounds, which is conveyed to an amplifier communicating with a speaker placed in the outer ear [239]. “Bone conduction implants” consist in a sound processor which transmits sounds from the environment to a titanium implant to the inner ear via the bone, bypassing outer and middle ear. The “middle ear implant” uses mechanical energy for the inner ear stimulation instead. An external processor transmits sounds to a surgically implanted part to one of the bones in the middle ear, or near to the membrane window of the cochlea. The implant directly moves the bones of the middle ear or makes vibrating the membrane window of the cochlea [240,241]. “Cochlear implants” replace the inner ear function providing signals to the brain. An external sound processor detects sounds and converts them into digital signals, then sent to an internal implant, where signals are converted into electrical pulses and are sent along the electrode placed in the cochlea, stimulating the hearing nerve [242]. The last category of partial audition restitution solutions is “hybrid electric and acoustic prostheses”. These devices allow people who experienced a high-frequency hearing loss to hear high-frequency sounds, fundamental in the understanding of speech. They are constituted of an electrical pathway dedicated to the processing of high-frequency sounds, and of an acoustic pathway which processes low-frequency sounds. In both cases, an external sound processor transmits sounds to an amplifier which interacts with electrodes inside the cochlea [243]. Among the above-described aids, implants are a promising technology [72,244]. However, in some cases, major and minor complications were reported, including infections, facial paralysis, bacterial meningitis and postoperative vestibular symptoms [245]. Non-invasive haptic assistive devices (“Listening and speech assistance” and “Communication and interfacing”, Figure 2) have the advantage of ease of use and are available without the surgery and its drawbacks. Conversely, they give hearing substitution in a non-homologous manner, i.e., they involve a different sensory modality as all the assistive devices reviewed in this paper. Technologies for the partial restoration of the vision sense are also available, such as visual neuro-prostheses (“Partial vision restitution”, Figure 2) [246]. A successful example is the “Argus II Retinal Prosthesis System”, available on the market since 2013. Studies by Rizzo and collaborators [247] confirmed that this device safely performed on implanted blind patients, who after one year of follow-up showed increased performances in square localization, direction of motion identification, grating visual acuity and Goldmann visual field tests, with no or little postoperative complications. Anyway, there are cases in which implants can generate serious issues. They can provoke damages in the electrode-nerve tissue interface, can cause acute and chronic astrocytic and microglial encapsulation of electrodes [248,249], together with chronic inflammation leading to localized neuro-degeneration [250,251].

19

Moreover, in the case of cortical implants, they can be responsible for the generation of effects related to the electrical stimulation, leading to neural degeneration [252]. They are also suspected of damaging the peripheral vision in patients with macular degeneration [253]. Haptic devices could be a less invasive option to visual neuroprostheses, avoiding possible drawbacks linked to surgery or may represent a unique solution when surgical intervention is not possible. Technologies already tested and potentially representing an alternative to the haptic assistive aids presented in this review are those based on electro-tactile stimulation. These devices are available for both blind and deaf persons. We briefly mention here some examples of sensory substitution technologies for deaf individuals based on electro-tactile stimulation (“Listening and speech assistance” – “Electrodes”, Figure 2). These are devices like the “Tickle Talker” and the electro-tactile vocoders. The first is an electro-tactile device designed at the Melbourne University with a speech-processing electronics similar to the one of cochlear implants. It stimulates the user’s non-dominant hand with four rings equipped with eight electrodes. In this way, the stimulation takes place directly on digital nerves [77,78]. The “Audiotact” [73,74] and the “Tacticon 1600” [75] are examples of electro-tactile vocoders. “Audiotact” is a 32-channel vocoder, while “Tacticon 1600” separates the acoustic frequency spectrum into 16 channels, each one connected to an electrode worn on a belt around the users’ abdomen. In the device presented in the research of Dodgson, electrical stimulation is transcutaneous, and actuation can also occur through vibrotactile stimulators [76]. Researches comparing electro- and vibro-tactile vocoders demonstrated that despite the use of the former allows the improvement of speech learning in deaf children, they have poorer performances compared with the latter, also if in combination with lip-reading [54,59,254]. Among vision substitution devices based on electro-tactile stimulation, worth to mention are the electro-tactile ETAs for vision substitution (“Navigation”, Figure 2). Examples are tongue stimulators [224,255], which deliver information via electro-tactile arrays placed on an orthodontic retainer. Environmental analysis is achieved using a video camera and an RF module with electronics mounted on a pair of glasses. Navigation information can also be provided by means of transcutaneous electrical nerve stimulation (TENS), using gloves equipped with electrodes [256]. Considering all the solutions mentioned above for visual and hearing disabilities, the aim of future researches for haptic assistive technologies would be the improvement of their diffusion for sensory disabled population, pointing to the development of lowcost and more efficient systems conceived ad hoc for patients’ needs. Portability and ease of handling are important characteristics for sensory aids, as experimental studies demonstrate [44]. Intuitiveness and low invasiveness are two other key features for the development of a new generation of sensory substitution aids, in order to improve their diffusion on the market and exploit their potentialities for a better living. Early interventions with haptic sensory aids during childhood would improve speech learning, spatial awareness, orientation and interactions with other people, especially with non-disabled. This is true primarily for deaf–blind individuals. Their quite-complete sensorial isolation is a condition much more compromising compared with the possession of a single sensory disability. Anyway, the literature analysis showed how also individuals affected by a single sensory disability can take advantage of haptic aids, which have a big potential in improving the disabled quality of life. On the contrary, the miniaturization of actuators is mandatory for

Tactile display

Pneumatic system

SPARSHA

Computer tactile interface

Shoulder interface

Braille haptic vest

V-Braille

Autosem

MyVox

Finger-Braille

Tactile-Handle

Map display

Shoulder-mounted system

Navigation device

Polaron

Mobile Lorm Glove

Body-Braille communication system

[174]

[177,178]

[192]

[193]

[194]

[195]

[196]

[197]

[198]

[199,200]

[201,202]

[203]

[204]

[205]

[206]

[33]

[207]

Device

Omar

[191]

Ref.

Actuation

Functionality

Long distance communication

Vibrating motors

Vibrating motors

Navigation hand-held device or for wheelchairs Long distance communication

Virtual navigation interface

Navigation device

Navigation device

Navigation device with haptic feedback on fingers Navigation device

Communication

Long distance communication

Communication

Communication

Navigation device

Virtual navigation

Communication

Tactile pixel pneumatically activated Injection needle of CO2

Communication based on Tadoma method

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Audio-speakers

Pneumatic

Pneumatic

Pneumatic

Hard disk head-positioning actuators Pneumatic

Table 3. Haptic technologies for deaf & blind people. System for haptic stimulation of the hand that moves and gives vibratings to fingers with hard-disk headpositioning actuators Tactile pixel made of a polysilicon membrane actuated by means of a MEMS pneumatic micro-valve Needle that inflates a CO2 jet and moved by a tactile plotter in order to draw characters directly on the skin Electronic device that converts printed text on a PC’s display in haptic Braille characters through a six pin display Tactile interface equipped with a Braille display allowing the access to computer games A backpack worn on the user’s shoulders provides spatiotemporal vibratingal patterns through a 3  3 array of speakers Haptic vest equipped with micro vibrators that can transmit Body-Braille characters sent via infrared technology from an hand-held device held by a not disabled person Haptic representation of Braille characters on the display of a smartphone by means of vibrating integrated in this device Accelerometers on game controllers can detect hands orientations, corresponding to sign language, and vibrating motors on the same controllers can communicate the correct orientation to the user Braille refreshable display interacting with a keyboard, a visual display, a speaker and a vibrating motor RFID (radio-frequency identification) tags interact with a wearable device equipped with vibrating motors or solenoids on fingers Proximity sensors and micro-controllers integrated in a handle interact with a 4  4 array of vibrators placed on fingers’ phalanxes Map display with a rumble gamepad that gives a vibration feedback when the user is near a street or is crossing it. A navigation wearable system shoulder-mounted is equipped with sonar sensors that interact with eight vibrators Environmental beacons interact with echo-sensors, allowing the exploration of a three-dimensional map through audio and vibrational feedback Laser and ultrasound sensors detect the surrounding obstacles giving haptic feedback to the user with vibrating motors Glove equipped with tactile pressure sensors on the palmar side and actuated with vibrating motors on the dorsal side Wearable system composed by six micro vibrators placed on armbands interacting with a Braille writer

Description

Downloaded by [Southern Cross University] at 18:39 13 October 2017

Prototype

Prototype

Commercial

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Prototype

Readiness level

Year

(continued)

2013

2012

2010

1996

2007

2010

2006

2003–2004

2014

2014

2010

2010

2000

2010

2013

2010-2015

2003

1993

20 F. SORGINI ET AL.

DB-Glove

Tactile maps

Miniguide

Navigation device

Navigation device

Finger-Braille

Communication Glove

Haptic glove

Parloma

[34]

[203]

[209]

[210]

[211]

[199,200]

[212]

[213]

[214,215]

Device

Communication Glove

[208]

Ref.

Table 3. Continued

Robotic hands

Piezoelectric

Piezoelectric

Solenoids

Servo motor

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Vibrating motors

Actuation

Long distance communication

Long distance communication/navigation

Long distance communication

Navigation device with haptic feedback on fingers

Navigation device with haptic feedback on fingers

Navigation device for horse riding

Navigation and mobility aid

System for the haptic exploration of maps

Long distance communication

Long distance communication

Functionality Glove equipped with touch sensors on the palmar side and actuated with vibrating motors on the dorsal side Glove equipped with pressure sensors and vibrating motors placed following the Malossi alphabet System that displays maps by means of vibratings using a standard rumble gamepad, actuated with vibrating motors System that gives navigation information by means of vibrating, detecting obstacles positions with ultrasonic sensors Device made of two vibrating motors on upper arms which delivers navigation information during horse riding. Motors are controlled by the instructor with a smartphone Cubic hand-held device composed of two halves which delivers Information about direction and distance of navigational targets via the translation and rotations of its upper half RFID (radio-frequency identification) tags interact with a wearable device equipped with vibrating motors or solenoids on fingers A glove equipped with strain gage sensors detects hands movements and transmits signs into text on a Braille display Vibrotactile glove equipped with piezoelectric transducers encapsulated in a soft polymeric matrix, delivers tactile information in a reliable manner for sensory substitution purposes A robotic hand felt by users’ hands interacts with a depth-senor for sign recognition

Description

Downloaded by [Southern Cross University] at 18:39 13 October 2017

Year

Prototype

Prototype

2015

2017

1991

2003–2004

Prototype Patent

2016

2016

2014

2010

2008

2015

Prototype

Prototype

Commercial

Prototype

Prototype

Prototype

Readiness level

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES 21

Downloaded by [Southern Cross University] at 18:39 13 October 2017

22

F. SORGINI ET AL.

faster diffusion of wearable haptic aids. Scaling sizes will limit bulkiness and visibility, and can act on the social impact of these aids. A variety of wearable aids are being launched or are already present on the market. Their costs are becoming affordable, around thousands of dollars, especially when considering smart wristbands and sensorized insoles. Future efforts must be spent in the enhancement of synergies between different wearables. Visually impaired users can benefit from technologies allowing both communication and navigation, as well as deaf–blind, who need a combination of both hearing and visual sensory aids. In addition to rehabilitation purposes, haptic sensory substitution technologies have a potential that can be exploited for sensorial augmentation of non-disabled. A synergy between all these technologies can lead to the development of multisensorial interactive environments with applications in virtual reality, gaming, long distance communication, manufacturing and rehabilitation. Video and audio acquisition devices, robots and controllers could interact, sending information via haptic feedback to the end users, thus enhancing their sensorial awareness. If integrated into workwear, haptic sensory aids can also be used for security and rescue purposes, allowing the exchange of information in dangerous situations (i.e., firefighters, police forces), or to communicate safety information to workers like the proximity to robotic arms in manufacturing environments.

[2]

[3]

[4]

[5] [6]

[7]

[8]

[9]

Acknowledgements This study was funded in part by the Italian Ministry of Education, Universities and Research within the “Smart Cities and Social Innovation Under 30” program through the PARLOMA Project (SIN_00132), and within the Technological Clusters Initiative through the project 1 of the Intelligent Factory Cluster (Grant no. CTN01 00163 148175), and by Scuola Superiore Sant’Anna, Pisa, Italy.

Disclosure statement The authors report no conflicts of interest.

Funding This study was funded in part by the Italian Ministry of Education, Universities and Research within the “Smart Cities and Social Innovation Under 30” program through the PARLOMA Project [SIN_00132], and within the Technological Clusters Initiative through the project 1 of the Intelligent Factory Cluster [grant number CTN01 00163 148175], and by Scuola Superiore Sant’Anna, Pisa, Italy.

[10]

[11] [12]

[13] [14] [15]

[16]

[17]

ORCID Francesca Sorgini http://orcid.org/0000-0002-4114-2166 Maria Chiara Carrozza http://orcid.org/0000-0002-7630-0865 Calogero Maria Oddo http://orcid.org/0000-0002-1489-5701

References [1]

Ziegler-Graham K, MacKenzie EJ, Ephraim PL, et al. Estimating the prevalence of limb loss in the United States: 2005 to 2050. Arch Phys Med Rehabil. 2008;89: 422–429.

[18] [19]

[20]

[21]

Abu-Faraj ZO. Handbook of research on biomedical engineering education and advanced bioengineering learning: interdisciplinary concepts: interdisciplinary concepts. Hershey, PA: IGI Global; 2012. Auer-Grumbach M, De Jonghe P, Verhoeven K, et al. Autosomal dominant inherited neuropathies with prominent sensory loss and mutilations: a review. Arch Neurol. 2003;60:329–334. Rasmussen P. The congenital insensitivity-to-pain syndrome (analgesia congenita): report of a case. Int J Paediatr Dent. 1996;6:117–122. Wickremaratchi M, Llewelyn J. Effects of ageing on touch. Postgrad Med J. 2006;82:301–304. World-Health-Organization. 2012. Prevention of blindness and deafness: Grades of hearing impairment. Available from: http://www.who.int/pbd/deafness/hearing_impairment_grades/en/. Accessed 2016 June 10. World-Health-Organization. WHO global estimates on prevalence of hearing loss. Mortality and Burden of Diseases and Prevention of Blindness and Deafness. WHO; 2012. Celesia GG, Hickok G. The human auditory system: fundamental organization and clinical disorders. New York: Elsevier; 2015. American-Speech-Language-Hearing-Association. June 23. Causes of hearing loss in adults. Available from: http:// www.asha.org/public/hearing/Causes-of-Hearing-Loss-inAdults/. Accessed 2016 June 23. Ear-Science-Institute-Australia. March 8. Causes of hearing loss. Available from: http://www.earscience.org.au/HearingDiscovery-Centre/Causes-of-HL.aspx. Accessed 2016 March 8. Canalis RF, Lambert PR. The ear comprehensive otology. Philadelphia: Lippincott Williams & Wilkins; 2000. American-Speech-language-hearing-association. June 28. Causes of hearing loss in children. Available from: http:// www.asha.org/public/hearing/Causes-of-Hearing-Loss-inChildren/. Accessed 2016 June 28. World-Health-Organization. Global data on visual impairments 2010; 2012. Hartong DT, Berson EL, Dryja TP. Retinitis pigmentosa. Lancet. 2006;368:1795–1809. EBU-The-voice-of-blind-and-partially-sighted-people-in-Europe. September 9. Facts, figures and definitions concerning blindness and sight loss. Available from: http://www.euroblind.org/resources/information/. Accessed 2016 September 9. Schalock M, Bull R. The 2012 national child count of children and youth who are deaf–blind. Monmouth, OR: National Consortium on Deaf-Blindness, Teaching Research Institute, Western Oregon University; 2013. Blanchet C, Hamel C. 2009. Sindrome di Usher. Available from: http://www.orpha.net/consor/cgi-bin/OC_Exp.php? Lng¼IT&Expert¼886. National-center-of-Deaf-Blindness. Causes of deaf-blindness. Available from: https://nationaldb.org/library/list/6. Malloy P, Stremel-Thomas K, Scalock M, et al. Early identification of infants who are deaf–blind. National Consortium on Deaf-Blindness, Monmouth, OR, 2009. Robertson J, Emerson E. Estimating the number of people with co-occurring vision and hearing impairments in the UK. Lancaster: Centre for Disability Research, Lancaster University; 2010. Jarrold K. 2014. Mapping Opportunities for Deafblind People across Europe: Final report. Available from: http://

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES

[22]

[23]

[24] [25] [26]

Downloaded by [Southern Cross University] at 18:39 13 October 2017

[27] [28]

[29]

[30] [31]

[32]

[33]

[34] [35]

[36] [37]

[38] [39]

[40]

[41]

[42]

www.deafblindindicators.eu/images/PDF/1_1Final% 20report%20-Mapping%20opportunities_0315.pdf. Kaczmarek KA. Sensory augmentation and substitution. In: Joseph D Bronzino, editor. CRC handbook of biomedical engineering, 2nd ed. 2000; p. 2100–2109. Johansson RS, Vallbo ÅB. Tactile sensibility in the human hand: relative and absolute densities of four types of mechanoreceptive units in glabrous skin. J Physiol (Lond). 1979;286:283–300. Kaczmarek KA, Bach-Y-Rita P. Tactile displays. Virtual Environ Adv Interface Des. 1995;349–414. Jones LA, Sarter NB. Tactile displays: guidance for their design and application. Hum Factors. 2008;50:90–111. Brewster S, Brown LM. Tactons: structured tactile messages for non-visual information display. Australia: Australian Computer Society, Inc; 2004. p. 15–23. Allen JM, Asselin PK, Foulds R. American Sign Language finger spelling recognition system. 2003. IEEE. p 285–286. Kyle JG, Woll B, Pullen G. Sign language: the study of deaf people and their language. Cambridge: Cambridge University Press; 1988. Wall S, Brewster S. Feeling what you hear: tactile feedback for navigation of audio graphs. New York: ACM; 2006. p. 1123–1132. Moore C, Murray I. An electronic design of a low cost braille typewriter; 2001. Australian: IEEE. p. 153–157. American-Association-of-the-Deaf-Blind. 2009. How deaf–blind people communicate? Available from: http://www. aadb.org/factsheets/db_communications.html. Accessed 2016 August 8. Reed CM, Rabinowitz WM, Durlach NI, et al. Research on the Tadoma method of speech communication. J Acoust Soc Am. 1985;77:247–257. Gollner U, Bieling T, Joost G. Mobile Lorm Glove: introducing a communication device for deaf–blind people. New York: ACM; 2012. p. 127–130. Caporusso N. A wearable Malossi alphabet interface for deafblind people. New York: ACM; 2008. p. 445–448. Miyagi M, Nishida M, Horiuchi Y, et al. Analysis of prosody in finger braille using electromyography. Conf Proc IEEE Eng Med Biol Soc. 2006;1:4901–4904. Matsuda Y, Isomura T. Finger Braille recognition system. New York: INTECH Open Access Publisher; 2012. Spens K, Huss C, Dahlqvist M, et al. A hand held twochannel vibro-tactile speech communication aid for the deaf: characteristics and results. Scand Audiol Suppl. 1996;47:7–13. Wiener N, Wiesner J, David E Jr, et al. Operation “Felix.” Quart. Progr. Repts, Cambridge; 1951. Galvin KL, Mavrias G, oore A, et al. A comparison of tactaid ii and tactaid 7 use by adults with a profound hearing impairment. Ear Hear. 1999;20:471. Saba MP, Filippo D, Pereira FRD, et al. Hey yaa: a haptic warning wearable to support deaf people communication. Collaboration and Technology. New York: Springer; 2011. p. 215–223. R with Super Sonic-Alert. 2015 September 23. Sonic BombV Shaker. Available from: http://www.sonicalert.com/SonicBomb-with-Super-Shaker-TM-p/sbb500ss.htm. Accessed 2016 September 23. Gault RH. Touch as a substitute for hearing in the interpretation and control of speech. Arch Otolaryngol—Head Neck Surg. 1926;3:121.

[43]

[44]

[45] [46] [47]

[48]

[49]

[50]

[51]

[52]

[53]

[54]

[55]

[56]

[57] [58]

[59] [60]

[61] [62]

[63]

23

Schulte K, Fant G. Fonator system: Speech stimulation and speech feed-back by technically amplified one-channel vibrations; 1972. Sherrick CE. Basic and applied research on tactile aids for deaf people: progress and prospects. J Acoust Soc Am. 1984;75:1325–1342. Willemain TR, Lee FF. Tactile pitch feedback for deaf speakers. The Volta Review 1971. Stratton WD. Intonation feedback for the deaf through a tactile display. The Volta Review 1974. Yeni-Komshian GH, Goldstein MH Jr. Identification of speech sounds displayed on a vibrotactile vocoder. J Acoust Soc Am. 1977;62:194–198. Sparks DW, Kuhl PK, Edmonds AE, et al. Investigating the MESA (Multipoint Electrotactile Speech Aid): the transmission of segmental features of speech. J Acoust Soc Am. 1978;63:246–257. Yuan H, Reed CM, Durlach NI. Tactual display of consonant voicing as a supplement to lipreading. J Acoust Soc Am. 2005;118:1003–1015. Sakajiri M, Miyoshi S, Nakamura K, et al. Voice pitch control using tactile feedback for the deafblind or the hearing impaired persons to assist their singing. New York: IEEE; 2010. p. 1483–1487. Sakajiri M, Miyoshi S, Onishi J, et al. Tactile pitch feedback system for deafblind or hearing impaired persons singing accuracy of hearing persons under conditions of added noise. New York: IEEE; 2014. P. 31–35. Sakajiri M, Nakamura K, Fukushima S, et al. Effect of voice pitch control training using a two-dimensional tactile feedback display system. New York: IEEE; 2012. p. 2943–2947. Sakajiri M, Miyoshi S, Nakamura K, et al. Voice pitch control ability of hearing persons with or without tactile feedback using a two-dimensional tactile display system. New York: IEEE; 2011. p. 1069–1073. Engelmann S, Rosov R. Tactual hearing experiment with deaf and hearing subjects. Exceptional Children. 1975;41:243–253. Gault RH, Crane GW. Tactual patterns from certain vowel qualities instrumentally communicated from a speaker to a subject's fingers. J Gen Psychol. 1928;1:353–359. Guelke R, Huyssen R. Development of apparatus for the analysis of sound by the sense of touch. J Acoust Soc Am. 1959;31:799–809. Kringlebotn M. Experiments with some visual and vibrotactile aids for the deaf. Am Ann Deaf. 1968;113:311–317. Pickett J, Pickett BH. Communication of speech sounds by a tactual vocoder. J Speech Hear Res. 1963;6: 207–222. Brooks P, Frost BJ. Evaluation of a tactile vocoder for work recognition. J Acoust Soc Am. 1983;74:34–39. Lynch MP, Eilers RE, Oller DK, et al. Speech perception by congenitally deaf subjects using an electrocutaneous vocoder. J Rehabil Res Dev. 1988;25:41–50. Gescheider G. Some comparisons between touch and hearing. IEEE Trans Man Mach Syst. 1970;11:28–35. Bekesy Gv. Human skin perception of traveling waves similar to those on the cochlea. J Acoust Soc Am. 1955;27:830–841. Richardson B, Wuillemin D, Saunders F. Tactile discrimination of competing sounds. Percept Psychophys. 1978;24:546–550.

24

[64]

[65]

[66]

[67]

[68]

Downloaded by [Southern Cross University] at 18:39 13 October 2017

[69] [70]

[71]

[72]

[73]

[74]

[75] [76]

[77]

[78]

[79]

[80] [81]

[82] [83] [84]

[85]

[86]

F. SORGINI ET AL.

Boothroyd A. Wearable tactile sensory aid providing information on voice pitch and intonation patterns. Google Patents; 1986. Boothroyd A. A wearable tactile intonation display for the deaf. IEEE Trans Acoust Speech Signal Process. 1985;33:111–117. Wada C, Ino S, Ifukube T. Proposal and evaluation of the sweeping display of speech spectrum for a tactile vocoder used by the profoundly hearing impaired. Electron Commun Jpn Pt Iii. 1996;79:56–66. Nanayakkara S, Wyse L, Taylor EA. The haptic chair as a speech training aid for the deaf. New York: ACM; 2012. p. 405–410. Nanayakkara S, Wyse L, Taylor EA. Effectiveness of the haptic chair in speech training. New York: ACM; 2012. p. 235–236. Park S. Sounzzz. Available from: http://www.yankodesign. com/2009/11/17/feel-the-music/. Accessed 2016 March 17. Baijal A, Kim J, Branje C, et al. Composing vibrotactile music: a multi-sensory experience with the Emoti-chair. New York: IEEE; 2012. p. 509–515. Karam M, Nespoli G, Russo F, et al. Modelling perceptual elements of music in a vibrotactile display for deaf users: a field study. New York: IEEE; 2009. p. 249–254. Reinfeldt S, Håkansson B, Taghavi H, et al. New developments in bone-conduction hearing implants: a review. Med Devices (Auckl). 2015;8:79. Cowan RSC, Galvin KL, Lu BD, et al. Electrotactile vocoder using handset with stimulating electrodes. Google Patents; 2002. Rakowski K, Brenner C, Weisenberger JM. Evaluation of a 32-channel electrotactile vocoder. J Acoust Soc Am. 1989;86:S83–S83. Smith RV, Leslie JH Jr. Rehabilitation engineering. Boca Raton, FL: CRC Press; 1990. p. 65–68. Dodgson G, Brown B, Freeston I, et al. Electrical stimulation at the wrist as an aid for the profoundly deaf. Clin Phys Physiol Meas. 1983;4:403. Cowan R, Galvin K, Sarant J, et al. Improved electrotactile speech processor: Tickle Talker. Ann Otol Rhinol Laryngol. 1995;104 (suppl.166):454–456. Cowan R, Alcantara J, Blamey PJ, et al. Preliminary evaluation of a multichannel electrotactile speech processor. J Acoust Soc Am. 1988;83:2328–2338. Pisanski K, Cartei V, McGettigan C, et al. Voice modulation: a window into the origins of human vocal control? Trends Cogn Sci. 2016;20:304–318. Geldard FA. Adventures in tactile literacy. Am Psychol. 1957;12:115. O'donoghue GM, Nikolopoulos TP, Archbold SM. Determinants of speech perception in children after cochlear implantation. Lancet. 2000;356:466–468. Engelmann S, Rosov RJ. Tactual hearing experiment with deaf and hearing subjects. Research Bulletin 14; 1974. Goldstein MH Jr, Proctor A. Tactile aids for profoundly deaf children. J Acoust Soc Am. 1985;77:258–265. Risberg A. A critical review of work on speech analyzing hearing aids. IEEE Trans Audio Electroacoust. 1969;17:290–297. Biondi E, Biondi L. The sampling of sounds as a new means of making speech intelligible to the profoundly deaf. Alta Frequenza. 1968;37:180–191. Johansson B. A new coding amplifier system for the severely hard of hearing; 1961. p. 655–657.

[87]

[88]

[89]

[90]

[91]

[92] [93]

[94]

[95]

[96]

[97]

[98]

[99]

[100] [101] [102]

[103]

[104]

[105] [106]

[107]

[108] [109]

Risberg A, Spens K. Teaching machine for training experiments in speech perception. Speech Technol Lab Quart Rep. 1967;2:72. Guttman N, Nelson JR. An instrument that creates some artificial speech spectra for the severely hard of hearing. Am Ann Deaf. 1968;113:295–302. Daniel L, Druz WS. Transposition of high frequency sounds by partial vocoding of the speech spectrum: its use by deaf children. J Auditory Res. 1967;7:133–144. Pimonow L. La parole synth etique et son application dans la correction auditive. New York: Springer; 1965. p. 151–171. Mazzoni A, Bryan-Kinns N. Mood Glove: a haptic wearable prototype system to enhance mood music in film. Osaka, Japan: Entertainment Computing; 2016. Gescheider GA. Cutaneous sound localization. J Exp Psychol. 1965;70:617. Johnson KO, Phillips JR. Tactile spatial resolution. I. Twopoint discrimination, gap detection, grating resolution, and letter recognition. J Neurophysiol. 1981;46:1177–1192. Vallbo ÅB, Johansson R. Properties of cutaneous mechanoreceptors in the human hand related to touch sensation. Hum Neurobiol. 1984;3:3–14. Harkins J, Tucker PE, Williams N, et al. Vibration signaling in mobile devices for emergency alerting: a study with deaf evaluators. J Deaf Stud Deaf Educ. 2010;enq018. Kim JS, Kim CH. A review of assistive listening device and digital wireless technology for hearing instruments. Korean J Audiol. 2014;18:105–111. Damper R, Evans MD. A multifunction domestic alert system for the deaf–blind. IEEE Trans Rehab Eng. 1995;3:354–359. Wang Y, Kuchenbecker KJ. HALO: Haptic Alerts for Lowhanging Obstacles in white cane navigation. New York: IEEE; 2012. p. 527–532. Dakopoulos D, Bourbakis NG. Wearable obstacle avoidance electronic travel aids for blind: a survey. IEEE Trans Syst Man Cybern C. 2010;40:25–35. Ram S, Sharf J. The people sensor: a mobility aid for the visually impaired. New York: IEEE; 1998. p. 166–167. Sunu-Inc. 2015. Sunu Band. Available from: http://sunu.io/. Accessed 2016 June 18. Kammoun S, Jouffrais C, Guerreiro T, et al. Guiding blind people with haptic feedback. Frontiers in Accessibility for Pervasive Computing (Pervasive 2012), vol. 3; 2012. Mann S, Huang J, Janzen R, et al. Blind navigation with a wearable range camera and vibrotactile helmet. New York: ACM; 2011. p. 1325–1328. Dakopoulos D, Boddhu SK, Bourbakis N. A 2D vibration array as an assistive device for visually impaired. New York: IEEE; 2007. p. 930–937. Adjouadi M. A man-machine vision interface for sensing the environment. J Rehabil Res Dev. 1992;29:57–76. Kaczmarek K, Webster JG, Bach-y-Rita P, et al. Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Trans Biomed Eng. 1991;38:1–16. Akhter S, Mirsalahuddin J, Marquina F, et al. A smartphone-based haptic vision substitution system for the blind. New York: IEEE; 2011. p. 1–2. Gemperle F, Ota N, Siewiorek D. Design of a wearable tactile display. New York: IEEE. 2001. p. 5–12. Jones LA, Lockyer B, Piateski E. Tactile display and vibrotactile pattern recognition on the torso. Adv Robot. 2006;20:1359–1374.

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES

[110] [111]

[112]

[113]

[114]

Downloaded by [Southern Cross University] at 18:39 13 October 2017

[115] [116]

[117] [118] [119]

[120] [121]

[122]

[123]

[124]

[125]

[126]

[127]

[128] [129]

[130]

Van Veen H, Van Erp JB. Providing directional information with tactile torso displays; 2003. p. 471–474. McDaniel T, Krishna S, Balasubramanian V, et al. Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind. New York: IEEE. 2008. p. 13–18. Van Erp JB, Van Veen HA, Jansen C, et al. Waypoint navigation with a vibrotactile waist belt. ACM Trans Appl Percept. 2005;2:106–117. Rose C, Pierce D, Sherman A. Interactive Navigation System for the Visually Impaired with Auditory and Haptic Cues in Crosswalks, Indoors and Urban Areas. HCI International 2015-Posters’ Extended Abstracts. New York: Springer; 2015. p 539–545. Flores G, Kurniawan S, Manduchi R, et al. Vibrotactile guidance for wayfinding of blind walkers. IEEE Trans Haptics. 2015;8:306–317. Gilson S, Gohil S, Khan F, et al. A wireless navigation system for the visually impaired, vol. 2015, 2015. Tsukada K, Yasumura M. Activebelt: belt-type wearable tactile display for directional navigation. UbiComp 2004: Ubiquitous Computing. New York: Springer; 2004. p. 384–399. Nagel SK, Carl C, Kringe T, et al. Beyond sensory substitution-learning the sixth sense. J Neural Eng. 2005;2:13–26. Wu J, Song Z, Wu W, et al. A vibro-tactile system for image contour display. New York: IEEE. 2011. p. 145–150. Johnson L, Higgins CM. A navigation aid for the blind using tactile-visual sensory substitution. New York: IEEE. 2006. p. 6289–6292. Ducere-Technologies. 2011. Lechal. Available from: http:// lechal.com/. Accessed 2016 March 10. Velazquez R, Bazan O, Alonso C, et al. Vibrating insoles for tactile communication with the feet. New York: IEEE. 2011. p. 118–123. ~a M. A shoe-integrated tactVel~azquez R, Bazan O, Magan ile display for directional navigation. New York: IEEE; 2009. p. 1235–1240. Velazquez R, Bazan O. Preliminary evaluation of podotactile feedback in sighted and blind users. New York: IEEE; 2010. p. 2103–2106. Zhang J, Lip CW, Ong S-K, et al. A multiple sensor-based shoe-mounted user interface designed for navigation systems for the visually impaired. New York: IEEE; 2010. p. 1–8. Abu-Faraj Z, Jabbour E, Ibrahim P, et al. Design and development of a prototype rehabilitative shoes and spectacles for the blind. New York: IEEE; 2012. p. 795–799. Lobo L, Travieso D, Barrientos A, et al. Stepping on obstacles with a sensory substitution device on the lower leg: practice without vision is more beneficial than practice with vision. PLoS One. 2014;9:e98801. Grathio-Labs. 2011. Meet The Tacit Project. It’s Sonar For The Blind. Available from: http://grathio.com/2011/08/meet-thetacit-project-its-sonar-for-the-blind/. Accessed 2016 July 30. Wormald International Sensory Aids, Horseshoe Bar Rd., Loomis, 95650 C. Mowat Sensor.  F. Vibrotactile feedback as Ghiani G, Leporini B, Paterno an orientation aid for blind users of mobile guides. New York: ACM; 2008. p. 431–434. Kim Y, Harders M, Gassert R. Identification of vibrotactile patterns encoding obstacle distance information. IEEE Trans Haptics. 2015;8:298–305.

[131]

[132] [133]

[134]

[135]

[136] [137]

[138]

[139]

[140]

[141]

[142]

[143]

[144]

[145]

[146] [147]

[148]

[149]

[150]

25

Gallo S, Chapuis D, Santos-Carreras L, et al. Augmented white cane with multimodal haptic feedback. New York: IEEE; 2010. p. 149–155. Fanucci L, Roncella R, Iacopetti F, et al. Improving Mobility of Pedestrian Visually-Impaired Users; 2011. BlindMaps.org. 2014. Haptic White Cane. Available from: http://www.blindmaps.org/prototypes/. Accessed 2016 May 21. Sound-Foresight-Technology-Ltd. September 16. The UltraCane - an award winning mobility aid. Available from: http://www.ultracane.com/about_the_ultracane. Accessed 2016 September 16. Scalise L, Primiani VM, Shahu DD, et al. Electromagnetic aids for visually impaired users. ICEmB II Convegno Nazionale, “Interazioni fra campi elettromagnetici e biosistemi,” Bologna; 2012. Calder DJ. Ecological solutions for the blind. New York: IEEE; 2010. p. 625–630. Eyes-Free-Project. 2011. WalkyTalky. Available from: https://play.google.com/store/apps/details?id¼com.googlecode.eyesfree.walkytalky. Accessed 2016 May 3. Amemiya T, Sugiyama H. Design of a haptic direction indicator for visually impaired people in emergency situations. In: Springer, editor. Computers Helping People with Special Needs. Volume 5105, Lecture Notes in Computer Science. New York: Springer; 2008. p. 1141–1144. Ortigoza-Ayala L, Ruiz-Huerta L, Caballero-Ruiz A, et al. Artificial vision for the human blind, vol. 47. Mexico: Revista Medica Del Instituto Mexicano Del Seguro Social; 2008. p. 393–398. Ortigoza-Ayala L, Ruiz-Huerta L, Caballero-Ruiz A, et al. tesis de substitucio n sensorial visual para pacientes ciePro gos. Rev Mex Oftalmol. 2009;83:235–238. Zelek J, Audette R, Balthazaar J, et al. A stereo-vision system for the visually impaired, vol. 1999. Guelph: University of Guelph; 1999. p. 1–9. Google-Inc. 2013. BrailleBack. Available from: https://play. google.com/store/apps/details?id¼com.googlecode.eyesfree.brailleback. Accessed 2016 27 September. Oliveira J, Guerreiro T, Nicolau H, et al. BrailleType: unleashing braille over touch screen mobile phones. Human-computer interaction–INTERACT 2011. New York: Springer; 2011. p. 100–107. Azenkot S, Fortuna E. Improving public transit usability for blind and deaf–blind people by connecting a braille display to a smartphone. New York: ACM; 2010. p. 317–318. Nakamura M, Jones L. An actuator for the tactile vest-a torso-based haptic device. New York: IEEE; 2003. p. 333–339. Jones L, Nakamura M, Lockyer B. Development of a tactile vest. New York: IEEE; 2004. p. 82–89. Velazquez R, Fontaine E, Pissaloux E. Coding the environment in tactile maps for real-time guidance of the visually impaired. New York: IEEE; 2006. p. 1–6. Arcara PD, Stefano L, Mattoccia S, et al. Perception of depth information by means of a wire-actuated haptic interface. New York: IEEE; 2000. p. 3443–3448. Akita J, Komatsu T, Ito K, et al. CyARM: haptic sensing device for spatial localization on basis of exploration by arms. Adv Hum-Comput Interact. 2009;2009:6. Yuan D, Manduchi R. A tool for range sensing and environment discovery for the blind. New York: IEEE; 2004. p. 39.

26

F. SORGINI ET AL.

[151]

Park CH, Ryu E-S, Howard A. Telerobotic haptic exploration in art galleries and museums for individuals with visual impairments. IEEE Trans Haptics. 2014;8:327–338. Loscos C, Tecchia F, Frisoli A, et al. The Museum of Pure Form: touching real statues in an immersive virtual museum; 2004. p. 271–279. Russomanno A, O'Modhrain S, Gillespie R, et al. Refreshing refreshable Braille displays; 2015. Bach-y-Rita p. Neurophysiological basis of a tactile visionsubstitution system. IEEE Trans Man Mach Syst. 1970;11:108–110. Bach-Y-Rita P, Collins CC, Saunders FA, et al. Vision substitution by tactile image projection. Nature. 1969;221:963–964. Bach-y-Rita P, Tyler ME, Kaczmarek KA. Seeing with the brain. Int J Hum–Comput Interact. 2003;15:285–295. Gardner EP, Palmer CI. Simulation of motion on the skin. I. Receptive fields and temporal frequency coding by cutaneous mechanoreceptors of OPTACON pulses delivered to the hand. J Neurophysiol. 1989;62:1410–1436. Goldish LH, Taylor HE. The Optacon: a valuable device for blind persons. New Outlook Blind. 1974;68:49–56. Humanware. April 25. BrailleNote. Available from: http:// store.humanware.com/europe/brailliant-bi-32-new-generation.html. Accessed 2016 April 25. Smithmaitrie P, Kanjantoe J, Tandayya P. Touching force response of the piezoelectric Braille cell. Disabil Rehabil: Assist Technol. 2008;3:360–365. Levesque V, Pasquero J, Hayward V, et al. Display of virtual braille dots by lateral skin deformation: feasibility study. ACM Trans Appl Percept. 2005;2:132–149. Hayward V, Cruz-Hernandez M. Tactile display device using distributed lateral skin stretch; 2000, p. 1309–1314. Hayward V, Terekhov AV, Wong S-C, et al. Spatio-temporal skin strain distributions evoke low variability spike responses in cuneate neurons. J R Soc Interface. 2014;11. €rntell H, Bengtsson F, Geborek P, et al. Segregation of Jo tactile input features in neurons of the cuneate nucleus. Neuron. 2014;83:1444–1452. Pasquero J, Hayward V. STReSS: a practical tactile display system with one millimeter spatial resolution and 700 Hz refresh rate. Dublin, Ireland; 2003. p. 94–110. Bliss JC, Katcher MH, Rogers CH, et al. Optical-to-tactile image conversion for the blind. IEEE Trans Man Mach Syst. 1970;11:58–65. Kobayashi M, Watanabe T. A tactile display system equipped with a pointing device—MIMIZU. Computers helping people with special needs. New York: Springer; 2002. p. 527–534. Homma T, Ino S, Kuroki H, et al. Development of a piezoelectric actuator for presentation of various tactile stimulation patterns to fingerpad skin. New York: IEEE. 2004. p. 4960–4963. Chouvardas VG, Miliou AN, Hatalis MK. Tactile display applications: a state of the art survey. Citeseer; 2005. p. 290–303. O'Modhrain S, Giudice N, Gardner J, et al. Designing media for visually-impaired users of refreshable touch displays: possibilities and pitfalls. IEEE Trans Haptics. 2015;8:248–257. Metec. 2007. HyperBraille: The Project. Available from: http://www.hyperbraille.de/project/?lang¼en. Accessed 2016 July 10.

[152]

[153] [154]

[155]

[156]

Downloaded by [Southern Cross University] at 18:39 13 October 2017

[157]

[158] [159]

[160]

[161]

[162] [163]

[164]

[165]

[166]

[167]

[168]

[169]

[170]

[171]

[172]

[173]

[174]

[175] [176]

[177]

[178]

[179] [180] [181] [182]

[183]

[184]

[185]

[186]

[187]

[188]

[189] [190]

[191]

[192]

Koo IM, Jung K, Koo JC, et al. Development of soft-actuator-based wearable tactile display. IEEE Trans Robot. 2008;24:549–558. Haga Y, Makishi W, Iwami K, et al. Dynamic Braille display using SMA coil actuator and magnetic latch. Sens Actuators A Phys. 2005;119:316–322. Yobas L, Durand DM, Skebe GG, et al. A novel integrable microvalve for refreshable Braille display system. J Microelectromech Syst. 2003;12:252–263. Lee JS, Lucyszyn S. A micromachined refreshable Braille cell. J Microelectromech Syst. 2005;14:673–682. Tactus-Technology-Inc. 2009. Phorm. Available from: https://www.getphorm.com/; http://tactustechnology.com/ technology/. Accessed 2016 September 27. Cunha JCd, Bordignon LA, Nohama P. Tactile communication using a CO 2 flux stimulation for blind or deafblind people. New York: IEEE; 2010. p. 5871–5874. Cunha J, Nohama P. A novel instrumentation to investigate the alternative tactile communication through mechanical stimulation using CO2 jets. New York: Springer; 2015. p. 35–38. Visell Y. Tactile sensory substitution: models for enaction in HCI. Interact Comput. 2009;21:38–53. Garland C. Sensory devices for the blind. J Med Eng Technol. 1977;1:319–323. Minagawa H, Ohnishi N, Sugie N. Tactile-audio diagram for blind persons. IEEE Trans Rehabil Eng. 1996;4:431–437. Plimmer B, Reid P, Blagojevic R, et al. Signing on the tactile line: a multimodal system for teaching handwriting to blind children. ACM Trans Comput–Hum Interact. 2011;18:17. Schloerb DW, Lahav O, Desloge JG, et al. BlindAid: virtual environment system for self-reliant trip planning and orientation and mobility training. New York: IEEE; 2010. p. 363–370. Johnson KA, Semwal SK. Shapes: a multi-sensory environment for the B/VI and hearing impaired community. Minneapolis, MN: IEEE; 2014. p. 1–6. Schwerdt HN, Tapson J, Etienne-Cummings R. A color detection glove with haptic feedback for the visually disabled. New York: IEEE; 2009. p. 681–686. Cappelletti L, Ferri M, Nicoletti G. Vibrotactile color rendering for the visually impaired within the VIDET project. Bellingham: International Society for Optics and Photonics; 1998. p. 92–96. Collins CC. Tactile television-mechanical and electrical image projection. IEEE Trans Man Mach Syst. 1970;11:65–71. Klein D, Rensink D, Freimuth H, et al. Modelling the response of a tactile array using electrorheological fluids. J Phys D: Appl Phys. 2004;37:794. Chouvardas V, Miliou A, Hatalis M. Tactile displays: overview and recent advances. Displays. 2008;29:185–194. Xu C, Israr A, Poupyrev I, et al. Tactile display for the visually impaired using TeslaTouch. New York: ACM; 2011. p. 317–322. Eberhardt SP, Bernstein LE, Coulter DC, et al. A haptic display for speech perception by deaf and deaf–blind individuals. New York: IEEE; 1993. p. 195–201. Sarkar R, Das S, Roy S. SPARSHA: a low cost refreshable braille for deaf–blind people for communication with deaf–blind and non-disabled persons. Distributed computing and internet technology. New York: Springer; 2013. p. 465–475.

HAPTIC AIDS FOR AUDITION AND VISION DISABILITIES

[193]

[194] [195]

[196]

[197]

Downloaded by [Southern Cross University] at 18:39 13 October 2017

[198]

[199]

[200]

[201]

[202]

[203]

[204]

[205]

[206]

[207]

[208]

[209]

[210]

[211]

Caporusso N, Mkrtchyan L, Badia L. A multimodal interface device for online board games designed for sightimpaired people. IEEE Trans Inform Technol Biomed. 2010;14:248–254. Ross DA, Blasch BB. Wearable interfaces for orientation and wayfinding. New York: ACM; 2000. p. 193–200. Ohtsuka S, Hasegawa S, Sasaki N, et al. Communication system between deaf–blind people and non-disabled people using body-braille and infrared communication. New York: IEEE; 2010. p. 1–2. Jayant C, Acuario C, Johnson W, et al. V-braille: haptic braille perception using a touch-screen and vibration on mobile phones. New York: ACM; 2010. p. 295–296. Khambadkar V, Folmer EA. tactile-proprioceptive communication aid for users who are deafblind. New York: IEEE; 2014. p. 239–245. Ramirez-Garibay F, Millan Olivarria C, Eufracio Aguilera AF, et al. MyVox—device for the communication between people: blind, deaf, deaf–blind and unimpaired. New York: IEEE; 2014. p. 506–509. Amemiya T, Yamashita J, Hirota K, et al. Virtual leading blocks for the deaf–blind: A real-time way-finder by verbal-nonverbal hybrid interface and high-density RFID tag space. New York: IEEE; 2004. p. 165–287. Hirose M, Amemiya T. Wearable finger-braille interface for navigation of deaf–blind in ubiquitous barrier-free space; 2003. p. 1417–1421. Shah C, Bouzit M, Youssef M, et al. Evaluation of ru-netratactile feedback navigation system for the visually impaired. New York: IEEE; 2006. p. 72–77. Bouzit M, Chaibi AD, Laurentis K, et al. Tactile feedback navigation handle for the visually impaired. USA: American Society of Mechanical Engineers; 2004. p. 1171–1177. Schmitz B, Ertl T. Making Digital Maps accessible using vibrations. Computers helping people with special needs. New York: Springer; 2010. p. 100–107. Cardin S, Thalmann D, Vexo F. A wearable system for mobility improvement of visually impaired people. Tvc. 2007;23:109–118. Max ML, Gonzalez JR. Blind persons navigate in virtual reality (VR); hearing and feeling communicates ‘reality’. Stud Health Technol Inform. 1996;39:54–59. Hersh M, Johnson MA, Mobility: an overview. Clear-path indicators. In: Johnson MAHaMA, editor. Assistive technology for visually impaired and blind people. New York: Springer Science & Business Media; 2010. p. 192–193. Sarkar R, Das S, Rudrapal DA. Low cost microelectromechanical Braille for blind people to communicate with blind or deaf blind people through SMS subsystem. New York: IEEE; 2013. p. 1529–1532. Choudhary T, Kulkarni S, Reddy PA. Braille based mobile communication and translation glove for deaf–blind people. New York: IEEE; 2015. p. 1–4. Vincent C, Routhier F, Martel V, et al. Field testing of two electronic mobility aid devices for persons who are deaf–blind. Disabil Rehabil: Assist Technol. 2014;9:414–420. Ogrinc M, Farkhatdinov I, Walker R, et al. Deaf–blind can practise horse riding with the help of haptics. New York: Springer; 2016. p. 452–461. Spiers A, Dollar A. Design and evaluation of shape-changing haptic interfaces for pedestrian navigation assistance; 2016.

[212]

[213]

[214]

[215]

[216]

[217]

[218]

[219]

[220] [221]

[222]

[223]

[224]

[225]

[226]

[227] [228]

[229]

[230]

[231]

27

Kramer JP, Lindener P, George WR. Communication system for deaf, deaf–blind, or non-vocal individuals using instrumented glove. Google Patents; 1991. Sorgini F, Mazzoni A, Massari L, et al. Encapsulation of piezoelectric transducers for sensory augmentation and substitution with wearable haptic devices. Micromachines. 2017;8:270.  Farulla G, Pianu D, et al. PARLOMA – a Russo LO, Airo novel human–robot interaction system for deaf–blind remote communication. Int J Adv Robot Syst. 2015;12:57. Farulla GA, Russo LO, Pintor C, et al. Real-time single camera hand gesture recognition system for remote deaf–blind communication. Augmented and virtual reality. New York: Springer; 2014. p. 35–52. Murphy K, Darrah M. Haptics-based Apps for middle school students with visual impairments. IEEE Trans Haptics. 2015;8:318–326. Nicolau H, Guerreiro J, Guerreiro T, et al. UbiBraille: designing and evaluating a vibrotactile Braille-reading device. New York: ACM; 2013. p. 23. Dictionary M-W. September 04. Definition of Disability. Available from: https://www.merriam-webster.com/dictionary/disability. Accessed 2017 September 04. Goldstein M, Proctor A, Bulle L, et al. Tactile stimulation in speech reception: Experience with a nonauditory child. Speech of the hearing impaired: research training and personal preparation; 1983:147–166. Pickett J. Speech communication for the deaf: visual, tactile, and cochlear-implant. J Rehabil Res Dev. 1986;23:95–99. Pickett J, McFarland W. Auditory implants and tactile aids for the profoundly deaf. J Speech Hear Res. 1985;28:134–150. Miyamoto RT, Robbins AM, Osberger MJ, et al. Comparison of multichannel tactile aids and multichannel cochlear implants in children with profound hearing impairments. Otol Neurotol. 1995;16:8–13. Osberger MJ, Robbins AM, Miyamoto RT, et al. Speech perception abilities of children with cochlear implants, tactile aids, or hearing aids. Otol Neurotol. 1991;12: 105–115. Tang H, Beebe DJ. An oral tactile interface for blind navigation. IEEE Trans Neural Syst Rehabil Eng. 2006;14: 116–123. Bolanowski SJ Jr, Gescheider GA, Verrillo RT, et al. Four channels mediate the mechanical aspects of touch. J Acoust Soc Am. 1988;84:1680–1694. Johansson RS. Tactile sensibility in the human hand: receptive field characteristics of mechanoreceptive units in the glabrous skin area. J Physiol (Lond). 1978;281:101–125. Weinstein S. Intensive and extensive aspects of tactile sensitivity as a function of body part, sex and laterality; 1968. Van Boven RW, Hamilton RH, Kauffman T, et al. Tactile spatial resolution in blind Braille readers. Neurology. 2000;54:2230–2236. Heinrichs R, Moorhouse J. Touch-perception thresholds in blind diabetic subjects in relation to the reading of Braille type. N Engl J Med. 1969;280:72–75. Stevens JC, Foulke E, Patterson MQ. Tactile acuity, aging, and braille reading in long-term blindness. J Exp Psychol: Appl. 1996;2:91. Grant AC, Thiagarajah MC, Sathian K. Tactile perception in blind Braille readers: a psychophysical study of acuity and hyperacuity using gratings and dot patterns. Percept Psychophys. 2000;62:301–312.

28

F. SORGINI ET AL.

[232]

Hollins M. Understanding blindness: an integrative approach. Mahwah, NJ: Lawrence Erlbaum Associates, Inc; 1989. Collins CC. On mobility aids for the blind. Electronic spatial sensing for the blind. New York: Springer; 1985. p. 35–64. Levesque V. Blindness, technology and haptics. Cim technical report (cim-tr-05.08). Canada: McGill University; 2005. Gori M, Cappagli G, Tonelli A, et al. Devices for visually impaired people: high technological devices with low user acceptance and no adaptability for children. Neurosci Biobehav Rev. 2016;69:79–88. Saunders GH, Echt KV. An overview of dual sensory impairment in older adults: perspectives for rehabilitation. Trends Amplif. 2007;11:243–258. Hollins M. Attitudes and emotional reactions to blindness. Understanding blindness: an integrative approach. Mahwah, NJ: Lawrence Erlbaum Associates, Inc; 1989. Hersh M. Deafblind people, stigma and the use of assistive communication and mobility devices. Technol Disabil. 2013;25:245–261. Cochlear. 2016. Hearing aids. Available from: http://www. cochlear.com/wps/wcm/connect/intl/home/understand/ hearing-and-hl/hl-treatments/hearing-aids. Link H. 2016. Middle ear implants. Available from: http:// www.hearinglink.org/your-hearing/implants/middle-earimplants/. MEDel. 2016. Candidacy for Middle Ear Implants. Available from: http://www.medel.com/us/candidacy-middle-earimplants/. Cochlear. 2016. Cochlear implants. Available from: http:// www.cochlear.com/wps/wcm/connect/intl/home/understand/hearing-and-hl/hl-treatments/cochlear-implant. Cochlear. 2016. Electro-acoustic implants. Available from: http://www.cochlear.com/wps/wcm/connect/intl/home/ understand/hearing-and-hl/hl-treatments/electro-acousticimplant. Hagr A. BAHA: bone-anchored hearing aid. Int J Health Sci (Qassim). 2007;1:265. Yawn R, Hunter JB, Sweeney AD, et al. Cochlear implantation: a biomechanical prosthesis for hearing loss. F1000prime Rep. 2015;7. Lewis PM, Ackland HM, Lowery AJ, et al. Restoration of vision in blind individuals using bionic devices: a review

[233]

[234] [235]

[236]

Downloaded by [Southern Cross University] at 18:39 13 October 2017

[237]

[238]

[239]

[240]

[241]

[242]

[243]

[244] [245]

[246]

[247]

[248]

[249]

[250]

[251]

[252]

[253] [254]

[255]

[256]

[257] [258]

with a focus on cortical visual prostheses. Brain Res. 2015;1595:51–73. Rizzo S, Belting C, Cinelli L, et al. The Argus II Retinal Prosthesis: 12-month outcomes from a single-study center. Am J Ophthalmol. 2014;157:1282–1290. Biran R, Martin DC, Tresco PA. Neuronal cell loss accompanies the brain tissue response to chronically implanted silicon microelectrode arrays. Exp Neurol. 2005;195:115–126. Edel DJ, Toi V, McNeil VM, et al. Factors influencing the biocompatibility of insertable silicon microshafts in cerebral cortex. IEEE Trans Biomed Eng. 1992;39:635–643. Azemi E, Lagenaur CF, Cui XT. The surface immobilization of the neural adhesion molecule L1 on neural probes and its effect on neuronal density and gliosis at the probe/tissue interface. Biomaterials. 2011;32:681–692. McConnell GC, Rees HD, Levey AI, et al. Implanted neural electrodes cause chronic, local inflammation that is correlated with local neurodegeneration. J Neural Eng. 2009;6:056003. McCreery D, Pikov V, Troyk PR. Neuronal loss due to prolonged controlled-current stimulation with chronically implanted microelectrodes in the cat cerebral cortex. J Neural Eng. 2010;7:036005. Zrenner E. Will retinal implants restore vision? Science. 2002;295:1022–1025. Weisenberger JM, Broadstone SM, Saunders FA. Evaluation of two multichannel tactile aids for the hearing impaired. J Acoust Soc Am. 1989;86:1764–1775. Bach-y-Rita P, Kaczmarek KA, Tyler ME, et al. Form perception with a 49-point electrotactile stimulus array on the tongue: a technical note. J Rehabil Res Dev. 1998;35:427–430. Meers S, Ward K. A substitute vision system for providing 3D perception and GPS navigation via electro-tactile stimulation. Proceedings of the International Conference on Sensing Technology, Palmerston North, New Zealand, 2005. O’Modhrain MS, Gillespie B. The moose: a haptic user interface for blind persons; 1997. R Haptic Device. Available Sensable. 2015. Phantom Omni V from: http://www.dentsable.com/haptic-phantom-omni. htm. Accessed 2016 April 12.

Haptic-assistive technologies for audition and vision sensory disabilities.

The aim of this review is to analyze haptic sensory substitution technologies for deaf, blind and deaf-blind individuals...
3MB Sizes 6 Downloads 14 Views