Combining Sensorial Modalities in Haptic Navigation Tasks for Blind People Iñaki DÍAZ a, b, Josune HERNANTES a, b and Jorge Juan GIL a, b, 1 a Department of Applied Mechanics, CEIT, Spain b TECNUN, University of Navarra, Spain

Abstract. Recent development of haptic technology has made possible the physical interaction with virtual environments. The combination of sight, hearing and touch senses provides these virtual environments with enhanced realism perception and interactivity to simulate multiple real world tasks. Furthermore, the addition of the sense of touch can help in the design of specific virtual tasks for people with visual impairment. This paper analyses the advantages and drawbacks of combining tactile and sound stimuli to render virtual environments for blind people. Specifically, it studies the combination of both senses to provide navigation-related information in virtual tasks. A user study has been carried out among several subjects in order to discuss this issue. Results state that a combination of both senses provides better performance in navigation tasks than with only haptic feedback. Keywords. Haptic systems, audio, multisensory interaction, visual impairment.

1. Introduction The development of efficient orientation and mobility skills is essential for blind people in order to acquire a good mental mapping of spaces, and have the ability to navigate efficiently within these spaces. Good navigation capability can undoubtedly improve live quality and provide greater independence to blind people. However, most of the information required for these skills is visual information. Therefore, it is essential to find other ways in which this visual information could be perceived by blind people. Up to now, virtual reality environments only provided visual immersion to users. This fact prevented visually impaired people from using this type of environments. However, recent development of haptic interfaces can think, to a certain extent, in the use of these interfaces to develop specific virtual applications for visually impaired people. Haptic devices allow users to interact with a certain environment, real or virtual, by the sense of touch. In addition, most virtual reality systems also have the capability of rendering sound effects. The combination of the audio and haptic modalities can imply other means of exploring virtual environments and virtual objects which can be used by visually impaired people to improve their navigation and orientation skills. In the real world, visually impaired people already use the sense of touch or hearing as substitute for the sense of sight (e.g. replacing written text with Braille). In the same way, virtual 1

Corresponding Author: Paseo Manuel Lardizabal 15, E-20018 San Sebastián, Spain; E-mail: [email protected].

environments can be specifically designed for blind people by substituting the visual modality with other ones available in the system. An important issue in navigation for visually impaired people is how to present navigation-related information to the user. In our daily lives we use most of our senses together to perceive the surrounding environment and to successfully accomplish everyday tasks. On the contrary, virtual interfaces do not have (up to now) the capability to render all the stimuli perceived in the real world. Therefore, the way in which information related to virtual events is presented to blind users depends on the system’s rendering capabilities (the sensorial modalities available) and the criterion of the designer. As a result, it is necessary to analyse thoroughly how different modalities interplay together (sound and haptics), and study their appropriateness to render different information in order to optimize the overall perception of the environment and the successful accomplishment of the tasks.

2. Related Work Virtual reality interfaces have become very popular in many fields like industry or medicine in order to develop advanced training environments for students and operators. In addition, haptic technology is emerging as the perfect partner for these systems providing them with the ability of interacting physically with such environments. These multisensory interfaces combine the human senses of sight, hearing and touch to immerse users into simulated environments that display real word scenarios and stimuli. In medicine, surgeons can simulate complex surgical procedures on virtual human bodies before operating real patients [1]. In industry, maintenance and accessibility can be evaluated with specially designed virtual interfaces [2]. Some haptic applications have also been successfully developed for blind people, and a great deal of research is going on. In [3], a virtual haptic-audio traffic environment for exploring and learning real routes was developed. Lécuyer et al. [4] developed a multimodal system dedicated to visually impaired people to explore and navigate inside virtual environments. In [5], the authors developed and tested a specially designed multisensory virtual environment to enable blind people to learn about different real spaces and have the ability to easily navigate in them.

3. Goal of the Study Nowadays, most haptic applications for blind people use sound as verbal commands for navigation purposes or to provide complementary information about events or the presence of people in the environment. But none of them use sound as collision or contact audio for navigation purposes. Recent advances in automatic sound generation algorithms [6] enable the creation of sound as a direct result of the interaction of the user with the environment. Using these algorithms, sound is generated in real-time and is equal to the real one that a user would listen in the real world. There is no need to use previously recorded sounds. This fact improves significantly the interactivity of these virtual applications. However, it is not clear in advance how the brain perceives and processes the same information of an event in a virtual environment coming from two different sensorial modalities (haptic and sound).

Our prior work [7] demonstrated that sound is an effective sensorial modality in combination with touch and vision in order to improve performance in virtually simulated accessibility tasks. Results of that work concluded that a combination of the three sensorial modalities gave better performance than any other combination of fewer modalities. However, the visual stimulus was always provided and only sound and haptic feedback changed from one experiment to another. This fact motivated us to analyse if a combination of the haptic and sound modalities still remains effective in haptic accessibility and navigation tasks when visual information is not displayed. The objective of this study is to analyse whether a combination of sound and haptic stimuli improves the accomplishment of virtual navigation tasks by blind people, rather than with only haptic feedback. A user study was conducted among 13 subjects in order to discuss the stated objective. The results can be used as a guideline in the design of haptic applications for visually impaired individuals.

4. Test Design Participants Thirteen subjects took part in the experiments, 11 men and 2 women. They were between 25 and 35 years old. Our first thought was to carry out the experiments with blind people. However, due to the difficulty of joining a minimum (statistically) suitable number of blind participants for the experiments, we decided to accomplish the study with people with normal vision, but with their eyes closed. All the participants reported normal tactile function and were free of auditory impairments. Most of the subjects had no prior experience with haptic interfaces, and all participants were naïve to the details of the experiments. Procedure A virtual environment that consisted of a labyrinth was designed for the experiments (Figure 1). A solid sphere was located at the entrance of the labyrinth and its position was handled with the end-effector of the LHIfAM haptic device [2]. The depth of the walls was large enough in order to guarantee that subjects could not escape from the path.

Figure 1. Virtual labyrinth (left) and multisensory interface (right) used for the experiments.

Initially participants were allowed to interact with other different haptic applications for a short period of time in order to familiarize with haptic and sound stimuli. Then, they were requested to cover the whole path from the entrance of the labyrinth (left) to the end (right). They were also informed that they could move right, up and down, but not left (backwards). Subjects repeated the experiment twice, and in each experiment, collision information of the sphere with the walls of the labyrinth was displayed in a different manner: with haptic stimuli (H), and haptic and sound stimuli together (HS). The order of the experiments changed from one subject to another and there was also an interval of one day between each experiment performed by each subject. The aim was to avoid that participants could acquire some kind of training and get used to the path they had to follow. Subjects were also told to complete the task at the speed they felt more comfortable. It was remarked that the main target was to avoid collisions between the sphere and the walls of labyrinth on their way to the sortie.

5. Results and Discussion We consider that a good response to a collision event should prevent participants from penetrating deep inside the walls of the labyrinth and leave collisions at the shortest time. Therefore, two parameters were measured to evaluate the performance of the task: 1) the quotient between total time in collision and total time needed to complete the task (TC/TT), and 2) the total time in collision weighting each sampling period with the penetration value (mm) measured in that period (WTC). This second parameter gives us an idea of the penetrated “area” along time, penalizing both penetration and collision time. Figure 2 shows the values of these parameters in each experiment and by each subject, as well as their corresponding box and whisker diagrams. Time in Collision / Total Time Subject 1 2 3 4 5 6 7 8 9 10 11 12 13

H

HS

0,1615 0,1785 0,2008 0,2306 0,2287 0,3506 0,1949 0,4234 0,1378 0,1502 0,5795 0,2690 0,5729

0,0699 0,1268 0,1093 0,2626 0,1088 0,1229 0,1283 0,2007 0,1149 0,1406 0,3524 0,2466 0,1873

Weighted Total Time in Collision

0,6

Subject 1 2 3 4 5 6 7 8 9 10 11 12 13

0,5 0,4 0,3

0,2 0,1 0,0 H

HS

H 3,27 12,64 9,47 9,44 11,38 12,72 10,56 14,73 3,31 7,18 27,54 9,25 15,44

HS 3,03 4,17 5,84 15,07 5,29 4,20 4,23 5,50 4,12 6,45 8,30 6,22 7,79

30

25 20 15 10 5 0 H

HS

Figure 2. Results of the experiments and box and whisker diagrams. H groups the experiments with only haptic feedback, and HS groups the experiments that combine haptic and sound feedback.

Nearly all participants accomplished the task better when a combination of haptic and sound stimuli was restored. A paired samples t-test was performed to analyse the data measured. The results of this analysis confirmed that the addition of sound improved task performance. This improvement was 40.9% for TC/TT (P = 0.01) and 45.4% for WTC (P = 0.004).

Apart from the analytical data, the comments made by the participants and their behaviour while performing the experiments were also taken into account for further analysis. Most participants claimed that sound was of great aid, and that the realism perception of the task was fairly increased. However, some participants complain about the lack of a proper directionality of the sound. In general, the results of the study state that collision events are better perceived with a combination of the haptic and audio modalities than with only haptic feedback. We think that this study should encourage using a combination of both modalities in the design of haptic applications for blind people to display navigation-related information. The use of non-verbal sounds to warn about collision events in virtual training environments makes these virtual applications more realistic and similar to real world. Future work will focus on implementing the directionality of sound to improve the quality of the overall system. In addition, it is known that visually impaired and nonimpaired people do not perceive non-visual stimuli in the same way. Therefore, experiments have to be carried out among visually impaired subjects in order to definitely validate these results.

Acknowledgements The research work presented in this paper has been partially supported by the Basque Government, under the project SIMOP S-PE05TE01.

References [1]

[2]

[3] [4]

[5]

[6]

[7]

Basdogan, C., De, S., Kim, J., Muniyandi, M., Kim, H. and Srinivasan, M. A., "Haptics in Minimally Invasive Surgical Simulation and Training", IEEE Computer Graphics and Applications, 24 (2004), 5664, March/April. Savall, J., Borro, D., Gil, J. J. and Matey, L., "Description of a Haptic System for Virtual Maintainability in Aeronautics", in International Conference on Intelligent Robots and Systems, (2002) 2887-2892, Lausanne, Switzerland. Magnusson, C. and Rassmus-Gröhn, K., "A Virtual Traffic Environment for People with Visual Impairment", Visual Impairment Research, 7 (2005), 1-12. Lécuyer, A., Mobuchon, P., Mégard, C., Perret, J., Andriot, C. and Colinot, J. P., "HOMERE: a Multimodal System for Visually Impaired People to Explore Virtual Environments", in Virtual Reality, (2003) 251-258. Lahav, O. and Mioduser, D., "Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping – a Case Study", in World Conference on Educational Multimedia, Hypermedia and Telecommunications, (2001) 1046-1051, Norfolk, VA. van den Doel, K., Kry, P. G. and Pai, D. K., "Foley Automatic: Physically-based Sound Effects for Interactive Simulation and Animation", in 28th Annual Conference on Computer Graphics and Interactive Techniques, (2001) 537-544, Los Angeles, USA. Díaz, I., Hernantes, J., Mansa, I., Lozano, A., Borro, D., Gil, J. J. and Sánchez, E., "Influence of Multisensory Feedback on Haptic Accesibility Tasks", Virtual Reality, 10 (2006), 31-40.

Combining Sensorial Modalities in Haptic Navigation ...

Up to now, virtual reality environments only provided visual immersion to users. This fact prevented ... Virtual reality interfaces have become very popular in many fields like industry or medicine in order .... Invasive Surgical Simulation and Training", IEEE Computer Graphics and Applications, 24 (2004), 56-. 64, March/April.

252KB Sizes 0 Downloads 125 Views

Recommend Documents

Combining Sensorial Modalities in Haptic Navigation ...
Recent development of haptic technology has made possible the physical interaction ... among several subjects in order to discuss this issue. Results state that a ... of the information required for these skills is visual information. Therefore, it i

Haptic illusions
Gregory, R. L. (1966). Visual Illusions. In B. Foss (Ed.), New Horizons in Psychology. (pp. 68-96). Harmondsworth: Pelican. Gregory, R. L. (1967). Comments on the inappropriate constancy scaling theory of illusions and its implications. Quart J exp P

Navigation Protocols in Sensor Networks
We wish to create more versatile information systems by using adaptive distributed ... VA 23187-8795; email: [email protected]; D. Rus, Computer Science and .... on a small set of nodes initially configured as beacons to estimate node loca-.

Haptic modality
system for the detection of pain stimuli and temperature. (Purves, Augustine ... surfaces that are in contact with the subject and guides the manipulation of objects ...

Combining ability in bhindi [Abelmoschus spp.]
analysis revealed that Arka Anamika was found to be a good general combiner for fruit number, fruit weight and fruit ... Email: [email protected].

Mental imagery generation in different modalities ...
labelled using the Talairach Daemon Database. Within anatomical regions of ... imagery can involve the recruitment of a multi-modal network, reflecting a ...

Combining ability in bhindi [Abelmoschus spp.]
crosses involving AC 5 was not utilised for the analysis as most of the crosses were not fertile. The data recorded from 5 x 5 diallel crosses was analysed.

Low-cost haptic mouse implementations
Jun 18, 2004 - Actuator For Teleoperator Robot Control,” Bachelor of Sci ence Thesis, MIT, May ... lssues in Force Display,” Computer Science Dept. Univer-.

Low-cost haptic mouse implementations
Jun 18, 2004 - facing a user with a host computer. ...... The host computer (console unit) provides com ..... such as used for the Internet and World Wide Web.

HIV Navigation Options In SF.pdf
Page. 1. /. 1. Loading… Page 1. HIV Navigation Options In SF.pdf. HIV Navigation Options In SF.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying HIV Navigation Options In SF.pdf. Page 1 of 1.

Haptic depth perception
These concepts allow the feel of sensory ... computer systems (as Virtual, Augmented, Mixed Reality), the concept of Presence .... Related External Links: None.

Haptic depth perception
of approaches to perception, action and cognition that form a new wave of approaches in .... A Dynamic Systems Approach to the Development of. Cognition and ...

Neurobiologia T5 Sistema Sensorial Somático.pdf
There was a problem loading more pages. Retrying... Whoops! There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Neurobiologia T5 Sistema Sensorial Somático.pdf

Learning Relationships between Multiple Modalities and Words
*This work was partially supported by JST, CREST. 1Akira Taniguchi and Tadahiro Taniguchi are with Ritsumeikan Univer- sity, 1-1-1 Noji Higashi, Kusatsu, Shiga 525-8577, Japan {a.taniguchi, taniguchi} @em.ci.ritsumei.ac.jp. 2Angelo Cangelosi is with

SportsStore: Navigation - GitHub
Act. ProductsListViewModel result = controller.List(null, 2).ViewData. ..... Clicking this button will show a summary of the products the customer has selected so ...