File Information

File: 05-lr/acl_arc_1_sum/cleansed_text/xml_by_section/metho/98/w98-0214_metho.xml

Size: 13,778 bytes

Last Modified: 2025-10-06 14:15:08

<?xml version="1.0" standalone="yes"?>
<Paper uid="W98-0214">
  <Title>Navigating maps with little or no sight: An audio-tactile approach</Title>
  <Section position="3" start_page="95" end_page="99" type="metho">
    <SectionTitle>
2 Preliminary GIS and web studies
</SectionTitle>
    <Paragraph position="0"> Research undertaken at the Univeristy of Wales, Aberystwyth, UK between 1994 and 1996 explored the potential of GIS and  hypermedia for communicating spatial information to blind and visually imapaired people.</Paragraph>
    <Paragraph position="1"> A GIS front end, ArcView, was used to construct a spatial system for visually impaired people (legally blind, but with some residual vision). The GUI was stripped down and many unnecessary component buttons and menus removed. The final system worked in two modes, a low vision zoom and pan query mode which displayed a map of the campus. With a single mouse click users could zoom-in to the area selected. By re-clicking the mouse button the user continued zooming-in until the area in question filled the display. With a further click an audio file was played, 'speaking' the name of the building. Finally a large photograph of the building was displayed. In the second mode the user typed in the name or function of the building (for example 'Llandinam Building' or 'Earth Sciences') a map Was then displayed of the campus and subsequent maps, each displayed after a mouse click zoomed the user in to the building requested (see Jaeobson and Kitchin, in press.) As such, ArcView was effectively reduced to a point-and-click hypermedia system. Users of the system expressed great interest and excitement asking 'can you do this for the town centre'; 'Now I can experience places I would never visit'. Due to the 'simplicity' of the final slimmed down version of ArcView and to allow optimum access and usability it was decided to continue with the project using the World Wide Web environment.</Paragraph>
    <Paragraph position="2"> A series of hypermap World Wide Web pages were built allowing the user to navigate between low vision maps and spoken textual screens. The interface utilised large font hypertext mark-up language (HTML), and at the bottom fight the screen magnifying software. Large scale abstracted and simplified maps were used to convey spatial information. An enhanced cursor is used to follow links, when a shape on the map is queried an audio file was displayed describing the building. This interface enabled users to access the low vision and spoken maps remotely.</Paragraph>
    <Paragraph position="3">  3. Protoype and pilot study of audio null tactile system The prototype used a low specification PC (DX4-100 processor, 16 megabytes of ram, and a 16bit souncard) The touch pad and associated software retail for around $300 and will work with any windows based PC. The touch pad can be attached to a monitor so a user with limited vision is able to view the screen, or used at table-top level where a totally blind individual is able to scan the pad with their fingertips. Spatial information is presented as an auditory map. Areas of the touch pad are overlain with sound, when the map users finger enters the designated area the sound is played. By touching areas of the pad users were able to determine the size and shape of a map feature by the change in sound. A collection of sounds are used to represent map information usually conveyed by visual symbols (text, color, line style, shape etc.). An off-line World Wide Web site is being built which utilizes interlinking auditory maps that can be traversed solely by sound and touch. As the user's finger is dragged across the touch pad, the system 'talks', playing audio files which are triggered by the position of the user's finger. By the use of spoken audio, verbal landmarks, environmental audio (such as traffic noise for a road) and auditory icons (earcons - Blattner et al.,1989) to denote specific events like the edge of a map, a link to further maps, or for the user to press for more information, an audio-tactile hypermedia is constructed conveying cartographic information.</Paragraph>
    <Paragraph position="4"> Rather than direct manipulation of a tactile surface, such as pressing on the tactile maps in NOMAD, this system uses a touch pad.</Paragraph>
    <Paragraph position="5"> Therefore the user has no direct cutaneous stimulus from tactile relief. The encoding from the audio-tactile stimulus meant that map  (b) Audio overlay on visual map (dark text - indicates the playing of an environmental sound) 1 To the south is a large conurbation: An area of many cities 2 To the north is an area of rolling farmland 3 To the west is a windy ocean 4 To the east is a hot dusty plain 5 The safari park has many animals from East Afi'ica 6 The lake is a popular escape for the city people during the hot summer months 7 Trains travel from the city into the wine country g Open space and farmland around the city 9 Many boats trawl the sea for shoals of cod 10 North island is home to a large colony of seabirds 11 South island is used for missile testing 12 The marsh is an area once filled by the sea, now unsuitable for development 13 The main map shows a city and its surroundings. To The west is an ocean. In the south west Is the city. To the north a train line, to the east a motorway and to the eastern fring~ a marsh.</Paragraph>
    <Paragraph position="6"> (c) Links to verbal information from the main scro-tactile map information (main map in pilot study)  information is built up from kinaesthetic sensing of movement across the. pad, sensing of the distance traversed across the pad, proprioceptive sensing of the location of the fingers and location information obtained by referencing with the hands to the outside frame of the touch pad. Linking enables a blind user to traverse from one auditory map to another. As each map loads, a verbal overview describing the map is played. From all maps there is direct access to a help screen that explains the system and the modes of interaction.</Paragraph>
    <Paragraph position="7"> Figure l(a) displays the simple user interface for the auditory hypermap system. As the map -reader's finger moves across the touch pad and over the &amp;quot;SOUTIT' bar the audio message &amp;quot;Press to go south&amp;quot; is played. Once this part of the touchpad is pressed the central area is filled with an auditory map to the south of the previous one. If no maps are available, this is verbally relayed to the user. North, west and east all work in a similar manner. Home returns the user to the main auditory map. The help button explains how to use the system. When exiting from help the user is returned to the correct map. The &amp;quot;i&amp;quot; button plays information about the map in view (e.g., 'this is the city area map. Downtown is in the north of the urban area, and the harbor to the west etc.'). The back and forward buttons allow the user to traverse through the 'history' of their</Paragraph>
    <Section position="1" start_page="98" end_page="98" type="sub_section">
      <SectionTitle>
links
3.1 Methodolgy
</SectionTitle>
      <Paragraph position="0"> content, stucture or links between the maps.</Paragraph>
      <Paragraph position="1"> During the evaluation phase individuals had 15 minutes to navigate through and explore the maps. They were told that they were free to go where they wished and to return to places previously visited. At the end of this 15 minute period, the computer was turned off and the participant gave a verbal desfiption of the maps and map-layout imaging they had to explain the maps to somebody over a telephone. The participant then graphical reconstructed the maps using a tactile drawing pad which enables a blind user to feel what they are drawing. The whole process was videotaped and a log made of peoples paths through the audio-tactile maps.</Paragraph>
      <Paragraph position="2"> Semi structured interviews were used to get impressions of the system, feedback on how it could be improved and for ideas of where it may be beneficial (such as in schools or at tourist sites).</Paragraph>
    </Section>
    <Section position="2" start_page="98" end_page="99" type="sub_section">
      <SectionTitle>
3.2 Results
</SectionTitle>
      <Paragraph position="0"> All users were able to successfully interact with the system. This included people who had never used a computer before. Interview responses suggest that the system aroused great interest and that map access was 'simple, satisfying and fun' (totally blind participant). Users were able to both graphically and verbally reconstruct the maps with varying degrees of accuracy. Further evluation is planned to directly compare these results to tactile map access of the same scenes.</Paragraph>
      <Paragraph position="1"> Figure 2 shows a graphical reconstruction by a visually impaired map user, and figure 3 a graphical reconstruction by a totally blind participant Evaluation of the system involved 5 visually impaired people and 5 blind people. The system was evaluated invidually. Initial training took place for 15 minutes using the help screen of the model. Users were familiarised with the touchpad, were shown how to follow a link, obtain more verbal information, and to follow a link. The structure of the menu surrounding the map was explained (buttons north, south etc.) and the function of the buttons to go back and home shown. Questions were answered and people familiafised themselves with the system, There were given no information about the  The audio-tactile hypermap system was designed as a prototype to explore the possibilities for conveying spatial information in this 'touch-audio' manner. Ultimately it is intended that such a system could act as a front end to a more fully functional GIS, enabling the selection and presentation of map like information to visually impaired people. For example, to construct a map of 'y' town showing roads, location of crossings and public conveniences, all at the request of the user.</Paragraph>
      <Paragraph position="2"> Figure 2 : Graphic reconsruction by visually impaired individual Figure 3 : Graphic reconstruction by a totally blind individual and graphics. Krygier 1994 outlined auditor~ cartography and Blattner et al., 1994 hav; worked on the sonic enhancement of two dimensional graphic displays. Clearly there i: the need for visually impaired people to b; active participants in the research process an~ for the process to be user-led with frequen validation Conclusion This research is highly relevent and ha.</Paragraph>
      <Paragraph position="3"> implications beyond the blind community that i is targeted at. New internet developments offe great potential. The internet is widely used commonplace and rapidly expanding. Th internet can potentially distribute informatiofrom anywhere to anywhere. The nature of th protocols such as VRML and HTML offer good approaches and techniques so a non-expert cabuild information is readily aecesible to blind and visually impaired people. This novel audio tactile approach offers, a dynamic, flexible, lov sost media for the presentation of spatia information.</Paragraph>
    </Section>
  </Section>
  <Section position="4" start_page="99" end_page="99" type="metho">
    <SectionTitle>
4 Future research
</SectionTitle>
    <Paragraph position="0"> There is a need for future research to address the further development and use of new interface technologies such as voice recognition, touch screens and tactile displays. Probably the most pressing need is to improve the user interface, as this is the largest barrier to successful and meaningful interactions with representations of spatial information. There have been several novel and interesting approaches that require further investigation. A vibro-tactile mouse which registers the mouse's position over a desired spatial object on a map (Nissen 1997), tonal interfaces for computer interaction (Alty 1996), and 'The Voice' which can convert a two dimensional picture, map or representation into a 'tonal soundseape' (Meijers 1993, 1997). Much of GUI of this research could be directed at conveying representations of the real world (maps) to blind people in order to develop fully functional non-visual GIS systems. Further research is needed on the sonifieation of maps Because this audio-tactile mapping syster.</Paragraph>
    <Paragraph position="1"> resides within the protocols of the world wid web it means that the maps can be accesse., audio-visually by sighted people using : conventional mouse. With the addition of th touch pad a partially or totally blind user is abl to remotely access the content of the auditor.</Paragraph>
    <Paragraph position="2"> maps from any computer with an interne connection. By adopting a 'design for all approach the spectrum of people able to acces map and graphic information is increased t.</Paragraph>
    <Paragraph position="3"> include not only people with limited vision, bu also potentially children, the elderly and peopl with learning disabilities.</Paragraph>
    <Paragraph position="4"> Visually impaired people's need for spatiz information is greater that their sighte.</Paragraph>
    <Paragraph position="5"> counterparts as they are unable to (fully perceive the environment through vision. Thi lack of visual perception severely limit independent travel. The computing comrnunit is in a unique position to address this need an improve the quality of life for people with visu~ impairments by increasing the capacity fG independent travel and education throug i00 &amp;quot;obility and learning aids.</Paragraph>
    <Paragraph position="6"> cknowledgements ur thanks go to all the participants in Belfast, orthern Ireland who offered their time to articipate in the pilot study.</Paragraph>
  </Section>
class="xml-element"></Paper>
Download Original XML