Introduction

The visually impaired rely on non-visual senses, primarily hearing, to help them locate and identify objects within their immediate and distant environment. Although all of the senses are able to convey information pertaining to an ob ject (i.e. texture, temperature, and size), only the auditory system is capable of providing significant distance cues, which are coded primarily through intensity. The Sonic Pathfinder [2] and SonicGuide [3] are examples of the many Electronic Travel Aids (ETAs) which sonify range information allowing object detection/avoidance by the visually impaired user.

To determine the distance to an object, most ETAs emit an ultrasonic sound and measure the time it takes for the echo to return. Due to the wide panoramic field of view, small range and reflection artifacts inherent with sonar, the use of sonar based devices is limited to localising objects within an immediate and uncluttered environment [1]. They fail to provide sufficient resolution and artifact-free range data necessary for the perception of 3D space.  Unlike sonar, laser ranging allows for a greater range. Its narrower beam and shorter wavelength combine to detect finer detail necessary for shape and pattern perception. A laser ranging device has been used by to provide range measurements for an autonomous robot [4].

Rather than object detection, which sonar is better suited for, we decided to examine the sonification of infrared range measurements in order to detect shapes and patterns (e.g. detection of corners, stairs, and depth discontinuities) in the user's environment.

The ultimate goal of our work is to develop a device that will enable the perception of 3D space by the visually impaired user.