Ideas for Future Continuation of the Project

Creating a Graphical User Interface

Currently, all executables do not contain any user interface (graphical, command line arguments, key presses etc.).  In order to alter any of the program parameters (e.g. mode of operation, whether to alter the velocity in conjunction to frequency, type of mapping used etc.), one must alter the source code and re-compile.  To avoid such code dependence, a Graphical User Interface (GUI) should be implemented allowing the user to switch between any changeable parameters at run time.  For example, while scanning in the proportional mode, the user may easily switch to the derivative mode to obtain more information about a particular object/location and then switch back to proportional mode.

Currently the program must be terminated by un-plugging the LRF from its power source.  Such an interface should also allow the user to start, stop or "freeze" program execution thereby eliminating the need to disconnect the power source.

A Few Possible Options the GUI may Contain

Experimentation

Although formal experimentation was to be conducted this summer, we once again did not have enough time.  We have a portable Mac (powerBook) with working versions of all software developed this summer installed allowing for the project to become "mobile" (e.g. visually impaired or blindfolded subjects may use the device in many different environmental settings).  Formal experiments must be designed and carried out using subjects to determine the overall effectiveness of the device.

Use the new Laser Range Finder (LRF)

Currently the accuracy available with the current LRF is limited to at least +/- 10cm depending on rate range measurements are being taken.  As a result, the number of range measurements per second has been restricted to eight.  The new LRF we will be obtaining will allow for much greater accuracy (+/- the 2cm at a rate of 20 readings per second).  Experiments should be conducted utilizing such a high rate of range measurements and accuracy.

Echo as a Distance Cue

An idea for future developement of this project would be to include the presence of echo in the notes output through QuickTime.

To do this, one must calculate the time it would take the sound wave to travel to the target and back, and output tones of decaying volume spaced apart by the amount of time previously calculated. The model of volume decay we have worked on so far is:

Volume = Initial_Volume * K(-note*time)
where K is a constant controlling the rate of decay (1.7 was a fairly suitable value), note is the MIDI index of the note being played, and time is the time elapsed in seconds since the first note was played. The model would look something like this:
 
Figure 1
Plot of Sound decay

Echo would be a very good aid in auditory distance perception since humans (as well as other animals) use echo and reverb for space perception.