Biologically-inspired visual guidance of robot locomotion

[see Simon K. Rushton , Jia Wen & Robert S. Allison . Egocentric Direction and the Visual Guidance of Robot Locomotion Background, Theory and Implementation.  BMCV (forthcoming) .]
 

Overview

Drawing on the recent suggestion [1] that humans rely on perceived egocentric direction, rather than optic flow to guide locomotion we implemented a robot guidance system.

The system relies on some ideas outlined elsewhere [2] to guide target interception of static and moving targets.

For an brief overview of some of the experimental work that undepins this work, please check Simon's research page here

The first assumption of the egocentric model is that approach to a target is based upon maintaining a constant egocentric direction.  Simply put, you can intercept a target if on each step you check that the target is at its previous egocentric direction, and if it is not then turn so as to fix this.  Provided that the direction you are keeping the target at is less than 90degrees (measured relative to your locomotor axis), then you will reach your target.  The path you will take is an equi-angular spiral.  Illustrated below are a family of constant-eccentricity trajectories (plan views) that intercept (1) a static target; (2) a target moving with a constant velocity; (3) an accelerating target.
Image1.gif
Image1.gif
Image2.gif
Image2.gif
Image3.gif
Image3.gif

Sometimes a system won't be calibrated (eg you don't know whether your target is at 15 degrees or -20 degrees), one way to deal with this is to use target drift [3] .  The image below shows the use of target drift and over-compensation for guidance of an uncalibrated robot and calibration.

Image4.gif
Image4.gif

We also added an obstacle avoidance system that uses basic visual variables such as time-to-contact (TTC; 4 , 5 ), and trajectory [6 ,7 ]. Selection and optimisation of the calculation of these variables is based upon recent experimental findings and computational models [ 8 ].

A simple control law, derived from work on human interception [ 9 ] and body-scaled parameters[10 ] can be used to produce successful avoidance of static and moving obstacles.
 

Examples

note .avi files.
Cylinder_Obs_15_planned_view.bmp
15 Obstacles_plan_view
Cylinder_Obs_15_camera_view.bmp
15 Obstacles camera_view

Moving obstacles

Cylinder_Moving_Obs_8_planned_view.bmp
Moving Obstacles_plan_view
Cylinder_Moving_Obs_8_camera_view.bmp
Moving Obstacles_camera_view

Different obstacle's shapes - Triangle & Planar

Triangle_Obs_15_planned_view.bmp
Triangle Obstacles plan view
Triangle_Obs_15_camera_view.bmp
Triangle Obstacles camera_view
 
Planar_Obs_15_planned_view.bmp
Planar_Obstacles plan view
Planar_Obs_15_camera_view.bmp
Planar_Obstacles_camera_view

Different sizes

Obs_20_size_1_planned_view.bmp
Obstacle_size_1_plan_view
Obs_20_size_1_camera_view.bmp
Obstacle size_1_camera_view
 
 Obs_8_size_3_planned_view.bmp
Obstacle size_3_plan_view
Obs_8_size_3_camera_view.bmp
Obstacle size_3_camera_view
 
Obs_8_size_4_planned_view.bmp
Obstacle_size_4_plan_view
Obs_8_size_4_camera_view.bmp
Obstacle size_4_camera_view

Interface

Our current implementation includes a simple to use interface for testing, an object-orientated Matlab implementation and the ability for batch processing of trials for performance evaluation.
  

GUIinterface.TIF
Image Of GUI

We also have a partial implementation on a Nomad robot (lack of space to run the robot being the constraining factor).


nomad robot

Image5.gif
Constant angle trajectory

This work is in development. Further information is available on request. The work will be presented at BMCV in November. Others [ 11 , 12 ] have done work on the same problem, using different approaches. We believe our model has significant advantages.
 

Future projects -

Computational

Human trajectory and eye-movements


* Jia Wen, 4080 project, Winter term

If you are interested in any of these projects please email
 
 

References

[1] Rushton, S.K., Harris, J.M., Lloyd, M.L. & Wann, J.P. (1998). Guidance of locomotion on foot uses perceived target location rather than optic flow. Current Biology, 8, 1191-1194.
[2] Rushton, S.K. & Harris, J.M. (submitted). The utility of not changing direction and the visual guidance of locomotion .
[3] Llewellyn KR. (1971). Visual guidance of locomotion. Journal of Experimental Psychology, 9, 1245-261.
[4] Lee, D.N. (1976). A theory of visual control of braking based on information about time-to-collision. Perception, 5, 437-459.
[5] Regan, D. & Hamstra, S. (1993). Dissociation of discrimination thresholds for time to contact and for rate of angular expansion. Vision Research, 33, 447–462
[6] Bootsma, R.J. (1991). Predictive information and the control of action: what you see is what you get. International Journal of Sports Psychology, 22, 271–278.
[7] Regan, D. (1993). Binocular correlates of the direction of motion in depth. Vision Research, 33, 2359-2360.
[8] Rushton, S.K. & Wann, J.P. (1999). Weighted combination of size and disparity: a computational model for timing a ball catch. Nature Neuroscience, 2, 186-190.
[9] Peper, L., Bootsma, R,J,, Mestre, D.R. & Bakker, F.C. (1994). Catching Balls - How To Get The Hand To The Right Place At The Right Time. Journal of Experimental Psychology-Human Perception And Performance , 20. 591-612.
[10] Warren – affordances and stepping
[11] Fajen, B.R., Warren, W.H., Temizer, S & Kaelbling, L.P. (in press). A dynamical model of visually-guided steering, obstacle avoidance, and route selection. International Journal of Computer Vision
[12] Khatib, O. (1986). Real-time obstacle avoidance for manipulators and mobile robots. International Journal of Robotics Research, 5. 90-99.
 
 

Acknowledgements

This research was supported in part by funds from National Science and Engineering Research Council of Canada and Nissan Technical Center North America Inc.