PDF format.
This paper presents the real-time vision system developed by Pavia University within the ENEA R.A.S. (Surface Antarctic Robot) project for the automatic driving of an intelligent snowcat able to follow the traces produced by other snowcats. A camera is used to acquire images of the scene in real-time; the image sequence is analyzed by a computer vision system which identifies the traces and produces a high level description of the scene. A further optional representation, in which black markers are superimposed onto the original acquired image, is transmitted to a human supervisor, located off-board. The low level part of image processing searches for patches of snow produced by the motion of preceding snowcats. They are characterized by a high variance and are generally darker than the rest of the scene. The algorithm uses a modified variance and morphological operators for low level analysis. The result of the first part of the processing is composed of a set of disjoint clusters representing possible traces that are then labelled and selected by the following high level processing. Even if the images used in this first phase come from a non-stabilized camera, the percentage of correct detection is about 95%.