Send to

Choose Destination
Vision Res. 1993 Mar;33(4):553-69.

Theta motion: a paradoxical stimulus to explore higher order motion extraction.

Author information

Max-Planck-Institut für Biologische Kybernetik, Tübingen, Germany.


Apparent motion stimuli of increasing complexity have been applied to analyse the mechanisms underlying visual motion perception. In the present paper it is investigated how motion detectors respond to three classes of stimuli which are realized as random-dot kinematograms. (i) In the most conventional stimuli, Fourier motion, a group of dots is displaced coherently in a random-dot pattern. (ii) In drift-balanced motion stimuli a bar made of static random dots is shifted in front of another random-dot pattern. (iii) In the novel class of stimuli, theta motion, an object which is exclusively defined by dot motion into one direction, is moving itself into the opposite direction. It is shown in psychophysical experiments that human observers perceive the direction of object motion in all three classes of stimuli. Simple motion detectors, however, only extract the motion direction of the object in the case of Fourier stimuli, and in the case of drift-balanced stimuli, if a nonlinear preprocessing is assumed. Any of the model alternatives discussed so far just detects the moving dots but not the object in a theta-stimulus, as is illustrated by a combinatorial analysis using a simplified version of a motion detector of the correlation type, which operates on a discrete time scale and takes only discrete values. In order to account for the detection of theta-motion, a model consisting of two hierarchical layers of motion detectors is developed, and simulated for conditions as used in the psychophysical experiments. The perception of theta-motion and the two-layer model is discussed in relation to psychophysical data and theoretical considerations from the literature, to try to incorporate the proposed two-layer model into a general scheme of visual motion processing.

[Indexed for MEDLINE]

Supplemental Content

Full text links

Icon for Elsevier Science
Loading ...
Support Center