U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Ackerman S. Discovering the Brain. Washington (DC): National Academies Press (US); 1992.

7From Perception to Attention

Although a fair amount is now known about the densely interconnected systems of information processing that reside inside our heads, the coordinated approach of neuroscience still represents a new arrival on the research front. According to David Hubel, professor of neurobiology at Harvard Medical School, the present state of knowledge about some parts of the brain is like that of a visitor from, say, another civilization who sets out to understand everything there is to know about a television set. Such a person may have learned all about transmitters, capacitors, conductors, and properties of resistance—but he is still at a loss, because not only is he unaware of large areas of the circuitry, he still does not know what the television set as a whole is used for. In like manner, basic research on the brain has provided an understanding of how signals are transmitted at the cellular level and identified several types of synapses, as well as an increasing number of neurotransmitters. Yet at a higher level of complexity, much is still unknown about the functions of the central nervous system.

The systems of perception—vision, hearing, taste, smell, and touch—stand as a piquant challenge to researchers in this regard, because they each start from such a well-defined physical basis but then follow exceedingly intricate pathways, bringing in new forms of information at each step, and arriving finally in the realm of subjective experience. Thus the workings of the senses pass at some point beyond the reach of the experimental scientist, because the results can never be reproduced exactly; once information from the brain's association cortex is brought in, the body is no longer simply taking part in some reaction predictable from the laws of physics. Rather, the mind is perceiving something, and the perception is uniquely shaped by that perceiving mind, at that moment.

Nevertheless, scientific research can do a great deal to unravel the pathways of perception, clarifying how and along what lines the information is organized. Vision, the best understood of the perceptive systems, can be explained with confidence well beyond the action of light on the rods and cones of the retina. And some of the mechanisms of this system turn out to be active in other aspects of our conscious behavior as well—namely, in spatial perception and the specific state of mind known as attention.

FIGURE 7.1.. Explaining the workings of our five senses is a challenge that has been taken up from time to time by natural historians or philosophers.


Explaining the workings of our five senses is a challenge that has been taken up from time to time by natural historians or philosophers. For example, in the seventeenth century, Réné Descartes hypothesized that we are able to see because (more...)

Pathways Of Information In The Visual System

Within the visual system, researchers seek to explain our seamless perception of a three-dimensional surround that contains color, movement, and shape, all assembled from the action of light on our two eyes. What takes place in the rest of the brain, beyond the 125 million rods and cones of each retina, to transmit nerve impulses and organize them into useful messages, recognizable forms, and meaningful scenes?

A basic organizing principle of the visual system is that of a hierarchy of information: a relatively large number of specialized cells at each stage supply information to a smaller number of cells at the next stage, which in turn have their own specialized function. The retinal rods are most attuned to dim light, the cones to bright light (or color). Both rods and cones transmit impulses to another layer of the retina, which sends signals through the third layer to the many neuronal fibers that make up the optic nerve.

Each cell in the third layer that supplies the optic nerve already represents the confluence of signals from thousands of rods and cones over about 1 square millimeter of the retina. The square millimeter thus covered is called the receptive field of that cell. The optic nerve in turn supplies a large amount of pooled information to the lateral geniculate nucleus, which then relays signals to the primary visual cortex.

It is in the primary visual cortex, located in the occipital lobes at the back of the head, that the brain first begins to assemble something that looks like an image to our conscious awareness. At the same time, this area of the cerebral cortex sends many signals to neighboring areas, each of which projects to several others. Without this set of lateral associations, in addition to the hierarchical arrangement, we would not be able to name what we see, for example, or to know whether we had ever seen it before.

This pathway of information, from light beam to nerve impulse to vision, extends for several centimeters inside the human skull and is traversed in thousandths of a second. Along the way, it passes through eight to ten branching stages that consolidate information from the preceding layer into the next one. At the optic chiasm, it branches in another way: half the nerve fibers from each retina cross the brain's midline and lead to the visual cortex of the other hemisphere. The result is that information from the right visual field (gathered by the left half of each eye) is projected to the left visual cortex, and information from the left visual field (the right half of each retina) is projected to the right visual cortex. This arrangement may at first seem overly elaborate, but in fact it offers two advantages. Each half of the brain is responsible for vision on the same side of the body as the hand, arm, and leg for which it controls motor functions; and since each eye provides some signals for the visual cortex in each hemisphere, blindness in one eye, although a serious deprivation, does not mean the loss of the entire left or right visual field.

A curious fact, as yet unexplained, is that at many of these branching stages, a small number of cells project their axons backward, to the preceding stage. The function of this back projection, which allows cells to transmit signals back to the layer from which they came, has puzzled researchers for some time. Could this be some sort of cellular mechanism for checking the "accuracy" of signals as each stage feeds into the next? At present, there are no major hypotheses directing investigations of the matter. But in the other direction—the main flow of signals from the retina to the visual cortex—scientific understanding has improved steadily over the past few decades.

It is quite clear, for example, that the retinal ganglion cells, whose axons are the fibers that make up the optic nerve, sit at a crucial junction in the pathway of information. The retinal ganglion cells act as gatekeepers: their inhibition or excitation determines which signals are sent through the optic nerve, toward the cortex.

The patterns of responsiveness among retinal ganglion cells can vary strikingly, depending on which aspect of vision is being handled. One major pattern is that of center and surrounding area: the retinal cell is stimulated when a small central portion of its receptive field receives light but is inhibited from sending signals when all the area surrounding the center receives light. The reverse may also be true: the center of the receptive field may be inhibited by light, and the surrounding area stimulated by it. Either way, the center-surround pattern responds to contrast; the ganglion cell measures not how much light, in absolute terms, is striking the receptive field but how great the difference is between the light in the center and that in the surround. An average amount of light spread out evenly over the whole receptive field thus conveys very little information to this type of cell, which by its nature is much better suited to register detail and the edges of objects.

Another major pattern is instrumental in explaining color vision. Some retinal ganglion cells receive their signals from a mixture of cones, a portion of which are sensitive to light of long wavelengths (that is, red), while others are more sensitive to short wavelengths (blue and green). If the synapses from the long-wavelength cones are excitatory and those from the short-wavelength cones are inhibitory, the result for a ganglion cell would be that reds excite it, blues and greens inhibit it, and white light (which contains a mixture of all the wavelengths) leaves it largely unaffected. This pathway is clearly specialized for one type of information—color—as distinct from the pathway described in the paragraph above, which could be called sensitive to ''shape" or "outline."

But the pathway of information for "shape" is not all that straightforward. It includes two subsystems, which correspond to two distinct types of cell in the lateral geniculate (the relay center from the optic nerve to the visual cortex). In each square millimeter of the cortex itself are approximately 100,000 cells—some 80 percent of which do not work by the center-surround design but instead respond best to lines. Contrast is still important: dark lines on a light background, or light lines on a dark background, are the most easily perceived. But another element enters the picture: the orientation of the line, whether it is vertical, horizontal, or any number of degrees in between. Small groups of cells each respond best to their own favored orientation, which is mapped in great detail in the cortex: every 50 microns (millionths of a meter) or so, the orientation favored by the cells rotates through about 10 degrees. (In everyday terms, 10 degrees of rotation is equal to one-third the distance between the "1" and "2" on a clock face.) Thus a full 360-degree rotation is described within each half-millimeter or so of the visual cortex.

Seeing The Patterns In Visual Signals

How does the brain coordinate such a flood of information from the eyes? This is a job for the cortex—specifically, the intermediate layers of the primary visual cortex. (The cortex is made up of six layers of cells, as described in Chapter 2; those of most interest here are layers 2 and 4, the neuron-rich granular layers that make synaptic connections within the same localized region.) Signals from the optic nerve pass through the lateral geniculate to the intermediate layers of the cortex, where any given cell receives impulses from either the right or the left eye. Small groups of cells responsive to one eye or the other form a striped pattern in the cortex, which can be made visible by injecting one eye of an anesthetized animal with a radioactive amino acid, exposing it to light, and developing the emitted radiation as a photographic image. The stripes, like the cell groups that respond to a particular orientation of line, are about half a millimeter in diameter.

In the context of right-eye or left-eye responsiveness, the stripes in the cortex are known as ocular dominance columns—indicating a preference toward receiving signals from one eye or the other but not an absolute predetermination. In fact there is some flexibility in the development of this system, especially in the first few years of life. This "plasticity," as it is known in developmental terms, derives in part from the competition among excess numbers of synapses, dendrites, axons, and whole nerve cells described in Chapter 6. The net result for the visual system is that one eye may develop its circuitry more extensively than the other if it has received more stimulation during the crucial early period. David Hubel and his colleagues at the Harvard Medical School demonstrated this phenomenon in an experimental animal by suturing closed one of its eyes shortly after birth. When they opened the eye after several months and injected it with a radioactive substance, the stripes of cortex receiving signals from that eye appeared significantly narrower than normal; the stripes from the other eye were correspondingly wider.

The implications of this work have led to an important change in the treatment of babies born with strabismus (walleye or cross-eye). In such cases, the brain has difficulty coordinating information from the two eyes, and consequently the circuitry of one eye may compete ineffectively for synapses and cells in the visual cortex—even to the point of blindness through underdevelopment of its neural circuitry. In fact, strabismus is a common cause of childhood blindness in the United States. Treatment to prevent this—eye exercises, an eye patch, or corrective surgery to bring the eyes into alignment—is no longer delayed until the age of four, as had been traditional, but now takes place as early as possible, while the neural circuitry underlying vision is still developing.

The systems for ocular dominance and for preferred orientation are of course only two of the patterns of information processing that make up the visual system as a whole. Also embedded in the same portion of the cerebral cortex are systems for color vision, discussed briefly above, and for the perception of depth and movement. Moreover, as will be evident shortly, our sense of ourselves as three-dimensional beings and our orientation in space depend on nerve cells of the visual system. The compactness of information in this system is a marvel of biological engineering, and it overwhelms standard approaches to the investigation of physiology—even those that are extremely precise and that focus on highly localized functions. David Hubel, for example, feels that researchers eventually will need to observe the workings of single nerve cells and to identify the functions of each one, if neuroscience is ever to give a full account of any of the human sensory systems. The level of resolution required for such observations would be much higher than anything available today from the best PET scans or nuclear magnetic resonance imaging. For this area of research, the questions are already at least partly evident, and the technology is racing to catch up.

As mentioned earlier, the brain's system for vision is probably the best understood of the five sensory systems. The pathway of information can be traced from the external stimulus through the sensory organs and along the neural circuitry all the way to conscious perception, and most of the major steps can be sketched in at least lightly. Some features that have first been described for vision will probably be found in other systems as well; for instance, columns or stripes, similar to those of ocular dominance or of line orientation, clearly figure in other systems of the brain (as just one example, see Goldman-Rakic's account of cognition in Chapter 8). But the visual system also has unique properties—the occipital lobe, in which it is mainly located, even looks different from other tissues of the brain—and it continues to engage researchers in its own right. David Hubel, Margaret Livingstone, and their colleagues at Harvard Medical School are exploring the intricacies of an area of the visual cortex that adjoins the one on which they have worked for many years. What circuitry does it have in common with its neighbor? Why is it a distinct area—what does it do differently? Much remains to be discovered in the field of vision research.

The Brain's System For Spatial Perception

In addition to the neural circuitry that serves the five primary senses, the human brain has numerous other systems for making sense of external stimuli and regulating the body's ability to function in the world. Although they may produce no obvious perceptions, as does vision or the sense of smell, such systems are often highly sophisticated, drawing on several specialized areas of the brain. An example is the system for visual spatial perception, which most of us take for granted when it is working well. We walk or run, put out a hand to greet an approaching friend, even drive a car along a highway—all the while monitoring the spatial surround. No matter how much or how fast we change our position, we continue to perceive ourselves as being at the center of a space, and to gauge accordingly our spatial relations to other people and to objects in the surround. A measure of the system's sophistication is that this ongoing computation hardly ever requires conscious effort; only when illness or injury interferes with it may we become aware of this vital faculty.

As another measure of sophistication, the system integrates signals from the individual's own body structure and muscle groups, as well as from the sense of sight. The signals from these various sources do not all carry equal weight, however, as was learned from work done in the 1970s by A. Berthoz and his colleagues in Paris and by J. Lishman and D. Lee in Edinburgh. The experiment was simple: subjects walked on a treadmill while scenery was projected on the walls, in the field of their peripheral vision, as if it were moving forward faster than they were walking. Although the subjects' legs were indeed carrying them forward on the treadmill, they reported the sensation of walking backward. It seems that visual input overpowered the other signals, although the sensation from the muscle groups involved in walking did contribute to the brain's account of what was going on.

An experimental setting of this kind, in which different forms of information are deliberately set at odds, offers researchers a valuable glimpse of how the brain ordinarily coordinates them. Outside the laboratory, disorders of the spatial perception system take many forms; often they occur without any evident defect in vision or movement. They may appear to be disorders of attention or of "neglect," in a specialized medical sense: the patient ignores or disowns part of her own body, even going so far as to say something like, "Doctor, who put this arm in bed with me?" Striking as such behavior is, the body's ability to recover normal function is equally impressive; in many cases, the patient recovers near-normal functioning within a year.

The cause of such difficulties in the first place is usually an injury to the parietal lobe of the brain, the site of a variety of functions. In one hemisphere (usually, but not always, the left), the parietal lobe controls language. The other hemisphere is associated with such functions as the recognition of shapes and textures and other visual information that is hard to put into words; it also monitors the opposite side of the body and the external environment. A large assortment of disorders, from the severe case described above to more benign (though still distressing) forms, such as the inability to pour a liquid accurately into a glass, have traditionally been collected under the term "parietal lobe syndrome." But a more current view, held, for example, by Vernon Mountcastle and his research team at the Johns Hopkins Medical School, is that this is not a disorder of one discrete lobe of the brain but of a system. Mountcastle, who studies the brain's means of spatial perception, has gathered evidence from numerous research projects that the parietal lobe forms part of a complex distributed system by which the brain actually "constructs" reality. A person stores information about his or her own material dimensions and the three-dimensional surround in several regions of the brain, but the parietal lobe figures importantly in gaining access to that information and in bringing it together effectively.

That the parietal lobe is a major element in this system was demonstrated in another ingenious experiment, this time in the Milan laboratory of E. Bisiach. Here native Milanese citizens were presented with a thought experiment in which they were to imagine themselves standing in the Piazza del Duomo in Milan facing its venerable cathedral, the Duomo. When shown an overview of the buildings around the square, subjects with an injury of the right parietal lobe could not identify the buildings on the left side of the square—that part of their field of vision corresponding to the site of their injury—although these buildings had been familiar to them for years. But when Bisiach asked these subjects to imagine themselves on the opposite side of the square, so that right and left were reversed, they could now name the buildings they had previously not been able to identify—and they were unable to name those they had known before. An information system of such fluidity and sensitivity to differing conditions argues for complex circuitry between the parietal lobe and other areas of the brain. In fact, extraordinarily dense connections have been found between two areas of the parietal lobe and parts of the frontal lobe involved in mental representations, planning, and cognition (see Chapter 8).

Making Sense Of Movement

Studying this kind of perception is a subtle endeavor for the researcher, because although visual stimuli are involved, the point of interest is not the visual system itself but something more like peripheral vision, or spatial awareness. To investigate the "visual neurons of the parietal lobe," Vernon Mountcastle has worked with primates, whose brain structure and sense of sight are much like our own. One experimental approach is to seat the monkey in front of a screen, with an interesting sight projected in the center of the screen to fix the animal's attention. Meanwhile, lights or other displays are moved around in the monkey's peripheral vision to test for particular responses of the visual neurons in the parietal lobe. For more direct observations, it is possible to implant several microelectrodes in the parietal cortex so that each records the activity of a single nerve cell. (The two approaches complement each other because these precise, single-cell recordings are even more informative when examined in the context of observed behavior.)

With this combined approach, Mountcastle has identified several distinct classes of cells in the parietal lobe. One class is active when the animal fixes its attention on a visual target, although these cells do not respond indiscriminately to visual stimuli per se; another class is active when the animal "tracks" a moving target with its eyes; and the third class responds to visual stimuli as such. The neural circuitry between the visual system and the parietal lobe is quite complex and clearly indicates two-way exchange, rather than one area simply feeding into another. Visual neurons of the parietal lobe have their own unique properties: they are highly sensitive to motion, shifts in attention, and the angle of gaze. Unlike neurons of the visual system proper, however, they are rather insensitive to such details as the color, size, or orientation of a visual stimulus.

Electrical recordings from single cells have enabled Mountcastle and his colleagues to work out the pattern by which parietal visual neurons are organized—a pattern quite different from that of the visual cortex itself. In the parietal lobe, a visual neuron will respond, within its receptive field, to a light moving in any of several directions—depending on whether the light is moving toward or away from the center of the field. The center itself apparently is inhibitory for such a cell, giving an overall pattern that, in Mountcastle's words, resembles the outline of a volcano. As a light moves toward the center, the nerve cell increases its response, climbing a slope of excitation—until the light reaches the center and the cell's response drops steeply. Then, as the light continues to move and leaves the center behind, the cell's response starts at a high point again and descends a slope on the other side of the central pit.

This pattern of response appears to take account of the direction of motion in a very idiosyncratic way—it simply registers whether a stimulus is moving toward or away from the center of a particular cell's receptive field. How could such an arrangement possibly give rise to a general perception of the direction in which a visual stimulus is moving? Mountcastle proposes that an accurate signal can be derived even from these imprecise elements, through a simple summing up of the linear vectors. Although there is no way to test this model directly in experimental animals, it has been tested in computer simulations, so far with encouraging results. It appears likely that some form of linear vector summation in the parietal lobe is responsible for our ability to distinguish both the speed and the direction of a moving object—or of the space around us, if we are the moving object.

Beyond the primary sensory areas of the cortex, research has begun to uncover powerful systems in the brain for organizing information in ways that make for an effective response to the environment. In "state control" systems, such as the angle of gaze or attention versus inattention, the "state" exerts an effect on all the information brought together by the system. For example, when we gaze alertly at the road in front of the steering wheel of a car, the parietal lobe's system for peripheral vision creates a "halo" of heightened sensitivity all around the center of our attention, which provides for safe driving.

Attention: A Subject Worth Looking At

Attention itself has long attracted the interest of those who seek to understand the workings of the brain. What are the essential elements of this hard-to-define state, and what is its underlying physical basis? The great advances in neural imaging of the past 20 years, together with new techniques for observing the living brain at work and refinements in the experimental use of the alert, behaving monkey, offer new ways of studying attention. These new methods have also enhanced the value of behavioral studies by revealing some of their physiological context.

In the view of Michael Posner, of the University of Oregon, psychologists (including Posner) have found it useful in recent years to apply some of the thinking and methods developed for exploration of the visual system to the study of attention.

Our current understanding of attention can be assessed under several headings, which are similar to the areas reviewed by David Hubel for visual perception: the anatomy of the brain structures that appear to be involved; the neural circuitry that makes possible the phenomenon of attention; various developments in the brain after birth that are required for attention; and pathologies, whether injury or disease, that interfere with attention.

The Anatomy Of Attention

Positron emission tomography has done much in recent years to change general ideas about the anatomy of mental functions. In particular, PET scans have shown rather distinct localization of the mental operations involved in a task such as "work processing." Under that general heading, it seems at first that many parts of the brain are active, but depending on the specific kind of processing required, activity appears highly focused in one or two areas. PET images of changes in blood flow to discrete areas of the brain (indicative of changes in activity) make it clear that simply showing the experimental subject the written form of a word, and requiring no overt response, activates mainly the visual areas, in the occipital lobe. In PET imaging studies carried out by Posner and Marcus Raichle (discussed in part in Chapter 3), subjects were shown groups of letters that conformed to English rules of construction but did not form a word in English; these nonwords, as well as authentic English words, tended to activate a portion of the left occipital lobe that does not respond to mere strings of consonants or to strings of graphic forms that resemble letters.

This portion of the left occipital lobe is also known to be associated with the brain's system for attention. Patients with "parietal lobe syndrome," described earlier, show a particular effect of parietal injury in their attempts to process words or letter strings shown to them on a screen. If the lesion is in the right parietal lobe, they tend not to notice the first three or four letters of a nonword—that is, those at the left end of the string. But when the letters form a recognizable English word, patients can retrieve it, although they may have difficulty identifying particular letters on the side of the word opposite the site of their injury. Michael Posner, together with Steven Peterson of the Washington University School of Medicine in St. Louis, considers this evidence for a pathway from the parietal lobe (or what they call the "posterior attention system" because of its housing toward the back of the head) to the visual areas of the occipital lobe. (In some other forms of attention, such as planning or mental representation, a number of studies have shown that the frontal lobes play a more important role.) Computer models are being developed to explain how information may be exchanged between the visual system and the posterior attention system.

Despite all it can offer researchers in neuroscience, PET imaging is not ideal for studies of changeable mental states such as attention, because it is not highly time dynamic. It takes about 40 seconds to gather the information from which a PET image is made. Fluctuations of activity within that period will not register in the image; a shift of visual attention, which is accomplished in well under a second, would be invisible. Consequently, behavioral studies of patients with lesions in this area, as well as in a couple of other sites in the midbrain that appear specialized for visual attention, have helped to fill in the picture.

From this work has emerged the finding that a shift of visual attention entails at least two steps: first disengaging the attention from one spot and then bringing it to bear on another location. It appears that the parietal lobe is important in the first step and the midbrain is more active in the second.

Neural Circuitry Of An Attention System

The division of labor described above, with one region of the brain responsible for disengaging attention and another for refocusing it, suggests an important distinction between the source of an attentional effect and the site of its operation. To investigate this possibility, researchers have used electroencephalographic recordings from the scalp during a task in which subjects had their attention drawn to a particular location. The EEG clearly records increases in activity when the subject's attention reaches the target location, as well as slow-wave activity preceding this stage, which presumably indicates disengagement of attention from a previous location. Such methods, which are quite precise as to time and levels of activity, can be combined fruitfully with PET scans or magnetic resonance images to show both the anatomy of interest and time-sensitive readings of activity; researchers then have the means to trace and confirm the circuitry that is thought to be involved in visual attention tasks. Electrical recordings from single cells in alert monkeys, made while the animals' attention was drawn to particular locations, offer another form of information that can be compared with data from PET images and electrode recordings from the scalp in humans.

The posterior attention system apparently is specialized to respond to location in space rather than to other cues such as color or shape or motion. In an experiment in which subjects were asked to attend to these kinds of cues, PET scans showed a distinct pattern of activity, outside the primary visual cortex, associated with each kind of cue. When the target was uncertain, the scans revealed activity in the middle area of the frontal lobe, suggesting the existence of an anterior attention system as well. However, location in space is still the only known visual-attention cue for which the brain has developed exclusive circuitry.

Developing Pathways In Infancy And Childhood

The PET studies described earlier that show a specific region of the brain for processing the visual form of words have particular significance for studies of the brain's development. The reading of words is a learned skill that usually does not appear until about 5 years of age. This means that studies of development must take into account the remarkable fact that several years after birth the human brain develops an entirely new visual capacity within the occipital lobe—namely, the ability to recognize the written forms of words in a language (see Plate 7). This new ability presumably entails an important reorganization of the visual system.

To understand how brain development may best be studied in human infants, researchers such as Michael Posner have used the formation of the posterior attention system as a working model. During the first year of life, the six layers of cells that form the visual cortex develop in a definite order, from the deepest layer (closest to the core of the brain) to the most superficial (closest to the surface of the skull). The sequence of development is important because new pathways develop as different layers of the visual system mature. These pathways foster the rapid transmission of signals, perhaps between two areas that before had been connected less directly.

Mark Johnson, of the psychology department at Carnegie Mellon, postulates that at about 1 month of age the human brain develops a pathway that allows an infant to fix his or her eyes on one stimulus and not be distracted by other events at the periphery. This is essentially an inhibitory pathway, because it prevents stimulation by everything but the main target. Studies of laminar development in the brain, which base their work on a large collection of autopsy data assembled over many years, appear to support this idea, as does an interesting marker in infant behavior. From about 1 to 4 months of age, infants practice what is known as ''obligatory looking"—that is, once their attention is drawn to a visual target, they seem unable to look away, often remaining fixed on that one target until they become overexcited or begin to cry. After about 4 months (as the posterior attention system matures), they acquire an ability to disengage their attention and are freer to shift their eyes from one object of interest to another.

Working from the adult state back to early development, Michael Posner and his colleagues have traced the origin of another useful element in the posterior attention system: a capacity that inhibits one's eyes from returning to a previous location. Clearly, it is more efficient, when searching for a visual target, not to look in locations that have already been searched. But this "inhibition of return" does not arise from conscious reasoning; instead, it takes form as a specific inhibiting pathway in the brain. The period of its development, between about 3 and 6 months of age, is also when most of the components of this attention system begin to take root.

Such "marker" behaviors as the 1-month-old's obligatory looking or the 3-month-old's tendency to return his eyes to an earlier site are a great help to researchers, who use them as guideposts in the study of development. It is thus possible to investigate, for instance, whether the growth and elaboration of this attention system can be correlated with other maturing abilities, such as the regulation of emotion. The capacity of an infant to be soothed—to shift emotional state under influence from the environment—is one example of a developmental advance that takes place concurrently with some of the early groundwork for the visual attention system.

Pathology In The Attention System And Beyond

The case files of illnesses and injuries to the brain that interfere with attention systems contain many curious observations (such as the disavowal of parts of one's body mentioned earlier). In these kinds of cases, the symptom makes its appearance on the side opposite the brain hemisphere that is injured. The location of such injuries is often the parietal lobe, and the system most affected by them is the posterior attention system, which is sensitive to location in space.

Meanwhile, the anterior attention system presents its own intriguing patterns of activity. The area of interest here is a little further forward than the site of the posterior attention system, in the middle part of the frontal lobes at a ridge in the brain called the cingulate gyrus. This portion of the brain shows high levels of activity, for example, on PET scans, when subjects in experiments are presented with written words and asked not merely to recognize them but to make some active response, such as saying the words aloud. The cingulate gyrus is also the most active part of the brain (as measured by PET scans of cerebral blood flow) in the well-known psychological phenomenon known as the Stroop effect. The test for this consists of color names that are shown to the subject as written words, but with each word written in a color different from what it says: the word "blue" appears in red, "yellow" in brown, "red" in green, and so on (see Plate 8). When asked to name the color of the ink, most subjects find it all but impossible to override the word they see in favor of the color itself. The very strong activation of the anterior attention system appears to be associated with the compulsion to favor the recognition of the written word over the recognition of color.

Such evidence for the functions of the anterior attention system have led Posner and other researchers to wonder whether defects in this information-coordinating system might also be involved in schizophrenia. Several studies have noted that schizophrenic patients tend to focus on the left side of objects and that they have difficulty shifting attention to the right visual field. Both these signs suggest a dysfunction in the left hemisphere, in an area that is also involved with the processing of language—not that schizophrenic patients as a group show difficulties with language per se, but it is possible that some impairment in the way the brain handles language stimuli could contribute to the disturbances of thought that are characteristic of schizophrenia. Another familiar feature of schizophrenia is the "alien hand sign," in which the patient believes that his hand, although still attached to him, is controlled by an alien power; this is reminiscent of the "neglect" syndrome (discussed earlier) that arises from defects in the posterior attention system. Here, however, the patient still recognizes the hand as his own but imputes the control and direction of its actions to another mind.

One other system for attention seems from early evidence to involve the frontal lobe of the right hemisphere. Injury in this area appears to cause difficulty in so-called vigilance tasks: monitoring a visual (or auditory) field over a long time while on the lookout for rather subtle or infrequent signals. Interestingly, scanning shows that the right frontal cortex is highly active during such tasks but the anterior cingulate gyrus is quite inactive—in fact, it operates below its baseline level of activity. But when the experiment is changed so that the signals become more frequent, the cingulate gyrus increases its participation. This pattern suggests to Posner and others that activity of the vigilance network might effectively inhibit the anterior cingulate gyrus, allowing targets—when they occur—to have ready access to higher levels of attention.

The phenomenon of attention—or the full assortment of mental activities that can be collected under the term "attention"—represents a fine opportunity for research in the Decade of the Brain, because it is terrain on which the cognitive sciences, with their descriptions of processes at the mental level, can join with the anatomical explorations and interpretations of neurobiology. Most alluring for the long-term is the prospect that as more becomes known about the anatomical structures and neural circuitry that underlie attention in all its forms, researchers will ultimately be able to resolve a question that curious minds have pondered for a long time: just what goes on in the brain, at a physical level, to account for the subjective experiences of the perceiving mind.


Chapter 7 is based on presentations by David Hubel, Vernon Mountcastle, and Michael Posner.

Image img00011
Image img00012
Copyright © 1992 by the National Academy of Sciences.
Bookshelf ID: NBK234148