Voluntary Movement: The Parietal and Premotor Cortex
Voluntary Movement Expresses an Intention to Act
Voluntary Movement Requires Sensory Information About the World and the Body
Reaching for an Object Requires Sensory Information About the Object’s Location in Space
Space Is Represented in Several Cortical Areas with Different Sensory and Motor Properties
The Inferior Parietal and Ventral Premotor Cortex Contain Representations of Peripersonal Space
Grasping an Object Requires Sensory Information About Its Physical Properties
The Activity of Neurons of the Inferior Parietal Cortex Is Influenced by the Purpose of an Action
The Activity of Neurons in the Ventral Premotor Cortex Correlates with Motor Acts
The Primary Motor Cortex Transforms a Grasping Action Plan into Appropriate Finger Movements
The Cortical Motor System Is Involved in Planning Action
The Premotor Cortex Contributes to Perceptual Decisions That Guide Motor Behavior
Cortical Motor Areas Contribute to Understanding the Observed Actions of Others
The Relationship between Motor Acts, the Sense of Volition, and Free Will Is Uncertain
IN THIS CHAPTER WE DESCRIBE HOW the cerebral cortex uses sensory information about the external world in deciding on which actions to take and how to organize voluntary movements to accomplish those actions. Studies over the past 25 years have shown that the cortical motor system is not an unthinking, passive circuit controlled by more intelligent parts of the brain. Instead, it is intimately involved in the many interrelated neural processes required to choose a plan of action, including processes that appear to be more perceptual and cognitive than motor in nature. The motor system also contributes to cognitive processes that appear unrelated to motor control, such as understanding the actions of others and the potential outcomes of observed events.
Voluntary Movement Expresses an Intention to Act
Voluntary behavior is the physical expression of an intention to act on the environment to achieve a goal. Let us say you want a cup of coffee. There may be many reasons why: You may wish to enjoy the stimulating effect of caffeine or may simply be thirsty. Whatever its origin, your behavioral goal is established by your motivational state but is fulfilled by voluntary motor behavior. The motor system has to transform your intention into action.
How you achieve your goal depends on the circumstances in which you find yourself. If the cup of coffee is already prepared and sitting in front of you, you can simply reach out, grasp the cup, and bring it to your lips. Often, however, the situation is more complex. The coffee might not be ready, or you might not have any coffee at home. In this case, to satisfy your craving for coffee you must organize and perform a complex series of actions to fulfill your goal of drinking coffee. You may go out to buy the coffee and return home, or you may go to a café, order a coffee, and drink it there. Alternatively, if it is too late in the evening or if the weather is inclement, you may alter your goal, such as drinking tea instead of coffee.
Each of these different voluntary behaviors is an action that serves an intermediate goal. However, only the entire series of actions can achieve your ultimate goal. The capacity to maintain a behavioral goal during a series of actions, and to develop alternative behavioral strategies and action sequences to fulfill the goal, are hallmarks of voluntary behavior. The pre-frontal cortex located rostral to the motor areas plays a critical role in the organization of voluntary behavior. Here we focus on the neuronal mechanisms in the parietal and premotor cortex that mediate voluntary behaviors.
Voluntary behavior often involves physical interaction with objects in the external world. This requires the brain to convert sensory inputs about the state of the world and the individual’s internal state into motor commands. As described in Chapter 33, the transformation involves a sequence of neural operations in many cortical and subcortical areas. No single area is responsible for all the steps between intention and action, or indeed for any one particular operation. This distributed organization is characteristic of all aspects of the neural control of voluntary behavior.
Another important feature of voluntary behavior is that once an intention is formed, action can be delayed or not performed at all. One is not irrevocably compelled to act on an intention the moment it is formed. A reflex, by contrast, is evoked immediately by a stimulus. Without self-control over whether, how, and when to act, behavior would be driven by the moment—impulsive, compulsive, and even antisocial. These considerations suggest that the motor system operates in at least two stages: movement planning and execution. Planning involves deciding what action or series of actions to perform to fulfill an intention, whereas execution orchestrates actual movement.
Studies of nearly every cortical area involved in arm movement have attempted to identify the neural pathways specific to planning or execution. This is often done by imposing a delay between the instruction about what movement to make and the cue to execute it. These studies show that none of the cortical areas contains a homogeneous population of neurons dedicated only to planning or execution. Instead, a broad range of neuronal function is evident in each area. Some neurons respond only during the planning phase of the task, whereas others discharge during the execution phase. Still others show activity changes during both stages (Figure 38–1).
Figure 38–1 Neural processes related to movement planning and movement execution can be dissociated in time. (Reproduced, with permission, from Crammond and Kalaska 2000.)
A. In a reaction-time task a sensory cue instructs the subject both where to move (target cue) and when to move (go cue). All neuronal operations required to plan and initiate the execution of the movement are performed in the brief time between the appearance of the cue and the onset of movement. In an instructed-delay task an initial cue tells the subject where to move and only later is the cue given to start movement. The knowledge provided by the first cue permits the subject to plan the upcoming movement. Any changes in activity that occur after the first cue but before the second are presumed to be neuronal correlates of the planning stage.
B. Movement planning and execution are not completely segregated at the level of single neurons or neuronal populations in a given cortical area. Raster plots and cumulative histograms show the responses of three premotor cortex neurons to movements in each cell’s preferred direction during reaction-time trials and instructed-delay trials. In the raster plots each row represents activity in a single trial. The thin tics represent action potentials, and the two thicker tics show the time of movement onset and end. In reaction-time trials the monkey does not know in which direction to move until the target appears. In contrast, in instructed-delay trials an initial cue informs the monkey where the target lies well in advance of the appearance of a second signal to initiate the movement. During the delay period many premotor cells show directionally tuned changes in activity that signal the direction of the impending delayed movement. The activity in cell 1 appears to be strictly related to the planning phase of the task, for there is no execution-related activity after the go signal in the instructed-delay task. The other two cells show different degrees of activity related to both planning and execution.
The major difference between cortical areas is whether the predominant neural activity is correlated with planning or execution. Whereas many primary motor cortex neurons discharge mainly during execution, premotor and parietal cortices contain more neurons that are strongly activated during the planning stage.
Neural activity during the planning stage also provides information about the intended act. The activity of single neurons and populations during the delay period of reach-to-grasp tasks conveys such information as the location of the target, the direction of arm movement, and the configuration of the hand required to grasp an object. This activity even encodes information about higher-order aspects of the action, such as its goal and expected reward value.
Even when a well-trained monkey makes the wrong movement in response to an instruction, the neural activity during the delay period before movement onset generally predicts the erroneous response. This is compelling evidence that the activity is a neural correlate of the intended motor act, not a passive sensory response to the stimulus that instructed it.
Further evidence of motor planning in the cortex comes from comparing the neural activity in a monkey when it has been instructed to make a reaching movement and when it has been instructed to withhold reaching. Many neurons in the premotor cortex generate directionally tuned activity during the delay period when the monkey is instructed to move, but not when it is instructed to refrain from moving. This differential activity represents an unequivocal signal about the monkey’s intention either to reach in different directions or not to move in response to an instructional cue seconds before the action is executed (Figure 38–2).
Figure 38–2 Decisions about response choices are evident in the activity of premotor cortex neurons. (Reproduced, with permission, from Crammond and Kalaska 2000.)
A. In a reaction-time task (reaching) a cell exhibits gradually increasing, nondirectional, tonic firing while waiting for the appearance of a target. When the target appears (go cue) the cell generates a directionally tuned response.
B. In an instructed-delay task, when a monkey is shown the target and instructed to move once the go cue appears, the cell generates a strong, directionally tuned signal for the duration of the delay period before the go cue. When the monkey is shown the target but is instructed not to move when the go cue appears, the cell’s activity decreases.
These studies demonstrate that activity in several movement-related cortical areas signals information about the nature of an intended motor act well before execution of the act. Many neurons in the same cortical areas also discharge during movement execution, implicating those areas in the control of movement. Given this close anatomical proximity of planning- and execution-related activity, even at the level of individual neurons, a major unresolved question is why planning-related neural activity does not immediately initiate the movement. There must exist a mechanism that either prevents movement execution during the delayed planning stage or permits the start of movement at a later time (see Box 38-2).
Voluntary Movement Requires Sensory Information About the World and the Body
Let us return to the action of getting a cup of coffee. The deceptively simple action of drinking from a cup represents not a single motor act but a series of motor acts, each with a specific goal: reaching for the cup, grasping, lifting, holding, and bringing the cup to the mouth. The sequence of acts must be coordinated so that the arm and hand can interact physically with the cup in an efficient manner to achieve the desired goal.
To reach out and grasp the cup the motor system must solve two basic problems. First, it has to localize the cup in space and transform this location into a reaching movement of the arm to bring the hand to the cup. Second, it must encode the physical properties of the cup, such as its size and shape, and transform them into a particular grip. One might suppose that reaching and grasping are conducted sequentially. However, recordings of hand and arm kinematics show that this is not so: The two acts occur largely simultaneously. As the arm reaches toward the cup, the hand starts to rotate and open to match the size, shape, and orientation of the target. The hand and fingers then begin to close even before the hand contacts the cup. Furthermore, although the two processes occur in parallel, they can influence each other. Both the velocity and acceleration of grasping and reaching, for example, can depend on the location, distance, orientation, size, and shape of the object to be lifted.
Along with information about the target object the motor system requires information about the current status of the arm, including its posture and motion and the position of the hand relative to the target. The various brain operations required to plan and guide arm movements are implemented in part by interconnected populations of neurons in the primary motor cortex, premotor cortex, and parietal cortex.
The parietal lobe is the principal target of the dorsal visual stream. It has long been implicated in a variety of functions such as the perception of the spatial structure of the world and the control of directed attention. As a result, the dorsal visual stream is often called the “where” pathway to distinguish it from the “what” pathway, the ventral visual stream that projects from the primary visual cortex into the temporal lobe and is involved in the recognition of objects.
Pioneering neurophysiological studies of the parietal lobe in active monkeys conducted independently by Vernon Mountcastle and Juhani Hyvärinen and their colleagues in the 1970s showed that many parietal neurons also discharge during eye, arm, or hand movements when an animal attentively explores and interacts with its environment. One striking property that both groups observed is that the discharge of many parietal neurons is highly dependent on the goal of the behavior. Neurons discharge strongly when a monkey reaches to grasp an object, searches for an object in a box, or manipulates an object with its hand, but are much less active when the monkey makes other arm and hand movements.
More recently, behavioral studies by Mel Goodale and David Milner and their collaborators have led to an important and still controversial hypothesis about the role of the dorsal visual stream. They propose that a primary function of the parietal lobe is to extract sensory information about the external world and one’s own body that is useful for the planning and guidance of movements. This sensory guidance of action may operate in parallel with and independently of perceptual processes evoked by the same sensory inputs. For instance, whereas our perception of the size and orientation of objects can be deceived by certain visual illusions, the motor system often behaves as if it is not fooled and makes accurate movements (Figure 38–3). As a result, the dorsal visual stream is also called the “how” pathway (see Chapter 18).
Figure 38–3 The visual information that serves object perception and movement may be processed in distinct, parallel pathways. In the Ebbinghaus illusion two orange disks of identical diameter appear to be of different size because one is surrounded by large disks and the other by small disks. Mel Goodale and collaborators reported that when subjects were asked to indicate the size of the central disks with their thumb and index finger, the separation between finger and thumb was significantly larger for the disk on the right. However, when subjects reached out to grasp the identical disks surrounded by larger or smaller disks, their thumb-finger separation was nearly the same in both cases. This and similar evidence suggests that visual pathways to the parietal lobe are distinct from those that support object perception and that the parietal inputs are not solely the output of the perceptual pathways.
This does not mean, however, that the parietal lobe has no role in spatial perception or attention. On the contrary, we now recognize that its contributions to spatial perception, attention, and sensorimotor transformations are intimately intertwined. This interconnectedness of function is clear in an examination of how different parts of the parietal lobe and associated precentral motor areas contribute to the planning and execution of the reach-to-grasp action required to drink a cup of coffee.
Reaching for an Object Requires Sensory Information About the Object’s Location in Space
Although we describe the neural processes underlying reach and grasp separately, the two actions are usually coordinated. Coordination is achieved through reciprocal axonal connections between reach- and grasp-related populations both within the same cortical areas and between different areas and through populations of neurons that discharge in connection with components of both reach and grasp.
Space Is Represented in Several Cortical Areas with Different Sensory and Motor Properties
The planning of a reaching movement is usually defined as the neural process by which the location of an object in space is translated into an arm movement that brings the hand into contact with the object. Our intuitive conception of space as a single continuous expanse—one that extends in all directions and within which objects have locations relative to one another and to ourselves—has long influenced neuroscience.
According to classical neurology the neural counterpart of the space that we experience is a single map in the parietal lobe constructed by inputs from different sensory modalities. This unified, multimodal neural replica of the world is assumed to provide all the information necessary for acting on an object and is shared by the different motor circuits that control the eyes, arm, hand, and other effectors.
An alternative view is that there are many maps each related to a different motor effector and adapted to its specific needs. These spatial representations are created when the individual interacts with its environment, defining a series of motor relations determined by the properties of a particular effector. For example, a rodent has a locomotion map in the hippocampus and adjacent entorhinal cortex representing the animal’s current location and direction of motion. This alternative hypothesis suggests that our intuitive sense of space arises at least in part from our motor interactions with the world.
Evidence collected in recent years clearly does not support the notion of a single topographically organized representation of space in the parietal cortex. First, the parietal cortex is organized as a series of areas working in parallel. Second, near space or peripersonal space, the space within our reach, is encoded in areas different from those that represent far space, the space beyond our reach. Third, the functional properties of the neurons in parietal and frontal areas of cortex involved in spatial coding vary depending on the body part controlled, such as the eyes versus the arm.
These findings support the idea there are many spatial maps, some located in the parietal cortex and others in the frontal cortex, whose properties are tuned to the motor requirements of different effectors. Moreover, the spatial maps in each cortical area are not maps in the usual sense of a faithful point-to-point representation of surrounding space, but rather dynamic maps that may expand or shrink according to the motor requirements necessary to interact with a given stationary or moving object.
The Inferior Parietal and Ventral Premotor Cortex Contain Representations of Peripersonal Space
In monkeys several areas in the inferior parietal cortex and interconnected parts of the premotor cortex contain representations of peripersonal space. One such area, the ventral intraparietal area, is located in the fundus of the intraparietal sulcus (Figure 38–4A). It receives visual projections from components of the dorsal visual stream, including areas MST (medial superior temporal cortex) and MT (medial temporal cortex), that are involved in the analysis of optic flow and visual motion.
Figure 38–4 Separate parietofrontal pathways are involved in the visuomotor transformations for reaching and grasping.
A. The visuomotor transformation necessary for reaching is mediated by the parietofrontal network shown here. The areas located within the intraparietal sulcus are shown in an unfolded view of the sulcus. Two serial pathways are involved in the organization of reaching movements. The ventral stream has its principal nodes in the ventral intraparietal area (VIP) and area F4 of the ventral premotor cortex, whereas the dorsal stream has synaptic relays in the superior parietal lobe (MIP, V6A) and the dorsal premotor cortex (gPMd), which includes area F2. (Parietal areas include AIP, anterior intraparietal area; LIP, lateral intraparietal area; and V6A, the parietal portion of the parieto-occipital area.) PEc and PEip are parietal areas according to the nomenclature of von Economo. Somatosensory areas 1, 2, and 3 and area PE, which provide somatosensory input to M1 (F1), are not shown in the figure. Precentral areas include F5, a subdivision of PMv, the ventral premotor cortex, and the primary motor cortex (M1, F1).
B. The visuomotor transformation necessary for grasping is mediated by the parietofrontal network shown here. The AIP and PFG areas are concerned mostly with hand movements, whereas area PF is concerned with mouth movements. PF and PFG are parietal areas according to the nomenclature of von Economo. Area F5 in PMv is concerned with both hand and mouth motor acts. Some grasping neurons have been found in F2, the ventral part of PMd. Area M1 (or F1) contains a large sector that controls the fingers, hand, and wrist (see Figure 37–2A). Other abbreviations are explained in part A.
Some ventral intraparietal neurons respond only to visual stimuli and respond preferentially either to expanding (looming) or contracting (receding) stimuli or to stimuli moving in the horizontal or vertical plane. Others have polymodal receptive fields within which inputs from different sensory modalities lie in spatial register (Figure 38–5A). These neurons respond to tactile stimuli, most often near the mouth or on the face but also on the arm or trunk, as well as to visual stimuli located immediately adjacent to the tactile receptive field. Some even respond to auditory stimuli in the same spatial location. Certain polymodal neurons respond to both visual and tactile stimuli moving in the same direction whereas others are strongly activated by visual stimuli that move toward their tactile receptive field but only if the path of motion will eventually intersect the tactile receptive field.
Figure 38–5 Some neurons in the parietal and premotor cortex respond to both tactile and visual stimuli within receptive fields that are spatially in register.
A. Some neurons in the ventral intraparietal cortex have tactile and visual receptive fields that are aligned in a congruent manner. Orange areas on the monkey represent tactile receptive fields; purple areas on the screen in front of the monkey’s face and centered on its nose represent visual receptive fields. Many of the neurons also share directional preferences for movement of tactile and visual stimuli (arrows). (Reproduced, with permission, from Duhamel, Colby, and Goldberg 1998.)
B. Neurons in ventral premotor cortex area F4 respond to either tactile or visual stimulation. Orange areas are tactile receptive fields; purple lines indicate the three-dimensional receptive fields within which visual stimuli activate the neuron. (Reproduced, with permission, from Fogassi et al. 1996.)
Ventral intraparietal neurons appear to represent an early stage in the construction of a peripersonal spatial map that is more fully expressed in a caudal part of the ventral premotor cortex, area F4, with which it is strongly interconnected. Virtually all neurons in area F4 respond to somatosensory inputs, especially tactile stimuli. The tactile receptive fields are located primarily on the face, neck, arms, and hands. Half of the neurons also respond to visual stimuli and a few to auditory stimuli.
As with ventral intraparietal neurons, the modality-specific receptive fields in area F4 lie in register (Figure 38–5B). This suggests that the visual receptive fields are not defined by the location of the visual stimulus on the retina, as in most neurons in the visual cortex, but are anchored to specific parts of the individual’s body. One striking feature of such a polymodal neuron, especially in the ventral premotor cortex, is that its visual receptive field remains aligned with the tactile receptive field when the monkey looks in different directions, but moves with the tactile receptive field to a different part of peripersonal space when the monkey moves the corresponding part of its body.
Nevertheless, area F4 is a motor area and its neurons also discharge in association with movements, most often of the arm, wrist, neck, and face. The neurons in this area control movements of the head and arm toward different parts of the body, or toward objects close to the body, to permit the animal to grasp them with its mouth or hand. Some neurons discharge during the entire action of bringing the hand to the mouth and opening the mouth to ingest food, as well as during arm reaching and associated neck- and trunk-orienting movements. Activity in other neurons is correlated not only with reaching but also with other behaviors such as the avoidance of threatening stimuli. The sensory representation of peripersonal space in area F4 contributes to the planning and execution of those behaviors.
The Superior Parietal Cortex Uses Sensory Information to Guide Arm Movements Toward Objects in Peripersonal Space
A key requirement for efficient reaching is knowledge of where the arm is before and during the action. Lesion studies suggest that this information is represented in Brodmann’s area 2, the primary somatosensory area (S-I), and in the superior parietal lobule. Patients with lesions of these regions are unable to reach toward objects efficiently, even though they do not have the deficits of spatial perception, such as spatial neglect, that are typical of lesions in the inferior parietal lobe (see Chapter 19).
Although single-neuron studies confirm the role of these areas in providing information about arm location, there are clear functional differences between the two areas. Neurons in area 2 usually respond to tactile input from a limited part of the body or to movements of a single joint or a few adjacent joints in specific directions and most commonly on the contralateral side of the body. In contrast, many neurons in the superior parietal lobule discharge during combined movements of multiple joints, the assumption of specific postures, or movements of the limbs and the body. Some cells also respond during combined movements of the arms and hind limbs or bilateral movements of both arms.
These findings indicate that, unlike neurons in area 2 that encode the positions and movements of specific parts of the body, neurons in the superior parietal lobe integrate information on the positions of individual joints as well as the positions of limb segments with respect to the body. This integration creates a body schema that provides information on where the arm is located with respect to the body and how the different arm segments are positioned with respect to one another. This schema provides fundamental information for the proprioceptive guidance of arm movements.
More posterior and medial sectors of the superior parietal cortex also receive input from areas V2 and V3 of the extrastriate visual cortex. Important nodes in this network include areas V6A and PEc and an area of parietal cortex involved in reaching described by Richard Andersen and colleagues and which most likely corresponds to the medial intraparietal area (MIP) and nearby parts of the superior and inferior parietal cortex (see Figure 38–4A). In these areas the spatial representation for reaching is not based on body-centered coordinates. For example, neurons in V6A and PEc often signal the retinal location of possible targets for reaching, but their activity is also strongly modulated by complex combinations of inputs related to the direction of gaze and the current arm posture and hand position.
Andersen and his associates propose that the reach-related region of parietal cortex is particularly important for specifying the goal or target of reaching but not how the action should be performed. The activity of many neurons in this area varies with the location of the target relative to the hand. Remarkably, however, this motor error signal is not centered on the current location of the hand or target but rather on the current direction of gaze. Each time the monkey looks in a different direction the reach-related activity in the neurons changes (Figure 38–6). In contrast, the reach-related activity of many neurons in area PEip is less gaze-centered and more related to the current hand position and arm posture.
Figure 38–6 Neurons in the parietal reach area encode target location in eye-centered coordinates. An upright board contains an array of pushbuttons. The four panels show the possible behavioral conditions at the beginning of a trial. The initial hand position and point of visual fixation are indicated by the green and orange buttons, respectively. Histograms of activity in a single neuron are arranged to correspond to the locations of the buttons on the board that serve as the target of a reaching movement from the start position in different trials. The firing pattern of this neuron does not vary with changes in initial limb position (A, B), but shifts with a change in the initial direction of gaze (C, D). The neuron thus signals the target location relative to the current direction of gaze, independent of the direction of arm movement required to reach the target. (Modified, with permission, from Andersen and Buneo 2002.)
Another important property of neurons in the parietal reach region is that they respond not only to passive sensory inputs but also before the onset of movements and during the planning period of delayed-reaching tasks. This behavior indicates that these neurons receive centrally generated signals about motor intentions prior to movement onset, likely through their reciprocal connections with precentral motor areas. Recent theoretical and experimental findings suggest that this combination of peripheral sensory and central motor inputs permits the parietal reach region to integrate sensory input with efference copies of outgoing motor commands to compute a continuously updated estimate of the current arm state and a prediction about how the arm will respond to the motor command. This forward internal model of the arm could be used to make rapid corrections for errors in ongoing arm movements and to acquire motor skills.
The functional properties of areas in the superior parietal cortex concerned with reaching suggest an intriguing explanation of the clinical phenomenon of optic ataxia. Patients with a lesion of the superior parietal cortex have difficulty with visually guided arm movements toward an object. Making errors in the frontal or sagittal plane, the arm gropes for the target until it encounters the object almost by chance. The deficit is severe when the target is in the peripheral part of the visual field, less when the target lies in the parafoveal region, and negligible when the patient fixates the target. The symptoms of optic ataxia may result from failure of the neural circuits that convert sensory information about targets and the arm into motor plans or from failure of the circuits that contribute to a predictive forward model of the arm’s current state.
Premotor and Primary Motor Cortex Formulate More Specific Motor Plans About Intended Reaching Movements

Stay updated, free articles. Join our Telegram channel

Full access? Get Clinical Tree

