Higher Cognitive Function and Behavioral Control




KEY CONCEPTS



Listen






  • Executive function, the cognitive control of behavior, depends on the prefrontal cortex, which is extensively developed in higher primates and especially humans.



  • Working memory is a short-term, capacity-limited cognitive buffer that stores information and permits its manipulation to guide decision making and behavior.



  • Attention permits the selection of relevant information from the enormous welter of sensory inputs. Attention may be effortful or may result “bottom-up” from the appearance of a salient stimulus.



  • Attention depends on working memory and on mechanisms that filter sensory inputs for access to working memory.



  • Attention and working memory are modulated by drugs that directly or indirectly stimulate dopamine D1 receptors and noradrenergic receptors. Such drugs, most notably the psychostimulants methylphenidate and amphetamine, are used to treat attention deficit hyperactivity disorder (ADHD).



  • ADHD is strongly genetically determined, but it has not yet been possible to identify specific risk genes.



  • Obsessive–compulsive disorder (OCD) and Tourette syndrome may represent related abnormalities in the circuitry connecting the prefrontal cortex, striatum, and thalamus. Both are highly heritable conditions, although the specific genes involved remain unknown.



  • Declarative or explicit memory, which is the memory of facts and events, and nondeclarative or implicit memory, which includes all other forms such as procedural memory and habits, are the two broad categories of memory.



  • Structures in the medial temporal lobe, such as the hippocampus, are particularly important for the temporary storage of declarative memories.



  • The striatum plays a central role in procedural and habit memory.



  • The amygdala, part of the temporal lobe, and the nucleus accumbens, part of the ventral striatum, are important in emotional memory.



  • Long-lasting changes in synaptic efficacy (synaptic plasticity) are thought to be important mechanisms for storing memories.



  • Strong emotions enhance memory formation, likely because of the associated activation of diffusely projecting neurotransmitter systems (eg, monoamines, acetylcholine).



  • Despite intense interest in finding drugs that enhance memory under pathologic conditions (eg, Alzheimer disease) or normal conditions (eg, aging), no robustly effective agents have yet been developed.



  • Social cognition refers to cognitive and emotional processes that underlie social functioning, including social judgments and social affiliation.



  • Autism, one of the most strongly heritable neuropsychiatric illnesses, describes a spectrum of disorders characterized by abnormal social function, language impairments, restricted interests, and repetitive behaviors.





INTRODUCTION



Listen




The simplest nervous systems sense survival-relevant stimuli in the environment and respond reflexively for self-defense, to obtain nourishment, and to reproduce. As organisms became more complex, an increasing number of neurons were interposed between sensory inputs and motor outputs, creating what has been called “the great intermediate net.” In the human nervous system, this net has reached remarkable levels of complexity, making possible diverse and subtle forms of cognition and behavioral control including many kinds of learning and memory, abstract thought, the creation of fiction, music, and visual art, and the ability to navigate flexibly in uncertain and changing social situations. For the sake of survival, humans retain simple reflexive responses (eg, to noxious stimuli), simple unconscious homeostatic behaviors such as swallowing and control of respiration, and more complex but automatized behavioral responses to salient stimuli (eg, to certain types of threats or to rewards such as attractive foods).



Most human behavior, however, whether automatic or requiring effortful conscious control, whether learned or improvised, results from interactions among sensory stimuli, innate drives (eg, hunger, thirst, sexual arousal), memories, emotions, and diverse types of cognitive processing. This complexity, which brings with it the possibility of flexible and creative responses to sensory stimuli, also creates the potential for conflict among goals and confusion among possible responses. Without coordination among inputs and outputs, and some mechanism to supervise the selection of goals and behavioral outputs, the result could be self-defeating confusion rather than evolutionary advantage.



The first part of this chapter focuses on prefrontal regions of cerebral cortex and their connections with sensory and emotional centers in the brain that exert control over an individual’s behavior under normal and pathologic conditions. The second section covers different types of memory, including explicit memories, which are dependent on hippocampal function, and implicit procedural memories, which are dependent on the striatum. We end with a discussion of social cognition, which is extensively developed in humans and impaired in a range of neuropsychiatric disorders.




COGNITION, EMOTION, AND THE CONTROL OF BEHAVIOR



Listen




Anatomy and Function of the Prefrontal Cortex



The prefrontal cortex is the region that subserves executive function, the ability to exert cognitive control over decision making and behavior. It is also important for anticipating reward and punishment and for empathy and complex emotions. This brain area has grown disproportionately large in our near primate relatives and especially in humans compared with other mammals 14–1. Different subregions 14–2 participate in a large number of distinct circuits that permit the integration and processing of diverse types of information. The prefrontal cortex receives not only inputs from other cortical regions, including association cortex, but also, via the thalamus, inputs from subcortical structures subserving emotion and motivation, such as the amygdala (Chapter 15) and ventral striatum (or nucleus accumbens; Chapter 16). The prefrontal cortex is also innervated by widely projecting neurotransmitter systems, such as norepinephrine, dopamine, serotonin, acetylcholine, and orexin/hypocretin. These diverse inputs and back projections to both cortical and subcortical structures put the prefrontal cortex in a position to exert what is often called “top-down” control or cognitive control of behavior. Because behavioral responses in humans are not rigidly dictated by sensory inputs and drives, behavioral responses can instead be guided in accordance with short- or long-term goals, prior experience, and the environmental context. The response to a delicious-looking dessert is different depending on whether a person is alone staring into his or her refrigerator, is at a formal dinner party attended by his or her punctilious boss, or has just formulated the goal of losing 10 lb. The response to a rattlesnake will differ depending on whether a person is a novice hiker or a herpetologist looking for specimens. Adaptive responses depend on the ability to inhibit automatic or prepotent responses (eg, to ravenously eat the dessert or run from the snake) given certain social or environmental contexts or chosen goals and, in those circumstances, to select more appropriate responses. In conditions in which prepotent responses tend to dominate behavior, such as in drug addiction, where drug cues can elicit drug seeking (Chapter 16), or in attention deficit hyperactivity disorder (ADHD; described below), significant negative consequences can result.




14–1


Phylogenetic comparison of the proportion of the brain taken up by prefrontal cortex in five different mammalian species. (Adapted with permission from Kandel ER, Schwartz JH, Jessel TM. Principles of Neural Science, 3rd ed. New York: Elsevier; 1991:837.)






14–2


Regions of the prefrontal cortex involved in complex cognitive function. Left panel, lateral view showing the dorsolateral prefrontal cortex (DLPFC). Right panel, sagittal section showing the medial prefrontal cortex (MPFC), orbital frontal cortex (OFC), and anterior cingulate cortex (ACC).





Because the prefrontal cortex does not subserve primary sensory or motor functions, language production, or basic intelligence (eg, arithmetic calculation), it was once thought to be silent or nearly so. As late as the 1950s, prefrontal lobotomy was used to treat schizophrenia and other severe mental disorders and was rationalized, in part, because basic intelligence survived the surgery. However, damage to the prefrontal cortex has a significant deleterious effect on social behavior, decision making, and adaptive responding to the changing circumstances of life. An important role for regions of the prefrontal cortex was demonstrated as early as the 19th century by the case of Phineas Gage 14–1.



14–1 Phineas Gage


Phineas Gage was a 25-year-old construction foreman, whose team was laying new track for the Rutland and Burlington Railroad when, on a September day in 1848, he became the victim of a dramatic accident. While using a tamping iron to pack blasting powder into a hole drilled in rock, he accidentally set off an explosion that sent the fine-pointed iron through his face, skull, and brain (see figure). The force of the explosion was such that the iron landed yards away. Remarkably, Gage rapidly regained consciousness and was able to talk and, with the help of his men, to walk. As remarkably, during the preantibiotic era, he returned to physical health and showed no signs of paralysis, impaired speech, or loss of memory or general intelligence.


Despite his recovery, Gage was a transformed man, and not for the better. Prior to the accident, he had been polite, responsible, capable, and well socialized; indeed, he had earned the role of foreman at an early age. After the accident he became unreliable and could not be trusted to keep his commitments and thus lost his job. He no longer observed social convention; it is documented, for example, that his language became quite profane. He wandered over the next several years and died in the custody of his family, never having held a responsible position again.


Gage’s physician, John Harlow, who had related Gage’s altered behavior to his brain injury, learned of Gage’s death from his family approximately 5 years after its occurrence and convinced them to exhume the body. As a result, Gage’s skull and the tamping iron, with which he had been buried, are available for study, thus permitting the reconstruction shown in the figure: left, reconstruction of path of tamping bar through the skull; right, midlevel transverse section of the brain showing damaged area of medial prefrontal cortex and preserved Broca area (yellow), motor cortex (red), Wernicke area (blue), and sensory cortex (green).


Gage’s lesion involved portions of the left orbital prefrontal cortex and portions of both left and right anterior medial prefrontal cortices. Based on Harlow’s reports of Gage’s behavior, and on what we now know of the functioning of these brain regions (see text), Gage’s lesions explain the degradation of his social behavior and his inability to guide his behavior in accordance with long-term goals. It appears that Gage’s dorsolateral prefrontal cortex was spared, thus preserving other domains of cognitive control.


Since Gage’s time, there have been many studies of human brain damage affecting the prefrontal cortex and more recently functional imaging studies. The case of Phineas Gage illustrates the central importance of the prefrontal cortex in integrating emotion and cognition in the service of executive function.



(From Damasio H, Grabowski T, Frank R, Galaburda AM, Damasio AR. The return of Phineas Gage: clues about the brain from the skull of a famous patient. Science. 1994;264:1102–1105.)




Several subregions of the prefrontal cortex 14–2 have been implicated in partly distinct aspects of cognitive control, although these distinctions remain somewhat vaguely defined. The anterior cingulate cortex is involved in processes that require correct decision making, as seen in conflict resolution (eg, the Stroop test), or cortical inhibition (eg, stopping one task and switching to another). The medial prefrontal cortex is involved in supervisory attentional functions (eg, action–outcome rules) and behavioral flexibility (the ability to switch strategies). The dorsolateral prefrontal cortex, the last brain area to undergo myelination during development in late adolescence, is implicated in matching sensory inputs with planned motor responses. The ventromedial prefrontal cortex seems to regulate social cognition, including empathy. The orbitofrontal cortex is involved in social decision making and in representing the valuations assigned to different experiences. It is also implicated in impulsive and compulsive behaviors.



Working Memory



Executive function depends on working memory, a short-term, capacity-limited cognitive buffer that maintains a representation of sensory information. Working memory permits the integration and manipulation of this information to guide thought, emotion, and behavior. Working memory can be demonstrated in a range of mammals and has been studied extensively in nonhuman primates as well as humans. Findings drawn from primate research have been used to examine the mechanisms of working memory and its role in executive function. A classic experiment involved the placement of bilateral lesions in the prefrontal cortex of a chimpanzee and subsequent testing of the animal for delayed spatial responses. The chimpanzee watched as a piece of food was placed under one of several opaque containers. After a brief delay, the animal was allowed to choose a container. Unlesioned control animals uniformly chose the container with food, whereas lesioned animals made random selections. After several additional experiments were performed to determine the types of cognitive deficits involved, it became apparent that basic sensory and cognitive functions were intact in the lesioned chimps. What the chimpanzees lacked was the ability to maintain an internal representation of the food and its significance. These chimps needed ongoing sensory stimulation to track the food.



Our understanding of working memory has expanded considerably since these initial experiments. Subsequent studies, for example, have examined the electrical activity of particular neurons in prefrontal cortex in experimental paradigms that require working memory. In one such study, a monkey was conditioned to fix its eyes on a central point on a video screen 14–3. Subsequently a box was displayed briefly in one of eight areas on the screen. After a 3- to 6-second delay, the central fixation point was removed from the screen, and the monkey was trained to shift its gaze to the area where the box previously had been displayed. This study enabled the identification of neurons in prefrontal cortex that are specific for the region of the screen where the box was displayed. Such neurons become more active during the delay phase of the task and return to baseline levels of activity when the gaze returns to the area in which the box appeared. The increased activity of neurons in the prefrontal cortex during the delay phase of a task is one signature of working memory. This activity appears to provide an internal representation of the box even when it is not visible.




14–3


Repeated recordings from a single neuron during many trials over which a rhesus monkey performed an oculomotor delayed-response working memory task. During a test session, the monkey’s ability to make correct memory-guided responses is evaluated 10 to 12 times per target location. The neuron’s response is collated over all the trials for a given target location (eg, 135°, 45°) as a histogram of the average response per unit time for that location. The activity is also shown in relation to task events (C, cue; D, delay; R, response) on a trial-by-trial basis for each target location. The particular neuron being recorded fires maximally during the delay period when the target at the 135° location disappears and the monkey is maintaining fixation. This neural activity is maintained throughout the delay period until a response is made. Activity is also seen in the 90° and 180° targets during the delay period but less than that observed for this neuron’s best direction. Different neurons code different spatial locations providing a spatial map in working memory. (Reproduced with permission from Funahashi S, Bruce CJ, Goldman-Rakic, PS. Mnemonic coding of visual space in the primate dorsolateral prefrontal cortex. J Neurophysiol. 1989;Feb;61(2):331–349.)





The pharmacologic manipulation of working memory has been the focus of considerable investigation, in part because of the working memory deficits that characterize schizophrenia (Chapter 17) and in part because working memory may be impaired in ADHD and in several other conditions, including severe stress.



Mild dopaminergic stimulation of the prefrontal cortex enhances working memory; in contrast, higher levels of stimulation profoundly disrupt this function. Because stress is known to increase dopaminergic transmission from the ventral tegmental area to the prefrontal cortex, the actions of dopamine in this brain region may explain why low levels of stress can enhance performance in working memory tasks, whereas higher levels of stress can disrupt performance. These findings are consistent with evidence that working memory depends on an optimal level of stimulation of D1 dopamine receptors (see also Chapter 17). D1 agonists have, therefore, been studied as enhancers of working memory, although it has not yet been possible to generate such agonists devoid of troubling side effects such as nausea and vomiting.



Manipulation of the norepinephrine system also affects working memory. For example, α2-adrenergic agonists such as clonidine and guanfacine (14–4; see Chapter 6) enhance working memory, a finding that may explain the utility of these agents in the treatment of ADHD. The selective norepinephrine reuptake inhibitor (NRI), atomoxetine 14–4, is approved for the treatment of ADHD, although its effects on working memory per se have not been established. Atomoxetine does not appear to be distinct in its therapeutic properties from older tricyclic NRIs (eg, desipramine) that are approved for the treatment of depression (Chapter 15), although it does have milder side effects.




14–4


Medications used in the treatment of ADHD.





Therapeutic (relatively low) doses of psychostimulants, such as methylphenidate and amphetamine 14–4, improve performance on working memory tasks in individuals with ADHD and, at higher doses, in normal subjects. Positron emission tomography (PET) demonstrates that methylphenidate decreases regional cerebral blood flow in the dorsolateral prefrontal cortex and posterior parietal cortex while improving performance of a spatial working memory task. This suggests that cortical networks that normally process spatial working memory become more efficient in response to the drug. Both methylphenidate and amphetamines act by increasing synaptic levels of dopamine, norepinephrine, and serotonin, actions mediated via the plasma membrane transporters of these neurotransmitters and via the shared vesicular monoamine transporter (Chapter 16). Based on animal studies with micro-iontophoretic application of selective D1 dopamine receptor agonists (such as the partial agonist SKF38393 or the full agonist SKF81297) and antagonists (such as SCH23390), and clinical evidence in humans with ADHD, it is now believed that dopamine and norepinephrine, but not serotonin, produce the beneficial effects of stimulants on working memory. At abused (relatively high) doses, stimulants can interfere with working memory and cognitive control, as will be discussed below. It is important to recognize, however, that stimulants act not only on working memory function but also on general levels of arousal and, within the nucleus accumbens, improve the saliency of tasks. Thus, stimulants improve performance on effortful, but tedious tasks, probably acting at different sites in the brain through indirect stimulation of dopamine and norepinephrine receptors.



Although the animal studies described above involve very simple mental representations, working memory is at the heart of many complex mental tasks in higher mammals. When viewed as a rudimentary form of abstraction, working memory can be seen as crossing from the realm of the brain into that of the mind, where internal representations of the external world are consistent and reliable. The ability to think in abstract terms presumably allows humans to create a sense of identity, to establish goals, and to plan for the future.



Attention



In the mass of sensory stimulation by which an animal is bombarded at all times, it must have mechanisms to select the information that is most relevant for its particular situation and ultimately its survival. Selection of relevant information is the role of attention. Once attended to, information gains access to working memory, and can thereby be used to plan appropriate responses. In humans, responses may be simple, but may also involve complex trains of cognition, the activation of emotional circuits, and production of elaborate behaviors. It must be emphasized that much important sensory information is processed by the brain unconsciously, and need not be made conscious to elicit significant responses. For example, subliminal processing of fearful faces can activate physiologic aspects of the fear response in humans without ever reaching consciousness (Chapter 15). However, information that is processed consciously is, at some point, attended to and entered into working memory.



Attention is not a simple unitary function. It may be commanded by “bottom-up” sensory information such as a stabbing pain or the sudden appearance of a loud noise or bright flashing light. The concept of attention also includes effortful “top-down” processing involving the prefrontal cortical circuits that connect with other specialized brain regions. For instance, we can purposefully allocate attention (eg, to a particular spatial location, which involves parietal cortex), we can pay selective attention to specific features of the world (eg, the color of an object, which involves the inferior temporal cortex), and we divide our attention, suppress distractions, and concentrate (ie, sustain attention over time).



In one model, four basic cognitive processes are considered to be the building blocks of attention 14–5: (1) working memory, as mentioned earlier, is required to exert top-down control over those sensory representations that will be attended to and for effortful direction of “the searchlight” of attention. Working memory acts by guiding orientation, such as the direction of the body, the head, or the eyes, and also produces signals that influence the sensitivity of neural circuits that represent information. (2) The allocation of attention, for example, to a point in space, can be shown by functional neuroimaging in humans and by physiologic recordings in monkeys to alter activity in relevant brain regions or relevant neurons. Thus, working memory can exert sensitivity control over neural representations and thereby influence the selection of representations that are attended to. Patients with ADHD have difficulty sustaining attention or ignoring distractions, suggesting problems with sensitivity control. (3) Diverse sensory information is subjected to salience filters so that irrelevant information does not gain access to working memory. Patients with psychotic disorders including schizophrenia attend to irrelevant and to hallucinatory stimuli, suggesting a failure of filtering processes. Salience is determined, for example, by innate or learned predictors of threat or reward or by the appearance of rare or high-intensity stimuli. (4) Sensory representations that pass salience filters are subjected to competitive processes that select the strongest signals for access to working memory.




14–5


Cognitive building blocks of attention. These cognitive modules can have distributed implementation in the brain. The processes that contribute to attention are in red type. Bottom-up processing occurs when sensory information is permitted to pass by salience filters tuned to innate or learned survival-relevant stimuli and to novel or highly salient stimuli. The neural representations of these stimuli are processed in circuits relevant to the type of information they contain (eg, different sensory modalities, interoceptive information). These neural representations then enter a competitive process that selects the one with the highest signal strength for access to working memory. Working memory directs gaze and other orienting behaviors as well as signals that modulate sensitivity to representations for access to working memory. (Adapted with permission from Knudsen EI. Fundamental components of attention. Annu Rev Neurosci. 2007;30:57–78.)





The monoamine neurotransmitters and orexin have a critical permissive role in attention by regulating arousal (Chapters 6 and 13). Performance on cognitive tasks has different optimal levels of arousal depending on the degree of effort required. This is captured in simplistic form by the Yerkes–Dodson principle 14–6 that expresses the relationship of arousal to performance for specific tasks as an inverted U-shaped curve. Such a curve also captures the effects of pharmacologic agents that influence arousal, ranging from psychostimulants and caffeine (Chapter 13) among those drugs that increase arousal to β-adrenergic antagonists and benzodiazepines that might decrease maladaptive arousal in an extremely anxious person. Beyond these general permissive effects, dopamine (acting primarily via D1 receptors) and norepinephrine (acting at several receptors) can, at optimal levels, enhance working memory and aspects of attention. Drugs used for this purpose include, as stated above, methylphenidate, amphetamines, atomoxetine, and desipramine. Modafinil is effective in improving both arousal and attention; it is believed to act (like methylphenidate) by inhibiting the dopamine transporter, although this is not established with certainty (Chapter 13).




14–6


Yerkes–Dodson Principle. This principle (dating from 1908) captures the inverted U-shaped relationship between arousal and performance. Performance on diverse cognitive tasks improves with arousal, but only up to a point; when arousal becomes too great, performance declines. Shown here are two Yerkes–Dodson curves illustrating that lower levels of arousal are optimal for hard tasks (eg, tasks that demand greater cognitive resources) and higher levels for easy tasks or tasks that require greater persistence.





Attention deficit hyperactivity disorder


ADHD is characterized by symptoms in three dimensions of behavior: inattention, impulsiveness, and hyperactivity. Hyperactivity may be absent, in which case the term attention deficit disorder (ADD) may be used. These dimensions are continuous with normality, but, when severe, ADHD produces significant impairment. ADHD can be a profound obstacle to success in school or work, despite normal intelligence. It also increases the risk for substance abuse and accidents, as well as for comorbid depression, anxiety disorders, and conduct disorders.



ADHD begins in childhood, often very early, but is typically diagnosed when children enter school. While symptoms often remit during teen years, a substantial fraction of individuals with ADHD remain symptomatic in adulthood. Based on current diagnostic criteria, worldwide prevalence ranges between 3% and 5%, but widely divergent diagnostic practices in different regions of the world produce varying local estimates. Indeed, the diagnosis and treatment of ADHD varies widely in the United States, with far higher rates in affluent, suburban communities compared with those in poorer, inner city areas.



ADHD is highly heritable, with genetic factors comprising ~75% of the risk. However, like all major psychiatric disorders, including depression, bipolar disorder, and schizophrenia, ADHD is genetically complex, making the identification of risk alleles very challenging. Elucidating genetic factors that contribute to ADHD risk awaits large-scale genome-wide studies.



ADHD can be conceptualized as a disorder of executive function; specifically, ADHD is characterized by reduced ability to exert and maintain cognitive control of behavior. Compared with healthy individuals, those with ADHD have diminished ability to suppress inappropriate prepotent responses to stimuli (impaired response inhibition) and diminished ability to inhibit responses to irrelevant stimuli (impaired interference suppression). Such deficits have been documented as well in functional MRI studies.



The ability to suppress prepotent responses is thought to require the action of frontal-striatal-thalamic circuits 14–7. A series of parallel loops connect the prefrontal cortex with specific regions of the basal ganglia and, via the thalamus, project back to prefrontal cortex. These loops are thought to be involved in the initiation and control of motor behavior, attention, cognition, and reward responses. Functional neuroimaging in humans demonstrates activation of the prefrontal cortex and caudate nucleus (part of the dorsal striatum) in tasks that demand inhibitory control of behavior. Subjects with ADHD exhibit less activation of the medial prefrontal cortex than healthy controls even when they succeed in such tasks and utilize different circuits.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Dec 26, 2018 | Posted by in NEUROLOGY | Comments Off on Higher Cognitive Function and Behavioral Control

Full access? Get Clinical Tree

Get Clinical Tree app for offline access