Cognitive Archaeology and the Cognitive Sciences




© Springer International Publishing Switzerland 2015
Emiliano Bruner (ed.)Human PaleoneurologySpringer Series in Bio-/Neuroinformatics310.1007/978-3-319-08500-5_8


8. Cognitive Archaeology and the Cognitive Sciences



Frederick L. Coolidge , Thomas Wynn2, Karenleigh A. Overmann1 and James M. Hicks1


(1)
Psychology Department, University of Colorado, Colorado Springs, CO, USA

(2)
Anthropology Department, University of Colorado, Colorado Springs, CO, USA

 



 

Frederick L. Coolidge



Abstract

Cognitive archaeology uses cognitive and psychological models to interpret the archaeological record. This chapter outlines several components that may be essential in building effective cognitive archaeological arguments. It also presents a two-stage perspective for the development of modern cognition, primarily based upon the work of Coolidge and Wynn. The first describes the transition from arboreal to terrestrial life in later Homo and the possible cognitive repercussions of terrestrial sleep. The second stage proposes that a genetic event may have enhanced working memory in Homo sapiens (specifically in terms of Baddeley’s multicomponent working memory model). The present chapter also reviews the archaeological and neurological bases for modern thinking, and the latter arguments are primarily grounded in the significance of the morphometric rescaling of the parietal lobes, which appears to have distinguished Homo sapiens from Neandertals.


Keywords
Cognitive archaeologyWorking memoryEnhanced working memoryParietal lobesIntraparietal sulcusIPSPrecuneusHomo sapiensNeandertals


Cognitive archaeology is an approach to understanding cognitive evolution that privileges the material record of past actions. Archaeological remains constitute the only direct evidence of past behavior that science possesses and, despite inherent biases in their deposition and discovery, provide an indirect avenue to the prehistoric mind itself. The challenges in approaching the prehistoric mind through the archaeological record are methodological, not epistemological. Archaeology can and does provide evidence for a variety of prehistoric activities, including tool making and use, subsistence, social arrangement in space, and even symbol use. From these, archaeologists have constructed an evolutionary sequence of hominin behavior and culture that parallels the evolutionary sequence of hominin fossils for 2.6 million years. With appropriate methods, this evolutionary record has tremendous potential for informing science about the cognitive abilities of hominin ancestors and near relatives.

There are several essential components in an effective cognitive archaeological argument (Wynn 2002, 2009a, b). First, it is necessary to have an explicit theory of cognition; common-sense categories such as ‘abstract’ or ‘complex’ are simply too vague to have much interpretive power. Worse, common-sense categories have rarely been defined and evaluated experimentally. In comparison, an explicit theory of cognition provides evolutionary analysis with well-defined cognitive categories and experimental/ethological evidence to support them. Second, the cognitive archaeologist must identify behavioral sequelae of these well-defined cognitive categories that would leave, potentially at least, an archaeological record. Ideally, this step would be supported by actualistic and experimental studies, a corroborative step that has rarely been taken. Third, the cognitive archaeologist must identify specific material attributes by which one can document the presence of this activity/cognitive ability in the past.

The archaeological record itself has several inherent biases that distort its perspective on the past and which must be considered when forming a cognitive archaeological argument. Very little of the material vestiges of past activity survives the ravages of time. Most organic material decays away quickly, and even bone disintegrates in many sedimentary contexts. In addition, the further back in time one looks, the less has been preserved. Thus preservation and other natural processes of destruction have impressed a quasi-progressive pattern on the archaeological record; there is just a greater variety of evidence for more recent time periods. Dating is a perennial problem. In spite of a plethora of complex dating techniques, archaeologists of human evolution must often operate with only vague, geological-scale assessments of the age of important remains, a problem obviously shared with human paleontology. These biases are significant enough that it is often possible for the same evidence to be used in very different interpretations of the past.

Cognitive archaeologists must also eschew the traditional terms and concepts of Palaeolithic archaeology whenever possible. Terms such as ‘Mousterian’ or ‘gravette’ were defined over a century ago, when unilineal evolution and typology dominated prehistoric studies. They have no inherent cognitive implications. Indeed, unreflective use of such categories has been fairly rampant in recent archaeological attempts to document the evolution of the ‘modern’ mind. It is unfortunately impossible to banish these terms altogether. They are, after all, the common tongue of Palaeolithic archaeologists, and in this guise enhance academic communication. But it is possible, even necessary, to avoid using them as analytical tools and instead define appropriate terms and concepts using cognitive science.

Over the last 20 years cognitive archaeology has grown dramatically, and it is now impossible to summarize all of its contributions to understanding cognitive evolution. Here we will simply discuss three points in hominin cognitive evolution that appear to have been especially significant: the evolution of Homo erectus sensu lato 1.8 million years ago, the evolution of modern working memory capacity about 100,000 years ago, and the morphometric rescaling of the parietal lobes in Homo sapiens that is not evident in Neandertals. Note that we stipulate that our use of the term should be understood as Homo erectus sensu lato, or Homo erectus ‘in the broad sense’, the inclusive term for early African and Asian Homo populations. However, due to their relatively homogeneous cognitive and behavioral patterns, we will use the name archaic humans in the discussion that follows, and our initial use of the term Homo erectus sensu lato will be synonymous with our later use of the term archaic humans.


The First Leap in Cognition: Transition from Tree to Ground Sleep in Archaic Humans


As an example of the kinds of questions and issues that cognitive archaeologists address, Coolidge and Wynn (2009) proposed that there were two significant leaps in the evolution of human cognitive abilities. The first of these leaps was proposed to have occurred about 1.8 million years ago with the transition from life in trees to life on the ground in archaic humans, the second major species appearing in the Homo lineage. The first group of Homo species, often referred to as habilines after one of the varieties, Homo habilis, was heterogeneous in many anatomical respects but were included in genus Homo for two reasons: First, their brains were about 60 % larger than those of the earlier australopithecines, and second, they appear to have been associated with the stone tools of the Mode 1 or Oldowan technocomplex. However, there is possible evidence of stone tool use in butchery by australopithecines at Dikika, Ethiopia at 3.39 million years ago (McPherron et al. 2010). This has been challenged by Domínguez-Rodrigo et al. (2010). Despite this apparent difference in behavior (which, along with their increased brain size, suggests differences in cognition), the habilines had the same general body proportions as the australopithecines, suggesting that despite their bigger brains, they still spent a considerable amount of time in trees (see Wong (2006) for a review of the evidence for and the debate over the amount of bipedality or arboreality in these early species).

By comparison, archaic humans, which appeared after 2.0 million years ago, have been associated with Mode 2 or Acheulean stone tools (bifacial handaxes and cleavers) and almost certainly slept on the ground. One of the most famous specimens of these archaic humans was Nariokotome, who appeared, even at the age he died (about 8–11 years old), to have attained a stature of about 160 cm, and who might have attained an adult height of about 170–185 cm, although a more recent study estimates his adult height might have been only 163 cm (Graves et al. 2010). The muscle attachment on his long bones suggested he led a strenuous life, and his relatively long legs and narrow hips gave him a classic body type for both distance running and heat loss, ideal for life on the ground in hot, open habitats (Bramble and Lieberman 2004; Cachel and Harris 1998; Lieberman et al. 2009; Ruff 1991, 1993). This body type would have been very different from the smaller, tree-dwelling bodies of the australopithecines and habilines. Further, Nariokotome’s endocranial volume was about 880 cc, and his adult endocranial volume might have been 910 cc, which makes him fairly representative since H. erectus crania range from a low of 700 cc in the earliest specimens to a high of over 1,200 cc in the latest (Antón 2003). His potential adult cranial volume therefore represents an absolute (but not relative) increase of nearly 50 % compared to that of the habilines.


The Archaeological Record for Archaic Humans


The archaeological record for archaic humans boasts several evolutionary ‘firsts’, including movement out of tropical woodlands and out of Africa altogether. Most of these firsts have at least some cognitive implications. Here we will focus on just two: the advent of biface technology and developments linked to sleeping on the ground.

Before turning to bifaces, a brief description of pre-biface lithic technology is in order. Hominins began making stone tools over 2.6 million years ago (Semaw et al. 1997, 2003, 2009), and these tools and the sites where they were found have been important clues in paleoanthropology’s attempts to understand the initial encephalization that occurred in the genus Homo (Schick and Toth 2006, 2009; Toth and Schick 2006, 2009). Various approaches to this Mode 1 or Oldowan evidence suggested that the advent of lithic technology marked a Rubicon in hominin evolution (Schick and Toth 2006). However, more formal cognitive archaeological analyses have yielded different results. Assessments from different cognitive theoretical perspectives—Piagetian (Wynn 1981) and cognitive neuroscience (Wynn 2002)—reveal that this earliest lithic technology was very apelike and therefore unlikely to have itself selected for larger brains or neural reorganization (e.g., Wynn et al. 2011). Neuroimaging research suggests that the attentional demands to Mode 1 tool creation may have been more related to perceptual-motor control rather than cognitive control (e.g., Stout and Chaminade 2007; Stout et al. 2000, 2008, 2011).

By comparison, the Mode 2 bifaces produced by archaic humans were clearly outside of the range of ape abilities. Bifaces were comparatively large stone tools produced by trimming bifacially around most or all of a large flake or core to produce handaxes and cleavers. The resultant tool has a continuous edge suitable for a variety of tasks, including butchery (Toth 1987). Several characteristics of bifaces have significant cognitive implications. Here we will focus on the imposition of shape and the curation of the artifacts; we will also briefly mention the implications for social learning.

The idea that archaic humans imposed a shape on bifaces has been challenged by several archaeologists (Davidson and Noble 1989, 1993; McPherron 2000; Noble and Davidson 1996), but to date, none have been able to provide convincing evidence that bilateral symmetry like that found on even the earliest bifaces could emerge as a regular, unintended consequence of producing flakes. Most authorities agree that the hominins intended that the artifacts be bilaterally symmetrical. This does not require that they possessed a detailed visual image of a target biface (a ‘mental template’ in archaeological jargon), but it does require attention to shape. Here again cognitive archaeologists reliance on established theories has provided corroborating assessments. Wynn (2002) and later Hodgson (2009) pointed to an important and well-known feature of visual processing: Spatial information is processed through a dorsal stream that includes occipital and parietal lobe resources, but shape information is processed through a ventral stream of occipital and inferior temporal lobe resources.

To make a biface, the hominin had to coordinate both information streams, a feat unknown for apes. In their experimental/neuroimaging studies of biface manufacture, Stout and colleagues have been able to document use of neural resources in the manufacture of bifaces that were not evident in the motor/spatial procedures utilized in producing Mode 1 tools. In particular, they noted involvement of “…the anterior intraparietal sulcus and inferior frontal sulcus, both in left hemisphere” (Stout et al. 2011, p. 1334), areas that have been linked to the mirror neuron system and which are thus suggestive of social learning, which would have been critical to imparting the skills involved in biface manufacture. While there are undoubted cognitive differences between the brains of modern human research subjects and Homo erectus, studies such as those performed by Stout and colleagues enable us to make inferences about ancient hominin brain activity during tool manufacture. Thus, cognitive archaeology has identified significant evolutionary developments in the procedural cognition of archaic humans: abilities for coordinating previously separate neural resources and learning complex procedures by observation.

Curation refers to the habit of carrying tools from place to place as a regular component of a tool kit. Mode 1 tools, associated with the habilines, were rarely treated this way. Instead, raw material was carried to use sites where flakes were knapped, used, and discarded. Tool use was task specific and ad hoc (Toth 1985; Wynn 1981). However, some claim evidence for distant Oldowan hominin tool and material transport (e.g., Braun et al. 2008). It is relatively clear, however, that tools did exist as a category of object in the minds of archaic humans, and for the first time there was not only tool use but an idea of ‘tool’. The neural resources for this development probably reside in the inferior temporal lobe, the same area that distinguishes ‘animate’ and ‘inanimate’ objects. The ‘toolness’ of bifaces had a couple of important long term cognitive consequences. First, via extended cognition, the temporal (in the sense of time) continuity of bifaces may have enabled the bootstrapping (Malafouris 2008) of an expanded personal awareness beyond the ape range: The biface made yesterday is still here today and will be here again tomorrow, thus so will the knapper. This continuity also sets the stage for displaced reference (the ability to point to something not physically present) through the inherent indexical quality of the biface: The biface not only continues from yesterday, it could stand for its use yesterday or its potential use in the future. One of the puzzles of biface archaeology is why hominins made them and shared an idea of them throughout a significant temporal span and geographic distribution—over a million years and much of the Old World. A role as indexes in a pre-linguistic social system might well account for this troubling pattern (versions of this argument have been explored by many scholars, most influentially by Donald 1991 and Kohn and Mithen 1999).

The skills involved in producing a biface and the persistence of the biface tradition over significant temporal and geographic spans imply that the skills were transmitted between individuals in some form of social learning (Mithen 1999; Stout 2002). Social learning has been observed in modern non-human primates, as for example, when chimpanzee juveniles learn to probe for termites by observing adults (McGrew 1992, 2004). Although bonobos have learned basic stone knapping, the more complex skills required for bifacial lithic reduction exceed their abilities, even after differences of anatomy (biomechanical constraints limit apes’ ability to deliver stone knapping gestures as forcefully as humans can) and opportunity (apes lack the cultural intensity of humans) have been taken into account (Caruana et al. 2013; Savage-Rumbaugh et al. 2007). The key to learning and sharing ideas about shape (goal) and how to achieve it (process) is not communication as in language per se (since there is no evidence that archaic humans spoke), but the ability to come to a knowledge of what another understands to be appropriate (Wynn 1995). Social learning can take many nonlinguistic forms, including true imitation of motor procedures and less complex forms such as goal emulation, in which the novice not only models observed motor patterns but also constructs an individual understanding of the goal (Whiten and Byrne 1991; Whiten and Ham 1992). The skill inherent in the successful production of biface technology suggests that archaic humans likely mastered the technique through long periods of observation and repetition.

Bifaces exist today as direct ambassadors from a remote time in evolution when hominins first stepped away from an ape way of doing things and entered a new cognitive/cultural milieu that had no precedents. The development of biface technology was not the only significant cognitive ‘first’ associated with archaic humans. There must also have been changes in sleep patterns linked to sleeping on the ground.


Challenges of Terrestrial Life


The first challenge of life on the ground was the issue of predation. The australopithecines and habilines probably lived, mated, and played in trees, and slept in nests in trees, a lifestyle that would have protected them from predation. This supposition is made on the basis that tree nesting is a plesiomorphic characteristic in extant apes (even gorillas build nests but on the ground). As a plesiomorphic trait of living apes, it was most likely the behavior of the australopithecines. Life on the ground, however, would have required some other form of protection, either a dramatic increase in group size or use of fire, both of which have cognitive implications. Managing an increased number of social relationships would require several cognitive abilities, among them increased facility in facial recognition (for allies and enemies), memory for reciprocity in things like food sharing, and an expanded ability to manage simultaneous relationships (e.g., Dunbar 2003). Controlled use of fire, for which there is now evidence extending back 1 million years (Berna et al. 2012), would likely have selected for some increase in executive function capacity to ensure provisioning of fires and social cognition to coordinate fire maintenance (Twomey 2013). In addition, life on the ground appeared to have included a greatly expanded territorial range, estimated to have been about 260 square kilometers (about 100 square miles) for archaic humans (Antón and Swisher 2004; Antón et al. 2002), suggesting concomitant expansion of abilities in spatial cognition and memory.

Several anatomic changes in Homo erectus support the idea that this species spent more time on the ground and less in trees, including changes to upper body anatomy suggestive of reduced reliance on brachiation, as well as a reorganization of the inner ear consistent with the transition to ground life (Spoor and Zonneveld 1998; Steudel-Numbers 2006; Wong 2006). Further, the greater territorial range and overall geographic dispersal of Homo erectus also support the idea that the species had transitioned to a fully bipedal lifestyle that would have additionally entailed more time on the ground, including where it likely slept (Antón 2003; Antón and Swisher 2004; Antón et al. 2002).

However, some of the more interesting cognitive changes were linked to changes in sleep. Although there are at least four distinct stages of mammalian sleep, research tends to focus upon only two stages: Rapid-eye-movement sleep (REM or vivid dream sleep) and slow-wave (delta) sleep. All great apes have REM and slow-wave sleep, although REM varies from about 7–15 % of their total sleep (e.g., Allison and Cicchetti 1976). Most monkeys, with whom modern humans share a more distant common ancestor than is the case with the great apes, also spend about 5–15 % of their sleep in REM. By comparison, modern human REM constitutes 25 % of total sleep time, and it remains a relatively constant percentage across the lifespan after about 10 years of age has been reached (Ohayon et al. 2004).

Based in part on modern evidence of nest building in great apes, as noted previously, Sabater Pi et al. (1997) proposed that early hominins may also have nested in trees. Additional support for arboreal nesting among australopithecines and habilines rests on the evidence of hominin post-cranial anatomy, as climbing selects for more robust front limb features, longer arms relative to legs, a higher brachial index (length of forearm relative to upper arm), a more cranially (headward) orientated glenoid cavity, and curved fingers (Stanley 1992). Simply, their anatomy (particularly their arms) reflected an adaptation for arboreal locomotion (or brachiation). As all of the australopithecines and habilines retained these features of climbing anatomy, it is reasonable to conclude that these early hominins must have nested in trees and, by extension, retained an ape-like pattern of sleep. Fruth and Hohmann (1996) proposed that nest building may have been a “great leap forward” in the evolution of cognition because the proximate functions of nests may have aided the transfer of information, that is, nests which are physically closer to each may help primates to warn nearby nesting primates of impending danger as well as transmission of other types of signals. Further, they proposed that the protection and stability provided by sleeping in nests may have increased the proportions REM sleep and slow-wave, which in turn may have helped to establish and consolidate memories.

By comparison, archaic humans (as is the case in Nariokotome) possessed relatively short arms compared to the long, brachiation-adapted arms of the australopithecines and habilines, anatomy that suggests archaic humans had transitioned to life on the ground, a lifestyle that not only entailed new forms of protection, social organization, and technologies but also improved the quality and quantity of sleep. Full ground sleep would have preserved the general integrity of the sleep period (e.g., sleep became less fragmented because it was less subject to the danger of falling and disruptive vagaries such as strong breezes, bad weather, etc.), further aiding increases in slow-wave and REM sleep. In addition, there may have been three specific evolutionary benefits for cognition from sleep on the ground: (1) threat simulation and priming, (2) creativity, and (3) procedural memory consolidation and enhancement.

Threat simulation and priming Revonsuo (2000) has argued that the waking lives of most dreamers were unlikely to have high levels of daily threat, especially attacks by wild animals. He hypothesized that the dreamers relived archaic dream themes, particularly ones that would simulate the real dangers of the ancestral environment, including falling, encounters with natural disasters and threats posed by strange people and wild animals. He reasoned that through natural selection, dreaming came to have the function of threat rehearsal, threat perception, and threat avoidance. The selective advantage would come from a dream theme repetition that would enhance and prepare waking threat-avoidance skills, which experimental psychologists call priming. A study by Malcolm-Smith and Solms (2004) provides some support for Revonsuo’s hypothesis. About 9 % of a sample of college students reported realistic physical threats in a dream and about 3 % reported a realistic escape response in the face of the threat. Although these percentages on the whole are rather low, it does demonstrate that threats and escape from threat themes are not rare nor absent in dreams. Revonsuo and others (Revonsuo and Valli 2000; Valli et al. 2005) thus viewed the present dream-production system as one that simulates threatening events in the ancestral environment while also making the dreamer more ready (priming) for an actual waking threat.

Franklin and Zyphur (2005) broadened Revonsuo’s hypothesis and proposed that dreaming may not only serve to simulate ancestral threats but may also serve as a more general rehearsal mechanism for scenarios encountered in daily life, subsequently and positively influencing waking encounters. Such a mechanism would facilitate decision making without significant influence from conscious awareness or language. If this hypothesis is correct, dreaming may have played a more important role in Homo evolution prior to the advent of language. Physiological evidence for the hypothesis comes from the activation during REM sleep of the ventromedial prefrontal cortex (PFC), amygdala, and anterior cingulate cortex, which mediate some of the affective executive functions and are thought to play a strong role in social and interpersonal decision-making and evaluations (e.g., Gazzaniga et al. 2013).

Creativity There is extensive anecdotal evidence for a link between creativity and dreaming. Artists and musicians who have claimed inspiration from dreams include Durer, Goya, Blake, Rousseau, Dali, Mozart, Wagner, and Billy Joel, among many others. Perhaps most famously, Samuel Taylor Coleridge claimed that his poem Kubla Khan came to him in a dream. Russian chemist Dmitri Mendeleyev said that he conceived of the periodic table in a dream, and French chemist Friedrich Kekulé claimed that the circular structure of the benzene molecule came to him in a dream (see Hartmann 1998; Van de Castle 1983 for a more complete discussion of dreams and creative thinking). In a survey of contemporary mathematicians, over 50 % reported that they had at least once solved a mathematical problem in a dream (Krippner and Hughes 1970). Indian mathematician Ramanujan averred that a form of the goddess Lakshmi gave him many mathematical ideas in his dreams.

Empirical evidence for sleep and creative problem solving comes from a study by Wagner et al. (2004), who gave human participants a cognitive task that required learning stimulus-response sequences (which they deemed an implicit procedural memory task) where improvement, as measured by reaction time, was evident over trials. Participants could improve abruptly if they gained insight into a hidden abstract rule. Twice as many participants who slept became aware of the hidden rule than those who stayed awake, regardless of time of day. The authors postulated that the participants’ greater insight was not a strengthening of the procedural memory itself but involved a novel restructuring of the original representations. Wagner et al. suspected that cell assemblies representing newly learned tasks were reactivated by the hippocampal structures during sleep and incorporated by the neocortex into preexisting long-term memories. They hypothesized that this process of incorporation into long-term storage formed the basis for the remodeling and qualitatively different restructuring of representations in memory.

Visuospatial, autobiographical, and procedural memory consolidation and enhancement Winson (1990) presented empirical evidence for REM’s role in memory consolidation with rats. He demonstrated that a theta rhythm (6 Hz) arises from the hippocampus associated with exploratory behavior. Because exploratory behavior appears critical to their survival, Winson reasoned that one purpose of REM sleep might be the strengthening and consolidation of visuospatial (procedural) memories. There is also substantial empirical evidence for the enhancement of various kinds of procedural memories in human sleep. Karni et al. (1994) demonstrated consolidation-based enhancement on a procedural visuospatial discrimination task. Stickgold et al. (2000) used the same task and found that the consolidation enhancement was dependent only on the first night of sleep following acquisition and that learning was correlated with both the amount of slow-wave sleep and the amount of REM sleep. Again using the same task, Gais et al. (2000) concluded that consolidation-based enhancement might be instigated by slow-wave sleep, whereas REM and Stage 2 may solidify and add to the enhancement effect, but that slow-wave sleep was a necessary component.

As a function of these sleep changes, and by means of the phenomenological contents of sleep (dreaming), (1) early archaic humans may have been primed to escape and avoid threatening events in the their waking environments and may have rehearsed social scenarios; (2) the contents of REM sleep may have promoted creativity and innovation; and (3) sleep changes may have aided autobiographical (episodic), visuospatial, and procedural memories, which include memories for motor skills, visuospatial discriminations, and spatial locations, without any further acquisition or practice. It is also important to note that recent fMRI and polysomnography studies of sleeping humans have also suggested that both REM and slow-wave sleep may have independent but complimentary roles in emotional memory processing (Cairney et al. 2014). Further, Tononi and Cirelli (2014) have proposed a synaptic homeostasis hypothesis which purports that during sleep spontaneous neuronal activity renormalizes synapses and restores synaptic strength as well as promotes neuronal homeostasis. One net effect of this process is that memories may be strengthened and new associations may be created. These findings and assertions serve to support our contentions that the tree-to-ground sleep transition had a potentially profound effect on the lives of archaic humans.


The Second Leap in Cognition: Expanded Working Memory in Homo Sapiens


Recently, paleoanthropologists (e.g., Henshilwood et al. 2004; Henshilwood and Dubreuil 2009) have made several claims for symbolic behavior and ‘fully syntactical language’ based on the presence of artifacts such as shell beads in South African Middle Stone Age sites. Botha (2008, 2012), Wynn (2009a), and Wynn and Coolidge (2010) have noted that these claims lack adequate bridging arguments, that is, the putative steps that tie these artifacts to symbolic behavior to fully modern language. As we noted earlier, to be persuasive, such linking arguments must articulate with well-established theories or models of cognition. Working memory is such a model. It was initially proposed in 1974 by experimental psychologists Alan Baddeley and Graham Hitch and has dominated and stimulated contemporary memory research over the past four decades (Baddeley 2001, 2007; Baddeley and Hitch 1974; Nota bene: The concept of working memory has also been used in a more narrow sense, and its narrow use does not imply Baddeley’s multi-component model). Importantly, various psychometric measures of working memory capacity have been found to be correlated with a variety of critical cognitive abilities, including reading comprehension, vocabulary learning, language comprehension, language acquisition, second-language learning, spelling, storytelling, logical and emotional reasoning, suppression of designated events, certain types of psychopathology, fluid intelligence, and general intelligence. The relationship with fluid intelligence is an important one because fluid intelligence measures an ability to solve novel problems. It appears to be less influenced by learning and culture and more influenced by some feral or inherent talent to figure out solutions to new problems. Thus, Baddeley’s model, which incorporates the executive functions metaphor, is a natural heuristic for inquiries into the evolution of modern thinking.

Baddeley’s multi-component Working Memory model currently consists of a (1) central executive, (2) phonological storage, (3) visuospatial sketchpad, and (4) episodic buffer. Baddeley’s idea of a central executive is virtually synonymous with concepts associated with the literature on executive functions of the frontal lobes, concepts that remain primarily within the discipline of neuropsychology, whose suppositions generally (but not exclusively) rely upon inferences based on brain-damaged patients. Briefly, the central executive directs attention to complete tasks based on short-term and long-term goals. The central executive does so by attending to appropriate tasks at hand which are consonant with short- and long-term goals, inhibiting irrelevant stimuli and distractions, and making decisions based on sequenced plans of actions. The central executive also directs two subsystems, the phonological loop and the visuospatial sketchpad. The phonological loop contains two elements, a short-term phonological store of speech and other sounds, and an articulatory loop that maintains and rehearses information either vocally or subvocally. The visuospatial sketchpad incorporates the maintenance and integration of visual (“what” information like objects) and spatial (“where” information like location in space) elements and a means of refreshing it by rehearsal. Finally, the episodic buffer serves as a temporary memory system for the central executive, and according to Baddeley (2000), integrates and stores information from the other two subsystems.

A core component of Coolidge and Wynn’s (2009) second major leap in cognition is Baddeley’s central executive. Baddeley and others (e.g., Baddeley and Logie 1999; Miyake and Shah 1999; Osaka et al. 2007) view the central executive as consisting of varying functions, including attention, active inhibition, decision making, planning, sequencing, temporal tagging, and the manipulation, updating, maintenance, and integration of multimodal information from the two subsystems. Importantly, the central executive also takes control when novel tasks are suddenly introduced, and one of its most important functions is to override pre-existing habits and to inhibit prepotent (previously well learned) but task-inappropriate responses. The central executive also takes control when danger threatens and task-relevant decisions must be made.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Jun 12, 2017 | Posted by in NEUROLOGY | Comments Off on Cognitive Archaeology and the Cognitive Sciences

Full access? Get Clinical Tree

Get Clinical Tree app for offline access