Early Intervention for Delirium

CHAPTER 19
Early Intervention for Delirium


David Meagher1 and Walter Cullen2


1 Department of Psychiatry, University of Limerick Medical School, Limerick, Ireland


2 Department of General Practice, University of Limerick Medical School, Limerick, Ireland


Key points



  1. Delirium is common across all health care settings and has a major adverse impact upon outcomes that is predicted by the occurrence and severity of delirium.
  2. Patients at high risk can be readily identified and a variety of interventions can reduce delirium incidence and severity of emergent cases.
  3. Problems with delayed or nondetection occur in approximately 50% of cases of delirium.
  4. Improved identification can be achieved through routine and systematic screening for cognitive impairment and neuropsychiatric symptoms of delirium.
  5. Evidence to support pharmacological management of delirium is gathering with the increasing emergence of randomised studies, including some placebo-controlled designs.
  6. Improved delirium care requires fundamental changes to the organisation of health care environments that require the combined efforts of clinicians and health care management.

Delirium – a key target for early intervention


Delirium is an acute neuropsychiatric syndrome that is characterised by a complex constellation of cognitive impairments and neuropsychiatric disturbances that reflect generalised impairment of brain function. It is common in many clinical settings, occurring in approximately one in five hospitalised patients, with rates of up to 90% reported among patients in palliative and intensive care settings [1]. No other psychiatric disorder has such penetration across health care settings – this frequency, along with the complexity of clinical presentation where typically one half of the cases are not detected [2], makes more timely intervention a key health care priority.


Delirium is especially common in patients with diminished cognitive function and/or multiple chronic illnesses, so that these predisposing factors interact with acute precipitants to result in acute brain failure. Many of these factors are highly preventable (and many have a significant iatrogenic component) such that better service organisation can allow for preventative measures to produce better patient outcomes and reduced health care costs. Early intervention includes strategies that embrace primary prevention as well as more optimal management of emergent cases so as to reduce the serious consequences of delirium and its complications.


The impact of delirium on health care outcomes


Optimising delirium care is an important health care priority for several reasons. In addition to its frequency, delirium has a considerable impact on patient outcomes and health care costs. Patients with delirium experience more prolonged hospitalisations, more complications, greater costs of care, reduced subsequent functional independence, and increased in-hospital and subsequent mortality [3]. Importantly, these adverse health and social outcomes are predicted by the presence of delirium and are relatively independent of confounding factors such as morbidity level, baseline cognition, age and frailty. In addition, delirium may be an accelerating and possibly causal factor in the development of dementia [4, 5].


Primary prevention of delirium


Delirium risk factors


Efforts at early intervention are assisted by studies that have identified a range of patients, illness and treatment factors predict the likelihood of developing delirium. The vulnerability of certain individuals to delirium, or “delirium readiness” emphasizes how a variety of predisposing factors interact with acute precipitating insults to produce the acute brain failure of delirium.


Inouye and Charpentier [6] developed a model comprising four predisposing factors (cognitive impairment, severe illness, visual impairment, and dehydration) and five precipitating factors (polypharmacy, catheterization, use of restraints, malnutrition, any iatrogenic event) that predicted a 17-fold variation in the relative risk of developing delirium. Baseline risk is especially important such that patients with high baseline vulnerability can develop delirium even in response to minor precipitants.


Table 19.1 shows a detailed list of factors. Certain factors are more relevant in particular settings and patient groups, but age extremes, pre-existing cognitive problems, severe comorbid illness, and psychotropic medication exposure are robust predictors of delirium risk across populations. Many risk factors are modifiable while others can help assess risk–benefit balance of surgical and other interventions in deciding upon optimal care, especially in frail elderly patients with cognitive impairments. Minimising exposure to modifiable factors has been the focus of intervention studies (see below) that demonstrate that delirium is highly preventable. Many interventions involve elements of good medical and nursing care (e.g. avoiding unnecessary polypharmacy, correcting sensory deficits). That these practices need to be protocolised within complex interventions reflects the standard of routine care provided in real-world settings, such that delirium may be a marker of the dysfunctionality of health care systems.


Table 19.1 Risk factors for delirium





A. Patient
Age
Pre-existing cognitive impairment
Previous delirium episode
CNS disorder
Increased blood–brain barrier permeability
Poor nutritional status
B. Illness
Severity of comorbidity
Burns
HIV/AIDS
Organ insufficiency
Infection (e.g.UTI)
Hypoxemia
Fracture
Hypothermia/fever
Metabolic disturbances
Dehydration
Low serum albumin
Nicotine withdrawal
Uncontrolled pain
C. Intervention
Perioperative
Type of surgery (e.g. hip)
Emergency procedure
Duration of operation
Catheterization
D. Medications
Polypharmacy
Drug/alcohol dependence
Psychoactive drug use
Specific drugs (e.g. anticholinergics)
E. Environment
Social isolation
Sensory extremes
Visual deficit
Hearing deficits
Immobility
Novel environment
Stress
Use of restraints

Nonpharmacological interventions


Primary prevention of delirium through nonpharmacological risk-reduction strategies has been demonstrated in elderly medical [7, 8] and surgical [9–11] populations. Moreover, these studies also indicate that incident delirium is less severe and of shorter duration when it occurs in the setting of active management of delirium proneness. Simple interventions (e.g. consensus guidelines, educational interventions) have limited impact [12, 13] which is unsurprising given the complex range of factors involved in delirium causation, which are more amenable to multifaceted interventions.


Multifaceted interventions include many common elements that focus upon assisting orientation, enhancing efficacy (e.g sensory), sleep, pain relief, optimising physiological parameters (electrolytes, hydration), physical therapy/mobilisation and active review by specialist nurses [7, 9] or geriatricians/geriatric psychiatrists [14]. A widely adopted intervention, the Hospital Elder Life Program (HELP), targeting six risk factors using standardized protocols can reduce the incidence and duration of delirium episodes relative to controls [7]. Other randomized works comparing perioperative geriatric consultation with usual care in elderly hip fracture patients found reduced delirium incidence and severity in the intervention group [15–17].


Overall, the evidence indicates that improving awareness of the importance of delirium through proactive education of staff involved in the care of patients at high risk of delirium combined with risk factor reduction, systematic screening for delirium, and patient-tailored treatment of emergent delirium can reduce incidence, severity and duration of incident delirium and delirium-related mortality [9, 10, 17]. The impact of interventions depends on degree of implementation [15]. However, these measures appear more effective in preventing delirium in patients at high risk for reasons other than dementia [10, 16–18]. Moreover, the impact upon longer-term outcomes such as independence and mortality at 1 year follow-up is less impressive, especially in populations with high rates of comorbid dementia [19].


The timing of preventative interventions may also be a key factor. ‘Prehabilitation’ involves risk factor reduction to optimise preparedness for nonemergency procedures or other interventions that represent key periods of risk for delirium. Bjorkelund et al. [11] described an intervention that commenced during the pre-hospital period and focused on optimising preparedness for surgery in patients with hip fracture (addressing hydration, oxygenation, analgesia, polypharmacy and optimising care environment routines) that significantly reduced delirium incidence.


The success with which these programmes can be applied to other settings where risk factors for delirium may differ (e.g. palliative care, community-based settings) is less clear. Siddiqi et al. [20] described a delirium prevention programme for care homes based upon elements with demonstrated effectiveness in hospitalised patients. Marcantonio et al. [21] conducted a randomized trial of a delirium abatement program for post-acute nursing facilities involving nurse-led detection and treatment which increased delirium detection (12–41%) but did not reduce delirium persistence at 1 month follow-up. Of note, the implementation rates for the intervention were modest reflecting the challenges of providing for delirious patients in such settings. Gagnon et al. [13] used a simple intervention targeting clinician awareness of delirium risk and monitoring of medication changes that was aligned to involving families in delirium recognition but did not significantly reduce delirium incidence, perhaps reflecting the different risk factor profile in palliative care (e.g. opioid exposure, vital organ failure). Moreover, the intervention was minimal, emphasised education and monitoring rather than implementation of specific protocolised interventions and as such highlights the knowledge–practice divide. These studies highlight the need to tailor interventions to the particular needs of different settings and patient groups.


The success of complex delirium prevention programmes is linked to system factors [22, 23] that include: (a) involvement of clinical leaders, (b) support from senior management, (c) linking the implementation of programmes to periods of system change (e.g. realignment of care pathways), (d) educational elements that are sustained and engaging, (e) mechanisms to support decision-making that are integrated to everyday routines (e.g. electronic care pathways), (f) monitoring procedures to promote continued adherence. In general, improving delirium care through formalised interventions is best achieved where it is supported by activities that promote enthusiasm, support implementation, remove barriers and allow for progress monitoring.


The workload and cost implications of interventions are key considerations. The ‘HELP’ programme, for example requires skilled interdisciplinary staff and trained volunteers to implement standardised protocols and the use of copyrighted protocols involves a fee. However, preliminary studies suggest that proactively dealing with delirium risk can be cost neutral [24] and reduces nursing workload caused by disturbed behaviours [25]. Given the evidence that the impact of delirium reflects more prolonged hospitalisations and need for interventions to address complications [26], better management of delirium and its complications should reduce health care costs.


Pharmacological prophylaxis


The prevailing neurochemical theory of delirium emphasises dopaminergic and cholinergic systems in the pathophysiology of delirium. As a consequence, interest has focused upon the impact of agents that diminish dopaminergic or enhance cholinergic function. Studies have explored the impact of neuroleptic and procholinergic agents in prevention of delirium in high-risk populations.


Neuroleptic agents


Studies have explored the impact of prophylactic use of typical antipsychotic agents upon delirium incidence. Kalisvaart et al. [27] in a randomized, double-blind, prophylaxis study using either low-dose haloperidol or placebo given up to 3 days prior to and 3 days following hip surgery found significantly shorter and less severe delirium associated with shorter length of stay in the active treatment group but without a significant impact upon delirium incidence.


Other studies have focused on atypical antipsychotic agents; Prakanrattana and Prapaitrakool [28] reported a significantly reduced delirium incidence in patients receiving a single 1 mg dose of risperidone versus placebo when emerging from cardiac surgery. Similarly, Larsen et al. [29] reported a double-blind placebo-controlled trial of olanzapine 5 mg given the day before and after orthopaedic surgery for elderly patients where the active treatment group experienced significantly reduced delirium incidence and a greater likelihood of being discharged to home rather than to a rehabilitation facility.


Procholinergic strategies


The use of procholinergic strategies is also supported by neurochemical theories of delirium. Although some work with rivastigmine suggests that sustained use may protect against delirium in patients with dementia [30], studies with donepezil [31, 32] have not identified a prophylactic effect, perhaps due to the limited duration of use prior to delirium risk exposure.


Other strategies


Careful management of agitation and pain in severely ill patients can reduce delirium incidence. The use of the α-2 agonist dexmedetomidine for post-operative sedation is associated with reduced incidence of delirium and reduced ventilator time compared with midazolam in patients undergoing mechanical ventilation [33] or sedation after cardiac surgery versus propofol or fentanyl/midazolam [34]. Pandharipande et al. [35] linked lorazepam use with increased risk of delirium in ICU patients. A placebo-controlled study of melatonin (0.5 mg nocte for up to 14 days) in elderly medical admissions found significantly reduced delirium incidence [36].


Careful titration of analgesia can reduce delirium incidence; Tokita et al. [37] found lower delirium incidence and superior pain control in elderly using patient-controlled epidural anaesthesia with bupivacaine and fentanyl compared to continuous epidural mepivacaine. An open-label study found that opioid rotation from morphine to fentanyl was associated with better pain control and reduced delirium severity [38]. Protocolised care to optimise use of medications in ICU for pain and agitation can reduce duration of mechanical ventilation, improve pain management and reduce delirium symptoms [39].


Overall, the evidence in respect of antipsychotic agents is encouraging, but there are uncertainties as to their role across settings, with limited work in high-risk elderly medical inpatients and in patients with comorbid dementia. In addition, the interaction between pharmacological and nonpharmacological interventions needs to be clarified. Moreover, optimal doses and timing of treatment is uncertain since studies have focused on low dose brief exposure.


Improving the identification of patients at risk of delirium


In addition to factors shown in Table 19.1, a variety of neuropsychological, biological and pathophysiological measures may allow for better identification of delirium risk and/or may be markers of the process by which delirium emerges.


Neuropsychological markers of delirium proneness


Pre-existing cognitive impairment is a well-recognised risk factor for delirium, with comorbid dementia evident in more than 50% of delirium occurring in hospitalised elderly. In addition, more subtle impairments of neuropsychological functioning are associated with increased delirium risk in nondemented patients. These include tests of attention, vigilance, memory, visuospatial function graphomotor speed and executive function [40–45].


Cognitive performance in the immediate post-operative period also predicts delirium likelihood [28]. Lowery et al. [46] found that impaired cognitive performance is common during the post-operative period in patients who do not develop delirium, although impairments differed with nondelirious patients experiencing a decrease in vigilance whereas delirious patients were impaired in the accuracy of attention. This work emphasises the need to qualify the character and extent of the acute deterioration of cognition that equates with delirium since impairment of cognitive performance is common during periods of high morbidity or exposure to interventions that confer elevated delirium risk, including many who do not develop syndromal delirium. These studies also highlight how formal neuropsychological testing can identify patients at risk of delirium for whom preventative actions are especially recommended.


Biological markers of delirium


Biological markers may assist in recognition and monitoring of delirium. The EEG shows generalised slowing in delirium and can distinguish delirium from dementia [47], but generalised slowing lacks specificity and the practicalities of performing EEG on delirious patients limits widespread use. Other works have linked peripheral measures of anticholinergic activity to delirium proneness [48] but findings are inconsistent. Post-operative delirium characterised by reduced serum levels of amino acids with tryptophan levels below 40 μg/mL is suggested as a measure for delirium detection [49]. However, this pattern may reflect the physiological response to surgery rather than a specific pathophysiological mechanism for delirium.


Delirium symptoms overlap with features of sickness behaviour that occurs in inflammatory responses. C-reactive protein is a marker of the acute phase of the inflammatory response that predicts delirium incidence in elderly medical admissions but lacks specificity for delirium and may be of greater use in monitoring progress [50]. Delirium often occurs in infectious states that involve cytokine activation and therapeutic use of cytokines can induce delirium. Studies of cytokine levels and delirium incidence have been inconsistent, with some indicating elevated proinflammatory cytokines in elderly medical admissions [51, 52] while a study in surgical patients found elevated chemokine levels in the immediate post-operative period without a significant rise in cytokine levels [53]. Cytokine and chemokines are associated with cognitive function and cholinergic activity such that their role as predisposing or precipitating factors or merely epiphenomena needs to be further explored.


Delirium detection


Delirium is poorly recognised in real-world practice where 50% or more of cases are diagnosed late or missed completely [54–56]. Early detection of delirium can allow for more timely and effective intervention while poor detection is associated with poorer outcomes that include elevated mortality [57, 58]. Low detection rates have been reported among cases that involve hypoactive clinical presentation [58], comorbid dementia [59], a history of previous psychiatric problems [54], prominent pain [54] and occur in the perioperative period [60].


The accurate detection of delirium is obfuscated by the continued use of a range of synonyms which reflect delirium occurring in different patient groups and treatment settings (e.g. acute confusion, ICU psychosis, toxic encephalopathy) rather than discrete scientific entities. Lakatos et al. [61] highlighted that delirium is not accurately diagnosed even where patients experience serious complications typical of delirium.


Screening


Delirium recognition in everyday practice is best achieved with routine screening. However, such practices are atypical and the reality is of ad hoc practices that lack a systematic approach with validated tools. In part, this reflects uncertainty regarding the optimal screening method – because delirium is primarily a cognitive disorder, bedside assessment of cognition is a critical element of diagnosis but limiting screening to cognitive assessment lacks specificity for delirium [62]. Screening may be best achieved by assessing for a combination of cognitive and noncognitive delirium features but including noncognitive features adds greater subjectivity to ratings. However, the problem of false negatives means that tools need to prioritise sensitivity over specificity.


Wong [63] reviewed bedside instruments for delirium detection with regard to sensitivity and specificity for DSM-IV delirium as well as suitability for use by various health care professionals. The Confusion Assessment Method (CAM) [64] was identified as the optimal instrument but with a number of caveats which are discussed below. Key considerations include the availability of time for testing and skill set of the tester; although the CAM is estimated to take approximately 5 minutes to complete, even this may have implications on clinical workload.


Cognitive tests


Bedside cognitive screening tests are sensitive to cognitive disorder but lack specificity to differentiate delirium from dementia. Other works have focused upon cognitive functions that are disproportionately affected in delirium but relatively preserved in normal ageing and early dementia [65], including tests of visual attention [66], visual perception and memory [67]. However, these tests require tester expertise and patient cooperation which is often lacking due to agitation and uncooperativeness in hyperactive patients, or lethargy and hypersomnolence in hypoactive patients. Leonard et al. [68] studied symptoms of depression and delirium in palliative care admissions and identified that any measure that required patient cooperation was sensitive to the presence of delirium, including self-report measures of mood. The extent to which inability to cooperate with testing is an indicator of delirium in different populations is a key consideration in determining approaches that allow assessment of uncooperative patients.


Delirium-specific screening tools


Other instruments incorporate neuropsychiatric and contextual features for delirium detection. Of these, the CAM is the most widely used delirium screening tool in general hospitals [64]. It is a brief (5 minute) four-item algorithm based on DSM-III-R criteria that has been adapted for use in the ICU [69] and nursing homes [70] where structured testing of each feature allows for more reliable rating. Wei et al. [71] reviewed the CAM scale attributes across studies and reported a sensitivity of 94%, specificity of 89% and inter-rater reliability typically over 70%. However, CAM accuracy is impacted upon by the background and training exposure of raters [72, 73] and the frequency of comorbid dementia in the population under study [59, 71]. CAM sensitivity can be increased by using structured cognitive tests but involves more prolonged administration time. Its accuracy in identifying more difficult cases such as those with hypoactive presentations requires greater study.


Other instruments emphasise observed patient behaviour. The Delirium Rating Scale (DRS) [74], Revised Delirium Rating Scale (DRS-R98) [75] and Memorial Delirium Assessment Schedule (MDAS) [76] are well-recognised delirium symptom assessment and diagnostic scales but are too detailed for the purposes of screening. The NEECHAM [77] takes less than 5 minutes and includes cognitive, behavioural and physiological items. It has high sensitivity with moderate specificity in medical and intensive care settings [78, 79]. The Nu-DESC is even briefer (1 minute), and assesses five delirium features including hypoactive symptoms [80]. It has high sensitivity and specificity in palliative care and internal medicine settings but requires that the rater account for the patient’s medical condition. Interestingly, it does not include ratings of either inattention or context of disturbances but compares favourably with the CAM in recovery room patients [81] and surgical wards [82].


Improving delirium detection

Stay updated, free articles. Join our Telegram channel

May 29, 2017 | Posted by in PSYCHIATRY | Comments Off on Early Intervention for Delirium

Full access? Get Clinical Tree

Get Clinical Tree app for offline access