CHAPTER 10 Brian Lawlor1 and Celia O’Hare2 1 Mercer’s Institute for Research on Ageing; Trinity College Dublin; St Patrick’s and St James’s Hospitals, Dublin, Ireland 2 The Irish Longitudinal Study on Ageing (TILDA), Department of Medical Gerontology, Trinity College Dublin, Ireland Early intervention in the elderly, for those unfamiliar with the speciality of old age psychiatry, may appear to be a contradiction in terms. The parallels with the early intervention movement in psychosis are many. It is notable, for example that the Kraepelinian term ‘dementia praecox’ has been proposed as typifying the therapeutic nihilism endemic in the once accepted clinical approach to psychosis. Similarly, the raison d’être for an early intervention movement in geriatric psychiatry could be reaction to the current pervading view that brain failure is an inevitable consequence of the ageing process. It thus appears that the same seismic shifts in youth mental health as heralded by Pat McGorry and his contemporaries are coming to old age psychiatry [1]. It is, of course, the seismic shifts in population trends that have given a new urgency to the search for better ways to diagnose and treat the common mental health problems of later life. If the population of those over the age of 65 continues to increase in line with current demographic trends, so too will the prevalence of all age-related mental health problems [2]. It is, however, in no small part what some see as the impending ‘dementia emergency’ that has crystallised research efforts and siphoned public funding. At the time of writing, the best estimates indicate that there are over 5 million people living with dementia in the USA [3]. In 30 years time it is predicted that this figure will have trebled. Governments around the world are finally taking note of such seemingly apocalyptic predictions and top heavy demographic charts, where the proportion of younger generations is so graphically dwindling; the bottom pillars seem to shrink beneath the weight of this potential future burden. In the UK the ‘Dementia Strategy’ has highlighted the key targets of early diagnosis along with changes to both public and professional attitudes to dementia and its treatment [4]. The current economic cost of dementia is estimated to be 1% of global GDP [5]. In 2010, the World Alzheimer’s Association estimated that if dementia were a country it would be the world’s eighteenth largest economy. In no other fatal illness would a diagnosis rate of 30% be tolerated yet this is the current standard in the UK [6]. Unfortunately however, even when a diagnosis does come, it is often when the illness is already advanced; too late to prevent harm and crises [4]. There are persuasive economic arguments in favour of early diagnosis and intervention. It is reported that despite significant upfront investments to establish early intervention services, these will quickly prove cost effective [7]. Early intervention in older adults makes sound fiscal policy, in particular given its potential to delay institutionalisation – perhaps the most prescient outcome in older adults. As we have seen, in our current system even those with frank dementia may struggle to achieve a diagnosis. Whereas in specialist centres the accurate diagnosis of dementia is reported as above 90% [8], the patient presenting to their GP may still fail to be referred for further specialist assessment, let alone receive an accurate diagnosis. Such is the discrepancy between the pace of research and the standard of care most of our patients can expect, even the symptomatic treatments which are available remain out of reach for most patients. As is widely noted, until the advent of a disease-modifying treatment many nonspecialists may be reluctant to diagnose a terminal neurodegenerative disorder [9]. It is critical therefore to translate recent leaps in knowledge not only into a source of empowerment and optimism for patients and carers, but also into concrete changes to the care received by some of our most vulnerable patients. In 2011, diagnostic guidelines which had remained fit for purpose since 1984 were finally revised to reflect new understandings gleaned from the intervening decades of research [10]. Now the diagnostic criteria for mild cognitive impairment (MCI) and AD incorporate ‘biological markers’, or biomarkers, although these remain outside of current standard clinical practice and remain a source of debate. Some, for example consider that the temporal ordering of biomarkers needs to be more convincingly determined [11]. The amyloid hypothesis remains the most accepted understanding of the neuropathological changes leading to AD. The theory posits that once the toxic amyloid-beta (Aβ) peptide begins to aggregate, a cascade of events is triggered that produces the classical phenotype of AD [12]. Biomarkers now in use, and most of those in development, are each considered to reflect different points in the underlying pathophysiology. As demonstrated in Figure 10.1, five AD biomarkers are currently considered sufficiently robust to be included in the new guidelines. These can be separated into two distinct categories; the major biomarkers of brain Aβ deposition (low cerebrospinal fluid (CSF) Aβ42 and positive positron emission tomography [PET] amyloid imaging) and the biomarkers of neuronal injury or neurodegeneration (elevated CSF total tau and phosphorylated tau, decreased fluorodeoxyglucose [FDG] uptake on PET in the temporoparietal cortex and regional atrophy on structural MRI). To definitively determine if these biomarkers accurately reflect the temporal progression of the disease, longitudinal studies spanning 30–40 years would in theory be required [13]. In the absence of such data it can only be assumed that the biomarkers drawn from individuals at different clinical stages represent the changes along the continuum of the disease. Based on such models, change in CSF Aβ is proposed as the earliest available marker – correlating best with the asymptomatic phase. The markers of neuronal injury occur later and tally better with changes in cognitive function. The clinical implications of the availability of such data are immense – we may soon be in a position to estimate time to clinical onset of symptoms and thus perhaps identify those in whom therapeutic interventions should be targeted. Perhaps the biggest shift in the guidelines comes in the proposals for ‘preclinical’ AD, that is asymptomatic or presymptomatic phase [14]. This highlights a readiness to suggest that those who may be destined to develop AD but are yet to display symptoms can be identified via biomarkers – albeit only for research purposes at this time. This change in emphasis is based not only on a growing confidence that such biomarkers can accurately identify those who have levels of amyloid outside of what is normal for their age, but also the imperative to ensure that research is less hampered by the effects of heterogeneous cohorts often overly dependent on subjective reports and clinical acumen. Many parallels can be drawn, and often are, between current thinking on dementia and the pervading view of cancer 40 years ago. These days carcinoma ‘in situ’ is an accepted terminology – the preclinical stage at which intervention is key and definitive curative treatment is possible. Increasingly, the staging model of disease so familiar to other medical specialities is gaining favour in the study of dementia not just as a schema for hypothesis building but to invigorate research and encourage early intervention. It is proposed that through accurate delineation of the neuropathological and cognitive features accompanying each stage, timely intervention will be possible. It is surmised that movement in both directions may occur with such staging also helping to accurately evaluate treatment outcome. The beckoning biomarkers will make AD diagnosis easier and faster, and ensure that disease-modifying treatments can be targeted where they are most likely to make a difference – prior to the symptomatic phase. At present, those with amnestic MCI (considered by many to be the symptomatic ‘prodromal’ phase of AD intervening between asymptomatic and overt dementia) are the target intervention group of choice. Studies consistently show a substantial yearly rate of transition from MCI to AD [15]. The ethical considerations of trialling novel treatments in this group (who in a research paradigm are also likely to demonstrate more than one positive putative biomarker) may seem simple. There are however many remaining questions: why for example do some people transition back to normal functioning and still others never convert [16]? Thus the ethics of treating those with an, as yet, ill-defined clinical trajectory should not be underestimated. As discussed in more detail below, flaws in the trial design and diagnostic variation across centres are again thought to account for at least some of these discrepancies. The foremost current clinical indication for biomarkers however, may indeed be in the MCI phase, and despite such apparent anomalies, research indicates that those with the clinical syndrome of MCI together with a positive Aβ biomarker and a positive biomarker of neuronal injury are the most likely to have underlying AD neurodegeneration with only a relatively short time before progression to dementia occurs. Each of us is born with a specific genetically determined risk profile for many diseases including genes that will help to determine our personal cognitive trajectory. AD genetic risk can be broadly divided into autosomal dominant mutations (encoding for ‘familial’ AD which classically presents before the age of 65) and genes which increase risk for the sporadic form of the disorder. The best defined of the latter is apolipoprotein E (APOE); two copies of the ϵ4 allele will increase risk while having the ϵ2 allele decreases risk for future AD [17]. As with most psychiatric disorders many other genes are implicated to a lesser extent and it is surmised that despite an inherent genetic risk profile it is ultimately gene–environment interactions which determine expression of the AD phenotype [18]; an underlying heterogeneity of genetic risk combined with an ever varying array of lifestyle exposures and idiosyncratic responses. Those with familial forms of AD have a predictable age at which clinical symptoms will become apparent. Researchers evaluating biomarkers in probands of varying ages were able to distinguish a pattern of gradual accumulation of the hallmark features of AD. Studies following those with the autosomal dominant forms of AD, suggest that the characteristic pathological brain changes in this group begin much earlier than previously thought – most likely decades prior to a formal diagnosis with AD [19]. Cognitive testing has also proven sensitive in small cohorts in discriminating those with hereditary forms of AD from sporadic and normal ageing cohorts. Impairment in ‘shape colour binding’ has been posited as a possible specific early cognitive marker of AD [20]. Targeting the genes that code for the abnormal proteins may be the earliest intervention point feasible. Stem cell therapy and therapies targeting epigenetic mechanisms may make this possible in the future, although none have as yet progressed to clinical stage trials [21]. Although studying those with the genetically determined form of AD undoubtedly adds to our overall knowledge and understanding of AD, to extrapolate these findings to the sporadic cases of AD (which make up over 95% of cases) may be unsound. Indeed not all genetic mutations necessarily confer an increased risk of AD; recent exciting research from Iceland has identified a rare mutation in amyloid-β precursor protein (APP) which is in fact protective against AD [22]. It is present in approximately 0.5% of the Icelandic population and has a similar prevalence in the other Scandinavian countries. It appears that this mutation reduces build up of amyloid in the brain (demonstrating a 40% reduction in the formation of amyloidogenic peptides in in vitro models) leading to improved cognitive outcomes – not just a reduced incidence of AD but also improved resilience to the cognitive decline associated with normal ageing. Studies charting the course of normal age-related cognitive change suggest that the initial changes may begin as early as the third or fourth decade – with a peak in cognitive performance around 20 years followed thereafter by a slow but inevitable decline [23]. Indeed it is suggested that up to 50% of the age-related decline in cognitive ability that occurs before 90 years is already established by the age of 60. Epidemiological observational studies which point to universal age-related decrements have sparked headlines with their findings, as the decline once seen as the reserve of the decrepit moves perilously close to the rest of us [24]. Sensitive cognitive testing as administered in the Whitehall cohort studies following thousands of British civil servants, detected significant decline between the beginning of testing when a person was in their thirties and follow-up 10 years later. What triggers such decline and how it starts remains unknown. Thus discerning normal ageing from the subtle early pathological processes of AD is difficult (never more so than with the MMSE – which is the extent of the cognitive testing with which most clinicians are familiar). What we do know however is that undoubtedly the biggest risk factor for pathological decline in cognition is increasing age [25]. The fallacy of the mind–body dichotomy is laid bare in the latter decades as the brain and mind are revealed as one as we age. Early intervention in older adult mental health spans not just conventional psychiatric expertise but necessitates cross disciplinary input, for example in cardiovascular and cerebrovascular health, while equally requiring a knowledge and understanding of the social implications of ageing. Following individuals who manage to resist functional decline despite the aggregates of normal ageing and ever increasing risk of AD pathology may provide an insight into ways to maintain cognitive health in the ageing population. Although at present interventions at population level cannot yet be recommended, research points to a number of key lifestyle factors that may interact with genetic predisposition to increase individual risk – as we often highlight to our own service users in the Mercers Institute (see Box 10.1: Seven Secrets). Specifically, increased levels of physical activity [26] and better social integration [27] are associated with a lower incidence of dementia. Furthermore research indicates that working to maintain the emotional and social network of people as they age is likely to reap rewards beyond better mental well-being towards helping to maintain cognitive health and independence [28]. As psychiatrists we are well versed in risk. Stroke risk and ischemic heart disease risk are arguably more readily assessed, and indeed managed, than any other type of risk in a psychiatric population. Here, perhaps the role of the psychiatrist is as physician first is to acknowledge and manage these prime targets for possible primary prevention. Be that alone or with the support of the patient’s primary care physician or geriatrician. The path to a healthy heart may also lead to a healthy brain. Research points to mid-life hypertension, obesity, smoking and diabetes as potentially sowing the seeds for AD in later life [29]. There is substantial overlap between AD and cerebrovascualar disease. Cerebral hypoperfusion resulting from ischaemia and cerebrovascular disease may accelerate accumulation of amyloid thus potentially playing a pivotal role in the pathogenesis of AD [30], indeed elevated levels of microinfarction are a consistent finding at autopsy in individuals with AD [31]. Secondary prevention may be promoted by an increased awareness of physical ‘frailty’ – no longer simply an imprecise clinical impression in geriatric medicine but an increasingly well-defined phenotype [32]. Indeed the proposed underlying mechanisms leading to physical frailty have much in common with those mentioned in this chapter in relation to AD – from oxidative stress to cardiovascular risk. Autopsy studies have also revealed significantly elevated levels of AD neuropathology in those with the highest levels of frailty close to death – leading some to further argue a common underlying pathophysiology [33]. Again it is unclear which precedes which but intervening in either the physical or cognitive domain may have knock-on benefits in overall function. Keeping physical function in mind once symptomatic AD is established is also particularly important. Optimisation of physical health is likely to help maintain function, quality of life and delay institutionalisation, as well as lessen carer burden [34]. Just as with cancer, the ever elusive cure may simply reflect individual complexity [35]. Thus the parallels with oncology continue as the definitive AD treatment is unlikely to be a ‘one solution fits all’. Multifactorial origins will likely mean complex multifaceted treatments are required just as the course of a malignancy and the response to treatment is rarely readily predictable. Explanations for such variation range from associated environmental risks to a person’s personality and positivity. Such diverse explanations are also mooted for the variations observed in the clinical presentation of AD. Two people – one of whom in life lived with dementia, the other who remained independent and self-reliant, may display equivalent levels of plaque and neurofibrillary tangles at post mortem [36]. This may be a sobering thought for those for whom biomarkers are the panacea in the judicious targeting of treatment. Cognitive reserve has been highlighted as one explanation for such variation [37]. Those whom in life remained well despite potentially pathological levels of neurodegenerative changes have been found to have higher levels of education, to have been more socially engaged and quite simply to have had larger brains. It is suggested that such factors enable the brain to cope with greater levels of AD pathology for a longer period of time, meaning that the individual may be more adept at enlisting compensatory mechanisms to maintain function. Thus the question of the ethical implications of identifying those who may be at risk of AD with cumulative positive biomarkers but who in their lifetime may never in fact develop AD. Currently there are no disease-modifying agents available for AD. The results of prior trial treatments aimed at cure remain disappointing for the millions of people living with dementia and their families. Some failures are attributed to targeting the disease process too late in the sequence of neuropathological change [38]. The case for early diagnosis may thus be made from a dual perspective – if the neuropathological processes are to be identified early enough to intervene, diagnosis must be made early. Here we discuss just some of the past disappointments and still newer agents hopeful of future success. The majority of AD therapeutic strategies aim to reduce the accumulation of amyloid. The amyloid theory however has its detractors who can readily cite the failure of multiple preclinical successes to translate to the human phase trials. Even when there has been a reduction in the amyloid burden there has not always been amelioration in cognition. In the amyloid cascade hypothesis, Aβ plays an upstream and pivotal role in the pathogenesis of the disease. Two proteases β-secretase and γ-secretase which act on APP in a sequential manner to produce Aβ42, have long been considered prime targets for drug development [39]. A phase III trial which involved the compound Semagacestat that targeted γ-secretase was stopped prematurely due to significant decrements in cognitive function in the treatment group and poor selectivity of the compound (including inhibition of the systemically active Notch enzyme – with an associated increased incidence of skin cancer in the treatment group). A Europe-wide multicentre phase III clinical trial is currently underway to assess the efficacy of Nilvadipine. This antihypertensive (a 1,4-dihydropyridine L-type calcium channel blocker) reduced the risk of developing AD in large population-based observational studies [40]. Nilvadipine has since been shown to be well tolerated and to stabilize cognition, albeit as yet in small cohorts [41, 42]. Rather than simply being a selective anti-amyloid approach, transgenic mouse models hint at an ability to hit many of the hypothesised pressure points: blood flow, anti-amyloid, anti-tau and anti-inflammatory [43]. This multimodal mechanism of action is in some ways analogous to clozapine in schizophrenia, indeed perhaps just as clozapine can lead to improvements in refractory psychosis via numerous receptor pathways, Nilvadipine may work especially well in late-onset AD given that there are likely many overlapping pathologies at work. Inflammation is known to accompany the neuropathological processes of AD – whether this is a consequence or a driver of the disease is yet to be determined. Many remain proponents of the pivotal role of an abnormal neuroinflammatory response despite the failure of trials involving anti-inflammatory compounds. With epidemiological evidence pointing to a 50% reduction in incidence of AD in those with chronic NSAID use, again it is considered that anti-inflammatory agents may need to be trialled earlier to have an impact [44]. Immunisation – harnessing the immune response rather than preventing it – in theory is capable of targeting production, aggregation and deposition of amyloid [45]. A tantalising trio of the most pursued anti-amyloid mechanisms of actions. This therapeutic strategy however has been disappointing – including recent results from trials involving the compound Bapineuzumab and another monoclonal antibody, Solanezumab which failed to live up to the promise of earlier phase studies. Subanalysis of the effects on those with milder forms of AD however, does suggest some cognitive improvement – perhaps adding wind to the sails of the early interventionists [46]. The year 2013 sees the possibility of three trials commencing in asymptomatic individuals, finally truly putting the early intervention hypothesis to test [46]. The best known of these will use the compound Crenezumab (again a monoclonal antibody) in approximately 300 presymptomatic members of a Colombian family each of whom has a dominantly inherited form of early-onset AD. This trial is set to last 5 years. Strategies targeting tau, which is widely accepted to be a process downstream of Aβ accumulation is another possible modifiable part of the amyloid hypothesis. Tau may therefore represent the intervention point of choice in those in whom clinical features are already manifested. Two possible therapeutic compounds include lithium and methylthioninium chloride, thought to act by inhibition of tau phosphorylation and aggregation respectively [45]. Both therapeutic and observational AD trials to date have had numerous widely cited limitations. Principle among these perhaps, is the heterogeneity of cohorts where participant selection has been necessarily reliant on subjective report and consensus-based clinical diagnosis [35]. Furthermore, the move towards multisite trials has brought with it the problems of variation in training, instrumentation and analysis. Biomarkers may now help dictate selection of participants for both disease-modifying clinical trials and longitudinal observational studies. Research has also been hampered by lack of a definitive animal model. The advent of recombinant DNA and the discovery of the genes coding for dominantly inherited AD lead to the production of murine models of AD and Aβ accumulation. Critics suggest that this is not an accurate reflection of the human neuropathology – mice do not live long enough to naturally develop AD and the levels of amyloid present in murine models far outweigh those produced in the human brain [47]. This may explain why findings in basic science have not been replicated in humans. For some, improving the quality of research is not enough – the scope must also be broadened. It has been proposed that there is a need to think beyond the accepted confines of the amyloid hypothesis in order to find new therapeutic targets and accompanying therapeutic success. Herrup, for example envisages a weakened neuron which has been made vulnerable via ageing and an ‘initiating injury’, be that head trauma or less well-defined insults such as depression or grief [25]. Such theories may explain why specific psychiatric diagnoses have been suggested to relate intimately to cognitive decline, in some cases representing perhaps the earliest indicators of dementia and transition from MCI. Inevitably, such ‘injuries’ accumulate over one’s lifetime thus increasing risk of AD. The ethical implications of identifying individuals at risk for AD but not currently displaying symptoms, is leading to significant ethical and moral debate in the internationally recognised leading research centres focused on AD [48]. In a recently published study focusing on familial forms of AD with specific genetic mutations (cited previously in this chapter), no information regarding the participant’s carrier status was given [19]. Furthermore, the published results were carefully screened so as to prevent inadvertent identification of participants. The results of any test confirming AD risk has of course potentially huge implications, particularly in the ongoing absence of successful disease-modifying therapies and the lack of clear delineation of the prognostic implications of presumptive biomarkers [49]. Communicating such uncertainty to patients, their families, friends, and potentially their insurance companies is laden with potentially life-altering repercussions. Despite clear ethical pitfalls, the overarching goal remains ethically sound: to improve the diagnosis and treatment of AD, a debilitating, terminal neurodegenerative disorder set to reach epidemic proportions. Attaining this goal is closer than ever before. As the detection of the early stages of the disease becomes feasible in a standard clinical setting there will be ever greater opportunity for therapeutic intervention – to delay progression and perhaps even to arrest the disease process in its tracks. With recent research indicating that cognitive decline begins as early as the third decade, the need for further collaboration with colleagues specialising in mid-life mental health (and potentially even younger) is greater than ever before. It is imperative for all those working in every mental health discipline to increase their knowledge of the key concepts of maintaining cognitive and mental health throughout life. While we await the next bold therapeutic strategy in AD, be that lifelong prevention or late-onset cure, it is incumbent upon all of those working in the field to view each novel investigative and treatment modality with cautionary optimism. Staging and early intervention in AD will help prepare the patient and their family for the next steps, allow informed choice, increase hope and perhaps eventually enable every patient not just to live well with Alzheimer’s disease, but potentially never to develop Alzheimer’s dementia at all.
Early Intervention in Older Adults – A Focus on Alzheimer’s Dementia
The economic arguments
Moving towards a staging model
‘Symptomatic’ Alzheimer’s disease (AD)
Making a better, earlier diagnosis
‘Preclinical’ Alzheimer’s disease
‘Prodromal’ Alzheimer’s disease
Intervening in Alzheimer’s disease – a life course perspective
Genetic risk
Normal ageing
No cognitive health without physical health and social integration?
Primary prevention
Secondary prevention
Tertiary prevention
Cognitive reserve – a lifetime in the learning?
Disease-modifying drugs: battling the slings and arrows of outrageous fortune
Anti-amyloid approaches
Targeting the immune response
Tau late?
The future
Improving research in Alzheimer’s disease
Ethical considerations
Futurology