Evolution of the Spinal Delivery of Opiate Analgesics




Abstract


Morphine and other alkaloids were initially isolated in the 1850s, and their application to ameliorate pain by hypodermic delivery began shortly thereafter. The development of conduction anesthesia by delivery of cocaine to the peripheral nerve in the 1880s and then later to the spinal cord in the late 1890s was accompanied by several reports where morphine was added, but thereafter its neuraxial use disappeared until the latter half of 20th century. At this time three important observations were made: the specific and precise pharmacology of opiates led to the conclusion that the agents were acting though a specific site(s) (receptor/s); consistent with the presumed actions, opiates were found to have a potent analgesic effect mediated though an action at specific brain sites, notably the periaqueductal gray; and while opiates indeed had a potent effect in the brain, the preclinical demonstration that intrathecal (IT) opiates specifically blocked nociception led to the conclusion that opiates with an action limited to the spinal cord would have behaviorally relevant effects upon spinal pain processing. Subsequent work in humans displayed an important effect upon acute pain processing, and the development of implanted pumps and catheters enabled the use of IT opiates in chronic pain.




Keywords

Cancer pain, Catheter-based infusion of intrathecal opiates, History of intrathecal opiates, Intrathecal granulomas, Intrathecal morphine, Opiate receptors

 






  • Outline



  • Introduction 803



  • Neuraxial Drugs: The First 20Years 804





  • Neuraxial Opiates: The Last 40Years 805




    • Preclinical Studies on the Analgesic Actions of Neuraxial Opiates 805



    • Early Clinical Studies on the Analgesic Actions of Neuraxial Opiates 805



    • Discovery of the Underlying MOA for Spinal Opiate Actions 806



    • Identification of Subtypes of Opioid Receptors: Mu, Delta, Kappa 806




  • Development of Catheter-Based Infusion Techniques 806



  • Evolution of the Neuraxial Delivery of Targeted Therapeutics 807




    • Epidural Versus Intrathecal Delivery 807



    • Intrathecal Pharmacokinetics 808




  • Other Intrathecal Analgesics 808




    • Opioids 808



    • Nonopioid Agents 809



    • Developing Concerns Over Spinal Drug Safety 810




  • Side-Effects of IT Administration of Opiates 810





  • Concluding Commentary 812



  • References 812




Introduction


The systematic study of spinal systems and their transmitter biology has shown them to play a defining role in the encoding of afferent input and the tuning of autonomic and somatomotor outflow. Studies on the pharmacology of these spinal linkages have revealed that drugs with an action limited to the spinal cord can have surprisingly robust but selective effects upon behaviorally relevant components, including pain and spasticity. Development of insights into the mechanisms of actions (MOAs) of opiates revealed their potent regulatory effects upon spinal nociceptive processing. Accordingly, spinally directed drug therapy, particularly of opiates, has proven to be a valuable treatment for acute and chronic pain of varying etiologies, with a reduction in the side-effects associated with extraspinal targets affected by systemically delivered drugs. The development of implantable drug delivery systems (IDDS) led to the development of therapeutic interventions that ensure a steady-stream delivery of medication, avoiding problems with frequent dosing and peaks/troughs of taking systemic opiates and other medications. The journey to discovering and developing I T therapy has been fascinating and is the subject of this chapter.




Neuraxial Drugs: The First 20 Years


The development of neuraxial therapeutics begins with the development of a delivery system and the isolation, purification, and creation of a soluble formulation of two natural products, cocaine and morphine.


Delivery System


The notion of delivering agents into the body led to the development of hollow needles and plunger/barrel systems. Several individuals are credited in the 19th century with contributing to various degrees to the final product, which appeared around 1870: a calibrated glass barrel with a removable needle attached to the barrel and a tapered nose, introduced by Hermann Woolf Leur (i.e., a Leur fitting) and permitting sterilization and minimizing leakage ( ).


Drugs


It was appreciated that chewing of coca leaves by Peruvian Indians led to oral numbness ( ). Niemann in , using differential solubility, produced pure white crystals which numbed the tongue. He named the product cocaine. Halsted and Hall in the United States demonstrated its clinically efficacious regional or local anesthesia by perineural delivery to most of the somatic nerves, and its MOA was considered to be “conduction anesthesia” ( ).


Opium or “poppy tears” is harvested from the dried latex of the opium poppy. The word itself is derived from the Greek “opus,” which means vegetable juice. An alternative name for opiates (narcotics) is derived from the Greek word “narco,” which means to numb or deaden. Opium was most likely used farther back than can be dated, at least as far back as 4200 BC, and appears to have been cultivated for ritualistic functions, at least as far back as the Neolithic Age ( ). Opium was enjoyed by the Sumerians, who called it Hul Gil, the joy plant. 1


1 http://www.poppies.ws/poppies/opium-effects.html

The Sumerians passed it on to the Assyrians, who shared it with the Babylonians, who then gave it to the Egyptians. 2

2 http://opioids.com/timeline/

It was used in various ways, including eating it, drinking it in teas, and smoking it; moreover, it was used for many differing purposes, including insomnia, headache, and, of course, pain and anesthesia. Serturner in 1803 isolated an alkaloid product in alcoholic extracts of the poppy and, by differential solubility, identified pure crystals that led to sedation in animals and humans. He christened this fraction morphine, from Morpheus, the Greek god of sleep and dreams ( ). Though opiate concoctions such as laudanum were typically given orally, the development of purified, water-soluble morphine ( ) and the introduction of hypodermic-needle technology permitted delivery of opiates at specific sites for pain alleviation.


Spinal Anesthesia


In the late 19th century the dangers of general anesthesia (ether, chloroform) were widely appreciated. The utility of cocaine for regional procedures (perineural block and infiltration) was well known ( ). As August Bier noted, “ General anesthesia is dangerous, and its scope has fortunately and none too soon been greatly reduced by the advent of Schleich’s infiltration anesthesia and Oberst’s regional method of cocainization. However, for truly major operations, those two approaches have only limited application. I have therefore sought to render large areas of the body insensitive to pain by cocainizing the spinal cord ” ( ).


This therapeutic use of intrathecal (IT) delivery was enabled by water-soluble purified cocaine; the syringe and needle; and the extensive work performed by Heinrich Quincke characterizing the lumbar IT tap ( ). In his well-known work, Bier and his assistant Hildebrand demonstrated the robust, reversible anesthesia resulting from the spinal delivery ( ). It was reported in the Lancet ( ) that “by January 1901 there has been very nearly 1000 publications on the use of medullary narcosis…” This was proof of the rapid spread of this therapeutic procedure around the world. by January 1901 there has been very nearly 1000 publications on the use of medullary narcosis… This was proof of the rapid spread of this therapeutic procedure around the world


Neuraxial Morphine


While the primary agents employed for neuraxial delivery were molecules that produced conduction block (e.g., local anesthetics), it was not lost on early clinicians that other agents might prove useful. Alexander Wood, considered by many to be among the first to use hypodermic drug delivery, published early descriptions of delivering drugs (water-soluble muriatic morphine, e.g., morphine HCl) for pain relief through the skin using a hollow needle ( ). Following the Civil War, wounded soldiers were discharged and issued syringes with morphine tablets to manage pain that was secondary to nerve injury and amputation ( ). Accordingly, it is not unexpected that morphine, compatible with delivery by a syringe and needle, would be considered for use by the IT route.


Shortly after the initial report by Bier showing the clinical utility of IT cocaine, the Romanian surgeon Racoviceanu-Pitesti first reported the use of morphine for IT analgesia ( ). In 1901 Rudolph Matas, a vascular surgeon in New Orleans, supplemented his spinal anesthetics of synthetic local anesthetics for surgical anesthesia with morphine. In 1901 a Japanese physician, Otojiro Katagawa, reported the first use of IT morphine solely for uncontrolled back pain ( ). He injected 10 mg of morphine and local anesthetic into two patients, and both achieved superb pain relief. Given the quantity, likely his patients survived without respiratory depression because most of this solution leaked out into the epidural space. But after these early reports the neuraxial delivery of drugs, other than local anesthetics, largely disappeared from clinical therapy.




Neuraxial Opiates: The Last 40 Years


Preclinical Studies on the Analgesic Actions of Neuraxial Opiates


During the latter half of the 20th century an increasing interest in the MOA of opiate analgesia developed. At this time, three important observations were made.



  • 1.

    By the late 1960s the specific and precise pharmacology of the biological effects of opiates in in vitro and in vivo assays led to the conclusion that the opiate agents were acting though specific site(s) (receptors) ( ). This data laid the groundwork for subsequent studies describing the presence of specific high-affinity binding sites in brain tissues for opiates, using stereospecific displacement of a radioligand ( ).


  • 2.

    Early work on the clinical actions of opiates emphasized their effects on the emotive components of pain behavior, with their actions described as the pharmacological equivalent of a prefrontal lobotomy, in that the pain sensation remained but the affective component was missing ( ). Consistent with the presumed actions on affect, opiates were found to have a potent analgesic effect that was mediated through an action at specific brain sites, notably the periaqueductal gray, and most importantly when delivered by local microinjection of morphine through stereotactically implanted microinjection cannulae. These effects were dose dependent, site specific, and naloxone reversible in species from the mouse through primates ( ).


  • 3.

    While opiates indeed had a potent effect in the brain, early work showed that morphine could also block spinal reflexes ( ) and reduce the firing of spinal neurons ( ) in a spinally transected animal. The question of whether or not these local actions of opiates on spinal function had any direct bearing on the organized response of the intact and behaving animal to strong and otherwise aversive stimuli was absent at this point. The subsequent preclinical demonstration that IT opiates specifically blocked nociceptive responses led to the conclusion that opiates with an action limited to the spinal cord would have behaviorally relevant effects in animals ranging from mice through primates ( ). In these studies, opioids, bolused into the IT space demonstrated inhibition of spinal nociceptive reflexes such as tail flick ( ) and skin twitch ( ), as well as more organized responses to nociception, as in responses to hot plate, pinch, or shock titration, in multiple species ( ). Studies also confirmed that nonnociceptive reflexes were not diminished with the spinal delivery of opiates; autonomic reflexes and light touch were unaffected ( ), and parturition was not altered in rats and rabbits, which opened the way for the use of spinal opiates for labor and delivery ( ).



Early Clinical Studies on the Analgesic Actions of Neuraxial Opiates


Some 3 years after the initial preclinical reports on the use of IT opiates, the efficacy of IT ( ) and epidural ( ) morphine in cancer patients was reported. Subsequent reports extended the use of spinal opiates to obstetric ( ), orthopedic ( ), and postoperative patients ( ).


Discovery of the Underlying MOA for Spinal Opiate Actions


In light of the certain action of opiates within the spinal cord, early evidence based on stereospecific opiate binding in brain and spinal tissue homogenates curiously indicated that the spinal cord had relatively little such binding activity ( ). In contrast, later opiate receptor autoradiography revealed that the small amount of activity present in monkeys and rats was localized within dorsal horn (DH) laminae I, II, and III of Rexed ( ). These early binding studies found that ganglionectomies and sensory root lesions resulted in a large but incomplete reduction in DH opioid binding ( ); opioid binding sites were found on dorsal root ganglion cells ( ); and neurotoxins (e.g., capsaicinoids), which destroy small peptidergic primary afferents, reduced opiate binding ( ). Consistent with these findings, early work pointed to the fact that the in vivo release of substance-P from small primary afferents was reduced by local action of opiates ( ). Aside from an action on primary afferent terminals, several lines of evidence also suggested a postsynaptic site for the action of opiates in the DH. Thus extensive rhizotomy resulted in a subtotal depletion of DH binding ( ). Furthermore, a postsynaptic action was demonstrated by the ability of opiates to block directly DH neuron excitation evoked by glutamate, presumed to reflect a direct activation of the DH neuron ( ). The presynaptic action corresponded to the ability of opiates to prevent the opening of voltage-sensitive calcium (Ca ++ ) channels, thereby preventing release of Ca ++ . The activation of potassium (K + ) channels, leading to hyperpolarization, is consistent with direct postsynaptic inhibition ( ). The joint ability of opiates to reduce the release of excitatory neurotransmitters from C fibers and decrease the excitability of DH neurons,is believed to account for the powerful and selective effects upon spinal nociceptive processing ( ).


Identification of Subtypes of Opioid Receptors: Mu, Delta, Kappa


The concept of multiple opiate receptors arose from the in vivo work of William Martin at Lexington, Kentucky. He described multiple opiate sites based on different structure activity relationships; different potencies of the competitive antagonist naloxone when reversing the effects of different ligands (where different competitive antagonist potencies were interpreted as reflecting binding sites for which the antagonists had different affinities and were therefore different binding sites); and lack of cross-tolerance between different families of ligands. On the basis of these in vivo criteria, Martin proposed the existence of a mu and a kappa receptor mediating analgesia, and a sigma site, representing a nonnaloxone-reversible site, mediating opiate excitation ( ). In separate work, Kosterlitz et al. ( ) in Scotland found that several pentapeptides (enkephalins) isolated from brain tissue stimulated contraction of mouse vas deferens in a naloxone-reversible fashion, as did opiates. The observation that the relative potency of the enkephalins and morphine on vas deferens and guinea pig ileum differed and naloxone showed different potency in reversing equivalently active doses of the two classes of agents argued for two distinct receptors: mu (for the alkaloids) and delta (for the enkephalins). Based on these observations and the earlier work by Martin, it was suggested that there were three principal classes of naloxone-reversible opioid sites: mu, delta, and kappa. Subsequently, these three subtypes were confirmed by the isolation, cloning, and expression of three G-protein-coupled receptors, displaying the respective pharmacological profiles ( ). Subsequent development of selective antagonists for the mu ( ), delta ( ), and kappa ( ) sites were consistent with the overall body of data. Agents of mu, delta, and kappa opioid receptor binding classes, administered spinally in animal models, were shown to modulate nociceptive processes in pain behavior in several modalities (thermal, chemical, and mechanical) ( ). Selective mu (DAMGO, morphine, sufentanil), delta (DPDPE), and kappa (U50488H, butorphanol) agonists have all induced behavioral changes to noxious stimuli. Of note, kappa opioids showed minimal effects on acute nociception and slightly enhanced activity in models of inflammation and hyperalgesia ( ). As is noted below, many of these agents were observed to have notable analgesic activity in humans with pain.




Development of Catheter-Based Infusion Techniques


Continuous spinal anesthesia was attempted as early as 1940 ( ) by inserting a malleable needle into the subdural space and molding it down, to be attached to rubber tubing and bolused intermittently as needed. This system was fraught with problems, including difficult lumbar puncture because of the pliability of the needle and the concern over dislodgement while laying the patient supine ( ).


Love, a Mayo clinic surgeon, reported on the placement of a red rubber tube for the purpose of achieving continuous cerebrospinal fluid (CSF) shunt to provide drainage ( ). In 1945 a military anesthesiologist, Edward B. Tuohy, published a technique for placement of a catheter into the epidural space. An introducer needle was placed in the caudal epidural space; a ureteral catheter was passed through the needle and the needle removed. A Leur lock adapter was attached, through which injections were made. The catheter was taped to the patient’s shoulder to make it “accessible and ready for additional injections” ( ).


The safety of catheters was also examined, first in preclinical models and then validated in humans with the prolonged presence (4–16 months) of indwelling IT polyethylene catheters. These long-term indwelling catheters were found to be safe; there was no evidence of neurological changes and, at autopsy, no evidence of cord pathology, demyelination, vacuolization, or degeneration was discerned ( ). Furthermore, no changes in turbidity of CSF to suggest precipitation of protein was noted for opiates like morphine, methadone, and meperidine. The only exception was heroin ( ). After these promising results, reported on the first case of continuous morphine infusion in a patient with chronic cancer pain, using an indwelling catheter infused continuously by an implanted Infusaid pump. The use of the epidural route for morphine delivery with implanted systems was also reported ( ). An alternative approach was employed by , who placed modified Broviac catheters into the epidural space and externalized them to deliver epidural morphine in patients with cancer pain. In a later study in 350 patients with cancer pain ( ), superficial, deep, and epidural catheter infections occurred at a reported rate of 15.1%. Concern for this high reported infection rate led to the implanting of injection/reservoir ports. These implanted injection ports produced an almost 50% reduction in the frequency of dislodgments (as well as infection) when compared to externalized systems ( ).


In the body of work on IDDS was expanded by Krames et al. with their study of continuous fixed-rate infusion of spinal morphine via implanted pumps (Infusaid, model #400, Boston, MA, USA) in 17 patients, 16 of whom had pain of malignant origin. The patients achieved 50%–70% reduction of their pain scores and improved quality of life and function ( ). The one patient of the cohort of 17 who had pain of nonmalignant origin did not receive any benefit from the continuous infusion of IT morphine and the authors, prematurely and without benefit of a further study, argued against the use of IT morphine for pain of nonmalignant origin.


The question then arose as to which dosing schedule, either continuous or bolus dosing, was superior to the other. One study compared bolus dosing to continuous infusion in 28 cancer patients with either IT- or epidural-administered morphine ( ). No significant difference was reported in visual analogue pain scores, pain relief scores, or satisfaction scores between the two groups. However, a greater degree of dose escalation occurred in the continuous infusion group, perhaps because at that time most IDDS were fixed-rate continuous infusion systems. Of note, at the same time as a great interest in the use of IDDS to treat pain with the delivery of IT morphine evolved, there was a parallel development in the use of IT baclofen delivery for the management of spasticity. Intrathecal baclofen, a GABA-B agonist, as early as 1978 was reported to result in potent suppression of spinal motor tone ( ), and it was subsequently used to treat spasticity in humans ( ). By 1990 there were over 120 papers in the literature discussing and reporting on the use of IT drug infusion for the management of spasticity and pain in human subjects.




Evolution of the Neuraxial Delivery of Targeted Therapeutics


Over the ensuing decades, after the original description of the robust effects of opiates on spinal nociceptive processing, a number of issues regarding the IT delivery of opiates arose and were addressed.


Epidural Versus Intrathecal Delivery


Early clinical reports with bolus delivery reported on the efficacy of both IT and epidural delivery of morphine. While some clinicians ( ) found that epidural delivery of opiates provided excellent pain relief with minimal complications, others ( ) routinely found that this mode of delivery resulted in multiple problems/adverse events (AEs), including scarring, respiratory depression, epidural infections, meningitis, catheter blockage from metastasis, leakage of the catheter, dislocation, and pain upon injection. These AEs were prevalent enough—reported in up to 55% of patients, especially in the long term ( )—for many physicians treating pain with spinal opioids to opt for IT over the epidural delivery. Furthermore, epidural delivery appeared to result in more variable outcomes with respect to spinal uptake ( ), analgesic effect, and duration of action. Given these findings, as well as the significantly lower doses required for IT therapy when compared to epidural administration of morphine ( ), a clear preference for IT placement in opiate therapy emerged. Moreover, clinicians were choosing to change epidural to IT catheters, specifically in patients who had previously had excellent pain control but, over time, lost efficacy; this was thought to be due epidural scarring, creating a diffusion barrier ( ). Such scarring was widely observed in animal studies ( ) and in humans ( ).


At this same time, Medtronic, which had developed a programmable IDDS for the delivery of spinal morphine which at the time was only approved for epidural use by the US Food and Drug Administration (FDA), asked Elliot S. Krames of San Francisco, who had published on the use of IDDS for the delivery of IT morphine ( ), to present arguments to the FDA in favor of IT delivery of morphine. These arguments resulted in the FDA approving morphine for IT delivery (personal communication from E. Krames).


Intrathecal Pharmacokinetics


From the first studies by Bier in , the anesthetic effect after lumbar IT delivery was noted to remain localized, reflecting the restricted distribution of the administered IT drug. This restricted distribution was clearly inconsistent with the notion that remained popular though the late 20th century that spinal CSF moved caudally and rostrally like a flowing river ( ). An absence of discernable flow was supported by the work of those studying IT drug redistribution after low-volume bolus injection and low-rate infusion ( ), as well as by a variety of noninvasive imaging procedures in humans and nonhuman models. Imaging studies, however, pointed to an oscillatory movement in the lumbar CSF, resulting from oscillatory pressure gradients established by the cardiac and respiratory cycle ( ). Given this restricted distribution, a number of elements became clear. Drug distribution within the CSF is dependent on several factors: location and orientation of the catheter tip to target the pain-generating spinal dermatome level; speed of the injection, i.e., bolus versus slow continuous, for spread from the catheter tip; increased infusion dose leading to broadening of the concentration gradients around the catheter tip ( ); anatomical variants of trabecular arachnoid and lumbar subarachnoid ligaments, creating barriers to laminar flow; and CSF flow which corresponds to intracranial arterial pulsation, heart rate, and breathing also affects drug distribution ( ). Overall, IT infusion medications appear to be poorly distributed in the CSF, likely due to low kinetic energy (low flow rate) and minimal CSF bulk flow.


Given that the target opioid receptor sites for IT therapy are in the DH of the spinal cord, specifically lamina II or substantia gelatinosa, versus the roots, as with local anesthetics, it was appreciated that IT opiates must penetrate through the pia mater and white matter into the superficial DH of the spinal cord ( ).


Thus early work emphasized that the physicochemical properties of the molecules, notably lipid solubility and molecular weight, played an important role in their movement within the CSF. Polar molecules, such as morphine, or large molecules, such as proteins, were cleared slowly, while lipophilic agents had shorter half-lives secondary to increased diffusion, increased vascular clearance ( ), and smaller volumes of distribution with deeper cord penetration and rostral spread, but were considered to have minimal spread ( ).




Other Intrathecal Analgesics


Opioids


As reviewed above, a number of opioids have been used intrathecally. Morphine was the first drug employed for IT and epidural delivery in humans and continues to be the only FDA-approved opioid for IT delivery, and sufentanil was approved for epidural delivery. After the initial work with epidural and IT morphine, other opioids were examined for spinal use, including hydromorphone ( ), diamorphine ( ), fentanyl ( ), sufentanil ( ), methadone ( ), and meperidine ( ). Meperidine and sufentanil were seen as promising IT agents for pain control, as there was discussion of the local anesthetic properties of the two agents ( ), but some meperidine preparations, because of its pH, were noted to damage the internal tubing of pumps, rendering them nonfunctional, which made the IDDS unsuitable for use ( ). Fentanyl and sufentanil have also been used for the management of chronic pain ( ). As noted above, cephalad migration appeared to be less with lipid soluble agents such as fentanyl and sufentanil, reflecting their clearance first from CSF into tissue, then from tissue into blood, and then, by vascular redistribution, to extraspinal sites ( ). Epidural delivery of butorphanol, a kappa opioid partial agonist/antagonist, was reported to display efficacy in obstetric patients ( ).


Other opioid agents with a peptide structure were shown to have efficacy after IT delivery in humans. These included β-endorphin, a 31 amino acid mu/delta ligand ( ), d-ala2-dleu5-enkephalin, a delta-preferring peptide ( ), and dermorphin, a mu-selective peptide ( ).


Nonopioid Agents


Aside from local anesthetics, major nonopioid medications were used intrathecally and included baclofen, ziconotide, and clonidine, of which baclofen and ziconotide were FDA approved as IT monotherapy agents. Clonidine was approved for epidural infusion. Other drug targets were also explored. Several of these targets are discussed below.


Ziconotide. Ziconotide, a ω-conotoxin derived from the venom of the cone snail, is demonstrated to be a selective ligand, blocking the N-type voltage-sensitive calcium channel ( ). It is a stable, water-soluble, large, 25 amino acid, polybasic peptide containing three disulfide bridges with a molecular weight of 2639 Da. Ziconotide is FDA approved for IT use for chronic severe pain.


The N-type calcium channel is densely expressed in afferent nerve terminals and mediates the influx of calcium in the afferent terminal, leading to transmitter release.


Intrathecal ziconotide was shown to produce potent antihyperpathic effects after nerve and tissue injury ( ) that did not display tachyphylaxis ( ). Extensive preclinical work displayed no evidence of tissue toxicity ( ). The IT use of ziconotide was fraught with early reports of overdoses, but careful titration revealed efficacy in a variety of chronic clinical pain states ( ).


Clonidine. Alpha-2 adrenoreceptor agonists were found to be analgesic in preclinical models. Mechanistically, these spinal drug effects are mediated by alpha 2 receptors expressed both presynaptically and postsynaptically on the primary afferent neuron ( ). Systematic preclinical work revealed no neuraxial toxicity ( ), but a high incidence of hypotension or bradycardia. Intrathecal alpha-2 adrenergic agents were noted to provide synergistic effects in combination with IT opioids ( ), and to enhance and prolong analgesia in combination with IT local anesthetics ( ). The efficacy of chronic IT clonidine for intractable pain was reported ( ).


Midazolam. This molecule, a benzodiazepine that is water soluble at acidic pH, was shown early to interact with the GABA-A receptor ( ). IT midazolam was observed to have depressive effects upon spinal nociceptive processing ( ).


Midazolam was examined for effects upon neuropathic pain, chronic low back pain ( ), and spasticity ( ). Unfortunately, preclinical literature on IT midazolam was equivocal, with many investigators detecting spinal neurotoxicity. Mechanisms of neurotoxicity were not clear and may have reflected, in part, the use of hypotonic solutions ( ), nonpreservative-free midazolam ( ), delay in postmortem tissue fixation, and untreated hypotension ( ). A robust preclinical study in sheep and pigs exposed chronically to midazolam over an extended period of time revealed that midazolam showed minimal indices of toxicity ( ). Given these preclinical studies displaying a mixed dataset about the safety profile of IT midazolam, clinicians exercised restraint in using it due to concerns about neurotoxicity ( ).


N-methyl- d -aspartate (NMDA) receptors. NMDA receptors in the DH are involved in the development of peripherally induced central sensitization ( ), and antagonists like ketamine were shown to reduce the development of hyperalgesia in persistent/chronic pain states ( ). IT ketamine (an NMDA antagonist) held great promise, but again while some preclinical studies reported neurotoxicity with IT ketamine boluses ( ), others reported necrotizing lesions with cellular infiltrates ( ). In a canine model ( ), a 28-day IT infusion trial of conventional competitive (AP5), noncompetitive (MK801, memantine), and nonconventional (amitriptyline, d -methadone) NMDA antagonists showed that paresis and histopathological evaluation demonstrated changes from mild inflammation with perivascular cuffing to necrotizing vasculitis and meningomyelitis, suggestive of a vasculitis leading to ischemia and necrosis of the spinal cord. Spinal histopathology occurred at the same dose range as its antihyperalgesic effects in the neonatal preclinical model ( ), and because of the severity of spinal pathology from the aforementioned studies, IT ketamine is no longer pursued clinically. In humans, several case reports with IT ketamine demonstrated severe histological changes, but those findings could not be directly attributed to ketamine alone, due to confounding factors like the use of adjuvants, chemotherapy, and radiation therapy, or metastatic disease ( ).


Somatostatin. The localization of somatostatin in intrinsic spinal cord interneurons in the superficial layer of the DH ( ) and its inhibitory effect on nociceptive neurons ( ) led to an interest in IT somatostatin for pain. In cancer patients with refractory pain with IT opiates, somatostatin-14 markedly reduced pain ( ) despite opiate tolerance, and was not reversible by naloxone, indicating a distinct MOA. In rodents, however, IT somatostatin was noted to produce pronounced neuropathology ( ).


Of note, preclinical trials in primates showed neurotoxic effects characterized by truncal ataxia, dysmetria, and severe bradykinesia ( ). Investigations turned to octreotide, a more stable analog of the rapid enzymatically degraded somatostatin. A combined preclinical and clinical study in a small case series of six cancer patients who failed IT opioids demonstrated minimal neurotoxicity in the two dogs and excellent efficacy of IT octreotide on nociceptive and neuropathic pain, with none of the reported side-effects that are expected with systemic somatostatin, including nausea, diarrhea, and hypotension ( ).


Other nonopioid agents. Other agents have been examined intrathecally following preclinical safety assessments, and include the cholinesterase inhibitor neostigmine ( ), the nonsteroidal antiinflammatory agent ketorolac ( ), adenosine ( ), and oxytocin ( ). For a greater discussion of the evaluation of nonopioids for IT use, see Chapter by Rauck and Hong in this section.


Developing Concerns Over Spinal Drug Safety


The clinical research outlined above that evolved after the initial demonstration of the actions of spinal morphine revealed a progressive appreciation that the safety of an agent after neuraxial delivery could not be predicated on prior assessments of safety after systemic delivery, and that changes in formulation or concentration might lead to deleterious effects. By the early 2000s concern regarding this issue had risen significantly ( ), such that several of the major journals publishing clinical pain research with spinally delivered analgesic agents ( Anesthesiology, Anesthesia and Analgesia ; Pain ) made it an editorial policy that they would not publish research if it had not been subject to appropriate and robust preclinical safety evaluation ( ). This thinking emphasized that robust preclinical toxicology required systematic dose/concentration assessments of extended exposures ( ) in small and, more importantly, large animal models. This interest has generated validated preclinical models of continuous infusion in dogs ( ), sheep ( ), and guinea pigs ( ).




Side-Effects of IT Administration of Opiates


Effects on Physiological Functions


In the first decade of IT opiate use preclinical and clinical studies revealed a number of side-effects that were secondary to the mode of delivery, most of which were dose dependent and naloxone reversible. These early side-effects included those actions known to be associated with opiate delivery, especially IT delivery, and included respiratory depression, somnolence, and nausea/vomiting, reflecting a rostral redistribution of the spinal opiates ( ). In addition, urinary retention ( ) associated with the opioid receptor-mediated effect upon bladder–spino–bladder reflexes, pruritus secondary to spinal action, and perhaps peripheral mast cell effects ( ) and altered neuroendocrine release ( ). Lower-extremity swelling secondary to epidural morphine delivery was later observed, though its mechanisms remain unclear ( ).


Hyperalgesia


After the initiation of systematic studies on spinal opiate action, preclinical models revealed that high spinal concentrations of several opiates, including morphine, produced an ongoing pain agitation state and tactile allodynia that was not naloxone reversible ( ). This paradoxical response to spinal opiates, now called opioid-induced hyperalgesia (OIH), occurs with other routes of opioid administration. Preclinical studies ( ) have shown that high concentrations of morphine given acutely result in evident pain behaviors and allodynia at the dermatome levels proximal to the catheter tip ( ). These allodynic side-effects, however, appear to be independent of the opiate receptor pathways because they are not reversible with naloxone, not stereo specific, and could occur with lower doses of opiate metabolites that have no opiate activity, like morphine-3-glucoronide or normeperidine ( ). This paradoxical OIH phenomenon has been reported in patients receiving chronic IT opiates ( ), with doses ranging from 30 to 200 mg/day.


Clonus


Motor side-effects of IT opiates were thought to be due to loss of intrinsic inhibition in the motor horn, or perhaps a direct stimulatory effect of the IT delivery of morphine. Case studies have been published of patients developing myoclonic jerking with doses of just over 20 mg/day of IT morphine ( ), therefore consensus conference guidelines recommend maximum daily doses of the several opioids ( ).


Tolerance


With the increasing utilization of neuraxial opiates, the issue of tolerance in human patients was and continues to be a much-discussed topic ( ). While in some studies relatively modest incremental increases in chronic IT opiate dosing for the majority of patients are required ( ), other studies have shown dose escalations that are mostly associated with cancer pain ( ), patients younger than 50 years old ( ), and patients with neuropathic pain ( ).


Mechanistically, tolerance after IT opiate dosing is not thought to occur from increased metabolism or clearance of the opiate. Instead, a number of compensatory mechanisms are brought into play with chronic opiate exposure at the receptor site, including spinal cholecystokinin pathway activation ( ) and NMDA receptor activation by the phosphorylation ( ) that occurs with persistent opiate receptor occupancy. Persistent opiate receptor occupancy leads to reciprocal phosphorylation of and thus inactivation of opiate receptors to propagate the end effect of tolerance ( ). As a consequence, NMDA antagonists ( ) are reported to reduce tolerance, at least in the short term.


Granuloma


Though morphine had been infused intrathecally since the late 1970s, it did not receive systematic preclinical assessments for toxicity. The clinical experience was that aside from the evident effects of opiate receptor activation, there was little evidence of pathology, as revealed by the occasional autopsy performed on patients who happened to have had IT delivery of opioids during life ( ). However, Richard North et al. in reported on a case of a patient with a space-occupying IT mass after delivery of spinal morphine. This was the first principal toxicity to be found that was associated with IT delivery of morphine. Since that time there have been a number of case reports and series noting IT masses ( ).


Factors that are common to all these cases of IT masses after the IT delivery of opiates are the use of high concentrations of morphine or a congener (hydromorphone or methadone), the absence of an infectious agent, and the continuous delivery of the IT agent. Interestingly, up to the late 1980s the common concentration of morphine employed was 10 mg/mL ( ). The work by North et al. was among the first of many reports on the use of concentrations of 20 mg/mL and higher. Higher and higher concentrations of morphine were increasingly employed by clinicians to extend the interval between refills.


Of note, clinicians felt that IT fentanyl’s low propensity toward granuloma formation was a reason to transition to it after granuloma formation with IT morphine. However, recent case reports note granulomas with high-dose fentanyl (2700 mcg/day ( ) and 8866 mcg/day ( )), leading to new guidelines recommending maximum daily delivery of 1000 mcg/day ( ).


Furthermore, bupivacaine has been implicated as having a tendency toward significance as a risk factor for granuloma formation ( ), with resulting recommendations for a maximum dose of 15–20 mg/day ( ). Interestingly, bupivacaine in vitro has been shown to precipitate in CSF at a low concentration of 0.75 mg/mL ( ), but this finding has not been correlated in the clinical setting.


Preclinical work has identified the presence of granuloma after IT infusion of opiates in several animal species, including guinea pig ( ), dog ( ), and sheep ( ), showing that the IT mass arises from the spinal meninges in a concentration- and time-dependent fashion ( ) following a variety of IT-infused opiates ( ) in an opiate receptor-independent fashion ( ). These granulomas have been hypothesized to arise from the meninges adjacent to the catheter tip, and represent a hypertrophic proliferation of fibroblasts and collagen that arises secondary to meningeal mast cell degranulation ( ).


An important component of current thinking arising from these early experiences is the absolute appreciation that new agents for neuraxial use must undergo appropriate preclinical safety assessments ( ). It is instructive to consider that, beginning at the turn of the last century, issues of tissue toxicity had been reported after local anesthetic spinal injections, including CSF pleocytosis and changes in Nissl staining in the dorsal root ganglion ( ).


The unexpected observations in the 1990s of IT granulomas and the appearance of radiculopathies after continuous high concentrations of IT-delivered opiates and local anesthetics, respectively, emphasized that a drug with an appropriate safety profile after systemic delivery may have deleterious effects after neuraxial therapeutic delivery. This distinction reflects the high concentrations of drugs that are commonly employed and the poorly mixed environment represented by the lumbar IT space. Accordingly, considerable work is now focused on appropriate and minimal requirements for transitioning a candidate drug from the bench to the bedside ( ).




Evolution of the Neuraxial Delivery of Targeted Therapeutics


Over the ensuing decades, after the original description of the robust effects of opiates on spinal nociceptive processing, a number of issues regarding the IT delivery of opiates arose and were addressed.


Epidural Versus Intrathecal Delivery


Early clinical reports with bolus delivery reported on the efficacy of both IT and epidural delivery of morphine. While some clinicians ( ) found that epidural delivery of opiates provided excellent pain relief with minimal complications, others ( ) routinely found that this mode of delivery resulted in multiple problems/adverse events (AEs), including scarring, respiratory depression, epidural infections, meningitis, catheter blockage from metastasis, leakage of the catheter, dislocation, and pain upon injection. These AEs were prevalent enough—reported in up to 55% of patients, especially in the long term ( )—for many physicians treating pain with spinal opioids to opt for IT over the epidural delivery. Furthermore, epidural delivery appeared to result in more variable outcomes with respect to spinal uptake ( ), analgesic effect, and duration of action. Given these findings, as well as the significantly lower doses required for IT therapy when compared to epidural administration of morphine ( ), a clear preference for IT placement in opiate therapy emerged. Moreover, clinicians were choosing to change epidural to IT catheters, specifically in patients who had previously had excellent pain control but, over time, lost efficacy; this was thought to be due epidural scarring, creating a diffusion barrier ( ). Such scarring was widely observed in animal studies ( ) and in humans ( ).


At this same time, Medtronic, which had developed a programmable IDDS for the delivery of spinal morphine which at the time was only approved for epidural use by the US Food and Drug Administration (FDA), asked Elliot S. Krames of San Francisco, who had published on the use of IDDS for the delivery of IT morphine ( ), to present arguments to the FDA in favor of IT delivery of morphine. These arguments resulted in the FDA approving morphine for IT delivery (personal communication from E. Krames).


Intrathecal Pharmacokinetics


From the first studies by Bier in , the anesthetic effect after lumbar IT delivery was noted to remain localized, reflecting the restricted distribution of the administered IT drug. This restricted distribution was clearly inconsistent with the notion that remained popular though the late 20th century that spinal CSF moved caudally and rostrally like a flowing river ( ). An absence of discernable flow was supported by the work of those studying IT drug redistribution after low-volume bolus injection and low-rate infusion ( ), as well as by a variety of noninvasive imaging procedures in humans and nonhuman models. Imaging studies, however, pointed to an oscillatory movement in the lumbar CSF, resulting from oscillatory pressure gradients established by the cardiac and respiratory cycle ( ). Given this restricted distribution, a number of elements became clear. Drug distribution within the CSF is dependent on several factors: location and orientation of the catheter tip to target the pain-generating spinal dermatome level; speed of the injection, i.e., bolus versus slow continuous, for spread from the catheter tip; increased infusion dose leading to broadening of the concentration gradients around the catheter tip ( ); anatomical variants of trabecular arachnoid and lumbar subarachnoid ligaments, creating barriers to laminar flow; and CSF flow which corresponds to intracranial arterial pulsation, heart rate, and breathing also affects drug distribution ( ). Overall, IT infusion medications appear to be poorly distributed in the CSF, likely due to low kinetic energy (low flow rate) and minimal CSF bulk flow.


Given that the target opioid receptor sites for IT therapy are in the DH of the spinal cord, specifically lamina II or substantia gelatinosa, versus the roots, as with local anesthetics, it was appreciated that IT opiates must penetrate through the pia mater and white matter into the superficial DH of the spinal cord ( ).


Thus early work emphasized that the physicochemical properties of the molecules, notably lipid solubility and molecular weight, played an important role in their movement within the CSF. Polar molecules, such as morphine, or large molecules, such as proteins, were cleared slowly, while lipophilic agents had shorter half-lives secondary to increased diffusion, increased vascular clearance ( ), and smaller volumes of distribution with deeper cord penetration and rostral spread, but were considered to have minimal spread ( ).




Other Intrathecal Analgesics


Opioids


As reviewed above, a number of opioids have been used intrathecally. Morphine was the first drug employed for IT and epidural delivery in humans and continues to be the only FDA-approved opioid for IT delivery, and sufentanil was approved for epidural delivery. After the initial work with epidural and IT morphine, other opioids were examined for spinal use, including hydromorphone ( ), diamorphine ( ), fentanyl ( ), sufentanil ( ), methadone ( ), and meperidine ( ). Meperidine and sufentanil were seen as promising IT agents for pain control, as there was discussion of the local anesthetic properties of the two agents ( ), but some meperidine preparations, because of its pH, were noted to damage the internal tubing of pumps, rendering them nonfunctional, which made the IDDS unsuitable for use ( ). Fentanyl and sufentanil have also been used for the management of chronic pain ( ). As noted above, cephalad migration appeared to be less with lipid soluble agents such as fentanyl and sufentanil, reflecting their clearance first from CSF into tissue, then from tissue into blood, and then, by vascular redistribution, to extraspinal sites ( ). Epidural delivery of butorphanol, a kappa opioid partial agonist/antagonist, was reported to display efficacy in obstetric patients ( ).


Other opioid agents with a peptide structure were shown to have efficacy after IT delivery in humans. These included β-endorphin, a 31 amino acid mu/delta ligand ( ), d-ala2-dleu5-enkephalin, a delta-preferring peptide ( ), and dermorphin, a mu-selective peptide ( ).


Nonopioid Agents


Aside from local anesthetics, major nonopioid medications were used intrathecally and included baclofen, ziconotide, and clonidine, of which baclofen and ziconotide were FDA approved as IT monotherapy agents. Clonidine was approved for epidural infusion. Other drug targets were also explored. Several of these targets are discussed below.


Ziconotide. Ziconotide, a ω-conotoxin derived from the venom of the cone snail, is demonstrated to be a selective ligand, blocking the N-type voltage-sensitive calcium channel ( ). It is a stable, water-soluble, large, 25 amino acid, polybasic peptide containing three disulfide bridges with a molecular weight of 2639 Da. Ziconotide is FDA approved for IT use for chronic severe pain.


The N-type calcium channel is densely expressed in afferent nerve terminals and mediates the influx of calcium in the afferent terminal, leading to transmitter release.


Intrathecal ziconotide was shown to produce potent antihyperpathic effects after nerve and tissue injury ( ) that did not display tachyphylaxis ( ). Extensive preclinical work displayed no evidence of tissue toxicity ( ). The IT use of ziconotide was fraught with early reports of overdoses, but careful titration revealed efficacy in a variety of chronic clinical pain states ( ).


Clonidine. Alpha-2 adrenoreceptor agonists were found to be analgesic in preclinical models. Mechanistically, these spinal drug effects are mediated by alpha 2 receptors expressed both presynaptically and postsynaptically on the primary afferent neuron ( ). Systematic preclinical work revealed no neuraxial toxicity ( ), but a high incidence of hypotension or bradycardia. Intrathecal alpha-2 adrenergic agents were noted to provide synergistic effects in combination with IT opioids ( ), and to enhance and prolong analgesia in combination with IT local anesthetics ( ). The efficacy of chronic IT clonidine for intractable pain was reported ( ).


Midazolam. This molecule, a benzodiazepine that is water soluble at acidic pH, was shown early to interact with the GABA-A receptor ( ). IT midazolam was observed to have depressive effects upon spinal nociceptive processing ( ).


Midazolam was examined for effects upon neuropathic pain, chronic low back pain ( ), and spasticity ( ). Unfortunately, preclinical literature on IT midazolam was equivocal, with many investigators detecting spinal neurotoxicity. Mechanisms of neurotoxicity were not clear and may have reflected, in part, the use of hypotonic solutions ( ), nonpreservative-free midazolam ( ), delay in postmortem tissue fixation, and untreated hypotension ( ). A robust preclinical study in sheep and pigs exposed chronically to midazolam over an extended period of time revealed that midazolam showed minimal indices of toxicity ( ). Given these preclinical studies displaying a mixed dataset about the safety profile of IT midazolam, clinicians exercised restraint in using it due to concerns about neurotoxicity ( ).


N-methyl- d -aspartate (NMDA) receptors. NMDA receptors in the DH are involved in the development of peripherally induced central sensitization ( ), and antagonists like ketamine were shown to reduce the development of hyperalgesia in persistent/chronic pain states ( ). IT ketamine (an NMDA antagonist) held great promise, but again while some preclinical studies reported neurotoxicity with IT ketamine boluses ( ), others reported necrotizing lesions with cellular infiltrates ( ). In a canine model ( ), a 28-day IT infusion trial of conventional competitive (AP5), noncompetitive (MK801, memantine), and nonconventional (amitriptyline, d -methadone) NMDA antagonists showed that paresis and histopathological evaluation demonstrated changes from mild inflammation with perivascular cuffing to necrotizing vasculitis and meningomyelitis, suggestive of a vasculitis leading to ischemia and necrosis of the spinal cord. Spinal histopathology occurred at the same dose range as its antihyperalgesic effects in the neonatal preclinical model ( ), and because of the severity of spinal pathology from the aforementioned studies, IT ketamine is no longer pursued clinically. In humans, several case reports with IT ketamine demonstrated severe histological changes, but those findings could not be directly attributed to ketamine alone, due to confounding factors like the use of adjuvants, chemotherapy, and radiation therapy, or metastatic disease ( ).


Somatostatin. The localization of somatostatin in intrinsic spinal cord interneurons in the superficial layer of the DH ( ) and its inhibitory effect on nociceptive neurons ( ) led to an interest in IT somatostatin for pain. In cancer patients with refractory pain with IT opiates, somatostatin-14 markedly reduced pain ( ) despite opiate tolerance, and was not reversible by naloxone, indicating a distinct MOA. In rodents, however, IT somatostatin was noted to produce pronounced neuropathology ( ).


Of note, preclinical trials in primates showed neurotoxic effects characterized by truncal ataxia, dysmetria, and severe bradykinesia ( ). Investigations turned to octreotide, a more stable analog of the rapid enzymatically degraded somatostatin. A combined preclinical and clinical study in a small case series of six cancer patients who failed IT opioids demonstrated minimal neurotoxicity in the two dogs and excellent efficacy of IT octreotide on nociceptive and neuropathic pain, with none of the reported side-effects that are expected with systemic somatostatin, including nausea, diarrhea, and hypotension ( ).


Other nonopioid agents. Other agents have been examined intrathecally following preclinical safety assessments, and include the cholinesterase inhibitor neostigmine ( ), the nonsteroidal antiinflammatory agent ketorolac ( ), adenosine ( ), and oxytocin ( ). For a greater discussion of the evaluation of nonopioids for IT use, see Chapter by Rauck and Hong in this section.


Developing Concerns Over Spinal Drug Safety


The clinical research outlined above that evolved after the initial demonstration of the actions of spinal morphine revealed a progressive appreciation that the safety of an agent after neuraxial delivery could not be predicated on prior assessments of safety after systemic delivery, and that changes in formulation or concentration might lead to deleterious effects. By the early 2000s concern regarding this issue had risen significantly ( ), such that several of the major journals publishing clinical pain research with spinally delivered analgesic agents ( Anesthesiology, Anesthesia and Analgesia ; Pain ) made it an editorial policy that they would not publish research if it had not been subject to appropriate and robust preclinical safety evaluation ( ). This thinking emphasized that robust preclinical toxicology required systematic dose/concentration assessments of extended exposures ( ) in small and, more importantly, large animal models. This interest has generated validated preclinical models of continuous infusion in dogs ( ), sheep ( ), and guinea pigs ( ).

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Sep 9, 2018 | Posted by in NEUROLOGY | Comments Off on Evolution of the Spinal Delivery of Opiate Analgesics

Full access? Get Clinical Tree

Get Clinical Tree app for offline access