Abstract
General anesthetics have been used to ablate consciousness during surgery for more than 150 yr. Despite significant advances in our understanding of their molecular-level pharmacologic effects, comparatively little is known about how anesthetics alter brain dynamics to cause unconsciousness. Consequently, while anesthesia practice is now routine and safe, there are many vagaries that remain unexplained. In this paper, the authors review the evidence that cortical network activity is particularly sensitive to general anesthetics, and suggest that disruption to communication in, and/or among, cortical brain regions is a common mechanism of anesthesia that ultimately produces loss of consciousness. The authors review data from acute brain slices and organotypic cultures showing that anesthetics with differing molecular mechanisms of action share in common the ability to impair neurophysiologic communication. While many questions remain, together, ex vivo and in vivo investigations suggest that a unified understanding of both clinical anesthesia and the neural basis of consciousness is attainable.
Since the inception of clinical anesthesia in the mid-nineteenth century, scientists have greatly advanced our fundamental understanding of modern anesthetics. In particular, there is now consensus that their molecular targets are proteins that control neuronal excitability and connectivity, including ligand- and voltage-gated channels.1,2 Anesthetic interactions at the γ-aminobutyric acid (GABA) type A (GABAA) receptor are best understood where they bind to receptor subunit interfaces to potentiate the effect of GABA binding and prolong the resulting increase in chloride conductance.3 This explains why many anesthetics broadly depress brain activity. However, understanding how molecular-level effects translate into macro network-level phenomena underlying unconsciousness has proven problematic. This is where our understanding of anesthetic action falters, and bridging this gap is an important focus for ongoing research efforts.
The aim of this review is to highlight the insights into network-level mechanisms of anesthesia derived from experiments utilizing ex vivo brain slice preparations. We begin with a brief introduction into the methodology of these preparations. Next, we briefly review our still rudimentary understanding of anesthetic effects at the network level, focusing particularly on evidence that cerebrocortical effects are central. Finally, we review data from ex vivo models supporting the hypothesis that a breakdown in cortical functional connectivity is a unifying explanation for anesthetic effects on consciousness. Anesthetic agents are diverse in terms of their chemical structure and molecular targets. We focus primarily on the effects of classical sedative/hypnotic agents, such as volatile anesthetics, propofol, etomidate, and barbiturates, occasionally referencing studies on the dissociative agent ketamine to point out some of its unique actions. We also draw on studies employing benzodiazepines, agents that are sedative and amnestic, and, at sufficient doses, can cause hypnosis, but are not used as general anesthetics in patients. These agents are highly specific for GABAA receptor–mediated inhibition and thus offer mechanistic insights that are not readily available from studies relying on less specific agents.
Ex Vivo/Slice Preparations of the Brain
Brain slice preparations have been a staple of in vitro neurophysiological research for decades.4 The acute brain slice method has been embraced by anesthesiologists for investigating anesthetic drug effects on neural networks, with the advantage that drug actions can be examined in isolated, locally connected networks under highly controlled, but flexible, conditions. Acute slices are obtained by swiftly, yet carefully, dissecting the brain out of an animal’s skull and cutting off slices of 300- to 700-µm thickness. The slicing angle must be chosen deliberately, such that the structures of interest are preserved as best as possible (fig. 1). For example, in the neocortex, one usually opts for a coronal or sagittal plane of section, both of which are parallel to the elongated apical dendrites of pyramidal cells, the dominant cortical cell type (fig. 1B). In slices meant to preserve synaptic connections between brain regions, e.g., thalamocortical slices, the plane of section must account for thalamocortical axonal pathways.5
Ex vivo slice preparations of the brain and common recording configurations in neocortex. (A) A schematic illustrating preparation of brain slices. Brains are sliced in any desirable plane and orientation (shown is a coronal slicing plane) and, depending on subsequent use, the slices may be trimmed to include just the region of interest. The resulting slices are either allowed to recover 1 to 2 h and are then used for experimentation on the same day (“acute slice”) or are placed on a substrate and cultivated in nutritional medium, resulting in an organotypic slice culture. Acute slices can be prepared from animals of any age (commonly juveniles), whereas for cultures, neonatal animals or even embryos are often required. (B) A sketch of a partial coronal brain slice including neocortex, thalamus, and hippocampus. Blue dots illustrate commonly used orientations of multichannel recording sites within the neocortex. In the “horizontal” orientation, the sites are situated within one layer or along layer boundaries, allowing the recording of interareal propagation of neuronal activity. The “vertical,” cross-layer orientation, running parallel to pyramidal cells’ apical dendrites, is usually chosen if the spread of activity within a cortical column/across layers is of interest, e.g., upon thalamic stimulation. Two pyramidal cells (magenta) are shown schematically to illustrate the orientation of the long apical dendrites.
Ex vivo slice preparations of the brain and common recording configurations in neocortex. (A) A schematic illustrating preparation of brain slices. Brains are sliced in any desirable plane and orientation (shown is a coronal slicing plane) and, depending on subsequent use, the slices may be trimmed to include just the region of interest. The resulting slices are either allowed to recover 1 to 2 h and are then used for experimentation on the same day (“acute slice”) or are placed on a substrate and cultivated in nutritional medium, resulting in an organotypic slice culture. Acute slices can be prepared from animals of any age (commonly juveniles), whereas for cultures, neonatal animals or even embryos are often required. (B) A sketch of a partial coronal brain slice including neocortex, thalamus, and hippocampus. Blue dots illustrate commonly used orientations of multichannel recording sites within the neocortex. In the “horizontal” orientation, the sites are situated within one layer or along layer boundaries, allowing the recording of interareal propagation of neuronal activity. The “vertical,” cross-layer orientation, running parallel to pyramidal cells’ apical dendrites, is usually chosen if the spread of activity within a cortical column/across layers is of interest, e.g., upon thalamic stimulation. Two pyramidal cells (magenta) are shown schematically to illustrate the orientation of the long apical dendrites.
Slicing the brain is clearly a traumatic procedure, warranting a careful choice of conditions.6 Relatively recent successes at improving the viability of slices include new slicer designs,7 improvements of oxygen supply,8 and specific formulations of the bathing medium (artificial cerebrospinal fluid) for slicing or immediate postslicing recovery.9,10 Thus, after decades of procedural optimization, brain slices offer a number of attractive features to experimentalists. They provide comparatively easy access to even the deepest brain regions, and single regions or subnetworks of the brain can be studied in isolation. Combined with a tight control over the environment of the neural tissue, including ionic composition of the artificial cerebrospinal fluid and concentrations of drugs of interest, they are ideal experimental objects for a reductionist approach to neuroscience research. However, our interpretation of past and present findings of brain slice research must take into consideration a specific bias inherent to these preparations: in brain regions like neocortex with prominent long-range connections, the majority of which are excitatory, slicing results in a disbalance of viable excitatory to inhibitory synaptic connections.11 In order to mitigate these and other side effects of excising neuronal tissue, “thick” slices (1 mm)12 and in vitro preparations of entire hippocampi13 have been devised.
An alternative to acute ex vivo preparations of the brain is the organotypic slice culture, essentially slices from neonatal or embryonic animals, cultivated for days to months. Although they have been in use for electrophysiologic experiments for an equally long time,14 they are not nearly as widely used as acute slices. As has been detailed elsewhere,15,16 there is a tradeoff: high levels of spontaneous activity, the possibility to design a wide range of brain subnetworks not feasible in acute preparations, and fast diffusion of drugs into the thin tissue have to be weighed against a less faithful neuronal morphology and synaptic architecture.
In order to illustrate the important contribution ex vivo models have made to our understanding of anesthesia mechanisms, we next present the context of related and equally important in vivo experimental and clinical investigations. With a spotlight on ex vivo models, we have endeavored to integrate a broad knowledge base into a coherent picture of cortical anesthetic effects.
Cortical versus Subcortical Anesthetic Action
While cerebrocortical networks are central to consciousness, there remains contention as to whether the behavioral effects of anesthetics are better explained by direct action on the cortex, or via subcortical effects on the thalamus or on natural sleep and arousal pathways. The potential influence of these neurophysiologic details on the quality of anesthetic induction, maintenance, and emergence highlights the importance of this research avenue.
The subcortical and cortical areas involved in anesthetic action are not easily functionally separated when considering the patient as a whole. It can be helpful to conceptually divide general anesthesia appropriate for surgery as targeting two aspects of consciousness: level of consciousness, including overall arousal and responsiveness to external stimuli; and content of consciousness, including interpretation of the saliency of these signals as sensations that could be harmful or warrant decisive action, as well as mentation that may be unrelated or only tangentially related to the sensory environment.17 This functional separation can be used as a framework for understanding the neuroanatomy of clinical anesthesia at the systems neuroscience level. Accumulating evidence suggests that general anesthetics produce some aspects of hypnosis by acting on sleep and arousal centers in the brain. For example, anesthetics promote sleep-like patterns of activity in the subcortical nuclei in hypothalamus and brainstem that control sleep—for example, activating sleep-active neurons in the ventrolateral preoptic nucleus.18–20 These agents also depress activity in subcortical arousal centers, e.g., noradrenergic, cholinergic, and dopaminergic neurons in the locus coeruleus, basal forebrain, and ventral tegmental area, respectively.21,22 There is also extensive evidence for anesthetic actions on the thalamus. The thalamus is a critical hub in the ascending arousal system, in relaying sensory information to cortex,23 and in corticocortical communication.24 Activity in the thalamus is suppressed by nearly all anesthetics, with the exception of ketamine.25 The functional roles of each of these anesthetic targets in producing and maintaining hypnosis and in emergence from anesthesia can be defined within the dichotomy of level versus content of consciousness.17 Actions on sleep and arousal centers (including the arousal-promoting functions of thalamus) would control arousal, i.e., level of consciousness, whereas the content of consciousness would be due to direct actions on the corticothalamic network, and also secondary to actions on sleep and arousal centers. Understanding the relative contributions of each to the functional endpoint of clinical anesthesia is a formidable challenge and may ultimately prove to be dependent on clinical context and individual-specific differences.
Electroencephalogram Changes under General Anesthesia
The importance of subcortical targets for anesthesia is supported by similarities in the electroencephalogram during general anesthesia and natural sleep. For agents that predominantly act by enhancing GABAA receptor inhibition (e.g., propofol and volatile agents, among others), surgical anesthesia is characterized by heightened power in the alpha and delta bands, resembling non–rapid eye movement stage 2 and non–rapid eye movement stage 3 sleep, respectively.26–29 Recent in vivo work in rats has demonstrated that the central medial thalamus may act as an initiation hub for natural sleep and propofol anesthesia, with changes in dynamics of high-frequency bands occurring prior to similar changes in the neocortex.27 Neuroimaging studies also support functional thalamic disconnection during anesthesia.30,31 However, as reviewed by Antkowiak,32 most of the electroencephalogram field potential patterns indicative of general anesthesia (thalamocortical oscillations being the obvious exception) can be reproduced in isolated cortical networks in the absence of subcortical connections. This suggests that an electroencephalogram pattern resembling that of non–rapid eye movement sleep does not prove subcortical causation. Furthermore, not all anesthetics depress the electroencephalogram, ketamine being an obvious case in point.33,34
Subcortical Microinjection Studies
Subcortical action is further supported by microinjection experiments showing that highly localized drug application to some mesopontine/midbrain regions can, on their own and without direct cortical effects, cause an anesthesia-like state,18,35,36 as well as studies showing that stimulation of the dopaminergic ventral tegmental area can hasten arousal from anesthesia in rodents.37 Minert et al.38 have shown that anesthetic microinjection into a highly localized region of the upper brainstem called the mesopontine tegmental anesthetic area induces a reversible state of unconsciousness, along with associated anesthesia-like changes in the electroencephalogram.35 Investigation of fos protein expression suggests that general anesthetic cortical action may be attributable to effects at subcortical neuromodulatory sites.39 It should be noted, however, that these results have not been fully replicated elsewhere.40 Also, because the extent of drug diffusion is difficult to estimate, the choice of drug concentration is not straightforward in these types of microinjection studies and may result in a tissue concentration as much as 50 times the cerebral tissue concentration required for thiopental anesthesia in rats.35,41,42 Under the reasonable assumption that such concentrations incapacitate neurons in any brain region, these studies provide valuable information on the role of the subcortical site in question in consciousness. However, this should not be considered evidence of the sites’ pivotal role in bringing about unconsciousness by clinically relevant concentrations of anesthetics. It is also worth noting that the microinjection studies that document dramatic anesthesia-like effects form a small minority of a large number of similar investigations. As summarized recently by Leung et al.,43 these studies generally report an increase in anesthetic sensitivity, but not general anesthesia outright. At the very least, no one subcortical site seems to form a unifying “anesthesia-sensitive” area.
Pharmacologic Investigations of a Hyperpolarized Thalamus
An idea that has remained prominent since its introduction by Angel44 in 1991 is that thalamic hyperpolarization causes a transmission block to the passage of sensory information from periphery to cortex. The premise is that consciousness is unsustainable in the absence of a sensory substrate. While also appealing in its intuitive simplicity, there are now compelling experimental data refuting this hypothesis. First, in keeping with its tendency to maintain cortical activity, ketamine at anesthetic concentrations neither suppresses the thalamus45 nor blocks thalamocortical sensory transmission.46 Cortical sensory evoked potential studies also show that primary sensory cortical responsiveness is preserved, and can even be enhanced during anesthesia.47–50 It thus seems likely that unconsciousness occurs at the level of disruption to cortical information processing, not cortical sensory reception. Importantly, this does not preclude the thalamus as an essential anesthetic target, particularly in light of the role of thalamocortical relays in mediating corticocortical communication.51 Dexmedetomidine is reported to block resting state thalamocortical connectivity to a greater extent than corticocortical connectivity.52 Recent data also show that xenon anesthesia may be mediated by a hyperpolarization-activated cyclic nucleotide-gated (HCN) cation channel type 2–blocking action with effects on thalamocortical primary sensory relay pathways.53 The role of HCN channels in mediating anesthesia effects is not clear, however. For example, targeted blockade of HCN channels with ZD-7288 is without effect on highly synchronized “seizure-like event” activity in cortical slices,54 while anesthetics such as propofol, etomidate, and ketamine consistently reduce seizure-like event activity in cortical slices.55
Cortical versus Subcortical Anesthetic Sensitivity
The cortical/subcortical debate hinges not so much on whether anesthesia can be induced by action at one site or the other, as it seems that both are possible.17 Instead, the relative importance of one or the other depends on their relative anesthetic sensitivity at clinically relevant concentrations. This distinction is not only of theoretical interest, but is vital for understanding clinically relevant behavioral phenomena associated with anesthesia. For example, even though emergence from anesthesia may not be a simple passive reversal of the induction process,56 knowing the relative influence of dwindling anesthetic concentrations on the spiking activity at cortical versus subcortical sites is pivotal for understanding the nature of these transitions. If the subcortex is more sensitive than the cortex, emergence from anesthesia may be dominated by lingering anesthetic effects on subcortical structures (breathing, detection of sensory stimuli). Conversely, lingering cortical effects during emergence could result in a mismatch between subcortical sensory input mediating arousal and readiness to integrate complex information from different cortical areas. The arousal disturbance in children that results in so-called “night terrors” may be a manifestation of this in natural sleep.57
The balance of evidence seems to favor cortical over subcortical sensitivity, with the proviso that this remains an area of active research and is perhaps even individual-specific. High-frequency electroencephalogram power (greater than 14 Hz, beta and gamma waves) represents the transfer of information among cortical regions during wakefulness,58 and when present during clinical anesthesia for surgery, these features are associated with patient movement and heightened arousal.59 These “cortical arousals” can be pharmacologically induced and appear to hasten emergence from anesthesia in vivo.60 Furthermore, anesthetic reduction in cortical firing rates has been shown to be similar with or without subcortical connectivity.61
Interactions among the Thalamus and Cortex
Thalamocortical slice studies add further support to the corticocentric view of anesthetic action. Although early reports provided evidence that anesthetics might suppress cortical activity via actions in the thalamus,62 more recent reports have demonstrated greater sensitivity of intracortical compared to thalamocortical signal pathways. By selectively activating corticocortical “top-down” and thalamocortical “bottom-up” pathways, Raz et al.49 showed that evoked corticocortical responses exhibited greater sensitivity to isoflurane compared to thalamocortical responses (fig. 2). In vivo, visual responses in the primary auditory cortex, mediated at least in part by projections from the secondary visual cortex,63 are blocked by anesthetics at doses that do not suppress auditory responses to pure tones.49 Consistent with these data, early- and mid-latency components of auditory evoked potentials are resistant to the effects of anesthesia, at least at doses causing loss of consciousness.64,65 These components most likely correspond to subcortical and thalamocortical synaptic potentials.66–68 At higher doses, corresponding to surgical levels of anesthesia, even these shorter components are suppressed by general anesthetics.69,70 By contrast, longer latency components, likely reflecting corticocortical signaling, are sensitive to even low doses of anesthetics.71,72 These data suggest that many of the effects on sensory evoked potentials during clinical anesthesia may be due to direct cortical effects rather than effects on thalamocortical afferents. In support of this, changes in cortical evoked potentials in isolated cortical slices closely resemble those observed in vivo73 for a range of anesthetic classes.50
Pathway specificity of isoflurane effects in auditory cortex. Current source density responses to corticocortical ([CC] A; top-down) and thalamocortical ([TC] B; bottom-up) synaptic responses in murine brain slices of primary auditory cortex in control (left) and recovery (right) and three doses of isoflurane (middle). In each, the vertical axis corresponds to normalized cortical depth (pial surface at top, white matter at the bottom). Horizontal gray lines indicate cortical layer boundaries. Blue corresponds to current sinks, i.e., excitatory synaptic currents flowing into cells. (C) Magnitude of layer 3/4 TC sink (red) and layer 1 CC sink (blue) from the data in A, showing greater suppression by isoflurane of corticocortical responses compared to thalamocortical. (D) Same as C but showing the two-dimensional cross-correlation between sink pattern at each drug condition with the pattern in control. C, correlation; cond, drug condition; ctrl, control; Iso, isoflurane. Reproduced under the terms of the Creative Commons Attribution License from Raz et al.49
Pathway specificity of isoflurane effects in auditory cortex. Current source density responses to corticocortical ([CC] A; top-down) and thalamocortical ([TC] B; bottom-up) synaptic responses in murine brain slices of primary auditory cortex in control (left) and recovery (right) and three doses of isoflurane (middle). In each, the vertical axis corresponds to normalized cortical depth (pial surface at top, white matter at the bottom). Horizontal gray lines indicate cortical layer boundaries. Blue corresponds to current sinks, i.e., excitatory synaptic currents flowing into cells. (C) Magnitude of layer 3/4 TC sink (red) and layer 1 CC sink (blue) from the data in A, showing greater suppression by isoflurane of corticocortical responses compared to thalamocortical. (D) Same as C but showing the two-dimensional cross-correlation between sink pattern at each drug condition with the pattern in control. C, correlation; cond, drug condition; ctrl, control; Iso, isoflurane. Reproduced under the terms of the Creative Commons Attribution License from Raz et al.49
Imaging and electroencephalogram studies in primates and human subjects also point to corticocortical synaptic signaling as a critical locus for effects of anesthetics on consciousness.74,75 During midazolam-induced loss of consciousness and during slow-wave sleep, cortical responses to transcranial magnetic stimulation are enhanced locally, but the spread of activity due to corticocortical interactions is reduced47,76 ; similar effects are observed with the dissociative anesthetic ketamine.77–79 “Feedback” cortical connections are particularly sensitive to hypnotic doses of anesthetics,30,80–86 as they are in non–rapid eye movement sleep and in disorders of consciousness.87,88 These data, which are conserved across species,89 suggest that loss of consciousness is accompanied by reduced cortical connectivity in the presence of maintained responsiveness to thalamic inputs in the primary sensory cortex90,91 (fig. 3).
Summary of effects of general anesthetics on long-range connectivity in the corticothalamic network. Schematic showing major feedforward and feedback afferent pathways in the cortico-thalamic network. Under awake conditions (left), projections from “core” cells (blue) in thalamus carry specific information to granular layers (L4) in neocortex, while “matrix” cells (red) exert modulatory influences in supragranular and infragranular layers (L1, L5). Feedforward cortical projection cells (cyan) in supragranular layers (L2/3) project to higher-order cortex, while feedback cortical projection cells (magenta) in infragranular layers (L5/6) project to lower-order cortex (and subcortically; not shown). Under doses of anesthesia causing loss of consciousness (right), feedback corticocortical and matrix thalamocortical projections are suppressed relative to feedforward corticocortical and core thalamocortical projections.
Summary of effects of general anesthetics on long-range connectivity in the corticothalamic network. Schematic showing major feedforward and feedback afferent pathways in the cortico-thalamic network. Under awake conditions (left), projections from “core” cells (blue) in thalamus carry specific information to granular layers (L4) in neocortex, while “matrix” cells (red) exert modulatory influences in supragranular and infragranular layers (L1, L5). Feedforward cortical projection cells (cyan) in supragranular layers (L2/3) project to higher-order cortex, while feedback cortical projection cells (magenta) in infragranular layers (L5/6) project to lower-order cortex (and subcortically; not shown). Under doses of anesthesia causing loss of consciousness (right), feedback corticocortical and matrix thalamocortical projections are suppressed relative to feedforward corticocortical and core thalamocortical projections.
In the remaining sections, we will explore the contribution that acute and organotypic slices can make to understanding and substantiating cortical anesthetic action. In particular, evidence suggests that cortical network activity may underlie bidirectional communication within the cortical hierarchy,92 a breakdown in which may disrupt the integrative processes central to conscious experience.93,94 While the cortex is central to the information that mediates our conscious experience, it should once again be pointed out that this does not preclude the involvement of thalamic and other subcortical nuclei.95,96
Anesthetic Effects on Physiologic Cortical Network Activity
Are effects on cortical networks consistent with a corticocentric view of anesthetic action? At the very least, there are strong data to indicate that the cerebral cortex is an important direct target of anesthetic drugs. Accordingly, models utilizing isolated cortical networks provide a valuable tool for investigating anesthetic mechanisms of action. Many have approached this by examining signature electrical patterns in cortical networks that are relevant to anesthetic mechanisms causing unconsciousness.
Network Activity in Organotypic Cultures Informs Anesthesia Research
A commonly reported effect of anesthetics consists of a reduction of network activity. Evidence that this effect is mediated at least in part by direct actions on cortical circuits is derived from experiments on neocortical organotypic cultures, where spontaneous activity rates are very sensitive to virtually all classes of anesthetics and sedatives.97 Diazepam was shown to depress spontaneous firing rates in a biphasic concentration-dependent manner, highlighting the different roles of classical and nonclassical benzodiazepine binding sites of GABAA receptors.98 Analysis of activity rates and patterns in the same model system allowed disentangling the effects of midazolam from its main metabolite, suspected to have potent depressive actions on neuronal activity as well.99 In vivo, however, the picture is more complex. While a reduction in cortical neuronal firing (with accompanying changes in the electroencephalogram) does reliably accompany slow-wave sleep100 and anesthesia,61,101–104 dissociative anesthetics such as ketamine maintain or even increase cortical activity.105 In addition, depression of cortical activity by anesthetics can be unrelated to loss of consciousness per se.106–108 Therefore, we posit that although changes of activity rates in cortical networks ex vivo are a very useful approximate marker of the potency/efficacy of anesthetics, there is no straightforward causal relation between their change and the state of consciousness.
Anesthetic Effects on Cortical Network Activity and Sensory Information Processing in Acute Slices
Emerging evidence suggests that information transfer within the cortical network may occur via all-or-none responses in cortical ensembles rather than stochastic firing of individual cells.92,109–111 In the auditory cortex, network bursts are triggered by sensory stimulation and contain specific spatiotemporal spike sequences (“packets”) whose organization is stimulus-specific and thus may underlie a population-based encoding process.112,113 Importantly, these network bursts and the network bistability that underlies them are observed in waking conditions,114–116 suggesting an important role in sensory awareness. Spontaneous and induced network activity similar to that recorded in vivo during sensory processing can readily be observed in acute slices of rodent neocortex.117–120 The occurrence and propagation of this activity in cortex are modulated by volatile anesthetics, which can both promote its occurrence by synchronizing network activity and disrupt its propagation by interfering with corticocortical communication. Indeed, we have shown that in murine auditory thalamocortical brain slices, nearly all spiking activity in response to stimulation of thalamocortical afferents occurs in the context of network bursts.119 In acute slices, which dwell primarily in quiescent “down” states, spontaneous and induced network bursts are suppressed by the volatile anesthetic isoflurane at moderate doses (fig. 4A).49,121 Importantly, spiking that is monosynaptically driven by thalamocortical afferents is significantly more difficult to block by isoflurane than reverberant activity generated within the burst by local corticocortical connections (fig. 4, B and C). These results are consistent with a model in which anesthetics, at doses causing loss of consciousness and suppression of sensory awareness, act directly on circuits intrinsic to the cortex rather than on pathways carrying information to the cortex from the periphery.
Isoflurane depresses polysynaptic bursts more than monosynaptically driven activity. (A) Current source density plots of activity in a mouse auditory thalamocortical slice induced by electrical stimulation of thalamic afferents. The vertical extent of the plots spans the entire cortical depth. Arrowheads indicate the times of occurrence of the stimulation pulses (four pulses at 40 Hz). Cold colors represent current sinks, warm colors current sources. Brief monosynaptic responses (~10 ms) appear immediately after each stimulation pulse, whereas the much longer bursts arise after the third stimulation pulse and evolve over hundreds of milliseconds poststimulus. Compare the almost complete depression of bursts by isoflurane to the moderate attenuation of the monosynaptic responses. (B) Depression of monosynaptically driven (“early”) spiking activity in thalamocortical slices by isoflurane. Each point represents the integral of these early responses (see Hentschke et al.121 for details) from a slice, normalized to the drug-free condition. (C) Integral of burst activity induced by thalamocortical (TC) stimulation and cortical layer 1 (L1) stimulation (same conventions as in B apply). Bu, burst; integ, integral; iso, isoflurane; norm, normalized to control. Reproduced under the terms of the Creative Commons Attribution License. A modified from Raz et al.49 ; B and C slightly modified from Hentschke et al.121
Isoflurane depresses polysynaptic bursts more than monosynaptically driven activity. (A) Current source density plots of activity in a mouse auditory thalamocortical slice induced by electrical stimulation of thalamic afferents. The vertical extent of the plots spans the entire cortical depth. Arrowheads indicate the times of occurrence of the stimulation pulses (four pulses at 40 Hz). Cold colors represent current sinks, warm colors current sources. Brief monosynaptic responses (~10 ms) appear immediately after each stimulation pulse, whereas the much longer bursts arise after the third stimulation pulse and evolve over hundreds of milliseconds poststimulus. Compare the almost complete depression of bursts by isoflurane to the moderate attenuation of the monosynaptic responses. (B) Depression of monosynaptically driven (“early”) spiking activity in thalamocortical slices by isoflurane. Each point represents the integral of these early responses (see Hentschke et al.121 for details) from a slice, normalized to the drug-free condition. (C) Integral of burst activity induced by thalamocortical (TC) stimulation and cortical layer 1 (L1) stimulation (same conventions as in B apply). Bu, burst; integ, integral; iso, isoflurane; norm, normalized to control. Reproduced under the terms of the Creative Commons Attribution License. A modified from Raz et al.49 ; B and C slightly modified from Hentschke et al.121
Sharing of information between local cortical networks is central to supporting consciousness.90 A major pathway for this integration process is via corticocortical connections between cortical columns, which form functional units for processing sensory information. Disruption of these connections by anesthetic agents would likely contribute to cortical disintegration during loss of consciousness. Experiments in brain slices support this model. Network activity propagates “horizontally” in the cortex from column to column via local connectivity,112,117,122,123 especially among infragranular pyramidal cells.123 In thalamocortical brain slices, isoflurane suppresses intercolumnar propagation of network activity, either spontaneous or induced by thalamocortical or corticocortical afferent stimulation (fig. 5, A and B).121 These effects are consistent with a model in which isoflurane suppresses the activation of local networks by the propagating wave. Anesthetic interruption of propagating network activity is also seen in cortical slices activated by removal of artificial cerebrospinal fluid magnesium (fig. 6, A and B). In this model, spontaneous seizure-like event activity spreads freely across the full extent of the cortex, even across hemispheres. While strictly an epileptiform model, correlated network activity in the neocortex has also been described in the context of synchronized brain states that occur under surgical levels of anesthesia and during slow-wave sleep.124 Furthermore, anesthetic effects on seizure-like event activity directly correlate with in vivo anesthetic hypnotic potency.55,125–127 Two main conclusions can be drawn from recent experiments. First, agents of different classes, including ketamine, etomidate, and propofol, constrain the spatial extent of seizure-like event spread from the event initiation source.127 When multiple initiation locations are evident in the same slice, the effect is identical to isolating one source from the other by physically sectioning the tissue. Thus, the anesthetic effect seems to equate to a functional disconnection. Second, the population events recorded near the source of seizure-like event generation have shorter rise times and higher amplitudes during anesthetic exposure,128 suggestive of enhanced local network synchrony. The dual effects of enhanced local response combined with reduced activity spread bear striking resemblance to the effect of midazolam on consciousness and during slow-wave sleep, cortical responses to transcranial responses in anesthetized human subjects.47
Anesthetics slow and impair propagation of cortical activity in acute thalamocortical slices. (A) Example of neocortical burst activity in a thalamocortical slice. Bursts were either induced by electrical stimuli (dotted lines) in auditory thalamus (TC) or in cortical layer 1 (L1) or arose spontaneously; they were extracellularly recorded from a linear 16 channel-array placed in layer five of neocortex. Gray traces are three representative trials; colored thick traces are averages. Note the speedy uni- or bidirectional burst propagation during control, and its impairment by a very small concentration of isoflurane. (B) Speed of burst propagation in various isoflurane concentrations, normalized to control. Each filled circle is one slice. Bu, burst; iso, isoflurane; norm, normalized to control; S, spontaneous. Modified and reproduced under the terms of the Creative Commons Attribution License from Hentschke et al.121
Anesthetics slow and impair propagation of cortical activity in acute thalamocortical slices. (A) Example of neocortical burst activity in a thalamocortical slice. Bursts were either induced by electrical stimuli (dotted lines) in auditory thalamus (TC) or in cortical layer 1 (L1) or arose spontaneously; they were extracellularly recorded from a linear 16 channel-array placed in layer five of neocortex. Gray traces are three representative trials; colored thick traces are averages. Note the speedy uni- or bidirectional burst propagation during control, and its impairment by a very small concentration of isoflurane. (B) Speed of burst propagation in various isoflurane concentrations, normalized to control. Each filled circle is one slice. Bu, burst; iso, isoflurane; norm, normalized to control; S, spontaneous. Modified and reproduced under the terms of the Creative Commons Attribution License from Hentschke et al.121
Anesthetics slow and impair propagation of seizure-like event (SLE) activity in acute cortical slices. Schematic (A) and recorded data (B) showing the effect of etomidate on the pattern of zero-magnesium SLE activity in the cortical slice. Shown is one hemisphere of a coronally cut slice with two recording electrodes (R1 and R2), with a hypothetical (but realistic128 ) scenario of two independent sources of SLE activity (S1 and S2), each of which initiate repeating waves of excitation that spread across the full extent of the cortex in opposite directions (A, left). Under this baseline (drug-free) condition, each event will be recorded by both electrodes, with small interelectrode time lags reflecting the speed of wave propagation. As such, each event will appear “synchronized” across both channels (B, left). A proposed explanation for the effect of etomidate is shown schematically (A, right). Propagation of some of the SLE wavefronts is curtailed such that some of the events initiated at S1 will not reach R1 and vice versa. Consequently, the recordings will take on a “desynchronized” appearance (B, right). Variations of this theme will be apparent from slice to slice, according to the number of SLE initiation sources present and where those sources are located relative to the recording electrode positions. Recorded data are from Voss et al.128
Anesthetics slow and impair propagation of seizure-like event (SLE) activity in acute cortical slices. Schematic (A) and recorded data (B) showing the effect of etomidate on the pattern of zero-magnesium SLE activity in the cortical slice. Shown is one hemisphere of a coronally cut slice with two recording electrodes (R1 and R2), with a hypothetical (but realistic128 ) scenario of two independent sources of SLE activity (S1 and S2), each of which initiate repeating waves of excitation that spread across the full extent of the cortex in opposite directions (A, left). Under this baseline (drug-free) condition, each event will be recorded by both electrodes, with small interelectrode time lags reflecting the speed of wave propagation. As such, each event will appear “synchronized” across both channels (B, left). A proposed explanation for the effect of etomidate is shown schematically (A, right). Propagation of some of the SLE wavefronts is curtailed such that some of the events initiated at S1 will not reach R1 and vice versa. Consequently, the recordings will take on a “desynchronized” appearance (B, right). Variations of this theme will be apparent from slice to slice, according to the number of SLE initiation sources present and where those sources are located relative to the recording electrode positions. Recorded data are from Voss et al.128
Hippocampal Slice Preparations and Their Role in Anesthesia Research
Much of the original work with brain slice techniques focused on the hippocampus, a specialized archaecortical structure in the temporal lobe that is conserved in form and function across mammals.4 The hippocampal slice has led to many great advances in understanding learning and memory,129 and anesthesiology researchers have used this system extensively to better understand the amnesic effect of anesthetic drugs.130–132 However, even with its classic and well-described role in learning and memory consolidation, it is an oversimplification to exclude the hippocampus from a discussion on the hypnotic effect of anesthetics. The hippocampus is exquisitely sensitive to anesthetics,133 and a case can be built for it subserving a critical role in binding episodic memory formation to our experience of the external world.134 It has been argued that the neural processes underpinning episodic memory could actually be regarded as indistinguishable from those underpinning consciousness.134 The neuroanatomical connectivity of the hippocampus to some extent supports this. Although not spatially arranged exactly like those from the neocortex (i.e., motor and sensory cortices),23 many reciprocal connections with corticothalamic networks exist in the hippocampus.
Outright seizures are a rare but important complication of anesthesia.135,136 The hippocampal slice has also been used in combination with pharmacologic manipulation of the GABAA receptor for mechanistic investigations of seizures. For example, propofol133 and isoflurane137 have differential excitatory effects on cortical and hippocampal networks. Propofol, in particular, has a propensity to induce ictal-like events in hippocampal slices.133 Interestingly, recent evidence suggests that inhibition of GABA signaling by the bath application of the macrolide antibiotic clarithromycin does not cause seizure-like events when administered alone to slices, but increases the frequency of these events when given in combination with a solution (high potassium) known to hyperexcite.138 Human studies link clarithromycin with improved vigilance in patients with hypersomnia,139 highlighting a potential avenue for hastening recovery from general anesthesia without hypoactive delirium in the recovery room.140
Functional Implications and Future Directions
While we present for actions of anesthetics at multiple locations in the brain, each contributing to specific anesthetic endpoints, our emphasis has been on cortical actions that likely impact the content of consciousness. That is, while actions on the brainstem and midbrain nuclei play important roles in arousal and in the transitions into and out of awareness, there is substantial evidence that consciousness itself is a corticothalamic phenomenon. Within such a highly recurrent network, there may be multiple hubs and pathways that play critical roles, any one of which may represent the primary or “first” site of anesthetic action depending on the agent, context, and organism of interest. The greatest current gap in our understanding of loss and recovery of consciousness under anesthesia lies at this network level. Because monitors of brain activity in clinical settings invariably sample activity averaged over large numbers of cortical cells, improvements in assays of depth of anesthesia will derive from studies at this spatial scale. Herein, we have reviewed some of the evidence that cortical network activity is particularly sensitive to general anesthetics, but many questions remain. Primary on this list is the mechanism underlying this sensitivity. Is the disruption of cortical network activity a function of the structure of the network, i.e., sparsely firing cells densely interconnected via synaptic connections with low release probability, balanced excitation and inhibition, and columnar stratification of external connectivity,141 or do corticocortical synapses or cortical pyramidal cells possess intrinsic elements that are specifically sensitive to anesthetics, properties not found at thalamocortical synapses or in membranes of subcortical neurons? A second critical question is why disruption of network activity is so deleterious, given that sensory stimulation still drives activity in the sensory cortex under anesthesia and gross features of tuning are preserved.142–144 Evidence suggests that cortical network activity may underlie bidirectional communication within the cortical hierarchy92 that is critical for information integration and predictive coding,93,94 processes central to construction of self, mind, and experience. Investigations in vivo and ex vivo aimed at understanding how the transfer of information packets109 between cortical regions occurs and how this process is disrupted by general anesthetics will contribute to our understanding of both anesthesia in a clinical setting and the neural basis of consciousness itself.
Research Support
Supported by National Institutes of Health grant Nos. R01 GM109086 and GM116916A (Bethesda, Maryland; to Dr. Banks); the Waikato Medical Research Foundation (Hamilton, New Zealand; to Dr. Voss); the Veteran’s Affairs Career Development Award No. BX00167 (Washington, DC; to Dr. García); and the James S. McDonnell Foundation Grant No. 220023046 (St. Louis, Missouri; to Dr. García).
Competing Interests
The authors declare no competing interests.