One of the most noticeable publishing
trends of the last ten years has been the rise and rise of the
‘misery memoir’. In this genre, the authors recount the tough times
of their childhood and how they have risen above them to be
successful and fulfilled individuals. The genre can be sub-divided
into two categories. The first is the poor-but-happy tale, the ‘we
had nothing but we had love’ story. The second, which may or may
not also include poverty, tends to be much more disturbing. It
focuses on harrowing tales of childhood neglect and childhood
abuse, and some of these memoirs have been hugely successful. A
Child Called It by Dave Pelzer, possibly the most famous of
this category of books, spent over six years on the New York
Times bestsellers list.
A substantial amount of their appeal
seems to lie in the triumph-over-adversity aspects of these
memoirs. Readers seem to take heart from the stories of individuals
who, despite a terrible start in life, finally grow up to be happy,
well-balanced adults. We applaud those who become winners ‘against
the odds’.
This tells us something quite revealing.
It shows that, as a society, we believe that early childhood events
are extremely important in influencing adult life. It also shows
that we believe that it is very difficult to get over the effects
of early trauma. As a readership, we possibly value these
successful survivors because of what we perceive as their relative
rarity.
In many ways, we are correct in our
assumptions as it is true that dreadful early childhood experiences
really can have a dramatic impact on adult life. There are all
sorts of ways in which this has been measured and the precise
figures may vary from study to study. Despite this, certain clear
trends have emerged. Childhood abuse and neglect result in adults
with a three times greater risk of suicide than the general
population. Abused children are at least 50 per cent more likely
than the general population to suffer from serious depression as
adults, and will find it much harder to recover from this illness.
Adults who were subjected to childhood abuse and neglect are also
at significantly higher risk of a range of other conditions
including schizophrenia, eating disorders, personality disorders,
bipolar disease and generalised anxiety. They are also more likely
to abuse drugs or alcohol1.
An abusive or neglectful environment when
young is clearly a major risk factor for the development of later
neuropsychiatric disorders. We are so aware of this as a society
that sometimes we almost forget to question why this should be the
case. It just seems self-evident. But it’s not. Why should events
that lasted for two years, for example, still have adverse
consequences for that individual several decades
later?
One explanation that is often given is
that the children are ‘psychologically damaged’ by their early
experiences. Whilst true, this isn’t that helpful a statement. The
reason why it’s not helpful is that the phrase ‘psychologically
damaged’ isn’t really an explanation at all – it’s a description.
It sounds quite convincing but on certain levels it doesn’t really
tell us anything.
Any scientist addressing this problem
will want to take this description and probe it at another level.
What are the molecular events that underlie this psychological
damage? What happens in the brains of the abused or neglected
children, that leaves them so prone to mental health problems as
adults?
There is sometimes resistance to this
approach from other disciplines, which work within different
conceptual frameworks.
This seems rather puzzling. If we don’t
accept there is a molecular basis to a biological effect, what are
we left with? A religious person may prefer to invoke the soul,
just as a Freudian therapist may invoke the psyche. Both of these
refer to a theoretical construct that has no defined physical
basis. Moving into such a model system, where it is impossible to
develop the testable hypotheses that are the cornerstone of all
scientific enquiry, is deeply unattractive to most scientists. We
prefer to probe for a mechanism that has a physical foundation,
rather than defaulting to a scenario in which there is something
which is assumed, somehow, to be a part of us, without having any
physical existence.
This can generate a cultural clash, but
it’s one that’s based on a misunderstanding. A scientist will
expect that observable events have a physical basis. For the topic
of this chapter, our proposed hypothesis is that terrible early
childhood experiences change certain physical aspects of the brain
during a key developmental period. This in turn affects the
likelihood of mental health problems in adult life. This is a
mechanistic explanation. It’s lacking in details, admittedly, but
we’ll fill in some of these in this chapter. Mechanistic
explanations often sit uncomfortably in our society, because they
sound too deterministic. Mechanistic explanations are
misinterpreted and taken to imply that humans are essentially
robots, wired and programmed to respond in certain ways to certain
stimuli.
But this doesn’t have to be the case. If
a system has enough flexibility, then one stimulus doesn’t always
have to result in the same outcome. Not every abused or neglected
child develops into a vulnerable, unwell adult. A phenomenon can
have a mechanistic basis, without being deterministic.
The human brain possesses sufficient
flexibility to generate different adult outcomes in response to
similar childhood experiences. Our brains contain one hundred
billion nerve cells (neurons). Each neuron makes links with ten
thousand other neurons to form an incredible three dimensional
grid. This grid therefore contains a thousand trillion connections
– that’s 1,000,000,000,000,000 (a quadrillion). It’s hard to
imagine this, so let’s visualise each connection as a disc that’s
1mm thick. Stack up the quadrillion discs on top of each other and
they will reach to the sun (which is ninety-three million miles
from the earth) and back, three times over.
That’s a lot of connections, so it’s
perfectly possible to imagine that our brains have a lot of
flexibility. But the connections are not random. There are networks
of cells within the giant grid which are more likely to link to
each other than to anywhere else. It’s this combination of huge
flexibility, but constrained within certain groupings, that is
compatible with a system that is mechanistic but not entirely
deterministic.
The child is (epigenetically) father to
the man
The reason scientists have hypothesised
that the adult sequelae of early childhood abuse may have an
epigenetic component is that we’re dealing with scenarios where a
triggering event continues to have consequences long after the
trigger itself has disappeared. The long-term consequences of
childhood trauma are very reminiscent of many of the effects that
are mediated by epigenetic systems. We have seen some examples of
this already. Differentiated cells remember what cell type they
are, even after the signal that told them to become kidney cells or
skin cells has long since vanished. Audrey Hepburn suffered from
ill-health her whole life because of the malnutrition she suffered
as a teenager during the Dutch Hunger Winter. Imprinted genes get
switched off at certain stages in development, and stay off
throughout the rest of life. Indeed, epigenetic modifications are
the only known mechanism for maintaining cells in a particular
state for exceptionally long periods of time.
The hypothesis that epigeneticists are
testing is that early childhood trauma causes an alteration in gene
expression in the brain, which is generated or maintained (or both)
by epigenetic mechanisms. These epigenetically mediated
abnormalities in gene expression predispose adults to increased
risk of mental illnesses.
In recent years, scientists have begun to
generate data suggesting that this is more than just an appealing
hypothesis. Epigenetic proteins play an important role in
programming the effects of early trauma. Not only that, they also
are involved in adult depression, drug addiction and ‘normal’
memory.
The focus of a lot of research in this
field has been a hormone called cortisol. This is produced from the
adrenal glands which sit on top of the kidneys. Cortisol is
produced in response to stress. The more stressed we are, the more
cortisol we produce. The average level of cortisol production tends
to be raised in adults who had traumatic childhoods, even if the
individuals are healthy at the time of measurement2,3. What this shows is that
adults who were abused or neglected as children have higher
background stress levels than their contemporaries. Their systems
are chronically stressed. The development of mental illness is, in
many cases, probably a little like the development of cancer. A lot
of things need to go wrong at the molecular level before a person
becomes clinically ill. The chronic stress levels in the abuse
survivors push them closer to that threshold. This increases their
vulnerability to disease.
How does this over-expression of cortisol
happen? It’s a consequence of events that happen far from the
kidneys, in our brains. There is a whole signalling cascade
involved here. Chemicals produced in one region of the brain act on
other areas. These areas in turn produce other chemicals in
response and the process continues. Eventually a chemical leaves
the brain and signals to the adrenal glands and cortisol is
produced. During an abusive childhood, this signalling cascade is
very active. In many abuse survivors, this system keeps signalling
as if the person is still trapped in the abusive situation. It’s as
if the thermostat on a central heating system has malfunctioned,
and the boiler and radiators continue to pump out heat in August,
based on the weather from the previous February.
The process starts in a region of the
brain called the hippocampus, which gets its name from the ancient
Greek term for seahorse, it being shaped a little like this
creature. The hippocampus acts as a master switch in controlling
how much the cortisol system becomes activated. This is shown in
Figure 12.1. In this figure, a plus symbol
indicates that one event acts to stimulate the next link in the
chain. A minus symbol shows the opposite effect, where one event
decreases the level of activity of the next event in the
chain.
Because of changes in the activities of
the hippocampus in response to stress, the hypothalamus produces
and releases two hormones, called corticotrophin-releasing hormone
and arginine vasopressin. These two hormones stimulate the
pituitary, which responds by releasing a substance called
adrenocorticotrophin hormone which gets into the bloodstream. When
the cells of the adrenal gland take up this hormone, they release
cortisol.
There’s a clever mechanism built in to
this system. Cortisol circulates around the body in the
bloodstream, and some of it goes back into the brain. The three
brain structures shown in our diagram all carry receptors that
recognise cortisol. When cortisol binds to these receptors, it
creates a signal that tells these structures to calm down. It’s
particularly important for this to happen at the hippocampus, as
this structure can send out signals to dampen down all the others
involved in this signalling. This is a classic negative feedback
loop. Production of cortisol feeds back on various tissues, and the
final effect is that the production of cortisol declines. This
stops us from being constantly over-stressed.
But we know that adults who suffered
traumatic childhoods are actually over-stressed. They
produce too much cortisol, all the time. Something must be going
wrong in this feedback loop. There are a few studies in humans that
show that this is the case. These studies examined the levels of
corticotrophin-releasing hormone in the fluid bathing the brain and
spinal cord. As predicted, the levels of corticotrophin-releasing
hormone were higher in individuals who had suffered childhood abuse
than in individuals who hadn’t. This was true even when the
individuals were healthy at the time of the experiments4,5. Because it’s so hard to
investigate this fully in humans, a lot of the breakthroughs in
this field have come from using animal models of certain conditions
and then correlating them where possible with what we know from
human cases.
Relaxed rats and mellow mice
A useful model has been based around
the mothering skills of rats. In the first week of their lives, rat
babies love being licked and groomed by their mothers. Some mothers
are naturally very good at this, others not so much so. If a mother
is good at it, she’s good at it in all her pregnancies. Similarly,
if she’s a bit lackadaisical at the licking and grooming, this is
true for every litter she has.
If we test the offspring of these
different mothers when the pups are older and independent, an
interesting effect emerges. When we challenge these now adult rats
with a mildly stressful situation, the ones that were licked and
groomed the most stay fairly calm. The ones that were relatively
deprived of ‘mother love’ react very strongly to even mild stress.
Essentially, the rats that had been licked and groomed the most as
babies were the most chilled out as adults.
The researchers carried out experiments
where newborn rats were transferred from ‘good’ mothers to ‘bad’
and vice versa. These experiments showed that the final responses
of the adults were completely due to the love and affection they
received in the first week of life. Babies born to mothers who were
lacklustre lickers and groomers grew up nicely chilled out if they
were fostered by mothers who were good at this.
The low stress levels of the adult rats
that had been thoroughly nurtured as babies were shown by measuring
their behaviour when they were challenged by mild stimuli. They
were also monitored hormonally, and the effects were as we would
expect. The chilled-out rats had lower levels of
corticotrophin-releasing hormone in their hypothalamus and lower
levels of adrenocorticotrophin hormone in their blood. Their levels
of cortisol were also low, compared with the less nurtured
animals.
The key molecular factor in dampening
down the stress responses in the well-nurtured rats was the
expression of the cortisol receptor in the hippocampus. In these
rats, the receptor was highly expressed. As a result, the cells of
the hippocampus were very efficient at catching even low amounts of
cortisol, and using this as the trigger to subdue the downstream
hormonal pathway, through the negative feedback loop.
This showed that levels of the cortisol
receptor stayed high in the hippocampus, many months after the
all-important licking and grooming of the baby rats. Essentially,
events that only happened for seven days immediately after birth
had an effect that lasted for pretty much all of a rat’s
life.
The reason the effect was so long-lasting
is that the initial stimulus – being licked and groomed by the
mother – set off a chain of events that led to epigenetic changes
to the cortisol receptor gene. These changes occurred very early in
development when the brain was at its most ‘plastic’. By plastic,
we mean that this is the time when it’s easiest to modify the gene
expression patterns and cellular activities. As the animals get
older, these patterns stay set in place. That’s why the first week
in rats is so critical.
The changes that take place are shown in
Figure 12.2. When a baby rat is licked and
groomed a lot, it produces serotonin, one of the feel-good
chemicals in mammalian brains. This stimulates expression of
epigenetic enzymes in the hippocampus, which ultimately results in
decreased DNA methylation of the cortisol receptor gene. Low levels
of DNA methylation are associated with high levels of gene
expression. Consequently, the cortisol receptor is expressed at
high levels in the hippocampus, and can keep the rats relatively
relaxed6.
This is a very interesting model to
explain how early life events can influence long-term behaviour.
But it seems unlikely that just one epigenetic alteration – even
one as significant as DNA methylation levels at a very important
gene in a critical brain region – could be the only answer. Five
years after the work described above, another paper was published
by a different group. This also showed the importance of epigenetic
changes but in a different gene.
The later group used a mouse model of
early-life stress. In this model, baby mice were taken away from
their mothers for three hours a day, for the first ten days of
their lives. Just like the baby rats that hadn’t been licked or
groomed much, these babies developed into ‘high-stress’ adults.
Cortisol levels were increased in these mice, especially in
response to mild stress, just like the relatively neglected
rats.
The researchers working on the mice
studied the arginine vasopressin gene. Arginine vasopressin is
secreted by the hypothalamus, and stimulates secretion from the
pituitary. It is shown in Figure 12.1. The
stressed-out mice, those that had suffered separation from their
mothers in early life, had decreased DNA methylation of the
arginine vasopressin gene. This resulted in increased production of
arginine vasopressin, which stimulated the stress response7.
The rat and mouse experimental studies
show us two important things. The first is that when early life
events lead to adult stress, there is probably more than one gene
involved. Both the cortisol receptor gene and the arginine
vasopressin gene can contribute to this phenotype in
rodents.
Secondly, the studies also show us that a
particular class of epigenetic modification is not in itself good
or bad. It’s where the modification happens that matters. In the
rat model, the decreased DNA methylation of the cortisol receptor
gene is a ‘good’ thing. It leads to increased production of this
receptor, and a general dampening down of the stress response. In
the mouse model, the decreased DNA methylation of the arginine
vasopressin gene is a ‘bad’ thing. It leads to increased expression
of this hormone and a stimulation of the stress
response.
The decreased DNA methylation of the
arginine vasopressin gene in the mouse model occurred through a
different route to the one used in the rat hippocampus to activate
the cortisol receptor gene.
In the mouse studies, separation from the
mother triggered activity of the neurons in the hypothalamus. This
set off a signalling cascade that affected the MeCP2 protein. MeCP2
is the protein we met in Chapter 4, which
binds to methylated DNA and helps repress gene expression. It’s
also the gene which is mutated in Rett syndrome, the devastating
neurological disorder. Adrian Bird has shown that the MeCP2 protein
is incredibly highly expressed in neurons8.
Normally, MeCP2 protein binds to the
methylated DNA at the arginine vasopressin gene. But in the
stressed baby mice, the signalling cascade mentioned in the
previous paragraph adds a small chemical group called a phosphate
to the MeCP2 protein and because of this MeCP2 falls off the
arginine vasopressin gene. One of the important roles of MeCP2 is
attracting other epigenetic proteins to where it is bound on a
gene. These are proteins that all cooperate to add more and more
repressive marks to that region of the genome. When the
phosphorylated MeCP2 falls off the arginine vasopressin gene, it
can no longer recruit these different epigenetic proteins. Because
of this, the chromatin loses it repressive marks. Activating
modifications get put on instead, such as high levels of histone
acetylation. Ultimately, even the DNA methylation is permanently
lost.
Amazingly this all happens in the mice in
the first ten days after birth. After that, the neurons essentially
lose their plasticity. The DNA methylation pattern that’s in place
at the end of this stage becomes the stable pattern at this
location. If the DNA methylation levels are low, this will normally
be associated with abnormally high expression of the arginine
vasopressin gene. In this way, the early life events trigger
epigenetic changes which get effectively ‘stuck’. Because of this,
the animal continues to be highly stressed, with abnormal hormone
production, long after the initial stress has vanished. Indeed, the
response continues long after the animal would even normally ‘care’
about whether or not it has its mother’s company. After all, mice
are not renowned for hanging about to look after their ageing
parents.
In the depths
Researchers are gradually gathering
data that suggest some of the changes seen in the rodent models of
early stress may be relevant in humans. As mentioned earlier, there
are logistical, but more importantly ethical, issues which make it
impossible to perform the same kinds of studies in people. Even so,
some intriguing correlations are emerging.
The original work in the rat model was
carried out by Professor Michael Meaney at McGill University in
Montreal. His group subsequently performed some interesting studies
on human brain samples from individuals who had, sadly, committed
suicide. The group analysed the levels of DNA methylation at the
cortisol receptor gene in the hippocampus from these cases. Their
data showed that the DNA methylation tended to be higher in the
samples from people who had had a history of early childhood abuse
or neglect. By contrast, the DNA methylation levels at this gene
were relatively low in the suicide victims who had not had
traumatic childhoods9. The high DNA methylation
levels in the abuse victims would drive down expression of the
cortisol receptor gene. This would make the negative feedback loop
less efficient and raise the circulating levels of cortisol. This
was consistent with the findings from the rat work, where the
stressed-out animals from the less nurturing mothers had high
levels of DNA methylation at the cortisol receptor gene in the
hippocampus.
Of course, it isn’t just people who have
had abusive childhoods who develop mental illnesses. The global
figures for depression are startling. The World Health Organisation
estimates that over 120 million people worldwide are affected by
depression. Depression-related suicides have reached 850,000 per
annum and depression is predicted to become the second greatest
contributor to the global disease burden by 202010.
Effective treatment for depression took a
big step forwards in the early 1990s with the licensing by the US
Food and Drug Administration of a class of drugs called SSRIs –
selective serotonin re-uptake inhibitors. Serotonin is a
neurotransmitter molecule – it conveys signals between neurons.
Serotonin is released in the brain in response to pleasurable
stimuli; it’s the feel-good molecule that we met in our happy rat
babies. The levels of serotonin are low in the brains of people
suffering from depression. SSRI drugs raise the levels of serotonin
in the brain.
It makes sense that drugs that cause an
increase in serotonin levels would be useful in treating
depression. But there’s something odd about their action. The
serotonin levels in the brain rise quite quickly when patients are
treated with the SSRI drugs. But it usually takes at least four to
six weeks before the terrible symptoms of severe depression begin
to lift.
This suggests that there is more to
depression than simply a drop in the levels of a single chemical in
the brain, which perhaps isn’t that surprising. It’s very unusual
for depression to happen overnight – it’s not like coming down with
the flu. There’s now a reasonable amount of data showing that there
are much longer-term changes in the brain as depression develops.
These include alterations in the numbers of contacts that neurons
make with each other. This in turn is critically dependent on the
levels of chemicals called neurotrophic factors11.
These chemicals support healthy survival and function of brain
cells.
Researchers in the depression field have
moved away from a simple model based on levels of neurotransmitters
and into a more complex network system. This involves sophisticated
interactions between neuronal activity and a whole range of other
factors. These include stress, production of neurotransmitters,
effects on gene expression and longer-term consequences for neurons
and how they interact with each other. While this system is in
balance, the brain functions healthily. If the system moves out of
balance, this complicated network begins to unravel. This moves the
brain’s biochemistry and function further away from health and
closer to dysfunction and disease.
Scientists are beginning to focus their
attention in this field on epigenetics, because of its potential to
create and sustain long-lasting patterns of gene expression.
Rodents are the most common model system for these investigations.
Because a mouse or a rat can’t tell you how it’s feeling,
researchers have created certain behavioural tests that are used to
model different aspects of human depression.
We all recognise that different people
seem to respond to stress in different ways. Some people seem
fairly robust. Others can react really badly to the same stressful
situation, even developing depression. Mice from different inbred
strains are like this as well. Researchers exposed two different
strains to mildly stressful stimuli. After the stressful situation,
the researchers assessed the behaviour of the mice in some of the
tests which mimic certain aspects of human depression. One strain
was relatively non-anxious, whereas the other was relatively
anxious. These strains were called B6 and BALB, but we’ll called
them ‘chilled’ and ‘jumpy’, respectively, for
convenience.
The researchers focused their studies on
a region of the brain called the nucleus accumbens. This region
plays a role in various emotionally important brain functions.
These include aggression, fear, pleasure and reward. The
researchers analysed the expression of various neurotrophic factors
in the nucleus accumbens. The one that gave the most interesting
results was a gene called Gdnf (glial cell-derived neurotrophic factor).
Stress caused an increase in expression
of the Gdnf gene in the chilled mice. In the jumpy strain it
caused a decrease in expression of the same gene. Now, different
inbred strains of mice can have different DNA codes so the
researchers analysed the promoter region, which controls the
expression of Gdnf. The DNA sequence of the Gdnf
promoter was identical in the chilled and the jumpy strains. But
when the scientists examined the epigenetic modifications in this
promoter, they found a difference. The histones of the jumpy mice
had fewer acetyl groups than the histones of the chilled mice. As
we’ve seen, low levels of histone acetylation are associated with
low levels of gene expression, so this tied up well with the
decreased Gdnf expression in the jumpy mice.
This led the scientists to wonder what
had happened in the neurons of the nucleus accumbens. Why had the
levels of histone acetylation dropped at the Gdnf gene in
the jumpy mice? The scientists examined the levels of the enzymes
that add or remove acetyl groups from histones. They found only one
difference between the two strains of mice. A specific histone
deacetylase (member of the class of proteins which removes acetyl
groups) called Hdac2 was much more highly expressed in the neurons
of the jumpy mice12, compared with the chilled
out mice.
Other researchers tested mice in a
different model of depression, called social defeat. In these
experiments, mice are basically humiliated. They’re put in an
environment where they can’t get away from a bigger, scarier mouse,
although they are removed before they come to any physical harm.
Some mice find this really stressful; others seem to brush it
off.
In the experiments adult mice underwent
ten days of social defeat. At the end of this they were classified
as either susceptible or resistant, depending on how well they
bounced back from the experience. Two weeks later the mice were
examined. The resistant mice had normal levels of
corticotrophin-releasing hormone. This is the chemical released by
the hypothalamus. It’s the one which ultimately stimulates the
production of cortisol, the stress hormone. The susceptible mice
had high levels of corticotrophin-releasing hormone and low levels
of DNA methylation at the promoter of this gene. This was
consistent with the high levels of expression from this gene. They
also had low levels of Hdac2, and high levels of histone
acetylation, which again fits with over-expression of the
corticotrophin-releasing hormone13.
It might seem odd that in one model
system Hdac2 levels went up in the susceptible mice, whereas in
another they went down. But it’s important with all these
epigenetic events to remember that context is everything. There
isn’t just one way in which Hdac2 levels (or those of any other
epigenetic gene, for that matter) are controlled. The control will
depend on the region of the brain and the precise signalling
pathways that are activated in response to a stimulus.
The drugs do work
There’s more evidence supporting a
significant role for epigenetics in responses to stress. The
naturally jumpy B6 mice were the ones with the increased expression
of Hdac2 in the nucleus accumbens, and decreased expression from
the Gdnf gene. We can treat these mice with SAHA, the
histone deacetylase inhibitor. SAHA treatment leads to increased
acetylation of the Gdnf promoter. This is associated with
increased expression of the Gdnf gene. The crucial finding
is that the treated mice stop being jumpy and become chilled
instead14 – changing the histone
acetylation levels of the gene changed the mouse’s behaviour. This
supports the idea that histone acetylation is really important in
modulating the responses of these mice to stress.
One of the tests used to investigate how
depressed the mice become in response to stress is called the
sucrose-preference test. Normal happy mice love sugared water, but
when they are depressed they aren’t so interested in it. This
decreased response to a pleasant stimulus is called anhedonia. It
seems to be one of the best surrogate markers in animals for human
depression15. Most people who have been
severely depressed talk about losing interest in all the things
that used to make life joyful before they became ill. When the
stressed mice were treated with SSRI anti-depressants, their
interest in the sugared water gradually increased. But when they
were treated with SAHA, the HDAC inhibitor, they regained their
interest in their favourite drink much faster16.
It’s not just in the jumpy or chilled
mice that histone deacetylase inhibitors can change animal
behaviour. It’s also relevant to the baby rats who don’t get much
maternal licking and grooming. These are the ones that normally
grow up to be chronically stressed, with over-activation of the
cortisol production pathway. If these ‘unloved’ animals are treated
with TSA, the first histone deacetylase inhibitor to be identified,
they grow up much less stressed. They react much more like the
animals who received lots of maternal care. The levels of DNA
methylation at the cortisol receptor gene in the hippocampus go
down, increasing expression of the receptor and improving the
sensitivity of the all-important negative feedback loop. This is
presumed to be because of cross-talk between the histone
acetylation and DNA methylation pathways17.
In the social defeat model in mice, the
susceptible animals were treated with an SSRI anti-depressant drug.
After three weeks of treatment, their behaviour was much more like
that of the resilient mice. But treatment with this anti-depressant
drug didn’t just result in increased levels of serotonin in the
brain. The anti-depressant treatment also led to increased DNA
methylation at the promoter of the corticotrophin-releasing
hormone.
These studies are all very consistent
with a model where there is cross-talk between the immediate
signals from the neurotransmitters, and the longer-term effects on
cell function mediated by epigenetic enzymes. When depressed
patients are treated with SSRI drugs, the serotonin levels in the
brain begin to rise, and signal more strongly to the neurons. The
animal work described in the last paragraph suggests that it takes
a few weeks for these signals to trigger all the pathways that
ultimately result in the altered pattern of epigenetic
modifications in the cells. This stage is essential for restoring
normal brain function.
Epigenetics is also a reasonable
hypothesis to explain another interesting but distressing feature
of severe depression. If you have suffered from depression once,
you are at a significantly higher risk than the general population
of suffering from it again at some time in the future. It’s likely
that some epigenetic modifications are exceptionally difficult to
reverse, and leave the neurons primed to be more vulnerable to
another bout.
The jury’s out
So far, so good. Everything looks very
consistent with our theory about life experiences having sustained
and long-lasting effects on behaviour, through epigenetics. And
yet, here’s the thing: this whole area, sometimes called
neuro-epigenetics, is probably the most scientifically contentious
field in the whole of epigenetic research.
To get a sense of just how controversial,
consider this. We’ve met Professor Adrian Bird in this book before.
He is acknowledged as the father of the DNA methylation field.
Another scientist with a very strong reputation in the science
behind DNA methylation is Professor Tim Bestor from Columbia
University Medical Center in New York. Adrian and Tim are about the
same age, of similar physical type, and both are thoughtful and low
key in conversation. And they seem to disagree on almost every
issue in DNA methylation. Go to any conference where they are both
scheduled in the same session and you are guaranteed to witness
inspiring and impassioned debate between the two men. Yet the one
thing they both seem to agree on publicly is their scepticism about
some of the reports in the neuro-epigenetics field18.
There are three reasons why they, and
many of their colleagues, are so sceptical. The first is that many
of the epigenetic changes that have been observed are relatively
small. The sceptics are unconvinced that such small molecular
changes could lead to such pronounced phenotypes. They argue that
just because the changes are present, it doesn’t mean they’re
necessarily having a functional effect. They worry that the
alterations in epigenetic modifications are simply correlative, not
causative.
The scientists who have been
investigating the behavioural responses in the different rodent
systems counter this by arguing that molecular biologists are too
used to quite artificial experimental models, where they can study
extensive molecular changes with very on-or-off read-outs. The
behaviourists suspect that this has left molecular biologists
relatively inexperienced at interpreting real-world experiments,
where the read-outs tend to be more ‘fuzzy’ and prone to greater
experimental variation.
The second reason for scepticism lies in
the very localised nature of the epigenetic changes. Infant stress
affects specific regions of the brain, such as the nucleus
accumbens, and not other areas. Epigenetic marks are only altered
at some genes and not others. This seems less of a reason for
scepticism. Although we refer to ‘the brain’, there are lots of
highly specialised centres and regions within this organ, the
product of hundreds of millions of years of evolution. Somehow, all
these separate regions are generated and maintained during
development and beyond, and thus are clearly able to respond
differently to stimuli. This is also the case for all our genes, in
all our tissues. It’s true that we don’t really know how epigenetic
modifications can be targeted so precisely, or how the signalling
from chemicals like neurotransmitters leads to this targeting. But
we know that similarly specific events occur during normal
development – so why not during abnormal periods of stress or other
environmental disturbances? Just because we don’t know the
mechanism for something, it doesn’t mean it doesn’t happen. After
all, John Gurdon didn’t know how adult nuclei were reprogrammed by
the cytoplasm of eggs, but that didn’t mean his experimental
findings were invalid.
The third reason for scepticism is
possibly the most important and it relates to DNA methylation
itself. DNA methylation at the target genes in the brain is
established very early, possibly pre-natally but certainly within
one day of birth, in rodents. What this means is that the baby mice
or baby rats in the experiments all started life with a certain
baseline pattern of DNA methylation at their cortisol receptor gene
in the hippocampus. The DNA methylation levels at this promoter
alter in the first week of life, depending on the amount of licking
and grooming the rats receive. As we saw, the DNA methylation
levels are higher in the neglected mice than in the loved ones. But
that’s not because the DNA methylation has gone up in the neglected
mice. It’s because DNA methylation has gone down in the ones
that were licked and groomed the most. The same is also true at the
arginine vasopressin gene in the baby mice removed from their
mothers. It’s also true for the corticotrophin-releasing hormone
gene in the adult mice that were susceptible to social
defeat.
So, in every case, what the scientists
observed was decreased DNA methylation in response to a stimulus.
And that’s where, molecularly, the problem lies, because no-one
knows how this happens. In Chapter 4 we saw
how copying of methylated DNA results in one strand that contains
methyl groups and one that doesn’t. The DNMT1 enzyme moves along
the newly synthesised strand and adds methyl groups to restore the
methylation pattern, using the original strand as a template. We
could speculate that in our experimental animals, there was less
DNMT1 enzyme present and so the methylation levels at the gene
dropped. This is referred to as passive DNA
demethylation.
The problem is that this can’t work in
neurons. Neurons are terminally differentiated – they are right at
the bottom of Waddington’s landscape, and cannot divide. Because
they don’t divide, neurons don’t copy their DNA. There’s no reason
for them to do so. As a result, they can’t lose their DNA
methylation by the method described in Chapter
4.
One possibility is that maybe neurons
simply remove the methyl group from DNA. After all, histone
deacetylases remove acetyl groups from histones. But the methyl
group on DNA is different. In chemical terms, histone acetylation
is a bit like adding a small Lego brick onto a larger Lego brick.
It’s pretty easy to take the two bricks apart again. DNA
methylation isn’t like that. It’s more like having two Lego bricks
and using superglue to stick them together.
The chemical bond between a methyl group
and the cytosine in DNA is so strong that for many years it was
considered completely irreversible. In 2000, a group from the Max
Planck Institute in Berlin demonstrated that this couldn’t be the
case. They showed that in mammals the paternal genome undergoes
extensive DNA demethylation, during very early development. We came
across this in Chapters 7 and 8. What we glossed over at the time was that this
demethylation happens before the zygote starts to divide. In other
words, the DNA methylation was removed without any DNA
replication19. This is referred to as
active DNA demethylation.
This means there is a precedent for
removing DNA methylation in non-dividing cells. Perhaps there’s a
similar mechanism in neurons. There’s still a lot of debate about
how DNA methylation is actively removed, even in the
well-established events in early development. There’s even less
consensus about how it takes place in neurons. One of the reasons
this has been so hard to investigate is that active DNA
demethylation may involve a lot of different proteins, carrying out
a number of steps one after another. This makes it very difficult
to recreate the process in a lab, which is the gold standard for
these kinds of investigations.
Silencing the silencer
As we’ve seen repeatedly, scientific
research often throws up some very unexpected findings and so it
happened here. While many people in epigenetics were looking for an
enzyme that removed DNA methylation, one group discovered enzymes
that added something extra to methylated DNA. This is shown in
Figure 12.3. Very surprisingly, this has
turned out to have many of the same consequences as demethylating
the nucleic acid.
A small molecule called hydroxyl,
consisting of one oxygen atom and one hydrogen atom, is added to
the methyl group, to create 5-hydroxymethylcytosine. This reaction
is carried out by enzymes called TET1, TET2 or TET320.
This is highly relevant to the question
of DNA demethylation, because it’s the effects of DNA methylation
that make this change important. Methylation of cytosine affects
gene expression because methylated cytosine binds certain proteins,
such as MeCP2. MeCP2 acts with other proteins to repress gene
expression and to recruit other repressive modifications like
histone deacetylation. When an enzyme such as TET1 adds the
hydroxyl group to the methylcytosine to form the
5-hydroxymethylcytosine molecule, it changes the shape of the
epigenetic modification. If a methylated cytosine is like a grape
on a tennis ball, the 5-hydroxymethylcytosine is like a bean stuck
to a grape stuck to a tennis ball. Because of this change in shape,
the MeCP2 protein can’t bind to the modified DNA any more. The cell
therefore ‘reads’ 5-hydroxymethylcytosine in the same way as it
reads unmethylated DNA.
Many of the techniques used until very
recently looked for the presence of DNA methylation. They often
couldn’t distinguish between unmethylated DNA and
5-hydroxymethylated DNA. This means that many of the papers which
refer to decreased DNA methylation may actually have been detecting
increased 5-hydroxymethylation without knowing it. It’s currently
unproven, but it may be that instead of actually demethylating DNA,
as reported in some of the behavioural studies, neurons really
convert 5-methylcytosine to 5-hydroxymethylcytosine. The techniques
for studying 5-hydroxymethylcytosine are still under development
but we do know that neurons contain higher levels of this chemical
than any other cell type21.
Remember, remember
Despite these controversies, research
is continuing into the importance of epigenetic modifications in
brain function. One area that is attracting a lot of attention is
the field of memory. Memory is an incredibly complex phenomenon.
Both the hippocampus and a region of the brain called the cortex
are involved in memory, but in different ways. The hippocampus is
mainly involved in consolidating memories, as our brains decide
what we are going to remember. The hippocampus is fairly plastic in
the way that it operates, and this seems to be associated with
transient changes in DNA methylation, again through fairly
uncharacterised mechanisms. The cortex is used for longer-term
storage of memories. When memories are stored in the cortex, there
are prolonged changes in DNA methylation.
The cortex is like a hard drive on a
computer with gigabytes of storage. The hippocampus is more like
the RAM (random access memory) chip, where data are temporarily
processed before being deleted, or transferred to the hard drive
for permanent storage. Our brain separates out different functions
to selected cell populations in different anatomical regions. This
is why memory loss is rarely all-encompassing. Depending on the
clinical condition, for example, either one of short-term or
long-term memory may be relatively lost or remain relatively
intact. It makes a lot of sense for these different functions to be
separated in our brains. Just try to imagine life if we remembered
everything that ever happened – the phone number that we dialled
only once, every word a dull stranger said to us on a train, or the
canteen menu from a wet Wednesday three years ago.
The complexity of our memory systems is
one of the reasons why it is quite a difficult area to study,
because it can be difficult to set up experiments where we are
absolutely sure which aspects of memory our experimental techniques
are actually addressing. But one thing we know for sure is that
memory involves long-term changes in gene expression, and in the
way neurons make connections with one another. And that again leads
to the hypothesis that epigenetic mechanisms may play a
role.
In mammals, both DNA methylation and
histone modifications play a role in memory and learning. Rodent
studies have shown that these changes may be targeted to very
specific genes in discrete regions of the brain, as we have come to
expect. For example, the DNA methyltransferase proteins DNMT3A and
DNMT3B increase in expression in the adult rat hippocampus in a
particular learning and memory model. Conversely, treating these
rats with a DNA methyltransferase inhibitor such as 5-azacytidine
blocks memory formation and affects both the hippocampus and the
cortex22.
A particular histone acetyltransferase
(protein which adds acetyl groups to histones) gene is mutated in a
human disorder called Rubinstein-Taybi syndrome. Mental retardation
is a frequent symptom in this disease. Mice with a mutant version
of this gene also have low levels of histone acetylation in the
hippocampus, as we would predict. They also have major problems in
long-term memory processing in the hippocampus23.
When these mice were treated with SAHA, the histone deacetylase
inhibitor, acetylation levels in the hippocampus went up, and the
memory problems improved24.
SAHA can inhibit many different histone
deacetylases, but in the brain some of its targets seem to be more
important than others. The two most highly expressed enzymes of
this class are HDAC1 and HDAC2. These differ in the ways they are
expressed in the brain. HDAC1 is predominantly expressed in neural
stem cells, and in a supportive, protective population of
non-neurons called glial cells. HDAC2 is predominantly expressed in
neuronal cells25, so it’s unsurprising that
this is the histone deacetylase that is most important in learning
and memory.
Mice whose neurons over-express Hdac2
have poor long-term memory, even though their short-term memory is
fine. Mice whose neurons don’t express any Hdac2 have excellent
memories. These data show us that Hdac2 has a negative effect on
memory storage. The neurons which over-expressed Hdac2 formed far
fewer connections than normal, whereas the opposite was true for
the neurons lacking Hdac2. This supports our model of
epigenetically-driven changes in gene expression ultimately
altering complex networks in the brain. SAHA improves memory in the
mice that over-express Hdac2, presumably by dampening down its
effects on histone acetylation and gene expression. SAHA also
improves memory in normal mice26.
In fact, increased acetylation levels in
the brain seem to be consistently associated with improved memory.
Learning and memory both improved in mice kept in conditions known
as environmentally enriched. This is a fancy way of saying they had
access to two running wheels and the inside of a toilet roll. The
histone acetylation levels in the hippocampus and cortex were
increased in the mice in the more entertaining surroundings. Even
in these mice, the histone acetylation levels and memory skills
improved yet further if they were treated with SAHA27.
We can see a consistent trend emerging.
In various different model systems, learning and memory improve
when animals are treated with DNA methyltransferase inhibitors, and
especially with histone deacetylase inhibitors. As we saw in the
last chapter, there are drugs licensed in both these classes, such
as 5-azacytidine and SAHA, respectively. It’s very tempting to
speculate about taking these anti-cancer drugs and using them in
conditions where memory loss is a major clinical problem, such as
Alzheimer’s disease. Perhaps we might even use them as general
memory enhancers in the wider population.
Unfortunately, there are substantial
difficulties in doing this. These drugs have side-effects which can
include severe fatigue, nausea and a higher risk of infections.
These side-effects are considered acceptable if the alternative is
an inevitable and fairly near-term death from cancer. But they
might be considered less acceptable for treating the early stages
of dementia, when the patient still has a relatively reasonable
quality of life. And they would certainly be unacceptable for the
general population.
There is an additional problem. Most of
these drugs are really bad at getting into the brain. In many of
the rodent experiments, the drugs were administered directly into
the brain, and often into very defined regions such as the
hippocampus. This isn’t a realistic treatment method for
humans.
There are a few histone deacetylase
inhibitors that do get into the brain. A drug called sodium
valproate has been used for decades to treat epilepsy, and clearly
must be getting into the brain in order to do this. In recent
years, we have realised that this compound is also a histone
deacetylase inhibitor. This would be extremely encouraging for
trying to use epigenetic drugs in Alzheimer’s disease but
unfortunately, sodium valproate only inhibits histone deacetylases
very weakly. All the animal data on learning and memory have shown
that stronger inhibitors work much better than weak ones at
reversing these deficits.
It’s not just in disorders like
Alzheimer’s disease that epigenetic therapies could be useful if we
manage to develop suitable drugs. Between 5 and 10 per cent of
regular users of cocaine become addicted to the drug, suffering
from uncontrollable cravings for this stimulant. A similar
phenomenon occurs in rodents, if animals are allowed unlimited
access to the drug. Addiction to stimulants such as cocaine is a
classic example of inappropriate adaptations by memory and reward
circuits in the brain. These maladaptations are regulated by
long-lasting changes in gene expression. Changes in DNA
methylation, and in how methylation is read by MeCP2, underpin this
addiction. This happens via a set of poorly understood interactions
which include signalling factors, DNA and histone modifying enzymes
and readers, and miRNAs. Related pathways also underpin addiction
to amphetamines28,29.
If we return to the starting point of
this chapter, it’s clear that there’s a major need to stop children
who have suffered early trauma from developing into adults with a
substantially higher than normal risk of mental illness. It’s very
appealing to think we might be able to use epigenetic drug
therapies to improve their life chances. Unfortunately, one of the
problems in designing therapies for children who have been abused
or neglected is that it’s actually pretty difficult to identify
those who will be permanently damaged as adults, and those who will
have healthy, happy and fulfilled lives. There are enormous ethical
dilemmas around giving drugs to children, when we can’t be sure if
an individual child actually needs the treatment. In addition,
clinical trials to determine if the drugs actually do any good
would need to last for decades, which makes them economically
almost a non-starter for any pharmaceutical company.
But we mustn’t end on too negative a
note. Here’s a great story about an epigenetic event and behaviour.
There is a gene called Grb10 that is involved in various
signalling pathways. It’s an imprinted gene, and the brain only
expresses the paternally inherited copy. If we switch off this
paternal copy, the mouse can’t produce any Grb10 protein, and the
animals develop a very odd phenotype. They nibble off the face fur
and whiskers of other mice in the same cage. This is a sort of
aggressive grooming, a bit like a pecking order in chickens. In
addition, if faced with a big mouse that they don’t know, the
Grb10 mutant mice don’t back away – they stand their
ground30.
Switching off Grb10 in the brain
results in what might sound like a rather impressive, kick-ass kind
of a mouse. It maybe even seems odd that this gene is normally
switched on in the brain. Wouldn’t mice that switched off
Grb10 be the butchest, most successful mice? Actually, it’s
more likely that they’d be the mice most likely to get themselves
beaten up. There are a lot of mice in the world, and they encounter
each other pretty frequently. It pays to recognise when you are
out-gunned.
When the Grb10 gene is switched
off in the brain, it’s like a bad Friday night for the mouse. Let’s
put this in human terms so we can see why. You’re down the pub when
a person twice your size and all muscle knocks against you and you
spill your pint.
When this gene is switched off, it’s as
if you have a friend next to you who says, ‘Go on, you can take
him/her, don’t wimp out.’ We all know how badly those scenarios
tend to play out. So let’s end this chapter by raising a cheer for
imprinted Grb10, the gene that likes to say, ‘Leave it mate,
it’s not worth it.’