Let Them Eat Dirt

The decision taken by the United States in August 1945 to force the surrender of Japan and end the Second World War undoubtedly changed the world. The bombings of Hiroshoma and Nagasaki demonstrated that the world had entered the nuclear era and warfare, and the way politics would be carried out on an international scale, would never be the same again. While the United States had demonstrated nuclear power at its most destructive, it was clear that the technology had the potential to revolutionise the way governments would be able to meet soaring energy demands. The United Kingdom led the charge to use nuclear energy in this peaceful light.

Calder Hall nuclear power plant, 1956. Image credit: Ann Ronan Picture Library/Heritage Images

In 1956, Calder Hall, part of the nuclear site know called Sellafield, became the first nuclear power plant designed to produce electricity for the national power grid on an industrial scale. Having had a history of small munitions production and the enrichment of weapons-grade plutonium, Calder Hall had shifted its focus and was to feed a country desperate to shake-off the legacy of the Second World War. By 1983 Calder Hall had been joined by another reactor – Windscale – and, against a backdrop that saw the Cold War at its height, increasingly large-scale nuclear accidents accidents and disasters around the world and communities of miners that were seeing their opportunities for work dwindling, public opinion was turning against nuclear power. This was exacerbated by the release of a documentary by Yorkshire TV – Windscale: The Nuclear Laundry – which identified a substantial increase in cancer cases, particularly childhood leukaemia, in the villages surrounding Windscale.

The response from the public, and the government, was immediate. An investigation was launched into the levels of environmental radiation emitted by the plant and the public, particularly those living in the affected villages were outraged. However, while it was certainly true that there was a remarkable rise in leukaemia cases surrounding the plant, environmental discharges were simply too low to explain them; they were comparable with background levels around the country. What has emerged since is a fascinating story that challenges our very understanding of how we keep ourselves healthy and protect against disease.

The underlying cause of all forms of cancer is clearly understood. Cancers arise due to an accumulation of genetic mutations that confer both an increased level of cellular replication and a resistance to the cellular death mechanisms that safeguard against major disease.

The cause of some cancers, such as that of the cervix which is tightly linked to Human Papillomavirus (HPV), is clearly identified. Image by Unknown photographer [Public domain], via Wikimedia Commons

However, in the case of childhood leukaemia, the common form of which – Acute Lymphoblastic Leukaemia (ALL) – accounts for one-third of paediatric cancer diagnoses, it is difficult to say with confidence what the source of these mutations is. The only identified cause of ALL with biological plausibility is ionising radiation, such as X-rays and gamma rays. Many forms of leukaemia in animals, including mice, cats, chickens and cattle, can be linked to a virus that can transform infected cells into cancerous ones – much like the role played by Human Papillomavirus (HPV) in the development of cervical cancer and Hepatitis viruses in cancers of the liver. However, no such relationship exists for human leukaemia. Now, a fascinating model has been suggested that explains the rises in leukaemia cases reported around Windscale in 1983 and attempts to shed light on how leukaemia arises.

The report, published in Nature Reviews Cancer by researchers from the Centre for Evolution and Cancer and the Institute of Cancer Research, outlines a two-step model for childhood leukaemia development and suggests a number of potential targets for the prevention of disease.

The two-step hypothesis originated through comparison of the genetic damage in  monozygotic twins that both had an ALL diagnosis. Sequencing of the DNA found in cancerous cells from both twins revealed a wide range of genetic damage. One mutation, which involved the fusion of two chromosomes, was routinely in both affected twins, but there will little similarity between the other observed forms of genetic damage. It is believed, therefore, that since the chromosome fusion event is observed in most cases of ALL, it happens early in cancer development and can be seen as an initiating event. The wide range of mutations seen across twins with ALL, it is argued, points to a role for an external factor in initiating the development of disease (see Fig 1).

FIgure 1: Acute Lymphoblastic Leukaemia (ALL) is thought to develop in a two-step process. Increased exposure to microorganisms may reduce the frequency of the key second step.

This secondary, transforming event, is clearly an important moment in determining the fate of these so-called pre-leukaemic clones, which harbour the initiating mutation but have not accumulated enough genetic damage to be transformed into fully cancerous cells. In an analysis of blood tests carried out on newborns, the initiating fusion mutation was extremely common. It was found in 1% of the tested samples – 100 times more common than the incidence of ALL. So, what is this transforming event and can it be prevented?

The search for the source of the transforming event has long focused on finding a virus or other infectious agent responsible for driving cancer development. Now, it appears, the driver of the transforming event is in fact a lack of infection, representing a paradox of progress in an increasingly sanitised world.

Infections, both those causing symptoms and ‘silent’ infections that go unnoticed, along with the presence of harmless bacteria and other foreign substances, are vital in the development of a fully functional immune system. In early life, early stimulation of the immune system ‘educates’ the body to properly fight a wide range of infections throughout an individual’s life. Without being exposed to a range of bacteria, viruses and other microorganisms in the first few years, the immune system will respond inappropriately when finally called into action; it will not only be unable to properly clear the infection, but it may also lead to further damage.

A Paradox of Progress
In the middle of the nineteenth century in the Vienna General Hospital, the Allgemeines Krakenhouse, a crisis was unfolding. Pregnant women giving birth in the hospital had a 22% mortality rate – 1 in 5 pregnant women that entered the hospital would not leave. The mortality rate was due to an abnormally high rate of puerperal, or postpartum, infections. Nobody could understand why Austria’s leading hospital, one of the grandest in Europe, was suffering so greatly from this plight. Determined to unravel the mystery, an obstetrician by the name of Ignaz Semmelweis set about detailing the cases of every birth in the hospital, whether the mother was healthy or not. He discovered that there was a dramatically increased level of postpartum mortality in women whose births had been assisted by doctors and nurses, while those women who had the support of a midwife saw far better outcomes. At the time, doctors not only routinely performed autopsies themselves, but also organised the procedures around other surgeries and consultations. Semmelweis proposed that doctors who had come into contact with deceased patients during autopsies were transfering the very substances that lead to mortality directly into women during delivery. The hospital institued a policy of all doctors washing their hands in chlorinated lime water and postpartum mortality dropped almost immediately to 2%. Despite clear evidence that Semmelweis had highlighted an invisible transferable agent that led to disease, his work was not recognised by the scientific community at large. Now, of course, we know that Semmelweis had discovered evidence for the germ theory of disease, which states that many diseases are caused by microorganisms, such as bacteria and viruses. It would not be long before the theory was commonplace, largely through the work of Louis Pasteur and Robert Koch.

Their seminal works, alongside the rapid development of microscope technology, has given rise to a society that is acutely aware of the threat microorganisms carry and that does it all can to sanitise the environments that we exist in. However, though sanitation efforts have led to a near-eradication of many infectious diseases in developed societies, there now exists the threat that the benefits of exposure to microorganisms brings is being lost in these vary same developed countries.

This delayed activation of the immune response is proposed to be the crucial secondary, transforming event that drives the development of pre-leukaemic cells harbouring few mutations into cancerous ones riddled with genetic damage. Experiences in early life that expose young children to microorganisms that stimulate the immune system have been shown to be protective against ALL development: children that attended day care, were born into households with older siblings, were born naturally rather than by caesarean section and were breastfed for longer are all at a lower risk of developing ALL than those who grew up in a more sanitised environment.

Recently, the Italian city of Milan saw a remarkable spice-time cluster of ALL diagnoses, with seven cases in just a four-week period. The local communities were extremely concerned about a deadly, environmental trigger that was causing their children to develop these cancers. However, an analysis of the children’s historical potential exposure to infectious agents yielded interesting results: none had attended day-care, all were first-born children and all had been infected with the H1N1 influenza virus in the months before their diagnosis. Other clusters have been reported in Germany, following influxes into smaller communities that resulted in population mixing and the exposure of local children to new pathogens. These spikes indicate just how damaging starving the immune system in early life can be; one simple infection can bring leukaemia out of hiding.

So, what of the spike in leukaemia around Windscale in 1983? The minimal radiation emitted by the plant wasn’t directly creating cancer-causing mutations in local children, but its presence was having an effect. It is now thought that the influx of people into the area to staff the plant exposed what was a relatively isolated, rural community to a vast range of new microorganisms. The young residents in the area, whose immune systems had not been properly stimulated in early life, had an inappropriate response to the new infections, which drove the transformation of pre-leukaemic cells into cancerous ones.

Naturally, childhood cancers represent high priority targets for research. Acute Lymphoblastic Leukaemia (ALL) has historically been, and remains, one of the prime targets, as it accounts for up to one third of all paediatric cancer diagnoses. The outcomes for children have improved markedly with the advent of combination chemotherapy techniques, which use a cocktail of drugs to aggressively destroy cancerous cells. In just a matter of decades childhood ALL has improved from a near-universally negative prognosis to a 90% survival rate. However, this substantial success story does not paint the whole picture. While a great deal of children do indeed go into remission, the disease is near-universally fatal should it resurface. What’s more, the combined chemotherapy treatments are aggressive, cause great distress in paediatric patients and have been associated with negative effects on long-term health, including early mortality, heart failure, additional cancers and dysfunctions in neurocognition.

With improved understanding of the way ALL develops, researchers are now hopeful of targeting specific processes and preventing leukaemia arising in the first place. Some strategies could be behavioural: encouraging parents to send their children to day-care centres in early life and a longer period of breastfeeding have both been shown to reduce the risk of developing ALL.

There is also great excitement about the potential use of prophylactic ‘vaccines’ that, when administered in the first year of life, would mimic the protective effects of exposure to common infections. A similar effect could be achieved through the oral administration of so-called harmless symbiotes, which cause asymptomatic infections, such as lactobacillus species of bacteria. It is hoped that a combination of behavioural and pharmaceutical techniques could represent the next step in eradicating ALL. The mission started with revolutionising treatment and may now end with outright prevention.

What is clear is that failing to properly stimulate our immune system early in life can lead to substantial health concerns later on. So, though we shouldn’t encourage our children to eat dirt, perhaps it isn’t too bad if they do.  

Joe

Having studied Biomedical Sciences, I have spent my career sharing my passion for science and making life-changing educational opportunities accessible for anyone, no matter their background. This blog is another way of sharing the stories and ideas that fascinate me - I hope you find them just as interesting!

Previous Story

Needle in a Haystack: Tipping the Scales in the Search for Cancer

Next Story

Death and Secrecy in Ice Valley