know it. This was white, watery stuff, and there was no end of it. Each person could produce up to 20 litres per day, all of which was dumped in the cesspits beneath Soho’s cramped houses. The disease was cholera, and it killed people in their hundreds.
Dr John Snow, a British doctor, was sceptical of the miasma theory, and had spent some years looking for an alternative explanation. From previous epidemics, he had begun to suspect that cholera was water-borne. The latest outbreak in Soho gave him the opportunity to test his theory. He interviewed Soho residents and mapped cholera cases and deaths, looking for a common source. Snow realised that the victims had all drunk from the same water pump on Broad Street (now Broadwick Street) at the heart of the outbreak. Even deaths further afield could be traced back to the Broad Street pump, as cholera was carried and passed on by those infected there. There was one anomaly: a group of monks in a Soho monastery who got their water from the same pump were completely unaffected. It was not their faith that had afforded them protection, though, but their habit of drinking the pump’s water only after they had turned it into beer.
Snow had looked for patterns – connections between those who had become ill, reasons why others had escaped, links explaining the appearance of the disease outside its Broad Street epicentre. His rational study used logic and evidence to unravel the outbreak and trace its source, eliminating red herrings and accounting for anomalies. His work led to the disabling of the Broad Street pump and the subsequent discovery that a nearby cesspit had overflowed and was contaminating the water supply. It was the first-ever epidemiological study – that is, it used the distribution and patterns of a disease to understand its source. John Snow went on to use chlorine to disinfect the water supplying the Broad Street pump, and his chlorination methods were quickly put to use elsewhere. As the nineteenth century came to a close, water sanitation had become widespread.
As the twentieth century unfolded, all three public health innovations became more and more sophisticated. By the end of the Second World War, a further five diseases could be prevented through vaccination, taking the total to ten. Medical hygiene techniques were adopted internationally, and chlorination became a standard process in water-treatment plants. The fourth and final innovation to put an end to the reign of microbes in the developed world began with one world war and concluded with the second. It was the result of the hard work, and good fortune, of a handful of men. The first of these, the Scottish biologist Sir Alexander Fleming, is famously credited with ‘accidentally’ discovering penicillin in his laboratory at St Mary’s Hospital in London. In fact, Fleming had been hunting for antibacterial compounds for years.
During the First World War he had treated wounded soldiers on the Western Front in France, only to see many of them die from sepsis. When the war came to an end and Fleming returned to the UK, he made it his mission to improve upon Lister’s antiseptic carbolic acid dressings. He soon discovered a natural antiseptic in nasal mucus, which he called lysozyme. But, as with carbolic acid, it could not penetrate beneath the surface of wounds, so deep infections festered. Some years later, in 1928, Fleming was investigating staphylococci bacteria – responsible for boils and sore throats – when he noticed something odd on one of his Petri dishes. He had been on holiday, and had returned to a messy lab bench full of old bacterial cultures, many of which had been contaminated with moulds. As he sorted through them, he noticed one dish in particular. Surrounding a patch of Penicillium mould was a clear ring, completely free of the staphylococci colonies that covered the remainder of the plate. Fleming spotted its significance: the mould had released a ‘juice’ that had killed the bacteria around it. That juice was penicillin.
Though growing the Penicillium had been unintentional, Fleming’s recognition of its potential importance was anything but accidental. It began a process of experimentation and discovery that would span two continents and twenty years, and revolutionise medicine. In 1939, a team of scientists at Oxford University, led by the Australian pharmacologist Howard Florey, thought they could make more use of penicillin. Fleming had struggled to grow significant quantities of the mould, or to extract the penicillin it produced. Florey’s team managed it, isolating small amounts of liquid antibiotic. By 1944, with the financial support of the War Production Board in the United States, penicillin was produced in sufficient quantities to meet the needs of soldiers returning from the D-Day invasion of Europe. Sir Alexander Fleming’s dream of beating the infections of the war wounded was realised, and the following year he, Florey, and one other member of the Oxford team, Sir Ernst Boris Chain, received the Nobel Prize in Medicine or Physiology.
Over twenty varieties of antibiotics have subsequently been developed, each attacking a different bacterial weakness, and providing our immune systems with backup when they are overwhelmed by infection. Before 1944, even scratches and grazes could mean a frighteningly high chance of death by infection. In 1940, a British policeman in Oxfordshire called Albert Alexander was scratched by a rose thorn. His face became so badly infected that he had to have his eye removed, and he was on the verge of death. Howard Florey’s wife Ethel, who was a doctor, persuaded Florey that Constable Alexander should become the first recipient of penicillin.
Within twenty-four hours of being injected with a tiny quantity of penicillin, the policeman’s fever dropped, and he began to recover. The miracle was not to be, however. A few days into his treatment, penicillin supplies ran out. Florey had attempted to extract any remaining penicillin from the constable’s urine to continue the treatment, but on the fifth day, the policeman died. It is unthinkable now to die from a scratch or an abscess, and we often take antibiotics without heed to their life-saving properties. Surgery, too, would carry enormous risk were it not for the protective shield of intravenous antibiotics given before the first cut is made.
Our twenty-first-century lives are a kind of sterile ceasefire, with infections held at bay through vaccinations, antibiotics, water sanitation and hygienic medical practice. We are no longer threatened by acute and dangerous bouts of infectious disease. Instead, the past sixty years have seen a collection of previously rare conditions rise to prominence. These chronic ‘twenty-first-century illnesses’ have become so common that we accept them as a normal part of being human. But what if they are not ‘normal’?
Looking around among your friends and family, you won’t see smallpox, measles or polio any more. You might think how lucky we are; how healthy we are these days. But look again and you might see things differently. You might see the sneezing and red, itchy eyes of your daughter’s hay fever in the spring. You might think of your sister-in-law, who has to inject herself with insulin several times a day because of her type 1 diabetes. You might be worried your wife will end up in a wheelchair with multiple sclerosis as her aunt did. You might have heard about your dentist’s little boy who screams, and rocks himself, and won’t make eye contact, now that he has autism. You might get impatient with your mother who is too anxious to do the shopping. You might be searching for a washing powder that doesn’t make your son’s eczema worse. Your cousin might be the awkward one at dinner who can’t eat wheat because it gives her diarrhoea. Your neighbour might have slipped unconscious whilst searching for his EpiPen after accidentally eating nuts. And you might have lost the battle to keep your weight where beauty magazines, and your doctor, say it should be. These conditions – allergies, autoimmune diseases, digestive troubles, mental health problems and obesity – are the new normal.
Let’s take allergies. Perhaps there’s nothing alarming about your daughter’s hay fever, as 20 per cent of her friends also snuffle and sneeze their way through summer. You are not surprised by your son’s eczema, because one in five of his classmates have it too. Your neighbour’s anaphylactic attack, terrifying though it was, is common enough that all packaged foods carry warnings if they ‘may contain nuts’. But have you ever asked yourself why one in five of your children’s friends have to take an inhaler to school in case they suffer an asthma attack? Being able to breathe is fundamental to life, yet without medication, millions of children would find themselves gasping for breath. What about why one in fifteen children are allergic to at least one type of food? Can that be normal?
Allergies affect nearly half of us in developed countries. We dutifully take our antihistamines, avoid picking up the cat, and check the ingredients lists of everything we buy. We unthinkingly do what is necessary to stop our immune systems overreacting to the most ubiquitous and innocuous of substances: pollen, dust,