programmes are designed to bring about ‘herd immunity’ by vaccinating a large enough proportion of the population that contagious diseases cannot continue their spread. They have meant that many infectious diseases are almost completely eliminated in developed countries, and one, smallpox, has been totally eradicated. Smallpox eradication, as well as dropping the incidence of the disease from 50 million cases a year worldwide to absolutely none in little more than a decade, has saved governments billions in both the direct cost of vaccination and medical care, and the indirect societal costs of illness. The United States, which contributed a disproportionately large amount of money to the global eradication effort, recoups its investment every twenty-six days in unspent costs. Governmental vaccination schemes for a dozen or so other infectious diseases have dramatically reduced the number of cases, reducing suffering and saving lives and money.
Today, most countries in the developed world run vaccination programmes against ten or so infectious diseases, and half a dozen are marked for regionwide elimination or global eradication by the World Health Organisation. These programmes have had a dramatic effect on the incidence of these diseases. Before the worldwide eradication programme for polio began in 1988, the virus affected 350,000 people a year. In 2012, the disease was confined to just 223 cases in only three countries. In just twenty-five years, around half a million deaths have been prevented and 10 million children who would have been paralysed are free to walk and run. Likewise for measles and rubella: in a single decade, vaccination of these once-common diseases has prevented 10 million deaths worldwide. In the United States, as in most of the developed world, the incidence of nine major childhood diseases has been reduced by 99 per cent by vaccination. In developed countries, for every 1,000 babies born alive in 1950, around forty would die before their first birthday. By 2005, that figure had been reduced by an order of magnitude, to about four. Vaccination is so successful that only the oldest members of Western society can remember the horrendous fear and pain of these deadly diseases. Now, we are free.
After the development of the earliest vaccines came a second major health innovation: hygienic medical practice. Hospital hygiene is something we are still under pressure to improve today, but in comparison with the standards of the late nineteenth century, modern hospitals are temples of cleanliness. Imagine, instead, wards crammed full with the sick and dying, wounds left open and rotting, and doctors’ coats covered in the blood and gore of years of surgeries. There was little point in cleaning – infections were thought to be the result of ‘bad air’, or miasma, not germs. This toxic mist was thought to rise from decomposing matter or filthy water – an intangible force beyond the control of doctors and nurses. Microbes had been discovered 150 years previously, but the connection had not been made between them and disease. It was believed that miasma could not be transferred by physical contact, so infections were spread by the very people charged with curing them. Hospitals were a new invention, born of a drive towards public health care and a desire to bring ‘modern’ medicine to the masses. Despite the good intentions, they were filthy incubators for disease, and those attending them risked their lives for the treatment they needed.
Women suffered most as a result of the proliferation of hospitals, as the risks of labour and giving birth, rather than falling, actually rose. By the 1840s, up to 32 per cent of women giving birth in hospital would subsequently die. Doctors – all male at that time – blamed their deaths on anything from emotional trauma to uncleanliness of the bowel. The true cause of this horrifyingly high death rate would at last be unravelled by a young Hungarian obstetrician by the name of Ignaz Semmelweis.
At the hospital where Semmelweis worked, the Vienna General, women in labour were admitted on alternate days into two different clinics. One was run by doctors, and the other by midwives. Every second day, as Semmelweis walked to work, he’d see women giving birth on the street outside the hospital doors. On those days, it was the turn of the clinic run by doctors to admit labouring women. But the women knew the odds for their survival would not be good if they could not hold on until the following day. Childbed fever – the cause of most of the deaths – lurked in the doctors’ clinic. So they waited, cold and in pain, in the hope that their baby would delay its entrance to the world until after midnight had struck.
Getting admitted to the midwife-run clinic was, relatively speaking, a far safer proposition. Between 2 and 8 per cent of new mothers would die of childbed fever in the care of midwives – far fewer than succumbed in the doctors’ clinic.
Despite his junior status, Semmelweis began to look for differences between the two clinics that might explain the death rates. He thought overcrowding and the climate of the ward might be to blame, but found no evidence of any difference. Then, in 1847, a close friend and fellow doctor, Jakob Kolletschka, died after being accidentally cut by a student’s scalpel during an autopsy. The cause of death: childbed fever.
After Kolletschka’s death, Semmelweis had a realisation. It was the doctors who were spreading death among the women in their ward. Midwives, on the other hand, were not to blame. And he knew why. Whilst their patients laboured, the doctors would pass the time in the morgue, teaching medical students using human cadavers. Somehow, he thought, they were carrying death from the autopsy room to the maternity ward. The midwives never touched a corpse, and the patients dying on their ward were probably those whose post-natal bleeding meant a visit from the doctor.
Semmelweis had no clear idea of the form that death was taking on its passage from the morgue to the maternity ward, but he had an idea of how to stop it. To rid themselves of the stench of rotting flesh, doctors often washed with a solution of chlorinated lime. Semmelweis reasoned that if it could remove the smell, perhaps it could remove the vector of death as well. He instituted a policy that doctors must wash their hands in chlorinated lime between conducting autopsies and examining their patients. Within a month, the death rate in his clinic had dropped to match that of the midwives’ clinic.
Despite the dramatic results Semmelweis achieved in Vienna and later in two hospitals in Hungary, he was ridiculed and ignored by his contemporaries. The stiffness and stench of a surgeon’s scrubs were said to be a mark of his experience and expertise. ‘Doctors are gentlemen, and gentlemen’s hands are clean,’ said one leading obstetrician at the time, all the while infecting and killing dozens of women each month. The mere notion that doctors could be responsible for bringing death, not life, to their patients caused huge offence, and Semmelweis was cast out of the establishment. Women continued to risk their lives giving birth for decades, as they paid the price of the doctors’ arrogance.
Twenty years later, the great Frenchman Louis Pasteur developed the germ theory of disease, which attributed infection and illness to microbes, not miasma. In 1884, Pasteur’s theory was proved by the elegant experiments of the German Nobel prize-winning doctor Robert Koch. By this time, Semmelweis was long dead. He had become obsessed by childbed fever, and had gone mad with rage and desperation. He railed against the establishment, pushing his theories and accusing his contemporaries of being irresponsible murderers. He was lured by a colleague to an insane asylum, under the pretence of a visit, then forced to drink castor oil and beaten by the guards. Two weeks later, he died of a fever, probably from his infected wounds.
Nonetheless, germ theory was the breakthrough that gave Semmelweis’s observations and policies a truly scientific explanation. Steadily, antiseptic hand-washing was adopted by surgeons across Europe. Hygienic practices became common after the work of the British surgeon Joseph Lister. In the 1860s, Lister read of Pasteur’s work on microbes and food, and decided to experiment with chemical solutions on wounds to reduce the risk of gangrene and septicaemia. He used carbolic acid, which was known to stop wood from rotting, to wash his instruments, soak dressings and even to clean wounds during surgery. Just as Semmelweis had achieved a drop in the death rate, so too did Lister. Where 45 per cent of those he operated on had died before, Lister’s pioneering use of carbolic acid slashed mortality by two-thirds, to around 15 per cent.
Closely following Semmelweis’s and Lister’s work on hygienic medical practice was a third public health innovation – a development that prevented millions from becoming ill in the first place. As in many developing countries today, water-borne diseases were a major health hazard in the West before the twentieth century. The sinister forces of miasma were still at work, polluting rivers, wells and pumps. In August 1854, the residents of London’s Soho