(hence the name “bioterrorism”). A small number of deaths are sufficient to achieve this goal.
Among bacteria, Bacillus anthracis, the causative agent of anthrax disease, has been identified as a particularly useful weapon by bioterrorists. B. anthracis, as a spore-forming Gram-positive bacterium, is easier to store and “weaponize” than a more fragile organism, such as the Gram-negative Yersinia pestis, the causative agent of bubonic plague. In spore form, B. anthracis is also easier to handle than the highly contagious smallpox virus. The U.S. Army was worried enough about possible anthrax attacks to administer the anthrax vaccine to soldiers going to Iraq and Afghanistan. This sparked controversy because the efficacy and safety of the available anthrax vaccines were contentious issues at the time. Unfortunately, the anthrax attacks through the U.S. postal system in late 2001 only served to exacerbate the fear and solidify the realization that bioterrorism is a reality that we must now address.
An alternative bacterial choice of bioterrorists is Clostridium botulinum, another spore-forming bacterium that produces botulinum neurotoxin. Producing botulinum neurotoxin in your garage is inadvisable and can be extremely hazardous to your health, but this toxin is produced commercially (as Botox) for use in a variety of medical and cosmetic applications, ranging from correcting facial tics and strabismus (cross eyed) to eliminating wrinkles and preventing scarring from reconstructive surgery. Thus, it is conceivable that terrorists might hijack commercially produced Botox from factories that produce it. Whether emptying vials of toxin into a city’s water supply would actually result in any deaths is not clear, as dilution and breakdown of the purified toxin protein in the environment will occur. It is, however, better to err on the side of caution. The most recent concern, though so far only theoretical, is that botulinum toxin might be deliberately introduced into milk, juice, or soft drinks during processing.
A New Respect for Prevention
Major changes that hold great promise for the future have been occurring in the approach to controlling infectious diseases. Traditionally, medical establishments in developed countries have opted for a treatment-based approach. Although vaccinations were given to prevent some diseases and doctors used antibiotics prophylactically to prevent others, such as postsurgical infections or infections in cancer chemotherapy patients, the most common approach to treating infectious diseases was to wait for an infected person to seek medical help before intervening in the disease process. This approach has been criticized for being expensive and for allowing diseases to gain a foothold in the body before action is taken—a delay that in some cases results in long-term damage to the patient, even if the treatment successfully eliminates the infecting bacterium from the body.
Treatment-based approaches have also become much less effective as increasingly resistant bacteria make it more difficult to choose the appropriate antibiotic treatment. To better combat this escalating problem, it is important to understand the reasons for the rise in antibiotic resistance, particularly in hospital settings. If a bacterial infection is not cleared immediately, sepsis can kill a patient in just a few hours. Thus, since waiting for proper diagnostics can be fatal to the patient, the physicians’ responses have generally been to use more advanced, broad-spectrum antibiotics to treat all bacterial infections, regardless of whether they might be treatable with less expensive, narrow-spectrum antibiotics. Adding the diagnostic testing to the decision process also raises the overall cost of the clinical visit, something that not only delays treatment, but also is actively discouraged by health insurance companies. Physicians have been advised to use the frontline antibiotics first, but also to send samples to the microbiological laboratory for antibiotic resistance evaluation and then adjust the therapy if laboratory results indicate another, more appropriate treatment regimen. Nevertheless, the overall result is increased selective pressure on the bacteria to develop resistance against the frontline antibiotics.
A far preferable approach to controlling a disease is preventing it in the first place. This approach has been successful in ensuring the safety of food and water. Now, more and more public health officials, hospital managers, and executives of health management organizations (HMOs) are rediscovering that prevention is also far more effective—and far less expensive—than treatment after infection has occurred. Prevention is suddenly center stage again. But, for a preventive approach to work, it is first necessary to have extensive information about the epidemiology of disease (i.e., information about disease patterns, their geographic distribution, and determinants of health-related states). It is also necessary to have a large-scale networking infrastructure in place that can serve as an early warning system to detect signals indicating new disease trends. Led by the CDC and the WHO, a variety of such epidemiological surveillance programs have been implemented to monitor the appearance of new diseases, the increased incidence of existing diseases, and the occurrence of antibiotic-resistant bacteria. Indeed, the CDC website (http://www.cdc.gov/) is now an extensive portal for the latest information about various infectious diseases, including scientific and medical information about a disease and its causative agent, up-to-date disease trends, precautionary measures, and travel alerts.
The CDC has been monitoring a subset of particularly problematic infectious diseases for years, but the list of diseases covered had been far from exhaustive. Now, many more infectious diseases, such as infections with pathogenic E. coli and Chlamydia, have been placed on the list of reportable diseases. A problem the CDC has had to cope with is that reporting of diseases is voluntary on the part of state public health departments. Overworked and underfunded state health departments have sometimes, understandably, given reporting of diseases a low priority. The CDC and the National Institutes of Health (NIH) are fighting to alert government agencies to the importance of having consistently funded monitoring programs. Many of the recent pandemics, such as SARS, West Nile virus, Ebola, and various avian and swine flus, have lent urgency to these efforts.
Surveillance: An Early Warning System
An example of a CDC surveillance program that was established in 1995 is Foodnet, a program that tries to count all cases of foodborne disease, such as those caused by Salmonella, E. coli O157:H7, Vibrio cholerae, Listeria, and Campylobacter, in 10 selected states in the United States and then to estimate from these data the incidence of these diseases nationwide. Attempts are also being made in several areas to monitor antibiotic-resistant pathogens. Prior to the introduction of Foodnet, the CDC had abundant information about large outbreaks of foodborne disease, but had no idea how many isolated cases of foodborne disease occur, so it was difficult to track sources of contamination. Based on the information gathered so far by Foodnet, the outlook is not positive. Although contamination of foods by some pathogens, such as E. coli O157:H7, has declined, there have been little or no recent reductions for most other infections. Indeed, Campylobacter and Vibrio infections have actually increased in recent years.
Monitoring disease prevalence is only the first step. Next must come effective action to control the further spread of disease. Due to limited shelf life, most perishable products must be shipped immediately for distribution. As such, companies generally wait until the final step in food processing to test foods for microbiological safety. However, it can take days to weeks for microbiological test results to be obtained. This scenario allows shipments of contaminated food to leave processing plants and reach points of distribution before the results of the tests are known. There are encouraging signs that programs for prevention of foodborne diseases are beginning to be implemented. For example, the Food and Drug Administration (FDA) and the United States Department of Agriculture (USDA) have now in place hazard analysis and critical control point (HACCP) programs.
Based on risk assessment, HACCP programs monitor the food and food safety practices at control points along the food production chain where contamination is most likely to occur. This approach not only lessens the likelihood that contaminated foods will be shipped, but also identifies contamination problems or situations that might lead to contamination early so that they can be rectified. At first the food industry was leery of the HACCP approach, viewing it as a needless and potentially expensive government intrusion, but the food industry has now become more enthusiastic about HACCP programs after seeing how expensive and injurious to the reputation of a company a large recall of contaminated products can be. A good HACCP program not only protects the public from disease, but also protects the company from recalls and lawsuits.