A history of terrorism requires a very specific definition to avoid a never-ending summary of every violent act ever recorded. The brief, objective definition proposed by Dr. Boaz Ganor, an Israeli political scientist and deputy dean of the Lauder School of Government and Diplomacy at the Interdiciplinary Center Herzliya, works well for this purpose:terrorism is the intentional use of, or threat to use violence against civilians or against civilian targets, in order to attain politician aims.

This avoids subjective interpretation based on the perpetrator’s motivations, tactics, and civilian versus military status. When we discuss terrorism in the 21st century, however, we must include weapons of mass destruction, and broaden the defintion slightly to include indiscriminate targets, since many of the weapons and tactics of modern terrorism are capable of killing huge numbers of people at once.

Additionally, some forms of modern terror, such as cyberterrorism, do not fall neatly under the rubric of “violence”, at least in their initial employment, although in this increasingly computerized world, viruses and database intrusions could ultimately lead to deaths.

How real are the threats of WMD terrorism? What new or highly mutated forms of terrorist activities might lie ahead? And more to the point, how can countries hope to counter such violence, when one of the key components of “successful” terrorism is the element of suprise?

If you have ever seen photos of ordinary household germs and dust mites under an electron microscope, magnify your visceral and immediate recoil by ten-fold and you have a fair idea of how most people think about biological weapons.

Terrorism feeds on fear, and one thing people fear is fighting something likely invisible, insidious, and irreversible. Certain chemicals (and radioactive fallout) meet this description as well, but many do not. Biological pathogens, however, seem especially frightening to people perhaps because they seem, to the lay person, the easiest to disseminate and, unlike with other weapons, can be passed from one person to the next, expanding an attack well beyond the original point of deployment, using such contagious diseases as small pox, ebola, AIDS, or plauge.

Adding to this is the reality that the first responders are not members of law enforcement or the military, but members of the public health sytem: doctors, EMTS, firefighters, and other civilians.

Consider some staggering facts. According to a report issued by the World Health Organization in 1999, “Over the next hour alone, 1,500 people will die from an infectious disease- over half of them are children under five. Of the rest, most will be working-age adults-many of them breadwinners and parents.

Both are vital age groups that countries can ill afford to lose.” That adds up to 13.1 million people a year. Perhaps more frightening still, just six infections diseases account for more than 90 percent of those deaths: pneumonia, tuberculosis, diarrheal diseases, malaria, measles, and HIV/AIDS. (WHO,p.2,1999)

Improper use of antibiotics, as well as increased virulence and human tolerance due to the natural mutation process, have led to highly resilient strains of pneumonia, tuberculosis, cholera and malaria.

Considering that accidental and naturally occurring outbreaks can cost so many millions of lives, it’s not difficult to imagine the effect deliberately mutated and weaponized strains of biological pathogens would have around the world.

Armies and individuals have employed biological weapons throughout recorded history. Many of the earliest recorded instances involve poisoning food and water supplies. During the BC 6th century, Assyrians poisoned enemey wells with rye ergot, a fungal parasite that causes hallucinations and brain damage. Solon of Athens poisoned Krissa’s water supplied with hellebore, a narcotic that can also cause heart attacks. Ancient armies routinely infected tossed rotting animals into the enemies; water supply; in the 12th century Barborassa used the bodies of his own dead soldiers.

Contaminating food and water supplies is not the only-time honored form of bioterrorism. Spreading infection and disease using conventional weapons and everyday objects has a long history as well. As far back as BC 400, archers poisoned their arrows by dipping them into decomposing bodies or in blood mixed with feces. During the Second Macedonian War, in a crude but effective precursor to missiles with biological warheads, Hannibal won the naval battle of Eurymedon by launching pots of venomous snakes onto the decks of the Pregamon ships.

In 1346, when many of the Tatar soldiers attacking the Crimean port of Kaffa were dying of bubonic plauge, their leader, DeMussis, capulated the diseased corpses into the city. When the infected Geonese defenders fled, precipitating the Black Plauge epidemics that killed enemies with wine mixed with blood of lepers.

Two hundred years later another Spaniard, Franciso Pizarro, tried to speed along his invasion of South America by distributing clothing infected with smallpox. British forces tried the same tactic in the French and Indian War.

In the early part of the Civil War, a Confederate surgeon tried to infect the Union army with clothes carrying yellow fever, while his compatriots were tossing dead animals into wells as they retreated. At this time, the U.S. Government, concerned that its Union soldiers were far less experienced in military matters thatn were their Confederate counterparts, paid German lawyer Franz Lieber to prepare a code laying out the accepted principles of warfare.

The articles in the resulting document,”Instructions for the Government of Armies of the United States in the Field,” became part of General Order No. 100, issued April 24, 1863. One key article read as follows: “The use of poison in any manner, be it to poison wells, or food, or arms, is wholly excluded from modern warfare. He that uses it puts himself out of the pale of law and usages of war.”

Other countries were at work drafting similar codes. The nations participating in a conference in Brussels in August 1874 issued a declaration banning specific weapons, including poison. A 1907 addition prohibited the “employment of projectiles containing asphyxiating or deleterious gases.” These same prohibitions were upheld by later declarations, including the “Protocol for the Prohibion of the Use in Ware of Asphyxating, Poisonous or other Gases, and of Bacteriological Methods fo Warfare”- the Geneva Protocol, signed June 19, 1925-which stated that “the use in war of asphyxiating, poisonous or other gases, and of all analogous liquids, materials or devices, has been justly condemned by the general opinion of the civilized world.”

Countries that ratified the protocol before WWII were Iran, Iraq, France, Germany, and the United Kingdom. The U.S. did not sign until 1975. The protocol was further strengthened in 1972 with the Biological Weapons Convention, but efforts to make it legally binding failed in 2001 when President George W. Bush refused to sign.

One business-oriented publication that often supported the president’s policies had this reaction: “Alongside Mr. Bush’s refusal to ratify the Comprehensive Test-Ban Treaty, and his moves to scrap the ABM(anti-ballistic missile) treaty, this was more than an undiplomatic blunder. It seems to represent a dangerously ideological aversion to any sort of binding arms control.”

These noble agreements, however, failed to prohibit governments from continuing to research, develop, store, transport, or produce biological weapons, and implied that all that was truly outlawed was being the first to use them in a particular conflict. The result is that countries around the globe still have active biological and chemical stockpiles or, as in the case of the United States, maintain active facilities engaged in defense research.