top of page

How the Terrible Gas Warfare of World War One Saved Millions of Lives from Cancer

By Tom Anderson


World War II Poster from the US Army

Ever since the power of science and technology to change the world was widely recognised, people have grappled with the dichotomy that they can change it both for better and for worse. Throughout history, and particularly in the twentieth century with its rapid pace of change, the leaders of political parties, corporate businesses and popular movements have attempted to turn scientific progress only to the rails on which they wish it to run, or to stop it altogether. The former two groups have typically focused on ‘utilitarian’ ends of attempting to focus research only on areas they think will be of practical use, or which will make money. The third group may campaign against breakthroughs such as nuclear power, improvements in communication technology or widescale chemical production, for reasons sometimes justified but frequently imaginary or overblown. In any case, all such efforts are fundamentally doomed to failure, for the simple reason that science is not predictable. If it was, it wouldn’t be science—and there would be little point to doing research. Ninety-nine experiments out of a hundred, or more, may give the expected result—but it is the one which does not which leads to old theories being scrapped and new ones conceived. A quote attributed to Isaac Asimov states that the most exciting phrase in science is not ‘Eureka!’ but ‘Hmm, that’s funny…’


The upshot of all this is that attempting to direct the course of research is often misguided or futile. Strategies focused on utilitarian, money-making ends (as was adopted by the research funding model of Edward Heath’s government in the UK) miss the point that so many scientific breakthroughs were achieved by accident, while attempting to do something else. Radio waves, and everything we now take for granted that came from them—television, radar, mobile telephones, wireless internet—were first produced by Heinrich Hertz in the 1880s not with a profit motive in mind, but because he wanted to find evidence for James Clerk Maxwell’s theories of electromagnetism formulated two decades earlier. When interviewed by the media at the time, Hertz stated that he did not see any practical use for radio waves—it took the business-oriented mind of a man like Guglielmo Marconi to realise their potential to change the world. Time and again throughout history, it is such ‘blue-skies’ research, as it sometimes sneeringly referred to be utilitarians, which has been the origins of world-changing inventions that make fortunes. Nuclear power and nuclear weapons stemmed from extremely esoteric investigations about the nature of matter. The World Wide Web was incidentally created by Tim Berners-Lee to keep track of particle accelerator data at CERN, and has changed the world more, perhaps, than even the most profound discoveries about particle physics there. The list goes on.



Stephanie Kwolek. Photo from the Science History Institute and shared under the CC BY-SA 3.0 licence.

Of course, it is possible to go too far the other way. Radio Corporation of America (RCA), after many early brilliant breakthroughs in the first half of the twentieth century, came unstuck in the latter half in part because they were unable to bring their blue-skies focused research labs back under control. Simply throwing money at ideas men and women is not guaranteed to produce a successful business model. Yet even in such corporate settings, we find that scientific breakthroughs frequently take on a characteristic of serendipity—as someone once said, a useful word of Persian origin that sounds better than ‘dumb luck’. For example, Stephanie Kwolek, a chemist working at the chemical giant DuPont in 1965, discovered the material Kevlar (known for its use in bulletproof vests)—while trying to develop a new material for racing tyres. Similarly, cyanoacrylate ‘superglues’ were discovered by a team led by Harry Coover Jr. in the 1940s and 50s while trying to make solid plastics for use in aircraft manufacturing. It took time to realise that, while the new material was far too sticky for the intended use, it could be sold as a powerful adhesive—and Coover’s latter company, Eastman Kodak, made millions off it.


This brings us to an interesting question. If scientific research is truly blind, can it be said to have moral character? Deadly weapons can come from research into innocent uses, for example new mining explosives also being used for weaponry in the nineteenth century. World War One was prolonged because German scientists had developed the Haber-Bosch Process to produce nitrates from nitrogen and hydrogen—which meant Germany was no longer dependent on imports of naturally-occurring nitrates from guano islands, and a blockade was ineffective. Yet those nitrates were used both for explosives and for fertilisers, as they still are to this day; an invention that killed so many soldiers also feeds millions the world over.


Is it then fair to say that scientific research has no inherent moral character one way or the other, that it is never good or evil, and it is only how we choose to deploy its results which determine this? That is a much-debated point; for example, one can certainly raise questions about the ethics of how an experiment is designed (such as testing on animals or, historically, unwilling or unwitting humans). Yet in terms of the research direction, its stated goal, we have seen many examples so far of how an intended goal can be very far from what the research ultimately produces. Many of those examples have been negative; striving for a peaceful invention, to make the world a better place, leads to weapons or pollutants. Yet sometimes the reverse can be true. Let us take what, on the face of it, is surely one of the most morally indefensible uses of scientific reseach in history: the development of poison gas for use in warfare.


As with many weapons we associate with twentieth-century innovation, there are in fact many ancient uses of rudimentary chemical warfare. It was reasonably common to poison arrows or enemy food supplies, and besieged armies might be smoked out (there are Chinese records of using toxic smoke containing arsenic compounds). In the sixteenth and seventeenth centuries, a number of European inventors experimented with poison-based weapons; in 1675 France and the Holy Roman Empire signed the Strasbourg Agreement, the first ever international treaty banning the use of chemical weapons (then usually referring to poisoned bullets).


However, the first proposals of what we would consider gas warfare in the modern sense came during the Napoleonic Wars. Admiral Cochrane (whose family was the subject of an earlier article in this series) proposed coastal bombardments of France in 1812, using (as well as conventional bomb-ships) ‘stink vessels’. The Admiralty rejected his plans, in part because of the prophetic fear that this would escalate into France developing the same weapons and using them against Britain. Undaunted, Cochrane proposed another gas weapon for use against Russia during the Crimean War, together with the Liberal politician and scientist Lyon Playfair, 1st Baron Playfair. The Cochrane-Playfair proposal was to use shells filled with a noxious organoarsenic compound called cacodyl cyanide, discovered by Robert Bunsen (after whom the Bunsen burner is named, though his lab assistant Peter Desaga had a bigger role in developing it). Cacodyl cyanide had the unusual property that even a dose far below lethal portions would produce tingling in the extremities and insensibility, as well as a noticeable black coating on the tongue.



Lyon Playfair, advocate of gas warfare.

The Prime Minister, Lord Palmerston, considered the proposal, but it was rejected by the Ordnance as being morally equivalent to poisoning wells. Playfair, like many later advocates of gas warfare, was irate at what he saw as an arbitrary moral judgement: “There was no sense in this objection. It is considered a legitimate mode of warfare to fill shells with molten metal which scatters among the enemy, and produced the most frightful modes of death. Why a poisonous vapor which would kill men without suffering is to be considered illegitimate warfare is incomprehensible. War is destruction, and the more destructive it can be made with the least suffering the sooner will be ended that barbarous method of protecting national rights. No doubt in time chemistry will be used to lessen the suffering of combatants, and even of criminals condemned to death.” Playfair’s latter prediction did come true, though the use of gas chambers as an execution method became controversial (in part due to their use by Nazi Germany in the mass murder of the Holocaust) and the last execution by gas chamber in the USA was carried out in 1999.


It would take another six decades before gas warfare was finally tried for real. A number of other international conventions had forbidden the use of war gases in the meantime, but these restrictions were allowed to slide away in an example of the terrible escalation of the war. France initially used tear gases originally intended for riot control, which blurred the line. Germany’s aforementioned powerful chemical industry weaponised first chlorine and then sulfur mustard, a chemical developed almost a century earlier. Mustard gas, as it became known, wrought terrible burns and blisters on its targets. Though there were other war gases used in the First World War, such as phosgene, mustard gas is the best-known for the horrific damage it caused. Even soldiers who seemed physically unharmed would often die in hospital; the gas had burned their lungs and gut from the inside out. It later transpired that the gas was also a powerful mutagen and carcinogen, and those who recovered had a substantially increased risk of cancer. Unlike other gases that needed to be breathed in, mustard gas could harm even a soldier wearing a gas mask. It was also persistent, contaminating uniforms, equipment and even the ground for weeks afterwards.


Britain reverse-engineered the gas and used it in return on the Germans; following the end of the war, the use of mustard gas was banned. While it has been used in warfare since, it has always been by regimes against asymmetric opponents who cannot retaliate in kind, the fear of such being so great. Even during the horrors of the Second World War, gas was never used between Germany and the Western Allies for fear of retaliation. In some ways, this can be considered a precursor to the mutually assured destruction of the atomic age.


Surely, then, mustard gas must hold a position in a hall of infamy as one of the most unambiguously negative and horrific inventions that science and engineering have ever produced. And yet, even in a situation as dark as this, we see that scientific progress is not predictable. Mustard gas cost or ruined the lives of thousands in the First World War—yet it has saved the lives of millions since that time. How?


Autopsies of those who died from mustard gas exposure led to some interesting results. As noted above, soldiers without apparent wounds might die due to burns to the lungs and guts—the parts of the body which are constantly producing new cells to replace those worn out on a daily basis. The same is true of the white blood cells that form the basis of our immune system: the autopsies showed the soldiers with unexpectedly low white cell counts. It was clear that though mustard gas attacked all cells, it disproportionately affected rapidly-dividing cells. It would take decades later for researchers to find out exactly how this happened: mustard gas worked because nitrogen atoms in the guanidine nucleotides of DNA would become alkylated by it. This is usually recognised by a cell’s internal defence systems as damage and causes the cell to self-destruct before it becomes cancerous as a result. (In rare cases where this wasn’t possible, this was why mustard gas could cause cancer in survivors). If doctors could find a weakened form of mustard gas, they would have developed a drug that might only kill the rapidly-dividing cells of an existing cancer—or at least kill them faster than it killed the healthy cells in the rest of the body. It would be a form of therapy for cancer using a chemical: a chemotherapy!


Further research was required (and studies of US soldiers exposed to released gas during an air raid on Bari in Italy), but in the 1940s researchers at the Yale School of Medicine produced the first suitable derivative, a nitrogen (rather than sulfur) mustard usually called mustine. Mustine was the first ever chemotherapy drug, used for treating lymphoma, a cancer of the blood. It was still very crude, and chemotherapy has always remained a case of ‘find a poison that kills the cancer faster than it kills the patient’, but it nonetheless saved lives and began a research path that has brought us where we are today.

Cisplatin

Once again, serendipity rules the roost. A later anti-cancer drug, cisplatin, was discovered by Barnett Rosenberg and co-workers in 1965 by chance. Rosenberg had been studying the growth of bacteria in an electric field, but was puzzled to find that bacterial growth was halted for no apparent reason—until he and his colleagues realised that platinum from his electrode was combining with other compounds present to create cisplatin, which reacted with the bacterial DNA to trigger cellular self-destruct. The same property could be used to target rapidly-dividing cells in cancer patients. In fact the chemical cisplatin had been known for over a century, but nobody had ever thought to test that it might have anti-cancer properties. Cisplatin went on to be one of the most important and wide-spectrum chemotherapy drugs in history, forming the basis for many later ones.


Arguably these two chance discoveries together have made more difference to the lives of cancer patients, people who once would have been condemned to certain death, then all the deliberately focused research aimed at finding new drugs—though the latter is, of course, important as well. It is a key reminder that science cannot be coralled or directed or blocked. It is an exploding flower of unpredictable consequences, for good and for ill. At time of writing, the world is struggling with the oubreak of the novel Covid-19 coronavirus, and scientists across the world are working to make vaccines and countermeasures. The answers may come from grand and well-funded projects directed to this end goal, with researchers working in lockstep—but do not be surprised if they also come from one man or woman in an unrelated lab staring at a slide and muttering “Hmm, that’s funny…”

 
 

Comments


bottom of page