(1)
Department of Medicine, Baystate Health, Springfield, MA, USA
Abstract
“Flexner’s serum” had a significant and dramatic impact on the uniformly poor prognosis associated with meningococcal meningitis in the early decades of the twentieth century, reducing the overall mortality by more than 50 % and providing the first real glimmer of therapeutic hope in the century-long history of this disease. Similarly, serum therapy for pneumococcal disease—the most lethal of the common bacterial diseases of the time—showed substantial early promise during the 1920s and 1930s. It seemed at that juncture as if this would be the path towards managing some of the most dangerous bacterial infections of the era. However, an abrupt shift—a paradigm shift—was about to take place in medical science, one that would lead to a transformational change in the way in which infectious diseases were treated. In the late 1930s, this revolution would not only alter the prognosis of patients suffering from these maladies but would also alter the history of medical science—at least temporarily.
“Flexner’s serum” had a significant and dramatic impact on the uniformly poor prognosis associated with meningococcal meningitis in the early decades of the twentieth century, reducing the overall mortality by more than fifty percent and providing the first real glimmer of therapeutic hope in the century-long history of this disease. Similarly, serum therapy for pneumococcal disease—the most lethal of the common bacterial diseases of the time—showed substantial early promise during the 1920s and 1930s. It seemed at that juncture as if this would be the path towards managing some of the most dangerous bacterial infections of the era. However, an abrupt shift—a paradigm shift—was about to take place in medical science, one that would lead to a transformational change in the way in which infectious diseases were treated. In the late 1930s, this revolution would not only alter the prognosis of patients suffering from these maladies but would also alter the history of medical science—at least temporarily.
Throughout human history and until the latter portion of the nineteenth century, medicines comprised substances derived from plants, minerals, or other natural sources that had empirically been found to possess therapeutic qualities. In some cases, as with purging agents, the therapeutic effect was based on their mythical ability to expel certain bad humors or toxins from the body. While this attribute may have had some benefit in select poisonings or other intoxications—and still does in some instances today—it clearly had no effect on the course of most infectious diseases. In fact, it may have even caused a worsening in the clinical course of patients with severe infections because it caused them to become further depleted of fluids in their body, hence exacerbating an already depleted state.
The first hundred years of therapy for meningococcal meningitis—before the introduction of serum therapy in the early twentieth century—involved largely symptomatic and “alexipharmic”—antidote—therapies. These included such treatments as “nutritious diet…Peruvian bark and bitters…tincture of iron…twigs of the hemlock tree…brandy, spiced wine, opium and arsenite of potash…calomel, ipecacuanha…cold and tepid ablution…rhubarb”; these specific interventions were recommended for specific symptoms or manifestations of the disease.1 Although such approaches gave both patients and their physicians at least some measure to attempt to alleviate the suffering associated with meningitis, they did little to alter the dismal natural history of the disease.
With advances in the field of chemistry came advances in medicine. Although it was widely known that many plants contained medicinal properties, delineation of their specific, active substances eluded science until the tools of technology had progressed to a point that such laboratory experiments could be performed.2 The bark of cinchona plants—a species native to the tropical forests of South America—was known to have therapeutic properties for malaria and other “tropical fevers” for nearly 300 years before its chemical basis—the alkaloid agent quinine—was discerned in the laboratory.
As the field of synthetic, analytical organic chemistry developed through a series of rapid, incremental advances in the middle and late nineteenth century—in parallel with advances in microbiology, immunology, and other scientific disciplines during that period—researchers transitioned from working to isolate medicinally active substances from natural products to the actual production of these substances through chemical synthesis in the laboratory. The first, major breakthrough in this arena came from experiments with a thick, dark, oily liquid—coal tar—used in the dye industry.
In the spring of 1856, an 18-year-old chemistry assistant, William Henry Perkin, working to synthesize quinine in a makeshift laboratory in his East London home, made the serendipitous discovery that aniline—an organic compound derived from coal tar—could be chemically transformed to produce a brilliant purple pigment. Perkin proceeded to commercialize the pigment—“mauveine”—thus not only becoming a very wealthy young man but also becoming the inadvertent founder of the commercial dye industry.3
Early on, the English and French dominated the fledgling dye industry. But aided by lax or absent patent laws, Germany rapidly ascended to primacy in the field, a situation formalized with the founding of Friedrich Bayer & Co.—later Bayer—in Barmen, Germany by dye salesman Friedrich Bayer and dyer Johann Weskott in 1863. Soon after, Bayer had competition—Hoechst dyeworks was founded the same year—followed some years later by Baden Aniline and Soda Factory—BASF.4–6 The dye industry in Germany quickly became a massive one, spurred by the rapid growth in textile manufacturing—its main market—as a consequence of the Industrial Revolution. Within just a few decades, Bayer had become an international powerhouse in the production of industrial dyes.
The relationship between chemically synthesized dyes and medical science, although perhaps not self-evident, became apparent early in the history of the former. Newly invented dyes advanced the technology of microbial and tissue staining—of great importance to the fledgling field of microbiology—as evidenced in the early work of Paul Ehrlich with methylene blue. But synthetic dyes would contribute much more to the field of medicine than as simple laboratory tools. In the 1880s, the same decade in which Koch discovered the bacterial etiologies of tuberculosis and cholera and in which he laid down the principles for growing bacteria in the laboratory, it became apparent to German industrial chemists that compounds involved in the production of dyes had potential therapeutic value.
The common denominator of dyes and medicines was coal tar. A usually discarded by-product of the carbonization of coal, coal tar was therefore a highly abundant raw material—albeit one without obvious utility—of nineteenth century industrialization. It was also the basis of the first synthetic medicinal compounds. Derivative forms are still used today in various topical and shampoo formulations for skin conditions such as dandruff and psoriasis. Some of the aniline compounds derived from coal tar that were used to synthesize dyes—initially indigo but later other colors as well—were also found, in many cases through serendipity, to have medicinal effects.7
The first of these synthetic dyes-turned-medicinal agents was acetanilide, a highly toxic compound with antipyretic—fever-lowering—effects. In 1887, the same year that Weichselbaum identified meningococcal bacteria in patients with meningitis in Vienna, Carl Duisberg—the inaugural research director and later management board chairman of Bayer—devised a method to chemically modify acetanilide, rendering it a less toxic but highly potent antipyretic as well as analgesic.8 The widespread medicinal use of that compound—phenacetin, the parent drug of acetaminophen—Tylenol®—and its economic ramifications impelled the diversification of the rapidly growing German dye industry into the pharmaceutical research business.
Perhaps the most important landmark in the early history of the commercial pharmaceutical industry, occurring in the final days of the nineteenth century, also involved the relief of pain, headaches, and fevers. The medicinal effects of various extracts of plants, including those from willow tree bark and the meadowsweet herb, were probably known since the time of Hippocrates, the ancient Greek father of modern medicine. A synthetic form of salicylic acid, the medically active ingredient in these homeopathic remedies, first isolated during the rise of analytical chemistry in the mid-1800s, was produced by Bayer scientists in 1897 and sold worldwide under the trade name of “Aspirin” by 1899.9 This agent, by virtue of its profitability and global appeal, revolutionized and galvanized the nascent pharmaceutical research industry.
Because of its supremacy in the industrial manufacture of dyes and synthetic chemistry, Germany became the leader in medicinal chemistry during the early years of the first quarter of the twentieth century. Bayer was at the forefront, with its production of phenacetin followed by aspirin and other agents, including—for a brief period during the first decade of the new century—heroin, marketed as a cough suppressant until its addictive nature became widely manifest.10 In large part a response to the effect of forced reparations and other measures levied against German economic interests resulting from their defeat in World War I, the German chemical industry underwent a massive restructuring in 1925. Three of its major companies—Bayer, Hoechst, and BASF—joined with three of its smaller ones to form I.G. Farben, which by the 1930s was one of the world’s largest companies.11 With the merger, German pharmaceutical research not only consolidated its standing but also advanced its reach into the well traveled yet poorly understood world of infectious disease therapeutics.
Ehrlich is credited with coining the term “chemotherapy”—the use of chemical agents to treat germ causes of disease—in the early 1900s in describing his vision to unite advances in two great streams of scientific research of the era: infectious diseases and synthetic organic chemistry.12 Although “chemotherapy” is now more often synonymous with cancer treatment, Ehrlich specifically used it to refer to the treatment of infection.