This article summarizes the developments that led to the current approach to immunotherapy. These developments were characterized in the early years by empirically derived successive approximations to arrive at effective injection regimens, in the middle years by a sorting through of the wide variations in practice with placebo-controlled clinical trials, and more recently by a closer association of clinical and laboratory measures to better define evidence-based practices. The pace of investigation along with the scientific quality continues to increase.
- •
From the later half of the 1800s to the early 1900s, allergy, infectious disease, and immunology practitioners were one and the same, with the view that sensitivity to pollen toxin could be addressed in the same way vaccination or subcutaneous injection remedied susceptibility to bacterial diseases. It was not until 1955 that bacterial vaccines became uncommon in the treatment of intrinsic asthma or chronic rhinitis.
- •
In the first half of the twentieth century, the basis for much of what we currently practice, from skin testing to progressive dose escalation immunotherapy, was derived empirically from uncontrolled (single arm, unblinded) clinical studies with nonstandardized extracts. The efficacy of combining multiple allergens in a treatment regimen went largely unverified.
- •
In the 1920s, observations that house dust and climate allergens could cause rhinitis and exacerbate asthma failed to gain the attention of allergy practitioners. It was not until 1960s when dust mite was characterized that environmental modification joined the prior triad of treatment (counseling, pharmacotherapy, and immunotherapy).
- •
Oral immunotherapy was common in the United States until a multi-institutional trial in 1940, with the negative result now ascribed to the ineffective pill route of administration. Positive reports from European centers over the intervening period have caused a recent resurgence in interest.
- •
Beginning in the mid-1950s, clinical trials began to incorporate placebo controls and subject/doctor blinding, moving the practice of allergists from empiric- to evidence-based practices. Oral immunotherapy still trails injection therapy in corroborating data. Recently, organized allergy has promulgated practice parameters from patient testing to allergen standardization and administration and has agreed on standards for future studies.
Earliest history
The first inklings that one’s native reaction to a toxic substance could be altered may have originated with Mithridates (131–63 bc ), a king of Pontus in Asia Minor. Because poison was a common way to dispatch rivals in those days, the king began ingesting small amounts of the potential poisons in gradually increasing doses until he developed resistance. Vague references to alterable capabilities of the immune system thereafter surfaced sporadically, from Galen and others, but it was not until 1891 that Ehrlich confirmed the ability to induce tolerance in mice fed ricin, a potent toxin, after a prolonged and gradual dose escalation. The distinction between resistance to toxin, or to infection, now known to be principally IgG and/or IgA mediated, and hypersensitivity diseases, principally IgE mediated or cell mediated, was indistinguishable to physicians until 1915 when verbiage related to a toxin (poison to infection) as the inciting factor for allergic rhinitis or asthma was abandoned in favor of the concept that such was a localized manifestation of anaphylaxis. In 1923, an allergy interest group was formed, and subsequently the Journal of Allergy , within the members of the American Association of Immunologists, which itself did not have sufficient numbers to be self-sustaining until 1913. Skepticism in the scientific community about both disciplines was characterized by an admonition to the society’s first journal editor that “immunology is dead.”
Eighteenth and nineteenth century contributions
Many investigators in the eighteenth and early nineteenth centuries who empirically derived much of the basics of what is still practiced clinically, from skin testing to progressive allergen dose escalation, were themselves afflicted with allergic rhinitis, chronic rhinosinusitis, and/or asthma. Pharmacologic alternatives were sparse, and there was no agreement on the nature of inhalant sensitivities. An article by Bishop in the first issue of Laryngoscope espoused a formula of morphine, atropine, and caffeine, laying the blame for seasonal rhinitis to an excess of uric acid in the blood. Adrenaline did not become commercially available until 1904, ephedrine until 1924, the first sedating antihistamine until 1936, and widespread release of steroid preparations until the mid-1950s. Given this dearth of effective remedies, it is not surprising that many investigators devoted their professional careers sorting through the confusing, given few laboratory-based assays, morass of hypersensitivity disorders to identify effective treatments. The efforts of some prominent contributors in allergy and related immunology are detailed.
The concept of an epicutaneous or transcutaneous route for conferring resistance started in 1795 with Jenner, who used a prick into a subject’s skin to deliver material derived from a cowpox pustule, which over time afforded protection from the more virulent smallpox virus. Jenner named the process vaccination (Latin reference to cow). In 1879, Pasteur demonstrated that a weak cholera strain given to chickens also provided protection from virulent strains and then replicated this procedure for anthrax in sheep. Pasteur chose a term just appearing in English literature, immunizes, for this phenomenon and, in 1884, successfully applied a series of graduated subcutaneous inoculations for rabies prevention. However, there were issues including significant reactions in some cases, now known to be from antineural autoantibodies. In the same time frame, Koch reported delayed hypersensitivity responses in some samples in which he had given tuberculosis culture inoculations with the hope of conferring immunity. These untoward effects encouraged alternate approaches, and by 1891, von Behring and Kitasato had successfully conferred passive immune protection to diphtheria in human patients injected with serum containing antitoxin from previously infected animals (usually horse). However, reports of adverse reactions surfaced yet again, with the first fatality in 1896, soon joined by others, suppressing enthusiasm for passive immunization to other infectious diseases prevalent in that era. In an effort to identify what might be causing these issues and also to introduce some standardization to antisera preparations, Ehrlich identified in the laboratory what is now known to be an antigen-antibody reaction as the culprit behind anaphylactic reactions to nonhuman sera.
The first description of the classic symptoms of hay fever and asthma were penned by Bostock in 1819, describing his personal illness. It was not until 1872 that Wyman tentatively identified pollen as a cause of autumn catarrh in the United States. The next year in England, Blackley followed with the same observation and a critical concept, not fully accepted for hay fever until 1915 and asthma until the 1940s, that inhalant sensitivities were not the result of an infectious disease. Blackley was the first to try immunotherapy, a self-experimentation whereby he rubbed grass pollen into his abraded skin, following up in 1880 with observations on the cutaneous wheal, erythema, and pruritus response to what was essentially a scratch test and using such to detect and grossly quantitate patient sensitivity.
Eighteenth and nineteenth century contributions
Many investigators in the eighteenth and early nineteenth centuries who empirically derived much of the basics of what is still practiced clinically, from skin testing to progressive allergen dose escalation, were themselves afflicted with allergic rhinitis, chronic rhinosinusitis, and/or asthma. Pharmacologic alternatives were sparse, and there was no agreement on the nature of inhalant sensitivities. An article by Bishop in the first issue of Laryngoscope espoused a formula of morphine, atropine, and caffeine, laying the blame for seasonal rhinitis to an excess of uric acid in the blood. Adrenaline did not become commercially available until 1904, ephedrine until 1924, the first sedating antihistamine until 1936, and widespread release of steroid preparations until the mid-1950s. Given this dearth of effective remedies, it is not surprising that many investigators devoted their professional careers sorting through the confusing, given few laboratory-based assays, morass of hypersensitivity disorders to identify effective treatments. The efforts of some prominent contributors in allergy and related immunology are detailed.
The concept of an epicutaneous or transcutaneous route for conferring resistance started in 1795 with Jenner, who used a prick into a subject’s skin to deliver material derived from a cowpox pustule, which over time afforded protection from the more virulent smallpox virus. Jenner named the process vaccination (Latin reference to cow). In 1879, Pasteur demonstrated that a weak cholera strain given to chickens also provided protection from virulent strains and then replicated this procedure for anthrax in sheep. Pasteur chose a term just appearing in English literature, immunizes, for this phenomenon and, in 1884, successfully applied a series of graduated subcutaneous inoculations for rabies prevention. However, there were issues including significant reactions in some cases, now known to be from antineural autoantibodies. In the same time frame, Koch reported delayed hypersensitivity responses in some samples in which he had given tuberculosis culture inoculations with the hope of conferring immunity. These untoward effects encouraged alternate approaches, and by 1891, von Behring and Kitasato had successfully conferred passive immune protection to diphtheria in human patients injected with serum containing antitoxin from previously infected animals (usually horse). However, reports of adverse reactions surfaced yet again, with the first fatality in 1896, soon joined by others, suppressing enthusiasm for passive immunization to other infectious diseases prevalent in that era. In an effort to identify what might be causing these issues and also to introduce some standardization to antisera preparations, Ehrlich identified in the laboratory what is now known to be an antigen-antibody reaction as the culprit behind anaphylactic reactions to nonhuman sera.
The first description of the classic symptoms of hay fever and asthma were penned by Bostock in 1819, describing his personal illness. It was not until 1872 that Wyman tentatively identified pollen as a cause of autumn catarrh in the United States. The next year in England, Blackley followed with the same observation and a critical concept, not fully accepted for hay fever until 1915 and asthma until the 1940s, that inhalant sensitivities were not the result of an infectious disease. Blackley was the first to try immunotherapy, a self-experimentation whereby he rubbed grass pollen into his abraded skin, following up in 1880 with observations on the cutaneous wheal, erythema, and pruritus response to what was essentially a scratch test and using such to detect and grossly quantitate patient sensitivity.
Early twentieth century contributions
The concepts of passive versus active immunity, anaphylaxis, and hypersensitivity were sorted out during the first 15 years of the twentieth century. von Pirquet was able to distinguish a basic difference between systemic horse serum reactions and a local response to smallpox vaccination, stating that, “immunity and hypersensitivity can plus be closely related” and introducing the term allergy from allos (other or altered state) and ergon (work) for the former phenomenon. von Pirquet proposed the term allergen for a substance that stimulates an organism to change its intrinsic response after prior exposure to that substance. Attempting to answer the question of why an immunized organism could be protected from or supersensitive to the same disease or substance, Pirquet tracked the progressively shortening interval to reactions to successive horse serum antitoxin injections in affected individuals, postulating a collision of antigen and antibody as per Ehrlich’s concept. The investigator added the distinction that because of previous exposure, the body was changed by antibodies and hence the heightened response. How to minimize these adverse reactions consumed the efforts of many. In 1907, Besredka and Steinhardt demonstrated in an animal model that injections of progressively larger but tolerable doses of antigen, like that of Ehrlich’s mice fed ricin, eventually conferred protection from an adverse response. This observation reinvigorated clinical interest in injection therapy for hay fever and asthma, although there were still continuing issues (which continue today), with acute asthma attacks being incited in some.
Curtis in the United States was an early proponent for injection therapy, albeit with very dilute pollen extracts. Although the details were sparse, Curtis reported some success with injection therapy as well as oral administration in the same dosages. In Germany, Dunbar was still pursuing the antitoxin route then being practiced for diphtheria and tetanus, applying a horse and rabbit serum antipollen preparation variously into the conjunctiva, nose, or mouth or via aerosolized oral inhalation, inciting occasional life-threatening reactions, which convinced most investigators that passive immunization for hypersensitivity diseases was a dead end. However, the investigator did develop a method of testing, whereby pollen extracts were applied to a subject’s conjunctiva to identify specific sensitivities. This diagnostic approach became widespread and was adopted by Noon who was practicing at St Mary’s Hospital in London, a major center for chest diseases.
Noon added a measure of quantification to the conjunctivally applied extracts, which he thought contained a plant toxin, setting that amount extracted from a thousandth part of a milligram (ie, a microgram) of a pollen sample as 1 “Noon unit.” Noon subdivided patients into those who were very sensitive, reacting to 4 or fewer Noon units; those with intermediate sensitivity, reacting to a 70-unit threshold; and those who were nonallergic, reacting to no less than 20,000 units. The investigator chose subcutaneous injections, still called inoculations, of the same extract as for testing preseasonally and/or coseasonally at 5- to 10-day intervals for 2 months ( Fig. 1 ). Although Noon’s approach varied considerably among patients, he seemed to use higher doses of the extract in lesser-sensitive patients. Noon eventually settled on an optimum interval between injections of 1 to 2 weeks and noted that local or systemic reactions increase if the injections were too frequent or excessive in extract dose. He documented diminished conjunctival reaction to allergen challenge after preseasonal therapy. Noon’s personal issue with tuberculosis necessitated his withdrawal from practice (died 2 years later), and he transferred ongoing investigations to his partner and a hay fever sufferer, Freeman. Freeman established that some symptom relief persisted for at least a year after discontinuing the injections and noted that some individuals were also less troubled by asthma. The investigator introduced the concept of a placebo effect with immunotherapy, observing a “constant tendency to detect such improvements in adventitious fluctuations in health” and noting the potential for physician or patient bias or external circumstances such as a heavy or light pollen season. Freeman marketed Noon’s toxin solution via the Parke Davis pharmaceutical firm. By 1920, Freeman reported experience with more than 200 pollen-sensitive patients (no controls), had discarded the 2-per-day limitation of conjunctival testing in favor of batteries of scratch tests, and was experimenting with injections for food and animal sensitivities ( Fig. 2 ). The investigator also suggested the option of rush immunotherapy over a day or two, an approach now used for vespid or in-hospital drug hyposensitization.
Early organization of allergy testing and treatment
From 1914 to the 1950s, most published clinical allergy experience shifted from Europe to the United States, with Koessler in Illinois and Goodale in Massachusetts, the former investigator reporting positive results with 45 patients. Goodale had established the first university-based allergy clinic in the United States in the Department of Laryngology of the Massachusetts General Hospital, where he enjoyed a career-long collaboration with medicine colleagues Rackemann and Colmes whose interest was asthma. A year later, Cooke established an allergy practice in the Department of Medicine at Cornell, where he also benefited from the presence of a learned colleague, Coca, who was running a fledgling immunology lab and later became the first editor of the Journal of Immunology . Cooke, who in the 1920s was a founder of both the first US allergy society and the Journal of Allergy , was striving to understand a personal affliction. He had had severe asthma during childhood, which vanished during sojourn in boarding school, and it became apparent that his major sensitivity was toward horses. During an obligatory 6-month internship rotation on a horse-drawn ambulance in New York City, Cooke observed that this would not have been possible save for his adrenaline kit, stating “I put as much adrenaline under my skin as any human being.” After assisting a tracheotomy in a patient with diphtheria, and hence obliged to a prophylactic injection of horse serum antitoxin, the investigator lapsed into unconsciousness for 10 hours, requiring intubation. In his 1915 publication, choosing Goodale’s preferred journal, Laryngoscope , Cooke eschewed pollen toxin as the source of hay fever in favor of the pollen inciting a sensitization that could manifest as either localized (rhinitis, asthma) or systemic anaphylaxis. The investigator postulated that immunity to infection likely resulted from induction of a widely circulating antibody, which for anaphylaxis was from a mostly tissue-fixed antibody. Observing that most sufferers had multiple sensitivities, Cooke preferred wide batteries of intracutaneous skin tests in a clinic session along with the scratch tests Freeman favored and prick tests that had been introduced in 1908 by Mantoux as he searched for the best method to apply a tuberculosis skin test. All 3 cutaneous testing methods remained in common use until 1987, when issues with reproducibility of scratch tests led the American Medical Association to recommend against routine use. Cooke tested for a wide variety of pollens as well as foods, including lobster, although only treated for inhalants. Extracts of varying strengths were used to classify patients into 4 levels of sensitivity, and to some degree, the initial injection dose was determined by such, ranging from 5 to 100 Noon units. Most treatment was preseasonal, although coseasonal was an option. The usual regimen was to start with a low dose and escalate weekly, beginning as Noon about 2 months before the relevant season. After a report on the aforementioned in 140 patients, Cooke teamed with VanderVeer (1916) to detail the medical histories, with emphasis on a possible inherited predilection to allergic diseases, of 621 patients. The investigators noted a familial tendency for asthma, allergic rhinitis, angioneurotic edema, and immediate food sensitivities but had insufficient evidence to include eczema. However, by the 1920s, Coca was using atopy to describe the trio of hay fever, asthma, and eczema. The incidence of inhalant allergy in the New York area was estimated by Cooke at 10%, of whom 42% had multiple sensitivities. Cooke noted that, “sensitized individuals transmit to their offspring not their own specific sensitization but an unusual capacity for developing bioplastic reactivities to any foreign protein.” Although the conclusion was that hypersensitivity was transmitted as a dominant characteristic, it did not fit the simple mendelian pattern; so the question remained open.
Immunotherapy for asthma or rhinitis was confined to pollen sensitivities until 1921 when Kern identified house dust as the causative agent in many cases. However, the major allergen, mite, was not isolated until 1959, and extract preparations until then were as varied among practitioners and commercial sources as any in immunotherapy; not unexpectedly, reports on effectiveness varied. Cooke and others began adding dust and many other nonstandardized allergens into testing and treatment mixes, but there were few published reports, none major, establishing efficacy. It was not until 1954 that the first placebo-controlled trial of immunotherapy for basic pollen sensitivity was published.
van Leeuwen and colleagues in 1924 observed the presence of climate allergens that could exacerbate asthma and rhinitis. The first allergen-free hospital rooms were fitted, and the substantial relief produced in asthma patients brought to the attention of the medical community the potential effect of environmental modification beyond the simple advice to avoid outdoor activities during problematic pollen seasons. Without specifically identifying molds as the issue, the investigators established that damp sleeping quarters were bad for asthma patients, whereas dry mountain areas were favorable. Curiously, recommendations for environmental measures for patient homes or places of employment did not gain traction in the allergy community until the 1980s when within a decade there was an explosion of publications distributed across the world’s literature and the introduction of the hygiene hypothesis.
Early use of bacterial vaccines for allergy
Another common practice during the early years of immunotherapy was the use of bacterial vaccines for asthma and/or chronic rhinitis. This practice was first mentioned by Allen who in 1908 applied an autologous nasopharyngeal vaccine for chronic rhinitis (infectious or allergic, it seems). Lowdermilk advocated combining pollen and bacterial extracts in the same injection for the patient with both seasonal rhinitis and sinusitis issues. Bacterial vaccines were adopted for patients with sinusitis by Goodale and for selected asthma patients by his colleague, Rackemann, who thought intrinsic asthma was possibly of bacterial origin, unlike the better-characterized extrinsic variety that was correctly ascribed to inhalant hypersensitivity. Cooke later became a proponent, correctly making the distinction that injections for allergies and those for bacterial issues caused different bodily responses, one being hyposensitization and the other immunity. This insight was confirmed by his colleague, Coca, who established in 1925 that a heat-labile reagin (now known to be IgE) was responsible for positive skin test results and hypersensitivity reactions and that a heat-stable antibody called blocking antibody (now known to be IgG) was induced by immunotherapy. Enthusiasm for bacterial vaccines did not wane until the advent of antibiotic therapies in the 1940s allowed trials of such in asthma patients to no effect. As a final blow, in 1955, the first controlled study of bacterial vaccines was reported by Franklin, followed within a few years by other negative studies, removing mixed respiratory or autologous bacterial vaccines from the practices of most allergy practitioners ( Fig. 3 ).