| Today's Chemist at Work | E-Mail Us | Electronic Readers Service
Introduction We live today in a world of drugs. Drugs for pain, drugs for disease, drugs for allergies, drugs for pleasure, and drugs for mental health. Drugs that have been rationally designed; drugs that have been synthesized in the factory or purified from nature. Drugs fermented and drugs engineered. Drugs that have been clinically tested. Drugs that, for the most part, actually do what they are supposed to. Effectively. Safely.
By no means was it always so.
Before the end of the 19th century, medicines were concocted with a mixture of empiricism and prayer. Trial and error, inherited lore, or mystical theories were the basis of the world’s pharmacopoeias. The technology of making drugs was crude at best: Tinctures, poultices, soups, and teas were made with water- or alcohol-based extracts of freshly ground or dried herbs or animal products such as bone, fat, or even pearls, and sometimes from minerals best left in the ground—mercury among the favored. The difference between a poison and a medicine was a hazy differentiation at best: In the 16th century, Paracelsus declared that the only difference between a medicine and a poison was in the dose. All medicines were toxic. It was cure or kill.
“Rational treatments” and “rational drug design” of the era were based on either the doctrine of humors (a pseudoastrological form of alchemical medicine oriented to the fluids of the body: blood, phlegm and black and yellow bile) or the doctrine of signatures. (If a plant looks like a particular body part, it must be designed by nature to influence that part. Lungwort, for example, was considered good for lung complaints by theorists of the time because of its lobe-shaped leaves.) Neither theory, as might be expected, guaranteed much chance of a cure.
Doctors and medicines were popular, despite their failures. As pointed out by noted medical historian Charles E.Rosenberg, a good bedside manner and a dose of something soothing (or even nasty) reassured the patient that something was being done, that the disease was not being ignored.
Bloodletting dominated the surgeon’s art, and dosing patients with powerful purgatives and cathartics became the order of the day in an attempt to match the power of the disease with the power of the drug. Bleed them till they faint. (It is difficult to sustain a raging fever or pounding pulse when there is too little blood in the body, so the symptoms, if not what we would call the disease, seemed to vanish.) Dose them with calomel till they drool and vomit. (Animals were thought to naturally expel toxins this way.) Cleanse both stomach and bowels violently to remove the poisons there.
Certainly these methods were neither pleasant nor very effective at curing patients already weakened by disease. George Washington died in misery from bloodletting; Abraham Lincoln suffered chronic mercury poisoning and crippling constipation from his constant doses of “blue mass.” The “cure” was, all too often, worse than the disease.
In the second half of the 19th century, things changed remarkably as the industrial revolution brought technological development to manufacturing and agriculture and inspired the development of medical technology.
Spurred in part by a reaction against doctors and their toxic nostrums, patent medicines and in particular homeopathy (which used extreme dilutions of otherwise toxic compounds) became popular and provided an “antidote” to the heroic treatments of the past. Not helpful, but at least harmless for the most part, these new drugs became the foundation of a commodity-based medicine industry that galvanized pharmacist and consumer alike. Technology entered in the form of pill and powder and potion making.
Almost by accident, a few authentic drugs based on the wisdom and herbal lore of the past were developed: quinine, digitalis, and cocaine. Ultimately, these successes launched the truly modern era. The century ended with the development of the first of two synthesized drugs that represent the triumph of chemistry over folklore and technology over cookery. The development of antipyrine in 1883 and aspirin in 1897 set the stage for the next 10 decades of what we can look back on in retrospect as the Pharmaceutical Century. With new knowledge of microbial pathogens and the burgeoning wisdom of vaccine technology, the first tentative steps were taken to transform medicines to a truly scientific foundation.
From these scattered seeds, drug technology experienced remarkable if chaotic growth in the first two decades of the 20th century, a period that can be likened to a weedy flowering of quackery and patent medicines twining about a hardening strand of authentic science and institutions to protect and nourish it.
the Pharmaceutical Century
But it was not until the birth of medical microbiology that the true breakthroughs occurred, and science—rather than empiricism—took center stage in the development of pharmaceuticals.
Technology made the new framework possible. The brilliance of European lens makers and microscopists, coupled with the tinkering of laboratory scientists who developed the technologies of sterilization and the media and methods for growing and staining microbes, provided the foundation of the new medical science that would explode in the 20th century. These technologies offered proof and intelligence concerning the foe against which pharmaceuticals, seen thereafter as weapons of war, could be tested and ultimately designed.
In 1861, the same year that the American Civil War began, Ignaz Semmel weis published his research on the transmissible nature of purperal (childbed) fever. His theories of antisepsis were at first vilified by doctors who could not believe their unwashed hands could transfer disease from corpses or dying patients to healthy women. But eventually, with the work of Robert Koch, Joseph Lister, and Louis Pasteur adding proof of the existence and disease-causing abilities of microorganisms, a worldwide search for the microbial villains of a host of historically deadly diseases began.
In 1879, as part of the new “technology,” Bacterium coli was discovered (it was renamed Escherichia after its discoverer, Theodor Escherich, in 1919). It quickly became the quintessential example of an easily grown, “safe” bacteria for laboratory practice. New growth media, new sterile techniques, and new means of isolating and staining bacteria rapidly developed. The ability to grow “pathogens” in culture proved remarkably useful. Working with pure cultures of the diphtheria bacillus in Pasteur’s laboratory in 1888, Emile Roux and Alexandre Yersin first isolated the deadly toxin that causes most of diphtheria’s lethal effects.
One by one over the next several decades, various diseases revealed their microbial culprits to the so-called microbe-hunters.
Initially, most American physicians were loath to buy into germ theory, seeing it as a European phenomenon incompatible with the “truth” of spontaneous generation and as a threat to the general practitioner from the growing cadre of scientifically trained laboratory microbiologists and specialist physicians.
“Anti-contagionists” such as the flamboyant Colonel George E. Waring Jr., pamphleteer, consulting engineer, and phenomenally effective warrior in the sanitation movement, ultimately held sway. Filth was considered the source of disease. A host of sewage projects, street-cleaning regimens, and clean water systems swept urban areas across the United States, with obvious benefits. Ultimately, the germ theory of infectious diseases had to be accepted, especially as the theoretical foundation behind the success of the sanitation movement. And with the production of vaccines and antitoxins, older medical frameworks fell by the wayside, though rural American physicians were still recommending bleeding and purgatives as cures well into the first few decades of the 20th century.
But active immunity was perhaps not the most impressive result of the immunologicals. Antitoxins (antibodies isolated against disease organisms and their toxins from treated animals), when injected into infected individuals, provided salvation from otherwise fatal diseases. This technology began in 1890 when Emil von Behring and Shibasaburo Kitasato isolated the first antibodies against tetanus and, soon after, diphtheria. In 1892, Hoechst Pharma developed a tuberculin antitoxin. These vaccines and antitoxins would form the basis of a new pharmaceutical industry.
Perhaps as important as the development of these new immunologics was the impetus toward standardization and testing that a new generation of scientist-practitioners such as Koch and Pasteur inspired. These scientists’ credibility and success rested upon stringent control—and ultimately, government regulation—of the new medicines. Several major institutions sprang up in Europe and the United States to manufacture and/or inspect in bulk the high volume of vaccines and antitoxins demanded by a desperate public suddenly promised new hope against lethal diseases. These early controls helped provide a bulwark against contamination and abuse. Such control would not be available to the new synthetics soon to dominate the scene with the dawn of “scientific” chemotherapy.
The chemical technology of organic synthesis and analysis seemed to offer for the first time the potential to scientifically ground the healer’s art in a way far different from the “cookery” of ancient practitioners. In 1887, phenacetin, a pain reliever, was developed by Bayer specifically from synthetic drug discovery research. The drug eventually fell into disfavor because of its side effect of kidney damage. Ten years later, also at Bayer, Felix Hoffman synthesized acetylsalicylic acid (aspirin). First marketed in 1899, aspirin has remained the most widely used of all the synthetics.
Many other new technologies also enhanced the possibilities for drug development and delivery. The advent of the clinical thermometer in 1870 spearheaded standardized testing and the development of the antifever drugs. In 1872, Wyeth invented the rotary tablet press, which was critical to the mass marketing of drugs. By 1883, a factory was producing the first commercial drug (antipyrine) in a ready-dosaged, prepackaged form. With the discovery of X-rays in 1895, the first step was taken toward X-ray crystallography, which would become the ultimate arbiter of complex molecular structure, including proteins and dna.
Of equal if not more importance to the adoption and implementation of the new technologies was the rise of public indignation—a demand for safety in food and medicines that began in Europe and rapidly spread to the United States.
food and public furor
In the United States, where the popular sanitation movement could now be grounded in germ theory, this fear of contagion manifested among the developing middle classes was directed especially against immigrants—called “human garbage” by pundits such as American social critic Herbert George in 1883. This led to the Immigration Act of 1891, which mandated physical inspection of immigrants for diseases of mind and body—any number of which could be considered cause for quarantine or exclusion. Also in 1891, the Hygienic Laboratory (founded in 1887 and the forerunner of the National Institutes of Health) moved from Staten Island (New York City) to Washington, DC—a sign of its growing importance.
That same year, the first International Sanitary Convention was established. Although restricted to efforts to control and prevent cholera, it would provide a model of things to come in the public health arena. In 1902, an International Sanitary Bureau (later renamed the Pan American Sanitary Bureau and then the Pan American Sanitary Organization) was established in Washington, DC, and became the forerunner of today’s Pan American Health Organization, which also serves as the World Health Organization’s Regional Office for the Americas.
Fears of contagion on the one hand and poisoning on the other, resulting from improperly prepared or stored medicines, led to the 1902 Biologicals Controls Act, which regulates the interstate sale of viruses, serums, antitoxins, and similar products.
One of the significant outgrowths of the new “progressive” approach to solving public health problems with technological expertise and government intervention was the popularity and influence of a new class of journalists known as the Muckrakers. Under their impetus, and as the result of numerous health scandals, The 1906 U.S. Pure Food and Drugs Act, after years of planning by U.S. Department of Agriculture (USDA) researchers such as John Wiley, was passed easily. The act established the USDA’s Bureau of Chemistry as the regulatory agency. Unfortunately, the act gave the federal government only limited powers of inspection and control over the industry. Many patent medicines survived this first round of regulation.
The American Medical Association (AMA) created a Council on Pharmacy and Chemistry to examine the issue and then established a chemistry laboratory to lead the attack on the trade in patent medicines that the Pure Food and Drugs Act had failed to curb. The ama also published New and Nonofficial Remedies annually in an effort to control drugs by highlighting serious issues of safety and inefficacy. This publication prompted rapid changes in industry standards.
International health procedures continued to be formalized also—L’Office International d’Hygiène Publique (OIHP) was established in Paris in 1907, with a permanent secretariat and a permanent committee of senior public health officials. Military and geopolitical concerns would also dominate world health issues. In 1906, the Yellow Fever Commission was established in Panama to help with U.S. efforts to build the canal; in 1909, the U.S. Army began mass vaccination against typhoid.
Nongovernmental organizations also rallied to the cause of medical progress and reform. In 1904, for example, the U.S. National Tuberculosis Society was founded (based on earlier European models) to promote research and social change. It was one of many groups that throughout the 20th century were responsible for much of the demand for new medical technologies to treat individual diseases. Grassroots movements such as these flourished. Public support was often behind the causes. In 1907, Red Cross volunteer Emily Bissell designed the first U.S. Christmas Seals (the idea started in Denmark). The successful campaign provided income to the Tuberculosis Society and a reminder to the general public of the importance of medical care. Increased public awareness of diseases and new technologies such as vaccination, antitoxins, and later, “magic bullets,” enhanced a general public hunger for new cures.
The movement into medicine of government and semipublic organizations such as the AMA and the Tuberculosis Society throughout the latter half of the 19th and beginning of the 20th centuries set the stage for a new kind of medicine that was regulated, tested, and “public.” Combined with developments in technology and analysis that made regulation possible, public scrutiny slowly forced medicine to come out from behind the veil of secret nostrums and alchemical mysteries.
crowning of chemistry
The flowering of biochemistry in the early part of the new century was key, especially as it related to human nutrition, anatomy, and disease. Some critical breakthroughs in metabolic medicine had been made in the 1890s, but they were exceptions rather than regular occurrences. In 1891, myedema was treated with sheep thyroid injections. This was the first proof that animal gland solutions could benefit humans. In 1896, Addison’s disease was treated with chopped up adrenal glands from a pig. These test treatments provided the starting point for all hormone research. Also in 1891, a pair of agricultural scientists developed the Atwater–Rosa calorimeter for large animals. Ultimately, it provided critical baselines for human and animal nutrition studies.
But it wasn’t until the turn of the century that metabolic and nutritional studies truly took off. In 1900, Karl Landsteiner discovered the first human blood groups: O, A, and B. That same year, Frederick Hopkins discovered tryptophan and demonstrated in rat experiments that it was an “essential” amino acid—the first discovered. In 1901, fats were artificially hydrogenated for storage for the first time (providing a future century of heart disease risk). Eugene L. Opie discovered the relationship of islets of Langerhans to diabetes mellitus, thus providing the necessary prelude to the discovery of insulin. Japanese chemist Jokichi Takamine isolated pure epinephrine (adrenaline). And E. Wildiers discovered “a new substance indispensable for the development of yeast.” Growth substances such as this eventually became known as vitamines and later, vitamins.
In 1902, proteins were first shown to be polypeptides, and the AB blood group was discovered. In 1904, the first organic coenzyme—cozymase—was discovered. In 1905, allergies were first described as a reaction to foreign proteins by Clemens von Pirquet, and the word “hormone” was coined. In 1906, Mikhail Tswett developed the all-important technique of column chromatography. In 1907, Ross Harrison developed the first animal cell culture using frog embryo tissues. In 1908, the first biological audioradiograph was made—of a frog. In 1909, Harvey Cushing demonstrated the link of pituitary hormone to giantism.
Almost immediately after Svante August Arrhenius and Soren Sorensen demonstrated in 1909 that pH could be measured, Sorenson pointed out that pH can affect enzymes. This discovery was a critical step in the development of a biochemical model of metabolism and kinetics. So many breakthroughs of medical significance occurred in organic chemistry and biochemistry in the first decade of the Pharmaceutical Century that no list can do more than scratch the surface.
With the cure identified and the public increasingly aware of the subject, it was not surprising that the “progressive” U.S. government intervened in the public health issue of venereal disease. The Charmerlain–Kahn Act of 1918 provided the first federal funding specifically designated for controlling venereal disease. It should also not be a surprise that this attack on venereal disease came in the midst of a major war. Similar campaigns would be remounted in the 1940s.
“fall” of chemotherapy
Ultimately, despite the manifold breakthroughs in biochemistry and medicine, the end of the ’Teens was not a particularly good time for medicine. The influenza pandemic of 1918–1920 clearly demonstrated the inability of medical science to stand up against disease. More than 20 million people worldwide were killed by a flu that attacked not the old and frail but the young and strong. This was a disease that no magic bullet could cure and no government could stamp out. Both war and pestilence set the stage for the Roaring Twenties, when many people were inclined to “eat, drink, and make merry” as if to celebrate the optimism of a world ostensibly at peace.
Still, a burgeoning science of medicine promised a world of wonders yet to come. Technological optimism and industrial expansion provided an antidote to the malaise caused by failed promises revealed in the first two decades of the new century.
But even these promises were suspect as the Progressive Era drew to a close. Monopoly capitalism and renewed conservatism battled against government intervention in health care as much as the economy did and became a familiar refrain. The continued explosive growth of cities obviated many of the earlier benefits in sanitation and hygiene with a host of new “imported” diseases. The constitutive bad health and nutrition of both the urban and rural poor around the world grew worse with the economic fallout of the war.
Many people were convinced that things would only get worse before they got better.
The Pharmaceutical Century had barely begun.
© 2000 American Chemical Society