In 1920, Canadian surgeon Frederick Banting visited the University of Toronto to speak to the newly appointed head of the department of physiology, John J.R. Macleod. Macleod had studied glucose metabolism and diabetes, and Banting had a new idea on how to find not only the cause but a treatment for the so-called “sugar disease.”
Late in the nineteenth century, scientists had realized there was a connection between the pancreas and diabetes. The connection was further narrowed down to the islets of Langerhans, a part of the pancreas. From 1910 to 1920, Oscar Minkowski and others tried unsuccessfully to find and extract the active ingredient from the islets of Langerhans. While reading a paper on the subject in 1920, Banting had an inspiration. He realized that the pancreas’ digestive juice was destroying the islets of Langerhans hormone before it could be isolated. If he could stop the pancreas from working, but keep the islets of Langerhans going, he should be able to find the stuff! He presented this idea to Macleod, who at first scoffed at it. Banting badgered him until finally Macleod gave him lab space, 10 experimental dogs, and a medical student assistant.
In May, 1921, as Macleod took off for a holiday in his native Scotland, Banting and his assistant Charles Best began their experiments. By August they had the first conclusive results: when they gave the material extracted from the islets of Langerhans (called “insulin,” from the Latin for “island”) to diabetic dogs, their abnormally high blood sugars were lowered. Macleod, back from holiday, was still skeptical of the results and asked them to repeat the experiment several more times. They did, finding the results the same, but with problems due to the varying purity of their insulin extract.
Macleod assigned chemist James Bertram Collip to the group to help with the purification. Within six weeks, he felt confident enough of the insulin he had isolated to try it on a human for the first time: a 14-year-old boy dying of diabetes. The injection indeed lowered his blood sugar and cleared his urine of sugars and other signs of the disease. Banting and Best published the first paper on their discovery a month later, in February, 1922. In 1923, the Nobel Prize was awarded to Banting and Macleod for the discovery, and each shared their portion of the prize money with the other researchers on the project.
Ironically, Banting’s original idea wasn’t entirely correct. He and Best later found they could obtain insulin even from an intact pancreas. Improved technology for testing and detecting sugar in the blood and urine provided information that earlier researchers didn’t have, and this encouraged them to pursue a line of thinking that may have looked like a dead end to those working in the decades before them.
The discovery of insulin was one of the most revolutionary moments in medicine. Though it took some time to work out proper dosages and to develop manufacturing processes to make enough insulin of consistent strength and purity, the introduction of insulin seemed literally like a miracle. One year the disease was an automatic death sentence; the next, people — even children — had hopes of living full and productive lives even with the disease. Estimates show there are more than 15 million diabetics living today who would have died at an early age without insulin.
by Lexi Krock
Accidents in medicine: The idea sends chills down your spine as you conjure up thoughts of misdiagnoses, mistakenly prescribed drugs, and wrongly amputated limbs. Yet while accidents in the examining room or on the operating table can be regrettable, even tragic, those that occur in the laboratory can sometimes lead to spectacular advances, life-saving treatments, and Nobel Prizes.
A seemingly insignificant finding by one researcher leads to a breakthrough discovery by another; a physician methodically pursuing the answer to a medical conundrum over many years suddenly has a “Eureka” moment; a scientist who chooses to study a contaminant in his culture rather than tossing it out stumbles upon something entirely new. Here we examine seven of medical history’s most fortuitous couplings of great minds and great luck.
|A laborer scrapes the bark from a cinchona tree. The bark is then sundried and pulverized to make the drug quinine.
The story behind the chance discovery of the anti-malarial drug quinine may be more legend than fact, but it is nevertheless a story worthy of note. The account that has gained the most currency credits a South American Indian with being the first to find a medical application for quinine. According to legend, the man unwittingly ingested quinine while suffering a malarial fever in a jungle high in the Andes. Needing desperately to quench his thirst, he drank his fill from a small, bitter-tasting pool of water. Nearby stood one or more varieties of cinchona, which grows from Colombia to Bolivia on humid slopes above 5,000 feet. The bark of the cinchona, which the indigenous people knew asquina-quina, was thought to be poisonous. But when this man’s fever miraculously abated, he brought news of the medicinal tree back to his tribe, which began to use its bark to treat malaria.
Since the first officially noted use of quinine to fight malaria occurred in a community of Jesuit missionaries in Lima, Peru in 1630, historians have surmised that Indian tribes taught the missionaries how to extract the chemical quinine from cinchona bark. In any case, the Jesuits’ use of quinine as a malaria medication was the first documented use of a chemical compound to successfully treat an infectious disease. To this day, quinine-based anti-malarials are widely used as effective treatments against the growth and reproduction of malarial parasites in humans.
|A depiction of Edward Jenner vaccinating James Phipps, a boy of eight, on May 14, 1796.
In 1796, Edward Jenner, a British scientist and surgeon, had a brainstorm that ultimately led to the development of the first vaccine. A young milkmaid had told him how people who contracted cowpox, a harmless disease easily picked up during contact with cows, never got smallpox, a deadly scourge.
With this in mind, Jenner took samples from the open cowpox sores on the hands of a young dairymaid named Sarah Nelmes and inoculated eight-year-old James Phipps with pus he extracted from Nelmes’ sores. (Experimenting on a child would be anathema today, but this was the 18thcentury.) The boy developed a slight fever and a few lesions but remained for the most part unscathed. A few months later, Jenner gave the boy another injection, this one containing smallpox. James failed to develop the disease, and the idea behind the modern vaccine was born.
Though doctors and scientists would not begin to understand the biological basis of immunity for at least 50 years after Jenner’s first inoculation, the technique of vaccinating against smallpox using the human strain of cowpox soon became a common and effective practice worldwide.
|Physicist Wilhelm Conrad Röntgen (1845-1923), discoverer of the X-ray.
X-rays have become an important tool for medical diagnoses, but their discovery in 1895 by the German physicist Wilhelm Conrad Röntgen had little to do with medical experimentation. Röntgen was studying cathode rays, the phosphorescent stream of electrons used today in everything from televisions to fluorescent light bulbs. One earlier scientist had found that cathode rays can penetrate thin pieces of metal, while another showed that these rays could light up a fluorescent screen placed an inch or two away from a thin aluminum “window” in the glass tube.
Röntgen wanted to determine if he could see cathode rays escaping from a glass tube completely covered with black cardboard. While performing this experiment, Röntgen noticed that a glow appeared in his darkened laboratory several feet away from his cardboard-covered glass tube. At first he thought a tear in the paper sheathing was allowing light from the high-voltage coil inside the cathode-ray tube to escape. But he soon realized he had happened upon something entirely different. Rays of light were passing right through the thick paper and appearing on a fluorescent screen over a yard away.
Röntgen found that this new ray, which had many characteristics different from the cathode ray he had been studying, could penetrate solids and even record the image of a human skeleton on a photographic negative. In 1901, the first year of the Nobel Prize, Röntgen won for his accidental discovery of what he called the “X-ray,” which physicians worldwide soon adopted as a standard medical tool.
Charles Robert Richet, a French physiologist, made several experiments testing the reaction of dogs exposed to poison from the tentacles of sea anemones. Some of the dogs died from allergic shock, but others survived their reactions and made full recoveries.
Weeks later, because the recovered dogs seemed completely normal, Richet wasted no time in reusing them for more experiments. They were given another dose of anemone poison, this time much smaller than before. The first time the dogs’ allergic symptoms, including vomiting, shock, loss of consciousness, and in some cases death, had taken several days to fully develop. But this time the dogs suffered such serious symptoms just minutes after Richet administered the poison.
Though Richet was puzzled by what had happened, he realized he could not disregard the unexpected result of his experiment. Later, he noted that his eventual conclusions about the dogs’ affliction were “not at all the result of deep thinking, but of a simple observation, almost accidental; so that I have had no other merit than that of not refusing to see the facts which presented themselves before me, completely evident.”
Richet’s conclusions from his findings came to form the theoretical basis of the medical study and treatment of allergies. He eventually proved that there was a physiological state called anaphylaxis that was the antithesis of prophylaxis: When an allergic subject is exposed to an allergen a second time, he or she is even more sensitive to its effects than the first time. Instead of building immunity to the substance through exposure (prophylaxis), the allergic subject’s immunity becomes greatly reduced.
In 1913 Richet received a Nobel Prize for his discovery and articulation of diseases of allergy.
|Frederick Banting (above) and John MacLeod discovered insulin, the elusive substance that regulates blood-sugar levels in the body.
Frederick G. Banting, a young Canadian doctor, and Professor John J.R. MacLeod of the University of Toronto shared a Nobel Prize in 1923 for their isolation and clinical use of insulin against diabetes. Their work with insulin followed from the chance discovery of the link between the pancreas and blood-sugar levels by two other doctors on the other side of the Atlantic decades earlier.
In 1889, German physicians Joseph von Mering and Oscar Minkowski removed the pancreas from a healthy dog in order to study the role of the pancreas in digestion. Several days after the dog’s pancreas was removed, the doctors happened to notice a swarm of flies feeding on a puddle of the dog’s urine. On testing the urine to determine the cause of the flies’ attraction, the doctors realized that the dog was secreting sugar in its urine, a sign of diabetes. Because the dog had been healthy prior to the surgery, the doctors knew that they had created its diabetic condition by removing its pancreas and thus understood for the first time the relationship between the pancreas and diabetes.
With more tests, von Mering and Minkowski concluded that a healthy pancreas must secrete a substance that controls the metabolism of sugar in the body. Though many scientists tried in vain to isolate the particular substance released by the pancreas after the Germans’ accidental discovery, it was Banting and MacLeod who established that the mysterious substance was insulin and began to put it to use as the first truly valuable means of controlling diabetes.
|Dr. George N. Papanicolaou, who devised the “Pap” smear test for cancer, examines a slide in his laboratory in 1958.
Dr. George Nicholas Papanicolaou’s chance observation, while doing a genetic study, of cancer cells on a slide containing a specimen from a woman’s uterus spawned the routine use of the so-called “Pap smear,” a simple test that has saved millions of women from the ravages of uterine cancer.
In 1923, Papanicolaou undertook a study of vaginal fluid in women, in hopes of observing cellular changes over the course of a menstrual cycle. In female guinea pigs, Papanicolaou had already noticed cell transformation and wanted to corroborate the phenomenon in human females. It happened that one of Papanicolaou’s human subjects was suffering from uterine cancer.
Upon examination of a slide made from a smear of the patient’s vaginal fluid, Papanicolaou was astonished to discover that abnormal cancer cells could be plainly observed under a microscope. “The first observation of cancer cells in the smear of the uterine cervix,” he later wrote, “gave me one of the greatest thrills I ever experienced during my scientific career.” Papanicolaou quickly realized that doctors could administer a simple test to gather a sample of vaginal fluid and test it for early signs of uterine and other cancers.
|Penicillium mold (enlarged here many times) is a fungus that differs little from one that appears on bread in warm, humid weather.
The identification of penicillium mold by Dr. Alexander Fleming in 1928 is one of the best-known stories of medical discovery, not only because of its accidental nature, but also because penicillin has remained one of the most important and useful drugs in our arsenal, and its discovery triggered invaluable research into a range of other invaluable antibiotic drugs.
While researching the flu in the summer of 1928, Dr. Fleming noticed that some mold had contaminated a flu culture in one of his petri dishes. Instead of throwing out the ruined dish, he decided to examine the moldy sample more closely.
Fleming had reaped the benefits of taking time to scrutinize contaminated samples before. In 1922, Fleming had accidentally shed one of his own tears into a bacteria sample and noticed that the spot where the tear had fallen was free of the bacteria that grew all around it. This discovery peaked his curiosity. After conducting some tests, he concluded that tears contain an antibiotic-like enzyme that could stave off minor bacterial growth.
Six years later, the mold Fleming observed in his petri dish reminded him of this first experience with a contaminated sample. The area surrounding the mold growing in the dish was clear, which told Fleming that the mold was lethal to the potent staphylococcus bacteria in the dish. Later he noted, “But for the previous experience, I would have thrown the plate away, as many bacteriologists have done before.”
Instead, Fleming took the time to isolate the mold, eventually categorizing it as belonging to the genus penicillium. After many tests, Fleming realized that he had discovered a non-toxic antibiotic substance capable of killing many of the bacteria that cause minor and severe infections in humans and other animals. His work, which has saved countless lives, won him a Nobel Prize in 1945.
Keep that mind open
For all you would-be Nobel Prize-winners, remember the one trait that tied all these lucky strikers together: openmindedness. As the American physicist Joseph Henry once noted, “The seeds of great discoveries are constantly floating around us, but they only take root in minds well prepared to receive them.”
Lexi Krock is editorial assistant of NOVA Online.
Photos: (1-7) Corbis Images.
Observations Upon the Origin of the Malignant Bilious, or Yellow Fever in Philadelphia. From the holdings of Center for the History of Medicine/Francis A. Countway Library of Medicine—Harvard Medical School.
Yellow fever is known for bringing on a characteristic yellow tinge to the eyes and skin, and for the terrible “black vomit” caused by bleeding into the stomach. Known today to be spread by infected mosquitoes, yellow fever was long believed to be a miasmatic disease originating in rotting vegetable matter and other putrefying filth, and most believed the fever to be contagious.
The first major American yellow fever epidemic hit Philadelphia in July 1793 and peaked during the first weeks of October. Philadelphia, then the nation’s capital, was the most cosmopolitan city in the United States. Two thousand free blacks lived there, as well as many recent white French-speaking arrivals from the colony of Santo Domingo, who were fleeing from a slave rebellion. Major Revolutionary political figures lived there, and in the first week of September, Thomas Jefferson wrote to James Madison that everyone who could escape the city was doing so. The epidemic depopulated Philadelphia: 5,000 out of a population of 45,000 died, and chronicler Mathew Carey estimated that another 17,000 fled.
Benjamin Rush: Coffee and Blood
Benjamin Rush, a Philadelphia physician and signer of the Declaration of Independence, became highly regarded for his work during the 1793 epidemic. Rush thought that the outbreak had originated in a pile of rotting coffee beans left on the docks. He developed a very aggressive approach to treatment, copiously bleeding his patients and administering large quantities of mercury. These aggressive therapeutics became known, not always favorably, as “heroic medicine.”
As the population fled or died, few were left to attend to nursing and burying duties. Rush, who believed that blacks were immune to yellow fever, asked members of the African Society to come forward and care for to the sick and the dead. Absalom Jones and Richard Allen, two free black men, volunteered. In a few weeks Jones, Allen and others were bleeding hundreds of people a day under Rush’s direction, as well as nursing patients and carrying coffins.
About two months into the epidemic, however, Rush was proven wrong and blacks began to fall ill, dying from yellow fever at about the same rate as whites. Their efforts, though praised by Rush, were scorned by the white public as being profiteering and extortionist. In response, Jones and Allen published their own description of their experiences.
The Santo Domingan Influence
The Bush Hill Hospital, which housed the sick poor, was desperately understaffed. When Philadelphia’s mayor asked the public for help, a French-born merchant from Santo Domingo named Stephen Girard stepped up and recommended his compatriot, Dr. Jean Devèze, to head the hospital. Devèze refused to believe that yellow fever was contagious and he disapproved of Rush’s aggressive treatments. Devèze later became a world authority on yellow fever.
Selected Contagion Resources
This is a partial list of digitized materials available in Contagion: Historical Views of Diseases and Epidemics. For additional materials on the topic “The Yellow Fever Epidemic in Philadelphia, 1793” click here or search the collection’s Catalog and Full Text databases.
Poincaré, Émile Léon. Prophylaxie et géographie médicale :des principales maladies tributaires de l’hygiène. From the holdings of Center for the History of Medicine/Francis A. Countway Library of Medicine—Harvard Medical School.
Tuberculosis, also known as “consumption,” “phthisis,” or the “white plague,” was the cause of more deaths in industrialized countries than any other disease during the 19th and early 20th centuries. By the late 19th century, 70 to 90% of the urban populations of Europe and North America were infected with the TB bacillus, and about 80% of those individuals who developed active tuberculosis died of it.
Causes of Tuberculosis
For most of the 19th century, tuberculosis was thought to be a hereditary, constitutional disease rather than a contagious one. By the end of the 19th century, when infection rates in some cities were thought by public health officials to be nearly 100%, tuberculosis was also considered to be a sign of poverty or an inevitable outcome of the process of industrial civilization. About 40% of working-class deaths in cities were from tuberculosis.
Robert Koch’s identification of the tuberculosis bacillus in 1882 helped to convince members of the medical and public–health communities that the disease was contagious. Preventing the spread of tuberculosis became the motivation for some of the first large-scale public health campaigns.
The sanatoria movement, which began around 1880, was an attempt to cure tuberculosis naturally and to prevent its spread by moving patients into quiet environments, isolated from normal life, where the air was pure and freely circulating. Major sanatoria included those in Davos, Switzerland, and Saranac Lake, New York.
At a sanatorium, rest in the open air was of paramount importance, and special houses, porches, and cabins were built to allow easy access to the outdoors. Because the sanatorium cure involved long periods of separation from home, work, and family, it was sometimes avoided until the disease was well advanced. Cost also was a deterrent.
The sanatorium model was adapted for use in urban environments, and dispensaries, free public clinics for the poor, also advised patients using the sanatorium model. However, following the dispensary’s advice was often difficult because of urban living conditions. If a patient had the means and the opportunity, moving to a different, more beneficial climate was another way to attempt a cure.
After Koch’s identification of the bacillus, hopes for a cure ran high. Despite enthusiastic publicity Koch’s tuberculin treatment, announced in 1890, proved mostly useless—and sometimes harmful—though it did turn out to be helpful in diagnosing the diasease.
Diagnosis: Percussion, the Stethoscope, and the X-ray
Until the development of antibiotics in the 1940s, major clinical developments surrounding tuberculosis involved diagnosis rather than cure.
Leopold Auenbrugger published Inventum novum ex percussione thoracis humani ut signo abstrusos interni pectoris morbos detegendi (“On percussion of the chest”) in 1761.
R. T. H. Laennec, associated with the French Clinical School, published Del’ascultation mediate (On Mediate Auscultation, or, Treatise on the Diagnosis of the Diseases of the Lungs and Heart) in 1819, describing his invention and use of the stethoscope.
In the early 1900s, the x-ray, discovered by Wilhelm Conrad Röntgen in 1895, became a widely used diagnostic tool.
Decline in Mortality
Mortality rates began declining in the late 19th century throughout Europe and the United States. Whether sanitary measures, sanatoria, improved nutrition, or larger epidemiological factors unrelated to human intervention were the cause of this decline remains undecided.
Selected Contagion Resources
This is a partial list of digitized materials available in Contagion: Historical Views of Diseases and Epidemics. For additional materials on the topic “Tuberculosis in Europe and North America, 1800–1922,” click here or search the collection’s Catalog and Full Text databases.
The Hay–Bunau–Varilla Treaty of 1903 created the Panama Canal Zone and allowed the US government to begin building its 51–mile waterway through the Isthmus of Panama in May 1904. The transoceanic waterway opened in 1914, approximately four centuries after Charles I, King of Spain, conceived of a waterway across the Isthmus to facilitate Spain’s colonial interests in the New World.
In constructing the Panama Canal, American planners and builders faced challenges that went far beyond politics and engineering. The deadly endemic diseases of yellow fever and malaria were dangerous obstacles that had already defeated French efforts to construct a Panama Canal in the 1880s. The crippling effects of these diseases, which incapacitated many workers and caused at least 20,000 to die, led the French to abandon their goal in 1889.
For the later American effort, William Crawford Gorgas was appointed chief sanitary officer. His task was to prevent yellow fever and malaria infection among the laborers—a task that proved critical to American success.
The successful completion of the Panama Canal was a tribute to its organizers and specialists, among them Gorgas, whose highly effective sanitation measures eliminated the lethal or debilitating effects of yellow fever and malaria among workers.
Epidemiology of Yellow Fever and Malaria
By 1904, medical researchers had established that yellow fever and malaria were mosquito-borne diseases. In Cuba, the Walter Reed Commission confirmed Carlos Finlay’s hypothesis and proved that the mosquito of the genus Aedes transmits yellow fever. As for malaria, the cumulative investigations of Charles Louise Alphonse Laveran, Ronald Ross, and Giovanni Battista Grassi and his collaborators proved that the female Anopheles mosquito transmits the disease–causing parasite.
By applying the results of these epidemiological studies to his sanitation methods, Gorgas prevented yellow fever and contained malaria among workers. Building on the sanitation methods he had pioneered in Cuba’s disease-stricken capital, Havana, in 1898, Gorgas developed a formidable repertoire of measures against mosquitoes.
The Results of Gorgas’s Sanitation Measures
Gorgas’s success in Panama was as dramatic as in Cuba: by 1906, he eradicated yellow fever and contained malaria during the canal’s 10-year construction period. Gorgas’s sanitary workers drained, or covered with kerosene, all sources of standing water to prevent mosquitoes from laying their eggs and larvae from developing; fumigated areas infested with adult mosquitoes; isolated disease-stricken patients with screening and netting; and constructed quarantine facilities. In major urban centers, new domestic water systems provided running water to residents, thereby eliminating the need for collecting rain water in barrels, which had provided perfect breeding sites for mosquitoes carrying yellow fever.
The US government’s $20 million investment in the sanitation program also provided free medical care and burial services to thousands of employees. In addition, Gorgas’s sanitation department dispensed approximately one ton of prophylactic quinine each year at 21 dispensaries along the Panama Canal route and added hospital cars to trains that crossed the Isthmus. Each year, hospitals treated approximately 32,000 workers, and 6,000 were treated in sick camps.
Selected Contagion Resources
This is a partial list of digitized materials available in Contagion: Historical Views of Diseases and Epidemics. For additional materials on the topic “Tropical Diseases and the Construction of the Panama Canal, 1904–1914” click here or search the collection’s Catalog and Full Text databases.
Des Inoculations Syphilitiques. From the holdings of Center for the History of Medicine/Francis A. Countway Library of Medicine—Harvard Medical School.
Syphilis was first reported in Europe in 1494 among soldiers (and their camp followers) involved in a war between France and Naples. The disease was striking in two ways: for its unpleasantness and for its status as a new disease, unknown to the ancient medical authorities. Syphilis would remain a significant social and medical problem through the mid-20th century.
The “French Disease”
Until the 19th century, syphilis was known by many different names, but the most common was the “French Disease.” (The French called it the “Neopolitan disease,” in a pattern that would repeat itself elsewhere. Russians, for instance, sometimes called it the “Polish disease.”)
Syphilis is generally believed to have come originally from the New World, imported into Europe by Christopher Columbus’s sailors after their famous voyage of 1492. Two important early experiences with syphilis are recorded in Grunpeck’s ca. 1496 Tractatus de pestilentiali scorra sive male de Franzos (also available in the vernacular German, and Ulrich von Hutten’s ca. 1519, Of the vvood called guaiacum, that healeth the Frenche pockes. Fracastoro is credited with naming the disease in his 1530 poem, “Syphilis.”
The sexual nature of syphilis transmission and its contagiousness was noticed from the start. In Europe and the United States, the disease has long been connected with questions of morality, both individual and societal. Women were often assumed to be the source of infection, and, in the 19th century, the regulation of prostitution in order to control the spread of venereal disease became a priority in many European countries. England’s Contagious Diseases Acts are an example of this. Social hygiene—the attempt to regulate and control disease-causing behavior, especially that related to venereal disease, though moral self-discipline and legislation—was of great importance in the late 19th and early 20th century United States as well.
Before the Wasserman blood test—the first widely used serum diagnosis test for syphilis—was developed in 1906, diagnosing syphilis relied on the evaluation of visible symptoms like lesions, rashes, and chancres. Regulated prostitutes were sometimes examined every few days. Though it could produce false positives and though performing the test required great skill on the part of the laboratory technician, the Wasserman test affected both the social and the medical understanding of syphilis, because it could reveal the disease at the asymptomatic stage. This meant that a syphilitic might be a person with no current outward manifestation of disease who could have or spread syphilis without realizing it.
Treatments: Mercury, “Syphilization,” and Salvarsan
Effective treatment for syphilis was controversial because of the perception that a widely available cure would increase “immoral” behavior.
Until the early 20th century, the primary treatment for syphilis was mercury, in the form of calomel, ointments, steam baths, pills, and other concoctions. Side effects of mercury treatments could include tooth loss; mouth, throat, and skin ulcerations; neurological damage; and death.
Guaiacum, a New World tree, was the source of another early treatment for syphilis used in the 16th century. Numerous patent medicines were also developed, especially in the 19th century, often with euphemistic names and advertising.
In the mid–19th century, European physicians conducted experiments in “syphilization”, often on hospitalized prostitutes. “Syphilization” was the name given to repeated inoculations with syphilis matter in order to “saturate” the subject, on the theory that the larger the number of visible, or “primary,” lesions, the less likely it was that secondary syphilis would develop.
“Syphilization” was also used as a preventative, analogous to smallpox inoculation.
The syphilis spirochete organism, a bacterium, was discovered in 1905. In 1908, Sahachiro Hata, working in Paul Ehrlich’s laboratory, discovered the arsenic compound arsphenamine that became known after 1910 by its brand name, Salvarsan. It was also known as “606” because it was the 606th compound Hata and Ehrlich tested. Salvarsan was the first effective specific chemotherapy against syphilis, although it could involve an extended series of treatments and cause serious side effects.
Selected Contagion Resources
This is a partial list of digitized materials available in Contagion: Historical Views of Diseases and Epidemics. For additional materials on the topic “Syphilis, 1494–1923,” click here or search the collection’s Catalog and Full Text databases.