4 Diseases that Don’t Exist Anymore

From smallpox to diphtheria, history has seen some gruesome diseases which made people’s lives miserable and uncomfortable.

Apr 1, 2023By Molly Dowdeswell, MA Early Modern History, BA History
history of diseases that no longer exist

 

It is extremely rare to come across a disease which has been entirely eradicated. In fact, according to the World Health Organization (WHO), only two exist: smallpox and rinderpest. However, there are a great number of diseases that have been eradicated in the West, and these are often shrouded in legacies of horror and fear. While they occasionally appear in other parts of the world, for most in the Western world, they are a long-forgotten part of their ancestors’ lives. What was the nature of some of the most well-known of these? How did they influence people’s lives, and how were they controlled?

 

1. Smallpox

edward jenner vaccination
Edward Jenner vaccinating patients in the Smallpox and Inoculation Hospital at St Pancras: the patients develop features of cows by J. Gillray, 1802, via The Wellcome Collection

 

One of the only two completely and officially eradicated diseases, smallpox, terrorized humanity for centuries. While its origins are unknown, there is evidence that it has existed for at least 3,000 years.

 

Smallpox was spread by touch and was caused by the variola virus, which was a member of the orthopoxvirus family. Symptoms included fever and vomiting at first before ulcers formed in the mouth and rashes on the skin. The rash would eventually turn into fluid-filled blisters, which would then scab over, fall off, and leave scars.

 

The earliest written record of the disease comes from 4th-century China. There is also evidence of it in 7th-century India and 10th-century Asia Minor. Smallpox was devastating; on average, three out of ten people who caught it died. Those who managed to survive it could be left with scars that were often stigmatized and seen as an embarrassment.

 

Get the latest articles delivered to your inbox

Sign up to our Free Weekly Newsletter

In 1796, an English doctor named Edward Jenner noticed that milkmaids who had previously had cowpox did not get smallpox. Given his medical background, Jenner assumed that this was because cowpox protected against smallpox.

 

Jenner tested his theory by infecting a young boy named James Phipps with the sore of an infected milkmaid. Over the course of the next few months, Jenner repeatedly exposed the 9-year-old boy to smallpox, but he never developed symptoms.

 

He conducted more experiments to be certain before publishing his treatise On the Origin of the Vaccine Inoculation in 1801. Over time (and after a few scientific developments), vaccinations became widely accepted and used to save people from this deadly disease.

 

In 1959, the WHO began its attempts to rid the world of smallpox for good. It was eventually successful except for two controlled outbreaks in the 1970s.

 

2. Diphtheria

disease diphtheria poster
An anxious mother with her little boy, representing the danger of diphtheria by unknown, 1934, via The Wellcome Collection

 

Coined by some “The Strangling Angel of Children,” diphtheria was shrouded in terror. The disease is a bacterial infection caused by corynebacterium diphtheria. It is transmitted via close contact with respiratory secretions in the air.

 

The name “diphtheria” comes from the Greek word diphterite, which means “leather” or “hide.” The choice, made in 1826, was inspired by the coating that the disease causes on throats.

 

A thick film develops on the throat of the victim, which makes it hard for them to breathe and ultimately strangles them to death if left untreated. The toxin in the bacteria can also affect the heart. If untreated, diphtheria fatality can be as high as 20% for children under five and adults over 40. For everyone else, it is between 5% and 10%.

 

The disease has a long history. The first records of diphtheria are from the early 17th century. It is thought that cases rose as cities grew and people moved from the countryside to the more cramped and crowded cities.

 

The disease continued to spread and became a considerable problem in the 19th century. During the Industrial Revolution, it became a major cause of death for those who were living in the poor conditions of the industrial cities. While it was most commonly associated with poor children, it did not discriminate against its victims, and people of all ages, classes, and genders suffered from it.

 

disease gaston ramon
Gaston Ramon. Photograph by Richard Alfred O’Brien, c.1878-1970, via The Wellcome Collection

 

In 1880, the bacterium behind the deadly disease was uncovered, and an antitoxin was developed in Germany to counter it. By 1890, this antitoxin was available for doctors to use to treat their patients. While this antitoxin was available to most who could afford it, the very poorest of society still struggled to obtain it.

 

It wasn’t until the 1920s that Gaston Ramon developed a toxoid at the Pasteur Institute in Paris, which was effective in achieving immunity like a vaccine does. The toxoid was rolled out, and the disease seemed to die down. That was, until the 1980s and the break up of the Soviet Union.

 

Very low vaccine rates in the Soviet Union meant that there was a vast increase in cases in the early 1990s, most of which were adults who had not been immunized. Since then, in the West, the disease has disappeared again as people became vaccinated. However, it still appears in some parts of the world where there is a lack of public health infrastructure.

 

3. Leprosy

christ-healing-leprosy
Christ healing a man of leprosy by Jost Amman, 1571, via The Welcome Collection

 

Leprosy was a disease that was particularly prevalent in medieval society. Because the illness caused visual marks on the skin, which often left the sufferer easy to recognize, the disease quickly became stigmatized.

 

Leprosy (or Hansen’s disease) is an infection caused by bacteria called Mycobacterium leprae. It affects the skin, nerves, eyes, and nose. It is thought to have first entered Britain in the 4th century and remained for centuries. In its most extreme forms, it could lead to the loss of fingers and toes, gangrene, blindness, the collapse of the nose, ulcers, lesions, and weakening of the bones.

 

The disease is commonly associated with the harsh treatment of its victims, who, it is often believed, were ostracized because of the illness. However, in some cases, the disease was also seen as a sign of God’s favor.

 

This was because the suffering the disease caused was likened to the suffering of Christ. It was thought that lepers were enduring purgatory on earth and, as a consequence, would go directly to heaven when they died. Because of this, they were believed closer to God than people who didn’t suffer from the disease.

 

Early hospitals were constructed to tend to people with leprosy. One of the earliest known in England was St. Mary Magdalen in Winchester, where burial excavations found the skeletons of people who had suffered from the disease between 960 and 1030 CE.

 

There were 320 religious houses and hospitals like this one in England that cared for lepers between the latter part of the 11th century and 1350. They could often be found on the edges of villages or towns by major travel routes or crossroads.

 

christ heals man with leprosy
Christ cures a man of leprosy; an apostle holds a garment in front of him, via The Wellcome Collection

 

This was because it was beneficial for people with leprosy to remain near other people, from whom they could beg alms. They also tended to offer services like prayer for souls, which they conducted because they were thought to be closer to god.

 

The stigma around the disease seems to have gained traction after the horrific outbreak of the Black Death between 1347-1350, which gave way to fears of contagion like there had never been prior.

 

By this time, however, cases of leprosy were declining. The reason for this is unclear. It is thought that maybe it was due to the greater immunity of people at this time. Many of the leprosy hospitals were transformed into general hospitals or almshouses.

 

While the disease does not exist in the Western world anymore, it does, however, keep appearing in other parts of the world where the medical infrastructure is not available to tackle its devastating effects.

 

4. Phossy Jaw (Phosphorus Necrosis)

phosphorus poisoning jaw
Skull with jaw affected by phosphorus poisoning by J. Bartholomew, via The Wellcome Collection

 

The disease known as phossy jaw differs slightly from the other diseases mentioned on this list in that it appeared as a result of human action; this action being the Industrial Revolution.

 

In the 19th century, it was discovered that by adding yellow phosphorous to the heads of matchsticks, they became easier to light. This discovery quickly gained attention until these “strike-anywhere” matches dominated the market.

 

The industry took off and began to make factory owners extremely rich. However, this was at the expense of the factory workers, who were extremely poor men and women working long hours around the harmful substances in the matches.

 

Breathing in the toxic chemicals in the yellow phosphorous would lead to phosphorus necrosis of the jaw. It caused horrible abscesses in the suffer’s mouths, eventually leading to facial disfigurement. Furthermore, the gums could develop a green/white “glow” in the dark. The abscesses became so bad in some cases that they even caused fatal brain damage.

 

One of the best sources on this disease is an article written by James Rushmore Wood in 1857 entitled “Removal of the entire lower jaw.” The work contained the details of an operation Wood had conducted on someone with the disease and illustrations of the result.

 

The case in the article is that of 16-year-old Cornelia, who worked eight-hour days in one of the match factories in New York for two and a half years. Her symptoms began in May 1855, with what she assumed was a toothache alongside some swelling in the right side of her lower jaw.

 

She had her gums lanced and a tooth taken out to cure the problem; however, the swelling only worsened until an opening formed in her jaw. Despite what would have been a very painful condition, she continued to work at the factory until the 17th of December 1855.

 

government factory inspector
A government inspector visiting a factory, via The Wellcome Collection

 

On that day, she was taken to Bellevue Hospital, where it was reported that she found it painful to chew, her jaw hurt, her face was swollen, and the bone of her lower jaw was damaged. Wood had a look at her condition and decided it was necessary to operate.

 

Wood removed the bone from the right side of her jaw on the 19th of January 1856 without anesthetic. The surgical instruments used included a chain saw which eventually broke and was replaced with forceps.

 

However, the operation wasn’t entirely successful, and she required a second operation in February, which removed the rest of her lower jaw. She was put to sleep with twenty drops of laudanum.

 

It was said that a few days later, her condition had improved, and her face almost appeared normal again. She recovered well, and Wood was happy with the outcome.

 

The disease was eradicated with the introduction of more stringent health and safety procedures in factories as well as the outlawing of the production of phosphorous matches in 1906 by the International Berne Convention.

 

History has seen some truly horrifying diseases which were impossible to control with primitive tools and facilities. Nevertheless, doctors attempted to cure many of their patients, even if these attempts could be extremely painful or, in some cases, make the disease worse.

Author Image

By Molly DowdeswellMA Early Modern History, BA HistoryMolly graduated from the University of Birmingham with a master's degree in early modern history and from Swansea University with a bachelor’s degree in history. She has a long-standing interest in the subject and enjoys researching and writing on a broad range of historical topics and is most interested in the history of medicine and disease. Molly is currently working as a writer based in Birmingham, England and is planning on returning to university to complete a PhD in history.