Sept. 10, 2004 -- Score one for the medical experts of the distant past. The old practice of bloodletting may have worked, and new research may show us why.
Before antibiotics were developed, bloodletting was used to treat serious illnesses.
In fact, America's first president, George Washington, is said to have had 80 ounces of his blood drained from his body in a last-ditch effort to save him in his last hours of life.
He had fallen ill after being caught in sleet and snow while riding around his farm a few days earlier, according to biographer Jack Warren Jr.
It didn't work. Washington died on Dec. 14, 1799.
Some experts blame the bloodletting; others say infection was the problem.
Bloodletting was going out of style by then, but the fact that such an important person was given that treatment indicates it was once a state-of-the-art technique.
"As recently as 1942, Sir William Osler's highly regarded medical textbook advocated bloodletting as a treatment for acute pneumonia," writes Tracey Rouault, MD, of the National Institutes of Health in Science.
Shedding New Light
Those bygone doctors probably didn't know why bloodletting sometimes worked, but new research presents a possible reason.
Scientists, including Eric Skaar of the University of Chicago, recently studied a type of bacteria called Staphylococcus aureus or simply "staph." This bacteria, which can be carried on the skin or nostrils of healthy people is also responsible for skin infections such as boils or pimples. The bacteria can also cause serious infections of the blood, bones, and lungs (pneumonia). Recently these bacteria, like many others, have become more and more resistant to antibiotic therapy.
Staph thrives on iron compounds, scavenging it from the animals it infects. It obtains most of the iron it needs to grow during infection.
Specifically, it prefers a kind of iron found in heme, the molecule in red blood cells that helps carry oxygen. It's as if the bacterium scans its host's menu of iron compounds, hoping to find heme.
"Heme iron is the preferred iron source during the initiation of infection," write Skaar and colleagues in the Sept. 10 issue of Science.
If no heme is available, the bacterium's chances of thriving may fail.
The researchers identified a gene cluster within the bacteria that promotes heme transfer, to the bacterium's advantage.
But when those genes mutate, it's harder for the bacteria to launch a successful infection, according to the researchers' studies of mice and worms.
What does all this have to do with bloodletting?
Skaar's team didn't address bloodletting.
But the idea boils down to this: The less blood that's available, the harder it is for the bacterium to scrounge up enough heme to thrive.
"Bloodletting in the preantibiotic era may have been an effective mechanism for starving bacterial pathogens of iron and slowing bacterial growth," writes Rouault.
These days, we have different ways to handle infections.
Though bloodletting is out of vogue -- and none of the researchers is suggesting its revival -- the reasons why it sometimes worked may be clearer.
They say that targeting or inhibiting the bacteria's ability to obtain iron is a promising area of research that may create novel options for therapy against infection.