BETTER RED THAN DEAD
One night with Venus: a lifetime with Mercury”- (19th century maxim on the perils of syphilis)
Thanksgiving Day, November 1936. As was her custom, Mrs. Eleanor Roosevelt, wife of Franklin Delano Roosevelt, the 32nd president of the United States of America, had conjured up a sumptuous feast at her house in the New York suburb of Hyde Park. The traditional Thanksgiving turkey was surrounded by all sorts of delicious holiday treats. The table looked glorious. However, just that morning a serious problem had arisen. A phone call from Boston had informed Mrs. Roosevelt that her son, Franklin D Roosevelt (FDR) Jr, had been taken to hospital. The problem had apparently started with a sinus infection. But then things started to go rapidly downhill. The streptococcal bacterium FDR Jr was infected with began to spread. He developed an abscess in his cheek which then moved to his throat, producing an extremely painful case of “strep throat”. FDR Jr’s temperature rose to very high levels, and he began to cough up blood. He was admitted to Massachusetts General Hospital. His doctors became seriously worried. If the infection entered his blood, producing a general systemic infection known as sepsis, it was quite possible he would die. The situation was critical. With FDR Jr’s fiancé Ethel du Pont and his mother continuously at his bedside, the Harvard doctors pondered their next move. There really wasn’t a standard treatment for this kind of condition. Nevertheless, Dr. George Loring Tobey Jr, the otolaryngologist involved in the case had an idea. He would try a new experimental drug which he had previously tested on some of his most serious patients. The drug was called Prontosil Rubrum or Prontosil Red. True to its name it was a bright red dye that had been developed by a scientist at the IG Farben company in Germany as part of their program for developing drugs derived from dyes that could fight microbial infections.
Dr. Tobey explained his idea to Mrs. Roosevelt who called her contacts at John’s Hopkins medical school for a second opinion. They told her that the drug was indeed highly experimental but that it didn’t seem to have serious side effects. Mrs. Roosevelt gave her permission for the trial to proceed. Over the next 24 hours Dr. Tobey gave FDR Jr several injections of the bright scarlet dyestuff. The results seemed quite miraculous. After 3 weeks in the hospital, sinking deeper and deeper into crisis, his condition completely turned around. His temperature started to come down, his strep throat improved and the abscess in his cheek started to shrink. He clearly began to feel better. This was in spite of the fact that his skin now appeared a bright tomato color-a small price to pay under the circumstances. In a few days he was released from hospital and, shortly after that, married his fiancée. FDR Jr’s skin gradually returned to its original healthy hue, and they lived happily ever after.
The newspapers were all over the story. “Young Roosevelt Saved by New Drug” trumpeted the New York Times. Time magazine declared Prontosil Red “The Medical Discovery of The Decade”. The author of an article in Colliers magazine put it this way- “A scientist in Germany, tinkering with molecules, produced a red dye. Immediately patients scheduled to “dye” from streptococcus infection began to get well. Here is the dramatic story of a modern miracle.” And why not? In the late 19th century, the work of Louis Pasteur and Robert Koch had developed the “Germ Theory” of infectious disease, leaving humanity with a better understanding of the underlying causes of these ailments but not with any method of curing most of them. That was until now.
But, of course, this was neither the beginning nor end of the story. The German scientist who had invented Prontosil Red was named Gerhard Domagk who was working in a program that was descended from one of the Bayer drug company’s oldest lines of research on dye related molecules that produced beneficial therapeutic effects, a project originally pioneered by the great Nobel Prize winning scientist Paul Ehrlich. Ehrlich had conceived the idea of using dyestuffs such as methylene blue as “magic bullets” when treating infectious diseases and cancer. His approach, which became known as chemotherapy, had proven to be very promising indeed and was being actively followed up by drug companies all over the world. As we have discussed in the previous chapter, by the 1930s the major German chemical and pharmaceutical companies, including Bayer, Agfa, BASF, Hoechst, and Casella, had consolidated themselves into a giant Nazi controlled cartel named IG Farben (Interessen Gemeinshaft Farbenindustrie: The combined interests of the dye making companies), thereby becoming one of the 5 largest companies in the world. By tweaking the chemistry of dyestuffs, Bayer chemists, now working for IG Farben, had already synthesized mepacrine in 1931, a drug that became one of the first widely used alternatives to quinine for the treatment of malaria (Chapter X). Domagk and his colleagues were charged with finding similar dye related substances that were active against streptococcal bacterial infections and he had devised an assay system for testing newly synthesized molecules using infected mice. Having not got very far with the program by modifying traditional aniline-based dyes, the chemists at IG Farben switched track and began making a series of substances based on azo-dyes. In late 1932 they started to produce azo-dyes that contained the chemical sulfonamide nucleus and passed these on to Domagk for testing. These were much more promising. One drug in particular produced dramatic effects in mice infected with deadly streptococcus. Fourteen mice given the drug all survived, whereas another group who didn’t receive it all died. Indeed, this was the new drug that would eventually become known as Prontosil Red.
The scientists at IG Farben soon confirmed that Prontosil Red was the real deal. Indeed, one incident confirmed this in the most dramatic fashion imaginable. In 1935, Gerhard Domagk’s young daughter Hildegard stabbed herself with a dirty needle, which led to a streptococcal infection. As we have seen in the case of FDR Jr, this was something to be taken very seriously indeed. Hildegard’s temperature rose to 104 degrees as the infection spread to her blood and the outcome looked bleak. Domagk arranged for his daughter to be treated with Prontosil Red and within a week she was completely cured. Further studies that he carried out convinced Domagk that Prontosil Red was active against other infections as well, including those that caused bacterial meningitis and gonorrhea. It appeared that the IG Farben scientists had made a breakthrough of historic importance that would not only help humanity treat a series of previously untreatable diseases but would also bring the company a windfall of profits. The first of these possibilities became reality but, alas for IG Farben, not the second.
In fact, right from the start there had been something curious about the way in which Prontosil Red worked. It was extremely effective in animals, including, as we have seen, humans; but it was completely inactive when added to bacteria in a culture dish or test tube. A group of scientists in France carefully read the IG Farben patent literature and came up with a way to produce their own version of Prontosil Red which they named Rubiazol and used this as a starting point for making their own derivatives for testing. Many of these drugs proved to work very well. As a control experiment, they then tested the sulfonamide moiety which was common chemical structural feature of the effective dye molecules. This was a substance called sulfanilamide. It turned out that sulfanilamide, when given just by itself, was at least as effective as substances like Prontosil/Rubiazol. This made everything clear. As we have seen with heroin in the previous chapter,drugs like Prontosil Red were acting as ‘Pro-drugs”. They had no real antibacterial effects but once inside the body an enzyme would cleave them in half, releasing sulfanilamide, which was now responsible for the beneficial effects observed. Unfortunately, sulfanilamide had already been synthesized and patented in 1906 for other purposes, meaning that IG Farben, who had discovered its profound antibacterial effects, were left with nothing to show for it. Nevertheless, sulfanilamide was the first of a group of similar substances with genuine antibacterial properties. “Sulfa” drugs were the first generally useful “antibiotics” which could be employed for fighting many previously untreatable infections. They became very widely used for treating the wounds of soldiers during the Second World War.
Establishing the germ theory of disease.
It is hard to believe that genuinely effective antibiotics were unavailable prior to the 1930s. Bacteria surpass us massively in numbers, total mass, and time of existence on the planet Earth. Before the development of sulfa drugs, mankind’s battles with infectious diseases almost inevitably ended in defeat. Some of these defeats were disasters of Biblical proportions. Historical records clearly attest to the fact that contagious diseases, including various rounds of Black Death, tuberculosis, leprosy, and syphilis have killed hundreds of millions of people who, until the 20th century, were completely defenseless against such infections. There had been some basic understanding of the idea of contagion and even of immunity, but the underlying mechanisms that explained these processes were unknown. For example, producing immunity in people by deliberately exposing them to diseases was practiced in many Asian countries centuries before it was rediscovered in the West by Lady Mary Wortley Montague, wife of the British ambassador to the Ottoman Empire in the eighteenth century. This ultimately led to the concept of vaccination. But exactly why this worked was not at all clear. Furthermore, the idea that some diseases were contagious was also understood at a very basic level. During the siege of Caffa on the Black Sea coast of Crimea in 1346, the besieging Tartar army was devastated by the Black Death but had the idea of catapulting their dead cadavers into the city so that the besieged Genoese might be similarly afflicted. Their ploy worked only too well, and the Genoese fled taking the Black Death with them back to Europe. Similarly, in ancient times it was understood that lepers might pass their disease on to others and so there were stringent laws that isolated them from the rest of the population.
Although such phenomena contributed to the common notion of an infectious disease, there was no scientific understanding as to the mechanism that was responsible for the spread of contagion. The first person to come up with a theory that was close to the truth was the Renaissance polymath Girolamo Fracastoro, who originated the name “syphilis”. In his book De Contagione et Contagiosis Morbis (1546), Fracastoro suggested that contagious diseases were spread through the agency of microscopic self-replicating particles (seminaria), different particles causing different diseases. Fracastoro suggested that this could basically happen in three different ways-by mutual contact, through the air or by touching some object (fomite) that was contaminated with the contagion in question; he wrote, “I call fomites such things as clothes, linen, etc., which although not themselves corrupt, can nevertheless foster the essential seeds of the contagion and thus cause infection.” If this sounds suspiciously like germ theory that’s because it really is germ theory. These ideas were quite highly regarded at the time but, in the absence of any real evidence for the existence of “seminaria”, eventually fell into disuse. It wasn’t until the invention of the microscope in the early seventeenth century that the existence of microbes became a reality and, even then, there was no reason to connect them with Fracastoro’s ideas.
In the 19th century several important observations began to provide a clear intellectual context for anticipating the arrival of the germ theory of disease. For example, puerperal or childbed fever was a common illness associated with childbirth in hospital maternity wards. We now know that it is caused by a bacterial infection of the genital tract. Between 1835 and 1844 in London’s general lying-in hospital 63 women died of puerperal fever out of every 1000 deliveries. At the time the major treatment for the disease was still based on Greek and Roman humoral theories of medicine. Women with high fevers were bled to rebalance their humors, but with little effect. Ignaz Semmelweis was a Hungarian doctor who worked at the lying-in hospital in Vienna and became interested in the mechanism of transmission of the disease. He carefully observed the numbers of women who suffered from puerperal fever and the nature of their treatment, which doctors, nurses and students provided their treatment, and so on. One thing that he noticed was that the women who developed fevers were often physically examined by medical students who had come straight from dissecting autopsies in the morgue. In 1847, he developed the hypothesis that the fever was caused by something that was being transferred to their wombs on the hands of the medical students who had picked it up when they were dissecting cadavers. The solution seemed obvious and incredibly simple. After their dissections and prior to examining women, the students should wash their hands with chloride of lime. The results of initiating this procedure were clear. The occurrence of puerperal fever fell precipitously. Unfortunately, Semmelweis’ ideas didn’t gain much traction with the Viennese medical community at the time and, ironically, he died several years later in a mental hospital of sepsis resulting from an infected finger. In retrospect however, Semmelweis’ results on the mechanism through which a contagious disease can be spread from one individual to another were clearly of central importance in the history of medicine.
A similarly important set of observations was made in 1865 by an English surgeon named Joseph Lister who was working at a hospital in Glasgow. Lister was interested in why it was that so many of the patients who had compound fractures, in which the skin was pierced by the broken bone, and were treated by surgeons, became ill and died. Indeed, more than half of them suffered this fate. The basic observation was that the wound usually became infected and septic requiring amputation which would then often lead to systemic infection and death. Lister imagined that the production of pus in a wound might be related to process of the putrefaction of rotting meat and both reflect the effects of ‘germs’ falling into the wound from the air. If this was so then, by killing these airborne germs, one might be able to improve the outcome of surgeries. Here again it was the science of organic chemistry, which had been of such great importance to the development of dyes and early pharmaceuticals, that was of critical importance (Chapter 1). In 1834 the German chemist Friedlieb Runge (who also discovered caffeine), isolated phenol or “carbolic acid”, from coal tar. Phenol, of course, has a famously strong smell. The professor of chemistry at the University of Glasgow explained to Lister that workers at the Carlisle sewage plant had used carbolic acid to remove the terrible smells caused by the germs that thrived there. As a bonus this treatment resulted in the fact that the cows that lived near the sewage works stopped dying in abnormally large numbers. Lister hypothesized that perhaps applying carbolic acid to wounds might prevent infection and enable successful surgery. This was an inspired idea and the fatalities from his own surgeries declined from around half of his patients to around fifteen percent. Surgeons around Europe began to apply Lister’s ideas for “disinfecting” surgeries with similar success. A huge problem in medicine had been solved.
A third influential set of observations made around this time concerned the results of investigations by Dr John Snow into the spread of cholera in England. Snow was a remarkable doctor in all respects, and he was one of queen Victoria’s most trusted physicians. In the early nineteenth century, the major theory concerning the mechanism underlying the spread of diseases was that they were due to the effects of bad smells or “miasmas”. This can certainly be seen encapsulated in the case of malaria which derives its name from the Italian for “Bad Airs” (mal-aria). Cholera arrived in Europe in the 19th century, probably from Asia, and it turned up in England in 1831 when an outbreak killed over 20,000 people. The idea that cholera came from the effects of foul odors from sewage and other sources had encouraged the government to overhaul the sewer system in many large urban areas, something that was certainly a good idea in the rapidly growing cities of the Industrial Revolution. The situation in a crowded city like London was particularly awful; sewage being freely dumped into the river Thames. The Thames was London’s major water supply with a variety of companies, using different standards, supplying drinking water to separate districts through a series of pumps. Snow didn’t believe in the miasma theory. He had investigated several cholera epidemics throughout the country and was impressed by the fact that it was often the case that people on one side of a street would all come down with cholera, whereas the residents on the other side were completely spared. Surely, they all breathed the same air. He had another idea; that the disease was spread through something in human excrement that got on to soiled clothing or into the water supply. He assembled a great deal of evidence for his theory, but his culminating investigation was centered around the incident of the Broad St pump. In London in 1854 ,500 people died of cholera in the area around Broad St in Soho in the first 10 days of September. Snow made a map of where the victims lived. At the center of the map was the water pump in Broad St from which the residents were drawing contaminated water. When the pump handle was removed and access to the water cut off, the epidemic ceased. Sometime later it was ascertained that the original victim was a baby whose diapers had been washed and drained into the local water supply. Today, Snow’s work is considered a classic example of the science of epidemiology.
The work of Semmelweis, Lister, Snow and numerous others clearly pointed to a theory of infectious disease that wasn’t compatible with the miasma theory but was with a Fracastoro-like theory implicating living microorganisms. It now needed evidence showing that individual diseases could be caused by particular organisms and this evidence was soon forthcoming. The first incontrovertible connection was provided by Robert Koch in 1876. Investigating anthrax, a disease of great importance in cattle farming, Koch observed that he could transmit the disease from sick cattle to mice. Moreover, he discovered that the blood of cattle suffering from anthrax contained a rod like bacterium and that this was sufficient to transfer the disease to mice or from one mouse to another. Clearly therefore, the microorganism, which was known as Bacillus anthracis, was responsible for causing the disease. Over the next 25 years, several other organisms responsible for a host of infectious diseases were identified and isolated including the causes of leprosy, malaria, tuberculosis, diphtheria, typhoid, tetanus, plague, and syphilis. Moreover, the entire science of microbiology, involving methods for growing and manipulating these organisms in cell culture was developed, providing mankind with a much clearer picture of his tiny enemies. But, of course, this begged the question as to how to deal with them on a practical level. One approach had been the development of vaccination. However, for treating the vast numbers of people who were exposed to many contagious diseases effective drugs were also needed that could rapidly treat large numbers of sick individuals. As we have seen, it wasn’t until the development of sulfa drugs in the 1930’s that this was really achieved. However, these weren’t actually the first specific drugs for combating an infectious disease. For that honor, we need to talk about syphilis
The conquest of syphilis.
Syphilis arrived in Europe at the very end of the 15th century in the area in and around Naples. Where did it come from? There are many theories about this. Perhaps it had always been there but suddenly mutated to a more virulent form? Perhaps it came from Asia? Perhaps it came from the Americas courtesy of Columbus’ crew returning to Europe? But to this day nobody is certain. Most often when the disease arrived in a particular country its origins were blamed by the citizens of the country on their natural enemies. For example, for the citizens of England, Italy and Germany, syphilis was always known as the “French disease”. The Russians called it the “Polish disease” and the Poles called it the “German disease”. For people in Scandinavia, North Africa, and Portugal, it was the “Spanish disease”. Similarly, in India, the Muslims and Hindus blamed each other. Right from the start, however, contracting syphilis has always been associated with some degree of social disgrace. During the period between the sixteenth and twentieth centuries, syphilis had a similar status to that of AIDS in the twentieth century – a sexually transmitted plague with all the social implications that go with it. When syphilis first appeared in Europe the symptoms were particularly harsh. Woodcuts and paintings of the time, including one as early as 1496 by Albrecht Durer, illustrated the problems. The disease began with genital ulcers, then progressed to a fever, a general rash, and joint and muscle pains. Then, weeks or months later, there was the appearance of large, painful, and foul-smelling abscesses, sores, and pockmarks, all over the body. Muscles and bones became sore, especially at night. The sores became ulcers that could eat into bones and destroy the nose, lips, and eyes of the victim. They often extended into the mouth and throat, and sometimes early death occurred. Victims often resorted to gloves or masks to hide the ravages of the disease. As one description of the time put it-“The contagion which gives rise to it comes particularly from coitus: that is, sexual commerce of a healthy man with a sick woman or to the contrary. … The first symptoms of this malady appear almost invariably upon the genital organs, that is, upon the penis or the vulva. They consist of small, ulcerated pimples of a color especially brownish and livid, sometimes black, sometimes slightly pale. These pimples are circumscribed by a ridge of callous like hardness. … Then there appear a series of new ulcerations on the genitalia … Then the skin becomes covered with scabby pimples or with elevated papules resembling warts. … A month and a half, about, after the appearance of the first symptoms, the patients are afflicted with pains sufficiently to draw from them cries of anguish. … Still very much later (a year or even longer after the above complication) there appear certain tumours of scirrhus hardness, which provoke terrible suffering.”
By the seventeenth century the disease had changed its character somewhat progressing in several distinct phases. The first began with genital sores called chancres. We know, for example, that the composer Robert Schumann, who eventually died of the disease, originally had this symptom as recorded by his doctor. After the chancre had healed, victims would usually develop a rash on the genitals and perhaps elsewhere, often accompanied by fevers, aches, and bone pains. In some sufferers that was that. But in others the disease had just stealthily taken up residence for a period of time, hiding out for what could be several decades, before suddenly reappearing. This last phase consisted of the appearance of abscesses and ulcers, and the large swellings or “gummas” often ending with severe disability and death. These late phases of the disease, commonly known as tertiary syphilis, could also manifest themselves as severe neurological symptoms that affected the brain, when they became known as “General Paralysis of the Insane” or the spinal cord, known as Tabes Dorsalis. Syphilis was viewed by ordinary people as a sign of sin, for which victims were shunned and punished.
Originally there was no effective treatment for the disease except for various folk remedies such as bleeding, purging and other ancient treatments. In 1530, however, the great alchemist Paracelsus came up with a new approach to treatment. Paracelsus lived in and around the regions of central Europe where mining was becoming an important enterprise and new elements, including several metals were often discovered. Paracelsus considered the possibility that, rather than using ancient methods such as bleeding, diseases might be treated directly with metals and suggested the use of mercury for treating syphilis. This was the treatment that then became the standard over the next several hundred years. Mercury or mercury salts could be rubbed on the skin, ingested, or taken in as a vapor. In fact, these procedures were somewhat effective in treating syphilis but, of course, mercury itself is horribly toxic and the side effects of the treatment might be as bad, or even worse, than the disease itself.
It wasn’t until the early 20th century that a further breakthrough emerged. By this point in time other metals such as bismuth and even arsenic were being used to treat syphilis. Obviously, arsenic, the culprit in so many murder mysteries, is famously poisonous and efforts had been made to temper its side effects by using the burgeoning science of organic chemistry for preparing organic derivatives of the element. An organic arsenic derivative called Atoxyl had been produced as early as 1863 and had found some use in the treatment of trypanosome induced sleeping sickness. This eventually attracted the attention of the great Paul Ehrlich who, as we have seen, was one of the key figures in the history of pharmacology. As it was thought at the time that, like sleeping sickness, syphilis was caused by a trypanosome, Ehrlich teamed up with his chemical colleagues to make different organic derivatives of arsenic with a view to targeting the disease. In 1905, Fritz Richard Schaudinn, a German zoologist, and Erich Hoffmann, a dermatologist, discovered Spirochaeta pallida (the bacterium which was spiral shaped and white under dark field illumination, and is now called Treponema pallidum) to be the causative organism of syphilis. Ehrlich therefore developed a new model for testing his novel arsenic containing compounds involving the infection of rabbits with T.pallidum. In 1909, following the systematic testing of numerous new molecules provided by his colleagues, Ehrlich eventually hit upon “compound number 606”, which he subsequently named arsphenamine and which effectively cured syphilis in rabbits with a single dose. The drug was effective against various bacterial strains of Spirochaetes, findings that led to clinical trials and eventual marketing of the drug for the treatment of syphilis. There was no FDA or its German equivalent in those days and the new drug, now named Salvarsan, rapidly made it into the clinic and was being used to treat humans with great success by 1910. Salvarsan was certainly an enormous advance on the use of mercury or arsenic. Manufactured by the German chemical company Hoechst, Salvarsan quickly became the most widely prescribed drug in the world. It was really the world's first blockbuster drug and remained the most effective drug for treating syphilis until the 1940s.Not only was Salvarsan an important drug, but it was also provided clear support for Ehrlich’s entire approach to pharmacology, testing a series of novel molecules in search of the ultimate “magic bullet.” Really, much of modern pharmacology dates from Ehrlich’s work with Salvarsan which provided a paradigm for drug discovery that has been used ever since. However, although Salvarsan, and its subsequent derivatives, were successful in treating many cases of syphilis, they were anything but ideal from a modern perspective. Being derived from arsenic, their use was still associated with numerous side effects. Moreover, the drug was unstable and therefore cumbersome to administer. Nevertheless, by the 1940s, the use of drugs like Salvarsan and sulfanilamide, had greatly revolutionized the treatment of infectious diseases. The enemy had been clearly identified and weapons were now starting to be developed so that humanity could begin to fight a war which had previously been completely one sided. But perhaps Salvarsan and sulfanilamide might be improved upon further? In fact, they already had been.
The first natural antibiotics.
Even prior to the development of bacteriology in the latter part of the 19th century, there had been suggestions that Nature might be able to provide mankind with things that could be used to counter the effects of infectious pathogens. For example, as early as the 17th century, the healing effects of molds were described by John Parkington, an English apothecary who recommended in his book Theatrum Botanicum, that it might be possible to use the extracts of certain molds for treatment of infections. This was something that would ultimately prove to have considerable medical importance but wasn’t taken up by doctors at the time. Some 200 years later, however, a German surgeon named Theodor Billroth reported that, when he grew cultures containing the mold Penicillium, bacteria would not grow on them. He suggested that perhaps the presence of the mold made the growth conditions for bacteria unsuitable. Around the same time in 1871, Sir John Burdon-Sanderson reported that molds of the Penicillium group could inhibit the growth of bacteria in meat broth. Shortly after this, in 1875, an English doctor named John Tyndall was performing studies on the growth of bacteria from the air that contaminated growth medium in test tubes. He also observed that in test tubes where Penicillium mold grew, bacterial growth could not be observed. He interpreted this by suggesting that there was a “battle” between the mold and the bacteria and that the mold was always victorious. As we have discussed, it was Joseph Lister who was responsible for pioneering the use of antiseptics in surgery and medicine in general. However, he didn’t only experiment with carbolic acid. He also examined the growth of bacteria in urine and observed that the presence of Penicillium prohibited growth. Lister was quick to note the implications of his observations and had the idea that extracts of the mold might be useful for inhibiting the growth of bacteria in wounds. He apparently tried this on his nurse who had an infected wound that had been found to be resistant to any other treatment. It is said that she was completely cured, but unfortunately, Lister never disseminated his results in the scientific literature. In the 1890’s Vincenzo Tiberio in Naples and Ernest Duchesne in France both demonstrated that extracts of Penicillium injected into animals could cure bacterial infection. Here again, these results were not published very widely and so were not appreciated by research scientists in English speaking countries even several decades later.
The eventual “discovery” of penicillin by Alexander Fleming in 1929 is one of those pieces of scientific mythology that has been told over and over again to scientists and nonscientists alike. Not only does it concern one of the discoveries in the history of science that everybody agrees is of the greatest importance, but as drama it has all the important elements. The original discovery by the young Alexander Fleming that combined both luck and scientific acumen. The fact that the discovery then languished in the scientific literature for a decade before it was rediscovered and developed by a team of brilliant but highly egotistical scientists who, racing against time, succeeded in purifying it so that it could be used to treat soldiers in the Second World War. Really, you can’t make this kind of thing up. The “truth” of exactly how these events unfolded has become shrouded in the mists of time. Even today the precise details of what happened are still debated.
Nevertheless, the outlines of the story are not in doubt. Born in 1881 in Scotland, Alexander Fleming had studied to become a doctor and had served in that capacity during the First World War. He therefore had had ample opportunity to observe the terrible effects of the bacterial infection of wounds close up. In 1928 he was working in the Inoculation Laboratory headed by Dr Almroth Wright at St Mary’s hospital in London where he was investigating new agents with potential antimicrobial activity. In fact, he had already made one interesting observation that the enzyme lysozyme could produce bactericidal effects; but it had been difficult to see how this might be put to practical use. On returning from a vacation Fleming was faced with clearing up his bench which was littered with the remains of experiments he had carried out prior to going away. Among the detritus were plates of media growing a pathogenic strain of the staphylococcus bacteria. On one of these plates a blob of Penicillium mold had begun to grow (the mold was actually Penicillium notatum, now known as P. chrysogenum). It had apparently gained access to the laboratory through an open window. The area of the plate where the mold grew was bereft of bacteria. Because he had been trained to look for precisely this kind of effect, this tweaked Fleming’s interest. Could the mold be producing something that killed the bacteria? He quickly established that Penicillium extracts were active against many, but not all, bacteria and that they were basically nontoxic and had no effects on the body’s immune system. He published his results in 1929.However,after that he did little to develop his findings further .It was obvious to him that “Penicillin” as he now called the unknown material produced by the mold, might be of use as an antimicrobial agent for therapeutic purposes, but he didn’t seem to have been interested enough to work on this possibility at the expense of his other projects ,such as the production of vaccines. At any rate, apart from the work of a couple of his students, the topic of penicillin languished for the next decade.
And it might have done so even longer, if it wasn’t for events that occurred at the University of Oxford in 1939. It was here, in the department of pathology, that a team of brilliant individuals had come together who would ultimately make one of the greatest contributions to the history of medicine. Like most universities, the laboratory was a conglomerate of people from all over the world. The titular head of the group was Howard Florey from Adelaide in Australia. A brilliant polymath, he had become a highly respected pathologist. Florey was not a man to hide his light under a bushel. He was highly ambitious, sociable, and interactive with a fantastic gift for leadership and organization. His perfect foil was the equally brilliant Ernst Chain, a Jewish refugee from Berlin. Apart from his great gifts as a chemist, Chain was also multitalented, at one point in his youth having to decide between a career as a pianist or a scientist. Chain also had a photographic memory and, in spite of his somewhat sarcastic demeanor, was highly respected for his intellectual gifts. Another key member of the team was the Englishman Norman Heatley, a truly innovative experimentalist with a great gift for designing practical solutions for problems by constructing novel pieces of equipment. The Oxford laboratory was working on several projects in the field of bacteriology and had an interest in Fleming’s original discovery of lysozyme. In 1937 Florey and Chain had decided to begin a broad study of novel substances produced by microorganisms that might have antibacterial effects. At some point they came across Fleming’s 1929 paper and decided to add penicillin to the list of substances they would like to test. Early testing revealed that penicillin was promising; but the Oxford team soon found out, as Fleming had found out before them, that it was extremely difficult to work with. It was only present in mold extracts in vanishingly small amounts and was extremely unstable when handled. However, Norman Heatley’s ability to devise practical solutions gradually yielded methods for improving the isolation of the new drug. Soon the world was plunged into war and the Oxford group’s interest in a powerful new drug for treating bacterial infection became much more than just another academically interesting project. Eventually enough penicillin was isolated to allow a meaningful test using live animals. The day before the Dunkirk evacuation, Florey infected 8 mice with a highly pathogenic strain of Staphylococcus. Four of the mice received penicillin and four didn’t. One day later, all four of the control mice were dead and all four of the penicillin treated mice appeared completely normal. These were the kinds of results that got everybody’s attention. But would the same effects be seen in humans? In Feb of 1941 a Mr. Albert Alexander was dying in the local hospital of a horrible Staphylococcus infection. Everybody agreed that he would be an ideal test subject. He was treated with penicillin and within hours his condition improved significantly. However, the amount of penicillin which the Oxford investigators had on hand was tiny and it soon ran out. The unfortunate Mr. Alexander eventually succumbed to his infection. Although this was tragic, the beneficial effects of the drug had been obvious. There could be little doubt as to the potential utility of penicillin which now became the group’s number one priority.
To meet the pressures of wartime expectations more penicillin was needed; much more and as quickly as possible. Briton’s wartime economy simply didn’t allow for this to happen. But Florey had cultivated many powerful American contacts over the years and now sought to capitalize on some of them. The Lancet, one of the world’s top medical journals, published several articles detailing the work of the Oxford group and now everybody, including the Americans, and, for that matter, the Germans, could appreciate the exciting nature of the results. Florey and Heatley next braved wartime conditions to travel to the US where they were able to persuade the US Department of Agriculture (USDA) to aid them in their quest. In particular, the USDA mobilized its largest laboratory located in Peoria, Illinois to help with making a large amount of penicillin. The Peoria lab requisitioned mold samples from all over the world in an effort to find the most potent source, which, ironically enough, ended up coming from a moldy cantaloupe purchased down the road in a local market. Intensive efforts to ramp up the levels of penicillin production were successful over the next year so that appreciable amounts were available for further human testing.
Now, the industrial production of penicillin during the rest of the war shifted to the United States. Nevertheless, some of the key studies which would lead to an understanding of how penicillin produced its miraculous effects would take place in England. Of particular note was the development of a new technique enabling scientists to elucidate the 3-dimensional structure of molecules through the use of X-rays. The technique, known as X-ray crystallography, had been invented in Germany but many of its greatest practitioners worked at Oxford and Cambridge. These included a young chemist named Dorothy Crowfoot, later better known by her married name Dorothy Hodgkin. Using material purified by Ernst Chain’s group, Hodgkin was able to produce crystals of penicillin related material and to determine its chemical structure. This turned out to be surprising. The core of the molecule contained an unusual chemical motif known as a b-lactam ring, something that would prove to be of central importance in our eventual understanding of how these molecules produce their effects.
And penicillin really did work. On November 28th ,1942, a fire broke out at the Coconut Grove nightclub in Boston killing 492 people, making it the deadliest nightclub fire in history. There were some survivors. Thirteen of these were among the first humans to be treated with penicillin .As soon as the survivors were in hospital ,the Merck company, which was one of the major US companies working on the project, sent 32-liters of the drug in the form of culture liquid in which the Penicillium mold had been grown, from New Jersey to Boston. The drug would be crucial in combating bacteria which typically infected the skin grafts of burn victims, and none of the patients that received it died, a truly significant result. By the end of the war, penicillin was being manufactured in large amounts and used to treat people all over the world.
When scientists are asked what they consider to be the most important discovery in the history of medicine, many would point to the discovery and development of penicillin. Indeed, Fleming, Florey and Chain would share the Nobel Prize in 1945 and Dorothy Hodgkin would receive it in 1964 for her work on numerous projects, including penicillin. But it is important to view the development of penicillin in the context of other discoveries that really allowed the practice of medicine to enter the modern world. These include the work of Lister and Snow and also of Pasteur and Koch, the main developers of germ theory. Now our microbial enemies had been identified, the way they went about their work to cause disease better understood and, with penicillin, a method had been developed for defeating them. Humanity could now meet microbes on the battlefield on a more equitable footing. Now humans would win some of the battles, but, as we shall see, not the war.
For one thing, although penicillin was fantastically effective in treating many transmissible diseases caused by bacteria such as staphylococci, streptococci, and clostridia, it wasn’t effective against all of them. For example, it wasn’t effective against Bubonic plague caused by Yersinia pestis or cholera caused by Vibrio cholerae. At the time, very little was known about the molecular basis through which penicillin produced its effects. Yet there was at least one clue. As we have discussed, many of the great advances in medicine in the 19th century were connected with the development of the dye making industry. Scientists like Paul Ehrlich had pioneered the use of dyes as histological stains and then, as we have seen, had further developed the entire field of chemotherapy where he predicted that dye-based molecules could act as magic bullets to combat bacterial pathogens and cancer and even diseases like malaria and schizophrenia. In 1884, a Norwegian pathologist named Hans Christian Gram, was performing research on bacterial infections of the lung in which he was trying to produce a stain which would identify bacterial cells against the background of the host tissue. He was successful in doing so and then observed that certain types of bacteria that he tested picked up his new stain while others didn’t. The reason for this, as we shall discuss later, is that that the cell wall that surrounds the two groups of bacteria has different structural features accounting for the differential effects of the stain. Henceforth, bacteria would become known as Gram positive or Gram negative. Interestingly, penicillin was active against many Gram-positive bacteria and inactive against most Gram negative bacteria. Ultimately, the structural features that determined the ability of a bacterium to pick up the Gram stain would also help explain the mechanism of action of penicillin.
The golden age.
The discovery of penicillin really opened the doors to what would be a golden age of antibiotic discovery, replete with many interesting stories. One of these concerns the man who originally coined the term “antibiotic”. The disease known as tuberculosis is caused by infection with a bacterium called Mycobacterium tuberculosis. Tuberculosis is fantastically infectious and can be spread from one person to another through the air-even as few as 10 bacteria can transmit the disease effectively from one coughing person to a bystander. Not only is “TB” very contagious but it is also deadly. It most commonly takes up residence in the lungs but can also infect many other parts of the body. Infection of the lungs produces the coughing symptoms, sometimes including blood, that is typically associated with the disease, together with fevers and extreme fatigue. Loss of appetite and weight loss are also typical, the victim gradually being ‘consumed’ by the disease-hence the archaic name ‘Consumption’. Until effective therapies for the disease were forthcoming in the second half of the 20th century, tuberculosis was truly one of the greatest plagues of human history. In 1882 Koch, together with his colleagues, isolated the Mycobacterium tuberculosis the organism that causes TB. We don’t know where the disease originated, although it is thought that TB probably tracked early humans as they migrated from Africa. It is estimated that 1 out if every seven humans that have ever existed has died from TB, a total of some 15 billion people, surely more than any other infectious disease. Until the latter half of the 20th century, there really was no effective way of treating TB, even though there were a huge number of possible therapies suggested over the years. Probably the best known to us these days is the idea of the sanitorium, often high in the mountains, where the clear air was supposed to help your lungs. And maybe it did a bit, but in the end, it was no cure. TB was a one-way street as patients gradually got weaker and weaker until the inevitable end. The residents of Thomas Mann’s Magic Mountain were never going to leave. However much philosophizing they did about the meaning of life, their fate was sealed. Only Hans Castorp got to leave and that’s because he didn’t really have TB.
The idea that microorganisms might produce substances that killed each other goes back to the early part of the 20th century. After all, it was well known since the work of Darwin and others that the world was a competitive place, and this was just as true of the world of microbes as it was for the world of men. It was easy to imagine that, in the tiny world of microorganisms, there was a constant war going on in which species competed with one another for nutrients and other factors so that they might establish dominance. One place where microorganisms exist in abundance is in the soil and so this is a natural place to look for interactions between different species. In the 1930’s one laboratory with great expertise in this area of research was run by Selman Waksman, a professor at Rutgers University in New Jersey. Waksman, a Jewish immigrant from Ukraine, had come to the US when he was 22. After studying at Rutgers and Berkeley, he joined the faculty at Rutgers and worked his way up the academic ladder until he became a professor with his own laboratory specializing in soil bacteria. In 1939 one of his ex-students named Rene Dubos, working at the Rockefeller University in New York, isolated two bactericidal agents, gramicidin and tyrocidine, from a soil bacterium called Bacillus brevis. These substances could kill a wide range of bacteria but were also toxic to normal host blood cells and so could not be used to counter body wide (‘systemic”) infections. However, they could be used for topical application as ointments or salves and, indeed, are still used for that purpose today.
Waksman was interested in further mining soil bacteria for “antibiotics” and, with the backing of the Merck drug company, around 1939 he began an intensive and methodical search for unique soil derived substances. The approach involved the systematic culturing and testing of a very large number of potential bacterial strains, a highly labor-intensive process. Fortunately for Waksman working with him was an extremely industrious graduate student named Albert Schatz. In particular, Schatz wanted to find something that would kill Mycobacterium tuberculosis, the organism responsible for TB. Concentrating on a family of soil bacteria known as actinomycetes, Schatz staged Petri dish battles between different members of the family and TB. Starting in 1941, he worked long hours, isolated down in the basement because he was using actively infectious strains of M. tuberculosis, systematically comparing the effects of one bacterium after another. Then in 1943, he found something interesting. Two different strains of a bacterium he named Streptomyces griseus, seemed to be active against M. tuberculosis. Schatz and Waksman quickly isolated the substance responsible for the activity. This was the antibiotic that would become known as streptomycin which, for the first time in human history, represented a weapon that humans could use against one of their deadliest foes. Naturally, there was still a great deal of work to be done including tests on animals, clinical trials on humans, design of industrial processes for growing and isolating the precious substance and so on. Indeed, the development of streptomycin was to prove difficult not the least because of an extremely acrimonious fight between Waksman and Schatz about who should get the credit for the discovery. Waksman made a good deal of money and also received the Nobel Prize in 1952 during which time he studiously downplayed any contribution by Schatz. Indeed, Schatz’s work wasn’t really recognized for decades.
In reality, Waksman and Schatz hadn’t just discovered a new antibiotic but a whole new way of discovering them. The idea was to obtain soil samples of different kinds and intensively screen them for interesting activities. And the approach worked. Several pharmacological companies began programs based on Waksman’s method. In 1945 scientists were investigating a soil sample sent to them by somebody at the University of Missouri and detected an interesting antibacterial activity. Eventually they isolated an active substance with a golden color that they therefore named “Aureomycin” (chlortetracycline). This new antibiotic was a real breakthrough as it was effective against both Gram positive and several previously untreatable Gram negative bacteria. Indeed, it was the first member of a new class of antibiotics called the tetracyclines. Not to be outdone, the Pfizer pharmaceutical company were screening their own soil samples and came up with something similar, another tetracycline that they named Terramycin (oxytetracycline). Now the genie was truly out of the bottle. Over the next two decades there was fantastic progress in the field of antibiotic research. More and more novel antibiotics were discovered, particularly from soil samples originating from all over the world. Not only that, but the chemistry and mechanism of action of antibiotics became much better understood. Several great chemists, such as the Nobel Prize winner Robert Burns Woodward, succeeded in synthesizing antibiotics from scratch in the laboratory and new antibiotics were made that were semisynthetic analogues of the original substances isolated from the soil. Indeed, one of the most important later advances in the field of antibiotics was completely synthetic in nature. Nalidixic acid which was synthesized in 1962 was found to be highly potent against Gram-negative bacteria such as Escherichia coli, Proteus mirabilis and Shigella flexneri. This discovery turned out to be highly significant in that it led to the emergence of quinolones, an important family of synthetic antibiotics whose structures are related to quinoline.
To people in the middle of the 20th century antibiotics seemed like miracle drugs. Prior to their discovery small cuts and abrasions could become septic and eventually even cause limbs to be amputated. Injured soldiers often died of their diseases. Patients with pneumonia, syphilis or similar diseases died or were severely disabled. Now all of this had completely changed and most such patients survived.
And so, it seemed that the war was won. It has been estimated that the number of lives saved by the discovery of antibiotics was the greatest for any discovery in the history of medicine. It now seemed that many transmissible diseases produced by bacteria would no longer be a problem for humanity to face. After millions of years of suffering, the human race had at last confronted the enemy and defeated it. In 1900, the life expectancy for men and women in the United States was 46 and 48 respectively, amazingly short by today’s standards. Most deaths were caused by bacterial pathogens such as those for cholera, diphtheria, typhoid fever, plague, tuberculosis, typhus, scarlet fever, pertussis, and syphilis. Antibiotics changed all of that and, by 2018, the life expectancy was in the region of 80, a truly amazing increase. Commenting on the effects of antibiotics, the Nobel laureate Sir Macfarlane Burnet said in 1962 ‘One can think of the middle of the 20th century as the end of one of the most important social revolutions in history, the virtual elimination of the infectious diseases as a significant factor in social life’.
But, of course, nothing could have been further from the truth.
Antibiotics bite the dust.
So, what went wrong? When one thinks about Nobel laureates pontificating on the topic, one would do well to remember the words of Sir Alexander Fleming who when receiving his Nobel Prize in 1945 discussed the use of antibiotics in medicine- “There may be a danger in under dosage. It is not difficult to make microbes resistant to penicillin the laboratory by exposing them to concentrations not sufficient to kill them and the same thing has occasionally happened in the body. The time may come when penicillin can be bought by anyone in the shops. Then there is the danger that the ignorant man may easily under dose himself and by exposing his microbes to non-lethal quantities of the drug make them resistant. Here is an example. Mr. X has a sore throat. He buys some penicillin and gives himself, not enough to kill the streptococci but enough to educate them to resist penicillin. He then infects his wife. Mrs. X gets pneumonia and is treated with penicillin. As the streptococci are now resistant to penicillin the treatment fails. Mrs. X dies. Who is primarily responsible for Mrs. X’s death? Why Mr. X. whose negligent use of penicillin changed the nature of the microbe. Moral: If you use penicillin, use enough.” Here Fleming was raising a red flag amid all the euphoria. Could disease causing microbes become resistant to antibiotics and what then? Fleming’s words of warning were not taken very seriously at the time, but they turned out to be extremely prescient because these are precisely the types of problems that have now come to pass. When antibiotics first came on the scene people couldn’t get enough of them. Business embraced them. They were a great marketing idea. Why listen to the warnings of some old scientist when there was so much money to be made? The drug manufacturers added antibiotics to ointments, throat lozenges, chewing gum, toothpaste, and even lipstick. Penicillin was freely available over the counter without a prescription. Nobody understood how the drugs worked anyway. It was assumed that they might target all kinds of “germs”, bacteria, viruses, fungi, pretty much everything. So why not take them -better safe than sorry!
Sadly, it was very soon after the introduction of penicillin that organisms that were resistant to the antibiotic began to emerge, eventually greatly reducing its effectiveness. And the same pattern followed for all the other antibiotics that had been discovered. From the 1940s to the 1960s this didn’t seem like a major problem because new antibiotics were always found that could be used to replace the older varieties which were no longer effective. But the human race was living on borrowed time. The situation was rather like the scene in Eisenstein’s famous film Alexander Nevsky-The Battle on the Ice. The Teutonic Knights charge forward across frozen Lake Peipus towards the Russian defenders. As the knights advance the ice behind them begins to crack. For a time, the knights move forward faster than the advance of the cracking ice. At some point however, it catches up with them and they all drown in the freezing water.
Antibiotic resistance quickly became an accepted fact of life. Some horribly dangerous bacterial strains developed resistance to almost all approved antibacterial drugs, including such stalwarts as tetracyclines, erythromycins, methicillin, gentamicin, and vancomycin, very shortly after their approval. Indeed, at the present time we are faced with a return to the situation prior to the 1940s when we will once again be at the mercy of disease producing pathogens. Currently, the number of people dying from antimicrobial resistant infections is at least 1.2 million worldwide according to a report recently commissioned by the British government and published in 2022 in the journal the Lancet. Moreover, if something doesn’t happen soon to reverse the trend, the estimated death toll due to antimicrobial resistance will reach at least 10 million by the year 2050, surpassing the mortality rate for cancer. Not only are some bacterial pathogens becoming resistant to a single antibiotic or class of antibiotics, but we have seen the emergence of “Superbugs” which are resistant to more than one kind of antibiotic or indeed even to virtually all known antibiotics. Currently, one of the most notorious superbugs is the Gram-positive organism Staphylococcus aureus. Around a third of humans test positive S. aureus whose presence has long been linked to common skin infections. “Staph” skin infections generally start as swollen, painful red bumps resembling pimples. In some individuals. however, these can quickly turn into abscesses. And things don’t stop at that point. The bacteria can infiltrate deep into the body, causing potentially life-threatening infections in bones, joints, surgical wounds, the bloodstream, heart valves and lungs. These situations were originally very effectively treated with penicillin. Once penicillin resistance became a fact of life methicillin, introduced in 1959, was supposed to take care of the problem. Soon, however, we were confronted with the appearance of methicillin-resistant S. aureus (MRSA: M is for methicillin) and then by multiantibiotic-resistant variants, (MRSA can now refer to Multidrug-resistant S. aureus.) As with many resistant bacterial pathogens, MRSA first turned up in the context of hospitals and health care centers, but it has now moved outside the hospital and become a major community wide problem. In some highly publicized instances infection with MRSA can produce “Necrotizing Fasciitis”, a serious condition that can be fatal and which is now commonly known as “Flesh Eating Disease”. MRSA is therefore sometimes referred to as the “Flesh Eating Bacterium” and has been widely featured by the tabloid press in numerous titillating horror stories. However, this is only one example of the effects of pathogens that are now difficult to control with antibiotics. Particularly dangerous are the drug-resistant bacterial strains collectively known as ESKAPE pathogens (vancomycin-resistant Enterococci, MRSA, Klebsiella pneumoniae, Acinetobacter baumannii, Pseudomonas aeruginosa, Enterobacter spp.), which can cause difficult to treat infections, particularly of patients in hospital settings. Our realization as to the seriousness of the problem, like our appreciation of the results of climate change, necessarily lags somewhat behind the events themselves. For example, just this year (2022) a paper in the journal Nature Microbiology reported that analysis of the genomes of Shigella bacteria, which cause lethal diarrhea, found that > 95% of the strains collected between the years 2007 to 2011 were resistant to at least 3 types of antibiotics including the very widely used fluoroquinolone drug ciprofloxacin. One can safely assume that the situation at the present time is even more serious. Moreover, there is evidence that antibiotic-resistant bacteria are traveling all over the world. In the January 2008 a report documented an unexpectedly high presence in Arctic wildlife of multi drug resistant Escherichia coli, which the authors speculated may have been transported by migratory birds.
So, how exactly did we get into this mess and how can we move forward to solve the problem? The first thing to understand is that, although hundreds of antibiotics were discovered during the “Golden Years” of antibiotic research, which really cover the period from the 1940s to the 1960s, all of these produced their effects by employing relatively few biochemical mechanisms. Penicillin and similar antibiotics interfere with the synthesis of the cell wall that surrounds Gram positive organisms. The cell walls of Gram negative organisms have an extra layer which penicillin cannot easily penetrate or, if it does, it can be destroyed by an enzyme called b-lactamase. On the other hand, antibiotics like streptomycin and the tetracyclines, which have a broader specificity, target the ribosomes, organelles within the bacteria that are the sites of protein synthesis. Synthetic quinolone antibiotics such as ciprofloxacin inhibit the enzyme DNA gyrase which, as a consequence, blocks DNA synthesis and bacterial cell division. Trimethoprim inhibits the enzyme dihydrofolate reductase and ultimately interferes with DNA synthesis. These mechanisms either reduce the rate at which bacteria replicate (bacteriostatic) or kill them outright (bactericidal). However, bacteria can adapt. The genomes of bacteria are very nimble. Their rate of reproduction is extremely fast relative to an animal cell giving them many more chances to produce a mutation that allows them to bypass the effects of an antibiotic and so produce a resistant bacterium. Even if such mutations are very rare, there are so many bacteria that they will occur at some point and may then be “selected for” in the appropriate environment. The process of mutation and selection is traditionally how we think about the mechanisms through which organisms adapt. In addition, bacteria can use another genetic trick known as horizontal gene transfer. In this case circles of DNA, which are separate from the bacterial chromosome, can replicate independently and move between bacteria -even between bacteria of different types. These “plasmids” may carry several antibiotic resistance genes producing bacterial resistance of multiple types. For example, one isolate of the bacterium A. baumannii was found to contain a ‘resistance island’ containing 45 resistance genes derived from several other species of bacteria, which was picked up in a single horizontal transfer event.
Bacteria resist the effects of antibiotics by using several different genetic strategies that have thousands of variations. First of all, they can produce destructive enzymes that destroy antibiotics. An example of this is the expression of the enzyme b- lactamase to destroy b-lactam antibiotics such as penicillin. Bacteria can also mutate so as to modify the normal targets of antibiotics. For example, they can modify the proteins that allow antibiotics to bind to ribosomes so that the drugs can no longer inhibit bacterial protein synthesis. A particularly effective strategy is for bacteria to express “pump” like proteins which can remove antibiotics by pumping them out of the cell. Some organisms produce a “biofilm” which prevents antibiotics from entering cells. These are the kinds of tricks that bacteria can use to avoid the effects of antibiotics. Events like these were observed, for example, with the first- and second-generation tetracyclines, which gradually lost their efficiency due to the widespread resistance conferred mainly by the ribosomal protection and pumping mechanisms.
However, it would be wrong to think that antibiotic resistance genes are a new phenomenon. In fact, many of them are extremely ancient. It is now generally recognized that the natural environment harbors a vast diversity of antibiotic resistance genes. In an interesting experiment reported in the journal Nature in 2011, investigators examined the properties of bacteria taken from ice cores obtained by drilling deep into the Yukon tundra in the Arctic. This resulted in samples of ancient bacterial DNA from 30,000-year old Beringian permafrosts. The investigators found genes encoding b-lactamase which can destroy penicillin. In addition, however, they also found sets of genes (known as operons) which can counter the effects of the powerful antibiotic vancomycin. Resistance to vancomycin was considered very surprising when it was first observed in the 1980s.But we now know, as shown by these experiments, that this kind of resistance is really an ancient, naturally occurring phenomenon normally present in the environment. Under the correct conditions such genes can be selected for and become widespread. These conclusions have been further confirmed by observation of antibiotic resistance genes in an 11th century A.D. pre-Columbian Andean Mummy and in bacteria taken from a cave in New Mexico estimated to have been isolated for over 4 million years.
Although such genes nowadays manifest themselves as the phenomenon we know as antibiotic resistance, it is likely that they originally served different purposes, raising the question as to why bacteria make antibiotics anyway. Answers to such a fascinating question include the likelihood that antibiotics have “signaling” functions, acting at low environmental concentrations to help organize the viability of bacteria, or that they can even act as a unique food supply under some circumstances. Clearly if bacteria want to eat antibiotics, they have to have enzymes that can break them down as well as mechanisms for shifting them in and out of cells. Moreover, based on their ancient nature we would expect that there would be some degree of primary antibiotic resistance observed in bacteria in their normal environment. Indeed, we know that some bacteria are naturally resistant to some antibiotics. Of course, the context in which these natural events take place underwent a radical reorganization with the discovery of antibiotics by human beings. Once these molecules started to be produced in large quantities and made their way back into the environment bacteria were now presented with an entirely new selection pressure, one that they had never previously experienced allowing for the selection of genes that might have been previously very rare. Unfortunately, Alexander Fleming and a few others aside, these potential problems were not foreseen by humanity who were too busy congratulating themselves on having banished infectious diseases forever.
Farming follies.
While it is true that antibiotics have been misused over the years, primarily due to the habit of overprescribing them for purposes for which they are inappropriate, this only accounts for less than half of the enormous load of antibiotics found in the environment. There is another reason as well; one that was completely unexpected. In the early part of the twentieth century, the diet of many Americans began to change, turning much more towards the consumption of meat, including red meat. The American farming community found it difficult to keep up with the demand. Prices for meat rose astronomically and public demonstrations followed. In response to these pressures, the production of red meat by farmers became much less haphazard. Meat production started to turn into what eventually became known as factory farming, which involved keeping animals in highly structured environments with diets designed to make them put on weight as efficiently as possible. The coming of the two world wars put even greater stress on the situation as important animal food supplements such as Norwegian cod liver oil and Japanese fish meal, both of which were widely used to boost the growth of chickens and pigs became unavailable, and there was enormous interest in finding things that could be used as food additives that were both effective and cheap. Such concerns continued after the Second World War. In 1948 the Merck pharmaceutical company discovered that the broth they used to make streptomycin contained a substance -vitamin B12, that helped chickens grow faster, leading other companies to look for similar substances. With this in mind, Dr. Thomas Jukes, a scientist at the Lederle drug company tried supplementing chicken feed with the broth used to grow their new antibiotic Aureomycin (chlortetracycline). The results proved to be a revelation. The chickens fed with broth supplemented food grew much faster than normal. As it turned out this was due to small amounts of the antibiotic itself. This seemed like an answer to farmers’ prayers. Aureomycin was relatively cheap to make, and small quantities added to animal food allowed them to grow 50-300% more quickly. This meant animals would reach “market weight” much more rapidly and allowed them to be fed less food under increasingly restrictive battery farm conditions. Other antibiotics proved to work equally well and, in 1951, the FDA approved the widespread use of 6 antibiotics as animal food additives. This was great news for companies like Lederle who now had a gigantic new market for their products. Indeed, it became clear that if animals were fed higher doses of antibiotics, they would not only put on weight more quickly but fend off various infectious diseases. As a result, the amounts of antibiotics used by the food industry grew astronomically. But not everybody thought that this was such a good idea. In 1954, Alexander Fleming visited the University of Minnesota. When the farmers on campus demonstrated to him that by feeding antibiotics to hogs, they saved millions of dollars in food, Fleming seemed disturbed by the thought of applying the same logic to humans. “I can’t predict that feeding penicillin to babies will do society much good,” he said. “Making people larger might do more harm than good.” Once again, his words would prove to be prescient.
And so, some 70 or so years after the discovery of a penicillin, production of antibiotics reached astronomically high levels. Because of the large amounts used by humans (often inappropriately) and for promoting animal growth, huge amounts of these materials have ended up back where they started -in the soil ,but at much higher concentrations than times prior to human meddling .Now instead of being exposed to minute concentrations of antibiotics for the purposes of biological signaling or as part of their diet, microorganisms have been forced to evolve by selecting genetic traits that allow them to defend themselves. As usual humans have succeeded in messing the world up by not thinking things through properly. And it’s not the first time. Indeed, the entire idea ushered in by germ theory, that all microorganisms are our enemy and we must destroy them, turned out to be an extreme oversimplification. This is a war we can never win. There is no ‘endgame, and our habits will only inevitably lead to greater antibiotic resistance.
By around 2010, some 80-90% of the antibiotics made in the USA were being used for supplementing animal food. Because of the mounting evidence that this practice has produced phenomena such as antibiotic resistance, laws are now in place in the US and many other countries restricting the use of antibiotics in animals to situations where they are employed appropriately for fighting disease. Nevertheless, much damage has already been done. Of course, all of this also raises an extremely important question- “why do antibiotics make animals put on weight more rapidly?”
The microbiome revolution.
Answering this question has resulted in a completely new evaluation of the role of bacteria and other microorganisms in human health and our environment. Perhaps many bacteria aren’t our enemies after all? Maybe some of them are actually our friends and perhaps largescale disruption of the relationship between humans and bacteria is, in many respects, harmful to our health and to the health of our planet. These considerations have included the growing realization that all animals normally exist as “superorganisms” in association with colonies of bacteria, archaea, protists, and viruses-something that has become known as the “Microbiome”. Not only that, but we have an ever-expanding idea as to the critical significance of this association for our everyday health. The vast majority of these bacteria are non-pathogenic and are referred to as “commensal” -that is humans and the bacteria “come to the table together “in a mutually beneficial manner. In humans one of the most important “niches” for the microbiome is the gastrointestinal tract. The density of bacteria in the human colon reaches as high as 1011/g content making it the most concentrated bacterial microenvironment known. Overall, the human gastrointestinal tract contains more than 100 trillion bacteria, a staggeringly large number, representing over 1000 different species These bacteria are mostly represented by four phyla known as Firmicutes, Bacteroidetes, Actinobacteria and Proteobacteria. Just to put things in perspective, the number of gut bacteria greatly exceeds the number of cells that make up the human body. Moreover, these bacteria contain some 600,000 genes, approximately 25 times more than the number of genes in our own genome. So, with respect to the hybrid human/bacterial superorganism, in many respects the tail is really wagging the dog.
Studies have shown that humans and their microbiome have evolved together to form a mutually beneficial association. What functions does the human microbiome subserve? We can get an idea about the kind of thing that is going on by taking a moment to consider a much simpler system-fungus growing ants. These ants evolved millions of years ago in the Amazon rain forest and cultivate types of fungus on which they feed. The ants carry symbiotic bacteria in specialized glands that produce nutrients that support the bacteria. In return the bacteria produce specific antibiotics that protect the ants’ fungus from other microbes. Symbiotic relationships of this type exist in the human gastrointestinal system. Gut bacteria are important for the production of vitamins, and fermentation of food components that would otherwise be indigestible. We also know that the gut microbiome is important for the development of the immune system, protection against pathogens, and for drug metabolism. These beneficial effects are the reason behind the idea that certain bacteria such as Bifidobacterium spp. and Lactobacillus spp are consumed by people as “probiotics” to promote gut health. The gut microbiome is a very protean superorganism constantly changing in response to our diet and other factors. For example, a recent paper demonstrated that low doses of laxatives can radically alter the genetic makeup of gut microbes.
An important consequence of the numerous roles of the gut microbiome is that if its structure is altered resulting in an abnormal balance of its constituent microorganisms, this might be an important factor in the development of human disease. This connection has now been clearly demonstrated for some autoimmune diseases, particularly inflammatory bowel disease as well as allergic diseases, cardiovascular diseases, and metabolic syndromes such as diabetes and obesity. Treatment of humans and animals with antibiotics clearly alters the composition of the gut microbiome. In one experimental report, for example, weanling mice were given low doses of penicillin, chlortetracycline, or vancomycin. The investigators observed changes in the genetic structure of the microbiome, which included key genes involved in the metabolism of carbohydrates to short-chain fatty acids, increases in colonic short-chain fatty acid levels, and associated changes in the metabolism of lipids and cholesterol. The investigators then discovered something else very interesting. They took the feces (which contain the microbiome) of the antibiotic treated mice and transplanted them into the intestines of germ-free mice. The recipient mice then developed exactly the same traits as the donor antibiotic treated mice, thereby demonstrating that it was indeed the altered microbiome and not the antibiotics per se which produced the observed metabolic changes.
It is now clear that the entire context in which antibiotics operate is much more complicated than originally envisaged when they seemed to be the ideal weapons for humanity to use in its “war” against microorganisms. If we want to be able to continue to use antibiotics to help us treat important diseases, the way we use them must be radically rethought. Providing antibiotics willy- nilly to everybody who thinks they might be useful for treating their cold or stomachache is something that needs to be urgently stopped. Pouring millions of tons of antibiotics into the environment because of their use as a growth additive for farm animals is also something that we need to continue to put an end to.
But what else does the future hold? Are there things that we can do in lieu of using our current store of antibiotics for treating deadly infectious diseases, saving the use of antibiotics for times when they are really necessary. There are several interesting possibilities. For one thing, it is important to realize that the potential source of antibiotics in Nature has certainly not been exhausted-in fact we may have hardly even scratched the surface. One reason is this. Antibiotics that have been obtained from natural sources so far rely on organisms that can be cultured in the laboratory; otherwise, how could we investigate them? As it happens the vast majority of microorganisms cannot be cultured, at least not by conventional means. In fact, it has been estimated, that some 99% of existing bacteria have never been successfully cultured. Attempting to grow uncultured bacteria is an obvious route to the potential discovery of novel antibiotics. Some investigators have now established methods for doing this more successfully by growing bacteria in their natural environments. This is done by using marine sediments or soil samples added to the culture in an effort to reinstate unknown growth promoting molecules or other factors. This approach has started to yield promising results including a number of unique antibiotic substances (e.g., teixobactin) which appear to have novel mechanisms of action. New antibiotics from sources such as these may, if their use is regulated appropriately, help us regain an advantageous position when trying to control pathogens. Added to this possibility is the fact that there have been extraordinary technical advances in gene sequencing, gene editing with tools like Crispr/Cas 9 and other methodologies which make screening and manipulating the genomes of bacteria much more efficient, revealing that many bacteria have previously unknown metabolic pathways for the potential production of what may be interesting antibiotics which are not normally expressed under conventional laboratory culture conditions.
Viruses to the rescue.
In addition to antibiotics there are also other approaches to curing infectious diseases which are quite different. One is to help the body’s immune system take care of problems. Of course, activation of immunity is the major method that our bodies use to attempt to clear away pathogens. It is the effects of immunity in the form of antibodies that are the basis of the principle of vaccination which can be a powerful method for preventing disease. There are now ongoing attempts to activate the process of innate immunity to help deal with infection. Innate immunity is a branch of the immune system that is linked to the process of inflammation and has a general role in detecting and helping to clear away all pathogenic agents. The properties of this branch of immunity and exactly how it can be controlled have advanced greatly in recent years. We now know that drugs can be made that target innate immunity and can increase or decrease its activity. It is thought that this is an approach that could also be used to help our natural defenses to better deal with infectious agents.
Some additional very creative approaches for dealing with infectious pathogenic bacteria are also in the works. Consider the case of Dr. Tom Patterson, a professor of psychiatry at the University of California in San Diego, who was on vacation with his wife in Egypt in 2015.At some point during his visit he became violently ill with symptoms including abdominal pain, fever, nausea, vomiting and a racing heartbeat. Local doctors determined that he had an infection of the pancreas but couldn’t seem to treat him with the drugs they had on hand, and so he was rapidly transferred to an American hospital in Frankfurt, Germany. Here the doctors identified the infectious pathogen as multidrug resistant Acinetobacter baumannii, which, as we discussed above, has emerged as a highly antibiotic resistant organism which is extremely dangerous. Eventually it was found that if he was treated with a combination of the antibiotics meropenem, tigecycline and colistin, the latter being rather nephrotoxic and so considered a treatment of “last resort”, his condition did stabilize somewhat, and he was transferred back to the ICU at the UC San Diego hospital. Here he continued to slowly improve. Unfortunately, an accident occurred which caused the bacterium to enter his blood stream and he immediately began to suffer the symptoms of septic shock. The antibiotics being used to treat him were no longer effective under these circumstances and he became comatose. It was clear he was going to die. However, his wife, also a doctor, was determined to find something that might help.
To understand what eventually happened one has to go back to a time before antibiotics had been introduced for the routine treatment of infectious diseases. In 1915, Frederick Twort, superintendent of the Brown Institution of London, discovered something that could infect and kill bacteria. The substance consisted of very small particles and could easily pass through most filters. Similar observations in France by Felix d’Herelle, working at the Pasteur Institute in Paris, led to the realization that the material was actually a virus which was subsequently named “bacteriophage” from “bacteria eating virus. Phages bind to specific receptors expressed by bacteria and inject their genetic material into them. The virus then replicates and when it is mature, destroys the bacterium releasing the newly formed viruses. Phages are common to say the least. There are more than 1031 bacteriophages on the planet-that is ten million trillion trillion or “rather a lot”. In fact, there are more phages than every other organism on Earth, both large and small, combined. Because of the fact that phages normally kill bacteria there was considerable research in the 1920s and 1930s attempting to use them as a method for treating bacterial pathogens, something that is well described in Sinclair Lewis’ novel Arrowsmith. These attempts met with some success by the standards of the time. For example, phages were reported to be successful in the treatment of Shigella dysenteriae as early as 1919. And other promising trials in Eastern Europe have continued over the years. In a 1938 clinical trial, 219 patients with bacterial dysentery were treated solely with bacteriophages and 74% were greatly relieved of their symptoms. Additionally, during a 1974 typhoid epidemic, 18,577 children were treated prophylactically with phages resulting in a 5-fold decrease in typhoid incidence However, despite promising results, once antibiotics were discovered research on “phage therapy” virtually stopped in the US and Western Europe. Why bother when you could just use a wonder drug instead? Interestingly, the research did continue to some extent in the Soviet Union, particularly in the Soviet republic of Georgia and in Poland at the Eliava Institute in Wroclaw.
Nowadays however, with the rise of antibiotic resistance, there has been something of a resurgence of interest in the idea of phage therapy. Because there are many different types of bacteria and a lot of different phages, it is necessary to find the correct phage for treating a particular bacterium or, another possibility, to give a cocktail of phages in the hope that one of them will hit the target. This is precisely what happened in the case of Tom Patterson. With her husband lying in a coma waiting to die from infection by an organism that was completely resistant to all known antibiotics, Tom’s wife Steffanie had to think fast. Steffanie, who has a PhD and is an expert on Global Health, remembered that she had heard of people who had antibiotic resistant infections traveling to Soviet Georgia and being cured by phage therapy where it was still being used. She quickly contacted several institutions that had worked on phages that were specific for the organism that had infected Tom Patterson. Once these had all been collected in San Diego and purified, they were injected intravenously into the patient. Just as with Franklin D Roosevelt Jnr. all those years ago, being tested with a new therapy, the effects on Tom Patterson were dramatic. Within 3 days he was out of his coma and much recovered. In fact, the path to his recovery was not an altogether simple one. The bacterium began to mutate and evolve within Tom as it became resistant to the effects of one phage cocktail, necessitating the use of another and so on. But eventually the doctors were victorious, the bacterium was vanquished, and Tom emerged emaciated but cured. This result which has been described as “a game changer” in the field has led to the founding of The Center for Innovative Phage Applications and Therapeutics (IPATH) at the University of California, San Diego which is specifically dedicated to phage therapy.
So, given the fact that there is no limit to human ingenuity, it may be that we can save ourselves from the crisis of antibiotic resistance. And make no mistake about it, like climate change, this is a crisis of critical importance to the future of humanity. Apart from finding ways to rectify the situation and ensure that we can deal with destructive pathogens in the future, the story of the rise and fall of antibiotics has a much more fundamental message for the human race. It is that we do not live apart from Nature. We are fundamentally an aspect of the entire ecosystem that makes up our planet. The history of antibiotics is an example of how human beings have discovered important substances from Nature and exploited them for what seems, at least in the short term, our own advantage. But, using our natural resources without really thinking through what we are doing, often simply with an eye on the financial bottom line, has now been shown, time and time again to be something that will come back to haunt us and, in the end, may ultimately destroy us. The history of antibiotics is something from which we can learn important lessons. If we continue to ignore them, the future of humanity is in serious jeopardy.