unit 3 surgery fact booklet

12
Unit 3: Surgery c.1840-c.1918 The situation at the beginning of the 19 th century In 1800, surgery rarely went beyond setting broken bones, removing growths and performing amputations. However, surgery was beginning to lose its reputation as a “second rate” branch of medicine and well-trained surgeons were emerging from medical schools. The first 15 years of the nineteenth century saw the Napoleonic Wars between France and the rest of Europe, and new surgeons at this time had lots of practice on the battlefield and the quarterdeck. As the first half of the 19 th century went by, surgeons began to go a bit further than the traditional operations. From Germany, operations were developed to cure cleft palates and squinting. In America, internal cysts were removed. For example, surgeon Ephraim McDowell removed a 15 pound ovarian cyst from Mrs Todd of Kentucky. The operation lasted 25 minutes, Mrs Todd sang hymns to drown the pain and she lived for a further 31 years! But this was unusual. Mortality rates in surgery were high. In 1800, about 40% of patients died, mostly through post-operative infections. Even surgeons who tried hard to be clean, like Spencer Wells in London, had high mortality rates – Wells’ was 25%. The three main problems causing this death rate were pain, infection and bleeding. This meant operations had to be done in the quickest time possible so that shock, loss of blood and the chance for germs to enter the wound were minimised. In 1824, it took 20 minutes to amputate a leg through the hip joint; in 1834, the same operation was done in 90 seconds. The most important English surgeons at this time were: John Abernethy, who became Professor of the College of Surgeons in 1814 and placed great emphasis on anatomy, turning surgery from a craft to a science. Astley Cooper, who operated on King George IV and spent years perfecting his operations on hernias. He was the first surgeon to

Upload: bottisham-village-college

Post on 01-Jul-2015

5.646 views

Category:

Education


1 download

TRANSCRIPT

Page 1: Unit 3 surgery fact booklet

Unit 3: Surgery c.1840-c.1918

The situation at the beginning of the 19th century

In 1800, surgery rarely went beyond setting broken bones, removing growths and performing amputations. However, surgery was beginning to lose its reputation as a “second rate” branch of medicine and well-trained surgeons were emerging from medical schools. The first 15 years of the nineteenth century saw the Napoleonic Wars between France and the rest of Europe, and new surgeons at this time had lots of practice on the battlefield and the quarterdeck.

As the first half of the 19th century went by, surgeons began to go a bit further than the traditional operations. From Germany, operations were developed to cure cleft palates and squinting. In America, internal cysts were removed. For example, surgeon Ephraim McDowell removed a 15 pound ovarian cyst from Mrs Todd of Kentucky. The operation lasted 25 minutes, Mrs Todd sang hymns to drown the pain and she lived for a further 31 years!

But this was unusual. Mortality rates in surgery were high. In 1800, about 40% of patients died, mostly through post-operative infections. Even surgeons who tried hard to be clean, like Spencer Wells in London, had high mortality rates – Wells’ was 25%.

The three main problems causing this death rate were pain, infection and bleeding. This meant operations had to be done in the quickest time possible so that shock, loss of blood and the chance for germs to enter the wound were minimised. In 1824, it took 20 minutes to amputate a leg through the hip joint; in 1834, the same operation was done in 90 seconds.

The most important English surgeons at this time were: John Abernethy, who became Professor of the College of Surgeons in 1814 and placed great

emphasis on anatomy, turning surgery from a craft to a science. Astley Cooper, who operated on King George IV and spent years perfecting his operations on

hernias. He was the first surgeon to successfully operate on an aneurysm (a ballooning in a large artery which would burst and lead the patient to bleed to death) after practising on a cadaver in the dead-room next door.

Robert Liston, who was renowned for biting the surgeon’s blade between his teeth so that he could save time in operations. He taught anatomy at Edinburgh Medical School from 1818 and then moved to University College London in 1835.

Hospitals in London provided operating theatres for the greatest surgeons and operating day was a weekly show with celebrity surgeons performing before their colleagues, students and the public. However, top surgeons, like Astley Cooper of Guy’s Hospital, earned their vast salaries from performing operations on private patients in their homes.

How was the problem of pain dealt with?

Page 2: Unit 3 surgery fact booklet

Since medieval times, surgeons had tried to find ways to deaden the pain during operations. The main products used were alcohol and opium – but experience showed that they were dangerous. For example, alcohol thins the blood, so a drunk patient might not feel much pain but would probably bleed to death during the operation.

In 1795, Humphry Davy experimented with nitrous oxide (laughing gas) as a painkiller. In 1800, he wrote a report of his experiments, showing that neat nitrous oxide would kill animals, but when mixed with oxygen it produced reversible unconsciousness. He used it on a human patient to relieve the pain of an inflamed gum and suggested that it might be useful in surgical operations, but no-one took up his ideas. Later, nitrous oxide was used by dentist Horace Wells in the USA for the extraction of teeth. He developed a bellows system to administer the gas and gave a public performance in Massachusetts – which went badly wrong and his patient suffered in agony. Wells lost medical support, grew depressed, became a drug addict and committed suicide whilst in jail for hurling sulphuric acid at prostitutes. Nitrous oxide got a poor reputation; it was also found that the effects did not last very long.

The next substance to be tried to deaden pain was ether. This had been discovered in the sixteenth century in Germany and had been used to make people cough up phlegm to balance their humours. An

American doctor, William Clarke, used ether to deaden the pain of a patient having a tooth extracted in 1842 and another American doctor, Crawford Long, used ether to remove a cyst from a neck. Both operations were successful. An American dentist, William Morton, then popularised the use of ether in dentistry in the USA.

News of ether soon spread to Europe. The first use of it in England was by Robert Liston in December 1846, during an amputation. Its effects lasted longer than nitrous oxide and the operation was completed successfully. The newspaper headlines the next day read: “Hail Happy Hour! We Have Conquered Pain!” However,

ether had problems. It irritated the lungs and led to long bouts of coughing, even when the patient was unconscious. It caused many patients to vomit. It was flammable – which was not good in an age of candles and gas lamps. It could knock a patient out for days which was dangerous. It had to be transported in heavy glass jars – not good in an age when surgeons had to travel to people’s homes to perform on them. An alternative anaesthetic was still needed.

Page 3: Unit 3 surgery fact booklet

James Young Simpson had discovered chloroform in 1847. Simpson was a professor of surgery in Edinburgh, dealing specifically in midwifery. One evening, he took home some chemicals with his assistants to try and find a decent anaesthetic. Someone knocked over the chloroform bottle, and when his wife had dinner brought in, Simpson and his assistants were all found asleep. Later, Simpson tried giving half a teaspoon of chloroform on a rag to a woman in labour and was so pleased with the results that another 30 patients were given chloroform that week.

The key event in the acceptance of chloroform was its use by Queen Victoria in the birth of Prince Leopold, 7 April 1853. John Snow administered the anaesthetic and the Queen

recorded in her journal that the effect of chloroform was “soothing, quieting and delightful beyond measure”. With royal approval, chloroform began to increase in popularity. However, there were protests against its use including: It was seen as cowardly to have pain relief – men were not real men if they took it. It was seen as anti-religious: the Bible taught that childbirth was supposed to painful. It was possible to kill people by giving them an overdose of chloroform, particularly if they were

panicking and breathing too heavily. This happened to Hannah Greener, a 14 year old girl who took too much when she was about to have an operation to remove a toenail. She died within seconds.

It actually killed more people in operations – this was true. As the patient was no longer thrashing about, surgeons began to attempt more risky and longer operations, inevitably killing more patients through bleeding and infection. This increase in the number of fatalities was known as the Black Period of surgery and lasted from the 1850s to the 1870s.

John Snow tackled the problem of overdosing on chloroform by developing the portable inhaler during the 1850s. This now regulated how much chloroform was given to patients.

It was accepted that knocking people out entirely was a risk. Surgeons also recognised that not all operations required patients to be totally unconscious. A search developed for substances which would numb a particular area for local surgery – such as the removal of a cyst. Coca leaves were traditionally used in South America for deadening pain and, in 1859, the active ingredient – cocaine – was isolated. By 1885, cocaine was being produced commercially as a local anaesthetic.

A few more developments to the problem of pain were made in the twentieth century: In 1902, anaesthetics began to be injected into the blood stream to

control doses even more precisely. Synthetic substances for local anaesthetics were developed such as novocaine in 1905.

James Simpson was the first man to be knighted for services to medicine. A plaque was dedicated to him in Westminster Abbey to show how much people appreciated his discovery of chloroform.How was the problem of infection dealt with?

Page 4: Unit 3 surgery fact booklet

In the nineteenth century, it was common for a surgeon to operate in an old blood-caked frock coat (to show how experienced he was) and to wash his hands only after the operation. The operating theatre would contain a wooden table and sawdust to soak up the blood. All of these things meant germs were rife and sepsis (infections in the blood and tissue) were common.

The link between sepsis and cleanliness had already been noted. In 1795, Alexander Gordon had argued that mothers who contracted childbed fever after childbirth had been infected by their doctors or midwives, and had recommended that the operator should wash before coming into contact with the mother. In 1843, American Oliver Holmes had argued that doctors were bringing “germs” into contact with mothers but was overruled by other doctors.

The sepsis problem came to a head in the 1840s in Vienna. The Vienna General Hospital had the largest maternity clinic in the world. There were two great wards: in Ward One, childbed fever raged and the mortality rate was at 29%, in Ward Two, childbed fever was rarer and the mortality rate at 3%. An assistant physician, Ignaz Semmelweis, tried to work out what the difference was. He knew that Ward One was handled by the medical students and that Ward Two was handled by midwifery students. Medical students came straight to the wards from the autopsy rooms with soiled hands and instruments; midwifery students didn’t. When the two groups swapped over, Ward Two became the place to die. His suspicions were confirmed when a doctor cut his finger during an autopsy and died of septicaemia (blood poisoning) with the exact symptoms of the

women dying of childbed fever.

In 1847, Semmelweis ordered that everyone in the wards wash their hand with chlorinated water. Mortality rated plummeted. However, colleagues refused to believe his theories about putrid particles being passed from the medical students to the women – remember this was before the discovery of Germ Theory – and resisted his attempts to clean up. Semmelweis left Vienna for Budapest and introduced chlorine disinfection to the maternity wards at his new hospital. Childbed fever rates fell below 1%, and in 1861, he published a book on childbed fever. However, in 1865, Semmelweis died in a Viennese mental asylum.

By this time, antiseptics of a general kind, like Semmelweis’s, were used widely. Iodine was used for bathing wounds, and other substances like bromine, creosote, zinc chloride and nitric acid were used for washing. James Simpson directly copied Semmelweis’s hand-washing routine at his midwifery hospital in Edinburgh. Florence Nightingale had introduced the idea of spotless hospital environments. Whitewashing walls was common. Surgeons were urged to use soap and it was known that dressings should be changed (although belief in miasmas meant that bandages were tied tightly, raising the temperature of the wound and encouraging bacteria to grow). But no-one had yet introduced one clear, effective way of limiting infection and got everybody to use it. This was the achievement of Joseph Lister.

Page 5: Unit 3 surgery fact booklet

Lister studied at University College London and became an assistant surgeon in Edinburgh in 1854, before moving on to head up surgery in Glasgow in 1859. In Glasgow, patients were often contracting sepsis. He began to research where sepsis was coming from, experimenting on frogs. Then he read Pasteur’s Germ Theory in 1865 and became convinced that sepsis was being caused by microbes in the air.

Lister now understood that he needed to get rid of these microbes. Normally skin provided the barrier to these microbes – if the skin was open, he would need a chemical barrier instead. Using knowledge of carbolic acid used in treating sewage, he dressed a compound tibia fracture with a bandage soaked in carbolic. The boy, James Greenlees, who had been run over by a cart, walked out of the infirmary fully healed in 6 weeks. The experiment was repeated 9 months later and worked again. From this success, Lister worked out a ritual for operations – antisepsis (killing infections in the wounds by smothering it in carbolic drenched dressings and tin foil) and asepsis (preventing new infections entering the wound by spraying the whole operating theatre with carbolic acid). He wrote up all his findings in 1867 in the medical journal The Lancet and went on to develop a

“donkey engine”, a steam driven device to spray the operating theatre with carbolic acid. His first operation with the spray occurred in 1877 and was widely publicised, leading to other surgeons copying his methods. He also introduced the use of catgut for ligatures which could be dipped in carbolic to sterilise them, and developed a form of catgut which would dissolve so that threads no longer had to be left dangling outside of the body.

Criticisms of his methods immediately followed:

Some doctors denied the existence of bacteria in the air and said all the carbolic was unnecessary. Lister kept changing his methods to improve them – but this made people think he was changing

because they didn’t work. Carbolic acid slowed down an operation – many people believed that speed was the most important

thing. It often was if a patient wasn’t going to bleed to death. When people tried to copy Lister’s methods and weren’t so careful, they didn’t work. Nurses and assistants in the operations complained of the carbolic fumes and the damage the acid

did to their hands. The equipment required to do an operation in carbolic was expensive. Not all surgeons could

afford it.

By the late nineteenth century, operations looked quite different as they were carried out in antiseptic conditions. Theatres were full of carbolic and surgeons and nurses worked in clean white aprons and shirts. Instruments were laid out on a clean tray and all used equipment was put straight in a bowl to move it away easily. Nurses had to wear caps to keep their hair from bringing in an infection. But at the end of the century, as Koch’s identification of germs continued, Lister’s antiseptic (getting rid of

Page 6: Unit 3 surgery fact booklet

germs) surgery began to develop into aseptic (no germs in the first place) surgery. Rubber gloves were worn (1894) and face masks began to be used (1897). Koch’s work showed that heat was more effective than carbolic for sterilising surgical instruments and the spray began to be abandoned in 1890. By 1910, operating theatres were filled with people wearing sterilised gowns, masks and gloves, using metal furniture and operating under electric lights. Surgeons were actively pursuing higher and higher standards of cleanliness to reduce the death rates. They were also able to undertake more complex operations such as repairing a heart that had been damaged by a stab wound (1896).

Lister was made a baronet in 1883 and a baron in 1897. He was given a huge funeral at Westminster Abbey. The Lister Medal was created in his honour, and is the highest honour that can be given to a British surgeon.

How was the problem of bleeding dealt with?

Bleeding was a concern for two reasons: it made it difficult for the surgeon to see what was going on which could lead to mistakes, and it could lead to the patient bleeding to death. In medieval times, bleeding was stopped by using cautery: placing a red-hot iron on the skin to fuse it together or pouring boiling oil over it to do the same. In the Renaissance, Ambroise Pare developed ligatures (silk threads) to tie off the blood vessels which was far less painful but had to be tied correctly to work (which was difficult in all the blood). Consequently, cauterisation continued to be the main way of dealing with bleeding until Lister’s developments in the late 19th century.

The idea of blood transfusions to replace the blood that the patient was losing had started back in the 17th century. Here, doctors had tried to pump animal blood into patients but with no success. In the 19th century, human to human transfusions had been tried, but no-one could explain why they sometimes worked and sometimes didn’t.

In 1901, Karl Landsteiner discovered blood groups. He noted that when patients died during a transfusion it was because their blood was clotting. Looking carefully at the blood cells under a microscope, he found that they were surrounded by different types of tiny cells which he called antigens. The first type of blood he called Group A and the second Group B. When the antigens of the

patient matched the antigens of the transfused blood, all was well. If they didn’t match, the blood would clot and the patient would die. As he continued to test this theory, he discovered some blood cells had no antigens (so he called them Group O) and some blood cells had two types of antigens (so he called them Group AB). Other blood groups were discovered and Landsteiner received a Nobel Prize for his work in 1930. He went on to discover positive and negative blood types in the 1940s.

Landsteiner’s work made it possible to have a successful transfusion every time. The problem was that a donor was needed on the spot which was not very practical and so his idea did not have a big effect on surgery to start with. When WW1 broke out and thousands of soldiers were dying of gunshot and shrapnel wounds in the trenches, it was not possible to have

donors on the spot. The war drove doctors to find an answer and in 1915, American doctor Richard

Page 7: Unit 3 surgery fact booklet

Lewisohn found that adding sodium citrate to blood stopped it clotting when it came into contact with the air. From this, doctors were able to store blood and have it transported to the Front, although it had to be used quite quickly as the blood cells would deteriorate after a time. Further experiments showed that refrigeration slowed the deterioration. In 1916, Francis Rous and James Turner found that adding a citrate glucose solution allowed blood to be stored for even longer. The government were now in a position to ask the public to donate blood before an offensive in preparation for the casualties. The first British blood depot was set up in 1917 for the Battle of Cambrai with huge stocks of Group O blood.

How has war helped in the development of surgery?

The First World War brought surgeons into contact with a large number of new wounds. They also had to work on hundreds of soldiers in poor conditions. Surgeons gained an enormous wealth of experience and were able to try out procedures on patients. The war accelerated their training.

The war also pressured surgeons into finding new techniques. For example, shrapnel wounds often contained shreds of clothing which caused infections. Since 75% of the wounds they were dealing with came from shrapnel, they had to find a way of dealing with this. They learnt to soak the wound in saline solution and cut away the infected tissue which could solve some infections (although not the most serious ones). Surgeons also came up against brain injuries in large numbers for the first time which led to the development of brain surgery, and huge advances were made in ear, nose and throat (ENT) surgery.

WW1 decisively advanced skin transplants and plastic surgery. Shells caused horrific facial injuries. Harold Gilles set up a plastic surgery unit in Aldershot and dealt with 2000 cases of facial damage after the Battle of the Somme. He developed the new technique of pedicle tubes where a narrow layer of skin was lifted from the body and stitched onto a tube at one end. The other end was still attached to the body so that blood continued to circulate and helped health skin to develop. When the tube was long enough the free end was attached to the new site and the skin grafted together. Once the new skin had grafted and was growing the tube could be cut free at the base. Gilles was careful to keep precise records of all the procedures that he did and these notes allowed plastic surgery to develop further in WW2 and afterwards.

Between 1914 and 1921, over 41,000 men in the British Armed Forces lost a limb. This meant that there had to be new developments in prosthetic limbs. New metal alloys and mechanisms were developed, although waiting lists for these were long.

So, war lead to a great deal of progress. However, it should be noted that war focused surgeons on wound-related issues so that other areas of surgery (such as anaesthetics) were ignored for the time.

How has science and technology helped in the development of surgery?

Chemistry played an important role. Knowledge of chloroform and carbolic helped with anaesthetics and antiseptics, and knowledge of citrates helped with the storage of blood. Clearly Louis Pasteur’s scientific work on Germ Theory was very important in the development of Lister’s ideas, in developing heat sterilisation and in creating aseptic operations (masks, gowns, gloves etc.)

Page 8: Unit 3 surgery fact booklet

William Röntgen discovered x-rays in 1895 when he noticed that certain light rays could pass through human tissue but not bone. After writing up his ideas, he chose not to patent them – which meant that people were free to copy them. As a result, the use of x-rays spread very quickly – the London Royal Hospital had an x-ray machine in 1896. The use of x-rays meant that surgeons could be far more precise when cutting into the body, reducing the need to dig around in the wound causing further bleeding and infection.

X-ray machines became vital in WW1 when surgeons needed to know precisely where bullets and shrapnel had lodged themselves. Marie Curie helped to set up mobile x-ray units on the Western Front so that surgeons could have access to them on the front line. At the end of the war, surgeons were so familiar with, and reliant upon, x-rays, that it ensured all hospitals after the war had them.

Other technological breakthroughs in surgery in this period included: Ophthalmoscope – 1851 – allowed the interior of the eye to be seen The hypodermic needle – 1853 – allowing blood transfusions Chloroform inhaler – 1850s – John Snow’s invention to prevent overdoses Oesophagoscope – 1868 – allowed foreign objects in the gullet to be seen and removed Donkey-engine – 1877 – Lister’s invention to spray an operation with antiseptic Rectoscope – 1895 – to see up the rectum Gastroscopes – late 1890s – to see into the stomach Cardiograph – 1903 – keeps the beating of the heart monitored during an operation.

How has communication helped in the development of surgery?

During the 19th century there were a great many scientific and medical journals established that allowed ideas to be shared. The Royal College of Surgeons had one which encouraged surgeons to discuss new ideas and problems. The Lancet is the famous medical journal, which published the report of Hannah Greener’s death to invite discussion on what had caused her demise. Lister read Pasteur’s work in a journal which enabled him to develop carbolic. X-rays developed so quickly because Röntgen published his work and made it patent-free.

Surgeons often wanted their work to be recorded and made notes and had photographs taken. This meant that later surgeons could build on their work. A good example of this is Harold Gilles, whose photographs are particularly important (what would you have made of the description of his work above if you hadn’t got the photograph to help you?!)

Newspapers reporting surgical news to the public had a role in popularising new techniques. The best example here is the newspapers reporting Queen Victoria’s use of chloroform whilst giving birth.

Surgeons and scientists also travelled and visited each other. Lister travelled around Germany and the USA discussing his ideas with other surgeons. He met with Louis Pasteur in 1892 at a conference of 2,500 surgeons. Interestingly, though, Lister did not know about the work of Semmelweis.

To get really good marks in the exam, see if you can say how different factors worked together to produce advances – that it rarely comes down to just one thing.