1. Infusion Nursing as a Specialty

CHAPTER 1. Infusion Nursing as a Specialty

Ann Corrigan, MS, RN, CRNI®





19th Century, 1


20th Century, 2


21st Century, 4


Infusion Nursing, 5


Infusion Nurses Society, 6


Certification, 6


Infusion Nurse Specialist, 8


Role Delineation, 8


Scope of Practice, 8


Competency, 8


Education, 9


Summary, 9


Infusion therapy is a highly specialized form of treatment that has evolved from an extreme measure used only on the most critically ill to a therapy used for 90% or more of all hospitalized patients. No longer confined to the hospital setting, infusion therapies are now delivered in alternative care sites such as the home, skilled nursing facilities, and physician offices. Infusion therapy refers to the administration of solutions, medications, nutritional products, and blood and blood components via the parenteral route.


EARLY HISTORY


The practice of using veins to inject substances essentially began in the 17th century. The first documented use of infusion therapy was administering blood as a treatment in 1492, when blood from three young boys was administered to Pope Innocent VIII (Greenwalt, 1997). It was not until 1615, however, that the concept of infusing blood from one person to another was again considered by Libavious. It would be several centuries before person-to-person transfusion became possible, and even longer before it became a safe practice.

In 1628 William Harvey’s experimental work resulted in the development of the theory of circulation, which led to an understanding of blood flow and of the presence and importance of valves. This new information about the circulatory system led others to experiment with injecting substances into the vascular system to observe the effect on the recipient. One of the earliest to document this type of experiment was Sir Christopher Wren, famed architect of St. Paul’s Cathedral in London. In 1656 he experimented with injecting opium and wine into the veins of dogs using a quill and animal bladder (Weinstein, 2007). The first successful injection into humans was accomplished in 1662 by J. D. Major. About this time, Richard Lower presented a paper before the Royal Society in England on intravenous (IV) feeding and blood transfusion in dogs. He was able to demonstrate his transfusion theory with a successful animal-to-animal transfusion (Greenwalt, 1997).

The first documented animal-to-human transfusion is credited to Jean Baptiste Denis. In 1667 Denis, a physician to the French royalty, successfully transfused 9 ounces of lamb’s blood into a 15-year-old boy suffering from madness (Cosnett, 1989 and Greenwalt, 1997). Subsequent transfusions to the boy were not successful and resulted in the first transfusion reaction. This initial success of Denis led to the promiscuous use of transfusions, with fatal results. As a result of the many fatalities, the Church and the French parliament banned the transfusion of blood from animals to humans in 1687.

Following the 1687 edict banning blood transfusions, little growth in the field of infusion therapy was noted for the next 150 years. The one significant event in the 18th century occurred in 1795, when Philip Syng Physick, from the University of Edinburgh, noted that the use of blood transfusions in obstetric hemorrhage had some success in decreasing mortality from this complication.


19th CENTURY


Infusion therapy, as practiced today, had its real beginnings in the 19th century, which was a time of rapid advancement in medicine. In 1818 James Blundell performed the first man-to-man transfusion in London (Greenwalt, 1997). In 1834 Blundell again used human blood to transfuse women who were threatened by hemorrhage during childbirth. Blundell is further credited with the correlation between blood loss and hypoxemia during hemorrhage.

One complication of early transfusions was coagulation of the blood during the transfusion. In 1821 Jean Louis Prevost, a French physician, experimented with preventing this coagulation. Prevost and Jean B. Dumas were the first to use defibrinated blood in animal transfusions (Weinstein, 2007). By 1875 Landois had discovered lysing between the serums of different animals, which later resulted in an understanding of antigen-antibody reactions.

The cholera epidemic of 1831 was an important event in the advancement of infusion therapy. Dr. William O’Shaughnessy, an Edinburgh physician, identified the significance of water and salt loss from the blood of cholera victims. In 1832 Dr. Thomas Latta of Leith, Scotland, is credited with taking this information and experimenting with administration of a saline solution to a patient. He described the patient as one who “apparently had reached the last moments of her earthly existence and now nothing could injure her. Indeed, so entirely was she reduced that I feared that I shall be unable to get my apparatus ready, ere she expire” (Cosnett, 1989). Latta was successful in resuscitating the patient initially, but the patient relapsed and eventually died. Dr. Latta wrote in Lancet of June 12, 1832, “I have no doubt the case would have issued in complete reaction, had the remedy, which had already produced such effect, been repeated” (Cosnett, 1989). The initial success of the saline injection led to further use of this therapy during the epidemic, but these efforts met with only limited success. Of the first 25 reported cases so treated, 8 recovered and there was much criticism, most often from outside the medical journals.

Further work continued and in 1853 Claude Bernard, a French physiologist, experimented with injecting sugar solutions into dogs. For the next two decades, he continued to experiment and infused not only sugar solutions but also egg whites and milk into animals, with some success.

In 1852 the importance of protein in relation to nitrogen balance, weight gain, and general well-being was observed by Bidder and Schmidt and confirmed by Voit in 1866. This correlation between protein and health led to the concept of nutritional support, although the effect of this relationship would not be fully known for another 75 to 100 years.

Major advances were made during the 1860s that would have an impact on infusion nursing and on all of medicine. Louis Pasteur developed the germ theory of disease, and demonstrated that fermentation and putrefaction result from the growth of germs. Building on this theory, Joseph Lister, Professor of Surgery at the University of Glasgow, hypothesized that microbes might be responsible for wound suppuration. He further postulated that infection could be prevented by destroying organisms and preventing contaminated air from coming into contact with the wound (Lyons and Petrucelli, 1987). Lister’s studies describing the use of carbolic acid spray as an antiseptic were published in 1867.

Many physicians began observing strict rules of cleanliness without understanding the implications. In France, surgeons continued to focus on the use of antisepsis during procedures instead of asepsis. Not until the early 1900s were the principles of asepsis fully understood. By then, it was common practice that only sterile items could contact the patient.

The use of gloves for procedures was introduced in 1889, when William Halsted of Johns Hopkins Hospital had the Goodyear Rubber Co. make a pair of rubber gloves for his operating room nurse (Weinstein, 2007). The use of gloves became popular, and by 1899 rubber gloves were being used on all clean cases. Today, gloves provide protection not only for the patient but also for the practitioner.

The last half of the 19th century also saw advances in the field of nutritional support. In 1869 Menzel and Perco of Vienna wrote a paper on the use of fat, milk, and camphor injected subcutaneously. The successful administration of a glucose solution is credited to Biedl and Krause in 1896 (Millam, 1996).


20th CENTURY


For almost 250 years, experiments in which different substances were injected into the body had yielded limited results. As with most of medicine, major advances occurred during the 20th century. By this time, the use of saline and glucose solutions was a more widely accepted practice, although they were still used only on the critically ill patient. Equipment was cleaned and sterilized between uses as a routine measure with the advent of heat sterilization in 1910, and with the medical profession’s acceptance that everything coming into contact with the patient needed to be sterile. The discovery of pyrogens in 1923 led to measures that helped eliminate them from fluids and drugs. Dr. Florence Seibert of the Phipps Institute in Philadelphia solved the serious problem of pyrogenic reactions to IV infusions in 1925, thus paving the way for safer practice (Greenwalt, 1997).


ADVANCES IN THERAPIES


Early in the 20th century, Karl Landsteiner discovered naturally occurring antibodies in the blood that led to a reaction when mixed with blood from another subject. This discovery eventually led to the identification of the ABO blood groups in 1901. By 1907 Reuben Ottenberg began using blood type differences as a basis for donor selection. By 1908 Epstein had set forth the hypothesis that ABO blood groups are inherited. Even with this information, transfusion therapy was still potentially fatal. Matching donor and recipient blood types helped reduce the incidence of transfusion reactions, but coagulation during the procedure continued to be a problem. During World War I, Oswald Robertson introduced the use of preserved anticoagulant blood, and by 1915 sodium citrate was being used successfully as an anticoagulant in blood transfusions (Greenwalt, 1997).

Levine and Stetson discovered the anti-Rh antigen in 1939, and in 1941 Levine and Burnham recognized that the anti-Rh antigen is responsible for alloimmunization during pregnancy and causes hemolytic disease of the newborn. These developments were important steps in the safe transfusion of blood.

World War II is important in the history of transfusion therapy because transfusions were used more widely during this time than ever before. Out of necessity, blood was being administered to the wounded troops in an effort to save more lives. Plasma was the first component to be used, and new techniques for the separation of plasma were developed in 1941. It was soon recognized, however, that plasma transfusions could not meet all the needs of the wounded, and by 1943 red blood cells were being salvaged and transfused. In 1962 the first filter to reduce white cell contamination and help reduce fibrin clots was designed. This helped solve an undesirable effect of transfusion therapy that had been recognized for more than four decades.

Today, blood can be separated into many different components, and each component is administered to correct a specific deficiency. Improved techniques make it possible to obtain, test, store, and administer these components. The risk of transfusion therapy has diminished as a result of the discovery and understanding of antigen-antibody reactions and of the development of improved methods for detecting bloodborne diseases. Administration sets, filters, infusion and warming devices, and other types of equipment are constantly being modified and improved. Agents were developed in the late 20th century that help the body to stimulate its own production of certain blood components, thereby reducing the need for transfusions and further reducing risks. In addition, there continues to be research to look at alternatives to the use of blood.

During the 20th century, advances were also being made in the area of nutritional support. Between 1904 and 1906, research on maintaining nitrogen balance for general well-being was conducted, and the rectal administration of protein for nutrition was documented. By 1918 Murlin and Riche were experimenting with the administration of fats to animals as a source of nutrition. The 1930s became a time of intense experimentation in nutritional support. In 1935 Emmett Holt of Baltimore administered an infusion of cottonseed oil, and has since been credited with the first infusion of a fat emulsion. By 1939 Dr. Robert Elman, along with Weiner, infused a solution of 2% casein hydrolysate and 8% dextrose without adverse effects. Following this success, various protein hydrolysates were studied, and in 1940 Schohl and Blackfan infused synthetic crystalline amino acids into infants. By 1944 Helfich and Abelson were able to provide nutritional support to a 5-day-old infant with a solution of 50% glucose and 10% casein hydrolysate, alternated with a 10% olive oil-lecithin emulsion.

With the assistance of Dr. Harry Vars at the Harrison Department of Surgical Research at the University of Pennsylvania, Stanley Dudrick conducted a series of experiments on beagle puppies in an attempt to support them totally by the parenteral route. By the early 1970s Dudrick had proven the effectiveness of protein and dextrose solutions for nutritional support. Today, primarily because of Dudrick’s work, patients can receive parenteral nutrition through the intravenous route and survive diseases and conditions that had formerly resulted in death.

The use of fat emulsions as a caloric source was also investigated, but the severe adverse reactions encountered with the IV administration of these substances led the U.S. Food and Drug Administration (FDA) to ban their use in the United States in 1964. Fats were still being administered in Europe, however, and an emulsion derived from soybean oil was developed. This refined product, which produced no significant side effects, led the FDA to reverse its ban on the IV administration of fat emulsions in 1980, and soybean and safflower oil emulsions were approved for IV administration.


ADVANCES IN EQUIPMENT


Advances in fluids and medications used for IV administration continued to evolve. Medical science provided the information necessary to replace and maintain the body’s fluid and electrolyte balance, to maintain or improve nutritional status, and to treat many disease states intravenously. The technology for administering IV solutions and medications has also advanced since Sir Christopher Wren used the quill, vein, and bladder of an animal for his treatments (Weinstein, 2007).

During the early trials with transfusion therapy, scientists and physicians used feather quills (sometimes with metal tips), animal veins, and animal bladders. This equipment was described in a 1670 Amsterdam publication called Clysmatic Nova. The crude apparatus of Wren was later replaced by metal needles, rubber tubing, and glass containers. Originally, the equipment was designed to be reused and required cleaning and eventually sterilization between uses. The first fluid containers consisted of an open glass flask that was covered with a piece of gauze to keep debris out. By the 1930s the container had evolved into a closed, vacuum glass bottle. The technology for refining plastics has done much for the improvement of infusion therapy equipment. Rubber was the precursor for the use of the plastic (polyvinyl chloride) that was applied to administration sets first and then fluid containers. Today, plastic containers and administration sets are state of the art for infusion therapy equipment.

Devices for accessing the vein have also progressed rapidly in the last 60 years. Metal cannulas and crude metal needles that required cleaning and resharpening between uses were first used in the 19th century. Problems with infiltration, however, led to the development of the plastic cannula in 1945. These first catheters were made of flexible plastic tubing that required either a cutdown (incision to access the vein) or a needle for introduction into the vein. In 1950 the Rochester needle was introduced by Gautier and Maasa, and revolutionized the IV catheter (Weinstein, 2007). The over-the-needle type of catheter is state of the art and used to deliver almost all peripheral infusions. The metal needle is still available, but it is now a disposable device modified for short-term use.

Another catheter available is the through-the-needle device, which allows the plastic catheter to be threaded into the vein through the needle after venipuncture has been completed. This device was first introduced in 1958 by the Deseret Pharmaceutical Co., and its successors are still used today for the placement of percutaneous central catheters and peripherally inserted central catheters (PICCs).

Peripheral IV catheters, both metal and plastic, are available in a variety of gauge sizes to allow for the delivery of different therapies to all age groups. Gauges range from a large lumen (12 gauge) to a neonatal size (27 gauge). Central catheters also come in a variety of sizes from 1.8 French to 13 French. Catheters are available in varying lengths, from 0.75 to 30 inches or longer. The length is generally determined by the route of administration, peripheral or central, and the size and age of the patient. The Infusion Nursing Standards of Practice recommends that the shortest length, smallest gauge catheter be used to accommodate the prescribed therapy (INS, 2006).

Before 1949 IV therapy could be administered only through a peripheral vein. At that time, Meng and colleagues documented the use of a catheter placed in the central venous system of a dog for administering a hypertonic dextrose and protein solution. The subclavian puncture for accessing the central veins was more frequently used after its description by Aubaniac from Vietnam in 1952. In 1967 Dudrick adapted the subclavian approach for the administration of high concentrations of dextrose and proteins, which produced minimal side effects caused by the tonicity of the solution.

Further expansion on this concept led to the development of a catheter that is placed in the subclavian vein and then tunneled under the subcutaneous tissue to exit on the chest wall. Originally designed for use with children, this catheter became known as the Broviac catheter. A size appropriate for adults, the Hickman catheter, was developed soon after. The evolution of the Hickman-Broviac catheter has allowed for the administration of therapies over long periods, with minimal technical complications. It has also revolutionized infusion therapy by allowing safer administration of solutions in the home setting.

The 1980s saw further evolution of the use of central venous access with the introduction of the totally implanted system. This system consists of the central catheter and a device referred to as a port. The catheter is placed by percutaneous puncture into the subclavian vein, and the port is placed in a subcutaneous pocket generally on the chest wall. Access to this port is by puncture through the skin with a specially designed needle for the portal septum. By the end of the 20th century, venous ports were being placed in the arm with the catheter threaded into the superior vena cava via the basilic or cephalic vein, and ports were being placed for other therapies such as epidural pain management.

The PICC was introduced in the last quarter of the 20th century. The advantages of using this catheter are that there is less risk involved; fewer injuries are caused during insertion and also there is reduced risk of infection. It can be inserted by the registered nurse (RN) who is trained in its placement.

Peripheral catheters primarily consist of a single lumen, although experiments have been carried out with dual lumen catheters. Central catheters—percutaneous, PICC, tunneled, and ports—are available in both single lumen and multilumen design. The multilumen design makes it possible for multiple therapies to be delivered through one device, thus sparing the patient from numerous venipunctures.

To make the delivery of infusion therapy safer, various administration devices have been developed. Filters were first used in 1943 to remove fibrin clots during blood transfusions. Filters are now of two types—screen and depth—and are available in a number of micron sizes. Filters remove particulate matter from the solution and, depending on the micron size, can also eliminate air and remove endotoxins.
< div class='tao-gold-member'>

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Aug 2, 2016 | Posted by in NURSING | Comments Off on 1. Infusion Nursing as a Specialty

Full access? Get Clinical Tree

Get Clinical Tree app for offline access