The Health Care System



The Health Care System





Introduction to Health and the Health Care System


The World Health Organization (WHO) defines health as the absence of illness or disease and a state of being in which the individual feels well and is able to carry out the daily functions of life with no difficulties and no pain. In reality, no one reaches this optimum level of health. Everyone has aches and pains, psychological if not physical.


In our health care system, the physician’s responsibility is to examine hundreds of people in the course of a week and try to focus on medical problems that meet the following criteria: The problem is causing or can cause severe difficulties in carrying out the daily functions of life, and the problem can be treated either by reducing the effects of the symptoms or by eradicating the problem altogether.


Each individual the physician sees has a different group of presenting physical symptoms as well as a different set of social circumstances and emotional issues. The physician listens to the patient’s description of his or her life, performs objective laboratory and diagnostic tests, identifies medical problems, and assesses the nature of each problem.


Physicians know that the vast majority of medical problems do not pose a long-term threat to health. Most medical conditions get better over time. Effective treatments are available to cure many conditions (curative treatment). In other cases the physician can reduce the symptoms even if the underlying medical condition is not significantly affected. This type of treatment is called symptomatic treatment (responding to symptoms) or palliative treatment (seeking to reduce the effects of a disease or condition without curing the underlying disease). For example, a patient with a urinary tract infection who is given a prescription for antibiotics receives curative treatment, whereas a patient who has diabetes mellitus receives palliative treatment. The patient is prescribed insulin, which alleviates the symptoms of the diabetes; however, the treatment does not cure the diabetes.


Most treatments are based on scientific study. In Western scientific medicine, as in no other medical tradition, approaches to diagnosis and treatment have been studied and tested over hundreds of years. As long ago as the fourth century BC, a physician named Hippocrates in Greece believed that disease was not a punishment for transgressions against the gods, but rather the result of physiologic and environmental factors that could be studied. Since the time of Hippocrates, the practice of medicine has changed considerably in response to scientific discoveries (Table 1-1).



Table 1-1


Milestones in the History of Medicine











































































3000 BC Writings about the circulation of blood in China.
c. 460 BC Birth of Hippocrates (called the “Father of Medicine”) in Greece—based medical care on observation and believed that illness was a natural biologic event.
1514-1564 Andreas Vesalius—wrote the first relatively correct anatomy textbook.
1578-1657 William Harvey—discovered circulation of blood (England).
1632-1723 Antony van Leeuwenhoek—discovered the microscope (Holland).
1728-1793 John Hunter—developed surgical techniques used in surgery.
1749-1823 Edward Jenner—first vaccine for smallpox (England).
1818-1865 Ignaz Semmelweis—theorized that handwashing prevents childbirth fever (Austria); his theories were rejected during his lifetime and not accepted until the work of Pasteur and Lister.
1820-1910 Florence Nightingale—began training for nurses; established first nursing school; before this time nurses received no training and the profession had little status (England).
1821-1910 Elizabeth Blackwell—first woman to complete medical school in the United States; established a medical school in Europe for women only.
1821-1912 Clara Barton—acted as a nurse on the battlefields of the Civil War; was a civil rights activist and suffragette; organized the American Red Cross.
1822-1895 Louis Pasteur—developed pasteurization of wine, beer, and milk to prevent growth of microorganisms; microbiology (France).
1827-1912 Joseph Lister—demonstrated that microorganisms cause illness; his experiments with phenol, carbolic acid, and other antiseptics laid the groundwork for modern surgery (England).
Mid-1800s First large hospitals, such as Bellevue, Johns Hopkins, and Massachusetts General, established in U.S. cities. Discovery of anesthesia in the United States is credited to a Southern physician named Crawford Williamson Long.
1843-1910 Robert Koch—isolated the bacteria that cause anthrax and cholera; established principles to determine that a specific type of bacteria causes a specific disease (Germany).
1845-1923 Wilhelm Roentgen—discovered x-rays (Germany) based on the discovery of radium and radioactivity by Marie Curie (1867-1934) and Pierre Curie (1859-1906).
1851-1902 Walter Reed—proved that yellow fever is transmitted by mosquitoes, not direct contact, while working as a U. S. army physician in Cuba. An aggressive spraying program made it possible to complete the Panama Canal.
1854-1915 Paul Ehrlich—coined the term chemotherapy; predicted autoimmunity; developed Salvarsan (arsphenamine), an effective treatment for syphilis, in 1909. This led to the development of sulfa drugs and other antibiotics (Germany).
1881-1955 Alexander Fleming—discovered penicillin (England); identified it in 1929, but an efficient method of producing large amounts was not developed until needed in World War II. Other antibiotic medications such as sulfa were soon discovered.
1891-1941 Frederick Banting—co-discoverer of insulin with Charles Best and John Macleod in 1922 (Canada).
1906-1993 Albert Sabin—developed oral polio vaccine.
1914-1995 Jonas Salk—developed parenteral polio vaccine.
1922-2001 Christiaan Neethling Barnard—South African surgeon who is remembered for succeeding at the first human-to-human heart transplant in 1967.
1978 Birth of Louise Joy Brown, the first child born by in vitro fertilization, in Great Britain.


Shift From Hospital-Based to Community-Based Health Care


Three trends running in parallel through modern medicine have led to an increasingly important role for office-based health care.


The first trend is the desire by those who pay the bills—employers, the federal and state governments, and insurance companies—to reduce the costs of health care whenever and wherever possible.


A second trend is the pressure for medical offices and clinics to provide a broad range of diagnostic and treatment services to avoid having to admit patients to the hospital. The increased cost of hospitalization has provided this pressure. Developments in diagnostic equipment, increased availability of home health care, and less invasive surgical procedures have facilitated the process.


The third trend is an increased understanding, through empirical evidence (information learned from experimental research), that people feel better the less they must be confined to a hospital or go to a hospital for treatment. Being able to be diagnosed and treated in an outpatient setting with follow-up at home allows people to feel more in control of their lives as medical patients. This is especially important for people who have frequent contact with the medical system, such as the parents of infants and children, the elderly, and those with chronic illnesses. Many people who would have been hospitalized for long periods or possibly even institutionalized 50, 25, or even 10 years ago are currently living independently in the community.


Today the hospital’s role is primarily to provide acute care and diagnostic services. In order for a patient to be hospitalized, his or her condition must be unstable or necessitate constant regulation of therapy. If the patient does not meet these strict criteria, he or she goes home to be followed as an outpatient; is transferred to a rehabilitation facility for intense, regular rehabilitative treatment; or is sent to a nursing home for long-term maintenance care.



Managed Care Versus Patient Care: Competing Forces Facing the Medical Office in the Twenty-First Century


Fee-for-Service Insurance Plans


Traditionally, medical care in the United States was paid for on a fee-for-service basis. Each service was billed and paid for as a separate charge: so much for the office visit, so much for the electrocardiogram, so much for the urinalysis, and so on. Fee-for-service payments can be thought of as ordering food at a restaurant à la carte: so much for the main course, so much for a salad, so much for coffee.


During the first part of the twentieth century, health insurance (if the patient had any) paid only for hospitalization, and usually the patient completed most of the paperwork. Health insurance is a system by which a person or the person’s employer pays an insurance company a yearly amount of money, and the insurance company pays some or most of the person’s medical expenses for that year. The theory behind insurance is that, although a few people will have large medical bills over the course of the year, most people will have small bills. By setting the fee for everyone at a level above the actual cost of care for most people, the insurance company can pay for the care of the well, the occasionally ill, and the often ill and still make a profit.


This system encouraged health care providers to provide a high level of care for everyone with health insurance because the insurance paid for every test and every procedure. Physicians’ incomes soared between the end of World War II and the early 1980s. With the increasing costs of laboratory and diagnostic testing, hospital services, and office visits, the cost of medical care increased far more rapidly than the cost of other goods and services in the U.S. economy. (In economic terms, health care inflation increased much more rapidly than the general rate of inflation.)


During this time, ever-better health insurance became a standard employee benefit at many companies. The first kind of health insurance offered, in the 1950s and 1960s, was coverage for hospital care. Coverage for office visits became standard in the 1970s.



Highlight on the History of Medical Treatment of Infectious Disease


Historians generally place the beginning of Western medicine with Hippocrates, an ancient Greek physician who saw medicine as an independent discipline based on clinical practice rather than prayer and ritual. For several centuries there were few treatment methods other than rest, exercise, diet, and a few medications derived from plants. The intensive study of the human body in the 1500s fostered a better understanding of physiologic processes. For example, the English scientist William Harvey, who rejected the traditional belief that blood was made up of “spirits” and that body fluids were “humors,” developed a theory, later proved true, that blood flows from the heart to the lungs, throughout the body via arteries, and back to the heart via veins.



The first microscopic lens was invented in 1677 by Antony van Leeuwenhoek. Through his microscope van Leeuwenhoek saw yeasts, molds, and algae, adding evidence to the theory that diseases could be caused by particles too small to be seen with the eyes. He also identified red blood cells passing through capillaries.


Throughout the nineteenth century, other scientists and physicians advanced the understanding of the cause of disease. Some found ways to combat disease without understanding the mechanism by which the disease acted; others determined the actual cause of a particular disease.


In the 1840s the Viennese obstetric assistant Ignaz Semmelweis discovered that puerperal fever, or so-called “childbed fever,” a fatal illness of women who had just given birth, could be reduced if physicians washed their hands. He came to believe that physicians were infecting women by transferring disease-causing substances from one woman to another.




Semmelweis conducted what today would be called an epidemiologic study. He studied the records of women who had died and determined which physicians and medical students had attended which birth. His study of the records led him to conclude that most of the women who died had been attended to by physicians and medical students who had come into the birthing room directly from the anatomy laboratory, where they had worked with cadavers, without first washing their hands. Most of Semmelweis’s colleagues dismissed his notion that simple handwashing could reduce childbirth deaths as nonsense, and during his lifetime, Semmelweis was ridiculed. It was not until decades later that physicians regularly began washing their hands.


The Scottish surgeon Joseph Lister worked on similar ideas to develop the first practice of antisepsis (cleaning areas where germs may be) and later asepsis (creating a germ-free environment). Lister started by pouring carbolic acid on the wounds of those who had just had surgery. Over time, he found milder substances. Lister found that far fewer patients who were treated with these substances died from gangrene that developed in the open wounds.


Semmelweis, Lister, and others worked empirically, which means they sought results through experiments that could be repeated with the same results. Although they were able to decrease infection rates, they never completely understood what caused infectious diseases. Other scientists sought to determine that bacteria caused specific diseases.


The German physician Robert Koch is called the “Father of Microbiology” because of his work with specific bacteria such as Mycobacterium tuberculosis, the bacterial agent that causes tuberculosis. Koch also isolated the bacterial agent that causes anthrax. Koch grew the anthrax bacillus in a number of different liquid media in his laboratory, used the microscope to identify it, injected the organism into a healthy animal, waited for the animal to become sick, and then recovered the same organism from the sick animal. This proved that one specific type of bacteria causes one specific disease. Today we know that it is possible to break the chain of illness by keeping those who are contagious away from those who are vulnerable to disease.


The work of Louis Pasteur and Koch, among others, helped set the stage for the understanding of infectious disease and for worldwide vaccination programs to eradicate smallpox and to try to eradicate the “childhood illnesses” of mumps, measles, and rubella (German measles).


The first vaccination actually had been performed a century earlier. Edward Jenner, an English physician in the farming country of Gloucestershire, used the pus from one person’s cowpox lesion to vaccinate another individual against smallpox in 1796.



Cowpox is a variant of smallpox. It is lethal to animals but relatively harmless to humans. For centuries, people had realized that people who had been infected with cowpox did not develop smallpox. Today, we understand what had happened—their immune systems had developed antibodies to cowpox that also prevented smallpox infection by attacking the smallpox virus.


Jenner used “humanized cowpox” to establish immunity by taking pus from a lesion on a human infected with cowpox and rubbing it into an open wound on another human. A couple of weeks later, he inoculated the second person with smallpox. Not only did the individual not become ill, but he also was not contagious. A century later Pasteur would discover fully the mechanism by which vaccination works. Vaccines were discovered for many diseases. By the beginning of the twentieth century, vaccines had been developed for diphtheria and tetanus, and most children received these vaccines as infants by the middle of the twentieth century. New immunizations continue to be developed not only for infants but also for adolescents, adults, and the elderly.


Medications to kill bacteria were another important tool in the fight against infectious disease. Paul Ehrlich is credited with the development of the first medication to kill bacteria. In 1909 he developed a drug called Salvarsan (arsphenamine), which could be used to effectively treat syphilis. Unfortunately the medication itself was extremely toxic. The first of the sulfanilamide drugs, Prontosil, was developed in 1932 in Germany. It was effective against infections caused by streptococci and some other types of bacteria. The sulfanilamides became popular before and during World War II because they were the only antiinfective agents widely available. Penicillin, a mold that kills bacteria, had been discovered in 1922 by Alexander Fleming in London after it attacked bacteria that he was growing on agar plates. Initially the scientific community did not believe that it would be effective inside the body, and little follow-up research was done. During World War II, two medical researchers, Howard Florey and Ernst Chain, took up the research on penicillin and managed to prove that the medication was effective. The first human was treated in 1941, and within a few years mass production had been established and penicillin was in widespread use.


Discovery of the first virus is credited to Dimitri Ivanowski, a Russian botanist, in 1892. He discovered that a substance could pass through a ceramic filter that trapped all known bacteria and still cause a disease of tobacco called mosaic tobacco disease. We now know that the culprit is the tobacco mosaic virus. Yellow fever was the first viral disease of humans to be identified. During construction of the Panama Canal, workers were devastated by this disease. Research done by Walter Reed established that the disease was caused by a virus transmitted by mosquitoes and not direct contact. Controlling mosquitoes facilitated the work on the canal. The development of the electron microscope in 1930 allowed viruses to be seen, but progress to control viral diseases was slow. For most viruses the body has adequate defenses to overcome the infection, but there are some significant exceptions. The retroviruses, such as human immunodeficiency virus (HIV), are notable because they are able to overcome the body’s immune system. In the 1970s the first deaths from acquired immunodeficiency syndrome (AIDS) were reported in the United States. Within the next 20 years, a worldwide epidemic occurred. By 1997 more than 6 million deaths worldwide had been caused by the AIDS virus. Treatments have been developed to slow the progression of the disease, but to date there is no effective immunization or cure for this disease. The ability of viruses to mutate rapidly has resulted in recent viral pandemics from diseases such as severe acute respiratory syndrome (SARS) in 2004 and H1N1 influenza in 2009. image



Government Insurance Plans


Recognizing that there were large segments of the population without insurance because they are not employed, the federal government began to provide health insurance to large segments of the population starting in the 1960s. Medicaid began to provide health insurance for low-income children without parental support and later expanded to cover all low-income people. Medicare initiated health insurance for the elderly, the disabled, and those with end-stage kidney disease. The Civilian Health and Medical Program of the Uniformed Services (CHAMPUS; now called TRICARE) provided health insurance for dependents of active-duty military personnel. With these programs, the federal government has become the primary insurer for more than 50 million Americans. These plans, which included payments for office visits for illness, greatly increased the number of Americans who had medical insurance. There was little incentive for the consumer (the patient) to control costs because insurance was covering those costs, and care in most cases was “free” to the consumer.


Although most Americans who were insured did not feel that they were “paying” for their medical care, they were, indirectly. The huge increases in health care costs were one of the major sources of the generally high rates of inflation in the 1970s. Employers, who paid for the insurance, had to pay ever-rising premiums and offset these large premium increases with small increases in cash wages, which did not keep up with inflation. So American workers did, in fact, pay for health insurance and health care costs in lower purchasing power for the cash they received as salary.



Managed Care


Health maintenance organizations (HMOs) were originally formed with a belief that consistent, routine care would help prevent later expensive care. The expansion of health insurance to cover office visits originally covered only visits for illness or injury and did not cover so-called “routine care” (well-child visits, immunizations, regular checkups, or physical examinations). Managed care was based on the belief that increasing prevention and promoting early detection and diagnosis of chronic and life-threatening medical conditions would reduce costs. The HMO movement, which gained acceptance in the 1970s, pushed traditional health insurance companies to begin providing coverage for routine care.


In the late 1970s, insurance companies began to respond to escalating health care costs by reviewing care to find out if it was medically necessary. This process, called utilization review, identifies patients who, according to the insurance companies, no longer need to be hospitalized. Originally, utilization review was used by Medicare and Medicaid. Other insurance companies soon realized that shortening hospital stays was an important way of reducing overall health care costs. The combination of HMO insurance plans and strict utilization review for hospitalized patients is the basis of what we call managed care.


The original HMO model had two components: insurance and services including diagnostic tests and pharmacy. HMO plans set up full-service medical clinics. Physicians were employees. The HMO established a contractual relationship with a hospital for inpatient services, and patients had to go to the specific hospital with which the HMO had a contract.


In the late 1980s HMO services began to separate from HMO insurance. A second type of HMO model based on networks of physicians who agreed to provide care for HMO patients came into being. Some of these networks operated under the old fee-for-service plans but agreed to discounted fees from the HMOs in exchange for access to the rapidly growing patient populations enrolled in HMOs. In an effort to reduce payments, HMOs tried to have physicians accept a flat monthly fee for each subscriber in their practice and agree to provide all necessary primary care for that fee. This type of payment is called capitation. This reduces the incentive to provide extra services because their cost will not be reimbursed separately.


The managed care movement in general, as well as the trend to decrease reimbursement for primary care in particular, put the burden on physicians to compete with one another to provide the most care for the least money. As a result, physicians often feel pressure to limit diagnostic tests, reduce hospitalizations and the number of days patients stay in the hospital, and use generic instead of brand-name drugs. (Generics are identical in chemical formulation to brand-name drugs and can be manufactured only after the brand-name drug’s patent protection has expired.)


Managed care also puts pressure on physicians to see more patients, spend less time with each patient, and justify all services including diagnostic tests and referrals. The expense of handling sicker patients is expected to be balanced by those patients who use less than the average amount of medical services.


In addition, insurance plans have tried to reduce their costs for prescription medications by restricting drug coverage to lists of approved drugs. Such a list, called a formulary, usually includes one or two of the less expensive drugs for each possible medical condition. Exceptions are made if the physician can show that the less expensive drugs have been ineffective for his or her patient and that a more expensive drug is necessary. In some plans the patient can receive a more expensive medication by paying more of the cost.



Health Care Reform


Despite these measures, however, beginning in the late 1990s, both insurance premiums and health care costs began to increase at more than double and even triple the underlying rate of inflation. There has also been an increase in the number of individuals and families who do not qualify for government insurance plans, and also do not have health insurance through their employers. This may be because they work part-time or are self-employed. The Patient Protection and Affordable Care Act, which became law in March 2010, expands insurance coverage to an estimated 32 million Americans who were previously uninsured. Among the provisions that went into effect in September 2010, insurance companies are no longer allowed to exclude children with preexisting health conditions or to drop customers after discovering technical mistakes on applications. By 2014 this law will require all individuals to purchase health insurance or pay an annual fine. Even though this law has been challenged in the courts, a strong belief persists that society has an obligation to make appropriate health care accessible to all citizens.

Stay updated, free articles. Join our Telegram channel

Apr 16, 2017 | Posted by in NURSING | Comments Off on The Health Care System

Full access? Get Clinical Tree

Get Clinical Tree app for offline access