Hospital Introduction

Chapter 1

Hospital Introduction

Key Terms

Accounts receivable (AR)


Acute care facility

Admission process

American College of Surgeons (ACS)

American Hospital Association (AHA)

American Medical Association (AMA)

Ancillary services

Blue Cross



Community hospital

Demographic information

Diagnostic service

Electronic health record (EHR)

Emergency Department (ED)

Evaluation and Management (E/M)


General hospital

Group health insurance

Health Information Management (HIM)

Hill-Burton Act

History and Physical (H & P)


Hospital Standardization Program



Insurance information

Integrated delivery system

Joint Commission on Accreditation of Healthcare Organizations (JCAHO)

Managed care plan


Medical record documentation


Medicare Severity-Diagnosis Related Groups (MS-DRG)

Non-patient care



Palliative service


Peer review

Per diem

Percentage of accrued charges

Prepaid health plan

Preventive service

Primary care network

Primary care physician (PCP)

Private hospital

Professional component

Prospective Payment System (PPS)

Quality Improvement Organization (QIO)

Specialty hospital

Teaching hospital

Technical component

Tertiary care hospital

The Joint Commission (TJC)

Therapeutic service

Third-party payer

Trauma center

Utilization management (UM)

Utilization review (UR)

The purpose of this chapter is to provide an understanding of how hospitals evolved and to provide an overview of a hospital’s organizational structure and functions. Over the centuries, hospitals evolved from ancient healing centers to facilities designed to care for the sick. We will explore some of the most significant factors that influenced the evolution of hospitals from ancient times to today. The complex network of personnel in a hospital is designed to contribute to the accomplishment of hospital goals. The mission of a hospital is to provide effective and efficient patient care. The survival of a hospital is directly related to carrying out the hospital mission and to maintaining a sound financial base. A review of organizational structures and departmental functions will provide a basis for understanding how departments and personnel work together to accomplish organizational goals. It is essential for personnel to understand how they contribute to the organization’s mission. Billing and coding professionals play a significant role in maintaining a sound financial base through accurate coding and billing. Additionally, billing and coding personnel play an important role in ensuring that the hospital is in compliance with billing and coding guidelines. To code and bill hospital services accurately, it is important to have an understanding of the types of hospitals, services provided, and levels for provision of care.

Hospital Introduction

The term hospital comes from a Latin word hospitalis. Hospitalis means “a house or institution for guests.” This definition describes the hospital when it served as housing for traveling guests such as the pilgrims who traveled to distant lands. Today’s definition of hospital is “an institution where the sick or injured receive medical or surgical care.” A hospital is a facility where patients with health care problems go to seek diagnosis and treatment of condition(s). A hospital may be housed in one building or in many buildings on a campus. Some hospitals are affiliated with medical universities that train medical personnel. Hospitals today have five major characteristics:

The main purpose of a hospital is to diagnose and treat illness. Hospitals provide medical care to patients with various types of illness including diagnostic and therapeutic services. Patient illness can range from a simple fracture to a condition that requires the patient to have an organ transplant. Administrative, financial, operational, and clinical departments within the hospital are coordinated to ensure that patient care is efficient and effective. Hospitals maintain financial stability through the billing process. Medical care provided to a patient must be processed for billing to the patient or other payers. A payer is an insurance company or government program that pays health benefits for patient care services. The complex process of billing health care payers and patients begins when patient demographic and insurance information is obtained. Patient demographic information is specific characteristic information about a patient including: name, address, date of birth, sex, and Social Security number. Insurance information is information regarding the insurance plan or government program that the patient is insured under including: the plan name and identification number, group name and number and insured’s name and number.

Hospital personnel must record all diagnostic and therapeutic services, and items in the patient’s medical record. Examples of diagnostic and therapeutic services include: room and board, medical surgical supplies, medications, laboratory and radiology procedures, and surgeries. The medical record is used to record pertinent information regarding the patient’s condition and treatment. It improves communication among providers and it serves as supporting documentation for charges billed. Patient conditions, services, and items are coded in preparation for submission of the charges to payers. Coding is the process of translating written descriptions of procedures, services, items, and patient conditions into numeric or alphanumeric codes. A claim form containing information about the services provided and the patient’s condition is used to submit charges to payers. Each payer conducts a review of the claim to determine payment or denial of the claim. Patient statements outlining charges, payments, and the outstanding balance are prepared after services are rendered. Patient statements are sent when the patient does not have insurance and when the insurance company determines there is a portion that the patient is required to pay.

The final stage of the billing process involves the collection of outstanding account balances. Accounts receivable (A/R) is the term that describes money owed to the hospital from patients, insurance companies, and government programs, such as Medicare, Medicaid, and TRICARE.

To gain an understanding of the structure and function of today’s hospital, it is necessary to explore historical factors that influenced their development and growth. The evolution of hospitals was greatly influenced by advances in medicine that came in response to cultural and environmental factors affecting health care. The following review of hospital history, from antiquity to the twentieth century, will highlight some of the major factors that contributed to the growth and development of hospitals.

Evolution of Hospitals

The evolution of hospitals began in ancient times when some of the first institutions used for the sick were temples built in Egypt, Greece, and Rome for worshipers of the gods. These institutions evolved to serve as a place to treat injured soldiers during the wars and a place to isolate those with contagious diseases. Throughout history, the spread of infectious disease and the need to care for injured soldiers contributed to the pursuit of new cures and treatments. These pursuits led to medical advances that greatly influenced the growth and development of hospitals.

Ancient Medicine and Healing Centers

The practice of medicine dates back to ancient times. There is evidence of primitive procedures, such as boring a hole into the skull (trephining), and the use of medicinal herbs and fungi dating back to 10,000 b.c. or earlier (Figure 1-1). These procedures and natural remedies appeared to be effective, and research has shown that the treatment of disease and wounds was very “modern” in approach. Despite the “modern” approach, medical care during this period was dominated by religious beliefs, and healing was intertwined with spiritual and ritualistic ceremonies.

Ancient temples in Egypt, Greece, and Rome were said to be the first institutions built for those with an illness. The temples were dedicated to the worship of healer gods. Healer gods, such as Aesculapius, were thought to be responsible for curing illness. Research indicates that the first hospital-like institutions were built between the fifth and second centuries b.c. in places such as India, Ireland, and Rome. Some of these hospitals were built specifically to provide care for the sick and many were used to provide medical care to injured soldiers. It was only with the rise of Greek civilization that the principles of medicine shifted to a holistic and realistic approach.

Classic Greece and Rome

Contributions of the high cultures of classic Greece and Rome are considered to be the foundations of modern medicine as we know it today. In the historical view, credit for the birth of medicine is given to Hippocrates, an ancient Greek physician. He is credited with changing theories of medicine from those based on supernatural beliefs to those with a scientific foundation. Hippocrates did not believe illness was caused by evil spirits. He believed that illness came in response to poor environment, diet, and hygiene. Hippocrates’s approach to medicine involved observation through physical examination and conservative treatment. If diet and exercise did not succeed, then medicines formulated from minerals and plants were used. Surgery was performed only if treatments with diet and medicine were not successful.

Hippocrates also laid the foundations for the development of the Hippocratic Oath (Figure 1-2). The Hippocratic Oath outlines standards for medical and ethical behavior that physicians follow today. Hippocrates traveled throughout Greece practicing his medicine. He founded a medical school in Greece and began teaching his ideas. Hippocrates is known as the “Father of Medicine” because his philosophies and methods formed the foundations of medicine and treatment we recognize today.

Classic Rome is a culmination of Greek and Roman civilizations. Rome developed superior armies that eventually conquered Greece by 133 b.c. Many elements of Greek civilization were adopted and modified by Rome. Hippocratic foundations of medicine were advanced by a physician from Asia named Claudius Galen. Galen assumed the role of primary physician in Rome, and his teachings carried into the modern age of medicine. Rome’s contribution to the advancement of medical care was enormous.

Roman hospitals were primarily found in military settlements, commonly referred to as field hospitals. Medical care provided in field hospitals showed evidence of the first systemized approach to medical care emphasizing diet, sanitation, and hygiene. Advances in medical treatments and instruments were made using knowledge gained from treating injured soldiers through many wars. Instruments found in a Roman legion’s field kit included suture kits, scalpels, wound spreaders, and splints (Figure 1-3). The drugs known to clean and promote healing were similar to those used today.

Middle Ages

The Middle Ages was a period marked by the chaos of war and the global spread of disease. The growth of hospitals was stimulated as the Roman Empire was converted to Christianity. War and the global spread of disease also contributed to the establishment of many hospitals.

During the Early Middle Ages, the fall of the Roman Empire had a profound effect on all aspects of life, including medicine. As established health and hygiene standards vanished, disease continued to spread. The centers of medicine and learning decayed, and were eventually lost in Europe. However, the growth of hospitals in Europe continued as a result of the Christian commitment to caring for the sick and poor and the need to treat soldiers of war.

For many centuries after the fall of the Roman Empire, the Arab world was the center of scientific and medical knowledge. The Arabs developed and refined Hippocratic theories, and Islamic physicians began to use the regulation of diet and exercise and the prescription of medicinal herbs to treat their patients. Arabic pharmacists became skilled in the formulation of medicines from plants and minerals. By 931 a.d. large Islamic hospitals were involved in the training and licensing of doctors and pharmacists.

A more significant factor in the growth in the number of hospitals during the High Middle Ages was the epidemic spread of contagious diseases. It is said that the lack of sanitation and poor hygiene caused infectious diseases such as the black plague, or “Black Death.” By 1348 a.d. the black plague killed a large percentage of the total European population. Hospitals became well known as places where people with contagious diseases were isolated. Many hospitals became involved in teaching physicians. It was during this period that the title “doctor” was adopted through legislative implementation of curriculum and licensing requirements for physicians. The practice of medicine was now an official profession. Over time, schools and universities were organized for the purpose of training individuals in the field of medicine.

The Renaissance period was initially marked by the establishment of commercial enterprise and a rekindling of the scientific spirit. As commercial enterprise flourished, a shift from villages and feudal estates to cities occurred. The growing economy and increasing population of the cities led to the need for more hospitals. The establishment and endowment of hospitals was accomplished through contributions from municipalities, guilds, and public individuals. The last 200 years of this period was marked by widespread abuse and disorder. Hospital revenues were lost, confiscated, or distributed by the magistrates and used for purposes other than caring for the sick. The continued misappropriation of funds from endowments and charities came to light, and this led to a major upheaval. The responsibility for management of institutions founded to care for the sick and injured began to shift to secular authorities. Toward the end of the Renaissance period, hospitals began to be operated by nonreligious organizations rather than religious orders.

Throughout the Enlightenment period, the patterns established in the Middle Ages continued as towns and cities throughout Europe built hospitals for the purpose of caring for victims of contagious diseases and also to care for the poor. It was during this period that the voluntary hospital movement started with the establishment of a voluntary hospital in England in 1718. These hospitals were funded through voluntary contributions and managed by representatives appointed by the contributors. Many other voluntary hospitals were established from 1719 to 1729. By the close of the century, hospital direction had been assumed by private interests with land and funds. By the 18th century, hospitals had been constructed in the new lands colonized in the Americas.

BOX 1-1   Test Your Knowledge



Select the answer option that matches the descriptions below.







Classic Greece and Rome

Hippocratic oath

Ancient temples


11. ____ Name a primitive procedure performed in ancient times back to 10,000 b.c.

12. ____ The first institutions built in Egypt, Greece, and Rome for those with an illness.

13. ____ The standards for medical and ethical behavior that physicians follow today.

14. ____ Contributions of these cultures are considered to be the foundations of modern medicine as we know it today.

15. ____ The name of the Greek physician who is given credit for the birth of medicine and is known as the “Father of Medicine.”

16. ____ The fall of this empire led to the vanishing of established health and hygiene standards, which contributed to the spread of disease.

17. ____ Name the profession that became official during the High Middle Ages through legislative implementation of curriculum and state licensing requirements for physicians.

18. ____ A significant factor in the growth in the number of hospitals during the High Middle Ages was the epidemic spread of this contagious disease.

19. ____ The responsibility for management of institutions founded to care for the sick and injured began to shift to secular authorities during this period.

20. ____ These hospitals were established during the Renaissance period as the result of a movement that shifted hospital direction to private interests.

History of Hospitals in the United States

With the discovery and conquest of new lands came many of the same health care problems experienced in foreign lands. The population in early American colonies grew rapidly, and contagious diseases continued to spread. The need to treat injured soldiers, care for the sick and poor, and isolate those with contagious diseases led to the establishment of American hospitals. Early institutions were primarily military hospitals, almshouses for the poor, and places to isolate those with contagious diseases. Hospitals were later established specifically to care for the sick. American hospitals later took the form of modern-day hospitals. These hospitals became teaching and research environments that contributed greatly to advances made in medicine and ultimately the growth and development of hospitals. To understand the history of hospitals in the United States it is important to explore several factors that influenced the establishment and continued development of hospitals, including public health status, medical practice, scientific advances, medical standards and accreditation, and economic influences on hospital development.

Public Health Status

The health status of the early American population was like that of the countries from which its settlers came. The population in early American colonies grew from around 2,000 in the year 1620 to more than 1 million by 1760. Settlers lived in primitive housing situated near waterways. There were no sanitation systems, and eventually the water became contaminated by human waste. The rapid growth of the population made it impossible to provide healthy food and living conditions. The population was stricken with diseases such as malaria, smallpox, yellow fever, cholera, and typhoid. A limited number of educated physicians had traveled to America; however, there were no known cures for these diseases. It was difficult to obtain care because the population was spread over large areas, and physicians traveled to patients’ homes to treat them.

Medical Practice

The practice of medicine in early colonial America was based largely on religious superstitions and rituals, and it was heavily reliant on home cures and folk remedies. It was believed that an imbalance of the body’s humors (blood, phlegm, yellow bile or choler, and black bile or gall) was responsible for illness. Medical care was provided by a few trained physicians and others who were self-taught or learned through apprenticeships. Despite the training of some physicians, there were no known cures for many of the diseases seen in those days. Medical treatment focused on treating the symptom. Treatments used to bring the humors into balance included bleeding, purging, and trephining. Herbs and other natural remedies were also used.

It is said that the first hospital established in the United States was a military hospital on Manhattan Island, New York. Voluntary hospitals were the first institutions established in the United States specifically to care for the sick. These hospitals were funded with money collected from voluntary contributions. Physicians served without pay, and patients were required to pay for services. Those who could not pay donated time to perform various duties such as cleaning and assisting others. Three of the first voluntary hospitals founded were: Pennsylvania Hospital in 1751, New York Hospital in 1775, and Massachusetts General Hospital in 1804. Voluntary hospitals were funded by contributions from the public and by matching government funds. The hospital was governed by 12 board members who established policies such as those for admissions and selection of physicians. The contributors met annually to elect members to serve on the board. These hospitals served as models for modern-day hospitals and teaching institutions.

The cause of many diseases was unknown, so the death rate of hospital patients was very high. Sanitation conditions in hospitals were not as they are today, which resulted in the spread of infection within hospitals. With this came the stigma that admission to a hospital was a last resort. Hospitals were not thought to be a safe place where someone could get well. It was not until scientific advances were made that hospital conditions improved.

Scientific Advances

The focus of medicine shifted to research and scientific exploration of the cause of disease. Discoveries in technology led to the invention of new diagnostic tools used in the identification and treatment of disease (Table 1-1). Diagnostic, technological, and therapeutic advances made in medicine through the 20th century contributed to the growth of hospitals.


Advances in Medicine

Anatomy A greater understanding of anatomy and physiology, the study of structure and function of the body and identification and classification of disease. Drawings of anatomy were created based on visualizations during autopsies.
Body systems A greater understanding of various body systems, such as the circulatory, respiratory, digestive, and nervous systems was gained:
Contagious diseases Knowledge of how infections were spread increased as theories of contagious diseases were developed:
Prevention of disease The reality that sanitation and hygiene were directly related to disease brought about changes in attitudes toward public health.
Microscopic innovation Provided a way to examine anatomic tissue and cells.
The creation of gunpowder Resulted in the need to identify ways to treat gunshot wounds and was instrumental in the development of advanced surgical procedures
Development of techniques for measurement of blood pressure and temperature Enhanced the ability to monitor a patient’s status by measuring the patient’s vital signs, including pulse and respiratory rate
Advanced exploration in the science of chemistry Discovered various laboratory tests to diagnose specific conditions
Introduction of anesthetic agents Allowed the performance of more complex surgical procedures through the alleviation of pain
Discovery of X-ray technology Provided ways to diagnose various internal conditions such as foreign body presence and fractures
Medications Through the study of plants, various drugs were developed and categorized: antibiotics (Penicillin), insulin, and birth control.
Vaccinations The development of vaccinations prevented thousands of deaths from: Smallpox, Diphtheria, Typhoid fever, Measles, and Chickenpox.
Surgery The treatment option of surgery enhanced the ability to cure more complex conditions:


Diagnostic advances came as the result of research in the areas of anatomy, body systems, contagious disease, and prevention of disease. Exploration of the human anatomy provided graphic descriptions of body parts that were used to develop principles of the structure and function of body systems. These principles were applied and used to identify organisms that contribute to dysfunction in those systems. Ultimately, the study of anatomy contributed to the ability to identify, classify, treat, and prevent disease.

Technological advances included new equipment used to study disease. The invention of the microscope allowed for the examination of anatomic tissue, cells, and other living organisms. New surgical procedures were developed to treat soldiers inflicted with gunshot wounds, which came after the creation of gunpowder. Equipment developed to measure blood pressure and body temperature enhanced the ability to monitor the patient’s vital signs, which were seen as an indicator of health status. The exploration of chemistry led to the discovery of various laboratory tests used to diagnose specific conditions. More complex surgical procedures were performed after the introduction of anesthetic agents. Diagnostic and therapeutic procedures were improved with the advent of X-ray technology.

Therapeutic advances were made through the study of plants, which led to the identification and categorization of many new medications, such as antibiotics, insulin, and those used for birth control. Vaccinations for diseases such as smallpox, diphtheria, typhoid, measles, and chickenpox prevented thousands of deaths. Surgery became a more viable option with anesthetics, and more complex conditions were treated through the performance of new surgical techniques. Advanced surgical procedures included removal of cancerous tissue, open heart surgery, and organ transplants.

Medical Standards and Accreditation

Early European and U.S. hospitals were developed without knowledge of bacteria and how disease spreads. It was later realized that disease and infection could be prevented through the use of disinfectants and antiseptics. A great contributor to the development of standards of hygiene and sanitation was Florence Nightingale. She worked throughout Europe in military hospitals that were dirty and unsanitary. She later developed standards for nursing that included sanitary measures, which had not existed up to that point. Hospital units, clothing, linen, and surgical instruments were cleaned regularly. The reality that more sterile environments prevented infections led to a general acceptance of hospitals as a place to receive medical care.

As medicine became more scientific, the need for trained physicians and nurses was highlighted. U.S. hospitals contributed significantly to medical education by providing access to a large number of patients available for observation. By the nineteenth century, there were more than 400 medical schools in the United States. A group of U.S. physicians formed the American Medical Association (AMA) in 1847, with a mission to improve the standards of medical education. It is said that the AMA’s efforts in raising medical standards led to the creation of the first state licensing boards. The AMA initiated an accreditation process that ranked schools according to their performance based on a model for medical education established by the Flexner Report. This model is still used in many schools today.

As the role of hospitals changed, it was realized that standards for care provided in the hospital must also be developed. An organization called the Association of Hospital Superintendents was formed in 1898 to provide a forum for hospital superintendents to discuss areas of concern and to share ideas. Membership in this organization was later expanded to include other hospital personnel, and the name was changed to the American Hospital Association (AHA) in 1906. The mission of the organization was to promote public welfare by improving health care in hospitals. The AHA conducted inspections of hospitals and, over time, established standards for the provision of care in hospital facilities.

During the 1900s the expansion of hospitals continued throughout the United States, and thus the need for standardization in hospitals was highlighted. The American College of Surgeons (ACS) was established in 1913 for the purpose of developing hospital standards by collecting patient care data. The ACS was at the forefront of developing and maintaining standards for hospital medical care, which were referred to as the Hospital Standardization Program.

The Hospitalization Standardization Program established by the ACS was adopted by the newly formed Joint Commission on Accreditation of Healthcare Organizations (JCAHO) in 1951, which was founded for the purpose of developing guidelines for hospitals and other health care organizations. The JCAHO announced their name change to The Joint Commission (TJC) in 2007. TJC evaluates and accredits health care organizations nationally, based on established standards of quality for operations and medical services. Accreditation will be discussed further in the Hospital Regulatory Environment chapter of this text.

Rising Health Care Costs

The rising cost of hospital care created health care a financial burden for people who could not afford healthcare. Individuals and hospitals continued to experience financial difficulty, especially after the Great Depression. In an effort to deal with the financial difficulty experienced by individuals and hospitals, a third-party payer, Blue Cross, introduced one of the first prepaid health plans in 1929 to provide coverage for hospital care. A third-party payer is an organization or entity other than the patient or provider that pays for health care services. A prepaid health plan provides health benefits for specified medical services in exchange for prepayment of monthly or annual premiums. Prepaid insurance spreads the financial burden of hospital care among a large group of members, thus reducing the individual cost. The Blue Cross prepaid health plan provided coverage for 21 days of hospitalization. The prepaid insurance helped people protect themselves against the cost of hospitalization. It also helped to improve the financial position of hospitals, because they could now receive payment for the care provided. During World War II employers began to offer full-time employees group health insurance as an incentive for employment. Wages could not be increased because of the wage freeze; therefore, pretax dollars were used to purchase the insurance. Group health insurance provides coverage for medical services to members of a group, such as an employer group or association.

Hospital Overcrowding

Hospitals became overcrowded as the number of patients increased due to the growing elderly population, military involvement, and medical advances. Legislative action was taken, and the Hospital Construction and Survey Act, otherwise known as the Hill-Burton Act, was passed in 1946. The Act made funding available, based on state need, through government grants to modernize existing hospitals and build new hospitals. Hospitals receiving the government funds were required to provide care to individuals who were unable to pay, at reduced rates or for free. The Hill-Burton Act provided funding for the growth and development of hospitals for many years.

Many other factors through this period and into the 1960s contributed to the rising cost of health care. The field of medicine began to see more chronic disease, and the need for hospitalization continued to rise. Hospitals expanded services and began to provide laboratory tests, X-rays, and other therapies in outpatient clinics. Another factor included government-sponsored programs created to address the growing concern about the number of uninsured and underinsured individuals among the elderly population.

Government Health Care Programs

In 1965, through an amendment of the Social Security Act (SSA) of 1935, health insurance for the aged was implemented. Medicare was implemented under Title XVIII of the SSA to provide health care benefits for medical services provided to individuals over age 65, the disabled, and other qualified individuals. Another program, Medicaid, was implemented under title XIX of the SSA to provide health care benefits for medical services provided to specified individuals and low-income families. Medicaid is administrated at the state level. Both of these programs will be discussed in greater detail in the Health Care Payers chapter. Through the implementation of these programs, the federal government became one of the largest payers of health care services.

Mar 24, 2017 | Posted by in NURSING | Comments Off on Hospital Introduction
Premium Wordpress Themes by UFO Themes