2
History of Long-Term Care
CAROLE HABER
CHAPTER OVERVIEW
This chapter focuses on the development of long-term care in the American colonies and the United States from the 17th through the 20th centuries. It demonstrates the effects economic, demographic, political, and cultural changes have had on both the attitudes and practices shaping long-term care, as well as the lasting impact of historical ideas and institutions.
LEARNING OBJECTIVES
After reading this chapter, you should understand:
• The English philosophy of care that shaped early colonial policy
• Why almshouses were established and their impact on support of the poor and aged
• The role and impact religious beliefs of the 19th century had on institutional care
• The foundation of old-age homes in 19th-century America
• Beliefs about older adults in the first half of the 20th century
• The effect of Social Security on institutional care
• How legislative and judicial rulings in the second half of the 20th century affected long-term care
KEY TERMS
Americans with Disabilities Act (ADA)
Federal Nursing Home Reform Act (OBRA ’87)
Omnibus Budget Reconciliation Act of 1987 (OBRA ‘87)
EXHIBIT 2.1 Long-Term Care Timeline
1601 | English poor law |
1612 | First hospital erected in Jamestown |
1664 | First almshouse built in Boston |
18th century | Development of almshouses throughout the colonial era and early republic |
Late 18th and | Cities attempt to eliminate outdoor relief early 19th centuries |
Early 19th | The Second Great Awakening changes attitudes about |
century | institutional reform |
1810–1840 | Cities and states build mental hospitals, orphanages, prisons |
1817 | First old-age home established in Philadelphia |
1900–1935 | progressive attitudes reevaluate the almshouse |
1935 | Social Security Act established |
1950 | Key Social Security Amendments allow payments to public institutions |
1960 | Kerr–Mills Act—Medical assistance to the aged |
1963 | Fitchville Fire illustrates need for federal support |
1965 | Passage of Medicare and Medicaid |
1987 | Federal Nursing Home Reform Act (OBRA ‘87) |
1999 | Olmstead Decision |
INTRODUCTION
In recent years, issues surrounding long-term care have become both a national concern and, for many, a difficult family and personal issue. Yet such matters—although certainly magnified by the number and proportion of elderly individuals in the population—are neither new nor the creation of a very recent past. In the United States, since its colonial beginnings, government authorities, welfare advocates, medical experts, family members, and the old and infirm themselves have all struggled to define and create suitable long-term care solutions. Not surprisingly, over the course of 400 years, although the problems of meeting the needs of these groups have remained, the possible options available for addressing their demands have significantly evolved. Individuals who required extensive care in the colonial era faced a very different landscape from that of their modern counterparts. Some ended their days in newly established almshouses, whereas others depended on the benevolence of their communities or assured their continued care through legal agreements with family members. No nursing home existed to provide extended care; no national government program offered financial assistance, allocated medical aid, or set institutional regulations. By the mid-20th century, the changing institutional, medical, and political conditions that defined the long-term care of the old and infirm resulted in a range of options that spoke to modern realities and attitudes. At the same time, these alternatives harkened back to the beliefs of the past and the long shadow cast by the first long-term care institutions.
COLONIAL AND EARLY 19TH-CENTURY PRACTICES
Among the first White settlers in British North America, few who dared to cross the ocean were of great age or incapacity. In the first few decades of passage, only the young seemed foolish or desperate enough to undertake the voyage. At times, nearly half of those who embarked on the journey were unable to withstand the storms and privations of the trip; the Atlantic Ocean became their final resting place. As was well known in England, the New World was hardly a hospitable place for the old or disabled. In the first century of settlement, therefore, colonial governments provided little institutional care. In Jamestown, the first hospital, built in 1612 to accommodate 80 “sicke and lame patients,” burned down in 1622. Another century would pass until the establishment of a stand-alone hospital (Duffy, 1976).
In time, of course, the earliest surviving settlers did grow old and others were incapacitated by accident, disease, or childbirth. In fact, within a century of settlement, the British colonies boasted of some of its inhabitants’ advanced age. In the 1700s, even with the high fertility rates, 5% of the population was above 60 (Wells, 1975). Among a society in which the median age of the population was around 16, such individuals often attracted the notice of commentators. Their gray hair and stooped backs were signs not only of their remarkable longevity, but also of the community’s need to offer support when necessary (Haber, 1983).
For these individuals, as well as those who were incapacitated or mentally challenged, both tradition and law dictated that they be cared for in their own homes or in the residences of others. Following the principles established by the English Poor Law in 1601, colonial governments decreed that families were required to provide for the needs of their relatives; kin were to take in the orphaned or infirm; and the old were to be sheltered alongside their offspring. If individuals required additional support, they often received “outdoor relief” in the form of a cord of wood, food, or clothing.
Although relatives generally opened their doors to the needy, not everyone found coresidence to be ideal. The colonial law that specified that kin had the financial and moral responsibility to care for their relatives did not dictate how that obligation was to be met. If they chose to place the individual with others, government officials generally approved such an arrangement. In 1715, for example, Mary Thomas of Boston explained to the selectmen that she had contracted care for her mother in Dorchester. Although she did not specify why she had chosen this course, the town leaders seemed satisfied that her aging mother would not be deserted or neglected (Record Commissioners of the City of Boston, 1884).
Even in colonial society, some failed to fulfill their responsibilities entirely. In response, both ministers and magistrates railed against families who had turned their backs on their kin. Samuel Willard, for example, chastised the young who “desert their helpless parents, as thinking it now time to look to themselves, and let them shift as they can” (Willard, 1726, p. 608). Other ministers went directly to the families to remind them of their duties. In 1713, Cotton Mather, the esteemed Puritan minister, noted, “there is an aged woman in my neighborhood, poor, lame, and sick, and in miserable circumstances. I must not only releeve [sic] her myself but also summon together her several relatives, that they agree to have her well-provided for” (Mather, 1912, p. 208). Local officials were also quick to take action, ready to penalize those who failed to meet their obligations. In the late 17th century, for example, the selectmen of Boston notified James Barber that if he did not provide care for his father, “you may expect wee [sic] shall; prosecute the law upon you” (Record Commissioners of the City of Boston, 1692, p. 62).
The colonial tradition of providing long-term care for the needy did not rest on sermon and statute alone. Rather, the agrarian nature of society established the economic basis for long-term care. Following the practices of England, upon settlement in the New World, land was distributed to male head of households. Retaining title to the property until their deaths, aging individuals were able to rely on the valuable property to guarantee the continued residence and support of at least one family member. Sons knew that if they deserted the household, they would also be leaving behind their inheritance. Even later generations of colonists, who began to deed some of the property to their heirs before their demise, made sure they retained the status of household head with at least one child in the household (Greven, 1970).
In the early 18th century, Joseph Abbot of Andover, Massachusetts, exemplified this pattern. One of six sons, he was selected to inherit the family estate upon his father’s death. As a result, Joseph remained a bachelor in his family home, responsible for the care of his aging parent. In 1731, at age 73, his father died. A year later, Joseph, now 45, married and began the family that would, in time, presumably support him in his old age (Greven, 1970).
For elderly widows, colonial landholding practices also tended to ensure care, although not necessarily the power that traditionally rested with the male head of household. According to legal statute, widows were entitled to one third of the estate. In addition, wills generally listed the wife’s portion of the estate. The documents often explicitly stated the room she would inhabit as well as the livestock and property that could be used for her needs. The widow would receive “the bedstead we lie on, the bedding thereto belonging” or “two cows, by name Reddy and Cherry, and one yearling heifer” (Demos, 1978, p. 238). Well into the late 18th century, these patterns persisted. In 1789, Adam Deemus of Pennsylvania decreed in his will that after his death, his wife would have “the privilege to live in the house we now live in until another is built and a room prepared for herself if she chuses [sic], the bed and beding [sic] she now lays on, saddled bridle with the horse called Tom; like ten milch cows, three sheep” (Chelfant, 1955, p. 35).
However, even in colonial America and the early republic, not all infirm and aged individuals could find such support. In small towns and communities, well-established residents without a kinship network became the responsibility of the selectmen. Often, government officials would find a suitable family to shelter them, providing funds for the boarders’ support. In other cases, they would arrange for a family to reside in the home of the debilitated individual. Although such practices did not guarantee the quality of care, they assured that the person would not be abandoned.
Others faced a more uncertain fate. Those without any ties to the community would be “warned out” and unceremoniously escorted to the town’s limits. In 1707, for example, Nicholas Warner was repeatedly ordered to leave the city of Boston. Despite the fact that he was over 80 and infirm, he merited little sympathy or support from the town’s leaders (Record Commissioners of the City of Boston, 1884, p. 57). Colonies even passed laws that barred individuals such as Warner from ever entering their borders. Just before the revolution, Pennsylvania ratified an act requiring a bond from anyone who chose to bring an elderly person into the community. If the individual then became a burden on the town, the money could be used to transport the individual back to the community of origin (Assembly of the Province of Pennsylvania, 1775, p. 160). Over the course of 100 years, town after town and colony followed by colony enacted laws to dissuade any person who might require relief or long-term care from entering the community.
In making such laws and allocating relief only to their local residents, communities and colonies gave little special attention to the age or infirmity of the individual. As was evident in Boston’s “warning out” of Nicholas Warner, the ties to the city, rather than his advanced years, physical condition, or mental state, dictated his treatment. In providing for long-term care, custom and statute grouped together the orphaned, the insane, the incapacitated, as well as the old. Categorized as the “worthy poor,” they were judged not to be responsible for their impoverished state. As such, and if they had long-standing ties to the community, they were seen as proper recipients of the community’s assistance (Rothman, 1971).
With the growing cities of the colonies and early republic, localized solutions to the needs of these groups increasingly became insufficient. The numerous colonial wars of the 18th century brought large numbers of impoverished refugees into the cities; the high mortality rate of colonial society meant that being an orphan or a widow was a common state of affairs. Without families or established communities, individuals had needs that could not be met by reliance on kinship support.
In response, and following practices established in England, colonial cities began to create the earliest institutional solution to long-term care—the municipal almshouse. First founded in Boston in 1664, the almshouse became a well-known and easily recognized institution in scores of cities throughout the 18th century. Within its walls could be found the most problematic of the needy: the orphan without a family, the insane individual whose care had become too difficult for his or her kin, the widow who had outlived her relatives, and the diseased whose sickness was incapacitating.
Initially, when almshouses were established, they were both small and rather unregulated institutions. Generally, they served a minute population. The Overseers of the Poor continued to provide outdoor relief to most needy individuals, limiting institutional care to only the most debilitated. In 1700, for example, New York City opened its first, very small almshouse, but continued the practice of outdoor relief. Between 1724 and 1729, the Mayor’s Court granted relief to 51 cases. For 18 persons they allocated outdoor relief, providing assistance to individuals within their own homes. They placed 19 individuals in the residences of others. Many of the individuals in the first two categories faced temporary illness or could survive with minimum support. The 14 they placed in the institution, however, had far greater demands. In the roster of the almshouse, they were described as extremely handicapped, blind, insane, or, in the words of the court documents, “ancient” (Rothman, 1971).
Other major cities soon followed Boston’s and New York’s example. Charleston founded its poorhouse in 1712; Philadelphia’s first public institution opened its doors in 1732. As in the case of the earliest institutions, these homes sheltered small numbers of residents. The municipal overseers assumed that only a few—the most debilitated individuals—would require long-term institutional care.
In the growing colonial cities of the 18th century, however, as migrants and immigrants flooded into urban areas, the small almshouses quickly became outdated. Their lack of discrete space meant that rooms became crowded with all types of indigent individuals. The young mixed with the old; the insane shared space with the diseased. As a result, throughout the colonies and then into the early republic, officials in the largest urban areas began to replace the small and unregulated buildings with far more imposing institutions.
In time, the existence of these new almshouses not only provided for greater numbers, but also had a direct impact on the colonial approach to long-term care. Convinced that many individuals had come to rely needlessly on charity, officials in numerous cities attempted to eliminate all outdoor relief. They declared that anyone who needed assistance had to reside in the publicly supported buildings (Rothman, 1971). In 1769, for example, Barnet and Sarah Campbell of Great Barrington, Massachusetts, petitioned the Overseers of the Poor for outdoor relief. Old and ailing, they assumed that their advanced age and roots in the community would qualify them as worthy of such assistance. The city’s authorities, however, responded that if they wanted care, they would have to enter the poorhouse (Quadagno, 1986).
By the early 19th century, welfare authorities across the new nation adopted this policy. Josiah Quincy of Massachusetts, a leading expert on charity allocation, demanded an end to all outdoor relief. In his influential Report on Poor Relief, he wrote, “the diminution of the evil is best, and most surely to be effected by making Alms Houses, Houses of Industry, not abodes of idleness, and denying for the most part all supply from public provision, except on condition of admission into the public institution” (Quincy, 1821, n.p.). Similarly, John Yates of New York asserted that even the worthy and disabled should be placed in institutions if they wished public support (Trent, 1994). Such statements did not mean that either men felt “the ancient” unworthy of assistance. Quincy, for example, stated, “of all classes of the poor that of virtuous old age has the most unexceptionable claims upon society” (Quincy, 1821, n.p.). Nonetheless, he asserted that they, too, should only receive care if they were willing to leave their homes for long-term care in the almshouse.
In requiring all who petitioned for assistance to enter the poorhouse and despite their sympathy for the needy old, welfare advocates, such as Quincy, did not then believe that the institution should be designed as a hospitable environment. Subscribing to the principle of “less eligibility,” they declared that the almshouse’s environment could not meet or exceed conditions found beyond the institution’s walls. If it did, they feared, unworthy individuals would wish to seek shelter rather than providing for themselves. As a result, the modern concern for the quality of the resident’s life was turned on its head: conditions within the institutions were purposefully intended to be inhospitable. The squalid environment was created to dissuade individuals from applying for entrance or remaining within the institution (Foucault, 1972; Rothman, 1971). Given this philosophy, conditions were often abhorrent: the insane were chained; the orphan went unschooled. Infected prostitutes interacted with alcoholic men while the old languished on their beds. In southern asylums, the races were segregated, or Black inmates were placed in distinct, and far less funded, institutions (Case Study 2.1). Well into the 19th and even the early 20th century, institutions that sheltered African Americans received one third of the budget of White institutions and were generally devoid of plumbing, electricity, or mattresses. In the north, such institutions became filled with impoverished immigrants (Haber & Gratton, 1986).
Case Study 2.1: Long-Term Care in a Southern City: The Segregated Almshouses of Charleston, South Carolina
In 1712, Charleston opened the American colonies’ second public almshouse. Its founding reflected the booming nature of the 18th-century city as a center for trade, a bustling port, and a magnet for immigrants and migrants. By the early 19th century, the Charleston almshouse had become a shelter to a growing number of impoverished and ailing individuals, nearly one in five of whom was of advanced age.
Although the southern poorhouse bore many similarities to northern asylums, it exhibited one striking regional difference. In 1811, city leaders dictated that Blacks could enter the institution only if they were insane, largely because all “lunatics” were to be housed in separate quarters in the basement. The authorities made it quite clear that they would not tolerate large numbers of African Americans sheltered alongside needy Whites. In 1837, Major Henry L. Pinckney declared that that the almshouse was “specifically intended for destitute whites.” Moreover, despite the fact that free Blacks comprised the neediest of the city’s citizens, the Commissioners for the Poor rarely extended outdoor aid to individuals of color.
By 1856, however, the city leaders began to revise the policy of excluding Blacks from the welfare system. The “aged and infirm free colored poor of the City and State,” asserted the Commissioners of the Poor, “are fit and rightful objects of public relief.” Rather than agree to house Blacks and Whites together, however, Charleston city officials established a new almshouse—eventually called the Charleston Home—for needy Whites and turned the old, dilapidated institution over to impoverished freed Black men and women.
Before the Civil War only a few destitute Blacks seemed to be housed in this institution; after the conflict, the notion of having two very different asylums for the long-term care of the poor became firmly established. As large numbers of impoverished, recently freed slaves sought assistance, the city opened a new long-term care institution, the Ashley River Asylum, solely for Black inmates. By 1899, it housed 91 paupers, far more than the 73 in the White institution.
But it was not only the numbers that differentiated the two shelters. The amount of resources and the resulting quality of care contrasted sharply between the institutions. Throughout the late 19th and early 20th centuries, the city routinely allocated two to three times the amount for the White asylum than for the Black. In 1928, for example, the city distributed $263 for each White inmate and only $115 for the Black resident. Thus, although the White home boasted a library, flowers, an icebox, and ample food and heat, the Black asylum had no water, mattresses, sewage system, or lights. Perhaps not surprising, mortality rates also differed sharply between the two homes. At the Black Ashley River Asylum, annual death rates ranged around 40%, whereas at the Charleston Home, yearly mortality rates rarely rose to 10%.
With the implementation of Social Security, the century-long system of segregated institutions came to an end. As the federal legislation mandated that residents of the city’s public asylums could not receive aid, in 1949, both the Charleston Home and Ashley River Asylum finally closed their doors. Instead, their residents now turned to support from National Old Age Assistance and private nursing homes (Charleston City Year Book for 1887, 1914, 1925, 1938, 1949; City of Charleston, 1967; Haber & Gratton, 1986).
Case Study Discussion Question:
1. To what extent do you feel it is appropriate to make special accommodation in long-term care facilities through separation on the basis of age, sex, marital status, mental status, religion, or culture?
In demanding that paupers relocate into the repugnant asylum if they wished assistance, welfare advocates were cognizant that even such substandard relief was more expensive than outdoor relief in the form of an occasional cord of wood or parcel of food. To limit expenses, they demanded that long-term inmates contribute to their own sustenance. All ambulatory individuals in the almshouse were assigned tasks such as growing food and producing goods for sale. Although they did not require work from the bedridden, they demanded that once their health improved, these recovering individuals would serve as nurses for those in worse health than they were. In the early 19th century, for example, the Philadelphia almshouse divided the women’s wards into four categories (Rosenberg, 1987):
Aged and helpless women in bad health
Aged and helpless women who could sew and knit
Aged and helpless women who are good sewers
Spinners
Despite this classification, the almshouse overseers knew very well that only a minority of inmates was able to work. Those who entered the poorhouse were not only indigent, but also extremely infirm and incapacitated. In response, many of the almshouses opened hospital wings attached to the asylum. Here, the chronically ill often passed their final days or were moved back and forth from the medical annex to the main residence depending on the availability of beds. Increasingly, the medical function of the institution came to dominate its character. By 1826, in the Philadelphia almshouse, for example, 13 of the 18 women’s wards were defined as medical care wards; only three of the 19 male wards had inmates who were actually able to work. By 1849, the Philadelphia almshouse listed 756 male paupers of whom 449 were patients and 67 served as nurses (Rosenberg, 1987). Overshadowed by disease and dependence, the entire facility increasingly turned into a medical, long-term care institution.
Yet such medical attention did not mean that the almshouse provided even the old and disabled with consideration or care. In 1797, for example, the Pennsylvania Hospital received one inmate of the almshouse who, as the Overseers of the Poor noted, had received a broken jaw “occasioned by a stroke from Dr. C.” Despite the degree of their infirmities, residents were seen as inmates—they were to respond properly to their betters or suffer the consequences. They could not question—or even talk to—the physicians who attended to the institution without first being addressed by the doctors and given permission to speak. They were, after all, spending their final days dependent on the city’s benevolence (Rosenberg, 1987).
Basic to the philosophy behind such long-term care was the expectation that the institutionalized individuals had little hope of being cured. Long-term care in the almshouse provided food and shelter rather than hope or reform. Those who resided in its wards were categorized as society’s most dissolute, destined to spend their final days relying on public support. The Overseers of the Poor did not believe that it was wise or necessary to devote considerable resources to their care.
RELIGIOUS REFORM AND LONG-TERM CARE
In the 19th century, religious and social beliefs led to a questioning of these assumptions. Followers of the Second Great Awakening challenged the biblical notion that “the poor will always be with us.” Within the walls of the almshouse, they identified groups whom they believed could be transformed or saved. They saw little reason to mix, indiscriminately, the orphan who might have a redeemable future with the old or debilitated. They declared that even the insane or mentally challenged might be saved and the criminal reformed, whereas the orphan or able-bodied could be taught to live a productive existence (Grob, 1973).
Thus, over the course of the 19th century, welfare advocates established specialized institutions for those who had once spent their days among the wards of the almshouse. Almost simultaneously they founded orphan asylums, mental hospitals, chronic care hospital, prisons, and homes for the blind and deaf. In each of these institutions, their organizers believed that residents would find a beneficial environment that would lead to almost miraculous cures or reformation. Once removed from the chaos of the modern world, the inmate was to become a far more perfect individual who could then be returned to the outside community (Grob, 1973).
THE NEW ASYLUMS AND THE IMPACT ON THE OLD
In the mid-19th century, for example, reformers established private mental asylums that touted cure rates of nearly 100%. Although such figures greatly exaggerated the therapeutic nature of such care by counting each returning patient as a new case, their enthusiastic superintendents argued that they were able to eliminate insanity in America. Focusing on the newly afflicted, they promised that if they were given a patient soon after the onset of the disease, they could restore sanity by teaching order and discipline. These were not to be places where the chronic languished for years, but rather institutions of hope and restoration (Rosenkrantz & Vinovskis, 1978).
In stressing the appropriateness of such institutions for those only recently suffering the onset of insanity, the superintendents made it clear that they had little room for the old or chronically ill. Despite original intentions, by the late 19th century, such institutions became filled with the chronically ill and elderly, thereby turning the asylums into long-term care institutions. Initially, the men in charge, as well as theorists on insanity, argued forcibly against admitting elderly people. In the mid-19th century, for example, the Superintendent of the Massachusetts State Hospital strongly cautioned against any type of long-term care for the insane of advanced age. “As there is no reasonable hope for cure,” wrote Dr. George Choate in 1860, “I have generally advised friends to retain them at home” (Trustees of the State Lunatic Hospital at Taunton, 1860, p. 34).
The renowned British psychiatrist T.S. Clouston supported this admonition. The old, he informed his students, should not be admitted to a mental asylum as, he wrote, the “difficulty of managing such cases satisfactorily … is extreme. They are very restive, always meddling with something or somebody, very obstinate, entirely forgetful and purposeless.” Such elderly patients, he noted, required the best individualized care in order that they not get hurt or hurt others. And yet, he concluded, “all this needs to be done … under the depressing feeling that it is of no use in the long run towards the cure of the patient” (Clouston, 1884, pp. 401–402).
Not surprisingly, in the 19th century the proportion of older adults found in mental institutions was far below their expected proportion. In 1854, in a study of Massachusetts, the old had the highest reported rate of insanity of any age group. Although over 18% of the state’s reported insane were of advanced age, fewer than 10% of the inmates in the asylum were above 60 (Rosenkrantz & Vinovskis, 1978). Similarly, in New York, in 1871, although 80% of all insane persons of ages 30 to 40 were placed in asylums, only 38% of the old who were identified as insane were permitted to enter such supposedly curative institutions (Board of State Commissioners of Public Charities of the State of New York, 1873).
In rejecting older people from obtaining such care, the superintendents argued that the almshouse was more suitable for their mental illness. Assuming that everyone of advanced age was simply “senile,” they asserted they could do little for their disease. “We never intentionally send an insane person to the almshouse,” explained Dr. Charles T. Gaynor, “except in the case of an old person who is pretty senile and can be treated as well [there] as at any other hospital” (Goldhamer & Marshall, 1953, p. 79). Although actual cure rates of insane elderly people who did enter the asylum mirrored their younger counterparts, superintendents continued to refuse the majority of older adults’ entrance, asserting that their institutions were not intended for long-term care (Boston Commissioners, 1904).
A similar rejection of older adults occurred in hospitals throughout the nation. Originally established as institutions that housed their patients for extended periods of time, hospitals often set aside a ward for the long-term custodial care of old and incapacitated people. In 1869, when Roman Catholic Carney Hospital of Boston opened its doors, it reserved one of its five floors for “old people who may not be sick but come here for a home.” By the 1890s, however, the hospital clearly changed its function and its willingness to provide a long-term residence for elderly individuals. In the 1890s, it made it clear that it did not want, nor would support, “chronic, lingering inmates” (Vogel, 1974, p. 138).
OLD-AGE HOMES
Not surprisingly, given such admonitions, and without the young, able-bodied, or insane who were now housed in other institutions, almshouses increasingly turned into long-term care institutions for elders. Although charity advocates and benevolent organizations deemed such a fate to be acceptable for the scores of immigrants who crowded the poorhouse, they judged the institution to be an unwarranted end for the minority of native-born persons who, without family or resources, had nowhere else to turn. Perhaps ironically, the same welfare experts who argued for the end of outdoor relief and the establishment of the detestable poorhouse also led organizations that created private, long-term care residences for specific groups among older inhabitants. In their views, individuals of their own religion or background did not deserve to die with the stigma of having been an almshouse pauper. Rather, they argued that specialized old-age homes should be established to provide long-term care for the upstanding, and especially, native-born elderly people.
THE GROWTH OF OLD-AGE HOMES
This initiative began early in the 19th century in Philadelphia. In 1817, horrified when they discovered two native-born Christian widows among the immigrants in an almshouse ward, a group of middle-class women established the first old- age home in the United States—the Indigent Widows’ and Single Women’s Society. In the view of the founders of the asylum, the ideal residents of their old-age home were women “whose earlier lives had been passed in more refined walks of life and whom experience, therefore, had not inured to the struggles of penury.” Such individuals, the founders asserted, were “too respectable to be classed with the poor that come under the notice of most charitable societies, and unwilling to be inmates of an Almshouse.” The new old-age home was to provide a suitable alternative (Haber, 1983).
However, the founders did not simply assume that those who applied to them deserved to be rescued from the almshouse. To ensure that, as the Society wrote, “unworthy objects would be excluded,” every applicant had to provide evidence of a “character and habits beyond reproach,” as well as recommendations from “acceptable persons” (Indigent Widows’ and Single Women’s Society, 1824). In addition, in 1823, the Society established a sizable $150 entrance fee. This money, the benevolent women argued, would be a sign that the individual’s needy state was not due to a lifetime of impoverishment. Those who lacked the funds, but had led an exemplary life, they reasoned, could appeal to their church or their friends for the necessary support (Haber, 1983).
Nonetheless, in its earliest years, the old-age home, like the almshouse, required its able-bodied inmates (as they were called) to contribute to the home through their labor. Women in the Philadelphia asylum often spent their days making shirts or participating in other domestic duties of the house. Not until the mid-1830s did the requirement to labor disappear from the old-age home’s bylaws. In other asylums, inmates produced the food they ate or were required to spin and weave (Haber, 1983).
The early founders also imposed rules to “preserve perfect harmony in the family” (Indigent Widows’ and Single Women’s Society, 1819). Establishing a resident matron who served as the central authority figure, they decreed that any woman disagreeing with her policies would be expelled from the institution and face ending her days in the almshouse. During the first 17 years of its establishment, the Philadelphia asylum determined that 20 out of 140 inmates were not suitable members of the community and expelled them from the establishment. By 1887, to ensure that the resident met their expectations, the society established a 1-year probationary period before the individual assumed permanent resident status (Haber, 1983).
Chief among these rules was the notion that the individual would separate herself from the world and prepare spiritually for her future destiny. Proudly, the founders noted, “What is more important, many exemplary Christians devote their best efforts to instruct [the inmates] and to induce them to prepare for the awful change from their sojourn in this life to a weary rest from their labours [sic], and where sorrow can never intrude” (Haber, 1983, p. 95). Repeatedly, they relayed stories in their annual reports of women who had lost their beliefs as they faced increasing hardships. Forced to end their days in the almshouse, they certainly would have died without any spiritual guidance. In their asylum, however, and under the tutelage of Christian women and ministers, they died knowing that they would meet their heavenly reward.
The initiative that began in Philadelphia quickly spread to other major cities. Like the earliest asylum, Boston’s first old-age home, established in 1849, was created to rescue worthy elderly women who had “a natural repugnance … to be herded with paupers of every character, condition, and clime” (Rogers, 1850, p. 3). The almshouse, Henry B. Rogers, asserted, had been taken over by “foreigners.” As a result, the new asylum was to be for those who were “bone of our bone and flesh of our flesh” (Rogers, 1850, p. 3).
By the end of the 19th century, scores of homes existed across the country in both large cities and smaller towns. They housed not only Christian White widows, but men, married couples, immigrants of varied ethnicities, as well as Black men and women. In Philadelphia alone, by the end of the century, the city roster noted the existence of 24 homes. Numerous churches created asylums for their congregants, whereas labor organizations erected homes for working men of specific occupations. Separate asylums were founded for African Americans, retired actors, as well as aged soldiers and sailors (Haber, 1983). Especially for groups whose lifetime employment denied them the ability to start a family or establish a household, the old-age home offered an alternative to almshouse residency.
Although the residents within these homes often differed by religion, occupation, race, or gender, the founders shared specific beliefs about their residents. Not only were the old-age homes intended to save the inmates from ending their days in the almshouse, but they were also fashioned to reflect contemporary views of senescence. Sharing the medical view of the era that the last years of life were a time of disease and dependence, the homes segregated the old people based on chronological age and offered supposedly age-specific activities: a rare trip to the park, a young people’s visit, and a Christmas concert. In their annual reports, the matrons of the homes repeatedly celebrated the fact that they had libraries stocked with appropriate books and flowers that lifted the spirit of those who faced a declining number of days.
THE MYTH OF ALMSHOUSE RESIDENCY
In portraying their homes as a superior alternative to the dreaded almshouse, the founders and managers of such homes clearly played into the widespread fear that without such institutions elderly people would inevitably end their days in the almshouse. For the great majority of aged individuals, such a concern was unfounded. Throughout the 19th century, the proportion of the elderly population in an asylum—whether in an almshouse or in an old-age home—remained rather stable. No more than 2% of the older population at any time sought such shelter. For many in the working and middle classes, industrialization actually increased their wealth and family support rather than led to impoverishment and desertion (Haber & Gratton, 1993).
Despite this reality, the popular culture of the early 20th century presented a very different picture. In movies, such as D.W. Griffith’s What Should We Do With Our Old? and in two popular songs, each titled, “Over the Hill to the Poorhouse,” even hardworking and thrifty elderly individuals were portrayed as having no choice but to end their days in the almshouse. In Griffith’s film, the aged man’s fate seemed inevitable. Thrown out of work by a younger laborer and arrested when he tried to steal food, he watched helplessly as his emaciated wife succumbed to starvation and old age. The subtitle explained that such a condition was hardly his fault; he had simply been “wounded in the battle of life.” Without the support of family or work, he now had little choice, but to go “over the hill to the almshouse” (Braham & Catlin, 1874; Carleton, 1871; Griffith, 1911).
The stereotype of the inevitability of the almshouse was reinforced by statistical studies published by welfare advocates. In city after city, these experts noted, the municipal almshouse had become dominated by the old. In San Francisco, for example, in 1870, the average age of the inmates was 37; by 1894, it had escalated to near 60 (Smith, 1895). Nationally, the numbers seemed to reflect the same trend; whereas in 1880, only 33% of the almshouse population was above 65; by 1923, the proportion had reached 67% (Hoffman, 1909).
Investigations by government commissions also contributed to the myth that an increasing proportion of elders was now institutionalized. In 1934, in their call for federal pensions, the Committee on Economic Security argued that, with industrialization, the almshouse was becoming the final home for ever-increasing numbers of aged individuals. Using a graph composed ominously of black-silhouetted stooped men, they charted the seemingly exploding growth of “Paupers 65 & Over in Almshouses.” In 1860, as the table revealed, there were 18,903 aged paupers or 25.6% of the institutionalized population. By 1923, the number had ballooned to 41,980, or 53.9% of the residents. “The predominance of the aged in the almshouse,” the Committee then conclusively asserted, “is a sign of their increasing dependency” (Committee on Economic Security, 1934, n. p.).
In reality, the growing proportion of aged paupers simply reflected the removal of other groups to institutions dedicated to their specific needs, such as orphanages, mental asylums, or hospitals. Nonetheless, in often citing the ballooning absolute numbers of the old residents within the almshouse, social advocates stressed what appeared to be an indisputable fact: not only the very poor, but also the native born and upstanding were likely to face impoverishment in old age and the danger of institutionalization. “The risks of being left without means of meeting [the needs] of old age,” wrote Lucille Eaves in her influential study, Aged Clients of Boston’s Social Agencies, “are not confined to workers with low earning capacity, but are shared by persons in all ranks of society” (Eaves, 1925, p. 3). Bent down by age and impoverishment, they then had little choice but to die amid the almshouse’s squalor.
Even attempts to make the almshouse appear more like a benevolent institution did little to allay such fears or lessen the horror of ending one’s life in the institution. In 1903, for example, the New York City almshouse changed its name to the Home for the Aged and Infirm; in 1913, the city of Charleston followed suit, now calling their almshouse “The Charleston Home” (Charleston City Year Book, 1914). Other superintendents focused on the amenities in the home, such as food, heat, or indoor plumbing. Regardless of these actions, the stigma of the almshouse remained. “The poorhouse,” as social analyst Henry C. Evans wrote, “is a word of hate and loathing, for it includes the composite horrors of poverty, disgrace, loneliness, humiliation, abandonment, and degradation” (Evans, 1900, cited by Epstein, 1929, p. 128).
This belief in the inevitability of the almshouse and the terror it evoked among older people had a significant impact on welfare philosophy. Even charity advocates whose predecessors had stressed the importance of the poorhouse as a deterrent now began to argue against its existence. If the almshouse could not be avoided, it now seemed to make little sense to punish the unfortunates who had nowhere else to turn or to follow the philosophy of “less eligibility.” Those needing institutional care had obviously done nothing to deserve this fate. Rather than a preventive option, therefore, the institution had become little more than a reflection of the nation’s lack of concern for its elderly citizens. “The placing of these unfortunate poor in the almshouse,” declared Earnest C. Marshall in 1898, in his Annual Report of the Institutions Commissioners to the City of Boston, “is not the kind, humane or just way of treating them. … It should no longer be known that Massachusetts brings shame to old age, the blush to wrinkled faces, by classing them under the shameful name of paupers” (Marshall, 1898, n.p.).
THE ATTACK ON THE ALMSHOUSE
Given increasing animosity toward the almshouse, the attack on the very existence of the institution became widespread. Over the course of the first three decades of the 20th century, welfare advocates were joined by religious, fraternal, and government officials in their rejection of the asylum’s central role in providing long-term care for elderly individuals. By the time of the Great Depression, few could seem to remember why the institution had even come into existence. Rather, opponents of the institution agreed there were myriad reasons for its elimination.
First, they argued, not only was the almshouse a barbaric way to treat the old, but it was not even cost-efficient. In fact, the large and ever-increasing budgets necessary to run the asylums seemed to negate their very purpose. When the almshouse was first established, the Overseers of the Poor knew all too well that supporting a pauper in the institution cost far more than an occasional cord of wood, a small allocation of money, or basket of food. They believed, however, that the loathsome nature of the institution would deter individuals from seeking any relief at all. Convinced now that older people could not avoid the institution, and that the numbers of the needy older population were exponentially increasing, charity experts now asserted that the costs of such institutionalized long-term care were neither prudent nor manageable. “It is well known,” wrote the Illinois State Federation of Labor in 1923, “that the cost of maintaining an aged person in a public institution is far in excess of the amount it proposed to pay an individual in the form of a pension” (Illinois State Federation of Labor, cited by Quadagno, 1986, pp. 170–171). According to the Massachusetts Commission on Pensions, support of an individual outside the institution would at best be a dollar a day or $365; the cost of a couple would be no more than $500. Both these sums, the Commission declared, would be far less than the expense within an institution (Massachusetts Commission on Pension, 1925).
Moreover, opponents of the asylum noted, regardless of these high costs, the institution rarely provided suitable care. Even as early as 1903, a study of Boston’s Long Island almshouse concluded that, although considerable sums were spent, the inmates often lacked sufficient food, heating, or clothing. By 1923, the Department of Labor echoed this finding. Calculating that the average cost per inmate in 2,183 almshouses was $440, the study found that given the great fixed costs of the asylums, they had few resources left to devote to the care or sustenance of their inmates (United States Bureau of the Census, 1923). It mattered little whether the institution was called an almshouse or had changed its name to an “old folks” home; the residents’ basic needs were not being met. The elderly people in the almshouse spent their final days surrounded by disease and impoverishment.
Finally, advocates for the old asserted that reliance on the almshouse for support of older adults not only threatened their dignity, it also had a negative impact on their family members. For too long, the relatives of older people had been faced with an untenable choice. Either they watched their aged parents go “over the hill to the poorhouse,” or they sacrificed their own needs and those of their children to rescue their aged family members from dying in an almshouse. “Consider the dilemma,” wrote Mabel Nassau in 1915 in her landmark study, Old Age Poverty in Greenwich Village, “of the middle generation trying to decide whether to support the aged parents and thus have less to eat for themselves and their children … or to put the old people in an institution” (Nassau, cited by Lubove, 1968, p. 153). A decade later, Reverend George B. O’Conor, director of Boston’s Charities, agreed. His parishioners, he declared, would rather “starve to death” than allow family members to enter the poorhouse (O’Conor, The Boston Globe, January 9, 1923, cited by Haber & Gratton, 1993, p. 136).
PENSIONS, SOCIAL SECURITY, AND LONG-TERM CARE
As numerous segments of society joined together to condemn the existence of the almshouse, they were also unanimous in their support of the solution to long-term care: pensions based on advanced age would eliminate the need for institutional aid. Previously, such funds had been allocated by the government for veterans and their families, as well as provided by select industries for long-time workers (Skocpol, 1992). By the first decades of the 20th century, the movement for national pensions became viewed not only as a way of rewarding the aged worker or disabled veteran, but also as a means to escape almshouse residency.
For fraternal and labor organizations, the symbol of the almshouse loomed large; they repeatedly evoked its seeming inevitability in their campaigns to increase membership. Only through dues-supported pensions, they argued, could even the most hardworking avoid such an ignominious fate. A 1925 illustration from The Eagle, the magazine of the Fraternal Order of Eagles, clearly portrayed this message. Despite the middle-class appearance of the elderly couple’s home, complete with “bless our home” framed needlepoint, a sign outside the window on the factory door reading “young men only need to apply” sealed their destiny. Held in the elderly man’s hand was the inescapable future: the certificate declared that it was the couple’s “Passport to the Poorhouse” (The Eagle, January 1925, cited by Haber & Gratton, 1993, p. 58).
By the 1920s, the call for pensions was also supported by government officials and candidates for political office. In unison, they claimed that the benefits of a pension system would not only restore the dignity of old age, but also would forever eliminate the almshouse as the means of providing the old with long-term care. In his campaign for governor of New York, Franklin Roosevelt, for example, evoked the symbol of the poorhouse in his call for government pensions. The great majority of aged inmates, Roosevelt declared, were institutionalized not as a “result of lack of thrift or energy,” but “as a mere by-product of modern life. … The worn-out worker, after a life of ceaseless effort and useful productivity must look forward for his declining years to a poorhouse.” Roosevelt and others knew that the attack on the almshouse was a salient campaign issue; those who opposed such measures risked their political futures (Roosevelt, Box 16, 1928).
In the enactment of Social Security, therefore, legislators took direct aim at the poorhouse. Through the measure, they not only provided assistance to the needy older adult and insurance for the long-time worker, but also attempted to eliminate the poorhouse as a welfare institution. In its initial formulation, Social Security strictly barred pensions for any resident of a public asylum. According to the legislation, individuals needed to be “sixty-five years of age or older and … not an inmate of a public institution” (“Title 1 – Grants to the States, Section 3, Payments to the States,” Social Security Act, 1935). In including this clause, the creators of the bill were well aware of its impact. “We were” as the Deputy Secretary of Public Assistance of Pennsylvania explained, “rather enthusiastic to empty the poorhouses” (Thomas, 1969, p. 97). Only at a risk to their political future could legislators object to a measure that would eradicate the greatest fear of growing old.
Even after the passage of Social Security, deep-seeded fear of the almshouse served to support arguments that the act was both necessary and constitutional. In 1937, the Supreme Court upheld the measure, declaring that the fear of the almshouse was a very real threat. Writing for the majority, Justice Cardozo proclaimed that “the hope behind this statute is to save men and women from the rigors of the poorhouse as well as from the haunting fear that such a lot awaits them when journey’s end is near” (Helvering v. Davis, 1937). With pensions, guaranteed support was to provide funds to replace the need for institutional care.
Had the promises of the welfare advocates been accurate, the history of long-term care would have ended with Social Security’s goal of eradicating the almshouse. Individuals would have lived out their days in the comfort of their own homes or in the residences of their children supported by small, but guaranteed monthly annuities. Following the passage of the measure, in fact, scores of smaller almshouses disappeared from the country’s landscape. Public officials quickly moved the residents to private facilities, or, in some cases, simply restructured the institution. In Kansas, for example, although the supervisors of the public county homes remained the same, the asylums were transferred to private control. No longer an almshouse in name, and now able to receive its residents’ pensions, the institution continued to provide for the long-term needs of the elderly people (Fischer, 1943). In other states and cities, as the almshouse population was transferred to private asylums or individuals were given support in their own homes, the institution lost its reason for being.
But not all the old who needed long-term care were able to live independently. Even before Social Security was enacted, some welfare advocates asserted that pensions would not eliminate the need for long-term care. In the 1930s, welfare advocate Homer Folks argued against the notion that the old ended their days in the almshouse simply because they were poor. Only about 15% of the residents, he stated, fit this description. Although this group would be able to live independently on a pension, “the others,” he declared at a hearing of the New York Commission on Old Age Security, “are physically infirm and sick, and have various kinds of ailments and conditions that require personal attention of the kind that you could not get in an individual home.” In response, he asserted, the government should include in the Social Security legislation a provision for the institutional support of the infirm elderly (Folks, cited by Thomas, 1969, p. 40).
Immediately following the passage of the bill, the economist and future senator Paul H. Douglas made a similar argument. Writing in 1936, he asserted that:
the dislike for institutional care has, however, commonly been carried too far, so that there is a movement to prevent the pensions from being paid to those who are in any private institution for the aged or who may be receiving treatment in a state hospital, etc. The truth of the matter is of course that while a large proportion of the aged ought not to be in an institution, there is nevertheless a large group who need the skilled and specialized care which only an institution can present. A great many of the old people are ill or crippled and need specialized nursing and medical care. This can commonly be furnished more effectively in a home for the aged or an infirmary than in the private home or lodgings of the aged person.
(Douglas, 1936, pp. 244–245)