The Tragic Failure of the War on Poverty
THE LONG WAR ON POVERTY has managed to eradicate 1960s-style poverty from our midst, or very nearly so—even if our federal authorities today are not competent to describe this accomplishment (or seemingly, even recognize the accomplishment in the first place). This is an important fact in favor of the War on Poverty—but other important facts must be considered as well, all seemingly weighing on the other side of the ledger. For the institutionalization of antipoverty policy has been attended by the rise and spread of an ominous “tangle of pathologies” in the society whose ills antipoverty policies were intended to heal. Those pathologies appear to be conjoined with antipoverty policies; in some cases, antipoverty policies may create them, but irrespective of the causality at work, they are clearly very largely financed today by antipoverty policies.
The phrase “tangle of pathologies” harkens back to the famous Moynihan Report of 1965, which warned of the crisis of the family then gathering for black America. (The now-eminent author of that report, the late Daniel Patrick Moynihan, was then assistant secretary of labor and went on to serve as a Democratic U.S. senator from New York for nearly a quarter-century.) That report was criticized, even viciously denounced, by some at the time, but in retrospect much of it seems positively prophetic.
The Moynihan argument also assumed that the troubles impending for black America were unique—a consequence of the singular historical burdens that black Americans had endured in our country. That argument was not only plausible at the time, but also persuasive. Yet today that same “tangle of pathology” can no longer be described as characteristic of just one group within our country. Quite the contrary: for whatever reasons, these pathologies are evident throughout all of America today, regardless of race or ethnicity. Three of the most disturbing of these many entangled pathologies are welfare dependency, the flight from work, and family breakdown.
Unlike, say, an old-age pension awarded on retirement after a lifetime of work, a bequest of charity or aid to the indigent is a transaction that establishes a relationship of dependence. As a people who have historically prized their independence, financial as well as political, dependence on “relief” and other handouts, whether informal or institutionalized, is a condition most Americans throughout history have attempted to avoid. Recovery from the Great Depression was corroborated by the great decline in the numbers of Americans on public aid: in 1951, the commissioner of Social Security was pleased to report that just 3.8 percent of Americans were receiving public aid, compared to 11.5 percent as recently as 1940. But with the War on Poverty and its successor programs, such dependency has become virtually the norm for modern America. The United States today is richer than at any previous juncture in its history—yet, paradoxically, more Americans than ever before are officially judged to be in need. Welfare dependence is at an all-time historical high and by all indications set to climb in the years ahead.
Perhaps tellingly, the U.S. government did not get around to collecting data and publishing figures on the proportion of the population dependent on need-based benefits on a systematic basis until nearly two decades after the start of the War on Poverty, during the Reagan era. By then (1983), nearly one American in five (18.8 percent) lived in a home taking in one or more antipoverty (means-tested) benefits. By late 2011, according to one Census Bureau source, that proportion had risen above 35 percent—over one American in three.
By 2012, according to a different Census Bureau count, the proportion was slightly lower: 32.3 percent and “only” 29.4 percent if school lunches were excluded from the tally. But this still left more than 90 million Americans applying for and accepting aid from government antipoverty programs. But only 33 million people from America’s “poverty population” were enrolled in those same means-tested programs. In other words, nearly twice as many Americans above the poverty line as below it were getting antipoverty benefits. Evidently, the American welfare state has been defining dependence upward.
By 2012, according to one Census Bureau count, significant demographic subgroups within the American population were well along the path to means-tested majorities—that is to say, toward the point where most members of the groups in question would be claiming benefits from government antipoverty programs, if they were not already doing so. More than 47 percent of all black Americans and fully 48 percent of Hispanic Americans of all ages were reckoned to be taking home means-tested benefits (excluding school lunches from the tally, here and in the rest of this discussion). More than 60 percent of black and Hispanic children, and nearly 43 percent of all American children, were depending on antipoverty programs for at least some support. Dependency was less pronounced among children of Asian Americans and non-Hispanic whites (Anglos), but only to a degree—for both those groups, the ratio was close to 30 percent. In all of the aforementioned cases, most of the beneficiaries drawing on government poverty program resources were men, women, and children not officially counted as poor.
By 2012, nearly twice as many Americans above the poverty line as below it were getting antipoverty benefits. Evidently, the American welfare state has been defining dependence upward.
The most revealing measure of the spread of dependence since the start of the War on Poverty is the declining financial independence of working-age American men. Among men 25 to 44 years of age, more than 25 percent lived in homes taking aid from antipoverty programs by 2012. For non-poor men those same ages, the ratio was over 20 percent.
The reach of dependence is perhaps best highlighted by its inroads into the parts of American society traditionally least ensnared into it. Historically, Anglos have had the lowest dependence on public aid of any major racial or ethnic group delineated within official statistics—yet by 2012, nearly one in five nonpoor Anglo men ages 25–44, and about one in 11 under 65, nonpoor, and living alone, was on the government poverty benefit rolls.
The Flight from Work
Men have been a diminishing presence within the workforce—and not only thanks to the rising share of women who seek to work. The proportion of men 20 and older who are employed has dramatically and almost steadily dropped since the start of the War on Poverty, falling from 80.6 percent in January 1964 to 67.6 percent 50 years later. No less remarkable: the proportion of adult men in the labor force—either working or looking for work—has likewise plunged over those same years, from 84.2 percent then to 72.1 percent today. Put another way: this means our country has seen a surge of men completely exiting the workforce over the past 50 years. Whereas fewer than 16 percent of men 20 or older neither had work nor were looking for it in early 1964, the corresponding share today is almost 28 percent.
In purely arithmetic terms, the main reason American men today are not working is not unemployment. Rather, it is because they have opted out of the labor market altogether. For every adult man in America who is between jobs and looking for new work, more than five are neither working nor looking for employment.
In early 1964, just over 6 percent of civilian noninstitutionalized men aged 20–64 were entirely out of the workforce. By early 2014, the corresponding share had almost tripled, to more than 17 percent. In early 1964, for every man of these ages who was unemployed, roughly 1.6 were not looking for work at all. The unemployment rate is much worse today than it was back then. Even so, the ratio among working-age men of nonworkers to unemployed is more than twice as high as 50 years ago, with well over three men not looking for work for each one who is looking.
The withdrawal of progressively greater proportions of men—including relatively young men—from the U.S. workforce seems especially paradoxical when we consider the major improvements in health conditions (as reflected in life expectancy improvements) and educational attainment (as reflected in mean years of schooling) for the cohorts under consideration over those same years. All other things being equal, one might have assumed these changes would make men more capable of working, not less.
It is curious, and noteworthy, that the male flight from work for prime working-age groups, striking as it has been, did not proceed uninterrupted over the entire postwar period. No, it took place only after the War on Poverty commenced. Between early 1948—when the Bureau of Labor Statistics began the current system for tracking workforce data—and early 1964, a period stretching more than a decade and a half, the proportion of unworking men 25–54 years of age in America remained essentially unchanged. The same was true for men 35–44 years of age. For men 25–34, the labor force participation rate actually rose between 1948 and 1964 (96.1 percent in January 1948 vs. 97.1 percent in January 1964). It is tempting to observe that only since the War on Poverty began to offer alternatives to work for able-bodied men have we as a society seen a major migration out of the time-established path of work by men in prime working ages.
The flight from work among African American men has merely preceded the same flight for Anglos. Although the black American labor force participation rate for men of peak working ages (25–54) was sharply lower than that of Anglos for 2013, it had actually been a bit higher in 1973 than the Anglo rate would be 40 years later. The same is true for men in their 20s, 30s, and 40s. The strange and disturbing fact is that a lower share of Anglo men today are working or looking for work than was true for their African American counterparts four decades earlier—notwithstanding all the disadvantages borne by their black counterparts in those earlier years.
In the early postwar era, the norm for childbearing and child rearing was the married two-parent household. Norm and reality were not identical, of course—but for the country as a whole, the gap was not immense. Illegitimacy was on the rise in the early postwar era, but as late as 1963, on the eve of the War on Poverty, more than 93 percent of American babies were reportedly coming into the world with two married parents. According to the 1960 census, nearly 88 percent of children under 18 were then living with two parents. That fraction was slightly higher than it had been before World War II, thanks in part to improving survival chances for parents and the correspondingly diminished risk of orphanhood.
The strange and disturbing fact is that a lower share of Anglo men today are working or looking for work than was true for their African American counterparts four decades earlier—notwithstanding all the disadvantages borne by their black counterparts in those earlier years.
Unfortunately, the rise of the new welfare policies inaugurated by the War on Poverty coincided with marked change in family formation patterns in America. Out-of-wedlock births exploded. Divorce and separation soared. The fraction of children living in two-parent homes commenced a continuing downward spiral. These new patterns are so pervasive, and so politically sensitive, that some today now object even to describing the phenomenon as “family breakdown” in the first place. But the phenomenon has swept through all of American society over the past 50 years, leaving no ethnic group untouched.
Pre–Great Society statistics on birth outside marriage may understate the true extent of nonmarital child bearing, given the stigma that illegitimacy was widely judged to confer in those days. Be that as it may, for the quarter-century extending from 1940 to 1965, official data recorded a rise in fraction of births to unmarried women from 3.8 percent to 7.7 percent. Over the following quarter-century—1965 to 1990—out-of-wedlock births jumped from 7.7 percent of the nationwide total to 28.0 percent. Twenty-two years later (the most recent available data are for the year 2012), America’s overall out-of-wedlock birth ratio had surpassed 40 percent.
By 2013, nearly 32 percent of America’s children were living in arrangements other than a two-parent home. For older children (12–17 years of age), the odds against living in an intact two-parent home were greater still. In all nearly one child in four was living with just his or her mother, and another 7 percent was with a father, or a grandparent, or some other relative.
Moreover, given current trends in cohabitation, divorce, and remarriage, not all children living in two-parent homes nowadays are with both their biological parents—and even where they are, those biological parents are not always married. A Census Bureau study for 2009 reported just under 69 percent of America’s children lived in two-parent homes that year—but only 60 percent were biological offspring of both parents in their home, and only 57 percent were with both married biological parents.
The two-married-parent family construct was always frailest among African Americans of all major U.S. population groups throughout the 20th century. In 1969—the first year for which such data are available for African Americans by themselves for both childbearing patterns and living arrangements for children—35 percent of black babies were born out of wedlock, and 30 percent of black children were living with a single mother. By 2012, more than 72 percent of black births were outside marriage, and in 2013 more than half of black children were living only with their mother—many more than the 37 percent who were in a two-parent home. Even smaller fractions of black children were living with both biological parents or with married biological parents (29 percent and 26 percent, respectively, for 2009—and likely lower today).
But out-of-wedlock birth ratios and living arrangements for children have been changing in the rest of America as well since the start of the War on Poverty—and radically. Among Hispanic Americans, more than 30 percent of children were in single-parent homes by 2013—and well over half were born out of marriage by 2012. By 2009, fewer than 60 percent of Latino children were living with both biological parents, and fewer than 55 percent lived with biological parents who were married. Corresponding data are not available for 1964, but these figures are much higher than for 1980 (when 21 percent were in single-parent homes and fewer than 25 percent were born out of marriage).
The collapse of the traditional family structure has been underway among the majority population of Anglos as well. For this population, there were few signs of impending family breakdown in the generation before the War on Poverty; between 1940 and 1963, the out-of-wedlock birth ratio increased, but only from 2 percent to 3 percent, and in 1960, just 6 percent of white children lived with single mothers. As of 2012, the proportion of out-of-wedlock births was 29 percent—nearly 10 times as high as it was just before the War on Poverty. As of 2013, more than 18 percent of Anglo children were in single-mother homes—three times the proportion before the War on Poverty—and more than one-quarter live outside two-parent homes. By 2009, less than two-thirds of Anglo children were living with both biological parents, and fewer than five out of eight were with biological parents who were married to each other. Thus, Anglo whites today register illegitimacy ratios markedly higher than the ratios for African Americans when Moynihan called attention to the crisis in the black family—and proportions of single-parent children are eerily comparable.
One of the many risks children of broken homes confront is a much higher chance of becoming a violent offender in our criminal justice system—and more broadly, a much higher risk of being arrested for crime.
The reason the Moynihan Report sounded an alarm about family trends for black America was that a very large body of research already existed back in the 1960s concerning the manifold disadvantages conferred on children who grew up in what were then called “broken homes.” Over the intervening decades, a small library of additional studies have accumulated to corroborate and document the whole tragic range of disadvantages that such children face. (This is not to say that children from alternative living arrangements cannot end up thriving—obviously, many do; it is, rather, that their odds of suffering adverse educational, health, behavioral, psychological, and other outcomes are much higher.) These disadvantages are starkly evident even after controlling for such important factors as socioeconomic status or ethnicity and race.
One of the many risks children of broken homes confront is a much higher chance of becoming a violent offender in our criminal justice system—and more broadly, a much higher risk of being arrested for crime. Since the launch of the War on Poverty, criminality in America has taken an unprecedented upward turn within our nation. Although reported rates of crime victimization—including murder and other violent crimes—have been falling for two decades, the percentage of Americans behind bars continued to rise (though those proportions appear to have peaked—or temporarily paused—since 2009).
As of year-end 2010, more than 5 percent of all black men in their 40s and nearly 7 percent of those in their 30s were living in state or federal prisons, with additional numbers incarcerated in local jails awaiting trial or sentencing. For Latinos, the corresponding numbers were more than 2 percent and nearly 3 percent. Among Anglos, slightly more than 1 percent of all men in their 30s were sentenced offenders in state or federal prisons—a lower share than for these others, but a higher proportion than in earlier generations. This huge convict population may be described in many different ways—but one way to describe most of them is as children of the earthquake that shook family structure in America in the era of expansive antipoverty policies.
Financing America’s Social Pathologies
The Great Society’s role in modern America’s social pathologies seems fated for endless and inconclusive debate. What is indisputable, however, is that the new American welfare state facilitated these new American trends by helping to finance them: by providing support for working-age men who are no longer seeking employment and for single women with children who would not be able to maintain independent households without government aid. Regardless of the origins of the flight from work and family breakdown, the War on Poverty and successive welfare policies have made each of these modern tendencies more feasible as mass phenomena in our country today.
Dependence on government relief, in its many modern versions, is more widespread today, and possibly also more habitual, than at any time in our history. To make matters worse, such aid has become integral to financing lifestyles and behavioral patterns plainly destructive to our commonwealth.
Suffice it to say that none of these troubling mass phenomena were envisioned when the War on Poverty commenced. Just the opposite: President Johnson saw the War on Poverty as a campaign to bring dependency on government handouts to an eventual end, not a means of perpetuating them for generations to come. He made this very clear three months after his Great Society speech at the signing ceremony for some of his initial War on Poverty legislation, when he announced:
We are not content to accept the endless growth of relief rolls or welfare rolls. … Our American answer to poverty is not to make the poor more secure in their poverty but to reach down and to help them lift themselves out of the ruts of poverty and move with the large majority along the high road of hope and prosperity. The days of the dole in our country are numbered.
Held against this ideal, the actual unfolding of America’s domestic antipoverty policies can be seen only as a tragic failure. Dependence on government relief, in its many modern versions, is more widespread today, and possibly also more habitual, than at any time in our history. To make matters much worse, such aid has become integral to financing lifestyles and behavioral patterns plainly destructive to our commonwealth—and on a scale far more vast than could have been imagined in an era before such antipoverty aid was all but unconditionally available.
Dr. Eberstadt is the Henry Wendt Chair in Political Economy at the American Enterprise Institute. This article is excerpted from his monograph “The Great Society at Fifty: The Triumph and the Tragedy,” published by the American Enterprise Institute in May 2014. Reprinted with permission of the American Enterprise Institute.