The Tragic Failure of the War on Poverty

by Nicholas Eberstadt

THE LONG WAR ON POVERTY has managed to eradicate 1960s-style poverty from our midst, or very nearly so—even if our federal authorities today are not competent to describe this accomplishment (or seemingly, even recognize the accomplishment in the first place). This is an important fact in favor of the War on Poverty—but other important facts must be considered as well, all seemingly weighing on the other side of the ledger. For the institutionalization of antipoverty policy has been attended by the rise and spread of an ominous “tangle of pathologies” in the society whose ills antipoverty policies were intended to heal. Those pathologies appear to be conjoined with antipoverty policies; in some cases, antipoverty policies may create them, but irrespective of the causality at work, they are clearly very largely financed today by antipoverty policies.

The phrase “tangle of pathologies” harkens back to the famous Moynihan Report of 1965, which warned of the crisis of the family then gathering for black America. (The now-eminent author of that report, the late Daniel Patrick Moynihan, was then assistant secretary of labor and went on to serve as a Democratic U.S. senator from New York for nearly a quarter-century.) That report was criticized, even viciously denounced, by some at the time, but in retrospect much of it seems positively prophetic.

The Moynihan argument also assumed that the troubles impending for black America were unique—a consequence of the singular historical burdens that black Americans had endured in our country. That argument was not only plausible at the time, but also persuasive. Yet today that same “tangle of pathology” can no longer be described as characteristic of just one group within our country. Quite the contrary: for whatever reasons, these pathologies are evident throughout all of America today, regardless of race or ethnicity. Three of the most disturbing of these many entangled pathologies are welfare dependency, the flight from work, and family breakdown.

Welfare Dependency

Unlike, say, an old-age pension awarded on retirement after a lifetime of work, a bequest of charity or aid to the indigent is a transaction that establishes a relationship of dependence. As a people who have historically prized their independence, financial as well as political, dependence on “relief” and other handouts, whether informal or institutionalized, is a condition most Americans throughout history have attempted to avoid. Recovery from the Great Depression was corroborated by the great decline in the numbers of Americans on public aid: in 1951, the commissioner of Social Security was pleased to report that just 3.8 percent of Americans were receiving public aid, compared to 11.5 percent as recently as 1940. But with the War on Poverty and its successor programs, such dependency has become virtually the norm for modern America. The United States today is richer than at any previous juncture in its history—yet, paradoxically, more Americans than ever before are officially judged to be in need. Welfare dependence is at an all-time historical high and by all indications set to climb in the years ahead.

Perhaps tellingly, the U.S. government did not get around to collecting data and publishing figures on the proportion of the population dependent on need-based benefits on a systematic basis until nearly two decades after the start of the War on Poverty, during the Reagan era. By then (1983), nearly one American in five (18.8 percent) lived in a home taking in one or more antipoverty (means-tested) benefits. By late 2011, according to one Census Bureau source, that proportion had risen above 35 percent—over one American in three.

By 2012, according to a different Census Bureau count, the proportion was slightly lower: 32.3 percent and “only” 29.4 percent if school lunches were excluded from the tally. But this still left more than 90 million Americans applying for and accepting aid from government antipoverty programs. But only 33 million people from America’s “poverty population” were enrolled in those same means-tested programs. In other words, nearly twice as many Americans above the poverty line as below it were getting antipoverty benefits. Evidently, the American welfare state has been defining dependence upward.

By 2012, according to one Census Bureau count, significant demographic subgroups within the American population were well along the path to means-tested majorities—that is to say, toward the point where most members of the groups in question would be claiming benefits from government antipoverty programs, if they were not already doing so. More than 47 percent of all black Americans and fully 48 percent of Hispanic Americans of all ages were reckoned to be taking home means-tested benefits (excluding school lunches from the tally, here and in the rest of this discussion). More than 60 percent of black and Hispanic children, and nearly 43 percent of all American children, were depending on antipoverty programs for at least some support. Dependency was less pronounced among children of Asian Americans and non-Hispanic whites (Anglos), but only to a degree—for both those groups, the ratio was close to 30 percent. In all of the aforementioned cases, most of the beneficiaries drawing on government poverty program resources were men, women, and children not officially counted as poor.

By 2012, nearly twice as many Americans above the poverty line as below it were getting antipoverty benefits. Evidently, the American welfare state has been defining dependence upward.

The most revealing measure of the spread of dependence since the start of the War on Poverty is the declining financial independence of working-age American men. Among men 25 to 44 years of age, more than 25 percent lived in homes taking aid from antipoverty programs by 2012. For non-poor men those same ages, the ratio was over 20 percent.

The reach of dependence is perhaps best highlighted by its inroads into the parts of American society traditionally least ensnared into it. Historically, Anglos have had the lowest dependence on public aid of any major racial or ethnic group delineated within official statistics—yet by 2012, nearly one in five nonpoor Anglo men ages 25–44, and about one in 11 under 65, nonpoor, and living alone, was on the government poverty benefit rolls.

The Flight from Work

Men have been a diminishing presence within the workforce—and not only thanks to the rising share of women who seek to work. The proportion of men 20 and older who are employed has dramatically and almost steadily dropped since the start of the War on Poverty, falling from 80.6 percent in January 1964 to 67.6 percent 50 years later. No less remarkable: the proportion of adult men in the labor force—either working or looking for work—has likewise plunged over those same years, from 84.2 percent then to 72.1 percent today. Put another way: this means our country has seen a surge of men completely exiting the workforce over the past 50 years. Whereas fewer than 16 percent of men 20 or older neither had work nor were looking for it in early 1964, the corresponding share today is almost 28 percent.

In purely arithmetic terms, the main reason American men today are not working is not unemployment. Rather, it is because they have opted out of the labor market altogether. For every adult man in America who is between jobs and looking for new work, more than five are neither working nor looking for employment.

In early 1964, just over 6 percent of civilian noninstitutionalized men aged 20–64 were entirely out of the workforce. By early 2014, the corresponding share had almost tripled, to more than 17 percent. In early 1964, for every man of these ages who was unemployed, roughly 1.6 were not looking for work at all. The unemployment rate is much worse today than it was back then. Even so, the ratio among working-age men of nonworkers to unemployed is more than twice as high as 50 years ago, with well over three men not looking for work for each one who is looking.

The withdrawal of progressively greater proportions of men—including relatively young men—from the U.S. workforce seems especially paradoxical when we consider the major improvements in health conditions (as reflected in life expectancy improvements) and educational attainment (as reflected in mean years of schooling) for the cohorts under consideration over those same years. All other things being equal, one might have assumed these changes would make men more capable of working, not less.

It is curious, and noteworthy, that the male flight from work for prime working-age groups, striking as it has been, did not proceed uninterrupted over the entire postwar period. No, it took place only after the War on Poverty commenced. Between early 1948—when the Bureau of Labor Statistics began the current system for tracking workforce data—and early 1964, a period stretching more than a decade and a half, the proportion of unworking men 25–54 years of age in America remained essentially unchanged. The same was true for men 35–44 years of age. For men 25–34, the labor force participation rate actually rose between 1948 and 1964 (96.1 percent in January 1948 vs. 97.1 percent in January 1964). It is tempting to observe that only since the War on Poverty began to offer alternatives to work for able-bodied men have we as a society seen a major migration out of the time-established path of work by men in prime working ages.

The flight from work among African American men has merely preceded the same flight for Anglos. Although the black American labor force participation rate for men of peak working ages (25–54) was sharply lower than that of Anglos for 2013, it had actually been a bit higher in 1973 than the Anglo rate would be 40 years later. The same is true for men in their 20s, 30s, and 40s. The strange and disturbing fact is that a lower share of Anglo men today are working or looking for work than was true for their African American counterparts four decades earlier—notwithstanding all the disadvantages borne by their black counterparts in those earlier years.

Family Breakdown

In the early postwar era, the norm for childbearing and child rearing was the married two-parent household. Norm and reality were not identical, of course—but for the country as a whole, the gap was not immense. Illegitimacy was on the rise in the early postwar era, but as late as 1963, on the eve of the War on Poverty, more than 93 percent of American babies were reportedly coming into the world with two married parents. According to the 1960 census, nearly 88 percent of children under 18 were then living with two parents. That fraction was slightly higher than it had been before World War II, thanks in part to improving survival chances for parents and the correspondingly diminished risk of orphanhood.

The strange and disturbing fact is that a lower share of Anglo men today are working or looking for work than was true for their African American counterparts four decades earlier—notwithstanding all the disadvantages borne by their black counterparts in those earlier years.

Unfortunately, the rise of the new welfare policies inaugurated by the War on Poverty coincided with marked change in family formation patterns in America. Out-of-wedlock births exploded. Divorce and separation soared. The fraction of children living in two-parent homes commenced a continuing downward spiral. These new patterns are so pervasive, and so politically sensitive, that some today now object even to describing the phenomenon as “family breakdown” in the first place. But the phenomenon has swept through all of American society over the past 50 years, leaving no ethnic group untouched.

Pre–Great Society statistics on birth outside marriage may understate the true extent of nonmarital child bearing, given the stigma that illegitimacy was widely judged to confer in those days. Be that as it may, for the quarter-century extending from 1940 to 1965, official data recorded a rise in fraction of births to unmarried women from 3.8 percent to 7.7 percent. Over the following quarter-century—1965 to 1990—out-of-wedlock births jumped from 7.7 percent of the nationwide total to 28.0 percent. Twenty-two years later (the most recent available data are for the year 2012), America’s overall out-of-wedlock birth ratio had surpassed 40 percent.

By 2013, nearly 32 percent of America’s children were living in arrangements other than a two-parent home. For older children (12–17 years of age), the odds against living in an intact two-parent home were greater still. In all nearly one child in four was living with just his or her mother, and another 7 percent was with a father, or a grandparent, or some other relative.

Moreover, given current trends in cohabitation, divorce, and remarriage, not all children living in two-parent homes nowadays are with both their biological parents—and even where they are, those biological parents are not always married. A Census Bureau study for 2009 reported just under 69 percent of America’s children lived in two-parent homes that year—but only 60 percent were biological offspring of both parents in their home, and only 57 percent were with both married biological parents.

The two-married-parent family construct was always frailest among African Americans of all major U.S. population groups throughout the 20th century. In 1969—the first year for which such data are available for African Americans by themselves for both childbearing patterns and living arrangements for children—35 percent of black babies were born out of wedlock, and 30 percent of black children were living with a single mother. By 2012, more than 72 percent of black births were outside marriage, and in 2013 more than half of black children were living only with their mother—many more than the 37 percent who were in a two-parent home. Even smaller fractions of black children were living with both biological parents or with married biological parents (29 percent and 26 percent, respectively, for 2009—and likely lower today).

But out-of-wedlock birth ratios and living arrangements for children have been changing in the rest of America as well since the start of the War on Poverty—and radically. Among Hispanic Americans, more than 30 percent of children were in single-parent homes by 2013—and well over half were born out of marriage by 2012. By 2009, fewer than 60 percent of Latino children were living with both biological parents, and fewer than 55 percent lived with biological parents who were married. Corresponding data are not available for 1964, but these figures are much higher than for 1980 (when 21 percent were in single-parent homes and fewer than 25 percent were born out of marriage).

The collapse of the traditional family structure has been underway among the majority population of Anglos as well. For this population, there were few signs of impending family breakdown in the generation before the War on Poverty; between 1940 and 1963, the out-of-wedlock birth ratio increased, but only from 2 percent to 3 percent, and in 1960, just 6 percent of white children lived with single mothers. As of 2012, the proportion of out-of-wedlock births was 29 percent—nearly 10 times as high as it was just before the War on Poverty. As of 2013, more than 18 percent of Anglo children were in single-mother homes—three times the proportion before the War on Poverty—and more than one-quarter live outside two-parent homes. By 2009, less than two-thirds of Anglo children were living with both biological parents, and fewer than five out of eight were with biological parents who were married to each other. Thus, Anglo whites today register illegitimacy ratios markedly higher than the ratios for African Americans when Moynihan called attention to the crisis in the black family—and proportions of single-parent children are eerily comparable.

One of the many risks children of broken homes confront is a much higher chance of becoming a violent offender in our criminal justice system—and more broadly, a much higher risk of being arrested for crime.

The reason the Moynihan Report sounded an alarm about family trends for black America was that a very large body of research already existed back in the 1960s concerning the manifold disadvantages conferred on children who grew up in what were then called “broken homes.” Over the intervening decades, a small library of additional studies have accumulated to corroborate and document the whole tragic range of disadvantages that such children face. (This is not to say that children from alternative living arrangements cannot end up thriving—obviously, many do; it is, rather, that their odds of suffering adverse educational, health, behavioral, psychological, and other outcomes are much higher.) These disadvantages are starkly evident even after controlling for such important factors as socioeconomic status or ethnicity and race.

One of the many risks children of broken homes confront is a much higher chance of becoming a violent offender in our criminal justice system—and more broadly, a much higher risk of being arrested for crime. Since the launch of the War on Poverty, criminality in America has taken an unprecedented upward turn within our nation. Although reported rates of crime victimization—including murder and other violent crimes—have been falling for two decades, the percentage of Americans behind bars continued to rise (though those proportions appear to have peaked—or temporarily paused—since 2009).

As of year-end 2010, more than 5 percent of all black men in their 40s and nearly 7 percent of those in their 30s were living in state or federal prisons, with additional numbers incarcerated in local jails awaiting trial or sentencing. For Latinos, the corresponding numbers were more than 2 percent and nearly 3 percent. Among Anglos, slightly more than 1 percent of all men in their 30s were sentenced offenders in state or federal prisons—a lower share than for these others, but a higher proportion than in earlier generations. This huge convict population may be described in many different ways—but one way to describe most of them is as children of the earthquake that shook family structure in America in the era of expansive antipoverty policies.

Financing America’s Social Pathologies

The Great Society’s role in modern America’s social pathologies seems fated for endless and inconclusive debate. What is indisputable, however, is that the new American welfare state facilitated these new American trends by helping to finance them: by providing support for working-age men who are no longer seeking employment and for single women with children who would not be able to maintain independent households without government aid. Regardless of the origins of the flight from work and family breakdown, the War on Poverty and successive welfare policies have made each of these modern tendencies more feasible as mass phenomena in our country today.

Dependence on government relief, in its many modern versions, is more widespread today, and possibly also more habitual, than at any time in our history. To make matters worse, such aid has become integral to financing lifestyles and behavioral patterns plainly destructive to our commonwealth.

Suffice it to say that none of these troubling mass phenomena were envisioned when the War on Poverty commenced. Just the opposite: President Johnson saw the War on Poverty as a campaign to bring dependency on government handouts to an eventual end, not a means of perpetuating them for generations to come. He made this very clear three months after his Great Society speech at the signing ceremony for some of his initial War on Poverty legislation, when he announced:

We are not content to accept the endless growth of relief rolls or welfare rolls. … Our American answer to poverty is not to make the poor more secure in their poverty but to reach down and to help them lift themselves out of the ruts of poverty and move with the large majority along the high road of hope and prosperity. The days of the dole in our country are numbered.

Held against this ideal, the actual unfolding of America’s domestic antipoverty policies can be seen only as a tragic failure. Dependence on government relief, in its many modern versions, is more widespread today, and possibly also more habitual, than at any time in our history. To make matters much worse, such aid has become integral to financing lifestyles and behavioral patterns plainly destructive to our commonwealth—and on a scale far more vast than could have been imagined in an era before such antipoverty aid was all but unconditionally available.

Dr. Eberstadt is the Henry Wendt Chair in Political Economy at the American Enterprise Institute. This article is excerpted from his monograph “The Great Society at Fifty: The Triumph and the Tragedy,” published by the American Enterprise Institute in May 2014. Reprinted with permission of the American Enterprise Institute.

Fall of the Berlin Wall: people from East and West Berlin climbing on the Wall at the Brandenburg Gate, Berlin, Germany (Newscom TagID: ibpremium177619.jpg) [Photo via Newscom]

Lessons of the Cold War

by Lee Edwards and Elizabeth Edwards Spalding

ALL THE GREAT HISTORICAL PERIODS and events are instructive. The Cold War is no exception. It offers lasting lessons that can help us deal with the challenges of the present and the future.

What, then, are the major lessons of the birth and death of the Cold War that can be applied to the conduct of U.S. foreign policy today? The world has changed since 1945 when the Cold War began and 1991 when it ended, but certain things remain true.

Regime Matters

Contrary to Machiavelli and his modern-day realpolitik disciples, power is not everything, even in totalitarian regimes. The philosophical ideas undergirding the regime matter as well, because they guide governments and help us to understand their conduct.

The United States was shaped by ideas drawn from its founding principles. By contrast, the Soviet regime from the beginning to the end was shaped by Marxism-Leninism: Gorbachev initiated glasnost and then perestroika in order to save Soviet Communism, not to initiate Western democracy. Once Communists in the Soviet Union and Eastern Europe admitted they no longer believed in Communism, they destroyed the ideological glue that bound their façade of power and authority.

Similarly, in Iran today, the mullahs who govern the country are guided by their commitment to Islam, a commitment that shapes their world view and influences their conduct on the world stage. In China, the Communist government struggles to rationalize the contrary demands of economic liberalization and political control. As China’s economy inevitably weakens, there will be increased pressure for political liberalization.

Friends and Allies, Real and Potential, Matter

Early and late in the Cold War, the United States called upon and led a grand alliance against the Soviet Union through economic and strategic instruments such as the Marshall Plan, the North Atlantic Treaty Organization, the “police action” in Korea, the deployment of Euromissiles to counter the Soviet SS-20s, its “special relationship” with Great Britain, and the multifaceted Reagan Doctrine. Where it acted more unilaterally, as in Vietnam, it was not successful.

In contrast, the Soviet Union was never able to command true allegiance from the members of the Warsaw Pact or the various nationalities and peoples within the Soviet empire. The Soviet Union was not a true nation, but rather a conglomeration of captive peoples and nationalities united by the Red Army.

Marxism-Leninism was an alien doctrine imposed on the peoples of Eastern Europe and the Soviet Union by an imperial power. Once Western governments began to encourage the people within the “evil empire” to stand up, they did so with increasing confidence and success. The Hungarian Revolution of 1956 was crushed by Soviet tanks, but in 1980, the Communist government of Poland could only “ban” the Solidarity union for fear of alienating the West.

Leadership Matters

The history of the Cold War is the biography of leaders on both sides of the Iron Curtain. It began under Truman and Stalin and was ended by leaders that included Ronald Reagan, Margaret Thatcher, Pope John Paul II, Czech dissident Vaclav Havel, Solidarity founder Lech Walesa, and even Soviet president Mikhail Gorbachev, who helped to end the Cold War by reluctantly abandoning the Brezhnev Doctrine that had propped up the Communist regimes of Eastern Europe for decades. Containment might have continued to be the policy of the United States for years if Reagan had not laid down a new way to wage the Cold War: “We win, and they lose.”

A firm commitment to freedom is something the two Presidents who served at the beginning and end of the Cold War had in common. As important as Ronald Reagan was to the winning of the war, there is much to be learned from the American President who was there at its birth. Harry Truman’s Cold War was a conflict between good and evil, between freedom and tyranny, between liberal democracy and totalitarianism, between capitalism and Communism. His strategy was (1) to articulate America’s basic principles of freedom and equality and
(2) to assist those who lived under such principles to maintain them and to aid those under totalitarianism to realize them in the future.

The United States enjoyed successes in the Cold War when led by visionaries, including Truman and Reagan. When American leaders sought to deal with the Communist threat through containment, however, they were less successful.

Meanwhile, the Soviet Union and its satellites were led by aging tyrants like East German Communist boss Erich Honecker, who in early 1989 declared that the Berlin Wall would stand for at least another 100 years. Gorbachev’s three immediate predecessors had believed that the Soviet Union could spend an estimated 40 percent of its budget on military weapons indefinitely.

Statecraft Matters

Victory over a determined adversary requires not only strength and resolve, but also a strategy relevant to the times and the nation-states involved. Containment was an appropriate strategy in the beginning of the Cold War when the United States was sorting out its domestic and foreign responsibilities and the Soviet Union was in place and in power in Eastern Europe. Forty years later, the United States could take the offensive against an economically weakened Soviet Union and its Communist satellites that had failed to deliver the goods to their peoples and whose Marxist ideology was disintegrating.

A successful U.S. foreign policy depends on the exercise of prudence. It is impossible to predetermine the extent, priority, and immediacy of the nation’s security requirements: They shifted constantly throughout the Cold War as the balance of world forces changed. Likewise, it is impossible to predetermine the challenges and opportunities for furthering American principles and interests in the world. It is therefore impossible to know beforehand what policy prudence will dictate at any particular time and place.

Cold War policies such as the Marshall Plan were prudent. Its economic aid helped our World War II allies get back on their feet and at the same time created markets for our goods. Less prudent policies, including Jimmy Carter’s human rights fixation that resulted in a Marxist Nicaragua and an Islamist Iran and the Nixon-Kissinger détente that allowed the Soviets to surpass us in strategic weapons, were failures.

A grand strategy for U.S. foreign policy should begin with the thesis that the United States should step in only when its vital interests are at stake and it has the capability to act. Those interests are:

  • Protecting American territory, sea lanes, and airspace;
  • Preventing a major power from controlling Europe, East Asia, or the Persian Gulf;
  • Ensuring U.S. access to world resources;
  • Expanding free trade throughout the world; and
  • Protecting Americans against threats to their lives and well-being.

Whether it is clashes with Islamic terrorists or long-term challenges from autocratic Communist China or Russia’s economic-strategic attempts to expand its sphere of influence, a prudent foreign policy guided by our founding principles of liberty and justice and based on our capabilities offers the best path for the United States. That is a strategy for the ages.

Dr. Edwards is the Distinguished Fellow in Conservative Thought at the B. Kenneth Simon Center for Principles and Politics at The Heritage Foundation. Dr. Spalding is Associate Professor of Government at Claremont McKenna College. This article is excerpted from their book A Brief History of the Cold War, published by The Heritage Foundation, © 2014 by The Heritage Foundation.


Margaret Thatcher’s 10 Principles for Successful Conservative Leadership

by Nile Gardiner and Stephen Thompson

IN A SPEECH TO THE CONSERVATIVE WOMEN’S CONFERENCE in 1989, toward the end of her time as Prime Minister, Margaret Thatcher took pride in declaring that “we in the Conservative Party are conviction politicians. We know what we believe. We hold fast to our beliefs. And when elected, we put them into practice.” Thatcher’s conviction was fundamental to her success as Britain’s longest continuously-serving Prime Minister of the 20th century. As she put it to fellow members of Parliament, “we never put power before principles.”

Adherence to one’s convictions is one of the key principles that Thatcher followed throughout her career, principles that are essential for successful conservative leadership today at every level of government. They also apply to conservative business leaders, whether they are captaining a Fortune 500 company or operating a small business with 50 employees. America needs strong conservative leadership, both in government and the private sector. Because the Iron Lady gave such a splendid example of how it is done, her guiding principles are worth reflecting on.

1. Walk with Destiny and Serve a Higher Purpose

Throughout her political life, Thatcher was driven by a sense of purpose, a clear sense of destiny, and a deep-seated patriotism. “Our supreme loyalty is to the country and the things for which it stands,” she reminded the British people in 1979. Her mission as Prime Minister was never in doubt—to save her country from socialist-driven decline and to stand up for freedom in the face of tyranny. On both fronts she succeeded, changing the course of history for the British nation and, with Ronald Reagan, bringing down a monstrous Soviet empire of tyranny.

The example she tried to follow was that of Churchill during World War II, who was to a great degree her role model in this regard, shaping her sense of resolve and determination. She said that Churchill “was the man of that hour, a true figure of destiny, and himself profoundly conscious of the fact.”

This sense of mission and destiny, of living for a higher purpose, distinguishes a great leader from a mediocre one. Thatcher, Churchill, and Reagan all possessed it, yet it is largely absent from the seats of power in Washington and London today. Today’s generation of conservative leaders needs to recapture the spirit of these great figures if the world’s superpower is to be revived and America is to be saved from decline. Margaret Thatcher always thought big, with the future of her nation at heart. She may have come from a small village in Lincolnshire, but her outlook and vision were on a grand scale, driven by selflessness and sacrifice for country. As she declared in a speech to the General Assembly of the Church of Scotland, “there is little hope for democracy if the hearts of men and women in democratic societies cannot be touched by a call to something greater than ourselves.”

2. Lead with Conviction

Margaret Thatcher was above all a conviction politician. Even her fiercest critics would acknowledge that she was driven by unshakeable beliefs. “It is the half-hearted who lose—it is those with conviction who carry the day,” she insisted. Without courage and conviction, Thatcher noted as a newly elected MP for Finchley, “the others are hollow and useless.” The notion that steely conviction is a fault in a leader seemed ridiculous to the Iron Lady: “There would have been no great prophets, no great philosophers in life, no great things to follow, if those who propounded the views had gone out and said ‘Brothers, follow me, I believe in consensus.’” Consensus for its own sake was the preoccupation of the feeble and the faint of heart. It has no place in true leadership.

3. Stick to Core Conservative Ideas

It is no coincidence that Thatcher won three successive general elections and never lost one. She firmly stuck to conservative principles and advanced a consistent message that voters understood. The British electorate knew what they were getting with Margaret Thatcher, and they rewarded her with unprecedented success. She stood for limited government, free enterprise, privatization, low taxation, strong defense, and an unyielding opposition to socialism. She was a champion of small businesses, declaring war on red tape and burdensome regulations, an enemy of big government and the heavy hand of bureaucracy.

In order to win the war of ideas, conservatives have to be clear in their message, and confident of their values. There is always the temptation to “soften” the message, to “reinvent” the brand, to bend and reshape central principles in order to appeal to different sections of society. The UK Conservative Party has given in to this temptation in recent years, a mistake that cost it an outright majority in the 2010 general election (winning only 36 percent of the vote) and forced it into a difficult coalition with the Liberal Democrats.

A conservative party must not sacrifice its principles in pursuit of popularity. Eighteen months after becoming Prime Minister, Thatcher insisted that “the worst betrayal the British people could suffer at the hands of this Government would be for us to seek a little more popularity now by sacrificing all hope of future stability and prosperity. That is not our way.”

4. Understand the Grassroots

Margaret Thatcher was able to lead her country for 11 years because she understood the beating heart of the British people. She was in touch with “Middle England,” the traditional conservative values of the typical British voter, concerned with bread-and-butter issues like the economy, taxes, law and order, immigration, and the quality of public services. As Thatcher noted in the second year of her premiership, “Those who seek to govern must be willing to allow their hearts and minds to lie open to the people.”

Like Ronald Reagan in the United States, she was not from the metropolitan elite. As a grocer’s daughter Thatcher understood the needs and concerns of hard-working, ordinary people trying to make ends meet, often under the most difficult of circumstances.

She appealed not only to middle-class voters, but also to large sections of the working class, who benefited from lower taxes and the selling of millions of council houses (public housing), which greatly boosted home ownership in Britain. Thatcherism managed to win over large cross sections of society because of its truly aspirational nature, offering an opportunity for less well-off voters to share in Britain’s new economic prosperity, through purchasing their own homes and buying shares in newly privatized companies. At the beginning of Margaret Thatcher’s premiership, there were just 3 million private shareholders in Britain. By the end of it that figure had risen to more than 11 million. During the same period, the percentage of Britons who owned their own home rose from 55 percent to 63 percent.

While politicians of the Left stirred up resentment between different class and economic groups, Thatcher’s vision was of a country united by common principles where economic freedom fostered opportunity and achievement. Socialism is the politics of division, fear, and loathing, appealing to the worst instincts of humanity. In contrast, as Thatcherism demonstrated, the free enterprise system appeals to man’s nobler instincts, to his desire to be creative and work hard and to advance prosperity through individual initiative and limited government.

Prime Minister Margaret Thatcher with Norman Tebbit acknowledge cheers of supporters at their HQ, Smith Square after the Tory victory. Credit: The Times. Online rights must be cleared by News Syndication. (Newscom TagID: niphotos071039.jpg) [Photo via Newscom]
Prime Minister Margaret Thatcher with Norman Tebbit acknowledge cheers of supporters at their headquarters, Smith Square after the Tory victory in May 1979. Photo by The Times.
5. Be Courageous

“Courage,” Margaret Thatcher once said, “is what you show in the heat of the battle, not at the post-mortem.” When running for high office, American presidential candidates are invariably asked how they will respond to that “3 a.m. call,” the moment when a leader must respond to a crisis with fortitude and boldness. That moment came for George W. Bush with the 9/11 attacks on Washington and New York. He rose to the occasion by swiftly launching Operation Enduring Freedom to remove the Taliban from power in Afghanistan and hunt down the terrorists of al-Qaeda. Rudy Giuliani also responded with true grit, inspiring a nation to fight back against Islamist terrorism and recover from the biggest assault on American soil since Pearl Harbor. Who can forget the sight of New York’s mayor walking through the dust-covered, debris-strewn streets of lower Manhattan, leading the city’s rescue efforts on a day that 3,000 people lost their lives at the hands of a barbaric enemy?

“There will be times when the unexpected happens,” Margaret Thatcher said of moments like these. “There will be times when only you can make a certain decision.” She demonstrated that fearlessness herself when Argentina invaded the Falklands in 1982 and again when she confronted the power of Britain’s trade unions during the miners’ strike of 1984-1985. Her leadership in the face of trade union militancy was vital in rescuing the country from its economic paralysis.

But Thatcher’s courage was more than just political. She also displayed tremendous personal courage when the IRA tried to assassinate her in 1984. Not even a terrorist bomb, which narrowly missed killing her in her hotel in Brighton, could keep her from delivering her address to the Conservative Party Conference just a few hours later. The IRA taunted her that day: “Today we were unlucky, but remember, we only have to be lucky once; you will have to be lucky always.” She took no heed of this kind of intimidation and led a sustained British military campaign against Irish Republican terrorists that made them understand that they would gain nothing through a campaign of mass murder. As Thatcher remarked four years later in a speech to women leaders, “The ultimate virtue is courage, the ultimate, the only thing you have got left sometimes—courage and fellowship.”

6. Be Decisive

Political courage and decisiveness go hand in hand, as Margaret Thatcher’s leadership during the Falklands War showed. It is often forgotten that the British task force that sailed 8,000 miles across the world had been assembled within 48 hours. Her decision to launch a liberation force at such short notice was an act of extraordinary leadership, and it carried huge risks. There can be no doubt that the failure of the Falklands mission would have been a national calamity, big enough to bring down the Thatcher government. It would also have scarred the British nation for a generation, deepening a sense of post-imperial decline. “I had total faith in the professionalism, and in the loyalty and morale of the British armed forces,” said Thatcher, and that faith proved to be justified.

7. Be Loyal

On June 11, 2004, Lady Thatcher paid tribute to her great friend Ronald Reagan, delivering a eulogy for the American president at his state funeral in Washington’s National Cathedral. Advised by her doctor against speaking publicly, she recorded remarks that were delivered by video link in the service.

Thatcher’s tribute to Reagan was so powerful because every word came from the heart of a leader who had stood with Reagan through adversity. Loyalty mattered to Margaret Thatcher, and the strength of the Reagan-Thatcher partnership is unlikely to be matched in our time. Thatcher stood with Reagan not only against the Soviet Union but also against the Libyan dictator Gadaffi.

The relationship was not a one-way street: Ronald Reagan frequently gave his support and encouragement to Thatcher. Without America’s military backing during the Falklands War, as Thatcher made clear in her memoirs, Britain would not have been able to defeat Argentina and liberate the islands. The close ties between the White House and Downing Street enhanced Thatcher’s influence on the world stage. The world is a far better place, and a safer one, thanks to the strength of the Anglo-American alliance, a partnership that depends upon shared interests and values as well as upon on the principle of loyalty between leaders on opposite sides of the Atlantic. For Margaret Thatcher, loyalty was essential to successful leadership. As she put it in a press conference in Washington in 1988, “loyalty is a very positive quality. If you cannot give it yourself, you should not be entitled to receive it from others.”

8. Know Your Brief and Prepare

Margaret Thatcher’s opponents could disagree with her message, criticize her ideas, and condemn her policies, but they could rarely find fault with her command of the facts, knowledge of her brief, and the power of her delivery. She took pride in being exceptionally well informed on the details of government policy and the issues that her administration faced, no matter how complex or seemingly unimportant. Parliament can be an extremely unforgiving environment. The separation of the executive and legislative branches of the United States government shields the American president from direct questioning by members of Congress. In contrast a British Prime Minister is expected to face questions from members of Parliament every week when the House is in session—twice a week when Thatcher was Prime Minister. This requires an extraordinary mastery of many subjects, often with little time to acquire it. Thatcher was a formidable debater, as a series of Labour leaders found when they faced her across the dispatch box.

She was also meticulous in her preparation for speeches and television interviews. Major speeches were carefully rehearsed to ensure that every line was delivered with the right tone and emphasis. Thatcher was a naturally gifted orator with a tremendous talent for appealing to the hearts of her audience. But even the greatest public speakers also depend on practice for successful delivery, a lesson that every conservative politician and businessman should heed. There is no substitute for good preparation, and no room for over-confidence, no matter how familiar the speaker is with the topic. A stumbling statement, factual error, or weak message can undercut any political candidate or business leader. In some cases it can even finish a career.

9. Make Your Message Clear

Margaret Thatcher was one of the most successful communicators of the modern era. She could present complex issues in a way that most voters could easily understand. Few politicians in the last 60 years—perhaps only Ronald Reagan and John F. Kennedy—could rival her as a public speaker. She never hid her own admiration for President Reagan’s extraordinary talent for communicating big ideas to ordinary voters, once remarking that “his fundamental instincts are the instincts of most decent, honourable people in democracy. That is why they felt such a sympathy with him—and then he could communicate.”

Thatcher’s speeches, interviews, and statements, like Reagan’s, always conveyed a clear-cut message. Her 1980 speech to the Conservative Party Conference was a case in point, with her delivery of one of the most memorable lines in recent British history. Addressing the party faithful, she confidently declared, “To those waiting with bated breath for that favourite media catchphrase, the ‘U’ turn, I have only one thing to say. ‘You turn if you want to. The lady’s not for turning.’” With a single turn of phrase, she projected resolve and determination and sent a signal that her free-market revolution was here to stay.

That speech powerfully reinforced her image as the “Iron Lady,” which she had earned in 1976 at Kensington Town Hall. Her “Britain Awake” speech, delivered as leader of the opposition, had sent shockwaves through the other side of Europe. Her warnings against a Soviet Union bent on “world dominance” forced the Russians to take the measure of a formidable new adversary. The speech made Thatcher, still three and half years away from governing, an international figure. It also projected Cold War leadership at a time when there was little of it coming from either London or Washington. It was a sequel to Churchill’s landmark Iron Curtain speech and a precursor to Reagan’s 1983 Evil Empire speech. It was one of the few speeches in history that have threatened an empire and compelled the grudging respect—even admiration—of its rulers.

Great speeches rely on brilliant lines, and often on gifted speechwriters. But they will always ring hollow if they are not matched by a clear set of beliefs and a leader who delivers them with conviction.

Thatcher’s speeches succeeded because the message was compelling and based on a core set of beliefs. They were delivered from the heart by a truly great communicator who understood the importance of delivering a clearly defined message.

10. Deliver a Message of Hope and Optimism

Margaret Thatcher’s put-downs of her political opponents are legendary. In hundreds of appearances at the House of Commons dispatch box during Prime Minister’s Questions, she shattered the egos of countless opposition MPs. Her speeches as well were filled with devastating critiques of Britain’s socialist opposition as well as its bankrupt ideology, broadsides which frequently brought the house down at party conferences.

In the arena of political combat, Margaret Thatcher had no equals in 1970s and 1980s Britain. But Thatcher’s speeches were also replete with messages of hope and optimism for the future. They were invariably positive in tone, offering a brighter future for the British people. Her words were frequently inspirational, focused on national renewal and the restoration of British greatness. The rejection of national decline was her constant theme as a candidate for Prime Minister, presenting an overwhelmingly bright conservative vision for the future.

There is much for American conservatives today to learn from Thatcher’s spirit of optimism. Like Reagan, she was uncompromising in her condemnation of left-wing ideology, but she frequently joined harsh words with a theme of renewal. In politics it is essential to point out and illustrate the flaws and follies of your opponents. As Thatcher demonstrated, it is also vital to present an alternative based on conservative ideas that an electorate can understand.

Thatcher-202x306The depth of despair and economic ruin in 1970s Britain was a national humiliation. Thatcher’s message of hope seemed to many, both at home and abroad, an impossible dream. But she succeeded in turning her country around out of an extraordinary faith in the human spirit, and the principles of liberty that sustain it.

Dr. Gardiner is the Director of the Margaret Thatcher Center for Freedom at The Heritage Foundation and a former researcher for Lady Thatcher. Mr. Thompson is a writer and consultant who experienced firsthand how Margaret Thatcher changed Britain while living in Cambridge and London. This article is excerpted from their book, Margaret Thatcher on Leadership: Lessons for American Conservatives Today, published by Regnery Publishing (2013).

Mugshot of a Man

You Could Break the Law and Not Even Know It: Why Overcriminalization is Everyone’s Problem and How to Fix It

by Jordan Richardson

[I]f the public are bound to yield obedience to the laws, to which they cannot give their approbation, they are slaves to those who make such laws and enforce them. —Samuel Adams, writing as Candidus, Essay in the Boston Gazette, (January 20, 1772)


IMAGINE IF YOUR SON OR DAUGHTER were charged with a crime because he or she reported being bullied or abused by other students in the classroom.

That is what happened to Christian Stanfield, a 15-year-old boy from Pennsylvania, who suffers from Attention Deficit Hyperactive Disorder, anxiety, and a comprehension delay disorder. Because of his condition, Stanfield’s classmates frequently ridiculed and bullied him. They often shoved and tripped him. Once he was nearly burned with a cigarette lighter.

One day, Stanfield decided to put a stop to the abuse. He used his school-issued iPad to make an audio recording of the insults and threats that the students directed at him during math class. During the class, according to a transcript of the audio he captured, one student near Stanfield told another student: “You should pull his pants down!” The other student replied with a vulgar comment about Stanfield. Moments later, a loud noise was made by a textbook being slammed shut near Stanfield’s head. Then a group of boys could be heard laughing in the background.

Confident he had proof that would finally compel the school authorities to intervene, Stanfield brought the audio recording to the principal of South Fayette High School. But instead of providing help and assistance, Stanfield was met with resistance and the threat of criminal prosecution.

The local police were called to the school and advised Stanfield he would face a felony wiretapping violation for recording the bullies without their consent. The Pennsylvania wiretapping law was being unreasonably enforced, and, by its own terms, was inapplicable in this case. Nevertheless, it was enough for the school to suspend the young boy for reporting academic abuse.

The bullies in school were now getting help avoiding accountability from bullies in the government.

Stanfield’s story is just one of many episodes of American citizens becoming trapped by an overcriminalized legal system. It is a system that puts a Florida fisherman in prison for six years for importing lobsters packed in plastic rather than paper, that jails a North Carolina man for 45 days for selling hot dogs without a license, that jails a Florida woman for filming a traffic stop. Citizens are finding themselves trapped by the very system that is supposed to protect them; and they are being prosecuted for actions that most people would not recognize as criminal offenses.

While it is important to maintain the rule of law to ensure order in society, it is equally vital to apply the law with fairness and justice. Punishing unsuspecting citizens for morally blameless behavior is unjust and fosters disrespect for the legal system.

What Is Overcriminalization?

When most people hear the term “overcriminalization,” they naturally associate it with an abundance of laws relating to criminal behavior. It is that, but more.

Overcriminalization is the overuse or misuse of the criminal law. Professor Darryl K. Brown of the University of Virginia describes overcriminalization as a “term that captures the normative claim that government creates too many crimes and criminalizes things that should not properly be crimes.” Overcriminalization manifests itself in a variety of ways, including overly broad definitions of criminal acts, excessively harsh sentencing, and imposing criminal sanctions for simple mistakes or accidents under a theory of strict liability. Each aspect of this phenomenon puts ordinary Americans in legal jeopardy for their everyday behavior and undermines the legitimacy of the rule of law.

The adage that “ignorance of the law is no excuse” no longer squares with the tremendous growth of the criminal law, and the modern reality that even judges and lawyers have a hard time discerning what is and what is not legal.

Take, for example, the bizarre criminal investigation of the Gibson Guitar Company. In 2011, armed federal marshals raided Gibson’s factories searching for illicit wood guitar components. Allegedly, Gibson had imported wood from India and Madagascar in violation of those nations’ laws. The federal government eventually charged Gibson under the Lacey Act, which makes it illegal for an American company to violate the environmental laws of another country. Even though the Madagascan law was not even written in English, Gibson Guitars was forced to sign a “criminal enforcement agreement” and pay a $300,000 penalty.

Overcriminalization also punishes individuals with good intentions for honest mistakes. In 2014, Shaneen Allen, a single mother from Pennsylvania, bought a handgun for protection after being robbed twice in a year. The next week, she drove to her son’s birthday party in New Jersey, where the police pulled her over for a traffic infraction. The police found the gun, and instead of giving Allen a ticket decided to charge her with a felony gun violation. Allen carried a lawful permit for carrying a gun in Pennsylvania, and believed she was in compliance with New Jersey regulations. In this case, the state eventually decided not to prosecute Allen, and instead offered a pretrial intervention program that allows Allen to avoid jail time. That decision, however, was made only after a public furor over the possibility that Allen would have to serve at least three years in jail simply for failing to realize her Pennsylvania permit wasn’t good in New Jersey.

The effects of overcriminalization extend to citizens from all walks of life. NASCAR legend Bobby Unser was charged with trespassing on federal property when he became lost in a blizzard and drove his snowmobile on federal land to seek shelter from the storm. Although he was unaware of his surroundings and assumed he could escape the blizzard, the federal government charged him under a theory of strict liability with trespassing, an offense that carried a potential penalty of a six-month prison sentence and a $5,000 fine.

These instances highlight the reality of how easily the average citizen can be ensnared by the law. But the problem with overcriminalization is not limited to aggressive prosecution or misapplication of the criminal code. As a practical matter, the current system perpetuates the problem of overcriminalization by creating voluminous additions to the law. The list of illegal activities grows longer every year. As a result, people are being deemed criminals for breaking rules and regulations they did not know were in effect.

Many laws today lack a criminal state of mind standard—also known as a mens rea standard. Such a standard would protect the average citizen from being turned into a criminal for otherwise normal behavior.

Consider how many laws exist at the federal level alone: The Code of Federal Regulations (a compendium similar to the U.S. Code), currently has 80,000 laws in over 200 volumes. There are nearly 5,000 federal crimes and an estimated 300,000 implementing regulations, also known as regulatory crimes, on the books, with more being created every year. These numbers do not even take into account the abundant number of state criminal laws and regulations that also exist and continue to expand. The adage that “ignorance of the law is no excuse” no longer squares with the tremendous growth of the criminal law, and the modern reality that even judges and lawyers have a hard time discerning what is and what is not legal.

James Madison, writing The Federalist 62, warned against the expansive nature of the American legal system:

It will be of little avail to the people, that the laws are made by men of their own choice, if the laws be so voluminous that they cannot be read, or so incoherent that they cannot be understood; if they be repealed or revised before they are promulgated, or undergo such incessant changes that no man, who knows what the law is to-day, can guess what it will be to-morrow. Law is defined to be a rule of action; but how can that be a rule, which is little known, and less fixed?

Wise words. It cannot truly be said that modern citizens can or should reasonably be expected to know, understand, and follow every aspect of the criminal law.

How did America become a place where everyday activity could be construed as a criminal act? One obvious explanation for the proliferation of new laws is a political one. Institutional incentives encourage legislators seeking re-election to create new laws in an effort to appear “tough on crime” and to respond to constituent demands to “do something” whenever anything bad happens in our society. There is significant political will to enact new criminal statutes, but scant support to repeal bad ones. Each new criminal law, however vague, empowers prosecutors, many of whom have a tendency to consider every questionable act that harms somebody as a crime. The roots of overcriminalization are ingrained in the system.

How Do We Fix the Problem?

A program of reform should do two things: first, reexamine the way laws are written; second, modify the way laws are enforced.

John Malcolm of The Heritage Foundation has laid out a strategy to accomplish the first part of that reform. Malcolm says Congress should identify every federal statute and regulation that contains a criminal provision and post it for the public to read in an easily accessible location. Then, he says, Congress should pass a law that says that in order for a person to be convicted of a crime, he must be proven to have had an intent to break the law. Many laws today lack this criminal state of mind standard—also known as a mens rea standard. Such a standard would protect the average citizen from being turned into a criminal for otherwise normal behavior. Under Malcolm’s plan, such a standard would be the default standard for criminal statutes, unless Congress specifies otherwise.

The second component of reform is to modify the way laws are enforced by creating a “mistake of law” defense. Paul Larkin Jr. of The Heritage Foundation has conducted extensive research about the merits of such a proposal. He writes:

Properly defined and applied, a mistake of law defense would be a valuable addition to the criminal law today. It would exculpate morally blameless parties for conduct that no reasonable person would have thought was a crime. The defense would ensure that no one could be convicted of a crime when criminal liability was unforeseeable.

The importance of this defense is immediately evident when applied to the vast number of obscure regulatory crimes. Citizens who act reasonably and in good faith would be shielded from the obtuse application of unduly burdensome or vague laws. Overcriminalization has ruined the lives of thousands of Americans, and the response to its effects should be equally vigorous.

The good news is that momentum is building for fixing overcriminalization. The effort is supported by a diverse array of organizations, including the American Civil Liberties Union, Families Against Mandatory Minimums, The Heritage Foundation, Justice Fellowship, the Manhattan Institute, the National Association of Criminal Defense Lawyers, and the Texas Public Policy Foundation. Because overcriminalization could ensnare anybody, the solutions offered to solve this problem unite people from all backgrounds.

Continued pressure is needed to remedy the injustices of overcriminalization and to prevent its continuation. Without serious reform, stories like Stanfield’s will persist as government officials target citizens who never imagined they were committing a crime. Unless the rights of victims of overcriminalization are vindicated, the American system of justice will lose its integrity.

Mr. Richardson is a visiting legal fellow in the Edwin Meese III Center for Legal and Judicial Studies at The Heritage Foundation.