The Need for More Pro-Growth Policies

by Kirk A. Johnson and Rea S. Hederman Jr.

ON AUGUST 30, 2005, THE CENSUS BUREAU reported that the poverty rate was essentially unchanged in 2004 and that the level of income inequality remained unchanged. Nevertheless, the familiar refrain that the “rich are getting richer” has been echoed once again. It is important to understand the Census numbers on poverty and income inequality and to draw the correct conclusion from them. While the report overstates both income inequality and poverty, the findings do show that Americans would benefit from expanded economic opportunities. Pro-growth policies, such as a less distortionary tax code, are the way to achieve this end.

The poverty rate increased to 12.7 percent in 2004, up slightly from the 12.5 percent reported in 2003. While a rise in the poverty rate is not a good sign, there are a variety of positive indicators in the latest report.

Poverty rose in only one region of the country, the Midwest, while rates in the West, South, and North Central regions were unchanged. Poverty rose for white (non-Hispanic) adults but not for children or minorities. The poverty rate for Asians dropped by two percentage points, and it decreased among the elderly, from 10.2 to 9.8 percent.

The rise in poverty occurred almost exclusively among working age adults. This points the way to improvement: more pro-growth policies that produce more jobs, lowering the unemployment rate and poverty in America. Between late 2001 and early 2004, the unemployment rate fluctuated between 5.7 percent and 6.3 percent nationwide. Only in the past year, after the full force of the President’s progrowth 2003 tax cuts had been felt, has there been a persistent decrease in the unemployment rate, which now stands at 5.0 percent, according to the Department of Labor.

Policymakers should resist calls for increased government social spending. Welfare reform demonstrated that greater economic opportunity comes from increased work and economic independence, not from generous government programs. If the current positive employment numbers persist for the remainder of the year, the 2005 poverty rate will likely come in below the 2004 rate.

The poverty numbers, as the Census Bureau currently presents them, have a variety of flaws that limit their usefulness. The poverty rate is determined by the money income definition, which excludes in-kind, non-cash benefits, such as Medicare, Medicaid, food stamps, and other forms of assistance. The Census figures also do not include taxes or tax payments such as the Earned Income Tax Credit (EITC). If taxes were subtracted from income and non-cash benefits were added to income, the poverty rate would be more than two percentage points lower than reported, based on historical experience. For example, a single parent who earns $11,000 per year and has two children would be considered poor by Census standards. However, that family would receive more than $4,000 under the EITC, raising its income sufficiently to clear the Census definition of poverty.

Income inequality is an example of bad data informing public policy. The Census Bureau considers only money income in its inequality calculations and does consider the effects of taxation. Because the United States has a progressive tax structure, low-income individuals pay few taxes, while high-income individuals pay most of the individual income taxes in America. Thus, Census reported that the top quintile earned 50.1 percent of money income in 2004, slightly above its 49.8 percent take in 2003. The share of income going to the bottom two quintiles was unchanged at 3.4 percent and 8.7 percent, respectively. If taxes were included in these calculations, the results would show much less income inequality in the United States.

Another major problem with the Census numbers is that the Census Bureau bases its quintile statistics on households instead of individuals. Many households in the bottom quintile are single-parent households or single-person units. In contrast, households in the upper quintile are generally larger. Earlier work shows that the top quintile of households may account for almost a quarter of the population, compared to only fifteen percent in the bottom quintile. Thus, households in the upper quintile contain more earners than households in the lower quintile. Census data show that the level of income inequality is partly explained by the difference in hours of work between the different quintiles. The top two quintiles, for example, account for well over half of all work in the United States.

While the annual income and poverty report provides good information, it is limited in its usefulness. In particular, the report overstates both income inequality and poverty in America. Policymakers would therefore be wise to take the report’s findings with a grain of salt.

Furthermore, policymakers should continue to enact pro-growth policies that expand the economic opportunities of Americans rather than support more social program spending.

Kirk A. Johnson, Ph.D., is Senior Policy Analyst, and Rea S. Hederman, Jr., is Senior Policy Analyst, in the Center for Data Analysis at The Heritage Foundation.

The Roads to Serfdom

by Theodore Dalrymple

PEOPLE IN BRITAIN WHO LIVED through World War II do not remember it with anything like the horror one might have expected. In fact, they often remember it as the best time of their lives. Even allowing for the tendency of time to burnish unpleasant memories with a patina of romance, this is extraordinary. The war, after all, was a time of material shortage, terror, and loss: What could possibly have been good about it?

The answer, of course, is that it provided a powerful existential meaning and purpose. The population suffered at the hands of an easily identifiable external enemy, whose evil intentions it became the overriding purpose of the whole nation to thwart. A unified and preeminent national goal provided respite from the peacetime cacophony of complaint, bickering, and social division. And privation for a purpose brings its own content.

The war having instantaneously created a nostalgia for the sense of unity and transcendent purpose that prevailed in those years, the population naturally enough asked why such a mood could not persist into the peace that followed. Why couldn’t the dedication of millions, centrally coordinated by the government—a coordinated dedication that had produced unprecedented quantities of aircraft and munitions—be adapted to defeat what London School of Economics head Sir William Beveridge, in his wartime report on social services that was to usher in the full-scale welfare state in Britain, called the “five giants on the road to reconstruction”: Want, Disease, Ignorance, Squalor, and Idleness?

By the time Beveridge published his report in 1942, most of the intellectuals of the day assumed that the government, and only the government, could accomplish these desirable goals. Indeed, it all seemed so simple a matter that only the cupidity and stupidity of the rich could have prevented these ends from already having been achieved. The Beveridge Report states, for example, that want “could have been abolished in Britain before the present war” and that “the income available to the British people was ample for such a purpose.” It was just a matter of dividing the national income cake into more equal slices by means of redistributive taxation. If the political will was there, the way was there; there was no need to worry about effects on wealth creation or any other adverse effects.

The growing spirit of collectivism in Britain during the war provoked an Austrian economist who had taken refuge there, F. A. von Hayek, to write a polemical counterblast to the trend: The Road to Serfdom, published in 1944. It went through six printings in its first year, but its effect on majority opinion was, for many years to come, negligible. Hayek believed that while intellectuals in modern liberal democracies—those to whom he somewhat contemptuously referred as the professional secondhand dealers in ideas—did not usually have direct access to power, the theories that they diffused among the population ultimately had a profound, even determining, influence upon their society. Intellectuals are of far greater importance than appears at first sight.

Hayek was therefore alarmed at the general acceptance of collectivist arguments—or worse still, assumptions—by British intellectuals of all classes. He had seen the process—or thought he had seen it—before, in the German-speaking world from which he came, and he feared that Britain would likewise slide down the totalitarian path. Moreover, at the time he wrote, the “success” of the two major totalitarian powers in Europe, Nazi Germany and Soviet Russia, seemed to have justified the belief that a plan was necessary to coordinate human activity toward a consciously chosen goal. Against the collectivists, Hayek brought powerful—and, to my mind, obvious—arguments that, however, were scarcely new or original. Nevertheless, it is often, perhaps usually, more important to remind people of old truths than to introduce them to new ones.

Hayek pointed out that the wartime unity of purpose was atypical: In more normal times, people had a far greater, indeed an infinite, variety of ends, and anyone with the power to adjudicate among them in the name of a conscious overall national plan, allowing a few but forbidding most, would exert vastly more power than the most bloated plutocrat of socialist propaganda had ever done in a free-market society.

Collectivist thinking arose, according to Hayek, from impatience, a lack of historical perspective, and an arrogant belief that, because we have made so much technological progress, everything must be susceptible to human control. While we take material advance for granted as soon as it occurs, we consider remaining social problems as unprecedented and anomalous, and we propose solutions that actually make more difficult further progress of the very kind that we have forgotten ever happened. While everyone saw the misery the Great Depression caused, for example, few realized that, even so, living standards actually continued to rise for the majority. If we live entirely in the moment, as if the world were created exactly as we now find it, we are almost bound to propose solutions that bring even worse problems in their wake.

In reaction to the unemployment rampant in what W. H. Auden called “the low dishonest decade” before the war, the Beveridge Report suggested that it was government’s function to maximize security of income and employment. This proposition was bound to appeal strongly to people who remembered mass unemployment and collapsing wages; but however high-minded and generous it might have sounded, it was wrong. Hayek pointed out that you can’t give everyone a job irrespective of demand without sparking severe inflation. And you can no more protect one group of workers’ wages against market fluctuations without penalizing another group than you can discriminate positively in one group’s favor without discriminating negatively against another. This is so, and it is beyond any individual human’s control that it should be so. Therefore, no amount of planning would ever make Beveridge’s goals possible, however desirable they might be in the abstract.

But just because a goal is logically impossible to achieve does not mean that it must be without effect on human affairs. As the history of the twentieth century demonstrates perhaps better than any other, impossible goals have had at least as great an effect on human existence as more limited and possible ones.

The most interesting aspect of Hayek’s book, however, is not his refutation of collectivist ideas—which, necessary as it might have been at that moment, was not by any means original. Rather, it is his observations of the moral and psychological effects of the collectivist ideal that, 60 years later, capture the imagination—mine, at least.

Hayek thought he had observed an important change in the character of the British people, as a result both of their collectivist aspirations and of such collectivist measures as had already been legislated. He noted, for example, a shift in the locus of people’s moral concern. Increasingly, it was the state of society or the world as a whole that engaged their moral passion, not their own conduct. “It is, however, more than doubtful whether a fifty years’ approach towards collectivism has raised our moral standards, or whether the change has not rather been in the opposite direction,” he wrote. “Though we are in the habit of priding ourselves on our more sensitive social conscience, it is by no means clear that this is justified by the practice of our individual conduct.” In fact, “It may even be . . . that the passion for collective action is a way in which we now without compunction collectively indulge in that selfishness which as individuals we had learnt a little to restrain.”

Thus, to take a trifling instance, it is the duty of the city council to keep the streets clean; therefore my own conduct in this regard is morally irrelevant—which no doubt explains why so many young Britons now leave a trail of litter behind them wherever they go. If the streets are filthy, it is the council’s fault. Indeed, if anything is wrong—for example, my unhealthy diet—it is someone else’s fault, and the job of the public power to correct. Hayek—with the perspective of a foreigner who had adopted England as his home—could perceive a further tendency that has become much more pronounced since then: “There is one aspect of the change in moral values brought about by the advance of collectivism which at the present time provides special food for thought. It is that the virtues which are held less and less in esteem and which consequently become rarer are precisely those on which the British people justly prided themselves and in which they were generally agreed to excel. The virtues possessed by the British people in a higher degree than most other people . . . were independence and self-reliance, individual initiative and local responsibility . . . non-interference with one’s neighbour and tolerance of the different and queer, respect for custom and tradition, and a healthy suspicion of power and authority.”

The British are no longer sturdily independent as individuals, either, and now feel no shame or even unease, as not long ago they would have felt, at accepting government handouts. Indeed, 40 percent of them now receive such handouts: For example, the parents of every child are entitled not merely to a tax reduction but to an actual payment in cash, no matter the state of their finances. As for those who, though able-bodied and perfectly able to work, are completely dependent on the state for their income, they unashamedly call the day when their welfare checks arrive “payday.” Between work and parasitism they see no difference. “I’m getting paid today,” they say, having not only accepted but thoroughly internalized the doctrine propounded in the Beveridge Report, that it is the duty of the state to assure everyone of a decent minimum standard of life regardless of his conduct. The fact of having drawn 16 breaths a minute, 24 hours a day, is sufficient to entitle each of them to his minimum; and oddly enough, Hayek saw no danger in this and even endorsed the idea. He did not see that to guarantee a decent minimum standard of life would demoralize not only those who accepted it, but those who worked in the more menial occupations, and whose wages would almost inevitably give them a standard of living scarcely higher than that of the decent minimum provided merely for drawing breath.

In any case, Hayek did not quite understand the source of the collectivist rot in Britain. It is true, of course, that an individualist society needs a free, or at least a free-ish, market; but a necessary condition is not a sufficient one. It is not surprising, though, that he should have emphasized the danger of a centrally planned economy when so prominent a figure as Orwell—who was a genuine friend of personal liberty, who valued the peculiarities of English life, and who wrote movingly about such national eccentricities as a taste for racy seaside postcards and a love of public school stories—should so little have understood the preconditions of English personal liberty that he wrote, only three years before Hayek’s book was published: “The liberty of the individual is still believed in, almost as in the nineteenth century. But this has nothing to do with economic liberty, the right to exploit others for profit.”

It is depressing to see a man like Orwell equating profit with exploitation. And it is certainly true that Britain after the war took no heed of Hayek and for a time seemed bent on state control of what were then called “the commanding heights of the economy.” Not only did the Labour government nationalize health care, but also coal mining, electricity and gas supply, the railways and public transportation (including the airlines), telecommunications, and even most of the car industry. Yet at no time could it remotely be said that Britain was slipping down the totalitarian path.

The real danger was far more insidious, and Hayek incompletely understood it. The destruction of the British character did not come from Nazi- or Soviet-style nationalization or centralized planning, as Hayek believed it would. For collectivism proved to be not nearly as incompatible with, or diametrically opposed to, a free, or free-ish, market as he had supposed. The effect of collectivist thought on a capitalist society would not be socialism, but something quite distinct. The means of production would remain in private hands, but the state would offer workers certain benefits, in return for their quiescence and agreement not to agitate for total expropriation as demanded in socialist propaganda.

The state action that was supposed to lead to the elimination of Beveridge’s five giants of Want, Disease, Ignorance, Squalor, and Idleness has left many people in contemporary Britain with very little of importance to decide for themselves, even in their own private spheres. They are educated by the state (at least nominally), as are their children in turn; the state provides for them in old age and has made saving unnecessary or, in some cases, actually uneconomic; they are treated and cured by the state when they are ill; they are housed by the state, if they cannot otherwise afford decent housing. Their choices concern only sex and shopping.

No wonder that the British have changed in character, their sturdy independence replaced with passivity, querulousness, or even, at the lower reaches of society, a sullen resentment that not enough has been or is being done for them. For those at the bottom, such money as they receive is, in effect, pocket money, like the money children get from their parents, reserved for the satisfaction of whims. As a result, they are infantilized. If they behave irresponsibly—for example, by abandoning their own children wherever they father them—it is because both the rewards for behaving responsibly and the penalties for behaving irresponsibly have vanished. Such people come to live in a limbo, in which there is nothing much to hope or strive for and nothing much to fear or lose. Private property and consumerism coexist with collectivism, and freedom for many people now means little more than choice among goods. The free market, as Hayek did not foresee, has flourished alongside the collectivism that was—and, after years of propaganda, still is—justified by the need to eliminate the five giants. For most of the British population today, the notion that people could solve many of the problems of society without governmental Gleichschaltung, the Nazi term for overall coordination, is completely alien.

Of course, the majority of Britons are still not direct dependents of the state. “Only” about a third of them are: the 25 percent of the working population who are public employees (the government has increased them by nearly 1 million since 1997, no doubt in order to boost its election chances); and the 8 percent of the adult population either unemployed or registered as disabled, and thus utterly dependent on government handouts. But the state looms large in all our lives, not only in its intrusions, but in our thoughts: for so thoroughly have we drunk at the wells of collectivism that we see the state always as the solution to any problem, never as an obstacle to be overcome. One can gauge how completely collectivism has entered our soul—so that we are now a people of the government, for the government, by the government—by a strange but characteristic British locution. When, on the rare occasions that our Chancellor of the Exchequer reduces a tax, he is said to have “given money away.” In other words, all money is his, and whatever we have in our pockets is what he, by grace and favor, has allowed us.

Our Father, which art in Downing Street. . . .

Theodore Dalrymple is a Contributing Editor of City Journal. A version of this article originally appeared in the Fall 2005 edition of City Journal. Reprinted with permission from City Journal.

The Collapse of Free Association

by Frank Prochaska

WHILE CENTRAL GOVERNMENT was little noticed in the 1850s, the tendrils of the state were everywhere to be seen a century later, from the local surgery to the unemployment office on the High Street. Translated into quantitative terms, British government spent less than 8 percent of gross national product in the 1900s and over 50 percent in the 1960s (Jose Harris, “Society and state in twentieth-century Britain,” The Cambridge Social History of Britain, 1750-1950, ed. F.M.L. Thompson, (Cambridge 1990), vol. 3, p. 64). Victorians held government in esteem, but expected little from it on social issues. In a national culture dominated by Christianity, they commonly believed that poverty was ineradicable, yet they sought its amelioration through voluntary service. A century later, most Britons believed poverty could be abolished, but that responsibility for welfare provision resided in the political process. With collectivism in the ascendant, the payment of taxes had become the primary civic duty.

To the Victorian mind, democracy was immanent in institutions. Before the advent of universal suffrage, the nation’s charities, societies for mutual aid, voluntary schools and various other bodies represented the most effective way for disparate groups to have an influence in their communities and integrate into the wider society. Self-governing, voluntary institutions gave a voice to those who were excluded, or felt excluded, from the political nation: dissenters, minorities, women and the working classes. Through the culture of free association, which had its origins in the Reformation, the most obscure sects could prosper in their own enclave of belief. Voluntary societies not only made life more bearable and human, but propelled those traditions of free association that were thought essential to the creation of a vibrant democracy.

Associations, it was often said, were the nurseries of democracy, which provided opportunities for grassroots participation, a moral training, and lessons in decision-making and organization. In the nineteenth century there were literally millions of them in Britain, from the humble mothers’ meeting and burial club to the great missionary societies and charitable hospitals.

By the twentieth century, however, voluntarists were increasingly on the defensive. In an age of social science, mass politics and national priorities, they looked increasingly parochial. In a culture growing more urban and diverse, they had difficulty rebutting criticisms that charity and mutual aid were patchy and inadequate. Unemployment and two world wars, which accelerated government controls, pushed the voluntary sector to the margins of social reform. The extraordinary circumstances of the Second World War had boosted Labour’s planning mentality, and its leadership paid scant heed to the democratic impulses and good offices of voluntary societies with their ethic of contributory citizenship. By the end of the Second World War, the citizenry looked to government, not to self-governing institutions, for redress. The representative principle, which developed a magical hold on the citizenry after the extension of the suffrage in 1918, had trumped the principle of duty.

The vast expansion of state-directed health and welfare services after the war threw voluntarists into disarray. Central government largely displaced the vast array of voluntary institutions in the provision of health and social services. Politicians of all parties, transfixed by the role of the welfare state in their election prospects, narrowed discussion of social policy down to government action. So did civil servants in the expanding welfare departments, who jealously guarded their new authority. In the heyday of centralized bureaucratic administration, social policy shifted from the local to the national and from the religious to the secular. Indirect, representative democracy, expressed through Cabinet government, now reigned supreme in educational and social policy over the spontaneous form of democracy inherent in voluntary institutions. To put it another way, the ministerial, civil service state had routed civic pluralism, whose foundations lay in Christian and humanistic notions of individual responsibility.

As a consequence of the state’s ascendancy in welfare, the public and the surviving voluntary institutions generally dealt less directly with social issues, leaving the individual disconnected. Individuals were in some ways more impotent in an age of universal suffrage and parliamentary democracy than their disenfranchised ancestors had been under an oligarchic system. Those self-governing local institutions, which connected citizens to their communities, gave them a measure of direct control over their own affairs. But the nationalization and professionalization of the social services made such institutions look provincial and amateurish. Clearly, something fundamental had happened to British culture, once so voluntarist, in which the burden of care shifted so radically to government, in which volunteering became characterized as a frill and faceless officials doled out the nation’s capital in the name of progress and “the people.”

In compensation for the decline of rival sources of democracy, politicians and social commentators sought to replace the sense of community, which people had built up in the past out of family life and self-governing institutions, with a sense of national community, built out of central bureaucratic structures and party politics. In passing social legislation, government acted in the name of freedom, progress and social justice. The beauty of such abstractions perhaps blinded the public to the dangers of overburdening the state. But the more the government expanded its role into areas that were formerly the responsibility of families and voluntary institutions, the more it reduced the scope for individual service and social interaction. With the years, the notion that a representative government had tutelary power over the citizenry took hold, and with it the concept of ministerial responsibility for social provision from the cradle to the grave.

In the 1950s and 1960s, there was a rearguard campaign to counter the effects of an impersonal state devastating traditional allegiances and local institutions, but the public in general seemed content to queue up for their false teeth and child benefits. A few social critics, often Christian in background, complained of a bloodless takeover of civic responsibility by anonymous bureaucrats. Tocqueville, who believed that Christian charity was essential to social well-being, had argued that without a culture of association, democratic nations became prey to overbearing government prone to a benign form of despotism, in which the citizenry exchanged freedom for benefits. As he put the case forcefully in Democracy in America:

Such a power does not destroy, but it prevents existence; it does not tyrannize, but it compresses, enervates, extinguishes, and stupefies a people, till each nation is reduced to nothing better than a flock of timid and industrious animals, of which the government is the shepherd.

Dr. Frank Prochaska is Lecturer in History, Yale University. A version of this article originally appeared on the Web site of the Social Affairs Unit,

The Most Influential Person You Never Heard Of

by Tim Worstall

ARTHUR SELDON DIED on October 11 at the age of 89. Few outside policy wonk circles will have heard of him. He may thus merit the title of the most influential person most people have never heard of, for he was behind the intellectual sea change that led to both Thatcherism and Reaganism. As such he merited obituaries in the New York Times, The Times of London, the Daily Telegraph and the Guardian as well as appreciations from think tanks like the Adam Smith Institute, and it’s the latter that gives a clue as to why he was indeed so influential.

Sir Anthony Fisher, having made his fortune in introducing broiler chickens to the UK (sort of a Frank Perdue for his times) got to know Friedrich Hayek and expressed an interest in going into politics in order to contribute to the ongoing debate as to how and where the country was going. Hayek convinced him that influencing the debate, providing the ideas, was a better way of wielding such influence—and so the Institute for Economic Affairs was formed. Seldon was the editorial director, and Ralph Harris (now Lord Harris), the general one. Seldon had been educated by both Hayek and Lionel Robbins at the London School of Economics in the 1930s and had also taught there after the war.

What followed was a flood of books, articles and pamphlets by Seldon, Harris and any number of eminent economists (Hayek, Milton Friedman and other Nobel Laureates amongst them) which, in time, raised the precepts of classical liberalism from their lowest point, that reached at the end of the 1950s.

The New York Times described Seldon as a libertarian and, while this may be true in an American sense, he was a liberal of the old school—he had indeed been prominent in the Liberal Party. As the Times pointed out:

For years the State had been seen as the preeminent force in managing the economy and providing social security. Seldon was a tireless advocate of replacing the welfare state and of allowing natural economic laws of supply and demand to increase national wealth more effectively than the man in Whitehall could ever do.

Not that he had ever been an enthusiast for the Conservative Party. Fundamentally Seldon was an old-fashioned Liberal who believed in the liberty—and responsibility—of the individual.

This was in fundamental opposition to both Labour and Conservative thinking at the time which was that the Man in Whitehall really did know best. It was simply the duty of those parties to manage that Man as best they could rather than any ideal of getting the State off the backs of the people. He wasn’t at all a proponent of what are thought of as the more extreme shores of libertarianism but rather thought that the State crowded out those examples of voluntary cooperation and communalism which had existed before the welfare system overcame them—the Friendly Societies for example, from which his adoptive mother had benefited at the time of her husband’s death.

As the Telegraph wrote:

For Seldon, the profit motive governed by consumers in an open competitive economy was more truly democratic—and wholesome—than the vote motive operating in a regime of so-called representative government dominated by pressure groups.

A sentiment which should have some resonance for those in the Porkbusters campaign going on at the moment. Wouldn’t it be a better place, a fairer society, if we were indeed left alone to make our own decisions, were actually empowered in our dealings by being consumers, customers, rather than supplicants to the bureaucracy? It was towards this end that he was an untiring champion of educational vouchers, something he lived to see enter the mainstream political debate in the UK but alas, not to see implemented.

To give a true measure of his influence, consider this from the writer Mark Steyn in the Spectator:

[S]uccessful conservatives don’t move towards the “political centre.” They move the political centre towards them. That’s what Thatcher and Reagan both did … . If Labour is at 1 on the scale and the Tories are at 9, and their focus groups tell them to move to 5, they have ensured that henceforth the centre will be 3, and they’ll be fighting entirely on the Left’s terms and the Left’s issues. …

Conservatives win when they champion ideas. They win in two ways: sometimes they get elected; but, even if they don’t, their sheer creative energy forces an ever more intellectually bankrupt Left to grab whatever right-wing ideas they figure they can slip past their own base.

Replace conservatives with liberals (as that is what Seldon was—far too radical to be conservative in the English sense and very much a liberal in that same language) and that’s exactly what he did.

Remember, when he and Harris started out in the ’50s, both the Conservatives and Labour thought that the Health Service should be exclusively provided by the State, with what private provision was left a mere hangover from an earlier time. The school system was just beginning to be made comprehensive, with parental choice being removed. The “commanding heights” of the economy were nationalized or about to be (steel, coal, shipbuilding, car manufacturing and so on) and it was thought by all that this should continue to be so. Government should micromanage the economy, to the extent of deciding how much money each individual could take out of the country when on holiday. In everything, a bureaucrat in his office knew better than individuals.

I might also point out that the Liberal Party of the day was so sidelined that at one point its entire number of MPs could fit in one London taxi…each with his own seat.

The Thatcher Revolution, of course, made a difference, but it is the ideas themselves that have lasted much longer. It is the current Labour Government that is bringing academic selection and parental choice back into schools, insisting that private companies be allowed to bid for work from the National Health Service and privatizing the Air Traffic Control system.

To have, as the phrase goes, not so much won the game as to have pulled the board—the place of conflict—over to your ground is a grand and great achievement in politics, one showing how much more influential one can be when proposing ideas rather than a specific electoral program.

Tim Worstall is a writer in the United Kingdom. Reprinted with permission from

No Longer Standing “Athwart History, Yelling Stop”

A HALF-CENTURY AGO, the National Review published its first issue, a thirty-two page weekly that featured writers such as Willmoore Kendall, Frank Meyer, Russell Kirk, and, of course, William F. Buckley. While many of the pressing issues of November 1955 no longer trouble the scene of 2005—most notably, the problem of the Soviet Union—a quick perusal of this first Review reveals that some things have stayed the same. The France of 1955 suffered from “economic, social, political and moral weaknesses,” an observation that would please NR political reporter John J. Miller, who in 2004 wrote Our Oldest Enemy: A History of America’s Disastrous Relationship with France. The Ninth U.S. Circuit Court of Appeals then, as it still does, was angering conservatives. In 1955, that court would make it impossible “for a concealed agent of the FBI to put the finger on clandestine Communists.” Today, it wants to make it impossible for school children to recite the Pledge of Allegiance.

The similarities between then and now do not indicate that conservatism occupies the same place that it did a half-century ago. Reading the National Review of 1955, one senses some desperation. Conservatives of that era stood “athwart history, yelling Stop,” fighting against what seemed like the unstoppable movement toward collectivism. Conservatives of 2005, with the recent groundswell surrounding government spending reminding us, are firmly on the offense. The National Review and subsequent journals and organizations deserve credit for the rise of conservatism, but this rise would not have been possible without core principles which are as relevant today as they were in 1955. This is how the National Review put its “Credenda” in the first issue.

Reprinted with permission from National Review.

Taxpayer-Funded Lobbying: Taxman vs. Taxpayer

by Peggy Venable

WHEN THOMAS JEFFERSON WROTE, “To compel a man to furnish funds for the propagation of ideas he disbelieves and abhors is sinful and tyrannical,” he likely had no idea that the practice would become commonplace at state capitols across the country.

Lobbying with tax dollars is a growth industry. The Texas Association of Counties (TAC) was established 30 years ago to provide services to Texas counties and includes lobbying as one of those services. Just five years ago, TAC had three registered lobbyists at the Texas Capitol. By 2005, their lobby list grew to 15. TAC is only one of many organizations which use taxpayer dollars to lobby—often against taxpayer interests—pitting the tax man against the taxpayer.

During the recent legislative session in Texas, TAC joined the Texas Municipal League (TML) and other taxpayer-funded organizations to oppose what many taxpayers supported—a Taxpayer Bill of Rights-type measure to limit the growth in government and require a vote to increase spending. The legislation was introduced to allow taxpayers to keep any property tax savings that would be realized by offsetting local property taxes with state funds.

These organizations’ opposition to taxpayer protections, coupled with the TML’s support for the City of New London in the infamous Kelo decision, make it clear that public dollars are funding efforts which, when revealed, leave many taxpayers steamed.

How does this happen? Some cities, counties and other government entities hire lobbyists to represent them at the capitol while others register staff as lobbyists during session. And most cities and counties (as well as other local taxing entities) join organizations that do the lobbying for them, essentially carrying the water for local elected officials at taxpayer expense.

Taxpayer-funded lobbying clearly distorts the democratic process. Government should not be in the business of providing funding to give voice to points of view that may not represent the views of the majority of the taxpayers. Allowing the government the authority to allocate taxpayer funds for lobbying transforms government from its appropriate role as a neutral policymaker into an advocate of certain policies and ideologies. This situation produces fertile ground for abuse and shields elected officials for cities, counties and schools by allowing them to hide behind the lobbying activities of organizations and other lobby-hired guns.

Texas legislation which would have required disclosure of lobbying expenses by school districts—part of an extensive education reform bill—was met with massive opposition by the education lobby. That lobby, much of it fueled by tax dollars, fervently fought greater accountability and financial disclosure measures.

The Texas legislature failed to pass reforms during regular session and two subsequent called sessions this year. But, Texas Gov. Rick Perry issued an executive order requiring school districts to disclose their expenses for lobbying, public relations and lawsuits including expenses for the school finance lawsuit suing the state for more funding.

Disclosure is long overdue. But prohibiting the practice of using public funds to lobby is the relief taxpayers are seeking—and help may be on the way.

During the first called session, one media outlet wrote about two announcements made that day. One announcement—when Agriculture Commissioner Susan Combs announced her candidacy for Texas Comptroller—was described as “expected,” but the second announcement was described as having “sent shockwaves through the halls of the capitol.”

Those “shockwaves” were caused by three taxpayers filing a lawsuit against one Texas county, alleging the county illegally expended funds to join an association that lobbies. The lawsuit is Venable v. Williamson County, and I am joined in the lawsuit by Americans for Prosperity (AFP) members Janice Brauner and Judy Morris, fellow Williamson County taxpayers.

The lawsuit was filed as a last resort, and it seeks to force counties to comply with state law. The suit asserts that Williamson County has used general revenue funds to support county associations, including the Texas Association of Counties (TAC). TAC employs 15 registered lobbyists and participates in lobbying activities. The TAC has contended that their 15 paid lobbyists don’t really lobby and that TAC is in compliance with state law.

Texas Local Government Code statute 89.002 states, “The commissioners court may spend, in the name of the county, money from the county’s general fund for membership fees and dues of a nonprofit state association of the counties if…neither the association nor an employee of the association directly or indirectly influences or attempts to influence the outcome of any legislation pending before the legislature….” The lawsuit cites numerous examples of TAC communications which are clearly efforts to influence legislation.

Though this legal action is against one county, the practice is widespread. Of the 254 Texas counties, we have not found a single county which joined associations and did not use general revenue funds.

In the recent legislative session, county associations, funded with public dollars, opposed taxpayer protections that would have provided greater public accountability and fiscal transparency. The legislation these associations opposed would have helped end “taxation by valuation” and would have provided “truth in taxation.” The county associations joined the cities’ lobby and opposed both appraisal caps and local tax and expenditure limitations (TELs).

Gov. Perry had reason for concern. When then-Governor George W. Bush provided over $1 billion in local property tax relief in the late ’90s, taxpayers didn’t get to realize the savings. Local taxing entities devoured that savings. And with rising property appraisals, tax bills increased even when tax rates were lowered. And Texas legislators were lobbied hard by taxpayer-funded lobbyists against a new “Truth in Taxation” initiative that would require local elected officials to vote on whether to keep extra revenue generated by appraisal increases, legislation which would essentially stop taxation by valuation. Local officials can now tell voters they didn’t raise their tax rates when local entities are increasing their revenues through appraisal increases.

Lobbying with taxpayer dollars has cost Williamson County residents approximately a half million dollars over the past 10 years. AFP estimates that as much as a half billion public dollars are spent on lobbying over the course of a biennium. Though this cost is considerable, the positions they take to deny taxpayer protections are even more expensive to Texas taxpayers.

Taxpayers have impressive representation in this lawsuit. Attorneys representing the plaintiffs are former Supreme Court Justice Steven Wayne Smith and David Rogers, both with the Texas Legal Foundation. These distinguished attorneys have taken this case on as a cause.

“The Texas Legal Foundation is pleased to represent the plaintiffs in this case because we believe in the principles of government accountability, limited government spending, the rule of law, and representation of the people through their elected representatives,” Smith said. “We believe that all these principles are endangered when taxpayers are illegally taxed in order to pay lobbyists to persuade the Legislature to raise taxes.”

Some public officials like Cheryl Johnson, County Tax Assessor and Collector for Galveston County, agree that the concept of using tax dollars to lobby is inappropriate. She said that the county associations’ responsibility should be to educate and assist the counties and not to lobby.

This is an unusual action for AFP and represents the first Texas lawsuit challenging use of public funds for lobbying. It may well serve as a wake-up call to taxpayers and to elected officials that taxpayer money should not be used for lobbying. And shining the light of public scrutiny on the practice may serve as a wake-up call to elected officials who are not currently held accountable for expending public funds for lobbying.

Just what is the extent of taxpayer-funded lobbying? The numbers are mind-boggling.

The City of Austin has 26 lobbyists listed with the Texas Ethics Commission; the City of Houston has 23; Dallas Independent School District has eight, the Texas Municipal League has 14 and the Texas Association of Counties has its 15 lobbyists (which TAC claims don’t actually lobby).

Texas cities have 146 lobbyists registered and spend as much as $9 million on lobbying.

County-related taxing entities have 106 registered lobbyists (some representing more than one county) and spend as much as $5 million at the Texas legislature, while individual counties have 27 registered lobbyists and may spend as much as $1 million on lobbying.

In addition to a myriad of organizations representing educator interests, independent school districts in Texas have 26 registered lobbyists (Northwest ISD in Bexar County has 9) and spend as much as $855,000.

The Texas Municipal League (TML) spends as much as $1,285,000. The TML agenda for 2005 summed it up. It lists legislation the board has chosen to actively use their resources to support and the proposals it opposed. In order of priority, the top four all involve increases in taxes in one way or another. The highest priority for the TML is not just to oppose, but to defeat “any legislation that is deemed detrimental to cities.” That includes legislation which would impose revenue caps of any kind. Recently, the Texas Municipal League, along with the Texas Association of Counties openly and vigorously opposed the taxpayer protections and revenue caps in the Governor’s plan to revamp school finance.

In stark contrast, the Texas Association of Business—essentially the state chamber of commerce—has only five lobbyists. (And they are not at public expense.)

There may be as much as $36 million spent on lobbying in Texas alone this year by groups or entities who oppose government spending limitations.

Pitting the tax spender against the taxpayer clearly distorts the legislative process. The voices of taxpayers are being overshadowed by powerful entities and those taxpayers who would fight are pitted against hired guns funded by the taxpayers’ very own tax dollars.

Relief may be in sight as the legal action runs a parallel track with efforts to get legislation enacted next session to end the practice of using taxpayers’ own money to lobby against taxpayer interests. Thomas Jefferson is joined by legions of taxpayers dedicated to putting this tyrannical practice to rest.

Peggy Venable is Texas director of Americans for Prosperity, a national grassroots organization which supports restraining the growth of government and empowering taxpayers. She can be reached at