Two things may be said about Americans and taxes with a fairly high degree of certainty. One is that Americans always have been resentful of them.
Sure, colonials were upset because they were taxed without any government representation, but even when they had representation they didn’t like it much. The Whiskey Rebellion in 1791 offers proof. It took President Washington and federal troops to force the issue of an excise on distillers.
The other is that Americans have fairly consistently argued over whether the rich are paying enough. Today’s rhetoric from President Obama is nothing new.
With the latest version of the fiscal cliff looming March 1, and with many Americans approaching April 15 like schoolchildren with an assignment due, it may be instructive to pause a moment and contemplate an anniversary that has gone largely unnoticed, and certainly uncelebrated.
One hundred years ago this month, Wyoming won a three-way race with New Jersey and New Mexico to become the state that officially put the tally at three-fourths of all state legislatures to ratify the 16th Amendment to the Constitution, which legalized the personal income tax.
Thank you, neighbors to the north.
This seemed a popular move at the time, with news reports speculating Congress would exempt everyone earning less than $5,000 a year. In reality, single people earning more than $3,000 and married couples earning more than $4,000 ended up being taxed 1 percent, up to $20,000 in income (a whopping $463,826 in 2012 dollars). The highest rate, for the very rich, was 7 percent.
But the rules began to change rapidly, and the tax spread like water from a leaky pipe.
Less than a quarter century later, the popular play, “You can’t take it with you” won the Pulitzer Prize. Its most sympathetic character was Martin Vanderhof, or “Grandpa,” a happy-go-lucky old man who proudly tells an IRS agent he has never paid a cent of income tax because he doesn’t believe in it and because he is sure the government wouldn’t know what to do with his money if he paid it.
At the time the play opened, in 1936, the bottom marginal rate was 4 percent for everyone earning up to $4,000 in income, and it was 79 percent at the top. But the culture of deductions already had taken root, making it difficult to easily calculate how much people actually were paying.
From the earliest years, the tax code made reality quite different from outward appearances. As Time magazine recently noted, the rich actually pay more when their marginal rate is lower. Among other reasons for this, when rates are high, the wealthy search harder for loopholes. In those early days, for instance, the rich often incorporated their wealth, which allowed them to pay at the much lower corporate income tax rate.
One month before the income tax became legal a century ago, the Wall Street Journal wondered aloud how much taxation a nation could handle. “The assumption seems to be made in certain quarters — particularly by those who do not have to pay directly themselves — that the tax-bearing resources of modern society are unlimited,” an editorial said before drawing ominous comparisons between where things seemed to be heading and ancient Rome, “when farms were so heavily taxed that they were deserted by their owners...”
Those worries, it turned out, were overblown. Instead, however, the worst part of the income tax is that its branches have grown into a hopelessly tangled mess of rules and loopholes.
The tax code now fills nearly 74,000 pages and continues to grow yearly. Two years ago, a board appointed by President Obama and led by former Federal Reserve chairman Paul Volker estimated Americans spend a combined 7.6 billion hours and $140 billion a year to wade through all the rules and file their returns.
This is because the ability to directly tax individuals allows government to reward behavior it deems appropriate, such as buying house or giving to charity, and also to reward businesses powerful enough to lobby for breaks.
Which brings up the third thing that can be said about taxes in America. They may be as inevitable as death, but they themselves are not likely to die.
Copyright 2017, Deseret News Publishing Company