https://commons.wikimedia.org/wiki/File:Historical_Marginal_...
As Mitt Romney pointed out in 2012 [1]: "Forty-seven percent of Americans pay no income tax."
[1] https://www.politifact.com/factchecks/2012/sep/18/mitt-romne...
For example, 57% of households did not pay any federal income tax in 2021: https://www.cnbc.com/2022/03/25/57percent-of-us-households-p...
That's not even from COVID, the percentages are fairly similar going back a number of years.
I am really opposed to taxing things that aren't actually realized (stock). The number of horror stories on HN about stock options just leads me to believe we do it wrong. I would rather we do a better job on the transaction to cash out.
After the standard deduction, the net income tax rate for a single filer making $20k is 3.75%. For a couple filing jointly it's 0 up through $25.5k/yr.
In both cases, the EITC guarantees those with at least one child a (substantially) negative effective tax rate, while single child-free filers are left with about a 2.6% effective rate.
I'm not sure the scale of this being taken advantage off though but it's well know benifit if you situation allows.
Two theories: 1) Maybe it’s better for everyone to feel like they are ‘chipping in’ … for pride and unity.
2) Maybe the government is really invested in knowing what everyone makes, regardless of the net on it.
Because you want people to feel like they have skin in the game.
There's gonna be a lot of comments in here saying ^^ but unironically. It's just ingrained in the fabric of our society.
It's actually kind of amusing to think about what a marginal rate over 100% would lead to. If the top bracket is $1M+ and you earn $100M, and that last $99M is taxed at 102%, then you owe roughly $101M of your $100M earned, leaving you negative for the year. Better not go above the max! Quick! Donate that $99M in order to maximize your earnings!
Yeah, you could also pay your workers a lot more and take less yourself. But in that case you'd spend the whole year working to make what you could make in 1/100 of the year. So again, what's the point?
So I suspect a >100% (or even >90%) marginal tax rate would have a lot of very negative side effects. (Keep in mind that back when the top rate was officially >90%, there were so many loopholes that nobody actually paid that. Closing the loopholes and drastically lowering the top rate was actually revenue-neutral.)
You'd likely have a hard time getting them to come back the next year. Who is going to take a job that pays a normal daily wage but only employs you for three days? And can you really ramp the org back up up in three days? Maybe you'd be better off working for a few days then handing the job off to someone else for the rest of the year. Even then, good luck convincing the board that you're doing anything worthwhile coming in 3 days per year.
I think the equilibrium for this would be fascinating.
It seems like when it first started, it only taxed the rich and at a very low rate. Then it expanded from there, to the point where 20% was the min and 91% was the max. Then lower to what we have now.
Compare these days, where we have 12% min and 37% max for 10k and 500k respectively.
If we were to tax at 37% using 1961 threshholds, you'd have to be making 95k inflation adjusted (that's the 38% threshhold), or 10k nominally.
Something else I found fascinating was that we had over 25 brackets back in the day. I can only imagine the headache that would be without an Excel spreadsheet.
People (myself included) might complain about taxes now, I can only imagine in the 60s.
Thanks for the insights, OP.
For example, 401k, IRA, 529, 403b, etc didn't exist. A lot of states and localities didn't have income taxes at that time either. If the deductions and overall tax code was simpler, then calculating the brackets would be easy - you're basically taking the percent times each bracket max until you get to your top bracket, then the amount in that bracket times that percent.
My understanding is that most people could do their taxes just based on the instructions on the back of the form just because there weren't so many deductions, credits, and complicated securities/instruments.
I couldn't find info on the deductions, but this was interesting.
I have to imagine that the IRS tax tables aren't a recent creation, so the math should have been similarly easy.
Also, loans don't evade the capital gains tax, they merely delay it. Stepped-up basis is what evades the capital gains tax.
99% of the borrowing is coming from people worth close to a billion or more (0.001%).
Better yet, do overall tax burden rather than just income tax.
Everyone loves to get a good ideological circle jerk going over the nominal 1950s rates but the actual tax burden at (various different points on the income spectrum) paints a very, very, different picture.
Do you have this data? I'd be very interested in this picture.
The economy overall grew by 37% during the 1950s. At the end of the decade, the median American family had 30% more purchasing power than at the beginning. Inflation was minimal, in part because of Eisenhower's efforts to balance the federal budget.
Unemployment remained low, about 4.5%.
It took nearly 20 years to rebuild Europe to the point that it could compete, and USSR was largely cut off from global supply in western worlds.
It’s not the same today and one cannot possibly assert with a straight face that we would enjoy a decade of 37% growth with those rates.
https://taxfoundation.org/taxes-on-the-rich-1950s-not-high/
Average rate paid by the 1% back then was around 41%
"The data comes from a recent paper by Thomas Piketty, Emmanuel Saez, and Gabriel Zucman that attempts to account for all federal, state, and local taxes paid by different groups of Americans over the last 100 years"
But regardless, it was higher in 1950, almost 6 points higher.
And look at al the caveats...
1] Some of the distributional assumptions in the Piketty, Saez, and Zucman paper are questionable. In particular, the authors assume that the full burden of the corporate income tax falls on owners of capital, which may not be correct. However, the authors note that they “have tested a number of alternative tax incidence assumptions, and found only second-order effects.”
[3] It is worth noting that, per the Piketty, Saez, and Zucman data, the tax rates of the top 0.1 and 0.01 percent of taxpayers have dropped substantially since the 1950s. The average tax rate on the 0.1 percent highest-income Americans was 50.6 percent in the 1950s, compared to 39.8 percent today. The average tax rate on the top 0.01 percent was 55.3 percent in the 1950s, compared to 40.8 percent today.
[4] The data from Piketty, Saez, and Zucman is not divided among federal, state, and local taxes, so it is difficult to tell exactly how much the rich were paying in federal income taxes specifically during this period.
What is then "fair"? 94% ? Do you have a formula? At least a principle. If I make $100 MM (I don't), should I pay 50%, 60%, 70%? I mean, even if I pay 99%, I still retain more than 99% of the Americans, right?
Brains here clearly in overdrive trying to elicit all the vaguely possible excuses and attenuations, everything but confronting the hard fact staring them in the face on a G-damn spreadsheet.
Folks, it doesn't get more straightforward than a spreadsheet. Income tax in the US was higher in the mid-1900s than it is today in Europe.
Arranging your income to reduce the tax paid in accordance with the law isn't tax evasion.