I saw a post http://www.early-retirement.org/forums/f28/2017-year-end-tax-planning-88717-3.html#post1947711
where the poster was puzzled that ACA subsidy phaseouts of 10% or less could effectively lead to "marginal rates" like 15% or 17% as mentioned by other posters.
http://www.early-retirement.org/forums/f28/2017-year-end-tax-planning-88717-3.html#post1947378
http://www.early-retirement.org/forums/f28/2017-year-end-tax-planning-88717-3.html#post1947398
Here's an explanation of the phenomenon (with made up numbers, analogous to ACA calculations).
Suppose you would have to pay
10% tax on $50,000, that is, $5,000
but you would have to pay
10.1% tax on $51,000, that is, $5,151
so the extra $1,000 costs an extra $151 in tax, so a "marginal tax rate" of 15.1%
What happened? There is the extra 10.1% * $1000 = $101 which contributes an unsurprising 10.1% to the "marginal tax rate".
But there is also an extra 0.1% * $50,000 = $50 which contributes another (possibly surprising) 5% to the "marginal tax rate".
Think of the change in area when a rectangle grows from 50X10 to 51X10.1. There's an extra L-shaped region, and you'll miscalculate if you miss one of the legs of the L. See the rectangular picture in this.
World Web Math: The Product Rule
Some college cost formulas have something similar, e.g. a college may say that for income in the range $50k to $150k, you'll pay a percentage of income ranging from 0% to 20% increasing (linearly, 1% per $5k income above $50k) as a function of income, so
0% * $50k = $0
1% * $55k = $550
2% * $60k = $1,200
............
............
19% * $145k = $27,550
20% * $150k = $30,000
But even though the "rate" increases from 0% to 20%, the "marginal rate" increases from 10% to 50% !
Consider,
the first $5k income increase from $50k to $55k costs an extra $550 which is 11% of $5k.
The last $5k income increase from $145k to $150k costs an extra $30,000-$27,550=$2,450 which is 49% of $5k.
In formulas: for X between $50k and $150k, the percentage you pay is
R(X)=(X-$50k)/$500k which goes from 0% to 20% as X goes from $50k to $150k,
so the college cost is
C(X)=X*R(X)=X(X-$50k)/$500k which goes from $0 to $30k as X goes from $50k to $150k,
but the marginal cost (derivative of C(X) wrt X) is
C'(X)=(2X-$50k)/$500k which goes from 10% to 50% as X goes from $50k to $150k.
So this is one way marginal rates can be possibly surprisingly higher than expected.
where the poster was puzzled that ACA subsidy phaseouts of 10% or less could effectively lead to "marginal rates" like 15% or 17% as mentioned by other posters.
http://www.early-retirement.org/forums/f28/2017-year-end-tax-planning-88717-3.html#post1947378
http://www.early-retirement.org/forums/f28/2017-year-end-tax-planning-88717-3.html#post1947398
Here's an explanation of the phenomenon (with made up numbers, analogous to ACA calculations).
Suppose you would have to pay
10% tax on $50,000, that is, $5,000
but you would have to pay
10.1% tax on $51,000, that is, $5,151
so the extra $1,000 costs an extra $151 in tax, so a "marginal tax rate" of 15.1%
What happened? There is the extra 10.1% * $1000 = $101 which contributes an unsurprising 10.1% to the "marginal tax rate".
But there is also an extra 0.1% * $50,000 = $50 which contributes another (possibly surprising) 5% to the "marginal tax rate".
Think of the change in area when a rectangle grows from 50X10 to 51X10.1. There's an extra L-shaped region, and you'll miscalculate if you miss one of the legs of the L. See the rectangular picture in this.
World Web Math: The Product Rule
Some college cost formulas have something similar, e.g. a college may say that for income in the range $50k to $150k, you'll pay a percentage of income ranging from 0% to 20% increasing (linearly, 1% per $5k income above $50k) as a function of income, so
0% * $50k = $0
1% * $55k = $550
2% * $60k = $1,200
............
............
19% * $145k = $27,550
20% * $150k = $30,000
But even though the "rate" increases from 0% to 20%, the "marginal rate" increases from 10% to 50% !
Consider,
the first $5k income increase from $50k to $55k costs an extra $550 which is 11% of $5k.
The last $5k income increase from $145k to $150k costs an extra $30,000-$27,550=$2,450 which is 49% of $5k.
In formulas: for X between $50k and $150k, the percentage you pay is
R(X)=(X-$50k)/$500k which goes from 0% to 20% as X goes from $50k to $150k,
so the college cost is
C(X)=X*R(X)=X(X-$50k)/$500k which goes from $0 to $30k as X goes from $50k to $150k,
but the marginal cost (derivative of C(X) wrt X) is
C'(X)=(2X-$50k)/$500k which goes from 10% to 50% as X goes from $50k to $150k.
So this is one way marginal rates can be possibly surprisingly higher than expected.