Quote:
Originally Posted by Dtail
Thanks, so with that thought, I would think the lowest asset level (Firecalc) with the 3% inflation rate assumption would have been better than the asset level with the historical 1966 retiree using actual CPI.
|
I just did the option to download a spreadsheet for a starting year, so I did two for 1966. One with the Spending Models tab (had trouble finding it!) set to CPI, one with 3%.
edit/add: I used $1,000,000 starting portfolio for easy math, $40,000 start for 4% WR, and 30 years (spreadsheet requires it)
I'm not too familiar with these spreadsheet outputs, but it sure looks like inflation was the killer for 1966, and way worse than a constant 3%.
For the CPI, I get (excuse poor formatting please):
Period - Inflation Factor - Infl Adj W/D - Starting Portfolio
1966 1.035 $40,691.82 $958,616.35
...
1995 4.855 $194,213.84
-$1,411,685.64
For constant 3%, I get...
1966 1.0300 $40,600.00 $958,800.00
...
1995 2.4273 $97,090.50 $2,688,451.12
Seems to make sense, 1.03^30 is 2.4273, so the actual inflation for 30 years of 4.855 seems to pass the smell test?
So with 3% inflation, the 1966 retiree would have been golden, instead of broke in ~ year 24 of 30.
The success rate stayed at 95% - but two possible explanations for that. One is that "success rate" doesn't give a lot of resolution, there could be a fairly big gap between 95% and 94%, and/or, some years may have just traded places. 1966 'wins', but a year that barely passed with an average 2% inflation losses now at 3%?
Check my work, as I said I am not very familiar with the spreadsheet output. But this seems to add up to me.
-ERD50