Portal Forums Links Register FAQ Community Calendar Log in

Join Early Retirement Today
Reply
 
Thread Tools Display Modes
Old 12-08-2013, 08:52 AM   #21
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
donheff's Avatar
 
Join Date: Feb 2006
Location: Washington, DC
Posts: 11,331
Quote:
Originally Posted by FIRE'd@51 View Post
Midpack

I find your tone a bit offensive. I started this thread with the hope that one (or more) of the thousands of readers of this forum better versed in statistics than myself would suggest a way to do this, or even if it can be done. If I knew how to do it myself, I would have done it already.
I still don't understand how you could do something to statistically improve an algorithm that specifically eschews statistics for arriving at its results. If you statistically removed the correlations, outliers, or any other historical facts you would no longer have a historical calculator.
__________________
Idleness is fatal only to the mediocre -- Albert Camus
donheff is offline   Reply With Quote
Join the #1 Early Retirement and Financial Independence Forum Today - It's Totally Free!

Are you planning to be financially independent as early as possible so you can live life on your own terms? Discuss successful investing strategies, asset allocation models, tax strategies and other related topics in our online forum community. Our members range from young folks just starting their journey to financial independence, military retirees and even multimillionaires. No matter where you fit in you'll find that Early-Retirement.org is a great community to join. Best of all it's totally FREE!

You are currently viewing our boards as a guest so you have limited access to our community. Please take the time to register and you will gain a lot of great new features including; the ability to participate in discussions, network with our members, see fewer ads, upload photographs, create a retirement blog, send private messages and so much, much more!

Old 12-08-2013, 09:00 AM   #22
Administrator
MichaelB's Avatar
 
Join Date: Jan 2008
Location: Chicagoland
Posts: 40,725
Quote:
Originally Posted by FIRE'd@51 View Post
I am not faulting FireCalc. To the contrary, I think it's the best program around. I am only wondering if there is a way to manipulate the overlapping historical data to better simulate a statistical ensemble of "independent" data points. This has nothing to do with PE10 or other valuations, which will always be present. I believe these affect the mean of the ex-ante distribution, but not the variance. I'm interested in focusing soley on the statistical analysis which, I believe, can give us a better handle on the variance, which ultimately determines the ex-ante probabilities.
Not a statistician, but what other options are there in addition to looking at either historical cycles or random rates of return? Wouldn't variance from either of these just be adding bias and reducing the value of the output?
MichaelB is online now   Reply With Quote
Old 12-08-2013, 09:09 AM   #23
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
 
Join Date: Sep 2005
Location: Northern IL
Posts: 26,896
Quote:
Originally Posted by FIRE'd@51 View Post
I ... I am only wondering if there is a way to manipulate the overlapping historical data to better simulate a statistical ensemble of "independent" data points. This has nothing to do with PE10 or other valuations, which will always be present. I believe these affect the mean of the ex-ante distribution, but not the variance. I'm interested in focusing soley on the statistical analysis which, I believe, can give us a better handle on the variance, which ultimately determines the ex-ante probabilities.
I'm lost as to how this could be done, or how there could be any validity to the results. I agree that the data points really only represent a handful of patterns. And I think that's about all we have.


edit/ - cross posted with the others

-ERD50
ERD50 is online now   Reply With Quote
Old 12-08-2013, 09:25 AM   #24
Thinks s/he gets paid by the post
FIRE'd@51's Avatar
 
Join Date: Aug 2006
Posts: 2,433
Quote:
Originally Posted by donheff View Post
I still don't understand how you could do something to statistically improve an algorithm that specifically eschews statistics for arriving at its results. If you statistically removed the correlations, outliers, or any other historical facts you would no longer have a historical calculator.
I didn't say remove correlations - I said remove the serial correlation from the overlap. They are totally different things. For example, if the year were 5013 instead of 2013, we would have approximately 105 independent non-overlapping 30-year data points, which would be a better sample size to try to estimate future probabilities from, analogous to flipping a coin 105 times instead of 5 times to estimate the probability of heads.
__________________
I'd rather be governed by the first one hundred names in the telephone book than the Harvard faculty - William F. Buckley
FIRE'd@51 is offline   Reply With Quote
Old 12-08-2013, 09:41 AM   #25
Thinks s/he gets paid by the post
FIRE'd@51's Avatar
 
Join Date: Aug 2006
Posts: 2,433
Quote:
Originally Posted by ERD50 View Post
I'm lost as to how this could be done, or how there could be any validity to the results. I agree that the data points really only represent a handful of patterns. And I think that's about all we have.
Maybe it can't be done, although I have to believe there are statistical methods that attempt to deal with the overlap problem, as it must occur often in time-series data. As I said above to Midpack, I started this thread with the hope that someone better versed in statistics than myself could opine on this subject.
__________________
I'd rather be governed by the first one hundred names in the telephone book than the Harvard faculty - William F. Buckley
FIRE'd@51 is offline   Reply With Quote
Old 12-08-2013, 09:46 AM   #26
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
samclem's Avatar
 
Join Date: May 2004
Location: SW Ohio
Posts: 14,404
Quote:
Originally Posted by FIRE'd@51 View Post
IOW, I'm suggesting that a 4% SWR could actually lead to a significantly lower success rate than 95% due to this data overlap effect; and, since we really don't know what the probability of success is, one should rely upon the FireCalc SWR that has never failed.
Even it were possible to add a greater amount of "cases" to the retrospective analysis (e.g. by disaggregating the present US data to make the years independent and nonsequential, adding data from Australia or Canada, etc), I think it would introduce more uncertainties than it resolves. More fundamentally, it would remain retrospective analysis with all the inherent uncertainty about its applicability to the future.
We say that "past results are not indicative of future returns", but in fact that's something each of us is forced to ignore, to a greater or lesser extent, as we plan for our future. Given that we are breaking the rules in this way, how much will be gained, in a practical sense, by improving the accuracy of the FIRECalc results? They provide a rough starting point.
samclem is offline   Reply With Quote
Old 12-08-2013, 09:51 AM   #27
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
Lsbcal's Avatar
 
Join Date: May 2006
Location: west coast, hi there!
Posts: 8,809
Quote:
Originally Posted by FIRE'd@51 View Post
I didn't say remove correlations - I said remove the serial correlation from the overlap. They are totally different things. For example, if the year were 5013 instead of 2013, we would have approximately 105 independent non-overlapping 30-year data points, which would be a better sample size to try to estimate future probabilities from, analogous to flipping a coin 105 times instead of 5 times to estimate the probability of heads.
Maybe you have touched on the problem yourself. How many years are needed to separate world economic events that affect US stock and bond returns? And so how many years to find independent periods to group together into a neat statistical study where we can apply STAT 101?

My guess is you will need a lot of years to reduce correlations towards zero -- at least several decades. If we really want to be conservative, maybe 100 years. For instance, WW1 is thought to have had a lot to do with the 1930's Great Depression. So maybe the events and economics of 1914 related to the economics 20 years later. My feeling is that the 1930's economics (80+ years ago) are still somewhat a ghostly presence in our current thought processes. I certainly consider the 1930's when deciding on our spending strategies.
Lsbcal is online now   Reply With Quote
Old 12-08-2013, 09:54 AM   #28
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
haha's Avatar
 
Join Date: Apr 2003
Location: Hooverville
Posts: 22,983
Quote:
Originally Posted by FIRE'd@51 View Post
Midpack

I find your tone a bit offensive. I started this thread with the hope that one (or more) of the thousands of readers of this forum better versed in statistics than myself would suggest a way to do this, or even if it can be done. If I knew how to do it myself, I would have done it already.
It can't really be done. Either something is there, in the data, or it is not. In this case, it is not. The Monte Carlo comes at the problem from an entirely different direction, and of course has its own set of limitations.

Ha
__________________
"As a general rule, the more dangerous or inappropriate a conversation, the more interesting it is."-Scott Adams
haha is offline   Reply With Quote
Old 12-08-2013, 10:46 AM   #29
Thinks s/he gets paid by the post
photoguy's Avatar
 
Join Date: Jun 2010
Posts: 2,301
Quote:
Originally Posted by Huston55 View Post
It's true that current PE10 is high historically, in the range if the historical highs. But, that's why we use >=95% success rates and analyze any failures. To call the current situation 'biased' seems to be counter to the methodology and overly pessimistic.
I'm not saying the current economic situation is biased. What I am trying to say is that if one uses the number straight out of firecalc (say 5% failure) and believes that starting today their chance of failure is also 5% they are likely underestimating the true probability.


Quote:
Originally Posted by FIRE'd@51 View Post
I am only wondering if there is a way to manipulate the overlapping historical data to better simulate a statistical ensemble of "independent" data points. .
I think this is what the creators of MC simulations are trying to achieve. They realize the limitations of the historical data record and setup a simulation with the parameters of their model informed by the past data. If they've done a good job, then hopefully they've captured all the important behaviors in the historical data (I have my doubts about how well this can be done).


Another thing one could do is take a MC simulation (which doesn't need overlapping data points for each 30 year run) and do so many runs that you get a very precise measure of the failure rate. Then modify the simulation to be like firecalc (overlapping 30 years runs, ~100 cycles) and get an idea of the distribution of results. This wouldn't let you improve the results but would at least give you a better idea of the range of error due to the overlapping runs.
photoguy is offline   Reply With Quote
Old 12-08-2013, 10:46 AM   #30
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
donheff's Avatar
 
Join Date: Feb 2006
Location: Washington, DC
Posts: 11,331
Quote:
Originally Posted by FIRE'd@51 View Post
I didn't say remove correlations - I said remove the serial correlation from the overlap. They are totally different things. For example, if the year were 5013 instead of 2013, we would have approximately 105 independent non-overlapping 30-year data points, which would be a better sample size to try to estimate future probabilities from, analogous to flipping a coin 105 times instead of 5 times to estimate the probability of heads.
OK, I think I get what you want but it would be a different sort of calculator from Firecalc. Instead of evaluating how your portfolio would have done against all actual x year historical periods you would want to simulate how future 30 year periods might statistically play out given the historical record we have to date. But isn't that precisely how some of the Monte Carlo simulators are constructed? Why not just look at how they describe their approaches and pick one that does what you want.
__________________
Idleness is fatal only to the mediocre -- Albert Camus
donheff is offline   Reply With Quote
Old 12-08-2013, 11:17 AM   #31
Thinks s/he gets paid by the post
 
Join Date: Jul 2005
Posts: 4,366
There is not enough data to do any better historically.

Ed Easterling at Crestmont Research was the first person I know of to look at starting conditions for retirement. He used only P/E though. He actually joined this forum for a while, though he did not receive a lot of support from most here.

Crestmont Research: Financial Market and Economic Research

At one time he had a nice spreadsheet of returns versus starting P/E, but I don't see it in a quick browse of his website.
Animorph is offline   Reply With Quote
Old 12-08-2013, 11:24 AM   #32
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
Lsbcal's Avatar
 
Join Date: May 2006
Location: west coast, hi there!
Posts: 8,809
Quote:
Originally Posted by Animorph View Post
...
At one time he had a nice spreadsheet of returns versus starting P/E, but I don't see it in a quick browse of his website.
The data is fairly easy to get. Just take the Shiller spreadsheet and calculate the trailing P/E. Then take the FIRECalc run spreadsheet data.

But how to present this is perhaps the harder part. P/E is one point and from that retirement year we get a curve from FIRECalc going out maybe 30 years. One would want the low point on any curve and maybe also the final portfolio value. Guess that is a nice task for someone here.
Lsbcal is online now   Reply With Quote
Old 12-08-2013, 11:25 AM   #33
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
haha's Avatar
 
Join Date: Apr 2003
Location: Hooverville
Posts: 22,983
Quote:
Originally Posted by Animorph View Post
There is not enough data to do any better historically.

Ed Easterling at Crestmont Research was the first person I know of to look at starting conditions for retirement. He used only P/E though. He actually joined this forum for a while, though he did not receive a lot of support from most here.
He was insufficiently respectful of our forum gods.

I have read two of his books, he seems very good to me. Though when markets are boom bust as they have been, and IMO must continue to be as long as we have this massive FRB intervention that we have had since 2009, his work can't catch it. IMO nothing other than momentum can deal with this, and of course this has its own difficulties.

Ha
__________________
"As a general rule, the more dangerous or inappropriate a conversation, the more interesting it is."-Scott Adams
haha is offline   Reply With Quote
Old 12-08-2013, 11:26 AM   #34
Full time employment: Posting here.
 
Join Date: Mar 2008
Posts: 800
I think there is too much wishing to improve accuracy when there is no way to do so in a meaningful way. As was said, past performance is not a predictor of future results. We have no way to know what the distant future holds. There are just too many unknown future world events that will affect your results.

In addition, we aren't static. I don't believe anyone will go blindly forward taking their SWR as events happen. Adjustments up and down will be made to reflect changes to your situation.

Firecalc is just a tool to help plan your retirement. Like horseshoes and hand grenades, it just needs to be close, that you're in the ballpark of having enough.
akck is offline   Reply With Quote
Old 12-08-2013, 12:16 PM   #35
Full time employment: Posting here.
CaliforniaMan's Avatar
 
Join Date: Dec 2013
Location: San Diego
Posts: 880
Very interesting thread. My problem is that I seem to agree with everyone on this. FIRE'd@51 has a very good point, we are looking at a lot of overlapping data, and that will underestimate the variation.

On the other hand Monte Carlo approaches can for example have you going from 2% inflation one year to 18% inflation the next, back to 2% the next year, same with interest rates, etc. We can model the deltas from year to year, which would tend to be more realistic, but then you could get some years with really really high or low values. Unrealistic high variation.

As many have pointed out here, and I think I agree, the overlapping time period approach as use by FireCalc and others is probably the best we can use. Just to get a baseline of what may be "reasonable" and then adjust as necessary.

The real problem in all of this is of course, that we do not have a stationary time series. Is 1900s really the same as 2000s? Will the rate of technology change be the same? Will the US still be as important as we were for most of this period? Will it matter?

And even if we had data back many more decades or centuries would it make any difference to the validity of our projections?

And then what about that asteroid strike? I hear that all of the gold we have found has actually come from past meteor strikes. (The original primordial gold sank to the core along with the other heavy elements) What if it contains tons and tons of gold, or maybe diamonds? Hmmm....

By the way, this is my first post as a new member.
__________________
Merrily, merrily, merrily, merrily,
Life is but a dream.
CaliforniaMan is offline   Reply With Quote
Old 12-08-2013, 01:22 PM   #36
Recycles dryer sheets
 
Join Date: May 2013
Posts: 127
Another Firecalc problem that hasn't been mentioned is undersampling of recent years. If, for example, you are looking at 30 year periods then the 2008 meltdown is only included in 5 time series, while years 1982 and earlier are included 30 times. And, of course, it get worse the longer the time period you are using in the calculator. I'm guessing that the underweighting of 2008 makes Firecalc overoptimistic.

I wrote my own little calculator using three methods: Firecalc, MC analysis assuming a normal distribution, and MC analysis using resampling with replacement (analagous to bootstrap sampling). Firecalc is much more optimisitic than the MC analysis, so I tend to be more comfortable with MC.
Fred123 is offline   Reply With Quote
Old 12-08-2013, 01:41 PM   #37
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
Midpack's Avatar
 
Join Date: Jan 2008
Location: NC
Posts: 21,304
Quote:
Originally Posted by Fred123 View Post
Another Firecalc problem that hasn't been mentioned is undersampling of recent years. If, for example, you are looking at 30 year periods then the 2008 meltdown is only included in 5 time series, while years 1982 and earlier are included 30 times. And, of course, it get worse the longer the time period you are using in the calculator. I'm guessing that the underweighting of 2008 makes Firecalc overoptimistic.
True. However the actual Great Depression (which was far worse than 2008) is included 30 times, and that hasn't happened again for about 80 years (let's hope it never does). And the worst 30 years periods since 1871 to present began 1965-69 and 1973 (ending 2003), which includes neither the 1930's Great Depression or the 2008 Meltdown (yet, though the market already more than recovered those losses).

FIRECALC is history, it's not optimistic or pessimistic in that it makes no predictions for the future whatsoever.

Again, retirement calculators tools are axes, not scalpels - even in the most gifted academic hands. And even if you could nail down portfolio returns, there are so many other variable unknowns you'd still be left with statistical probabilities...
__________________
No one agrees with other people's opinions; they merely agree with their own opinions -- expressed by somebody else. Sydney Tremayne
Retired Jun 2011 at age 57

Target AA: 50% equity funds / 45% bonds / 5% cash
Target WR: Approx 1.5% Approx 20% SI (secure income, SS only)
Midpack is offline   Reply With Quote
Old 12-08-2013, 01:44 PM   #38
Give me a museum and I'll fill it. (Picasso)
Give me a forum ...
samclem's Avatar
 
Join Date: May 2004
Location: SW Ohio
Posts: 14,404
Quote:
Originally Posted by Fred123 View Post
I wrote my own little calculator using three methods: Firecalc, MC analysis assuming a normal distribution, and MC analysis using resampling with replacement (analagous to bootstrap sampling). Firecalc is much more optimisitic than the MC analysis, so I tend to be more comfortable with MC.
Would it be best to simply append data from previous years to the end of the real-world set? So, a 30 year sample could be made from 1984-2012 (29 years)plus 1983. Another set would be 1985-2012 plus 1983-1984. Yes, there would be a big discontinuity where the "splice" occurs, but it is no worse than the discontinuity in >every< year-to-year of a typical MC simulation, and the payoff is that you can in this way include the very significant 2008-2012 data in a larger number of "runs."
samclem is offline   Reply With Quote
Old 12-08-2013, 02:24 PM   #39
Thinks s/he gets paid by the post
 
Join Date: Oct 2012
Location: Reno
Posts: 1,338
Quote:
Originally Posted by samclem View Post
Would it be best to simply append data from previous years to the end of the real-world set? So, a 30 year sample could be made from 1984-2012 (29 years)plus 1983. Another set would be 1985-2012 plus 1984-1984. Yes, there would be a big discontinuity where the "splice" occurs, but it is no worse than the discontinuity in >every< year-to-year of a typical MC simulation, and the payoff is that you can in this way include the very significant 2008-2012 data in a larger number of "runs."

I'm not sure how FireCalc uses it's data set, exactly. My impression is that it runs SWR against consecutive year sets as long as the length of the withdrawal period, but I'm likely wrong.
If so, one could randomize (or Monte Carlo) consecutive 5 or 7 year sets of data, which would hybridize the historical data set against the single year Monte Carlo approach. I chose 5-7 years simply because that period would hold a typical market cycle +.
I've probably got this wrong and maybe FireCalc already uses this approach.
RobLJ is offline   Reply With Quote
Old 12-08-2013, 03:51 PM   #40
Thinks s/he gets paid by the post
 
Join Date: Jan 2008
Posts: 1,495
There's a very good (recent) thread on the Bogleheads site regarding the use (and reliability) of various calculators. One in particular lists calculators (including FC) from most pessimistic to least. See this Boglehead site for a listing of several calculators, their attributes, and methodologies ranging from MC to deterministic.

Retirement calculators and spending - Bogleheads

I also recall reading here somewhere that a poster stated he would trust no single input to determine a comfort level for FIRE, and I tend to agree. I've run several scenarios/calculators which all tell me I'm on track. Next spring, I intend to buy ESPlanner (recommended as most detailed and helpful). If it concurs with the research I've done thus far--and I believe it will--I'll pull the plug and book that trip down the Amazon...
Options is offline   Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


» Quick Links

 
All times are GMT -6. The time now is 03:14 PM.
 
Powered by vBulletin® Version 3.8.8 Beta 1
Copyright ©2000 - 2024, vBulletin Solutions, Inc.