Originally Posted by Fred123
Here's an example: suppose you have $100 invested in the stock market. After year 1, the stock declines by 50%. The next year the stock increases by 100%. After two years, you have $100. But the average return is (-50+100)/2 or 25%. If you had invested the stock in a CD earning 1%/year, you would have (roughly) $102. The average return is only 1%. Yet you've earned more money in the second case.
I don't think one can calculate average rate of return using different piles of starting cash and then merging them together into an average. In the above case the 50% loss is on a pile of cash that is twice as large as the 100% gain. That has to be taken into account. When you do so the average return is 0%.
Obviously, you have picked up on that since you were not fooled by the 25% return. I thought I would make it clear why the math does not work.