I did a sort of study, which is at least applicable if you have a fairly energy efficient house in a moderate climate. It was part of my thread on the electricity measuring device I bought, about 2-3 months ago (the "kill-a-watt").
I put the kill a watt on my furnace and measured its electricity use for 24 hours without a setback (leaving it at 69 degrees) and with a setback (to 65). Figuring (without dissenting argument) that if the furnace was on, it used electricity; if off, it didnt. So measuring the electricity was an implicit measurement of how long the furnace ran in aggregate.
Without pulling up the thread, the usage in the 24 hour period with the setback was ~ 30% lower than without it.
Colder house, colder climate, different energy efficiency, maybe different #'s. I also want to do a run with 3-4 days at similar temperatures to get a better sample than 24 hours.
About the only thing I can think of in this area is the time of the setback. Air heats up pretty fast. The "stuff" in your house like furniture, floors, walls and so forth take a long time to cool off and a long time to warm back up. In a very cold environment like MN or new england, with a house with very little insulation, maybe the "stuff" gets pretty cold pretty fast and the furnace has to work really hard to warm it all back up in the morning, vs maintaining the 'status quo'.
But then again, my knowledge of thermodynamics extends just as far as being able to read a thermometer fairly accurately.