Meat Thermometer Puzzle

I just put that CDN in some microwaved water. It went from 66 degrees to 158 degrees in 23 seconds!

The ChefAlarm only took about 15 seconds.

One trick is to put your instant read thermometer in a glass of very hot water before sticking it in the meat. Then it only takes a few seconds.
 
I thought I'd found the problem:

I fill up a two-cup measuring cup with water and nuke it in the microwave.

I hold the probe by the little plastic bump (seen here):

chefalarm_yellow_squared_a-copy.jpg


I put it in water so that only the narrow half-inch tip is in submerged.

It stabilizes at 152 degrees.

I then put it all the way into the water.

The temperature drops quickly to 142 degrees.
But realized that the water was indeed 10 degrees hotter at the top. If I stirred the water while measuring, there was no difference.
 
Wow, that ChefAlarm of yours is no sloucher.

Oh wait! It is also made by Thermoworks!

I guess you do not need to use anything else. Please disregard my earlier suggestions.

But your actual "speed test" made me want to "race" the Thermapen. Please hold...
 
Last edited:
I just put that CDN in some microwaved water. It went from 66 degrees to 158 degrees in 23 seconds!

The ChefAlarm only took about 15 seconds.

One trick is to put your instant read thermometer in a glass of very hot water before sticking it in the meat. Then it only takes a few seconds.

Just to be clear, the CDN that T-Al is talking about is the mechanical kind. The CDN I discussed is ~ $18, digital readout, and is pretty fast (says 6-8 seconds, but seems faster in most cases). It has an 'auto-calibrate' mode for an ice-water bath.

Amazon.com: CDN DTQ450X ProAccurate Quick-Read Thermometer: Kitchen & Dining

No accuracy spec'd, but it sure seems to be good enough, and repeatable enough for home use. We purchased DEC 2009, and I replaced the coin cell just once I think.

I just tried T-Al's test, from a 74F water bath to a dunk in boiled water out of the microwave - moves pretty fast, I caught it at 203 @ 5 seconds, and 211 @ 10 seconds. I'd expect a little faster if I stirred it around, that makes a big difference in heat transfer. I'll check the ice-bath cal later.

-ERD50
 
I was thinking about doing my own test with the Thermapen, and thought that a video would be needed to catch the readings. But then, it occurred to me that surely, someone would have done it already and posted it on youtube.

Yes, see the following video for yourself. Thermapens RULE!

 
I am not saying they are doing this, but manufacturers know how their units are tested and can and do bias their results to account for that. Many years ago when I was testing a competitors ear thermometer I found that they biased the reading towards 98.6. For every degree Fahrenheit that they computed they subtracted or added 0.1 degree towards 98.6. They of course knew that most people who evaluate the thermometers are normal and want to see this temperature.

Anyone who has used a digital scale knows that most of them sold in the U.S. show exactly the same weight when you step off and step on again. Giving you a false sense of their measurement variability. You either have to step off, step on again with half your weight, or wait a period of time to reset it and get the newly computed results. When you do this you will see that there is substantial variability in the measurements.

I am always suspicious of the potential tampering with actual results with microprocessor based instruments and have to account for this possibility when testing.
 
Speaking of measurement accuracy, in a past industrial project I built a circuit to interface a thermocouple to a microcontroller. While the thermocouple sensor itself is so simple and accurate, the tough part is the cold junction compensation. It was tough to get 1deg F accuracy, and in the end we decided that we really did not need that because the object we were trying to measure the temperature of was a turbine engine that ran so damn hot anyway, and an error of a few degrees was of no concern.

But about speed or how fast a thermocouple sensor can come to stabilization, it has no rival other than infrared temperature sensors. However, the latter cannot be as accurate due to its nature.
 
I am not saying they are doing this, but manufacturers know how their units are tested and can and do bias their results to account for that. Many years ago when I was testing a competitors ear thermometer I found that they biased the reading towards 98.6. For every degree Fahrenheit that they computed they subtracted or added 0.1 degree towards 98.6. They of course knew that most people who evaluate the thermometers are normal and want to see this temperature.

Anyone who has used a digital scale knows that most of them sold in the U.S. show exactly the same weight when you step off and step on again. Giving you a false sense of their measurement variability. You either have to step off, step on again with half your weight, or wait a period of time to reset it and get the newly computed results. When you do this you will see that there is substantial variability in the measurements.

I am always suspicious of the potential tampering with actual results with microprocessor based instruments and have to account for this possibility when testing.

98.6 is a bogus number anyway...

Normal Body Temperature : Rethinking the Normal Human Body Temperature - Harvard Health Publications
 
Based on that video, I would definitely want to use a thermapen when I want a very quick measurement of something that is 33 degrees Fahrenheit. Does anyone ever need to do that?

Anyone who has used a digital scale knows that most of them sold in the U.S. show exactly the same weight when you step off and step on again.
Yes, Health-o-meter caught me with that once. I tried one at someone's house, got three readings the same, and said "Wow, I'm buying one of these!"

It is the worst kind of lie. I'll bet a marketing exec said "We can't sell a scale that goes up and down by 2 pounds as you stand on it, go find a solution."
 
Based on that video, I would definitely want to use a thermapen when I want a very quick measurement of something that is 33 degrees Fahrenheit. Does anyone ever need to do that? ...


I assume it applies to any temperature delta of that magnitude. Room temp to meat cooking temperatures are a similar delta. Yes, the time constant is in % of the delta, so delta doesn't really change things much, except that the typical 'within a few degrees' we shoot for will be a larger part of the total for small deltas, so it will get in that range quicker (absolute rather than relative).


Yes, Health-o-meter caught me with that once. I tried one at someone's house, got three readings the same, and said "Wow, I'm buying one of these!"

It is the worst kind of lie. I'll bet a marketing exec said "We can't sell a scale that goes up and down by 2 pounds as you stand on it, go find a solution."

I'm not sure if I'm more disgusted by the trickery, or impressed by the creative 'solution'. Pretty darn clever! BTW, I was hoping they didn't do this with our kitchen scale, as there you can be adding deleting small amounts for a recipe. Looks like it gives it its best shot w/o too much filtering.

I looked at that kitchen sale a while back, and it is also quite clever. They have a separate pressure transducer floating in each of the four legs, and sum the outputs of each. It doesn't matter where you place the item, any corner or center of the platform and it reads the same.

-ERD50
 
I assume it applies to any temperature delta of that magnitude. Room temp to meat cooking temperatures are a similar delta. Yes, the time constant is in % of the delta, so delta doesn't really change things much, except that the typical 'within a few degrees' we shoot for will be a larger part of the total for small deltas, so it will get in that range quicker (absolute rather than relative).

-ERD50

+1

The temperature rise (or drop) of a thermometer as it comes into stabilization with the media is approximately an exponential decay. So, the time to be within, say 1F of the final reading (which may not be correct), is a function of the time constant of the thermometer, and also proportional to the temperature change.
 
Just another thought about response time. All of the electronic oral thermometers out there now are linear predictive thermometers. Basically, they look at the rate of change of temperature readings and predict the final temperature. They display and lock that temperature reading long before the actual sensor reaches that temperature. I am guessing that is he case here. You can see how in this case it would be easy to bias a thermometer when you know the testing temperature. Again I am NOT saying that they are doing this, I don't know. But that it is something that is possible and sad to say is probably increasingly common. As T-Al suggests, those "marketing execs"...
 
I wonder if the problem here is simply one of original calibration. County Meath, Ireland is about ⅓ of the way around the planet from Al's location, and if he's trying to use a Meath thermometer as stated in the thread title he may not be conforming to the manufacturer's intent.

Just a thought. :LOL:
 
I understand how one can cheat with a medical thermometer where the expected human body temperature has a narrow range. I do not see how one can apply this "technique" to a food thermometer that in the case of the Thermapen claims to cover a range from -58F to 572F.

Of course, I could be wrong. But the easiest way is for me to do some tests on the Thermapen myself, as I own one. I need to get off this forum for a little while though. :)
 
Here's another thought.

In order to catch the medical thermometer maker cheating, I can devise a simple test. With the thermometer at room temperature, dip it into a glass of water at 80F for a couple of seconds, then quickly move it to a 2nd glass at 100F. What will the doggone thing read?

Next, start with 100F temperature for the first few seconds, then move to 80F. What will it read?

One can do simple tests to check out various hypotheses.

And people think that thermometer testing is boring. :)
 
Last edited:
Here's another thought.

In order to catch the medical thermometer maker cheating, I can devise a simple test. With the thermometer at room temperature, dip it into a glass of water at 80F for a couple of seconds, then quickly move it to a 2nd glass at 100F. What will the doggone thing read?

Next, start with 100F temperature for the first few seconds, then move to 80F. What will it read?

One can do simple tests to check out various hypotheses.

And people think that thermometer testing is boring. :)
You are correct, and there are a lot of papers and patents addressing this issue. One good way to fool an IR thermometer for example is to suddenly move it into a very different ambient temperature. All of them assume stable temperatures for stated accuracy. Temperature measurement and control is indeed much more difficult (and interesting) that it appears at first blush.
 
OK, I am back.

I did some tests similarly to that video, by dunking the Thermapen and other meat thermometers into glasses of water at different temperatures. Nope, never saw the sign of hanky-panky on any of them, other than the different stabilization times to the final reading. I do not have video because that takes way too much work to set up, and this job does not pay all that well.

But it should come as no surprise that the speed demon of them all is my engineering thermocouple. See photo below. The meter this probe plugs into has a display update of perhaps 10Hz, and it reaches final readings in 1/2 to 1 second (too fast to measure without videotaping). Even so, I think the limitation is in the electronics or the meter, which must do some averaging to filter out the noise.

A K-type thermocouple puts out 41uV per deg C, and this low level signal has to be amplified in the presence of noise like AC hum and other emitted radio interference. So, some filtering is needed to prevent fluctuations in the reading and that would cause most of the lag. Just two tiny strands of wire like that thermocouple should not take more than 1/10 of a second to equalize in temperature to the water bath.

I guess my medical thermometer deserves its own later scrutiny, due to the "serious allegations" in previous posts.

 
Last edited:
Just another thought about response time. All of the electronic oral thermometers out there now are linear predictive thermometers. Basically, they look at the rate of change of temperature readings and predict the final temperature. They display and lock that temperature reading long before the actual sensor reaches that temperature. I am guessing that is he case here. You can see how in this case it would be easy to bias a thermometer when you know the testing temperature. Again I am NOT saying that they are doing this, I don't know. But that it is something that is possible and sad to say is probably increasingly common. As T-Al suggests, those "marketing execs"...

Yes, but I don't think the predictive algorithms are really 'cheating' in any way. These thermometers are typically used in liquids or meat, so once they have characterized the response curve in that type of substance (and I assume meat and liquids are similar enough in this regard), they can predict the output based on the the temp change over a short period of time. Each time, they would update it as they approach the 'real' settled temperature, so the absolute error is less with each update. I don't think the temperatures themselves make any difference at all to the algorithm, it's all about that exponential curve, regardless of the specific start/end points.

And I assume they undershoot a bit in their approximation - it probably 'looks' better to the user to see a temp go from say....

72..... 120.....130....135....136....136... than to see it go....

72..... 130.....138....134....136....136...

I wonder if the problem here is simply one of original calibration. County Meath, Ireland is about ⅓ of the way around the planet from Al's location, and if he's trying to use a Meath thermometer as stated in the thread title he may not be conforming to the manufacturer's intent.

:LOL: :LOL:

I'm guessing these kinds of thermal dynamics jokes would not go over well with the late night comedians! :cool: and I think we lost any non-techy types a while back in this thread ;)

-ERD50
 
Yes, but I don't think the predictive algorithms are really 'cheating' in any way. These thermometers are typically used in liquids or meat, so once they have characterized the response curve in that type of substance (and I assume meat and liquids are similar enough in this regard), they can predict the output based on the the temp change over a short period of time. Each time, they would update it as they approach the 'real' settled temperature, so the absolute error is less with each update. I don't think the temperatures themselves make any difference at all to the algorithm, it's all about that exponential curve, regardless of the specific start/end points.

And I assume they undershoot a bit in their approximation - it probably 'looks' better to the user to see a temp go from say....

72..... 120.....130....135....136....136... than to see it go....

72..... 130.....138....134....136....136...



:LOL: :LOL:

I'm guessing these kinds of thermal dynamics jokes would not go over well with the late night comedians! :cool: and I think we lost any non-techy types a while back in this thread ;)

-ERD50
I did not mean to imply that they were cheating by using predictive algorithms, I worked on them myself, only that you might not be able to easily infer the response time of the sensor. And as you show they do lock in the predicted reading, not showing the wiggle at the end. Most algorithms though do have some form of error check, and discard a prediction that results from obviously bad input.

My intention was to indicate the possibility of some vendors "tuning" the test points to make the accuracy and precision look better than it actual was. Not that this is necessarily all that bad either because any device only has to function over a specific range. But any engineer testing the device would need to be aware of these possibilities.

I was however surprised at the time when I saw the built in bias in the ear thermometer. It would certainly give the appearance of lower variation at the 98.6 temperature of a casual tester. My personal opinion was that this was not ethical and I would not do that even if asked. I was never asked.

My bathroom scale also has a reset button, so it computes new results every time I use it. I bought it in Asia, couldn't find one like that here. I do hate the ones with a lock in of the last reading. In mechanical scales and the older electronic ones, you could step on it three times and keep the lowest reading. We want the lowest reading right? :cool:
 
All the above talks about the bathroom scale cheating made me think.

OK, what you are saying is if they can sense that it is the same person stepping on the scale to take a repeated reading, they will present the last readout to give the sense of repeatability, which will cause the user to infer accuracy or at least reliability, which of course is not the same thing.

So, how does one devise a test? Simple. If the bathroom scale shows a resolution to 0.1 lb, one can do this test.

1) Step on the scale. The readout says "X".
2) Step off, then back on the scale. The readout says "X" again. Good.
3) Step off, then back on the scale while holding something weighting 0.2 lb. Does the readout say "X+0.2"?

I do that test with my bathroom scale, and the thing keeps saying "X", although I had that extra object in hand. Damn!

So, I now hold a 2nd thing in my other hand, which should cause the scale to read even higher. It does now! Damn! There's a threshold upon which they "release" the lock.

Then, I release both objects, and step on the scale again empty handed. Does it read "X" again? Nope, it reads "X+0.2".

So, the repeatability of the scale is perhaps 0.2 lb, which is worse than its resolution of 0.1 lb. Its absolute accuracy is something else, which I do not know until I can compare it with a calibrated scale.
 
Last edited:
My experiments showed that the variability was even more -- perhaps .6 pounds. That is, weigh self, step off, reset by weighing half of you (one foot on scale), weigh self again. Results could vary by .6 pounds.
 
Back to the original problem...

Last night I cooked a 1.25 inch thick rib-eye to 145 degrees, and used my thermometers to monitor its "resting" back to 125 degrees. So I had the thing on the table, and could conveniently experiment with probe positioning.

I could never get the ChefAlarm and CDN instant read thermometers to agree. The CA was usually 8-10 degrees above the other.
 
I am going to throw out a question since this is getting a lot of attention from engineer types...

What about the probes that you stick in and leave in the meat... the ones with a lead that goes outside the grill... does it really matter if there is speed:confused: I would think that accuracy would be more important...
 
I am going to throw out a question since this is getting a lot of attention from engineer types...

What about the probes that you stick in and leave in the meat... the ones with a lead that goes outside the grill... does it really matter if there is speed:confused: I would think that accuracy would be more important...

Hey, that would be like asking the owner of a porsche with a 195 mph max speed if that speed is really important. I would probably arrive at the destination in about the same time as him, but I am guessing he would get more girls. Sometimes it is all about bragging rights! :cool:
 
Back
Top Bottom