View Single Post
01 May 2012  

Windows 7 Home Premium 64 Bit

Most temperature sensor software is designed for measuring temperatures under stress. There is more calibration that needs to be done for idle temperatures because the idle temperatures are generally closer to room temperature and prone to error for that reason.

Also, it depends on the software that measures the temperatures. Some software will work better with certain systems but be horrible with other systems and vice versa. It is best to test multiple programs and see which gives a more accurate measurement for your system. Get a feel for what you would expect to happen as you stress the system and see if one monitor does a better job in those conditions.

In my experience, most monitors will be more accurate under stress than at idle, and I attribute this to the fact that they are designed to watch for high temperatures and not care as much about lower temperatures and also the room temperature error I mentioned earlier. Not sure if I answered the question properly since I am making educated guesses about what is likely happening. Given that your idle temperatures are quite a bit higher than room temperature, I would guess they are fairly accurate, but I cannot say for sure.

The stress temperatures are more important, at any rate, and if those are well below 70 C on your card, you are doing very well. My AMD HD 4850 in my desktop runs in the 90-95 C range under stress, for instance, and around 50-60 C under idle. It is clear of dust, but I have never been able to get it to run at lower temps. No problems with the system, though *knock on wood*.
My System SpecsSystem Spec