Erm, I don't really get why the difference is so big, there really shouldn't be any. Anyway, go for the safe bet, and assume it's the higher temperature (it most probably is). Just to give an idea, my current CPU temp is 50C, and the 4 cores are all at 47C.
The individual core temps are read via DTS inside each core, the CPU temp usually from a thermal diode sensor on the motherboard, next to or below the CPU.
Actually, one of the sensors is probably faulty, at least that'd be my guess. Morpha says it right, there are sensors inside each core, and one close to the CPU. Those should be getting the same temperatures, or something very similar.
The low temperature is the CPU one, so that's on the motherboard. It's probably faulty, or placed in such a retarded way that it doesn't really pick up the correct temperature ^^
Well I have seen from some benchmarks that in some games there is a good extra FPS, also in things like video converting or rendering it speeds up the time taken to do them, I bought the cooler as my PC was quite hot anyway and I wanted to cool it down, while doing this I thought I might aswell overclock
Do you find your performance lacking in any particular app? What kind of framerates are you getting?
Slightly faster for the sake of being faster is pointless if you're already getting 90+ fps.
Likewise, a 25% boost in clockspeed (3.75GHz vs. 3.00GHz) might get you a 10% boost in framerate. At 30 fps, that gets you an extra 3 fps. Not exactly earth shattering, nor worth the effort/risk, IMO.
If you want higher framerates, get a new video card.
Thing is though, you're not getting 90+ fps everywhere, unless you have the most up to date computer (which people don't have). Overclocking an older CPU might get you a boost good enough to allow for better applications to be run on your computer. That's true especially if you overclock the whole computer.
Besides, it's mostly free, unless you spend money on expensive cooling, and luke only spent around 30$ worth of cooling. Considering that buying a video card would be over 100$ (at the very least), overclocking becomes good for the money. The risk, if you know what you're doing, is really minimal. I don't really hear often of people breaking their computers with overclocking. When I do, it's always because of abysmal, beginner errors.
Well Everest, Realtemp, Coretemp all give the same temperature readings, 1 degree up and down,
Maybe I will reapply the thermal paste again as I did use a bit too much
Edit- Mmm, I will wait a few days as there is apparently a 'Break in' period, which temps can drop 1 to 5C, so I shall see in a few days :P
Edit2- I went for a OC, Im currently at 3.26ghz, idles at 36/43C, full 43/50C, that was playing a game, I had to knock the voltage up to just under 1v,
Still no mention of what apps/games the OP is running. If all he's doing is playing games, he's either completely limited by his video card, or getting such astronomical framerates that more performance is pointless.
Sounds to me like he just wants to enlarge his e-peen.
Not trying to be a jerk, just putting things in perspective.
Lowering your graphics settings means lowering the load on your GPU, allowing it to render more frames in the same amount of time. However, if your CPU is the bottleneck, your graphics settings are irrelevant.
Ok, I quickly tested it, between the settings I use and the lowest, there was about 10-20FPS difference, considering I usually use 1920X1200 and went to 1024X768 thats not alot
Different games require different performances from either your CPU, GPU or other components. Therefore, it's always a good idea to get the most out of your current computer.
Anyway, it's entirely possible that both luke's CPU and GPUs are bottle-necking him, considering they're not the newest components, especially if he's playing newer games like BFBC2. In such a situation, and if he's not ready to spend hundreds of dollars on new stuff, overclocking remains the best solution.
It won't matter if he overclocks his cpu or not. If both are a bottleneck it WILLNOT matter if he overclocks his processor. Since both are slow he has to overclock both of them or get something better which The OP is not going to get. For example: My old pc with a q6600 and 8800gt on GTAIV. To test for bottlenecks I lowered the res to 1024x768 and lowered every detail. the fps difference between 1680x1050 and 1024x768 was 10 fps. I use my improvised rule of the thumb that both were bottlenecking. Not only gtaiv use cpu to render complex physics and gemetry but also uses massive amount of gpu mem to render textures. now on my new i7 860 rig with (newly stable) 3.6 ghz and 5870; when put through the same amount of tests the fps difference was noticable. 50-70 avg to 80-100 avg. bottleneck isn't a key issue there. So in conclusion if the OP wants a sufficient boost on fps both the cpu and the gpu will have to be overclocked thus putting more stress to the parts and judging by the knoledge not very informed about how voltages, multipliers and fsb works. Encouraging this will make his pc much less stable. Hope that helps
EDIT: Oh and one more thing if anyone brings a q about sli here; 512+512 does not = 1024. SLI and also CFX uses the vid mem. of ONE gpu, no additive thing going on here.
If I'm not mistaken, SLI/CF uses the memory from BOTH cards, but just to write/read faster? Almost as using RAID, if u know how i mean.. Thats how i, if i remember it correct, think it is.. =)