Run Burnintest, if you have a broken gfx card it will tell you. 2D as 3D errors. Don't let your monitor go to sleep or whatever. (that will give you a "3D test interrupted" error but isn't a fault. Not to confuse you )
We had a batch of 20 and 5 did exactly the same thing. Sent them all back to get replaced. When graphics cards fail in a big way they can take down a whole system. If you fired up a machine and pulled out the graphics card while it was running it'd crash the whole machine. It doesn't know what to do, so it will shit the bed. That is why your IGA screen shows a picture but is still frozen.
Rocking those stupid overclocks are a great way to ruin hardware. The option is there but unless you've got money to chuck at the problems it creates, just don't do it.
Just thought i sneak in this question here, is there anything wrong with pairing two different RAM manufacturers, both on 1600mhz? I have Mushkin Blackline (CL8) and thought about adding a Crucial Balistix also on 1600mhz but CL9, should i expect any problems?
It isn't recommended as it causes unstability, you will need to slacken out its timings on the Mushkin to match the Crucial by setting them manually in BIOS. It should do this by default but I've seen some boards try to tighten the timings which banjo'd the slower RAM.
I was wondering the same. AMD makes absolutely stellar GPU's (although quite power hungry compared to Nvidia) that are unmatched for GPGPU (atleast until Titan), but their current desktop processors are... well I'd propably be banned from here if I told you my honest opinion.
Depends on what you're building. if you're after a high end system then bulldozer is absolutecrap compared to ivy. If you've making a htpc light gaming steam box sort of build that relies on integrated graphics I'd say trinity is the way to go
I don't think AMD make a CPU as fast as mine at stock clocks (i7 3770k) let alone overclocked (4.8 GHz).
The GPU was just because I was lazy and could swap it over without installing new drivers (previous one was an AMD card because years ago I bought an AMD "Dragon" bundle).
On-Topic - Haven't had my iPhone connected and it hasn't crashed yet...
I have the Phenom II X6 1090T running at 4.0GHz from stock 3.2GHz, Benchmarks around the same as i5 2500 and 13% slower than an i7 3720.
I haven't pushed the overclock much beyond 4.2GHz but its watercooled at load temps are only 44C so could probably go a bit higher.
For basic use and gaming it works fine in conjunction with my Graphics @ 1GHz but I would rather go with intel but I couln't afford a decent i7 and I got this CPU from a friend for £75 when last year when I bought it from him it was retailing around £200+ so it was a very good bargain at the time.
It works for me so no need to upgrade yet
USB drivers, faulty USB port, the iPhone itself? Should maybe connect and leave it 30 mins or so then disconnect it, repeat. If it crashes then it may be the cause of the problem, or at least linked to it
CPU: AMD>Intel
GPU: Nvidia>AMD
Just because one makes a faster product at a higher price doesn't mean they are better. AMD was kicking ass once. If some people had their way, there would be no competition for Intel. Which is stupid.
I think the definition of a "better CPU" is that it's faster. You wouldn't say a Ford Pinto is a better car than a Ferrari Enzo because the Pinto is cheaper, would you?
AMD wins for Price vs Performance, but in terms of pure performance AMD is falling way behind Intel i7's even with their latest FX.
Well the FX-8350 is about the same as an 3770k but... AMD needs 8 cores & 8 threads to do what Intel does with 4 cores with 8 threads? xD
I stand by my Pinto vs Ferrari argument. With a CPU, the only thing that matters is speed.
It's not a qualitative thing like "ooh, nice shiny leather" like a car has. It's purely quantitative. You might like AMD, or not have as much money so you decide to buy AMD but to state that they have "better" CPU's is flat wrong.
Indeed, Intel's CPU's are far more superior to AMD and probably always will be. AMD boasts about their new 8/12 core CPU's yet Intel's i7 4/6 cores can perform 50% better and more in some cases.
Cheerio said they were better. I said the price vs performance was better.
The FX8350 is about £150 or so and when overclocked performs just like a 3770k at stock clocks.. Yet the 3770k costs about £250, £100 more than AMD.
Intel will always be better than AMD with CPU's and AMD should concentrate more on Graphics to take on Nvidia since AMD will never catch Intel imho
AMD were better during the 00s but that is only because their competition was the P4. Once Intel pulled up their socks they quickly started to leave AMD in the dust again.
And since games only use two cores at maximum right now, you might as well buy the most powerful chip you can get on 2 cores. Right now that crown belongs to Intel. If you're into vidagimo editing, that crown belongs to Intel. It was a common claim that AMD are always 10 years behind Intel. Since the iX range, that is more believable than ever.
And apparently it's actually not that much cheaper. The higher end bulldozer CPU's are the same cost as a similar CPU. the savings would be negated after 200 days of use due to the higher power usage.
Uhh.. no. The 8350 which is the fastest CPU that AMD offers, can barely compete with an i5-3570K which is Intels mid-range chip. When AMD FX chips are overclocked, they push insane amounts of heat and pull in equally insane amount of watts. Though they pull insane amounts of watts even at stock clocks compared to Intels opposition. The savings in your electricity bill with Intel makes all current AMD's pointless, unless you still live at parents and don't have to pay for the electricity, then I might see a point to the AMD FX bulldozers if you're really tight on cash.
I really hope not. Intel is already abusing it's dominance in the enthusiast market. All 39XX series are 8-core processors, but Intel locks two cores out from them, just because they don't have anyone to compete with in that market. If AMD dropped out completely from the mid- to high end markets, we would all be in deep sh*t very soon. Without AMD in the game, Intel would have no reason to keep pushing R&D forwards.
Looking a lot of different benchmarks the 8350 @ ~4.5GHz is about the same as a stock i7 3770k, Obviously Intel is the better of the two but it does pretty well against its oppositions, even if it needs to be overclocked to its max and watercooled, it can keep up for most purposes.
As with all, or most, AMD chips they all run quite hot especially when starting to overclock and start pulling scary amounts of wattage.
I have the 1090T 6core running at 4.5GHz watercooled and is 20-23C idle, ~45C 100% load. I've never verified but at the current clock speed and voltage it pulls around 147w at load.
Without having it watercooled it'd probably be approaching is TDP of ~67C, even with the best air cooler on the market it'd be almost impossible to reach those clocks and voltage, or it'd be scary hot under load.. and even at idle.
Yeah obviously AMD needs to at least try to keep up and keep releasing competing chips otherwise Intel would have 100% share of the CPU market but I think AMD needs to change their gameplan, Intel's quad chips perform significantly better than AMD's 6 and 8 cores. More cores doesn't mean its better.
Other than true multi-threaded applications that can utilize those cores/threads there is no need for any more than 4, for gamers its useless and a waste of power. I'd rather see a quad core AMD that can perform close to what Intel's i7s can.
(But then I guess as Intel uses 2 threads per core and AMD uses just 1 thread per core.. AMD needs the extra cores to get those extra threads...)
Well, the 1090T is actually a better CPU than FX when it comes to power efficiency. This graph speaks for itself.
I'm still not buying that 8350 could ever match a stock 3770K performance. Maybe it could get near with insane overclocks with some rendering/video exporting application that could stress all those cores, but for gaming and generic usage it's pretty hopeless even against a 3570K. Just compare SC2 and Skyrim framerates, the FX gets demolished.
When overclocked, things aren't much better when the 3570K is overclocked aswell. Since the price difference between 8350 and 3570K is only ~50 euros or so with a motherboard, you'd save that much in electricity in just a year.
as i said, IMO.
ive had 3 intel CPU's and 2 AMD CPU's.
all 3 intel were (1 is, got a Q8200 atm...) crap.
for me at least.
i dont doubt that they outdo AMD numberwise, but for me they wouldnt do...
ive had a phenom x3 @ 2.8 ghz before this (mainboard died and screwed everything else up).
compared to this Q8200 which is a quad...the phenom felt like 500% faster.
massive FPS drop in games, slower rendering ect.
either way, im a happy customer of AMD even if i will have a slower pc.
as already mentioned, price - perfomance is far better there.
im not on the hunt for high end machines anyway
Guys.... how often do you have a CPU maxed out at 100%???!!!!
So while a 150€ bulldozer is indeed slower then the 1500€ extreme edition, you will barely ever notice that. The extreme edition costs like 10 times as much and is like 30% faster (I obviously don't have the exact values, but the € to performance gap becomes MASSIVE)
Just recently I upgraded my cheap core2quad to a 3770k. I didn't really see a speed difference when using it normally (gaming, internet, booting etc etc). I prittey much only notice the difference when doing 3d rendering etc. so maxing it at 100% (and damn it's nice).
So when I would build a PC for gaming (or prittey much anything other that primary goal is not something like rendering stuff), I definitely would look at the performance to price ratio, which usually is on AMD side. And invest the extra 1000 € into the GPU, where the power actually is needed and makes a big difference.
Here is a nice benchmark with even cheap single core CPU's vs the expensive multi cure and thread CPU's.
The same with mobile phones. I buy 150€ phones from china. Those phones with android 4.0 can do everything your beloved iPhone does (and more, android FTW). While it is a tad slower, the 700€ difference is just not worth it. You never ever need that much power on a damn phone to listen to music and surfing the web.. and a damn lot of guys buy a new phone every 1-2 years anyways
Since there is nothing more to be said in topic I thouhgt I'd comment a bit on this "benchmark".
This is one helluva idiotic test. DirectX doesn't support multithreaded display lists until version 11 so you cannot really benefit from improved rendering speed on multicore CPUs. GPU drivers are optimized for parallel processing but they can only do so much when the rendering engine is ultimately running as a single thread. The fact that gaming engines yet have to fully embrace the power of parallel processing is one of the reasons why Bulldozer performs so poorly at gaming - there is no point in having 8 cores when the game cannot use more that two. In workloads that can actually take advantage of all available CPU's it performs reasonably well.
BTW there are no threads on the CPU, "logical CPU" is the term you're looking for
not quite they have plenty of competition although at the low performance low power consumption end of the scale
also the 39 series is pretty much useless for anything other than scientific work or people which too much money to waste
not sure how you come up with those numbers since z77 motherboards are actually cheaper than 990fx ones and the rams bulldozer requires are a bit more expensive as well so i come up with a price difference of pretty much 0 for a complete system