Care to explain, or just blindfolded intel fanboy?
The difference with i3 is +- few fps, no big difference, plus FX-4100 is really easy to OC, you can probably get +1 GHz with only changing the multiplier, no need to even add volt.
Why would you want an i3 in a mid to high-end gaming PC? The comparison is totally irrelevant IMO. I think you're better of with an Ivy bridge i5/i7 and spend some more bucks for more performance and less power usage.
Most high-performance applications (Photoshop, 3DS Max, Vegas, et cetera) are compiled using Intel C++ Compiler, which enables bunch of significant optimizations if app is being executed on a Intel CPU. Benchmarking apps, on the other hand, are not. So even if some AMD CPU comes close to similarly-priced Intel CPU in a benchmark, it'll still suck in real life usage scenarios.
Oh, by the way, $180 i5-2500K is faster than $190 AMD FX-6200 in both single-threaded and multi-threaded benchmarks.
E.Reiljans, the 180 dollar 2500k looks kinda weird, you can't even get it in that money if it was in euros, in Estonia it's 191€, now put that in dollars, FX-6200 is 147€.
2500k In the same page as FX-6200 you gave, is $235.66.
Not that big difference, but if you are doing a budget build, that's a huge difference.
Just a word of advice, dont go for ATI video cards....they are very apealing with the cheap prices and good specs but when its time to perform they always fail at something ( i speak from my own experience ).
Games are developed , most of them, with Nvidea drivers/graphics so you can bet your money that they will perform better on a Nvidea chip. With ATI cards its gambling a bit, the can work wonderfully well on one game, and suck very hard on other game with same requirements ( again, i speak from my own experience ).
I think its something worthy of some consideration.
I'm gonna have to go ahead and disagree with you there, ATI cards are alright, they won't fail at anything. Look at the benchmarks, they are right up there with Nvidia. It's true that in some games Nvidia cards perform better than ATI cards, but it's also true that in other games ATI cards perform better than Nvidia cards. You can't just say that ATI makes bad cards, it's just simply not true. It's up to the maker of the game to decide which cards to optimize their game for.
I've had fans failing of a few ATI cards that I've had over the years. They're pretty much on par with each other, but nVidia seems more reliable and worth the extra 50-100 bucks in the long run imo. That's personal preference though.
In my experience Nvidia is more compatible with games and other computer hardware. I've had multiple ATI cards which just didn't work on my mainboard with other hardware. Which tells nothing about which is better in performance though.
As for a computer I bought a 700 euro computer for less than 300 euro.
This is because my boss was buying more than 10 of these for his business. Maybe you can do the same and save a pretty penny. Of course this is illegal in Belgium, because everything you write off your taxes has to be present at your company.
With the 400 euro I saved on my system itself, I bought me a 300 euro graphics card and a 200 euro monitor. (My limit for full system upgrades is always 1000 euro).
This is just a hint for all people who work full-time who might be able to do the same deal with their boss.
Otherwise my next upgrade will be overclocking every clock (RAM, CPU, GPU) by 2x it's speed, just by submerging my PC! http://www.pugetsystems.com/submerged.php
i have the ati 5770 and ive never had a problem with any games that ive tried,i play bf3 at a high fps and iracing and bfbc 2 plus many other.ive also had many nvidia cards in the past and they were good enough also
I didnt say ATI cards were bad, man. Read the post carefully, i said they will fail at something, because thats the experience i had over the years with ATI cards. I owned 4 ATI cards before this new nvidea i have, and all of them disapointed me in something ( being the biggest problem, the lack of decent drivers from AMD/ATI, that makes the card work 100% with everything ).
But i'll give an exemple: my last ATI card was a HD5770, not high end card, but not to shaby also. For a 1Gb DDR5 i expected much more from it. I could run Skyrim with medium graphics, @ 1920x1080 and it was fine. Just as i could run mass effect 3 and a few others on medium setting with good FPS. But GTA4=20fps , TDU2=15fps , APB Reloaded = 25-30fps , Deus-ex=25-30fps .......not acceptable with a card with those specs!!!
The cards themselfs are good, its the drivers that are not!! Thus, making ATI a gambeling choice....thats all i ment in my earlier post
i have a 5770 and have none of them issues,high fps in everything ive tried,maybe something set wrong or something else on system causing issues, if you do a search on google you would find that gt4 has been an issue for a lot of people regardless of what gpu brand they are using and the same for test drive unlimited 2.
For me bf3 is considered a game that can have an impact on fps but i have settings on full and have no problems with fps,One setting that can have a bad impact on fps which is in the CCC is Anti-Aliasing Mode-only affects my fps in lfs but if i set to super-sample AA my fps drops by about 100 but only in lfs do i experience or have experienced this
If you look at those games that have poor performance you will see they are horrible console ports and thus not optimised. If you would take a similar card from nvidia you would see lower FPS in those games too.
I didn't read much of this thread. but sticking to the opening post, i'd go for this (pic attached). unless you have more than one monitor, there's no real need for more than one GPU. I'm also assuming you can use your current hard drives.
so, CPU= 8 cores,
GPU = Enough to play todays games.
Ram = 16gb DDR3
The i5-3570K's should be in stock now, atleast over here they are.
And for the love of all that's holy, don't buy a Bulldozer. They may be temptingly cheap, but that's for a good reason. They aren't 8-cores as AMD claims (quad core with four virtual cores in reality) and they are slow as s**t when it comes to single threaded apps and they have a MASSIVE power consumption.
You should propably get an AMD/ATI graphics card at the moment though, as Nvidia are taking their sweet time releasing mid-end kepler cards and AMD are dropping their own card prices to counter Nvidias keplers. So something like HD 7850 or HD 6870 should fit in your budget just fine.
get over yourself. help the guy or simply **** off.
EDIT, Even if the bulldozers are as bad as people said, swap it for a high end phenom. I've been using the 550 (duel core) and it's only really starting to show its age. still AM3+ so you could drop the latest chips into it further down the line. The reason for so much ram is that I'm thinking you'd like to dabble with ableton and the likes. Ram+cpu are important for that aspect.
Lets not turn this thread into something about you ok? make your own thread and i'll go there and call you all sorts of wonderful things. you've now made two posts that help nobody.
nothing nice to say? then STFU. simples...
See? That's why I never ask for an advice on the Interwebs, I don't feel like turning my problems into a flamewar fuel.
If you're not already up to speed with the Llano architecture and you don't feel like googling it up, Bulldozer:
- Is a real 8-core CPU
- Has 4 modules with 2 cores per each
- Each module has a shared L2 cache and FPU. The shared FPU becomes a bit of a bottleneck only when both cores want to execute a 256 bit long SIMD instruction at once (today only AVX is like that)
- It scales much better than Intel's HT, with about 80% effectivity of what a full-blown 8-core CPU would do.
- It works best when accompanied by a Bulldozer-aware CPU scheduler. It doesn't really help the performance, but it helps to lower the power consumption a lot. Windows 8 seems to have such a CPU scheduler.
Bottomline: If you're looking to save as much money as possible and you don't build your life around dominating 3DMark leaderboards, Bulldozer it not as bad a choice as some "experts" here would like you to believe. It has quite poor instruction-per-cycle ratio though, so some unoptimized single thread apps won't run as well as they do on SNB's or even older Phenoms.