Why are we not testing versus Crystalwell enabled Iris 5200? This is the most important information, even if it isn't in the same price category necessarily.
The i7 tested is not in the same price gap ether (an i3 would probably lose a lot of this partial CPU stuff) but the iGPU performance barely changes.
And how is that the most important information????? If anything it is the least important as you cannot buy Iris (pro) iGPU 5x00 on desktop unless embedded.
The bigger factor is that Iris Pro simply isn't available as a desktop part -- it's only available in the BGA package i7-4770R, which OEMs can use in things like all-in-one PCs. The other place where we'll see Iris Pro (for now) is on laptops with the HQ series parts, but again that's not going up against desktops. GT3 and GT3e effectively don't exist as desktop offerings right now, but that's not too surprising as laptops stand to benefit most from improved iGPUs.
Let me remind you that the Iris5200 is a $650 part. In fact, the ONLY situation where the Crystalwell part makes sense is when TDP and power requirements/battery concerns are the absolute priority. Otherwise, it's much cheaper to get a non-iris part and a separate mobile gpu (say, radeon 8970M) that offers vastly superior performance.
They improved the stuttering for single GPU use. Both nVidia and AMD suffer from micro stuttering with multi-GPU solutions. It's a different problem set.
AMD is worse than nVidia's stuttering, but is easily fixed by adding a third GPU. So instead of using two high end cards in Crossfire/SLI, using three mid-high end cards virtually eliminates stutter.
Indeed, there is a $468 part. You can still fit a decent dGPU and a decent CPU on that budget for, once again, vastly superior performance. And you don't need crossfire but you do lose on power consumption, which is the only point the Iris has for it.
I wonder how much discount do OEM generally gets from Intel. 30% off Tray $440 @ $308/chip ? If the CPU used to cost them $200 and $100 for the GPU, i guess the space saving of 2in1 solution, less power usage, while giving similar performance is going to be attractive enough.
Yah, for me, the only consideration for a system with on-die CPU graphics is if I buy a low-end notebook that I want to do a little gaming on, and the chips with Iris price themselves out of that market. I've recommended AMD for that kind of product to my friends before, and I don't see any reason to change that.
No, Crystalwell also makes sense on any high-performance part. Be it the topmost dekstop K-series or the Xeons. That cache can add ~10% performance in quite a few applications, which equals 300 - 500 MHz more CPU clock. And at 300$ there'd easily be enough margin left for Intel. But no need to push such chips...
As others have already pointed out it's not the "most important information" at all. Crystalwell isn't available on a regular desktop socket.
Most importantly though, that is also for a good reason: Who would buy it? At the price point of the Crystalwell equipped CPUs you would get hugely better gaming performance with an i3/i5/FX and a dedicated GPU. You can build an entire system from scratch for the same amount and game away with decent quality settings, often high - in full HD.
There is a point to make for HTPCs, gaming laptops/laplets, but i would assume that they don't sell a lot of them at the Crystalwell performance target.
Since the article is about Desktops however, and considering all of the above, Crystalwell is pretty irrelevant in this comparison. If you seek the info on Crystalwell performance i guess you will know where to find it.
"Who would buy it?" If it was just the added cost of the eDRAM put on top of the -K SKU (so 50€ or something on top of the i5-4670K and i7-4770K) I'd buy it in a heartbeat. First of all, it offers better QS functionality and second of all, the 128MB L4 cache is not exclusive to the iGPU, but can be used for the CPU as well, which offers some serious performance gains in programs that can make use of it.
Excuse me, but what is the MSRP of the A10-6800k versus the i7-4770k? Also, wouldn't benchmarks also be affected by CPU performance to at least some extent?
Given how GPU and memory bandwidth limited these systems are, I'm sure the difference in CPU performance plays only a small if not negligible role in the final score.
MSRP really isn't a valid comparison here as they are entirely different price points/target audiences. The point is to test the iGPU capability.
AMD and Intel have very different approaches to iGPU and processor SKU's, today. AMD and it's Fusion are specifically targeting low price points where AMD believes the value of an iGPU is most attractive. The CPU cores are similar to its FX line, but it's an entirely different die than its flagship desktop parts which have NO iGPU whatsoever.
Intel on the desktop for the mostpart has a single die for all its mainstream Core i7's down to budget Core i3, Pentiums. The Core i7's iGPU isn't really focused on giving a budget gaming experience. And this is where Anand's criticism is aimed. They could make an amazing APU with a very balanced iGPU and CPU on the high end desktop parts but have chosen not to. It would seem the powers that be have decided there is no market for Iris Pro and its high end desktop parts.
MSRP would be a valid comparison in the Mobile Core i7 with Iris Pro vs the Richland Mobile parts.
Perhaps. In that case, the price of the CPU would be partially obscured by the total BOM. If Iris Pro is that good, and you got double performance for twice the price, it wouldn't be too bad.
"Intel on the desktop for the mostpart has a single die for all its mainstream Core i7's down to budget Core i3, Pentiums. "
Not true. Dual core and quad core have had different silicon since they started the i3 / i5 / i7 naming convention. I'm no mobile expert, but I know that on the desktop i3 has never had the same die as i7.
A10-6800K is sitting around $150 on Newegg, while the 4770K is pushing $349 daisies. The comparison is still sensible and useful. Spend less money on Intel CPU and the clocks go down. So in an iGP setting for gaming AMD makes more sense, but if you throw a discrete card in the mix you'll have to rethink what your goals are. After staring at those prices, for a gaming only rig I might rather spend the price difference on a discrete card and call it a day if the monitor resolution is 1080p or less.
The comparison isn't for people buying Core i7 lol. You wouldn't need a comparison if you were already going to buy it. The comparison is exactly what it says: Radeon HD 8670D vs. Intel HD 4600.
1080p monitors can be found even bellow 100$ ,there isn't really a point in reviewing desktop anything bellow 1080p. going lower to find where a game becomes playeble is fine but the review should have 1080p tests even if the products are not good enough. Would be nice if Kaveri would double the SP count but AMD might be going for a smaller die to cut costs given their difficult financial situation. Wouldn't quite match the Xbox in perf but would be close enough and could do a decent job at playing console ports for the next few years.
Why test at a resolution where you're not going to get playable frame rates? If you can only get playable frame rates @ 768p by running medium quality, I'm pretty sure it's going to be unplayable at anything other than low / minimum @ 1080p.
that's one of the points ,"you are pretty sure" not sure because the review doesn't do it's job to show you for sure and if you want a clear picture you need to look elsewhere.
Considering that nothing here is playable at 900p, it is quite possible to extrapolate that 1080p won't be playable either. So I'm pretty fine with them not testing it. If you get a $150 APU to play the latest games at 1080p (a resolution much larger than current consoles support in gaming, might I add), you are deluded.
Are the tests currently set up to show higher res but lower detail settings? I know there is a set benchmark settings that they use to normalize, which is fine for high end CPU/mid to high end GPU testing. If I remember correctly the setting, as they climb in "quality" (low, medium, high) increase both resolution and detail concurrently. With Trinity/Richland and eventually Kaveri, it would be interesting to see if these APU's can handle recent games at higher resolution, but lower detail settings. Essentially can you get any recent games to play at common resolutions, even if you have to crank down settings.
I can't imagine anyone really wanting minimum quality 1080p over Medium quality 1366x768. Where it makes a difference in performance, the cost in image quality is generally too great to be warranted. (e.g. in something like StarCraft II, the difference between Low and Medium is massive! At Low, SC2 basically looks like a high res version of the original StarCraft.) You can get a reasonable estimate of 1080p Medium performance by taking the 1366x768 scores and multiplying by .51 (there are nearly twice as many pixels at 1080p as at 1366x768). That should be the lower limit, so in some games it may only be 30-40% slower rather than 50% slower, but the only games likely to stay above 30FPS at 1080p Medium are older titles, and perhaps Sleeping Dogs, Tomb Raider, and (if you're lucky) Bioshock Infinite. I'd be willing to wager that relative performance at 1080p Medium is within 10% of relative performance at 1366x768 Medium, though, so other than dropping FPS the additional testing wouldn't matter too much.
You're wrong. Most people who take Starcraft 2 seriously are actually playing on the highest resolution they can get, with the low detail setting. Sure, the game looks flashier, but it's easier to play with less detail. All pros do it.
Also, as for myself, I like to have the games running on native resolution of the display. It makes "alt-tabbing" (or equivalent thereof on Linux) much more responsive.
This is my BIGGEST pet peeve with some reviewers who will test 1080p and only show those results when testing these types of chips. All of the frame rates will be unplayable yet they'll try to draw "some conclusion" from the results. Test resolutions where the minimum frame rate is like 20-25 fps by the contenders so I can see how smooth it actually will be when I play.
I didn't purchase an IGP solution to play 10 FPS games at 1080p. I purchased it to play low resolution at OK frame rates.
I always start by setting the game to my display's native res (1080p), and then find out at what settings can I achieve passable performance. I just hate non-native resolution too much :(
"Richland maintains a 17 - 50% GPU performance advantage (~30% on average) over Intel's HD 4600 (Haswell GT2)" And yet consumes more then 2x the power according to your own charts. And what about the CPU performance? These are desktop parts not laptop parts, their iGPU performance is meaningless
Did you skip just to the conclusion? The reason as to a lack of CPU benchmarks is on the first page.
How much power would a 4770K with a GT640 use, incidentally? And at the other end of the scale, what about that 4600M which is rated at 35W yet in a couple of tests beat even the 4770K with its HD 4600? You're asking for results that Anand hasn't managed to grab just yet for reasons as stated on the first page.
There's some strange results in here. In the 3DMark: Fire Strike Extreme test, all three APUs have the same result, but in 3DMark06, the 6800K significantly beats everything else. However, regardless of one or two oddities, the 6800K isn't a real progression over the 5800K... but it was never really made out nor expected to be.
Who cares about power on the desktop? What are you running, a server farm? We're not talking about a 200W part here. The Richland is easy enough to cool as it is. Just because intel based its strategy around a mobile part doesn't mean we have to run behind absolute power/performance ratios. Price/performance makes more sense for the average user.
Also, iGPU is very important at that price point. If a $150 CPU saves you a $80 GPU, it's quite attractive. USA readers probably can afford to spend $200 for a dGPU, but struggling european economies and the developing world are a big part of the international market.
Well I think this also shows how close mobile will be. Gt2 at slightly lower speeds than the 4770k (say 1200 mhz 3MB cache for an i5/i3 vs 1300 mhz 8MB cache for 4770k) will be about 10-15% slower. Mobile trinity is going to be approximately equal to haswell mobile gt2 and richland may be slightly ahead but the gap is largely gone.
I have always wondered why would anyone paying $330 for a high-end CPU care for a barely adequate iGPU. It's much more reasonable to expect that people looking at the $150 price point will appreciate an iGPU, especially one that is quite decent. The cheapest GT640 I could find was ~$85 (local price), which is no small change. And don't think that the GT640 will get the same scores if paired with the i3...
The vast majority of these CPUs do not go to gamers... Their performance is more than acceptable for a large number of use cases if an OEM doesn't want to include a dGPU. However for $330 you get CPU performance that AMD cannot touch at ANY price point or performance/watt..
I see your point. But how many of these users need the CPU performance of the 4770? Do you think that the average business user needs a 4770 to do excel and answer emails? Will he even notice the difference? I can't really show you statistics, but I imagine that a big part of demanding users are in fact gamers.
I understand why the average Intel CPU has an integrated graphics processor, but the K-series parts are specifically targeted at enthusiasts. Why don't they omit the IGP from those?
QuickSync is one possible answer. Another is that enthusiasts tend to swap out GPUs more frequently than other demographics so having a basic iGPU can come in handy for diagnostics now and then. And not all enthusiasts are gamers.
Thanks for the squeezing this out in such a short turn around. However I just don't think this is useful or new. I have never met anyone in the market for an i7 xxxxk cpu looking to play AAA game using the iGPU, have you? The iGPU in the i7 is just a bonus because it shares a die with the mobile counterpart, and gives you quick sync if prefer speed over quality in your transcoding.
In today's market, the only reason to invest in the space, noise , heat, and money for a desktop gaming PC is to play games at 1080p or higher. Just get a Xbox if you need 720p. From the benchmarks its clear that neither the i-7 Haswell nor the Richland are playable at 900p let alone 1080p. On the other hand, the same tests at 768p on mobile Richland and Haswell parts makes perfect sense given the typical resolution and thermal of laptops. Given the power usage delta between the AMD and Intel desktop parts, I suspect the race is going to be a lot closer in the laptop race.
Richland is really lame. I mean it brings barely peanuts over Trinty. Why even release that. And it's expensive. Desktop parts are really boring right now.
I'd be interested in seeing if you can get a stable RAM clock @ 2400MHz, and if so, how much Richland scales with that. Hope you take a look at when you do the more thorough piece.
APUs scale very well with faster memory almost regardless of timings. I'd like to see Richland benchmarks with DDR3 2400, though I can already make a pretty good guess of what those figures would look like.
So what I'm seeing here is very similar GPU performance to trinity for $20 more? Except in 3Dmark06 where it suddenly has a huge jump. Doesn't strike me as worth it.
I'm a little disappointed that AMD cared to release this as a new model generation at all. There's barely enough argument to avoid throwing the "rebranding" flag. Shoulda just fit the upclocked parts appropriately into the current gen's numbering and adjusted prices accordingly.
The effort is appreciated always, but the marketing is somewhat misleading from the surface.
Sorry to say, but if you wanna play games, then grab a dedicated GPU. Noone plays modern games with integrated GPUs. iGPUs are only good for some casual gaming that don't demand any powerful GPU.
The only thing I'm interested in is the CPU-part and intel wins this comparison by a mile. AMD needs better CPU-performance or they'll never win me back as a customer for desktop-parts.
Exactly. I'm mystified by Richland's existence. Exactly where does it make sense for desktop users?
If you're gaming on a desktop and you're on a budget, it's more expensive than the outgoing A10 5800K and not much faster. Nor does it make any games playable that were unplayable before on the older APU.
If you're gaming on the desktop and budget is not the biggest factor, why even bother looking at AMD parts right now?
If you're planning on a HTPC build the 100W TDP is too high. Get a Haswell or an older (and lesser model number) Trinity APU.
If you plan to build your own all-in-one, again the TDP is too high.
So why would anyone buy Richland on the desktop? It's $20 (15%) more expensive than the A10 5800K but only ~5% faster. Is there +1000 model number simply there to justify the price hike?
It's slightly better silicon released to make AMD look a little better. There's also the much touted software bundle they've been mentioning - AMD seems to be more about the "experience" nowadays.
As an upgrade to my old PII X3 710, it'd be significant... but the GPU wouldn't be better than my 4830. Kaveri, on the other hand, would likely improve on the latter as well as providing more than double my current CPU speed. In 2009, the CPU and GPU cost me approx. £180 (about $250?), but in 2013, I'd be surprised if I had to pay more than two thirds of that for something much better. :)
For **** sake please increase the size of the text on your site, it's too small and I'm getting eye strain reading all the articles on Computex, do you not listen to your readers, the text is too small!!!
I suggest you stop giving useless advice, I'm not going to stop using my tablet. Even with text at 120% the text is too small and the text on other websites is huge @ 120%. The text on every other website I have ever visited on my nexus 7 is fine, so why can't anandtech be the same? There is an issue here that needs to be resolved.
Regardless of your feelings, posting about your issue in this discussion is off topic and pointless as Anand is not going to read and act on your post.
I'd say to write to Anand directly but I've tried that and it seems the emails go ignored anyway. I think you either just have to live with the site as it is or stop visiting the site. But whatever you do, don't introduce more off topic posts please.
by not running the trinity on supported max mem you effected the whole review, Richland ends up in that case being a very minor update....
On top of that not a single word on the improved power.
But then again as you mention with all the things around with Intel and all the nice motherboards linup for hasswell etc why even bother its just AMD. pls continue the efforts like this, a few years from know you regret that you will need to pay double for the same cpu, dominiated predefined designs by marketing geeks etc... way to go
Its pointless doing a iGPU review if you don't have frame metering factored in. Tomshardware, Techreport, PCPer all did frame transition ratings and this is where HD4600 took a massive beating, in most titles showed up Intel's iGPU to hit 60+ms frame lags while the APU is very low latencies for many titles at lower resolutions (sub HD) is under 1ms where Intel spikes will result in noticeable lags and stutters despite looking close on FPS which are basically not worth what they were.
BF3 @ 1080 on low settings with DDR3 2400 on my 5800K manages about 30FPS but its almost lag free, tested on HD4000 was completely undesireable to even persist, basically playing a slideshow, since HD4600 doesn't fix this much it still puts AMD top in the iGPU stakes by a healthy margin irrespective of frames per second. Since we are comparing top vs top there is no ambiguity. THG showdown between a i3 and 6800K was interesting, a 6800K can beat a i3 + 6670 in a few titles so that is another testiment to the improvement of APU technology.
It still amazes me that websites bench the i7s vs the amd a10 platforms. Whole a10 systems can be purchased for the price of an i7 CPU. Why dont you test a $200 graphics card in an a10 system vs the i7 integrated graphics - if this is the case and show people what they can get (for their $350 dollars spent on cpu + graphics)???? The closest price processor from intel vs 6800k that i found locally was an i3 3240. Why dont you use these in the comparisons?????? Why use a Gt640 only with an i7 & not on the amd system to show the cpu bias in the benches???
While yes the i7 being the fastest mainline part available and x86 will help distort its numbers a bit nevertheless the issue is not x86, the issue is iGPU performance only. That being said the i7 4770K and A10 6800K are both the top line parts in review so there is no ambiguity as to one being entry level and the other top line, these are both flagships so the test is top product on top product iGPU showdown.
What is disappointing is Anandtech still don't have any workable Frame Scaling tool to asses performance as Frames Per Second means diddly squat where Frame Latency is the true indicator. As before the A10 is always below 10ms and often below 1ms while the i7's HD4600 often hits 60ms+ latencies which is basically a microstut.tut.tut.tut.tut.ter.
I haven't personally tested HD4600 but I have been told it has boosted FPS but in terms of frame transistions which are often a combination of hardware and driver support that HD4600 is not much better so again while HD4600 is close in FPS in some instance, its very far off in latency. In short I would rather have a iGPU average 27FPS but have 0.1ms latencies opposed to 35FPS with 50ms latencies.
The comparison is amd's budget apu line vs intels top desktop line. You should throw a radeon 7850 in the amd system to even up the specs (at least dollars wise) and then run a few graphics benches... please someone do a comparison i7-4770k with its iGPU vs a10-6800k+7850hd in gaming. im pretty sure i could guess the winner here...
While it is interesting to see gaming benches, since I might game on an HTPC, I'm actually thinking more on how perfectly the chips can display a 4K video, *and* I'm interested in whether I can run 2 HD movies (two at once) on 2 displays without any dropped frames or stutters (this actually matters a bit at the moment for me). I know there is one article here for the Intel 4600 graphics re 4K.
Anand, I think this is a waste of time. I don't know anybody who buys desktops anymore, unless they intend to use it as a workstation or as a gaming platform. In the first case, they don't usually care about graphics, and in the second case, they will absolutely have discrete graphics and these graphics benchmarks are utterly irrelevant. I like this website and appreciate the work, but I would rather you spend your time on something more useful for us -- for example, comparing the notebook platform integrated graphics would be reasonable. This article puzzles me.
"comparing the notebook platform integrated graphics would be reasonable.". There are many reviews about IGP for the APUs for specific users or gamers. This article highlights how much AMD has arrived in terms of gpu and cpu balance in their desktop and notebook parts (ie APUs specifically) which will pose a serious challenge to Intel's dominance. To many, Intel based cpu with IGP is clearly not the way to go only Intel cpu plus Nvidia discrete gpu or go the APU route and compromise cpu somewhat gaining close to discretely gpu performance for way less money. Also shows that Intel 4600 IGP has gone a long way to within striking distance of AMD APU gpus but not good enough as the sliding scale of reference MOVES each time Intel approaches. The GT3e potentially can match a NV 650M discrete but at the cost of Intel $$$ to the user. Most manufacturers rather go Nvidia discrete which is cheaper and better as well. So unless Intel goes into heterogeneous core architecture chips for cpus, there is nothing really new in their offering.
With the dozens of different configurations that call themselves "GT 640" It's pretty important to specify which one was used in these tests. GF116? GK107? GK208? GDDR3? GDDR5?
I still find this article interesting even if IGP are certainly not the main focus of gamers. I don't consider myself a hardcore gamer but I don't game on IGP. I am currently using a 560 GTX which provides me with decent performances in pretty much any situation. On the other hand, it gives an idea of the progress made by IGP. I certainly would enjoy more performance from the one I am using at work which is a GMA 4500 paired with a E8400. There are markets for good IGP but gaming is not of them. As I see it, IGP are more suited to be paired with low to mid CPUs which would make very decent all around machine.
Looks like you used a 65W GT640, released just over a year ago. You could have used the slightly newer and faster 49W or 50W models or a 65W GTX640 (37% faster than the 65W GT640). Better still a GeForce GT 630 Rev. 2 (25W) with the same performance as a 65W GT640! (I'm sure you don't have every GPU thats ever been released lying around, so just saying whats there).
An i7-4770K, or one of its many siblings, costs ~$350. For most light gaming and GPU apps, the Celeron G1610T (35W) along with a 49W GT640 would outperform the i7-4770K. The combined Wattage is exactly the same - 84W but the relative price is $140! Obviously the 25W GK208 GeForce GT 630 Rev. 2 would save you another $20 and give you a combined TDP of 60W, which is 40% better than the i7-4770K. It’s likely that there will be a few more GT600 Rev.2 models and the GK700 range has to fill out. Existing mid-range GPU’s offer >5times the performance of the i7-4770K. The reasons for buying an i7 still have little or nothing to do with its GPU!
i shudder to think what an a10 kaveri can bring to the table considering it'll be equipped with amd's gcn architecture and additional ipc improvements. low price + 4 cores + (possibly) hybrid xfire with a 7xxx series radeon? a great starting point for a decent gaming rig. not to mention that the minimum baseline for pc gaming will rise from decent to respectable.
Sometimes I really don't understand your comparisons and even less the conclusions. Why compare a Richland to a Haswell when obviously they will get used for totally different purposes? Who will purchase a desktop Haswell without graphic card for gaming? Why use super expensive 2133 memory with a super bad processor?
There are really 3 conclusions to be had: - CPU-Wise Richland sucks aplenty - GPU-Wise there is next to no progress as compared to Trinity, the difference being fully explained by a small frenquency increase. - If you want cheap desktop gaming you will be much better server by a Pentium G2020 + Radeon HD6670 or HD 7750 for the same price as a crappy A6800 or A6700.
You make me laugh. I normally do not post comments on these things based on the fact that I read them just to get a laugh, but I do have to point out how wrong you are. I have a G1620, G2020, i3-3240, A8, A10 and a more and have ran benchmarks with a 6450, 6570, 6670, 7730, 7750 and 7770 for budget build gaming computers for customers. Your build of a G2020 with a 6670 in my test was beaten, hands down by the A10-6800k hxf with 7750 (yes I said it, hybrid crossfire with 7750, it can be done although not popular supported by AMD). G2020 with 6670 will run you about $130, and an A10 with 7750 is about $230. To match the A10 hxf 7750 ($230 value) performance with Intel I did have to use 7750/7770 or higher with the Pentiums and I3+7750 ($210 value) did quite well but still was beaten in quite a few things graphics related. Point being a discrete GPU changes the whole aspect of the concept. I3+7750 are very close to A10+hxf7750 in more ways than just performance, but that’s not the point of this Topic. It was AMD 8670D vs Intel HD 4600. I know lots of people that buy Intel i5 and i7 and live off the iGPU thinking one day they would have the money to get a nice GPU and call it good, %60 of the time this does not happen, new tech comes out that’s better and they just change their minds and try to get a whole new system. The APU on the other hand would have been cheaper and performed better for what they needed, had they just gone that road, and I am not the only one that came to that conclusion. AMD has done a great job with the APU and after testing many myself, I have become a believer. Stock i5 computer for $700 got smashed by stock A10 $400 in CS6 sitting side by side, I could not believe it. I do not have to argue how good the APU is doing because Microsoft and Sony have already done it. So I leave with a question. If the APU was not a fantastic alternative that delivers a higher standard of graphics performance, then why are they going to be used in the Xbox1 and PS4?
This is a slanted review. The i7 with the separate Nvidia card skews the results, perhaps erroneously, toward Intel. How about the A10 with the same separate Nvidia card and/or the comparable separate AMD video card? The performance difference can be quite drastic.
IMHO, one should compare apples to apples as much as possible. Doing so yields a much more complete comparison. I realize that these APUs tout their built-in graphic abilities, but Intel is trying to do so as well. It's the only way to give the CPU part of the APU a fair shake. That or leave the i7-Nvidia results out completely.
With Xeons you are getting into multi-proccessor boards, which brings up a question I have been wondering about. Does AMD have any plans to make their new APU's multiprocessor and crossfire capable? At that price point I wouldn't mind buying two of them to stick on a motherboard...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
102 Comments
Back to Article
coder543 - Thursday, June 6, 2013 - link
Why are we not testing versus Crystalwell enabled Iris 5200? This is the most important information, even if it isn't in the same price category necessarily.testbug00 - Thursday, June 6, 2013 - link
Because that makes no sense as a testing point.The i7 tested is not in the same price gap ether (an i3 would probably lose a lot of this partial CPU stuff) but the iGPU performance barely changes.
And how is that the most important information????? If anything it is the least important as you cannot buy Iris (pro) iGPU 5x00 on desktop unless embedded.
JarredWalton - Thursday, June 6, 2013 - link
The bigger factor is that Iris Pro simply isn't available as a desktop part -- it's only available in the BGA package i7-4770R, which OEMs can use in things like all-in-one PCs. The other place where we'll see Iris Pro (for now) is on laptops with the HQ series parts, but again that's not going up against desktops. GT3 and GT3e effectively don't exist as desktop offerings right now, but that's not too surprising as laptops stand to benefit most from improved iGPUs.jeffkibuule - Thursday, June 6, 2013 - link
On the desktop, it's only in one SKU that's only for OEM systems.FriendlyUser - Thursday, June 6, 2013 - link
Let me remind you that the Iris5200 is a $650 part. In fact, the ONLY situation where the Crystalwell part makes sense is when TDP and power requirements/battery concerns are the absolute priority. Otherwise, it's much cheaper to get a non-iris part and a separate mobile gpu (say, radeon 8970M) that offers vastly superior performance.mikk - Thursday, June 6, 2013 - link
No this is wrong. Iris Pro starts at 440 USD in mobile. Crossfire is not comparable since you get horrible micro stuttering.Hrel - Thursday, June 6, 2013 - link
They fixed the stuttering a long time ago mikkGigaplex - Thursday, June 6, 2013 - link
It wasn't that long ago and it's still not completely fixed.Guspaz - Friday, June 7, 2013 - link
They improved the stuttering for single GPU use. Both nVidia and AMD suffer from micro stuttering with multi-GPU solutions. It's a different problem set.Samus - Friday, June 7, 2013 - link
AMD is worse than nVidia's stuttering, but is easily fixed by adding a third GPU. So instead of using two high end cards in Crossfire/SLI, using three mid-high end cards virtually eliminates stutter.FriendlyUser - Thursday, June 6, 2013 - link
Indeed, there is a $468 part. You can still fit a decent dGPU and a decent CPU on that budget for, once again, vastly superior performance. And you don't need crossfire but you do lose on power consumption, which is the only point the Iris has for it.iwod - Thursday, June 6, 2013 - link
I wonder how much discount do OEM generally gets from Intel. 30% off Tray $440 @ $308/chip ? If the CPU used to cost them $200 and $100 for the GPU, i guess the space saving of 2in1 solution, less power usage, while giving similar performance is going to be attractive enough.testbug00 - Friday, June 7, 2013 - link
My desktop costed less than that... Mine probably is a little slower even with 1.1Ghz GPU and 4.4 CPU (my A10-5800K w/ 1866 OCed to 2133)Sabresiberian - Friday, June 7, 2013 - link
Yah, for me, the only consideration for a system with on-die CPU graphics is if I buy a low-end notebook that I want to do a little gaming on, and the chips with Iris price themselves out of that market. I've recommended AMD for that kind of product to my friends before, and I don't see any reason to change that.Sabresiberian - Friday, June 7, 2013 - link
What does Crossfire have to do with it? Using on-die graphics with an added discrete card doesn't have anything to do with Crossfire.max1001 - Friday, June 7, 2013 - link
Because AMD like to call APU+GPU combo Hybird Crossfire.Spunjji - Friday, June 7, 2013 - link
Who said anything about Crossfire?!MrSpadge - Thursday, June 6, 2013 - link
No, Crystalwell also makes sense on any high-performance part. Be it the topmost dekstop K-series or the Xeons. That cache can add ~10% performance in quite a few applications, which equals 300 - 500 MHz more CPU clock. And at 300$ there'd easily be enough margin left for Intel. But no need to push such chips...Gigaplex - Thursday, June 6, 2013 - link
There isn't a single K-series part with Crystalwell.mdular - Thursday, June 6, 2013 - link
As others have already pointed out it's not the "most important information" at all. Crystalwell isn't available on a regular desktop socket.Most importantly though, that is also for a good reason: Who would buy it? At the price point of the Crystalwell equipped CPUs you would get hugely better gaming performance with an i3/i5/FX and a dedicated GPU. You can build an entire system from scratch for the same amount and game away with decent quality settings, often high - in full HD.
There is a point to make for HTPCs, gaming laptops/laplets, but i would assume that they don't sell a lot of them at the Crystalwell performance target.
Since the article is about Desktops however, and considering all of the above, Crystalwell is pretty irrelevant in this comparison. If you seek the info on Crystalwell performance i guess you will know where to find it.
Death666Angel - Sunday, June 9, 2013 - link
"Who would buy it?" If it was just the added cost of the eDRAM put on top of the -K SKU (so 50€ or something on top of the i5-4670K and i7-4770K) I'd buy it in a heartbeat. First of all, it offers better QS functionality and second of all, the 128MB L4 cache is not exclusive to the iGPU, but can be used for the CPU as well, which offers some serious performance gains in programs that can make use of it.shinkueagle - Sunday, June 9, 2013 - link
Because it stupid to make such a comparison... And even more stupid of you to bring up such NONSENSE....gfluet - Monday, June 10, 2013 - link
Mostly because there are no Desktop Crystalwells yet, and the comparios is between socketed CPUs.But yeah, I look forward to when AnandTech gets a review model of the I7-4770R. I want to put one of those in a supercompact system.
Ewram - Thursday, June 6, 2013 - link
Excuse me, but what is the MSRP of the A10-6800k versus the i7-4770k? Also, wouldn't benchmarks also be affected by CPU performance to at least some extent?3DoubleD - Thursday, June 6, 2013 - link
Given how GPU and memory bandwidth limited these systems are, I'm sure the difference in CPU performance plays only a small if not negligible role in the final score.Even if we were talking a single 7970, the difference between AMD and Intel was pretty insignificant http://anandtech.com/show/6985/choosing-a-gaming-c...
CannedTurkey - Thursday, June 6, 2013 - link
The i7-4770 is roughly double the price of the A10-6800.BSMonitor - Thursday, June 6, 2013 - link
MSRP really isn't a valid comparison here as they are entirely different price points/target audiences. The point is to test the iGPU capability.AMD and Intel have very different approaches to iGPU and processor SKU's, today. AMD and it's Fusion are specifically targeting low price points where AMD believes the value of an iGPU is most attractive. The CPU cores are similar to its FX line, but it's an entirely different die than its flagship desktop parts which have NO iGPU whatsoever.
Intel on the desktop for the mostpart has a single die for all its mainstream Core i7's down to budget Core i3, Pentiums. The Core i7's iGPU isn't really focused on giving a budget gaming experience. And this is where Anand's criticism is aimed. They could make an amazing APU with a very balanced iGPU and CPU on the high end desktop parts but have chosen not to. It would seem the powers that be have decided there is no market for Iris Pro and its high end desktop parts.
MSRP would be a valid comparison in the Mobile Core i7 with Iris Pro vs the Richland Mobile parts.
silverblue - Thursday, June 6, 2013 - link
Perhaps. In that case, the price of the CPU would be partially obscured by the total BOM. If Iris Pro is that good, and you got double performance for twice the price, it wouldn't be too bad.Concillian - Thursday, June 6, 2013 - link
"Intel on the desktop for the mostpart has a single die for all its mainstream Core i7's down to budget Core i3, Pentiums. "Not true. Dual core and quad core have had different silicon since they started the i3 / i5 / i7 naming convention. I'm no mobile expert, but I know that on the desktop i3 has never had the same die as i7.
eanazag - Thursday, June 6, 2013 - link
A10-6800K is sitting around $150 on Newegg, while the 4770K is pushing $349 daisies. The comparison is still sensible and useful. Spend less money on Intel CPU and the clocks go down. So in an iGP setting for gaming AMD makes more sense, but if you throw a discrete card in the mix you'll have to rethink what your goals are. After staring at those prices, for a gaming only rig I might rather spend the price difference on a discrete card and call it a day if the monitor resolution is 1080p or less.BSMonitor - Thursday, June 6, 2013 - link
No, the comparison is absolutely meaningless. You are saying that someone's decision to buy a Core i7 4770 is influenced by the iGPU. It is not.Ortanon - Thursday, June 6, 2013 - link
The comparison isn't for people buying Core i7 lol. You wouldn't need a comparison if you were already going to buy it. The comparison is exactly what it says: Radeon HD 8670D vs. Intel HD 4600.BSMonitor - Thursday, June 6, 2013 - link
That's what I said. He is saying the price between the two should also matter. And as you say, people buying a Core i7 are not comparing iGPUs.BSMonitor - Thursday, June 6, 2013 - link
The comparison in question is not the article... It's the price comparison, go back to the beginning of the thread.silverblue - Thursday, June 6, 2013 - link
Unless you're very much interested in QuickSync, that is.jjj - Thursday, June 6, 2013 - link
1080p monitors can be found even bellow 100$ ,there isn't really a point in reviewing desktop anything bellow 1080p. going lower to find where a game becomes playeble is fine but the review should have 1080p tests even if the products are not good enough.Would be nice if Kaveri would double the SP count but AMD might be going for a smaller die to cut costs given their difficult financial situation. Wouldn't quite match the Xbox in perf but would be close enough and could do a decent job at playing console ports for the next few years.
DigitalFreak - Thursday, June 6, 2013 - link
Why test at a resolution where you're not going to get playable frame rates? If you can only get playable frame rates @ 768p by running medium quality, I'm pretty sure it's going to be unplayable at anything other than low / minimum @ 1080p.jjj - Thursday, June 6, 2013 - link
that's one of the points ,"you are pretty sure" not sure because the review doesn't do it's job to show you for sure and if you want a clear picture you need to look elsewhere.Death666Angel - Thursday, June 6, 2013 - link
Considering that nothing here is playable at 900p, it is quite possible to extrapolate that 1080p won't be playable either. So I'm pretty fine with them not testing it. If you get a $150 APU to play the latest games at 1080p (a resolution much larger than current consoles support in gaming, might I add), you are deluded.britjh22 - Thursday, June 6, 2013 - link
Are the tests currently set up to show higher res but lower detail settings? I know there is a set benchmark settings that they use to normalize, which is fine for high end CPU/mid to high end GPU testing. If I remember correctly the setting, as they climb in "quality" (low, medium, high) increase both resolution and detail concurrently. With Trinity/Richland and eventually Kaveri, it would be interesting to see if these APU's can handle recent games at higher resolution, but lower detail settings. Essentially can you get any recent games to play at common resolutions, even if you have to crank down settings.whatthehey - Thursday, June 6, 2013 - link
I can't imagine anyone really wanting minimum quality 1080p over Medium quality 1366x768. Where it makes a difference in performance, the cost in image quality is generally too great to be warranted. (e.g. in something like StarCraft II, the difference between Low and Medium is massive! At Low, SC2 basically looks like a high res version of the original StarCraft.) You can get a reasonable estimate of 1080p Medium performance by taking the 1366x768 scores and multiplying by .51 (there are nearly twice as many pixels at 1080p as at 1366x768). That should be the lower limit, so in some games it may only be 30-40% slower rather than 50% slower, but the only games likely to stay above 30FPS at 1080p Medium are older titles, and perhaps Sleeping Dogs, Tomb Raider, and (if you're lucky) Bioshock Infinite. I'd be willing to wager that relative performance at 1080p Medium is within 10% of relative performance at 1366x768 Medium, though, so other than dropping FPS the additional testing wouldn't matter too much.THF - Friday, June 7, 2013 - link
You're wrong. Most people who take Starcraft 2 seriously are actually playing on the highest resolution they can get, with the low detail setting. Sure, the game looks flashier, but it's easier to play with less detail. All pros do it.Also, as for myself, I like to have the games running on native resolution of the display. It makes "alt-tabbing" (or equivalent thereof on Linux) much more responsive.
Calinou__ - Friday, June 7, 2013 - link
+1, "Low" in today's AAA games is far from ugly if you keep the texture detail to the maximum.tential - Thursday, June 6, 2013 - link
This is my BIGGEST pet peeve with some reviewers who will test 1080p and only show those results when testing these types of chips. All of the frame rates will be unplayable yet they'll try to draw "some conclusion" from the results. Test resolutions where the minimum frame rate is like 20-25 fps by the contenders so I can see how smooth it actually will be when I play.I didn't purchase an IGP solution to play 10 FPS games at 1080p. I purchased it to play low resolution at OK frame rates.
zoxo - Thursday, June 6, 2013 - link
I always start by setting the game to my display's native res (1080p), and then find out at what settings can I achieve passable performance. I just hate non-native resolution too much :(taltamir - Thursday, June 6, 2013 - link
because you render at a lower res and upscale with iGPUsjamyryals - Thursday, June 6, 2013 - link
Love that Die render on the first page. It's dumb, but I always like seeing those.Homeles - Thursday, June 6, 2013 - link
It's not dumb. You can learn a lot about CPU design from them.Bakes - Thursday, June 6, 2013 - link
Not that it really matters but I think he's saying it's dumb that he always likes seeing those.Gigaplex - Thursday, June 6, 2013 - link
Homeles' comment could be interpreted in a way that agrees with you and says it's not dumb to like seeing them because you can learn lots.taltamir - Thursday, June 6, 2013 - link
"Richland maintains a 17 - 50% GPU performance advantage (~30% on average) over Intel's HD 4600 (Haswell GT2)"And yet consumes more then 2x the power according to your own charts.
And what about the CPU performance? These are desktop parts not laptop parts, their iGPU performance is meaningless
silverblue - Thursday, June 6, 2013 - link
Did you skip just to the conclusion? The reason as to a lack of CPU benchmarks is on the first page.How much power would a 4770K with a GT640 use, incidentally? And at the other end of the scale, what about that 4600M which is rated at 35W yet in a couple of tests beat even the 4770K with its HD 4600? You're asking for results that Anand hasn't managed to grab just yet for reasons as stated on the first page.
There's some strange results in here. In the 3DMark: Fire Strike Extreme test, all three APUs have the same result, but in 3DMark06, the 6800K significantly beats everything else. However, regardless of one or two oddities, the 6800K isn't a real progression over the 5800K... but it was never really made out nor expected to be.
FriendlyUser - Thursday, June 6, 2013 - link
Who cares about power on the desktop? What are you running, a server farm? We're not talking about a 200W part here. The Richland is easy enough to cool as it is. Just because intel based its strategy around a mobile part doesn't mean we have to run behind absolute power/performance ratios. Price/performance makes more sense for the average user.Also, iGPU is very important at that price point. If a $150 CPU saves you a $80 GPU, it's quite attractive. USA readers probably can afford to spend $200 for a dGPU, but struggling european economies and the developing world are a big part of the international market.
ChoadNamath - Thursday, June 6, 2013 - link
"iGPU performance is meaningless" ...except in an APU, where it's pretty much the whole point.shing3232 - Sunday, June 9, 2013 - link
it would not be meaningless for low end desktop. and it is not 2x power consumption in real world. they also have A10-6700 65W TDPwhyso - Thursday, June 6, 2013 - link
Well I think this also shows how close mobile will be. Gt2 at slightly lower speeds than the 4770k (say 1200 mhz 3MB cache for an i5/i3 vs 1300 mhz 8MB cache for 4770k) will be about 10-15% slower. Mobile trinity is going to be approximately equal to haswell mobile gt2 and richland may be slightly ahead but the gap is largely gone.FriendlyUser - Thursday, June 6, 2013 - link
I have always wondered why would anyone paying $330 for a high-end CPU care for a barely adequate iGPU. It's much more reasonable to expect that people looking at the $150 price point will appreciate an iGPU, especially one that is quite decent. The cheapest GT640 I could find was ~$85 (local price), which is no small change. And don't think that the GT640 will get the same scores if paired with the i3...whyso - Thursday, June 6, 2013 - link
yes it will, 640 is way to weak to be bottlenecked by an i3.Spunjji - Friday, June 7, 2013 - link
Indeed. It'll still cost more, though!BSMonitor - Thursday, June 6, 2013 - link
The vast majority of these CPUs do not go to gamers... Their performance is more than acceptable for a large number of use cases if an OEM doesn't want to include a dGPU. However for $330 you get CPU performance that AMD cannot touch at ANY price point or performance/watt..FriendlyUser - Thursday, June 6, 2013 - link
I see your point. But how many of these users need the CPU performance of the 4770? Do you think that the average business user needs a 4770 to do excel and answer emails? Will he even notice the difference? I can't really show you statistics, but I imagine that a big part of demanding users are in fact gamers.JDG1980 - Thursday, June 6, 2013 - link
I understand why the average Intel CPU has an integrated graphics processor, but the K-series parts are specifically targeted at enthusiasts. Why don't they omit the IGP from those?Gigaplex - Thursday, June 6, 2013 - link
QuickSync is one possible answer. Another is that enthusiasts tend to swap out GPUs more frequently than other demographics so having a basic iGPU can come in handy for diagnostics now and then. And not all enthusiasts are gamers.dbcoopernz - Thursday, June 6, 2013 - link
Are you going to look at HTPC performance for Richland? e.g. madVR, refresh rate timings.RoyYoung - Thursday, June 6, 2013 - link
Thanks for the squeezing this out in such a short turn around. However I just don't think this is useful or new. I have never met anyone in the market for an i7 xxxxk cpu looking to play AAA game using the iGPU, have you? The iGPU in the i7 is just a bonus because it shares a die with the mobile counterpart, and gives you quick sync if prefer speed over quality in your transcoding.In today's market, the only reason to invest in the space, noise , heat, and money for a desktop gaming PC is to play games at 1080p or higher. Just get a Xbox if you need 720p. From the benchmarks its clear that neither the i-7 Haswell nor the Richland are playable at 900p let alone 1080p. On the other hand, the same tests at 768p on mobile Richland and Haswell parts makes perfect sense given the typical resolution and thermal of laptops. Given the power usage delta between the AMD and Intel desktop parts, I suspect the race is going to be a lot closer in the laptop race.
kallogan - Thursday, June 6, 2013 - link
Richland is really lame. I mean it brings barely peanuts over Trinty. Why even release that. And it's expensive. Desktop parts are really boring right now.kyuu - Thursday, June 6, 2013 - link
I'd be interested in seeing if you can get a stable RAM clock @ 2400MHz, and if so, how much Richland scales with that. Hope you take a look at when you do the more thorough piece.johnny_boy - Thursday, June 6, 2013 - link
APUs scale very well with faster memory almost regardless of timings. I'd like to see Richland benchmarks with DDR3 2400, though I can already make a pretty good guess of what those figures would look like.Samastrike - Thursday, June 6, 2013 - link
So what I'm seeing here is very similar GPU performance to trinity for $20 more? Except in 3Dmark06 where it suddenly has a huge jump. Doesn't strike me as worth it.firewall597 - Thursday, June 6, 2013 - link
I'm a little disappointed that AMD cared to release this as a new model generation at all. There's barely enough argument to avoid throwing the "rebranding" flag. Shoulda just fit the upclocked parts appropriately into the current gen's numbering and adjusted prices accordingly.The effort is appreciated always, but the marketing is somewhat misleading from the surface.
jrs77 - Thursday, June 6, 2013 - link
Sorry to say, but if you wanna play games, then grab a dedicated GPU. Noone plays modern games with integrated GPUs. iGPUs are only good for some casual gaming that don't demand any powerful GPU.The only thing I'm interested in is the CPU-part and intel wins this comparison by a mile. AMD needs better CPU-performance or they'll never win me back as a customer for desktop-parts.
Voldenuit - Thursday, June 6, 2013 - link
Exactly. I'm mystified by Richland's existence. Exactly where does it make sense for desktop users?If you're gaming on a desktop and you're on a budget, it's more expensive than the outgoing A10 5800K and not much faster. Nor does it make any games playable that were unplayable before on the older APU.
If you're gaming on the desktop and budget is not the biggest factor, why even bother looking at AMD parts right now?
If you're planning on a HTPC build the 100W TDP is too high. Get a Haswell or an older (and lesser model number) Trinity APU.
If you plan to build your own all-in-one, again the TDP is too high.
So why would anyone buy Richland on the desktop? It's $20 (15%) more expensive than the A10 5800K but only ~5% faster. Is there +1000 model number simply there to justify the price hike?
silverblue - Friday, June 7, 2013 - link
It's slightly better silicon released to make AMD look a little better. There's also the much touted software bundle they've been mentioning - AMD seems to be more about the "experience" nowadays.As an upgrade to my old PII X3 710, it'd be significant... but the GPU wouldn't be better than my 4830. Kaveri, on the other hand, would likely improve on the latter as well as providing more than double my current CPU speed. In 2009, the CPU and GPU cost me approx. £180 (about $250?), but in 2013, I'd be surprised if I had to pay more than two thirds of that for something much better. :)
Calinou__ - Friday, June 7, 2013 - link
There are games that don't require a powerful GPU and are hard to play.kgh00007 - Thursday, June 6, 2013 - link
For **** sake please increase the size of the text on your site, it's too small and I'm getting eye strain reading all the articles on Computex, do you not listen to your readers, the text is too small!!!Sent from my nexus 7 with eye strain!
Gigaplex - Thursday, June 6, 2013 - link
It's too big for me, so I just use the zoom setting on my browser. I suggest you try the same, or stop using a 7" tablet.kgh00007 - Friday, June 7, 2013 - link
I suggest you stop giving useless advice, I'm not going to stop using my tablet. Even with text at 120% the text is too small and the text on other websites is huge @ 120%.The text on every other website I have ever visited on my nexus 7 is fine, so why can't anandtech be the same? There is an issue here that needs to be resolved.
bji - Friday, June 7, 2013 - link
Regardless of your feelings, posting about your issue in this discussion is off topic and pointless as Anand is not going to read and act on your post.I'd say to write to Anand directly but I've tried that and it seems the emails go ignored anyway. I think you either just have to live with the site as it is or stop visiting the site. But whatever you do, don't introduce more off topic posts please.
duploxxx - Friday, June 7, 2013 - link
by not running the trinity on supported max mem you effected the whole review, Richland ends up in that case being a very minor update....On top of that not a single word on the improved power.
But then again as you mention with all the things around with Intel and all the nice motherboards linup for hasswell etc why even bother its just AMD. pls continue the efforts like this, a few years from know you regret that you will need to pay double for the same cpu, dominiated predefined designs by marketing geeks etc... way to go
DeviousOrange - Friday, June 7, 2013 - link
Its pointless doing a iGPU review if you don't have frame metering factored in. Tomshardware, Techreport, PCPer all did frame transition ratings and this is where HD4600 took a massive beating, in most titles showed up Intel's iGPU to hit 60+ms frame lags while the APU is very low latencies for many titles at lower resolutions (sub HD) is under 1ms where Intel spikes will result in noticeable lags and stutters despite looking close on FPS which are basically not worth what they were.BF3 @ 1080 on low settings with DDR3 2400 on my 5800K manages about 30FPS but its almost lag free, tested on HD4000 was completely undesireable to even persist, basically playing a slideshow, since HD4600 doesn't fix this much it still puts AMD top in the iGPU stakes by a healthy margin irrespective of frames per second. Since we are comparing top vs top there is no ambiguity. THG showdown between a i3 and 6800K was interesting, a 6800K can beat a i3 + 6670 in a few titles so that is another testiment to the improvement of APU technology.
mrSmigs - Friday, June 7, 2013 - link
It still amazes me that websites bench the i7s vs the amd a10 platforms. Whole a10 systems can be purchased for the price of an i7 CPU. Why dont you test a $200 graphics card in an a10 system vs the i7 integrated graphics - if this is the case and show people what they can get (for their $350 dollars spent on cpu + graphics)???? The closest price processor from intel vs 6800k that i found locally was an i3 3240. Why dont you use these in the comparisons?????? Why use a Gt640 only with an i7 & not on the amd system to show the cpu bias in the benches???DeviousOrange - Friday, June 7, 2013 - link
While yes the i7 being the fastest mainline part available and x86 will help distort its numbers a bit nevertheless the issue is not x86, the issue is iGPU performance only. That being said the i7 4770K and A10 6800K are both the top line parts in review so there is no ambiguity as to one being entry level and the other top line, these are both flagships so the test is top product on top product iGPU showdown.What is disappointing is Anandtech still don't have any workable Frame Scaling tool to asses performance as Frames Per Second means diddly squat where Frame Latency is the true indicator. As before the A10 is always below 10ms and often below 1ms while the i7's HD4600 often hits 60ms+ latencies which is basically a microstut.tut.tut.tut.tut.ter.
I haven't personally tested HD4600 but I have been told it has boosted FPS but in terms of frame transistions which are often a combination of hardware and driver support that HD4600 is not much better so again while HD4600 is close in FPS in some instance, its very far off in latency. In short I would rather have a iGPU average 27FPS but have 0.1ms latencies opposed to 35FPS with 50ms latencies.
mrSmigs - Friday, June 7, 2013 - link
The comparison is amd's budget apu line vs intels top desktop line. You should throw a radeon 7850 in the amd system to even up the specs (at least dollars wise) and then run a few graphics benches... please someone do a comparison i7-4770k with its iGPU vs a10-6800k+7850hd in gaming. im pretty sure i could guess the winner here...sireangelus - Friday, June 7, 2013 - link
Just think about the performance of a i74770KR... 128 mb of L4 cache, an amount to put to shame any serverPhiro69 - Friday, June 7, 2013 - link
" AMD is proud of its valiation on the A10-6800K" - you mean validation, right?halbhh2 - Friday, June 7, 2013 - link
While it is interesting to see gaming benches, since I might game on an HTPC, I'm actually thinking more on how perfectly the chips can display a 4K video, *and* I'm interested in whether I can run 2 HD movies (two at once) on 2 displays without any dropped frames or stutters (this actually matters a bit at the moment for me). I know there is one article here for the Intel 4600 graphics re 4K.hobagman - Saturday, June 8, 2013 - link
Anand, I think this is a waste of time. I don't know anybody who buys desktops anymore, unless they intend to use it as a workstation or as a gaming platform. In the first case, they don't usually care about graphics, and in the second case, they will absolutely have discrete graphics and these graphics benchmarks are utterly irrelevant. I like this website and appreciate the work, but I would rather you spend your time on something more useful for us -- for example, comparing the notebook platform integrated graphics would be reasonable. This article puzzles me.fteoath64 - Saturday, June 8, 2013 - link
"comparing the notebook platform integrated graphics would be reasonable.". There are many reviews about IGP for the APUs for specific users or gamers. This article highlights how much AMD has arrived in terms of gpu and cpu balance in their desktop and notebook parts (ie APUs specifically) which will pose a serious challenge to Intel's dominance. To many, Intel based cpu with IGP is clearly not the way to go only Intel cpu plus Nvidia discrete gpu or go the APU route and compromise cpu somewhat gaining close to discretely gpu performance for way less money.Also shows that Intel 4600 IGP has gone a long way to within striking distance of AMD APU gpus but not good enough as the sliding scale of reference MOVES each time Intel approaches. The GT3e potentially can match a NV 650M discrete but at the cost of Intel $$$ to the user. Most manufacturers rather go Nvidia discrete which is cheaper and better as well. So unless Intel goes into heterogeneous core architecture chips for cpus, there is nothing really new in their offering.
zyky - Saturday, June 8, 2013 - link
With the dozens of different configurations that call themselves "GT 640" It's pretty important to specify which one was used in these tests. GF116? GK107? GK208? GDDR3? GDDR5?Ryan Smith - Tuesday, June 18, 2013 - link
There's only one retail GT 640 (as of this article); the GK107 based DDR3 version.Will Robinson - Sunday, June 9, 2013 - link
LOL...NeelyCam must be crying his eyes out over those results.Good work AMD!
Wurmer - Sunday, June 9, 2013 - link
I still find this article interesting even if IGP are certainly not the main focus of gamers. I don't consider myself a hardcore gamer but I don't game on IGP. I am currently using a 560 GTX which provides me with decent performances in pretty much any situation. On the other hand, it gives an idea of the progress made by IGP. I certainly would enjoy more performance from the one I am using at work which is a GMA 4500 paired with a E8400. There are markets for good IGP but gaming is not of them. As I see it, IGP are more suited to be paired with low to mid CPUs which would make very decent all around machine.lordmetroid - Monday, June 10, 2013 - link
Using high end games that will never be played on the internal graphic processor is totally pointless, why not use something like ETQW?skgiven - Monday, June 10, 2013 - link
Looks like you used a 65W GT640, released just over a year ago.You could have used the slightly newer and faster 49W or 50W models or a 65W GTX640 (37% faster than the 65W GT640).
Better still a GeForce GT 630 Rev. 2 (25W) with the same performance as a 65W GT640!
(I'm sure you don't have every GPU thats ever been released lying around, so just saying whats there).
An i7-4770K, or one of its many siblings, costs ~$350.
For most light gaming and GPU apps, the Celeron G1610T (35W) along with a 49W GT640 would outperform the i7-4770K.
The combined Wattage is exactly the same - 84W but the relative price is $140!
Obviously the 25W GK208 GeForce GT 630 Rev. 2 would save you another $20 and give you a combined TDP of 60W, which is 40% better than the i7-4770K.
It’s likely that there will be a few more GT600 Rev.2 models and the GK700 range has to fill out. Existing mid-range GPU’s offer >5times the performance of the i7-4770K.
The reasons for buying an i7 still have little or nothing to do with its GPU!
skgiven - Monday, June 10, 2013 - link
- meant GTX645 (not GTX640)NoKidding - Monday, June 24, 2013 - link
i shudder to think what an a10 kaveri can bring to the table considering it'll be equipped with amd's gcn architecture and additional ipc improvements. low price + 4 cores + (possibly) hybrid xfire with a 7xxx series radeon? a great starting point for a decent gaming rig. not to mention that the minimum baseline for pc gaming will rise from decent to respectable.Silma - Friday, June 28, 2013 - link
Sometimes I really don't understand your comparisons and even less the conclusions.Why compare a Richland to a Haswell when obviously they will get used for totally different purposes? Who will purchase a desktop Haswell without graphic card for gaming? Why use super expensive 2133 memory with a super bad processor?
There are really 3 conclusions to be had:
- CPU-Wise Richland sucks aplenty
- GPU-Wise there is next to no progress as compared to Trinity, the difference being fully explained by a small frenquency increase.
- If you want cheap desktop gaming you will be much better server by a Pentium G2020 + Radeon HD6670 or HD 7750 for the same price as a crappy A6800 or A6700.
XmenMR - Monday, September 2, 2013 - link
You make me laugh. I normally do not post comments on these things based on the fact that I read them just to get a laugh, but I do have to point out how wrong you are. I have a G1620, G2020, i3-3240, A8, A10 and a more and have ran benchmarks with a 6450, 6570, 6670, 7730, 7750 and 7770 for budget build gaming computers for customers.Your build of a G2020 with a 6670 in my test was beaten, hands down by the A10-6800k hxf with 7750 (yes I said it, hybrid crossfire with 7750, it can be done although not popular supported by AMD). G2020 with 6670 will run you about $130, and an A10 with 7750 is about $230. To match the A10 hxf 7750 ($230 value) performance with Intel I did have to use 7750/7770 or higher with the Pentiums and I3+7750 ($210 value) did quite well but still was beaten in quite a few things graphics related.
Point being a discrete GPU changes the whole aspect of the concept. I3+7750 are very close to A10+hxf7750 in more ways than just performance, but that’s not the point of this Topic. It was AMD 8670D vs Intel HD 4600. I know lots of people that buy Intel i5 and i7 and live off the iGPU thinking one day they would have the money to get a nice GPU and call it good, %60 of the time this does not happen, new tech comes out that’s better and they just change their minds and try to get a whole new system. The APU on the other hand would have been cheaper and performed better for what they needed, had they just gone that road, and I am not the only one that came to that conclusion. AMD has done a great job with the APU and after testing many myself, I have become a believer. Stock i5 computer for $700 got smashed by stock A10 $400 in CS6 sitting side by side, I could not believe it. I do not have to argue how good the APU is doing because Microsoft and Sony have already done it. So I leave with a question. If the APU was not a fantastic alternative that delivers a higher standard of graphics performance, then why are they going to be used in the Xbox1 and PS4?
ezjohny - Tuesday, September 10, 2013 - link
when are we going to get a APU where you could go in game an adjust the graphic setting to very high with out a bottle neck!nanomech - Sunday, December 8, 2013 - link
This is a slanted review. The i7 with the separate Nvidia card skews the results, perhaps erroneously, toward Intel. How about the A10 with the same separate Nvidia card and/or the comparable separate AMD video card? The performance difference can be quite drastic.IMHO, one should compare apples to apples as much as possible. Doing so yields a much more complete comparison. I realize that these APUs tout their built-in graphic abilities, but Intel is trying to do so as well. It's the only way to give the CPU part of the APU a fair shake. That or leave the i7-Nvidia results out completely.
alphacrasher - Tuesday, December 10, 2013 - link
With Xeons you are getting into multi-proccessor boards, which brings up a question I have been wondering about.Does AMD have any plans to make their new APU's multiprocessor and crossfire capable? At that price point I wouldn't mind buying two of them to stick on a motherboard...
boogerlad - Wednesday, December 25, 2013 - link
For the Luxmark Benchmark, is it cpu only, gpu only, or is it both?