These aren't Zen 5 CPUs, they're Zen 4 chips with better integrated graphics (which gamers likely won't care about anyway) and some AI accelerators (which probably won't see wide adoption for at least another year)
I think they meant people playing on really old dedicated GPU which likely are slower then the 8700G to a point where a 8700G might be a good replacement if their setup breaks or similar.
The 5700G had for some time been one of the best options for budged gaming.
The 8700G should be seen as a successor of that I think.
Enough to play a huge amount of games nicely under 1080p.
But in a certain way at the lowest end of desktop gaming.
Still better then the low-mid range of gaming on a laptop, tho.
But for a lot of people that is exactly the right balance of what they get to what they pay. Weather that's because they don't have a lot money or they don't do a lot gaming and don't want to wast money.
The main "problem" the 8700G might run into is I guess that the cost of AM5 motherboards and RAM.
Also AMD (and I think Intel, too) have in recent years slowly worked to take advantage of integrated graphics even if you have also dedicated graphics (of the same vendor) if that sees some more improvements the 8700G could also become interesting in some use-case you wouldn't expect today. We will see.
A quick check puts a Ryzen 5 5600 and a Radeon RX 6600 at $360 combined, where the 8700G is set to $330. And a RX 6600 will deliver around double the graphics performance. So even without factoring in motherboard and memory the 8700G is hard to justify for a cheap gaming rig.
Wrong prices. Anyway, you'd get even lower graphics performance. The 8700G is 12 CUs, and those two models come with 8 and 4 CUs, making them even less compelling alternatives to a dedicated graphics card.
the 8600G seem to be $229 instead of $299, which seem to have been a mistake of the interviewer in the video
the 8500G seem to be $179 instead of $176 which was a typo on my side
Like OP mentioned they have only 8 and 4 CUs respectively.
In comparison the Steam Deck has 8 (RDNA2) CUs, so while you can't really call a 8600G a "desktop gaming" APU it still can somewhat run a lot of games, enough for many very causal gamers.
For comparison the minimal GPUs in the 7x00X CPUs have 2 (RDNA2) CUs, which is still enough for most office PCs and still can run stuff like Dead Cell on 1080p reasonable well. Now Dead Cell is highly optimized but what I'm trying to say is: If you get yourself a cheap office PC or update of one and want to sometimes play some simpler older games a 8500G can still be all you need.
Generally speaking, you can set the power target of a high-spec GPU below 100% to get most of the performance at significantly reduced power draw. So if you're cost sensitive it's still probably better to get a dGPU, just cap the power.
The dGPUs have more cores and wider memory buses/dedicated memory which delivers higher performance than the space constrained integrated GPUs.
I used to play WoW on a Llano APU. Plenty of gamers are addicted to something like that and just want a serviceable cheap machine. Pretty sure my current 3070 cost more than the entire computer I was killing Deathwing with.
I played healer so I could just point my camera at the floor zoomed in as far as possible and suffer the least bad performance possible. And scramble to get back in-game after guaranteed disconnects on big events e.g. every time Nefarion spawned a wave of ads. What a struggle.
AMD's support is terrible, if you believe their documentation. ROCm/hip works just fine (well, with Torch - Tensorflow is shamelessly an NVIDIA shill) on many "unsupported" GPUs if you enable an override envar.
To add a bit more context: I remember reading somewhere (may have been in the Phoronix forum) from an official AMD engeneer that "supported" means validated and actively tested in their pipeline, while "unsupported" generally just means "we don't do any or minimal testing on these, they should work and we don't explicitly prevent them from working but we don't guarantee anything" (at least when it comes to same die/gen cards).
In the same post they also wrote that they are gonna look at integrating more of those "unsupported" cards into their suite.
Honestly, I hope they change their wording for this to something like "validated", "supported" and "unsupported" with actual explenations what each of these mean (fully tested, works theoretically, does not work even theoretically)
ROCm is also incredibly fragile and buggy at the best of times, so anything not actively tested by them stands a good chance of not working. Hell, I remember that a while back people were having problems with machine learning code giving garbage results on one of the few consumer GPUs that was officially listed as supported and AMD eventually replying to the bug report and declaring that actually, no they weren't going to support it and they'd remove it from the list ratther than try and fix the issue. I think this was back when newer consumer GPUs were genuinely unsupported as in the code simply wasn't there too. Integrated GPUs have also alwayu had a lot of problems.
There's also questionable OS compatiblity. ROCm is Linux-based and has extremely limited and rather experimental Windows support. Their fancy new neural processing unit is Windows-only, tied in with a Microsoft framework, and they don't seem to have any kind of definite plan for supporting it elsewhere. So there's quite possibly no single OS where all the hardware that theoretically could be used for machine learning and AI in these chips actually has even vaguely functioning code that works on that OS to make it work that way.
> Their fancy new neural processing unit is Windows-only, tied in with a Microsoft framework, and they don't seem to have any kind of definite plan for supporting it elsewhere. So there's quite possibly no single OS where all the hardware that theoretically could be used for machine learning and AI in these chips actually has even vaguely functioning code that works on that OS to make it work that way.
What they did was embed their Vitis IP in their CPUs. Those IP modules have been available in ASIC accelerator cards from AMD or their Xilinx FPGA offerings for a while now and have Linux support.
So they are releasing these APUs (which are primarily focused at a windows audience) with Windows support via ONNX. And then once there are chips in circulation, AMD can get Linux Vitis support for these chips set up with members of the Linux community being able to review and validate those patchsets.
Tensorflow is open source on GitHub. What’s stopping AMD engineers from contributing with Pull Requests? A casual search through the open PRs shows nothing of interest being submitted by team red.
I'm guessing no, as these newly announced processors are the first with those accelerators.
> AMD is also bringing the power of a dedicated AI neural processing unit (NPU) to desktop PC processors for the first time with the introduction of Ryzen™ AI
They're merely the first desktop processors from AMD with these NPUs. This is last year's laptop silicon repackaged for their desktop socket. They also re-branded the laptop version with new model numbers, because laptop OEMs basically demand new model numbers on an annual cadence. This year their marketing for these chips is heavily emphasizing the NPUs, because they actually have some tooling for them now, but they were basically dead weight when the silicon first shipped.
The problem there is with PyTorch. They've got bugs in their AMD build flow that people keep pointing out to them, but because it's people with consumer cards which are "unsupported" (which just means AMD won't personally solve your problems with those cards, not that they aren't intended to work) they close the issues / PRs without fixing the bug.
They are enough to run the Ally, and they're closely related to the one in the Steam Deck. So maybe they're extreme in the segment of integrated graphics??
only APU like the 5700G, 8700G or the recent Steam Deck APU (in reasonable comparison targets)
The integrated GPUs of the 7000 serious (not APUs) are maximal minimalist to a point I think they couldn't have shrunken them more without running into unusual issues (assuming not changing the architecture/design). They are only suited for web browsing, office use-cases and debugging (but also kinda where made only for that use-case). Well I guess Dead Cells and similar "low requirement/high optimized" games still run nicely on it on 1080p.
Still nice to have them if you only have such use-cases saves a ton of money.
APU is a marketing term. The GPU's in them are still integrated GPU's. Intel didn't have anything that held a candle to them until the Iris branded parts, and even those lagged behind.
> integrated GPUs of the 7000 serious (not APUs) are maximal minimalist to a point I think they couldn't have shrunken them more
Right, because these are meant as a backup display output in parts that are aimed at the gaming market, where they will almost always be paired with a GPU. Comparable parts in prior Ryzen generations didn't even have an iGPU. Intel offers you the choice (no GPU in the "K" SKU) which is nice.
Seriously? I'm through with buying expensive space heaters. These CPUs are 65 W so I might upgrade the cpu with integrated graphics in my x86 box with them.
I think they were hinting at the fact that the iGPU is more interesting to users of the mobile CPU (Steamdeck/Laptops), since this is a desktop CPU, which usually have dedicated graphics.
>they're Zen 4 chips with better integrated graphics (which gamers likely won't care about anyway)
Integrated GPUs are becoming far more appealing to the general consumer, including (especially?) gamers, with how fucking expensive discrete GPUs are getting these days (AMD is part of that problem).
We might just witness dGPUs becoming the sound cards of this decade.
If you want to buy new (which I don't think is unreasonable), there's not much in the market that doesn't cost a lot of money.
Maybe a Radeon 6500XT or a Arc A380; Geforce 1650 are still kind of a lot of money, and you're looking at at Geforce 1030 if you want something for not a lot of money. But why buy a Geforce 1030, unless you really just need something low profile.
... has similar performance than 480/580 or what's expected from these APUs.
The APU is cheaper overall, uses less power (even cheaper!), and has a clear upgrade path (add beefy discrete GPU or otherwise replace with newer APU).
An RTX 4060 (with a pitiful 8GB vram) is about $300. Something a bit more practical like an RTX 4070 (with 12GB vram) is about $550.
That is bullshit expensive. It's still nowhere near as bad as during the cryptomining craze, but remember a GTX 1080 MSRP'd for about $430 back in the old days. An RTX 4080 for context is around $1,500 to $2,000 right now.
If I'm just looking to game and only have a reasonable budget like most people, I'll just grab some AMD APU and be done with it. Better yet just go and buy a console or game on my phone.
How do the best integrated graphics compare performance-wise though? If they are way behind in performance, you might be able to buy a cheap 2GB VRAM GPU for less than a cutting edge integrated graphics CPU.
> How do the best integrated graphics compare performance-wise though
A bunch of handhelds (Steam Deck , Asus ROG Ally) that can comfortably run even AAA games with 30fps+ on small screens (as an example Red Dead Redemption 2 runs with 50fps on my LCD Deck) with battery and thermal limitations would imply that desktop versions would be totally acceptable, unless you're looking at 4K on the latest big titles.
It's another ~$100 what would not yield you any performance gains.
Older 5700G was ~$300 so it was hardly a part of a budget build, but 5 5500GT is $125 (and would probably drop below $100 in a half/year) and would clearly offer at least 30fps on FullHD[0] and you would buy it anyway because it's a CPU.
This is the part that needs to be emphasized more, that iGPU is practically free because it comes with the CPU we're buying anyway.
It's already hard enough to compete with free, but on top of that dGPUs are asking for Keksimus Maximus monies. The value proposition for gamers, let alone most consumer users, is very strongly favored towards iGPUs.
Yo! I tolerate relatively poor performance out of my laptop (a several year old model with a Ryzen and iGPU) while having a fairly kickass home desktop[0]. So chalk me in that group.
[0]Which currently sees me playing a whole bunch of Rule the Waves 3..... poor underutilized 3080.
The only reason I didn't list Fate/Grand Order (Japan) is because there is no Windows client. This also means most of my gaming time and money spent is, in fact, not on Windows but on Android. :V
I'm more of a sim fan, and my wallet thanks me for it. I do still feel profoundly bad for my friends who grew into playing Star Citizen though... hope they're making smart financial decisions in the new year (lmao)
In a relativistic sense, spending money on fake space insurance and gacha pulls is better than an alcohol addiction or a hospital bill. In a pragmatic sense, the opportunity cost of investing in space travel simulation and softcore catgirl pornography is the progress towards real world space travel and actual catgirls.
When humanity eventually collapses in on itself due to resource exhaustion or outright hubris, our heavenly benefactors will zoom into our tech tree and say "Damn! They were just about to unlock interstellar travel and goodwill towards all mankind. What the hell were they speccing into earlier?"