6 Cores vs. 16 Cores: How Well Have Ryzen 5 CPUs Aged?

Status
Not open for further replies.
Again highlighting there's a point where throwing more cores at the problem 'for non-perfectly parallel tasks' doesn't magically result in more performance; so long as the CPU has enough horsepower to keep the GPU as the bottleneck, you won't see any significant change in performance beyond a certain point.

*Assuming no other apps in the background, of course. Figure I'd add this before someone proclaims "productivity".
 
"Needless to say, the Ryzen 5 5600 and 5600X were always competitively priced and presented excellent value for gamers on a tight budget. "

No, no it wasnt, this is misinformation. When the 5600x came out, it was regarded as an expensive turd. ESPECIALLY for "budget" gamers. $300 is not "budget". The 3600x was commonly available for $140-150 around this time. So, for 20% more performance via the 5600x, you had to pay 100% more. Very Nvidia=esque "value". If you really wanted budget, the 3500x could be found on Alibaba for as little as $120, same cache as a 3600x. The 5600x was 2.5x as expensive as that, and a whopping 25% faster.

Wow, such value, many cores. This was right around the time AMD decided that budget buyers dont deserve computer parts and jacked up their prices since they could, now that they were faster then intel.

The real budget move was to nab up a 3500x cheap from Alibaba, run that for a few years, then nab a 5800x3d when they pitched way down in price (picked mine up for $249 on an ant online sale).

Otherwise, 6 cores are enough for gaming, since consoles only use 6 cores for games. That will continue until at least the PS6/xbox whatever. 8 cores is better for high FPS, especially the 3d parts. but if you are playing at 60 FPS then even older CPUs are still fine, would have liked to see some of those 3000 series 6 core parts for comparison.
 
The latest 6-core article was quite brilliant and extremely useful, but this one seems to return to the terrible trend of trying to portray real world situations with fantasy scenarios - in this case of using 5600X + 4090.

The conclusion of this article is still sound, but I bet that if you have paired this processor with an upper-mid range card, and shown 1440p results (thank god at least the 4K made the cut), which is what I suppose majority of users (such as myself) are doing, the differences vs the rest of the pack would be even smaller.
 
Damn it Steve, get with the program. More cores = future proofing. In ten years from now when 99% of all people have moved on from these CPUs you will run the same test and see the 5950X gives you a buttery smooth 20 FPS in games and while the 5600X sluggishly pulls in 17 fps; you will feel awfully silly.
 
The latest 6-core article was quite brilliant and extremely useful, but this one seems to return to the terrible trend of trying to portray real world situations with fantasy scenarios - in this case of using 5600X + 4090.

The conclusion of this article is still sound, but I bet that if you have paired this processor with an upper-mid range card, and shown 1440p results (thank god at least the 4K made the cut), which is what I suppose majority of users (such as myself) are doing, the differences vs the rest of the pack would be even smaller.
Hey there!

The intent of the chosen hardware and resolutions in this article is to illustrate the pure performance of a CPU, but only compared to other CPUs. The 4090 is used only to illustrate the worst possible gaming scenario for any CPU on the market; it's having a very easy time at 1080p. The 4090 is able to produce far higher frame rates as a result of the ease this resolution provides, which means that it's going to be asking the CPU for information far more often than it would at 4k. 1080p helps to illustrate how well these CPUs can keep up with the GPU's requests for information. So what you're seeing at 1080p is that some CPUs aren't able to respond as quickly or efficiently as others, showing the difference in gaming performance from a CPU perspective when paired with their worst possible GPU.

At 4k, we see that the GPU is having a more difficult time and doesn't need information from the CPU as often, so the gaps between the different CPUs start to close. The X3D part is no longer crushing it's cousins like it did at 1080p because the requests from the GPU aren't taxing enough at this resolution to illustrate the actual gaming performance difference between each CPU.

Testing or reviewing CPUs and GPUs is all about managing bottlenecks. To know the difference in gaming performance between CPUs, we use the best GPU on the market and we put it in a scenario where that GPU will tax each CPU enough to illustrate that difference. To know the difference in gaming performance between GPUs, we pair them all with the best possible gaming CPU on the market to illustrate the best that we're able to allow these GPUs to perform.

As a consumer, we're looking for the best possible part for our dollar and these tests give us an idea of how well these parts can perform. When buying parts, some may aim to build a perfectly balanced system at the start while others might go big on one with the intent to upgrade the other down the road. Everyone's goals are different, but I think most of us can agree that illustrating a part's potential makes it much easier to determine which part is best for the individual.
 
Last edited:
Bruh my FX-8350 printing NUMBERS now!
Don't laugh, I literally had some one telling me their FX-8350 was a better gaming chip in modern games than the Intel 2500k, about ten years after their release. My response was fantastic, ten years of mediocre gaming performance so you can now say your slide show is slightly faster than another slide show meanwhile all those people with with the 2500k have moved on years ago.
 
5600x still does a decent job even today, and yes the launch price was 300, not so good value there. I got it two years ago for 200. It did a awsome job for his time. Back then the 5800x3d was 450-500 and it didn't make sense to me. Ii will do a full platform upgrade when GTA6 came out. Until then...
 
No, no it wasnt, this is misinformation. When the 5600x came out, it was regarded as an expensive turd. ESPECIALLY for "budget" gamers. $300 is not "budget". The 3600x was commonly available for $140-150 around this time. So, for 20% more performance via the 5600x, you had to pay 100% more. Very Nvidia=esque "value". If you really wanted budget, the 3500x could be found on Alibaba for as little as $120, same cache as a 3600x. The 5600x was 2.5x as expensive as that, and a whopping 25% faster.
You're technically correct, but remember the Ryzen 5000 series launched early in the covid pandemic among global shortages, at a time nothing in the tech market made sense. Once the pandemic/crypto problems eased, just like the RX 6000 cards dropped to sensible prices, the Ryzen 5600 quickly dropped below $200 too.

8 cores is better for high FPS, especially the 3d parts.
This part is nonsense. The 7600 has 6 cores and wipes the floor with the 5800X and 5950X, and slightly outperforms the 5800X3D too.
The whole point of this series of articles about CPU core counts is to show how, if you're concerned about gaming, you shouldn't look at core count at all. It's a completely irrelevant metric.
 
"Needless to say, the Ryzen 5 5600 and 5600X were always competitively priced and presented excellent value for gamers on a tight budget. "

Just an observation in relation to "budget desktop builds": 10 years ago, the lowest priced AMD A4-5300 (Piledriver, iGPU, 2 cores, 2 threads) was priced 40€. Today, the lowest priced AMD desktop CPU with an iGPU [at my location] is Ryzen 3 4300G (Zen2, iGPU, 4MB L3 cache, 4 cores, 8 threads) priced at 90€. That is 225% price inflation in 10 years (average inflation per year: 8.5%).
 
Thanks for the article Steve ! But you havent included the top CPUs to compare to . They re 60% faster than Ryzen 5600X in gaming. So , while the article is good the results are perhaps intentionally twisted . Dont get me wrong Steve , I love your articles but this article needs an update . Ryzen 5600X is good but not enough for high end builds .
 
I have been using my 3600x for 4 years at 1080p. Till today, it's still serving me well since I OCed it to 4.4 and paired with a 3070ti. I still do not encounter any noticeable bottlenecks in games that I usually play (Baldur Gate, RD2, Witcher, etc). I'm planning to upgrade my monitor to 1440p 165hz and still see my 3600x very reliable for the task.
That's a 3600x so I strongly believe that a 5600 or 5600x will stay relevant for a long time.
 
Just an observation in relation to "budget desktop builds": 10 years ago, the lowest priced AMD A4-5300 (Piledriver, iGPU, 2 cores, 2 threads) was priced 40€. Today, the lowest priced AMD desktop CPU with an iGPU [at my location] is Ryzen 3 4300G (Zen2, iGPU, 4MB L3 cache, 4 cores, 8 threads) priced at 90€. That is 225% price inflation in 10 years (average inflation per year: 8.5%).
Either prices in Europe are much higher or that seller has limited inventory. Available from Newegg is Athlon 3000G for $52 or an OEM Athlon 200GE for $48. A Ryzen 3 2200G is $65.
 
The top budget CPU is Ryzen 4500 for 72 euro . The socket is still alive I.e. Ryzen 4500 is easily upgradeable to faster processor . Yet , I think i3 13100F or Ryzen 5600 are the sweet spot for a budget processor at the current moment .
 
You're technically correct, but remember the Ryzen 5000 series launched early in the covid pandemic among global shortages, at a time nothing in the tech market made sense. Once the pandemic/crypto problems eased, just like the RX 6000 cards dropped to sensible prices, the Ryzen 5600 quickly dropped below $200 too.


This part is nonsense. The 7600 has 6 cores and wipes the floor with the 5800X and 5950X, and slightly outperforms the 5800X3D too.
The whole point of this series of articles about CPU core counts is to show how, if you're concerned about gaming, you shouldn't look at core count at all. It's a completely irrelevant metric.

I wouldn't say "you shouldn't look at core count at all", just not to give it too much weight unless you using software that needs them. A perfect example that I've used before is Cities 2. As your city grows the core count needs increase, quite a lot. Linus (yeah I know) showed it saturating a threadripper. In the end every part will have strengths and weaknesses including cost, and a user simply has to decide what trade offs they can live with is all.
 
I wouldn't say "you shouldn't look at core count at all", just not to give it too much weight unless you using software that needs them. A perfect example that I've used before is Cities 2. As your city grows the core count needs increase, quite a lot. Linus (yeah I know) showed it saturating a threadripper. In the end every part will have strengths and weaknesses including cost, and a user simply has to decide what trade offs they can live with is all.
Even in that case, "you shouldn't look at core count at all" is still correct. For example, the Ryzen 7600 gives you the same multicore performance as the 5800X despite having 2 fewer cores. Even in a game that scales perfectly with core count, a 5800X still won't outperform a 7600.
Core count is a meaningless spec in the sense that you should only look at the performance you get, not how many cores the chip has. And, especially when comparing chips of different architectures, core count doesn't necessarily correlate with the performance you get. We've seen that multiple times, like the 12400F outperforming older eight-cores, or the Ryzen 5600/7600 outperforming their high core count predecessors. You can technically make that argument for something like the 7950X today, because it's a current gen chip so there's nothing that outperforms it (in MT) with lower core counts, but once Zen 5/Arrow Lake launches later this year there will be.
There is no such thing as "software that need X amount of cores". If you say "you need a 16-core 7950X to have optimal performance in Cities Skylines 2", but later a 12-core Zen 5 chip outperforms the 7950X in Cities Skylines 2, then the core count wasn't the part that actually mattered.
 
There is no such thing as "software that need X amount of cores". If you say "you need a 16-core 7950X to have optimal performance in Cities Skylines 2", but later a 12-core Zen 5 chip outperforms the 7950X in Cities Skylines 2, then the core count wasn't the part that actually mattered.

I couldn't really care at all about gaming performance. The gpu is more important as long as you aren't using something stupidly old/slow cpu wise. For productivity no one would ever pick the 8 or 6 core cpu over the 12/16 core one on anything but price. Plenty of software scales with cores, Z950X will crush Z600X in all those apps. Blender for example, will always favour more cores for rendering as will compression and decompression, compiling etc.

Too bad the sole focus of this article was gaming. Almost no one was buying 12 or 16 cores for just gaming so of course they make less sense if you are only doing frivolous things.
 
Ryzen 5600X is good but not enough for high end builds .

No one is/was using a 5600 for high end builds... it's a low-mid tier CPU and should be used accordingly. As many have posted, however, it was released at $300 - which is perhaps why a few people think that?

What this article SHOULD be compraing this CPU to isn't other AMD chips, but other INTEL chips... then we could see if it really was a good value...
 
The people who talk about cores are the same people who talk about video ram, they don't get it. They never will get it. It's not that cores or video ram are not important or are important but they area part of the whole. It's about performance, end of story. If a 4c/8T has equal performance to a 6C CPU in all benchmarks, they are equal in performance across all those benchmark. One having more or less cores or more or less threads makes no difference.
 
R7 5800X and R9 5950X wasn't competitively priced for the first 2 years so it was either R5 5600X(later non X) or R9 5900X. At least that was the case in the UK.
 
Status
Not open for further replies.
Back