# Ryzen2 3000 Series Specs Revealed



## Darren

https://www.tomshardware.com/news/amd-ryzen-3000-series-matisse-specs,38310.html


Looks like the R7's will be pushing 12c/24t and R5's 8c/16t


Dem boost clocks.


----------



## beers

If 3850x is true then I'm buying that lol


----------



## Darren

They've basically done to themselves what they did to Intel when Ryzen 1 dropped. Knock core counts up a notch across the line. A 2 year old i7 with 4c/8t is likely about to get out benchmarked by a $99 6c/12t chip. Similarly my 8c/16t R7 is now well below spec of an R5. Gr8


----------



## Intel_man

That's great news for us normal bois that don't need 16c/32t. Things just got a lot cheaper.


----------



## Jiniix

I've got my eyes on that 3700X. Even adjusted for local taxes (+25%) it's spot on the price of a 2700X and about $100 cheaper than a i7-9700K.


----------



## zeppelin04

I've been avoiding upgrades for a little while now and doing fine with a 6600K. Ryzen has been interesting to me but these new parts will likely get me to upgrade.  The price for those cores and speed is amazing.


----------



## Darren

zeppelin04 said:


> I've been avoiding upgrades for a little while now and doing fine with a 6600K. Ryzen has been interesting to me but these new parts will likely get me to upgrade.  The price for those cores and speed is amazing.


My 7th gen i5 at work feels like molasses when multitasking compared to my Ryzen machine.


----------



## Cromewell

Darren said:


> My 7th gen i5 at work feels like molasses when multitasking compared to my Ryzen machine.


Does your workplace install a bunch of security agents too? I've got a newish i7, with 16g of ram and an ssd and my work machine feels like a 386. A CPU core is routinely pegged at 100% dealing with scans from the array of shitty software.

The low end price/clock on these new ryzens is tempting though. I could probably upgrade my wifes machine to integrated graphics when the Gs come out and she'd never know. She's rocking an ancient HD6850 right now.


----------



## Darren

We run Kaspersky, although in process of transitioning off of it. It's given us a lot of issues and after 3 ransomwares in a year we're looking elsewhere, turns out they didn't add the ransomware component to the servers and didn't tell us or notify even though it's included in our package. Think ESET is what my boss said.


----------



## zeppelin04

Darren said:


> My 7th gen i5 at work feels like molasses when multitasking compared to my Ryzen machine.



That's good to know.  My system is starting to feel sluggish when running a few programs.  There are more background processes running than I realized.  I have a feeling part of it is from Windows 10/chrome updates.


----------



## OmniDyne

Darren said:


> My 7th gen i5 at work feels like molasses when multitasking compared to my Ryzen machine.



If I had known the Ryzen 5 2600 was gonna be so legit, I would have waited 2 months and ditched the 8400 build.


----------



## Darren

OmniDyne said:


> If I had known the Ryzen 5 2600 was gonna be so legit, I would have waited 2 months and ditched the 8400 build.


For quite a while I thought I had an actual problem as I thought no way an i5 would be this slow. Personally never used one for heavier work before.

Turns out that problem is just Intel.


----------



## Jiniix

As the "personal IT guy" for a lot of IRL friends and online guilds, I've seen a big rise in people complaining about low FPS, with all of them having these 3.3-3.7GHz i5's from 4th gen and up.
But there's rarely an upgrade path I'd consider anywhere near worth, since Ryzen has been "low" frequencies and Intel pricey AF.
I imagine a lot of people will take the plundge to compete with a i9-9900K at about 60% of the price, excluding the platform itself, which is also typically cheaper than their Intel counterparts.
Thank you AMD, very cool!


----------



## Darren

My Ryzen 7 does not bottleneck my 1080 at all at 1440p. Even being a first gen. I expect these new chips to game just as well if not better than Intels.


----------



## _Kyle_

I really should have waited on getting a new mobo, lol. I'll have to pick up another one for R3 unless it'll run on a B350 with a BIOS update.


----------



## Darren

_Kyle_ said:


> I really should have waited on getting a new mobo, lol. I'll have to pick up another one for R3 unless it'll run on a B350 with a BIOS update.


Probably will. I expect my board to support it, although I don't plan to upgrade for at least another couple years.


----------



## Jiniix

Darren said:


> My Ryzen 7 does not bottleneck my 1080 at all at 1440p. Even being a first gen. I expect these new chips to game just as well if not better than Intels.


My friend with 3-4000 hours in CSGO dropped from ~350 to ~250 FPS (on Windows) going from an Intel 6700K to a Ryzen 7 2700X, which he wasn't too happy about. But he can live with it, since his Gentoo compiles about 2-3x faster now.


----------



## Darren

Jiniix said:


> My friend with 3-4000 hours in CSGO dropped from ~350 to ~250 FPS (on Windows) going from an Intel 6700K to a Ryzen 7 2700X, which he wasn't too happy about. But he can live with it, since his Gentoo compiles about 2-3x faster now.


And I'd expect nothing less. Intel > AMD in strictly gaming, particularly high FPS scenarios.

I feel this might change though if performance correlates with these numbers.


----------



## Cromewell

Jiniix said:


> CSGO dropped from ~350 to ~250 FPS


Even at 250fps, who cares? That's still more than high enough to play.


----------



## _Kyle_

Any rumored release dates?

May just save up and build a new rig with the new AMD GPUs that'll coming out soon. My little brother has been wanting a gaming PC for a while so I'll gift my current one to him if I do end up building a new PC.


----------



## Darren

Cromewell said:


> Even at 250fps, who cares? That's still more than high enough to play.


Competitive players will swear that the higher frames, even at that level, help with input lag. Meh.


----------



## Intel_man

Higher frame rates draw fast moving images better without blurring/ghosting. 

Imagine moving a folder with the name on the bottom really fast across your desktop and being able to read the name without a slight hint of blur.


----------



## Cromewell

Intel_man said:


> Higher frame rates draw fast moving images better without blurring/ghosting.


You don't see any of the frames over 60, or 144 or whatever your monitor is doing. You are more likely to see the closest to your last mouse movement but that's all.


----------



## OmniDyne

Cromewell said:


> You don't see any of the frames over 60, or 144 or whatever your monitor is doing. You are more likely to see the closest to your last mouse movement but that's all.


----------



## Intel_man

Cromewell said:


> You don't see any of the frames over 60, or 144 or whatever your monitor is doing. You are more likely to see the closest to your last mouse movement but that's all.


I think currently the high end monitors go up to 240hz? That being said, even when capped with V-Sync or hopefully adaptive sync (Freesync and Gsync), when the CPU is capable of producing an overhead of around 10 fps more than the refresh rate, the dips below the 240hz can be more common than say a CPU capable of producing 100fps more than the refresh rate. At least that's the idea. 

Those who swear by letting the fps go unlimited and not capped to the refresh rate of the monitor are not well informed.


----------



## Cromewell

Intel_man said:


> I think currently the high end monitors go up to 240hz? That being said, even when capped with V-Sync or hopefully adaptive sync (Freesync and Gsync), when the CPU is capable of producing an overhead of around 10 fps more than the refresh rate, the dips below the 240hz can be more common than say a CPU capable of producing 100fps more than the refresh rate. At least that's the idea.
> 
> Those who swear by letting the fps go unlimited and not capped to the refresh rate of the monitor are not well informed.


Sure, if your monitor does 240, I can see wanting some amount in excess of that. All I was saying is that if you are running 600000fps all you have is a lot of frames getting drawn and overwritten without ever having them displayed so other than potentially having a newer frame in the buffer to send to the monitor there is little point to running that high.


OmniDyne said:


> ...


Don't worry, I know how refresh works. My point is if the monitor isn't drawing the frame, you will never see it.


----------



## Jiniix

Cromewell said:


> Sure, if your monitor does 240, I can see wanting some amount in excess of that. All I was saying is that if you are running 600000fps all you have is a lot of frames getting drawn and overwritten without ever having them displayed so other than potentially having a newer frame in the buffer to send to the monitor there is little point to running that high.
> 
> Don't worry, I know how refresh works. My point is if the monitor isn't drawing the frame, you will never see it.


He has a 144 or 165 hz, but more importantly look up Frame Times. You can have 60 FPS but still be laggy on a 60hz display.

I have not heard one actually good CS player say anything below 300 FPS is acceptable, afaik it's something with the engine.

I can personally attest that 90 FPS feels like 30 in CSGO, on a 60hz display.


----------



## OmniDyne

Jiniix said:


> He has a 144 or 165 hz, but more importantly look up Frame Times. You can have 60 FPS but still be laggy on a 60hz display.
> 
> I have not heard one actually good CS player say anything below 300 FPS is acceptable, afaik it's something with the engine.
> 
> I can personally attest that 90 FPS feels like 30 in CSGO, on a 60hz display.



Have you messed with the buffering settings? Triple buffering and reduced buffering?

It's worked wonders for me in CS GO and Overwatch playing at 60Hz. My buddy plays CS GO at 144hz without a problem.

Frametime consistency is the same in every game, as in every title should have the same frametimes at every frame, low or high.

If you're having consistency issues something is amiss and I doubt it's the game engine, but I could be wrong.


----------



## Darren

You can't rationalize technology with CSGO players in relation to their framerate. I've tried. Don't waste your breath.

That's not a stab at @Jiniix btw, just speaking generally.


----------



## Jiniix

OmniDyne said:


> Have you messed with the buffering settings? Triple buffering and reduced buffering?
> 
> It's worked wonders for me in CS GO and Overwatch playing at 60Hz. My buddy plays CS GO at 144hz without a problem.
> 
> Frametime consistency is the same in every game, as in every title should have the same frametimes at every frame, low or high.
> 
> If you're having consistency issues something is amiss and I doubt it's the game engine, but I could be wrong.


On my main PC I've played with everything at 60hz from 1024x768 to 2560x1440, highest and lowest settings, never with V-Sync though. Always 250-450 FPS and I've never had an issue. I'm turbocasual though.
On my laptop however, i7-720QM with an HD5650M it feels laggy and choppy with everything on the lowest possible settings, even though it reports 90 FPS. Other games like WoW would run just dandy at 45+ FPS.
As far as I know, which may not be much, frame time consistency will differ from game to game, depending on the engine and other variations.



Darren said:


> You can't rationalize technology with CSGO players in relation to their framerate. I've tried. Don't waste your breath.
> 
> That's not a stab at @Jiniix btw, just speaking generally.


I don't disagree at all, but I'm basing this mostly on information from my 3000+ hrs CSGO friend, who's also a server admin for an ISP and avid Gentoo enthusiast. 

It has something to do tickrates of the server, and how it communicates with the client. That's pretty much the extend of what I know.


----------



## OmniDyne

Jiniix said:


> i7-720QM with an HD5650M



A 10 year old mobile processor and 10 year old integrated graphics. It's not surprising you're having issues during gameplay.



Jiniix said:


> everything on the lowest possible settings, even though it reports 90 FPS



Yes, this is likely an optimization issue because the hardware is so old. Turning down graphical settings isn't going to fix frametime inconsistency. Driver optimization will affect consistency, and I doubt NVIDIA is optimizing the 720QM for current games.



Jiniix said:


> like WoW would run just dandy at 45+ FPS



WoW is probably a far more optimized title. They have the resources to cater to older platforms.



Jiniix said:


> As far as I know, which may not be much, frame time consistency will differ from game to game, depending on the engine and other variations.



Frametime consistency will differ depending on optimization, hardware, hardware design, age of hardware, processor frequency, etc., etc., etc.

Ideally, you want as consistent frametimes delivered as possible in every game depending on the FPS:

16.7 ms for 60 FPS, 8.3 ms for 120 FPS, 6.9 for 144, 4.2 for 240hz.

You can have higher or lower frametimes consistently, but constant jumps up or down 8ms and above/ below is generally perceivable even at super high framerates.


----------



## Jiniix

OmniDyne said:


> A 10 year old mobile processor and 10 year old integrated graphics. It's not surprising you're having issues during gameplay.
> 
> 
> 
> Yes, this is likely an optimization issue because the hardware is so old. Turning down graphical settings isn't going to fix frametime inconsistency. Driver optimization will affect consistency, and I doubt NVIDIA is optimizing the 720QM for current games.
> 
> 
> 
> WoW is probably a far more optimized title. They have the resources to cater to older platforms.
> 
> 
> 
> Frametime consistency will differ depending on optimization, hardware, hardware design, age of hardware, processor frequency, etc., etc., etc.
> 
> Ideally, you want as consistent frametimes delivered as possible in every game depending on the FPS:
> 
> 16.7 ms for 60 FPS, 8.3 ms for 120 FPS, 6.9 for 144, 4.2 for 240hz.
> 
> You can have higher or lower frametimes consistently, but constant jumps up or down 8ms and above/ below is generally perceivable even at super high framerates.




i7-720QM: Q3'09
HD5650M: Q3'10
The WoW I'm playing: 2005 and 2007.
CSGO: August 21, 2012

I doubt CSGO not being 'optimized' for a 2.8GHz Intel CPU and AMD dGPU that's two years older, to the point where 30 FPS above the screen Hz is unenjoyable. Not sure where NVIDIA comes in to the picture either.

And exactly on the topic of frame times, which I mentioned earlier, having a high frequency stronger CPU usually produces more stable frame times and coincidentally higher FPS. So in regards of why 300+ FPS is recommended for CSGO, it's usually a product of stable frame times and the allowance of an FPS drop while still staying well above the refresh rate of any monitor on the market currently. 

On the topic of AMD again, I can't wait to get my hands on the 3700X and test it at equal GHz in a 1:1 comparison against my 8700K.


----------



## OmniDyne

Jiniix said:


> i7-720QM: Q3'09
> HD5650M: Q3'10
> The WoW I'm playing: 2005 and 2007.
> CSGO: August 21, 2012



Yes, basically 10 years old. The age of the game has nothing to do with optimization. They constantly release updates. GTA and numerous older games had to release patches to update for Ryzen, for example.

CS GO has had endless amounts of patches; graphical and the like. It makes total sense that your laptop is underperforming; it's not in their interest to update for such old hardware. Not only that, the performance of the i7-720 is abysmal by this point, especially the mobile chip.



Jiniix said:


> I doubt CSGO not being 'optimized' for a 2.8GHz Intel CPU and AMD dGPU that's two years older, to the point where 30 FPS above the screen Hz is unenjoyable. Not sure where NVIDIA comes in to the picture either.



It's absolutely an optimization issue.

Read any NVIDIA or AMD driver update; they have to release drivers to optimize for games. In fact, NVIDIA recently released a driver update that lowered performance.



Jiniix said:


> And exactly on the topic of frame times, which I mentioned earlier, having a high frequency stronger CPU usually produces more stable frame times and coincidentally higher FPS. So in regards of why 300+ FPS is recommended for CSGO



Not really, but kind of. Ryzen doesn't differ by any meaningful amount in terms of frametimes against Coffee Lake:

https://www.gamersnexus.net/hwreviews/3407-intel-i5-9600k-cpu-review-vs-2700-2600-8700k

https://www.gamersnexus.net/hwreviews/3421-intel-i7-9700k-review-benchmark-vs-8700k-and-more

In fact, even high frequency (5GHz+ i5 8th gen processors) are having serious framerate issues in some titles. Ryzen can even outperform Coffee Lake in frametimes at much lower frequencies:

https://techreport.com/review/31546/where-minimum-fps-figures-mislead-frame-time-analysis-shines

300+ FPS doesn't reduce frametimes by a meaningful amount. In fact, trying to constantly push for such high FPS can negatively affect frametimes.


----------



## Jiniix

Except for one graph, with a heavily overclocked i5, Intel is winning all those frame time graphs. I tried finding graphs from GN with the same CPUs across different games, but gave up since I was (and am) at work and it took too long. 
As for updating drivers and optimization, sure they keep working on them, but they don't throw away the old optimizations. They can easily detect which CPU/GPU you have and apply the correct optimizations for that processor/GPU. They don't throw away the old, they build new on side.


----------



## OmniDyne

Jiniix said:


> Except for one graph, with a heavily overclocked i5, Intel is winning all those frame time graphs.



Hence why I used the word "can". Any Ryzen 5 or Ryzen 7 processor can and will provide better frametimes in certain titles against any i5 processor due to thread limitations.



Jiniix said:


> They can easily detect which CPU/GPU you have and apply the correct optimizations for that processor/GPU



That's not the way it works. There's no way AMD is optimizing drivers for the HD5650M.






Game developers can optimize their games for older hardware. Blizzard has the resources (money and manpower) to do this in WoW. I'm not sure how much you've played Counter Strike, but with how janky the entire UI is, it's quite obvious that CS GO is not a super optimized title.

This is further evidenced by Doom 2016 and the use of Vulkan API.


----------



## Jiniix

So I just got a reply back from my friend, a former low-tier pro, and the simple response is input lag from your mouse. It's related to frame times, more FPS = more up-to-date frames. We're talking milliseconds, which is why a casual won't notice (like myself) but also why it matters for professionals.


----------



## OmniDyne

Jiniix said:


> So I just got a reply back from my friend, a former low-tier pro, and the simple response is input lag from your mouse. It's related to frame times, more FPS = more up-to-date frames. We're talking milliseconds, which is why a casual won't notice (like myself) but also why it matters for professionals.



Frametime does not necessarily correlate with input lag.

Higher framerates help because you're getting more updates on screen.

If two computers had identical specifications and all else were equal; one producing 60 frames and one producing 300 frames, input lag would be identical. Your friend is simply able to react more quickly to the faster framerate.

This is further evidenced by the fact that professional players enable "reduce buffering" in the game settings, even at 300+ FPS; this significantly reduces input lag.

If higher framerates equalled lower input lag, reduce buffering would not exist.


----------



## Intel_man

Haven't seen it personally myself, but I'd be interested to see how much of an impact the csgo server's tickrate has on fps games.

Like... people swear by stupid high frame rates, when the public/competitive servers only has a tickrate of 64. The fluidity of the game surely can only matter so much up to a certain metric before it just becomes a placebo effect.


But... I digress. This thread is to talk about the upcoming Ryzen cpus.


----------



## Jiniix

In a perfect world with perfectly synced frames, 60 FPS on a 60Hz display would have no input lag. But that's not the world we live in. More frames, more dense frame array and a more updated frame will be chosen for each Hz. That's pretty objective, however minimal it may seem.
And indeed Intel_man, let's talk about the AMD CPUs


----------



## OmniDyne

Jiniix said:


> In a perfect world with perfectly synced frames, 60 FPS on a 60Hz display would have no input lag. But that's not the world we live in. More frames, more dense frame array and a more updated frame will be chosen for each Hz. That's pretty objective, however minimal it may seem.
> And indeed Intel_man, let's talk about the AMD CPUs



Right, so more frames delivered can provide a smoother experience. It could also be argued that higher framerates provide a competitive edge. But again, more frames delivered does not equal reduced input lag.

So on the topic of Ryzen processors, even though your friend went with the 2700X and lost deliverable frames, frametimes don't vary in any meaningful way, and the input lag would be virtually the same.


----------



## Jiniix

He noticed a difference


----------



## Darren

Alright boyos, moving on.


----------



## diypartsjoy

I am not actually very excited about the Ryzen 3000. I am not going to buy another CPU until DDR5 comes out anyway.


----------



## millz

diypartsjoy said:


> I am not actually very excited about the Ryzen 3000. I am not going to buy another CPU until DDR5 comes out anyway.



I on the other hand am so freaking excited to finally upgrade. I came to this site back in 2011-2012 and asked for advice to build a gaming rig. 

I'm still using the i5 2500k that is sorely needing an upgrade and it's going to be ryzen 3000 for sure.


----------



## C4C

I'm giddy.. Time to start saving for my next PC build. This i5-6600K is doing alright but it'd be nice to upgrade...


----------



## Darren

C4C said:


> I'm giddy.. Time to start saving for my next PC build. This i5-6600K is doing alright but it'd be nice to upgrade...


But you only need 4 cores for gaming! /s


----------



## zeppelin04

Darren said:


> But you only need 4 cores for gaming! /s



We are planning on upgrading from the 6700k.  My girlfriend wants to upgrade but has requested we wait until summer for the 3000 series. I'm not going to fight with her about it. Who am I to argue if she wants 8 cores and more speed.


----------



## beers

zeppelin04 said:


> Who am I to argue if she wants 8 cores and more speed.


Giggity.


----------



## C4C

Darren said:


> But you only need 4 cores for gaming! /s



Hehe... Starting to get back into video editing. My buddy wants me to create visuals for his DJ sets. Gonna upgrade to 16GB of RAM here soon.. also need another HDD for games


----------



## zeppelin04

I just saw a story that Intel will be releasing yet another Skylake refresh this year. Ryzen 3000 is looking that much better.  Can't wait.  Now I just need to start saving.


----------



## Darren

zeppelin04 said:


> I just saw a story that Intel will be releasing yet another Skylake refresh this year. Ryzen 3000 is looking that much better.  Can't wait.  Now I just need to start saving.


They still seem to have enough hive mind customers to limp along until their engineering department figures out a new die size.


----------



## zeppelin04

Darren said:


> They still seem to have enough hive mind customers to limp along until their engineering department figures out a new die size.



I have been happy with any Intel products I have used but it's tough right now.  That 9900k is way too expensive.  I have a habit of buying a higher tier than needed but I am not going to $600 for it.


----------



## Darren

Looks like we'll see an unveil in June at Computex, likely followed by a release soon after since that's been their pattern on the previous 2 generations.

https://www.techradar.com/news/amd-to-unleash-ryzen-3000-cpus-and-navi-graphics-cards-at-computex


----------



## zeppelin04

Seeing those 4.5ghz rumors.  Can't wait to get a cheap processor that can likely overclock close to those Intel chips.  Ready to try AMD for once.


----------



## Darren

With their keynote today we've learned a few things.



https://www.anandtech.com/show/1440...-cores-for-499-up-to-46-ghz-pcie-40-coming-77

Them boost clocks and with the IPC gains of 7nm I'm really interested to see some hard benchmarks.


----------



## OmniDyne

Interesting perspective, especially when he starts talking about motherboards.


----------



## Darren

OmniDyne said:


> Interesting perspective, especially when he starts talking about motherboards.


Ya know I've been telling myself to buy AMD stock since it was like 3 bucks a share.

Still haven't...


----------



## OmniDyne

Darren said:


> Ya know I've been telling myself to buy AMD stock since it was like 3 bucks a share.
> 
> Still haven't...



Probably not a bad time. Manufacturers are investing heavily in AMD products now. Linus said one of the motherboard manufacturers alone is producing 30 different boards for Ryzen 3000. My buddy has been holding onto AMD stock for long time and I think his time has finally come to reap the benefits ha.

Intel could be put in a precarious position, especially considering the introduction of ARM processors in laptops.


----------



## beers

I had like 10k @ $4 but sold it like a dumbass 

Where's that 16 core at?  Excited to see some real benches though


----------



## Darren

beers said:


> I had like 10k @ $4 but sold it like a dumbass
> 
> Where's that 16 core at?  Excited to see some real benches though


I haven't been keeping up as well as I used to but weren't there rumors of Threadripper being discontinued or something like that. 

That 3700X is tempting. Completely unnecessary, but I want it.


----------



## Jiniix

Has anyone seen any technical difference between the 3700X and 3800X? 
A difference of 40W TDP can't surely be down to binning and a few hundres MHz. Granted TDP is rather useless as a measurement, but the 12core is also 105W with a higher boost and almost same base.


----------



## OmniDyne

Jiniix said:


> Has anyone seen any technical difference between the 3700X and 3800X?
> A difference of 40W TDP can't surely be down to binning and a few hundres MHz. Granted TDP is rather useless as a measurement, but the 12core is also 105W with a higher boost and almost same base.



You'll notice that the 6 core 3600X has a higher TDP rating (95W) than the 8 core 3700X (65W). As Ian mentioned in his Anandtech article:


> This CPU has a TDP of 105W, which for AMD processors is usually a good measure of all-core power consumption




I think the variations in TDP are absolutely related to binning.


----------



## Darren

OmniDyne said:


> You'll notice that the 6 core 3600X has a higher TDP rating (95W) than the 8 core 3700X (65W). As Ian mentioned in his Anandtech article:
> 
> 
> 
> I think the variations in TDP are absolutely related to binning.



Same as the 1700 being 65w supposedly but when I clocked it at 4.0GHz it's pulling as much as a stock 1800x.


----------



## Jiniix

TDP has been rather useless since 2006, but exactly as you also point out, a six-core at 3.8/4.4GHz is supposedly pretty much the same TDP as a 12-core with 3.8/4.6Ghz
I hate TDP with a passion, and wish we could make something like their ACP an industry standard and required. Power consumption based on a specific set of benchmarks.


----------



## zeppelin04

I'm happy with the processors themselves but the costs I am hearing for motherboards has me a little worried.  It might be a costly upgrade once I factor in new ram.

In reference to AMD stock. I bought in at $1.80 and sold around $6.00. Wish I held that a little longer.


----------



## Shlouski

These new cpu's are looking good thank goodness, but before everyone starts singing about how great AMD is, just remember they are the reason why intel has had the best part of a decade long advantage to exploit, and a companies main priority is to make money, you don't have to like it.
Many seem to want intel to suffer and for the AMD underdog to rise up and leave intel in the dust, but this is the last thing the consumer should want, we don't need a repeat of the last decade with a AMD superiority instead, we need a competitive market to push technology and to keep prices competitive.

May AMD, Intel and nvidia prosper, so we the consumer may reap the rewards.


----------



## Darren

Jiniix said:


> TDP has been rather useless since 2006, but exactly as you also point out, a six-core at 3.8/4.4GHz is supposedly pretty much the same TDP as a 12-core with 3.8/4.6Ghz
> I hate TDP with a passion, and wish we could make something like their ACP an industry standard and required. Power consumption based on a specific set of benchmarks.


TDP is Thermal Design Power which I understood to mean how much heat it will put off under full load measured in watts, and not an actual power consumption figure. I've used HWInfo to measure actual power draw on the CPU socket (although unsure how accurate it is) and I'll see my actual consumption usually under 65 watts when at stock clocks but at 4.0GHz I see it north of 65 regularly and over 110ish at full load. I remember my 8320 would sometimes pull over 200 watts when I had that bad boy cranked all the way up.  

Again not sure how accurate those measures are but just shows that TDP is more like a general guideline than an actual rule/measurement. 

I'm hoping we see more laptops this year too.


----------



## OmniDyne

Jiniix said:


> TDP has been rather useless since 2006, but exactly as you also point out, a six-core at 3.8/4.4GHz is supposedly pretty much the same TDP as a 12-core with 3.8/4.6Ghz
> I hate TDP with a passion, and wish we could make something like their ACP an industry standard and required. Power consumption based on a specific set of benchmarks.





Darren said:


> TDP is Thermal Design Power which I understood to mean how much heat it will put off under full load measured in watts, and not an actual power consumption figure. I've used HWInfo to measure actual power draw on the CPU socket (although unsure how accurate it is) and I'll see my actual consumption usually under 65 watts when at stock clocks but at 4.0GHz I see it north of 65 regularly and over 110ish at full load. I remember my 8320 would sometimes pull over 200 watts when I had that bad boy cranked all the way up.
> 
> Again not sure how accurate those measures are but just shows that TDP is more like a general guideline than an actual rule/measurement.
> 
> I'm hoping we see more laptops this year too.




I wouldn't call TDP useless, but the way Intel uses TDP is definitely, well, dishonest. It's why the i7-8700 thermal throttles under the stock cooler.



			
				 Ian Cutress - Anandtech said:
			
		

> But TDP, in its strictest sense, relates to the ability of the cooler to dissipate heat. TDP is the minimum capacity of the CPU cooler required to get that guaranteed level of performance. Some energy dissipation also occurs through the socket and motherboard, which means that technically the cooler rating can be lower than the TDP, but in most circles TDP and power consumption are used to mean the same thing: how much power a CPU draws under load.





> The value of TDP, or thermal design power, is not a measure of power consumption. It is technically a measure of cooler performance, and a cooler needs to be rated at the TDP level in order to perform regular functions. Actual power consumption should technically be higher – thermal losses from the processor into the socket and from the socket into the motherboard also contribute to cooling, but are not involved in the TDP number. However, for most use cases, TDP and power consumption are used interchangeably, as their differences are minor.





> Over the last decade, while the use of the term TDP has not changed much, the way that its processors use a power budget has. The recent advent of six-core and eight-core consumer processors going north of 4.0 GHz means that we are seeing processors, with a heavy workload, go beyond that TDP value. In the past, we would see quad-core processors have a rating of 95W but only use 50W, even at full load with turbo applied. As we add on the cores, without changing the TDP on the box, something has to give.





> For the last however many years, this is the definition of TDP that Intel has used. For any given processor, Intel will guarantee both a rated frequency to run at (known as the base frequency) for a given power, which is the rated TDP. This means that a processor like the 65W Core i7-8700, which has a base frequency of 3.2 GHz and a turbo of 4.7 GHz, is only guaranteed to be at or below 65W when the processor is running at 3.2 GHz. Intel does not guarantee any level of performance above this 3.2 GHz / 65W value.




AMD uses a different formula and it scales much better and accurately with power consumption. That doesn't necessarily apply to overclocking, obviously.


----------



## Shlouski

Darren said:


> TDP is Thermal Design Power which I understood to mean how much heat it will put off under full load measured in watts, and not an actual power consumption figure.



As far as I understand it, power consumption wattage and heat output wattage cannot be different values as they go hand in hand, anything that uses 200 watts of power will be generating 200 watts of heat irrespective of what it is, so you can't for example have a cpu using 65 watts of power generating 100 watts of heat. A 100w led will produce the same amount of heat (100 watts) as a 100w incandescent light bulb, the difference is that the led is more efficient producing light, so you could use a lower wattage led to produce the same amount of light as the 100w incandescent light bulb.

TDP is the maximum wattage a cooling solution should be able to dissipate, so a cooling solution should be built to handle the maximum wattage expected in normal operation, the problem lies in how TDP values are determine by manufactures, chip manufactures will likely underestimate heat output and cooling solution manufactures will likely overestimate a products cooling performance.


----------



## Intel_man

Intel's TDP calculations... https://www.anandtech.com/show/13544/why-intel-processors-draw-more-power-than-expected-tdp-turbo
AMD's TDP calculations is mentioned in this article. https://www.anandtech.com/show/13124/the-amd-threadripper-2990wx-and-2950x-review/12

You guys should probably read up on how TDP is established... before making statements.


----------



## OmniDyne

Intel_man said:


> You guys should probably read up on how TDP is established...



You mean the articles I've already directly quoted from?



Intel_man said:


> before making statements.



Wow ha. The pot calling the kettle black on that one. Correcting misinformation is one thing, but just copying and pasting links and then behaving hypocritically by stating "before making statements" does nothing but cause bitter dissension.



Shlouski said:


> so you can't for example have a cpu using 65 watts of power generating 100 watts of heat.



Yes, but:



			
				Ian - Anandtech said:
			
		

> In the past, we would see quad-core processors have a rating of 95W but only use 50W, even at full load with turbo applied.


----------



## Intel_man

OmniDyne said:


> You mean the articles I've already directly quoted from?
> 
> Wow ha. The pot calling the kettle black on that one. Correcting misinformation is one thing, but just copying and pasting links and then behaving hypocritically by stating "before making statements" does nothing but cause bitter dissension.


Stop getting your panties in a twist. My post wasn't directed at you.


----------



## Shlouski

OmniDyne said:


> Yes, but



I mean irrespective of a cpu's power rating, if a cpu was measured to be using 65 watts then it wouldn't be producing 100 watts of heat, if a cpu with a 95w tdp was only using 50 watts then it would only be generating 50 watts of heat.



Intel_man said:


> You guys should probably read up on how TDP is established... before making statements



Dunno it this is directed at me, but I only stated what the goal of a tdp rating is, supplying enough cooling to dissipate the expected wattage of a chip, not on how manufacturers calculate their tdp's. In my opinion a chip manufacturer can use whatever calculations they want as long as they get it right, it's fine if they want to overestimate a tdp and build a cooling solution for that rating , which would be good practise as no two cpu's are the same so this would allow a margin of error, but it becomes a problem if a tdp is underestimate and then not enough cooling is provided. If my 50w rated cpu is using 70w and my cooling solution is barely able to effectively cool 50w, then the printed 50w tdp on the box ain't going to magically dissipate the extra 20w of heat.


----------



## OmniDyne

Intel_man said:


> Stop getting your panties in a twist. My post wasn't directed at you.



I stopped wearing panties years ago. I'm a thong guy now. 



Shlouski said:


> it's fine if they want to overestimate a tdp and build a cooling solution for that rating , which would be good practise as no two cpu's are the same so this would allow a margin of error, but it becomes a problem if a tdp is underestimate and then not enough cooling is provided.



And that is why Ryzen TDP scales well with power consumption. AMD applies TDP values based on stock cooler performance, which is more than adequate under stock usage. However, If you apply a stronger cooling solution to a Ryzen chip, you actually lower the TDP rating, according to AMDs formula.


----------



## Jiniix

Intels definition of TDP: _Thermal Design Power (TDP) represents the average power, in watts, the processor dissipates when operating at *Base Frequency* with all cores active under an Intel-defined, high-complexity workload. _

That gives us these numbers: 

i7-7800X, 6C/12T - 140W TDP (197W according to Techspot)

i9-7920X, 12C/24T - 140W TDP (7900X: 259W same Techspot)

i5-9600K, 6C/12T - 95W TDP (119W according to TomsHW)

i9-9900K, 8C/16T - 95W TDP (130W according to Arstech)
I tried finding reviews that said how they came to their findings, like Cinebench/Prime95 for a long time etc. Case in point, Intel can suck a bag of dicks.

AMDs definitions of TDP: 
_Thermal Design Power. The thermal design power is the maximum power a processor can draw for a thermally significant period while running commercially useful software_

AMDs proposed standard called ACP:
_According to AMD documentation, ACP (Average CPU Power) is the average (Geometric Mean) power a processor was measured to dissipate while running a collection of 5 different benchmarks (Transaction Processing Performance Council (TPC Benchmark*-C), SPEC cpu*2006, SPECjbb*2005, and STREAM.)
_
AMDs TDP is still very vague, but to their credit they're the only ones I've seen release a 200W+ TDP CPU. 

TLDR: TDP is a waste of bits on my screen when I look for a new CPU. But we could _easily _have a defined standard of max load that would be the same for both vendors - in a perfect world.


----------



## OmniDyne

https://www.anandtech.com/show/14516/amd-16-core-ryzen-9-3950x-up-to-4-7-ghz-105w-coming-september


----------



## Darren

OmniDyne said:


> https://www.anandtech.com/show/14516/amd-16-core-ryzen-9-3950x-up-to-4-7-ghz-105w-coming-september


Beat me to it.

Not trying to count chickens before they hatch but I have a feeling they're gonna jump Intel in gaming performance finally.


----------



## OmniDyne

Yeah I think you're right and I think it's time to buy some shares ha. The price has jumped substantially since last year.


----------



## Darren

OmniDyne said:


> Yeah I think you're right and I think it's time to buy some shares ha. The price has jumped substantially since last year.


Been saying it for years. Still haven't, at this point I almost just don't want to out of principle knowing how much I could have made a couple years ago pre Zen.


----------



## Darren

Just gonna drop this here. Usually avoid this site but those are pretty impressive numbers and look legitimate based on the graphs.

https://wccftech.com/amd-ryzen-5-3600-zen-2-7nm-cpu-review-published-online/


Single core perf of a Ryzen 5 is knocking on the door of an i9 9900K. 








And not terribly behind previous gen Ryzen 7's.


----------



## OmniDyne

Darren said:


> Just gonna drop this here. Usually avoid this site but those are pretty impressive numbers and look legitimate based on the graphs.
> 
> https://wccftech.com/amd-ryzen-5-3600-zen-2-7nm-cpu-review-published-online/
> 
> 
> Single core perf of a Ryzen 5 is knocking on the door of an i9 9900K.
> 
> 
> 
> 
> 
> 
> 
> 
> And not terribly behind previous gen Ryzen 7's.



I wonder if those 9900K scores are before security vulnerability mitigation.


----------



## OmniDyne

Close enough, and yet somehow also better ha.

https://www.anandtech.com/show/14605/the-and-ryzen-3700x-3900x-review-raising-the-bar











3600 review:

https://www.gamersnexus.net/hwreviews/3489-amd-ryzen-5-3600-cpu-review-benchmarks-vs-intel






Gamers Nexus gets the 3600 gaming scores closer to the 9600K and 8700K somehow.


----------

