It's vaguely amusing how they go for the same stock approaches every time. When a product becomes noncompetitive, either release a "special" bin that blows through acceptable power/stability limits, or ram a server CPU down the stack into "consumer" territory. The EE has a special place in my heart because they panicked so hard they did both of those things.
All true, but AMD does the same thing, as does Nvidia when it comes to GPUs. Remember the GTX 480? Or the FX-9590. :) If you mess up, that is mostly your only option to have something. Some people don't care much about efficiency and just want the fastest at a certain thing. They probably did the ROI numbers and it came out positive for them.
But...there are still quite a few tests where there is an orange bar at the top :). One thing I'm a bit curious about on the AMD side though is that there are several cases where the 3700X beats the 3900X. 3900X is both more cores and higher clocked so shouldn't it win everywhere?
Also for those of us where price is an obejct, that 3700X looks pretty darn good against most everything else. :)
It’s possible it’s related to the quality of the cores. With the Ryzen chips not all cores have the same limits. So in theory you could have a chip with less total cores but more higher spec’d ones.
Little need for any 'panic' as, gaming-wise at least, all AMD has managed is to tie the 8700K.....; everything 9700K and higher in the product stack remains virtually unopposed.
I'm still trying to figure out how a higher clocked Intel CPU which processes data slower than a lower-clocked AMD cpu is a "clear advantage" for Intel....;) Perhaps Dr. Cutress might enlighten me...?....;)
The IPC advantage is 10%, the clock speed advantage is >10%.
Intel chips all the cores are able to boost to the same levels. AMD chips now have varying spec cores and only the best 1-2 hit the advertised speeds and only when there are only a few threads running. If you are doing heavy multitasking, most of the AMD cores will be dramatically slower than those of the 9900ks.
That said the AMD chips still have a number of other advantages such as more pci bandwidth and no tiny DMI bottleneck. Lower cost, lower power use and generally more cores.
Right now it really comes down to what you plan to use it for. There is little doubt AMD is in a better spot right now, but it remains to be seen if they can hold it past 2021 as intel transitions to 7nm and MCM.
" The IPC advantage is 10%, the clock speed advantage is >10%." and WHY do you think intel has the lead, even barely, in some cases. because of the clock speed advantage. Intel's cpus NEED this advantage just to keep the performance they have. think about it, a LOWER clocked chip over all is performing about the same,( depending on usage ) and in some cases better, then the higher clocked equivalents clock these chips the same, and i bet the story would be quite different. " Intel chips all the cores are able to boost to the same levels " at quite a bit more power usage too. " 2021 as intel transitions to 7nm and MCM. " more like IF intel can transition to 7nm by then, look how long it has taken them to get to 10nm..
:) Funny, I had exactly the same thought. But honestly, for many real world uses (games etc) 4 cores at even faster frequency would be even better. Physically separated on the die as far as possible (in the corners) by huge amount of shared L3. I guess 5GHz base/6GHz turbo is not out of the question within the same TDP with liquid cooling.
AHAHAHA... okay, we have a blind fanboy here. Do you know anything about IPC? This CPU get destroyed in EVERYTHING except old games running outdated engines at 1080p. SO unless you buy this with a 2080 TI and a 240Hz 1080p monitor, you are not going to benefit from it.
Basically, with a budget, the 9900KS is a waste of money, period. The money you save buying a 3700x and investing in your GPU will give you some serious gaming performances increase.
People fail to consider other use cases. For competitive gaming or someone running 240hz 1080p with a high end gpu and willing to tweak settings to make their games cpu bound this is still the best cpu. Unfortunately not all testers optimize their cpu tests to be cpu bound in games. But if you look at the ones that do intel still poops on amd. Sure most gamers dont give a shit about fps above 160 or so but some do. When I ran overwatch I tweaked the config file and ran 400fps. If I was running csgo I would push the fps as high as possible as well. Also imo the biggest used case for amd cpus for gamers is futureproofing by having more cores. Most gamers are just gonna play their games with a few tabs open and maybe some music and discord running. Not everyone is running cpu based streaming encoding at the same time.
Well I don't seem to notice the same thing you do for max fps in games where you need 240hz for example. At most, I can see 10 to 15 fps difference in counter strike at around 400fps. I looked around and found a lot of tests/benchmarks. There is no such thing as ''this is the best cpu and you'll notice a difference in the games that matters for competitive gaming''. I might be wrong, if so, enlighten me please. I'm about to buy a new gaming rig and like 99.98% of the population, I'm not a competitive gamer. I don'T consider streaming as competitive neither.
But, in ubisoft's single player games, I noticed it does help to get closer to the 120hz at resolution and details that matters for these non-competitive games.
Look at the 95th percentiles. Ignore average fps. AMD and Intel are virtually tied in nearly every game. I cannot believe we have reached this point. Finally after a decade, AMD is back in business.
You do realize that running your CPU or GPU at 100% max utilization increases input lag correct? FPS isn't the only thing that matters. if the CPU cannot process new inputs in a timely matter because it's too busy with the GPU then the whole action of increasing your FPS was pointless. You should cap your FPS so that your neither your CPU nor GPU exceed 95% utilization. For the CPU this includes the core/cores that the game is running on. You loose maybe a handful of FPS by doing this but ensure consistent input lag.
Not true. AMD has the entire market pretty much cornered, though. So it doesn't matter whether you buy high end or mid range, Intel chips in general are a bad choice currently. Intel desperately needs to rethink their strategy going forward.
You might want to look at the benchmarks. Intel won most of them, with less cores. I was seriously considering an 8- or 12-core AMD, but Intel still ended up the better option for everything I do except video transcoding, in which AMD clearly wins. Other considerations: No cooling fan on the Intel motherboard, better Intel quality control and testing in general, more mature product (because the 9900 is an iteration of an iteration of an iteration...etc.)
AMD does, indeed, include a cooler. But stop acting like a frog and admit that you cannot (properly) use the CPU with that cooler. You'd still have to, eventually, get a AIO or a Noctua. So the fact that you'll have to pay an extra 90$ or so is moot, in this case.
moot ?? i dont think so, the cost of this cpu + $90 at least, just to be able to use it, ryzen 3000, no need for a cooler as it is included,if one wants better cooling, then it is an option, not a necessity, for the price, amd wins. for performance, for the most part, amd wins as well, specially in MT. and power usage, amd wins there again, you know, like how people were bashing amd for before zen cam out ?? WHY are people not bashing intel the SAME way now ??
The cooling fan on X570 is if you're doing something as crazy as dual PCIe 4.0 NVMe RAID 0. Otherwise, they don't need to spin, as the passive heat dissipation is enough, and Gigabyte makes a board with no fan at all.
It's great that Intel works better for you, but the use cases for it comes mostly down to super high refresh rates (240hz displays at 1080p) with medium settings. Otherwise, who cares if you lose maybe 5 frames if you're never gonna see those frames anyways, and if the 12 core Ryzen does almost everything better than the 8 core 9900KS in multi-threaded tasks, then I don't see the drawbacks in buying AMD.
Intel does lead AMD in the core speed and overclocking department however. If you're an enthusiast overclocker, then by all means, disregard AMD
Substantiated proof please? Otherwise, stop talking out of your backsie, many thanks.
Hey, look I can also be anecdotal: In 10 years of building PCs I've never had a CPU die, Intel or AMD, and only 3 motherboards, two of which was my fault.
I've had two AMD CPUs die. One was a Duron that I ran with the heatsink improperly mounted, causing it to go pop, and the other was a Barton core Athlon XP that had a corner of the die chip off from having a heavy CPU cooler dismounted and remounted too many times. Both my fault, albeit both consequences of design choices (no thermal regulation on the Duron, no protective heat spreader on the Barton)
I've had 3 Intel CPUs go faulty. All were Sandy Bridge, and every single one appeared to work correctly most of the time but would do weird things like suddenly stutter in games or crash without warning. None of them had been overclocked or otherwise "misused".
Off that basis I'd say AMD do better than Intel. *shrug*
AMD's random number generator instruction is crashing Linux and compromising security because it always reports that it can generate a legitimately random number, and it always generates the same one. This has made Sysd in Linux issue a patch that always assumes the generator is bad when it issues that value, a 0xffffffff, which means that when working instruction like Intel's generate that number legitimately, Linux will assume it is defective, too. If they had, say, tested the chip properly, the fact that many distributions of the world's most popular server operating have CPU-related crashes may have been found.
AMD made the first chip I have ever used that could not run its stock speed stable, but could run underclocked (AthlonXP).
AMD owns the company formerly known as ATI. Somehow, the quality of their drivers is controversial, but after giving them 6 or 7 chances since the 90's, I will never use that hardware again due to the hilariously low-quality drivers and driver support software. The hardware is often fine, but has at many times been reputed to run too close to the line, requiring intense cooling with leaf blower-like fans, and allowing almost zero overclocking.
Really, the idea that AMD's QA is inferior to Intel's seems strange to even question. AMD, even with their successes here and there (AthlonXP, Athlon 64, Ryzen... and that's pretty much it) has always held a reputation for being the off-brand. Anecdotally, problems with AMD systems have long been considered more likely than with Intel.
I wish AMD the best, and I hope their next-gen Ryzen 4000 series shows a clear win over Intel other than in just core count, but for as excited as I was to get a 3800X or 3900X, when I looked at the numbers and at my own long history as an AMD advocate yet purveyor of fact, I had to conclude that AMD still doesn't quite measure up other than in specific corner cases.
I owned multiple systems over the past here are some of the cpus: Intel 80286, intel pentium 133, intel pentium II 233 and 350, AMD Athlon 64 3200+, AMD Athlon x2 4200+, Intel Core 2 duo e5400 and q6600, AMD FX-8350, intel i5 2500k and then my brother got a ryzen 3700x that we just built lately. I won't go into all the video cards I've owned/purchased to build systems for my friends/family but here are some of them: ATI rage 128, Nvidia riva tnt2, geforce 2 mx, ATI Radeon 8500, ATI radeon 9700 non-pro, Nvidia 8800 gt, ATI radeon 4850 then upgraded to 4870 1gb and then I pretty much tried everything that came out after that for both team.
Sincerely, the biggest problems I've had so far was my geforce 6800 gt's drivers totally incompatible with my Athlon 64 x2 3200+ and that was due to the chipset. Nvidia knew the problem was widespread. We could only fix this by using a very old driver(can't remember which) and sometimes underclock would work. They never fixed this.
My father still uses the FX-8350 and that cpu was not well received back in the days. He has no problems whatsoever.
Again, I'm not a linux user but I've got a friend who swears by linux and he uses AMD hardware. I could ask him why but I do not really care anyway. I remember when I bought my i5 2500k there was a major recall on motherboard including mine for b3 stepping because of the S-ATA controller risk of dying or something like that. 90% of the systems I named above are still running strong and of that 10% dead, it's mostly the motherboards or ram, which aren't built by AMD nor INTEL.
I'm no fanboy, I bought so many systems with always the same objective being the best performance for the dollar depending on the usage of the said system. I just had to comment because when I see someone commenting like that, I think they were simply unlucky and that it has nothing to do with what they beleive.
A little mistake on my part, it was my AMD Athlon 64 3200+ and not the x2. The worst about this problem was that the chipset (nForce3-250 or something like that) was made by Nvidia and it had problems with their video cards only.
I forgot to mention that we bought a new system for my brother because the CPU died, I think that's the only cpu that ever died on me and it was an i5-2500k. The most shocking about that is I overclocked mine to 4.7ghz for all these years and it's still going strong(hence the reason I'm still waiting to upgrade). He bought some closed loop water cooling and never overclocked the cpu still it died on him... shame.
hmm so intel has better quality etc lets consider for a moment all the security issues with intel, then lets look at the way they refused to develop the cpu until AMD came alont with 12, then 12+ then 7 and shortly 7+ meanwhile intel cannot make a decent 10nm chip speaks volumes about your argument, then lets look at the TDP the AMD chip at 65w is almost neck and neck with the intel one at 255w ! Only cpu i ever had fail was a core 2 duo never lost a GPU from either camp but guess what... the intel GPU is being design led by a former AMD/ATI staffer as is the new intel CPU as well, think we can leave it there
Curious you think that someone considering a 9900KS is a 'budget' gamer. You could easily make that argument with any high end component. I'd expect them to be pairing this with both a 2080ti and a high refresh monitor.
I think you are mistaking enthusiast for "fool". I've bought a 980 Ti and a 1080 Ti but I sure as hell ain't going to buy a 2080 Ti. I had a 5820K and bought a 3700X.
Thankfully, there are some of us with some fiscal responsibility.
And to add to that, I can easily fit a high end 200W+ TDP CPU cooler in my small mATX case, but I cannot fit a graphics card that is longer than 26 cm (23.5 cm after my front fan modifications) into my case. SFF systems are more limited by graphics card size than cooler size most of the time. And the best cards in a smaller form factor are 1080 TI and 2070 Super as far as I know.
imagine what it would be like if the clock speeds were higher... if zen 2 is this close, or faster with the clockspeed disadvantage it has now.. what will it be like if zen 2 was hitting 4.6+ ghz ???
dont intel cpus NEED the higher clocks in order to have the performance they get ? clock them at the ryzen equivilents.. and see how well they perform.. when will people realize clock speed isnt everything ??
I am very eager to see Zen 3 tho. Regardless of all the fanboys. I feel like it’s very much a mixed bag to chose between intel and amd just cause of some of the instruction sets and #of threads used by various things.
But if even half of the stuff about Zen3 is true 2020 should be AMDs year as I don’t believe anything intel launches until 2021 is really gonna be competitive.
Zen 2 has more like 13%-15% IPC gain over released Intel CPU's right now at same clock speeds depending on work load. There was a video on Youtube where one of the bigger YT channels did a side by side of AMD Zen 2 CPU and Intel 9900K both @4GHz and I was surprised that the AMD chip was ahead in most everything by a fair amount.
When it came to gaming though Intel had a slight lead in a few games that seemed to favor Intel. But there were also games that AMD got wins from a s well. This would explain why AMD with a CPU with same core count is now able to match Intel even though AMD has a lower clock speed and even come close or match in a lot of games. I am no fanboy for either camp I currently only own Intel based systems but would be more than willing to look at either camps hardware when I do my next set of upgrades. That is just how good things are now on either side ans with AMD finally back in the game and putting pressure on Intel prices are also now getting better on the Intel side of the street as well. It is a win for everyone when things like this start happening.
you sure about that ??? i have 2 comps, both with a asus strix 1060 gaming OC, one with a 5930k, the other with an FX 8350, both max eye candy less AA ( 4x ) and AF ( 4x as well i think ) , and get about the same FPS. the 3900 will prob use less power over this.
I'm surprised an FX8350 can saturate a GTX1060, but, you maxing out the details and quality is the only reason the FX keeps up...; it's like saying the i5-8400 matches the 9900KS at 4k with a GTX1070. (Of course it does)
Because it's drop-in compatible with the poor sap who isn't getting enough from the cheap Walmart i3 gaming PC they overpaid for.
Unfortunately AMD just doesn't have the presence in retail to gloat that. Ironic, because traditionally AMD has had a superior upgrade path, keeping sockets longer and (provided motherboard vendors support their boards) new microcode support via BIOS updates.
I find it funny that in the past Intel CPUs were praised for their power efficiency over AMD ones. Now that AMD has a 65W CPU that is almost as fast as the reviewed CPU, it doesn't matter at all...
Indeed, the 7nm process is clearly a win here. That said, total platform power with Intel (9900KS excluded!) still tends to be lower very similar or even lower, in part due to the rather power-hungry 14nm AMD 570x chipset. 470x-based AMD systems still win in most cases, but not by an extremely large amount.
Please read the first couple of sentences before criticizing. No one is saying the 9900KS is a power efficiency winner. Indeed, when running compute-intense tasks, AMD's larger CPUs are likely the winner most of the time, but you cannot post one set of measurements with one specific configuration and draw broad conclusions. Other measurements differ, and most pit the idle power draw (where most PCs remain most of the time) as lower for Intel. For example, https://www.guru3d.com/articles-pages/amd-ryzen-7-...
thats funny.. cause before zen.. thats what people were saying about amd.. and the power those chips use. now its the opposite, and its ok for intel to use so much power ??? come on
Just to add to this...der8aur just tested the 9900KS and it runs games all core 5GHz all the time at between 98w and 126w....Video showing this starts at 3:50..the only time it uses more power is if you are doing high compute loads and or with AVX workloads...but for gaming, it will stay well below... https://www.youtube.com/watch?v=YWSn0cHauJ4
Surprising that it seems to hurt some that others might want to buy a RTX 2080Ti or a 9900K or a Ferrari for whatever reason. Are we now all to be lambasted for buying anything that maybe costs more?...or heaven forbid they want the best!!! Oh no...run!!! You might as well start on all those that buy better shoes, cookers, microwaves, TV's, bread, cars, chocolates, phones etc, etc ad infinitum or for that matter any branded products period as they cost more than there non branded counter parts...Hold on, we are not going far enough, here's an idea, lets all wear the same clothes, live in the same houses, watch the same TV, eat the same gruel, have only one manufacturere of all products.......so no one can be different and most importantly we certainly do not want any choice, creativity, design et all....
Historically this has been very true. However browsers nowadays can take advantage of many cores, as can games - they have been forced to, or else be outcompeted by those who do.
Considering it's a brief holiday special I would think this is all the golden samples from the past year. I'm quite sure they all pre-bin and keep chips that are "perfect" in some way like no defects/high frequency/ultra low voltage even though they don't have an SKU for it yet. Normally they'd launch one notch up as a 9950K or something, with this time limited special I guess Intel is saying we'll never produce these chips in enough volume to justify that so we're doing a limited edition for PR instead.
They keep the duds, too. That's one reason the Pentium and Celeron come out later. The other being they don't want to spoil the market for higher-end chips.
um.. it is a 172w cpu.... did you read it jorgp2 ?? intel tdp... is crap.. this is listed as being a 127 w cpu.. at BASE frequency. at 5ghz... it will probably use 172 w.. or more
Err...no...der8aur jus tested the 9900KS and it runs games all core 5GHz at between 98w and 126w....Video showing this starts at 3:50..It just depends on what you are doing and whether you are using heavy compute applications but for most games and normal day to day usage no it will hapily run under well under 172w... and that will never be an issue and that's really where this CPU is targeted...albeit those with deeper pockets... https://www.youtube.com/watch?v=YWSn0cHauJ4
It's possible that it is if the chip is more power efficient (i.e. needs a lower voltage for a given load), which it has to be just to reach 5Ghz on all cores without sending power through the roof
Yep, I did. Their testing showed that all cores running at 5Ghz had the CPU using 172W. If you're not going to keep all cores running at 5Ghz, except during idle, what's the point of buying this over a 9900K?
I am not sure how you missed it but he clearly says when running CSGO, SOTTR and even Timespy GT1 and GT2 the CPU is always at all core 5GHz all the time with max power draw between 75w and a 126w.....He clearly states that at 4 minutes and onwards...
The only time this CPU will draw max power of 170w is when you are doing heavy duty compute loads or testing with AVX...For normal use like gaming...this is what most people will buy it for it will never be at 170w plus..
again.. you dont state what the test setup was. how it was configured.. so, either he did something wrong, or anandtech did, and sorry to say, i trust AT more...
Asus z390 maximus xi extreme, RTX 2080Ti and not sure what RAM...but as he cleary states this was during the bechmark runs and in the comments he covers forgetting to mention the 2080Ti but clarifies this point. Bottom line for 'normal' gaming usage this CPU will run all core 5GHz all the time at on or under 127w...
Y'all are not understanding vMax65 whatsoever. He's talking about IN GAMES and not in benchmarks. It is extremely rare for your CPU to come close to 100% utilisation while playing games unless you're running a competitive setup at super low settings in a game like Overwatch or CSGO. Just because a CPU is running at all core 5ghz doesn't mean it's consuming 172W. No its consuming 172W at 100% utilisation. So with that being said, when you're using it at 70-80% utilisation it will consumer far less power so the 90ish W to 125W makes complete sense. I hate Intel and their practices, and their CPUs are overpriced and uncompetitive, but don't spread false information without understanding how a CPU even draws power...
I'm very, very sure it will not run at all-core 100% 5 GHz for CS:GO and I still don't know why you'd want to do it. I'd limit it to just above monitor refresh rate.
For a CPU directed towards power gamers, tell me, please, in what circumstances do you see a gamer or a normal, day to day user, execute Cinebench on their PC? Stop being such a dummy and stop acting like a rabid dog over a product that not only you will never get but clearly, it's not geared towards you.
and keep in mind the test bed were DIFFERENT, that is a factor there as well. how would you know if some one would never get this cpu ?? and now you resort insults and name calling?? that makes your opions, pointless.
Would it be possible to find a compilation benchmark? Chrome or Firefox perhaps? I remember Linux kernel compilation finished in 30 seconds, perhaps it's getting too small...
Given .Net Core is cross platform, maybe compiling the platform itself or Roslyn compiler? If I remember correctly, compiling the framework right now is not trivial, but the goal is to streamline it for .Net 5 release.
As long as you have good cooling every single 9900K ever made does all cores at 5Ghz all the time without any sort of turbo time limit when the motherboard has MCE(multi core enhancement) enabled- most have this enabled by default. The 9900KS is nothing new.
I think it would be interesting to see how the 3700x and 3800x do if you bump their power limit up to 9900KS levels. The 3700x is probably being held back quite a bit at 65W but still is competitive.
"AMD also has the 16-core Ryzen 9 3950X coming around the corner, promising slightly more performance than the 3900X, and aside from the $749 MSRP, it’s going to be an unknown on availability until it gets released in November."
Wasn't that delayed till December? I swear i saw that in the news..
Your IGP tests aren't very useful. I'm not going to run a game at 720p with the lowest settings on my IGP if I'm getting 250+ FPS. I'm going to increase the resolution or graphics settings until I'm getting the best visuals I can while still hitting 60 FPS.
Please compare with settings that give around 60 FPS on the IGPs so we can get an idea of how the integrated graphics performs when you actually stress it some. At 720p, you are just testing the CPU cores and not the IGP at all.
This has been a bugaboo of mine for quite a while. There are *no* IGP benchmarks. What you're seeing is a benchmark detail setting that AT refers to as "IGP." In effect, "Very Low" detail. The test is still run on the GeForce GTX 1080 and not the CPU's built-in graphics.
My other complaint is that the use of a GTX 1080 makes the High preset results entirely meaningless, as they're almost always limited by the GPU. It's a waste of graphs and a waste of test time.
The gaming benchmarks as they are shown here are absolutely pointless - all they show is that the GPU is the more important factor. Them saying that they keep the GTX 1080 for the comparability with older results is pointless, when the variability between three generations of CPUs is 3% at best, because the whole thing is nowhere near being limited by the CPU.
What a joke and pos, and they have the audacity to claim 127W TDP?
9900K out of box without messing with BIOS is drawing 180 watt at default 4.7GHz turbo, you expect me to believe 127W TDP? I know fanboys are going to defend TDP is not actual power usage blah blah blah, but it should be used as a point of reference and frankly actual power draw is 100% over TDP is not good enough for most people.
Are you not capable of reading? 9900K out of box runs 4.7GHz all core turbo without "ramp up" anything, that is its default speed without overclocking. Do you even own the processor?
Yes, the 9900K turbos at 4.7 GHz if given proper cooling. But, again, you do not understand the power rating nor do you understand turbo. The 9900K does not turbo to 4.7 GHz in many circumstances (think of a field application in a hot desert, or in an enclosed box without air holes, or in a dusty environment, or in a low pressure environment, or an application with passive cooling, etc).
The 9900K is actually a 3.6 GHz processor at 95W of power. If you happen to be in an environment and application that allows great cooling, then the 9900K will go up to 4.7 GHz at far more power.
Wow you are a pathetic loser. Just admit you are wrong instead of bringing up extreme cases like "hot desert" "enclosed box" to make yourself look ridiculous.
Of all the processors sold, how many are going to run "in an enclosed box without air holes"? 1%? 2%? Even without proper cooling 9900K still runs 4.7GHz turbo as long as the motherboard can support it, hitting 100 Celsius without throttling, that is a FACT. Now go away, stupid fanboy.
The answer is "Whatever percent of computers aren't regularly cleaned". Haven't you ever opened someone else's computer and found enough dust/fur to create a large stuffed animal? Reviews of new CPUs in new computer cases with sparkling new fans are nothing like the real life for the vast majority of CPU usage after real world usage.
I'm only replying the way TDP is defined. TDP is the max power used at base speed. CPUs are only guaranteed to run at base speed. Turbo is NEVER a guarantee. And when a CPU is in turbo it will use more power than TDP. If that makes me a fanboy, then which product am I a fanboy of?
As airdrifiting said below, it's misleading because absolutely non (read: NONE) of the Intel Turbo capable CPUs run at their base clock only. I've seen 'reviewers' and people, comparing power use characteristics based on TDP, and conclude that the 9900K (non S, mind, this was before) is more efficient than the 3800X. Go Figure.
take a look at de8aur's lates video on the 9900KS on TDP and power consumption, specifically around gaming at an all core 5GHz. The CPU draws under 127w all core in games and only goes up on serous workloads like compute. Start at 3 minutes 45 seconds...ish.
der8aur jus tested the 9900KS and it runs games all core 5GHz all the time at between 98w and 126w....Video showing this starts at 3:50.. https://www.youtube.com/watch?v=YWSn0cHauJ4
In terms of value, yeah it pretty much does. It beats Intel's entire lineup. -5-10% performance, (even in gaming), -40% price, and +30% efficiency, includes a cooler too. The thing is, the Ryzen 5 3600 does it, and then some.
The Ryzen 5 3600 is probably overall the "Smart person's CPU" while those that don't know how money works, like myself, quibble over a few % difference across $500 CPUs.
It isn't AMD propaganda, one of the problems is the way people perceive written communication with strangers, you can't share ideas with people you don't know, especially contrary ones.
If I check my Asus motherboard and 9700k, it uses 170W at default settings. Then it drops to 3.9ghz and uses exactly 95W. So it is a 170W CPU with all those benchmarks you see out there, that is highly misleading. I'd expect this new one to use more. People have been angry at Intel for some time for not listing their turbo power as TDP.
If I check my 3700x, it uses 65 watts. So my 9700k uses 2.6x times more power, and yes, it is faster (not in MT, but in ST) than my 9700k. That's all fair to point out, and not propaganda.
I think the 8700k was a genuinely good processor, with a large cache (not artificially limited like the 9700k) and 50 percent more speed for the same money. Now we have the 9900k only 33 percent faster at best, and WAY more expensive and much too power hungry. I haven't built any 9900k computers, and only have one 9700k computer (I bought the CPU used for a low price).
Correction: If I check my 3700x, it uses 65 watts. So my 9700k uses 2.6x times more power, and yes, it is faster (not in MT, but in ST) than my 3700k, but not overall. That's all fair to point out, and not propaganda.
Take a look at TDP and power consumption for the 9900KS for gaming at an all core 5GHz and you will notice it is between 98w and a 126w and thats 5GHz all the time....
the same can be said about the intel chips and the fact practically the only thing left for them to brag about, is single core performance, cause intel pretty much looses every where else.
Sure Zen 2 is great, but not great enough to spend almost $1000 on a new Motherboard and Processor right now. My Intel Core i9 9900K seems to hit 5GHz easily, which is nice boost for every workload. I'll make that AMD vs Intel decision again though someday ... maybe in a couple of years.
The difference between 9900KS and 3900X in gaming is unnoticable. The difference in everything else (in 3900X's favour) very much is. And people are starting to get mad at Intel for being so arrogant as to charge over 500 bucks for this garbage CPU that doubles as an immersion heater. They won't even include the 150 bucks worth of cooling you need to actually have it run at 5 GHz.
'AMD propaganda artistis'? Nah; people are finally waking up to Intel and their stranglehold on the consumer CPU market.
Factual: There is no difference in gaming in 2160P. We are limited by graphic cards. In 1440/1460P its +/- 3%. In 1080P: Only there Intels 5ghz is faster than AMD, since this is not limited by graphics. But still: P = 60hz. So Intel can produce 10-15% more frames that you can't see. "but... maybe I want to game in 1920x1080 with 240hz". Well. Then again, there is no difference since its graphic card limited. Maybe next gen graphics a fast CPU in 1440/1460/2160P can make a difference. But then again: maybe games can be threaded better over more cores.
'Hater's dislike this CPU because once again Intel is relying on marketing ploys and borderline mis-information to sell recycled parts based on tech from 2015, at a price that is higher than the competition's part with 4 more cores, signficnatly higher MT perf, higher efficiency, an included cooler, it's not EOL like Z390, etc. Do you see what I'm saying? these 'haters' are sick and tired of Intel's stranglehold on price-gauging the CPU market and they voice their opinions now that we have some really viable competition. Shocker: people care about value and features.
And to some customers, this chip has the features they want at a value level they can afford. I'm not sure why that's so difficult a concept to understand. Rather, I think that the "haters" understand perfectly well, they irrationally fear what other people purchase.
None of what you said discounts what he said - it's just whataboutism followed by some unflattering generalisations about people you think you disagree with. :|
What does the ring bus clock run at by default? If it's still 3.7 - 4 GHz, these may be a little more overclockable than the turbo clock implies.
I'm curious how the power and temperature compares to the regular 9900K with everything pushed as far as it will go.
If and when there's a desktop Comet Lake, I'd also like to see a comparison from Skylake and on. With chiplets, CPU manufacturing processes may no longer completely go out of style, so it would be interesting to see how 14nm progressed over 5 years.
It's kind of funny, that the only thing different about this 9900k is the possible voltage that it might be able to run at, and he didn't test that. "voltage" is nowhere to be found in this review. I.e. this CPU is just a 9900k, nothing improved. Just set your voltage to 1.3V with a normal 9900k and set 5ghz, there, the same. This by all rights will be crushed by the 3950x, I'd rather take double the cores for almost the same money, imo.
"I'd rather take double the cores for almost the same money".
I jumped on the 9900K long before Zen 2 even had a release date. I'm not disappointed with the performance I get in multi-threaded tasks, certainly not enough to spend $1000 to switch platforms. AMD's 3900X and (eventually) the 3950X might have more cores but it's going to be at least two years before I buy another platform. Instead, I will use that $1000 to purchase an RTX 3080Ti the very instant nvidia puts them up for sale.
What makes you think I had an 8700K? My previous platform was an Intel Core i7 5930K on an X99 Deluxe motherboard. The jump from 6 - 8 cores wasn't huge, but the Coffee Lake cores are faster than the Haswell-E - my Cinebench score still doubled. However, Cinebench is kind of stupid now, and why I dropped the HEDT platform. I'd rather spend more on a GPU, not just for great 4K gaming, but Blender can do faster rendering on a GPU, and you can also use your GPU to accelerate video encoding. 8 cores and 16 threads is plenty for everything else.
Still haven't seen anything about the ring bus clock.
Silicon Lottery posted their 9900KS binning results: 5 GHz at 1.250v, 5.1 GHz at 1.287v, and 5.2 GHz at 1.325v. The most recent numbers for 9900K and 9900KF had 5 GHz at 1.30v.
After the rather uneventful 8086K, if this is going to be a recurring thing, I think they need a little more special sauce. For example, cleaner solder and/or lapped IHS.
@Ian I feel like fanboyism is on the rise at Anandtech article comments sections lately, but I'm not sure if it's just a nostalgia thing where I just think the past was better. Do you have an archive of the comments from P4 vs Athlon-xp, Athlon64 days? I swear back then was just a simpler time of "In Soviet Russia... " jokes.
I think it's a symptom of wider trends on the internet. Astroturfing on tech forums has been a thing for a while now, so people are even more suspicious of the motives of others. The communities are larger, so there are more names to keep track of and less trust as a result. More people have learned stupid "debate" tactics like whataboutism and gish galloping from useless right-wing Youtubers, so even small disagreements and misunderstandings spiral out into lengthy pissing matches.
lol, this is supposed to be the chip Intel launches in response to AMD's second-best to get the 'last word' - instead it's a replay of Zen2 launch day all over again with the 9900K's sporty younger brother getting humiliated.
"...but the system was fully table the entire time." "stable" not "table". "...but the system was fully stable the entire time."
"It would appear that there is something else the bottleneck in this test." Mis-written: "It would appear that there is some other bottleneck in this test."
"One question that does remain however, is which set of results should we keep?" Keep to Intel guidelines. That's what you do with memory speeds and other things, I'm sure. That is also the guaranteed results for all consumers.
Normally this hardware site is my "go to" for hardware reviews. However, holy mother of Jebus was this test bed terrible. Let's buy a Porshe and throw on old outdated tires on it. This should have been paired with a 2080 to, but you pair it with a 1080?
The thing is, that would invalidate all older results. I'm guessing they will update the GPU in 2020 and retest certain legacy CPUs. What do you think would fundamentally change about the results with a newer GPU, though? I doubt it would be much, since the trends are the same with all the different graphical settings they already test at.
Only one year warranty with this CPU, reduced from 3yrs. So it’s marginally faster, uses more power, offers no gaming advantages and it’s price hike doesn’t justify the performance gain and warranty disadvantage over 9900k.
Counter strike really needs to be added to benchmarks. It’s just silly how useless these gaming benchmarks are. There is virtually nothing that separates any of the processors. How can you recommend it for gaming when your data shows that a processor half the price is just as good? Test the real scenarios that people would want to use this chip.
It's more because you need a specific set of circumstances these days to see the difference in gaming that's more than margin of error.
You need at least a 2080, but preferably a 2080ti You need absolutely nothing else running on the computer other than OS, Game and launcher You need the resolution to be set at 1080p You need the quality to be at medium to high.
then, you can see differences. CS:GO shows nice differences... but there's no monitor in the world that can display 400 to 500FPS, so yeah... Anandtech still uses a 1080, which is hardly taxing to any modern CPU, that's why you see no differences.
csgo is a proper use case. It isn’t intense, graphically, and people regularly play with 1440p120. Shaving milliseconds off input to display latency matters. I won’t go into an in depth analysis to why, but imagine a human response time has a gaussian distribution and whoever responds first wins. Even if the mean response time is 150 ms, if the standard deviation is 20 ms and your input to display latency is 50 ms then there are gains to cutting 20, 10, even 5 ms off of it.
And yes, more fps does reduce input latency, even in cases where the monitor refresh rate is lower than the fps.
If you visually can't react fast enough, doesn't matter how quickly the game can take an input, you're still limited on the information presented to you. 240hz is the fastest you can go, and 400FPS vs 450FPS isn't gonna win you tournaments.
CS:GO is not a valid test, as there's more to gaming than FPS. Input lag is more about the drivers and peripherals, and there's even lag between your monitor and GPU to consider. But go on, pretend 50FPS at 400+ makes that huge of a difference.
I dunno if anyone mentioned yet, but the KS has additional security measures to mitigate exploits which are probably causing the performance regressions.
I expect I will never own an i9-9900KS or a Ryzen 7 3700X, but it is interesting to see how close AMD's 65W 8 core chip gets to Intel's 127+W special edition CPU in terms of performance in most of these benchmarks.
The Processor Specification field in the CPU-Z screen shot has "(ES)" after the cpu name. Did you test and engineering sample or an actual retail sample?
It was only launched a couple of days ago, so they'd have to have had a sample, you couldn't run the tests and write the article having bought it at retail.
Nice to see a 10+year old Thermalright True Copper: 2kg for a CPU that has a mini-nuclear thermal reactor powering itself. LoL. I can't really say much. I'm using a 11year old ZALMAN CNPS 9900MAX (300watt) AIR cooler on a 5.3GHz Golden 8700K with 1.380v in a z370 Aorus Gaming 7. I thought about getting a new cooler with the $400 open-box 9900K that i just found. But, after seeing the 9900KS running on the 10year old Thermalright True Copper: 2 kg. I don't have a doubt the ZALMAN 9900MAX (300watt) cooler will have any problems running a 5GHz (PO) 9900K.
So, this is a direct competitor to 2700X, but costs twice as much?
I wonder if we'll see intel as a budget offering in a couple of years, cutting the price of their 14nm tech to half of what AMD charges just to get some sales.
It's like a Caterpillar engine that will need hydrogen fuel to beat most benchmarks. That means expensive Mobo, GPU, PSU, Mem, Cooler, Case. It's Intel stretching to the limits for PR purposes, yet much bellow AMD, in terms of value for money.
It's unbelievable, i swear. Even Anandtech's comment section has turned into a pcgamer shit show. Instead of just trying to take the article as it is - a piece of info, most of the people here either start a revolution against Intel or just plain dismiss it as "fake information" in regards to TDP. But i see nobody here admitting that Intel's 14nm is ON PAR with AMD's 7nm (i've heard they have the same density but don't quote me on that). Or that said products is not geared towards you or your acquaintances. I swear to God, it's like Intel has raped some family member or something. Just buy whatever you think is fit for you and leave others to enjoy proper journalism (which is so f*cking rare nowadays).
cause it is fake info, 127 watts ?? nope, 200+ more then likely. and some i think, are tired of the lies, BS, and over charging intel has had us pay over the years. face it, if amd didnt bring out zen, the chances are we would still be stuck at quad core for the mainstream, and anything above that, would be HEDT. FYI, it may be on par, but intel should also be on 10nm by now... maybe even the next node size, and yet, intel kept saying, 10nm is on track. there are better, less expensive options then this cpu, this is just intel being intel, last ditch effort to try to save face.
Thank you, liquid_c. This needs to be repeated over and over again. You have this stellar quality content, painstakingly researched and presented, and it's clear many of the readers (or at least the ones that comment) are neither understanding it, reading it in context, or even trying to think about it. The smartphone reviews are identical. It's all just "how do I read what I want so I can wage my emotional holy war because I have literally nothing else going for me."
Here, I'll go one step further: there's zero reason for a site like Anandtech to have comments following articles at all. You could delete 9 out of 10 attached to this article and zero of value would be lost. Given how terribad the comment system is (no editing, tiny and unscalable input window, spam posts that go undetected for days, etc.) and the effort/investment it would likely take Anandtech's limited team to improve it, just getting the hell rid of it would be a far more sensible solution. Let the content be the star. Send people to the forums in a special board requiring a 500 post barrier in the broader community to contribute to article discussions.
One of the few reasons i keep reading articles from sites / news outlets like Anandtech and ArsTechnica is the fact that besides good, well developed and portrayed journalism, i also expect knowledgeable people commenting on said articles. I always learn(ed) a little bit of extra info by doing so and it pains me to see this ongoing fan war between Intel / AMD fans, Apple / *Insert any other Android vendor name* fans, etc. So instead of finding out the “ifs” and “whens” of specific tech topics, i have to skip through countless hate posts.
People in glass houses.... liquid_c - Sunday, November 03, 2019 - Stop being such a dummy and stop acting like a rabid dog over a product that not only you will never get but clearly, it's not geared towards you.
Although I agree in principal that there are a fair number of toxic comments, yours are among them so I don't think you have much room to complain while at the same time contributing to the problem.
The main problem is: 16 PCI lanes. You can't really connect anything to the system without starving it for bandwidth. The 16 PCI lanes are used for graphics. DMI link to the motherboards PCI lanes has a bandwidth of 3.8GB/s. 4-year-old NVme SSDs are already at 3.5GB/s. Forget using 2 NVme. Usually, the graphics card is pushed to 8PCI lanes killing 10% performance. Forget connecting fun stuff with a thunderbolt, or use high-end capture cards and so on. There is no bandwidth. AMD Ryzen 3 with X570 is a bit better: 24 PCI lanes. 16 for PCI and 2x4 for dual NVme SSD with 8GB/s support.
Really? In FarCry i missing Ryzen 3000 generation, why?...Sorry but in Vulkan (Strange Brigade) its shame! 65W AMD beat 250W+ intel on 5GHz? :D:D:D LOL
so this has been the main front page story on this site for the last 8 Days strikes me as a little biased other CPUs come and go yet intels last gasp attempt to make something matter is upfront and centre for over a week strikes me as a little unfair
So.... În other review, Ryzen slaughtered 9900ks in blender, even 3700x is faster :) Some say in gaming it's faster than 3900x, but so is 9700k with 2080 :) 9700k is on par with 9900ks. Other than "having the best of the best", 9900ks requires a lot more spending to open it's full potential. From this review, 3700x is really, really good all-round with minimal spending a 100$ B450 and a good 550w psu
Any Ryzen CPU is getting slaughtered in gaming by piss poor 9700k without HT so your point? Don't hate things just because you are too poor to afford them. We rich bois don't care. Don't forget to attend Fridays for future today.
:) how could you live a day without insults. People asked why no ryzen 3000 results for blender, and slaughters is when a 127W-rated, "special edition" "all cores 5GHz all times", 550$+ CPU doesn't even manage to beat the 3700X at 325$. People also asked why 1080 and argued that 9900ks is still better for gaming, guess what - on a 2080, de difference is not so great, other than Hitman, 9700k (piss poor? I suppose rich boys don't know that it, at the moment, is 30$ more expensive than 3700x, with no cooler AND requires the Z mobo to overclock :)
Yeah, I am asking too why that allegedly excellent 7nm cpu with HT is unable to beat a piss poor 14nm 9700k with HT disabled? See, there is a reason why it is so cheap.
Yes, it is a piss poor version of 9900k with disabled HT, the silicone quality is subpar by Intel high standards and those pieces are unable to reach 5ghz boosts with HT being enabled , so Intel disabled the HT and sell them as 9700k. Of course, I should not forget to mention that 9700k is a piss poor quality cpu only by the high Intel standards and high Intel customers standards.. We do not bother with anything below 4.7 Ghz.
By AMD standards, anything reaching 4.0 is a godly cpu and is being binned and sold as 3900x for the premium and they are pumping up to 1.55v into it so it can reach that mighty 4.3ghz boost on a single core whilst they promised 4.6ghz.
Let's wait for 3950x and the promised 4.7ghz boost. You know, I smell a trap. Not that kind of one, you pervert. 3900x is able to reach 4.6ghz for microseconds so, that 4.7ghz boost won't be most likely even measurable because such tiny time unit the boost would be active for hasn't been discovered yet.
Shame you don't complain so much about AMD CPUs unable to reach promised boost clocks as much as you care about Intel power consumption. We get it, you are poor, you could finally afford 8 cores thanks to AMD yet loosing to Skylake refresh crippled by security patches so venting your frustration here. Difficult time to be an AMD fan, especially after the first gen threadripper support drop fiasco, suddenly a new socket and no backward compatibility is not an issue. Don't hate things just because you can't afford them. Fridays for future is up today again, vent your problems there, thanks. Anyway, bye, a private plane is waiting, gonna have a pizza for dinner in Italy to piss off Greta because I can.
"Skylake refresh crippled by security patches" - you must be kidding, right? It shouldn't have those security holes. Please stop talking shit about poor because people here talk about optimization - the best for the least money at a price point. And please stop bashing AMD's ryzen - it's not bulldozer, without ryzen this shit here would be sold as "Intel i11 Unobtanium Edition" for 1k$ and you, rich boy, would have 6 cores or more only on LGA 2011. Nobody hates a product - I don't like Intel practices - 5% increase per generation to the point. Where my i7 3840qm is 10-15% slower than 7700hq with a 4! generations gap. Speaking about private planes - nobody gives 1 cent on rich boys approach on tech at this level because, while you can afford stupid - the rest of us have to be smart. Now you can fly eat your pizza:)
wow maxiking... resorting to insults and name calling still ?? still believing all the intel bs ?? still believe intels bs about how much power their cpus use ?? talking like you have money is supposed to impress people ?? good for you.. nice to see you are also arrogant rich spoiled brat
How dare you? Where did I name call anyone? If someone is fat and I call them fat or if they smell and I tell them so, it is not an insult, it is called stating a fact.
I see you still do not get the TDP does not mean power consumption, it is even stated and explained in the review.
If I were you, I would be more concerned about 1700x, 1800x, 2700x, 3900x TDPs and AMD misleading marketing about boost frequencies because there have been so far 3 bios patches which were supposed to fix the issue and guess what. Nothing has changed. People have to use a makeshift custom power profile created by a geek in order to get closer to the promised boost clocks.
Typical AMD, I give it 3 months till he starts fixing their awful gpu drivers aswell.
calling people poor.. among many other things in previous posts by you.. and yes it is an unsult to call some one fat.. or they smell.. but, i bet you do that because either your selfesteem, and self worth is so low, you have to say things like that to make your self feel better.. yet you still cry about ryzen and the clock speeds.. but yet. you STILL refuse to admit the fraud intel calls is tdp spec ?? so what ever maxipadking . go back to your cave...
Yeah, my self esteem is so low that I regularly visit Mercedes and BMW showrooms only to tell them how they cars are overpriced and my Dacia is cheaper and can perform the same and consuming less gass like you do. If Intel TDP is fraud, so does is AMD's one and their promised boost clocks and video on youtube where they promise you can overclock chips even further with sufficient cooling. What do they mean by that? Ay, and what about the bulldozer fraud?
yea sure you do, your the one who is probably poor... you are becoming the worst intel shill on here now.... all you EVER do is talk. if you are so sure amd is committing fraud as you claim, then put your supposed money where your mouth is, and take AMD to court,m or shut up
Again, it is you, you and only you perpetuating lies. I never come here first talking **********, I only reply to amdfanboys comments.
I do not own any AMD cpu, I do not buy subpar products so I can not take them to court.
Anyway, if you are so sure about Intel wrongdoings, take them to the court. EZ.
Unfortunately for you, it is AMD who lost at court and got caught misleading about that parody on cpu called bulldozer. Claiming to possess 2 times more cores than they actually had.
maxipadking.. you are so full of it... what about the intel lies about its 10nm nodes for the last what.. 6 years being on track ?? what about the lies about their not doing anything wrong to prevent amd feom selling its products ?? among various other things over the years that you so easily for get... you never come here 1st ?? BS actually.. you DO buy sub par products.. intel is sub par now.. but in your intel blindness.. you just dont see it... intels marking has been worse over the years then amd.. deal with that.
ok maxiking, IF you THINK you are so smart and know it ALL let me ask you this:
WHY does it take a HIGHER clocked intel chip, to beat a LOWER clocked equivalent ?? i would LOVE to read your reason for this one.
clock both intel and AMD at the SAME clocks, and i bet, AMD's chips would slaughter intels, in EVERY metric, performance, and power usage.
i will tell you what, i will save you the trouble, you WON'T reasonably answer those 2 questions with a real, logical answer, you will just attack me like you do others on here, call me names, or insult me, among other things, but the answer for both, is amd's current chips, now either match, or surpass intel in IPC, and performance per what.
come one maxiking, i dare you to answer those 2 questions with a reasonable, logicial answer, WITH OUT being insulting, or resorting to name calling.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
235 Comments
Back to Article
Dudler - Thursday, October 31, 2019 - link
P4 cough cough..Hifihedgehog - Friday, November 1, 2019 - link
*cough, cough* Emergency Edition *cough, cough*Man, whatever is going around is really catchy.
Spunjji - Friday, November 1, 2019 - link
It's vaguely amusing how they go for the same stock approaches every time. When a product becomes noncompetitive, either release a "special" bin that blows through acceptable power/stability limits, or ram a server CPU down the stack into "consumer" territory. The EE has a special place in my heart because they panicked so hard they did both of those things.Samus - Friday, November 1, 2019 - link
And much like the P4 EE, the power consumption is through the roof. Ahh the days of Presshot warming my dorm seeding a Napster queue.You really have to appreciate (again) what AMD is able to pull off here at 65w. It's literally on the heals of a CPU burning 3x more power.
Notmyusualid - Monday, November 4, 2019 - link
https://www.phoronix.com/scan.php?page=article&...Death666Angel - Saturday, November 2, 2019 - link
All true, but AMD does the same thing, as does Nvidia when it comes to GPUs. Remember the GTX 480? Or the FX-9590. :) If you mess up, that is mostly your only option to have something. Some people don't care much about efficiency and just want the fastest at a certain thing. They probably did the ROI numbers and it came out positive for them.Ratman6161 - Monday, November 4, 2019 - link
But...there are still quite a few tests where there is an orange bar at the top :). One thing I'm a bit curious about on the AMD side though is that there are several cases where the 3700X beats the 3900X. 3900X is both more cores and higher clocked so shouldn't it win everywhere?Also for those of us where price is an obejct, that 3700X looks pretty darn good against most everything else. :)
amnesia0287 - Friday, November 8, 2019 - link
It’s possible it’s related to the quality of the cores. With the Ryzen chips not all cores have the same limits. So in theory you could have a chip with less total cores but more higher spec’d ones.MDD1963 - Thursday, November 7, 2019 - link
Little need for any 'panic' as, gaming-wise at least, all AMD has managed is to tie the 8700K.....; everything 9700K and higher in the product stack remains virtually unopposed.WaltC - Monday, November 11, 2019 - link
I'm still trying to figure out how a higher clocked Intel CPU which processes data slower than a lower-clocked AMD cpu is a "clear advantage" for Intel....;) Perhaps Dr. Cutress might enlighten me...?....;)amnesia0287 - Saturday, November 16, 2019 - link
Can’t tell if playing stupid or srs.The IPC advantage is 10%, the clock speed advantage is >10%.
Intel chips all the cores are able to boost to the same levels. AMD chips now have varying spec cores and only the best 1-2 hit the advertised speeds and only when there are only a few threads running. If you are doing heavy multitasking, most of the AMD cores will be dramatically slower than those of the 9900ks.
That said the AMD chips still have a number of other advantages such as more pci bandwidth and no tiny DMI bottleneck. Lower cost, lower power use and generally more cores.
Right now it really comes down to what you plan to use it for. There is little doubt AMD is in a better spot right now, but it remains to be seen if they can hold it past 2021 as intel transitions to 7nm and MCM.
Korguz - Saturday, November 16, 2019 - link
" The IPC advantage is 10%, the clock speed advantage is >10%." and WHY do you think intel has the lead, even barely, in some cases. because of the clock speed advantage. Intel's cpus NEED this advantage just to keep the performance they have. think about it, a LOWER clocked chip over all is performing about the same,( depending on usage ) and in some cases better, then the higher clocked equivalents clock these chips the same, and i bet the story would be quite different. " Intel chips all the cores are able to boost to the same levels " at quite a bit more power usage too." 2021 as intel transitions to 7nm and MCM. " more like IF intel can transition to 7nm by then, look how long it has taken them to get to 10nm..
shaolin95 - Monday, November 11, 2019 - link
*cough, cough* butthurt AMD fanboy detected *cough, cough*Qasar - Monday, November 11, 2019 - link
how so ??? seems there are more intel fans but hurt lately...peevee - Friday, November 1, 2019 - link
:)Funny, I had exactly the same thought.
But honestly, for many real world uses (games etc) 4 cores at even faster frequency would be even better. Physically separated on the die as far as possible (in the corners) by huge amount of shared L3.
I guess 5GHz base/6GHz turbo is not out of the question within the same TDP with liquid cooling.
eva02langley - Thursday, October 31, 2019 - link
A joke of a CPU. How can this cost more than a 3900x?eva02langley - Thursday, October 31, 2019 - link
Not to mention that with the price gouging, you are almost near the MSRP of the 3950x.prophet001 - Thursday, October 31, 2019 - link
People want this for clock speed.12 slow cores aren't helpful if you need a few fast ones.
eva02langley - Thursday, October 31, 2019 - link
AHAHAHA... okay, we have a blind fanboy here. Do you know anything about IPC? This CPU get destroyed in EVERYTHING except old games running outdated engines at 1080p. SO unless you buy this with a 2080 TI and a 240Hz 1080p monitor, you are not going to benefit from it.Basically, with a budget, the 9900KS is a waste of money, period. The money you save buying a 3700x and investing in your GPU will give you some serious gaming performances increase.
xenol - Thursday, October 31, 2019 - link
Honestly anything more than a midrange CPU and GPU is a waste of money for most people.Opencg - Thursday, October 31, 2019 - link
People fail to consider other use cases. For competitive gaming or someone running 240hz 1080p with a high end gpu and willing to tweak settings to make their games cpu bound this is still the best cpu. Unfortunately not all testers optimize their cpu tests to be cpu bound in games. But if you look at the ones that do intel still poops on amd. Sure most gamers dont give a shit about fps above 160 or so but some do. When I ran overwatch I tweaked the config file and ran 400fps. If I was running csgo I would push the fps as high as possible as well.Also imo the biggest used case for amd cpus for gamers is futureproofing by having more cores. Most gamers are just gonna play their games with a few tabs open and maybe some music and discord running. Not everyone is running cpu based streaming encoding at the same time.
Galid - Thursday, October 31, 2019 - link
Well I don't seem to notice the same thing you do for max fps in games where you need 240hz for example. At most, I can see 10 to 15 fps difference in counter strike at around 400fps. I looked around and found a lot of tests/benchmarks. There is no such thing as ''this is the best cpu and you'll notice a difference in the games that matters for competitive gaming''. I might be wrong, if so, enlighten me please. I'm about to buy a new gaming rig and like 99.98% of the population, I'm not a competitive gamer. I don'T consider streaming as competitive neither.But, in ubisoft's single player games, I noticed it does help to get closer to the 120hz at resolution and details that matters for these non-competitive games.
Galid - Thursday, October 31, 2019 - link
BTW I compared ryzen 7 3700x and i9 9900k and got to the above conclusion.eek2121 - Friday, November 1, 2019 - link
Look at the 95th percentiles. Ignore average fps. AMD and Intel are virtually tied in nearly every game. I cannot believe we have reached this point. Finally after a decade, AMD is back in business.evernessince - Friday, November 1, 2019 - link
You do realize that running your CPU or GPU at 100% max utilization increases input lag correct? FPS isn't the only thing that matters. if the CPU cannot process new inputs in a timely matter because it's too busy with the GPU then the whole action of increasing your FPS was pointless. You should cap your FPS so that your neither your CPU nor GPU exceed 95% utilization. For the CPU this includes the core/cores that the game is running on. You loose maybe a handful of FPS by doing this but ensure consistent input lag.CptnPenguin - Friday, November 1, 2019 - link
Not sure how you managed that. The engine hard cap for Overwatch is 300 FPS.eek2121 - Friday, November 1, 2019 - link
Not true. AMD has the entire market pretty much cornered, though. So it doesn't matter whether you buy high end or mid range, Intel chips in general are a bad choice currently. Intel desperately needs to rethink their strategy going forward.bji - Thursday, October 31, 2019 - link
Well kudos for at least admitting that you are a blind fanboy early in your post.Slash3 - Thursday, October 31, 2019 - link
WCCFTech's comment section keeps leaking.Sivar - Thursday, October 31, 2019 - link
You might want to look at the benchmarks. Intel won most of them, with less cores.I was seriously considering an 8- or 12-core AMD, but Intel still ended up the better option for everything I do except video transcoding, in which AMD clearly wins.
Other considerations: No cooling fan on the Intel motherboard, better Intel quality control and testing in general, more mature product (because the 9900 is an iteration of an iteration of an iteration...etc.)
Korguz - Thursday, October 31, 2019 - link
one thing you should consoder.. NO cooling for this cpu at all.. so add at least $90 for that...AshlayW - Friday, November 1, 2019 - link
This too, the Intel CPU doesn't even include a cooler.liquid_c - Sunday, November 3, 2019 - link
AMD does, indeed, include a cooler. But stop acting like a frog and admit that you cannot (properly) use the CPU with that cooler. You'd still have to, eventually, get a AIO or a Noctua. So the fact that you'll have to pay an extra 90$ or so is moot, in this case.Korguz - Sunday, November 3, 2019 - link
moot ?? i dont think so, the cost of this cpu + $90 at least, just to be able to use it, ryzen 3000, no need for a cooler as it is included,if one wants better cooling, then it is an option, not a necessity, for the price, amd wins. for performance, for the most part, amd wins as well, specially in MT. and power usage, amd wins there again, you know, like how people were bashing amd for before zen cam out ?? WHY are people not bashing intel the SAME way now ??amnesia0287 - Friday, November 8, 2019 - link
3950x doesn’t include a cooler.Xyler94 - Thursday, October 31, 2019 - link
The cooling fan on X570 is if you're doing something as crazy as dual PCIe 4.0 NVMe RAID 0. Otherwise, they don't need to spin, as the passive heat dissipation is enough, and Gigabyte makes a board with no fan at all.It's great that Intel works better for you, but the use cases for it comes mostly down to super high refresh rates (240hz displays at 1080p) with medium settings. Otherwise, who cares if you lose maybe 5 frames if you're never gonna see those frames anyways, and if the 12 core Ryzen does almost everything better than the 8 core 9900KS in multi-threaded tasks, then I don't see the drawbacks in buying AMD.
Intel does lead AMD in the core speed and overclocking department however. If you're an enthusiast overclocker, then by all means, disregard AMD
AshlayW - Friday, November 1, 2019 - link
"Better Intel quality control and testing"Substantiated proof please? Otherwise, stop talking out of your backsie, many thanks.
Hey, look I can also be anecdotal: In 10 years of building PCs I've never had a CPU die, Intel or AMD, and only 3 motherboards, two of which was my fault.
Spunjji - Friday, November 1, 2019 - link
My anecdotes:I've had two AMD CPUs die. One was a Duron that I ran with the heatsink improperly mounted, causing it to go pop, and the other was a Barton core Athlon XP that had a corner of the die chip off from having a heavy CPU cooler dismounted and remounted too many times. Both my fault, albeit both consequences of design choices (no thermal regulation on the Duron, no protective heat spreader on the Barton)
I've had 3 Intel CPUs go faulty. All were Sandy Bridge, and every single one appeared to work correctly most of the time but would do weird things like suddenly stutter in games or crash without warning. None of them had been overclocked or otherwise "misused".
Off that basis I'd say AMD do better than Intel. *shrug*
Sivar - Friday, November 1, 2019 - link
AMD's random number generator instruction is crashing Linux and compromising security because it always reports that it can generate a legitimately random number, and it always generates the same one. This has made Sysd in Linux issue a patch that always assumes the generator is bad when it issues that value, a 0xffffffff, which means that when working instruction like Intel's generate that number legitimately, Linux will assume it is defective, too. If they had, say, tested the chip properly, the fact that many distributions of the world's most popular server operating have CPU-related crashes may have been found.AMD made the first chip I have ever used that could not run its stock speed stable, but could run underclocked (AthlonXP).
AMD owns the company formerly known as ATI. Somehow, the quality of their drivers is controversial, but after giving them 6 or 7 chances since the 90's, I will never use that hardware again due to the hilariously low-quality drivers and driver support software. The hardware is often fine, but has at many times been reputed to run too close to the line, requiring intense cooling with leaf blower-like fans, and allowing almost zero overclocking.
Really, the idea that AMD's QA is inferior to Intel's seems strange to even question. AMD, even with their successes here and there (AthlonXP, Athlon 64, Ryzen... and that's pretty much it) has always held a reputation for being the off-brand. Anecdotally, problems with AMD systems have long been considered more likely than with Intel.
I wish AMD the best, and I hope their next-gen Ryzen 4000 series shows a clear win over Intel other than in just core count, but for as excited as I was to get a 3800X or 3900X, when I looked at the numbers and at my own long history as an AMD advocate yet purveyor of fact, I had to conclude that AMD still doesn't quite measure up other than in specific corner cases.
Galid - Friday, November 1, 2019 - link
I owned multiple systems over the past here are some of the cpus: Intel 80286, intel pentium 133, intel pentium II 233 and 350, AMD Athlon 64 3200+, AMD Athlon x2 4200+, Intel Core 2 duo e5400 and q6600, AMD FX-8350, intel i5 2500k and then my brother got a ryzen 3700x that we just built lately. I won't go into all the video cards I've owned/purchased to build systems for my friends/family but here are some of them: ATI rage 128, Nvidia riva tnt2, geforce 2 mx, ATI Radeon 8500, ATI radeon 9700 non-pro, Nvidia 8800 gt, ATI radeon 4850 then upgraded to 4870 1gb and then I pretty much tried everything that came out after that for both team.Sincerely, the biggest problems I've had so far was my geforce 6800 gt's drivers totally incompatible with my Athlon 64 x2 3200+ and that was due to the chipset. Nvidia knew the problem was widespread. We could only fix this by using a very old driver(can't remember which) and sometimes underclock would work. They never fixed this.
My father still uses the FX-8350 and that cpu was not well received back in the days. He has no problems whatsoever.
Again, I'm not a linux user but I've got a friend who swears by linux and he uses AMD hardware. I could ask him why but I do not really care anyway. I remember when I bought my i5 2500k there was a major recall on motherboard including mine for b3 stepping because of the S-ATA controller risk of dying or something like that. 90% of the systems I named above are still running strong and of that 10% dead, it's mostly the motherboards or ram, which aren't built by AMD nor INTEL.
I'm no fanboy, I bought so many systems with always the same objective being the best performance for the dollar depending on the usage of the said system. I just had to comment because when I see someone commenting like that, I think they were simply unlucky and that it has nothing to do with what they beleive.
Galid - Friday, November 1, 2019 - link
A little mistake on my part, it was my AMD Athlon 64 3200+ and not the x2. The worst about this problem was that the chipset (nForce3-250 or something like that) was made by Nvidia and it had problems with their video cards only.I forgot to mention that we bought a new system for my brother because the CPU died, I think that's the only cpu that ever died on me and it was an i5-2500k. The most shocking about that is I overclocked mine to 4.7ghz for all these years and it's still going strong(hence the reason I'm still waiting to upgrade). He bought some closed loop water cooling and never overclocked the cpu still it died on him... shame.
alufan - Wednesday, November 6, 2019 - link
hmm so intel has better quality etc lets consider for a moment all the security issues with intel, then lets look at the way they refused to develop the cpu until AMD came alont with 12, then 12+ then 7 and shortly 7+ meanwhile intel cannot make a decent 10nm chip speaks volumes about your argument, then lets look at the TDP the AMD chip at 65w is almost neck and neck with the intel one at 255w !Only cpu i ever had fail was a core 2 duo never lost a GPU from either camp but guess what... the intel GPU is being design led by a former AMD/ATI staffer as is the new intel CPU as well, think we can leave it there
outsideloop - Friday, November 1, 2019 - link
If you want stream your game while paying, get the 3900X.outsideloop - Friday, November 1, 2019 - link
While playing...flyingpants265 - Monday, November 4, 2019 - link
No more of the weird streaming comments please. Nobody really streams.BikeDude - Wednesday, November 6, 2019 - link
<blockquote>more mature product</blockquote>But all reports so far indicates that Intel has been hit much harder by spectre-class bugs?
"All the issues that came out this year, were reported not to be an issue on AMD" (https://www.theregister.co.uk/2019/10/29/intel_dis...
Midwayman - Thursday, October 31, 2019 - link
Curious you think that someone considering a 9900KS is a 'budget' gamer. You could easily make that argument with any high end component. I'd expect them to be pairing this with both a 2080ti and a high refresh monitor.evernessince - Friday, November 1, 2019 - link
I think you are mistaking enthusiast for "fool". I've bought a 980 Ti and a 1080 Ti but I sure as hell ain't going to buy a 2080 Ti. I had a 5820K and bought a 3700X.Thankfully, there are some of us with some fiscal responsibility.
Spunjji - Friday, November 1, 2019 - link
Cheers for doing your bit to reign in the madness.Death666Angel - Saturday, November 2, 2019 - link
And to add to that, I can easily fit a high end 200W+ TDP CPU cooler in my small mATX case, but I cannot fit a graphics card that is longer than 26 cm (23.5 cm after my front fan modifications) into my case. SFF systems are more limited by graphics card size than cooler size most of the time. And the best cards in a smaller form factor are 1080 TI and 2070 Super as far as I know.imaheadcase - Thursday, October 31, 2019 - link
So he makes a point about why people want a CPU and he is instant fanboy? Well you certainly got the boy part down pat..Jorgp2 - Thursday, October 31, 2019 - link
What are you talking about?Zen 2 only has a tiny IPC advantage over Skylake.
Spunjji - Friday, November 1, 2019 - link
10% isn't tiny, especially when boosting clock speeds by 10% is no longer trivial.Korguz - Friday, November 1, 2019 - link
imagine what it would be like if the clock speeds were higher... if zen 2 is this close, or faster with the clockspeed disadvantage it has now.. what will it be like if zen 2 was hitting 4.6+ ghz ???outsideloop - Friday, November 1, 2019 - link
We have a few months until Zen 3 clocks will leak.MDD1963 - Thursday, November 7, 2019 - link
Don't a full 10% of the 3900X samples actually have a core or two hit their advertised 4600 MHz for about 5 full consecutive seconds....sometimes? :)Korguz - Friday, November 8, 2019 - link
dont intel cpus NEED the higher clocks in order to have the performance they get ? clock them at the ryzen equivilents.. and see how well they perform.. when will people realize clock speed isnt everything ??amnesia0287 - Friday, November 8, 2019 - link
What would it be like if it hit 6ghz?Does it matter since it can’t?
I am very eager to see Zen 3 tho. Regardless of all the fanboys. I feel like it’s very much a mixed bag to chose between intel and amd just cause of some of the instruction sets and #of threads used by various things.
But if even half of the stuff about Zen3 is true 2020 should be AMDs year as I don’t believe anything intel launches until 2021 is really gonna be competitive.
rocky12345 - Tuesday, November 5, 2019 - link
Zen 2 has more like 13%-15% IPC gain over released Intel CPU's right now at same clock speeds depending on work load. There was a video on Youtube where one of the bigger YT channels did a side by side of AMD Zen 2 CPU and Intel 9900K both @4GHz and I was surprised that the AMD chip was ahead in most everything by a fair amount.When it came to gaming though Intel had a slight lead in a few games that seemed to favor Intel. But there were also games that AMD got wins from a s well. This would explain why AMD with a CPU with same core count is now able to match Intel even though AMD has a lower clock speed and even come close or match in a lot of games. I am no fanboy for either camp I currently only own Intel based systems but would be more than willing to look at either camps hardware when I do my next set of upgrades. That is just how good things are now on either side ans with AMD finally back in the game and putting pressure on Intel prices are also now getting better on the Intel side of the street as well. It is a win for everyone when things like this start happening.
AshlayW - Friday, November 1, 2019 - link
Hey, it's a good thing the 3900X doesn't have slow cores then, does it :)Seriously, though. i3-9350KF exists, go buy that if you want clock speed :P
prophet001 - Monday, November 4, 2019 - link
I mainly play WoW and this would do a much better job than a 3900.Why does that tilt people?
Qasar - Monday, November 4, 2019 - link
you sure about that ??? i have 2 comps, both with a asus strix 1060 gaming OC, one with a 5930k, the other with an FX 8350, both max eye candy less AA ( 4x ) and AF ( 4x as well i think ) , and get about the same FPS. the 3900 will prob use less power over this.MDD1963 - Thursday, November 7, 2019 - link
I'm surprised an FX8350 can saturate a GTX1060, but, you maxing out the details and quality is the only reason the FX keeps up...; it's like saying the i5-8400 matches the 9900KS at 4k with a GTX1070. (Of course it does)eek2121 - Friday, November 1, 2019 - link
bwahahah, have you looked at the benchmarks? Enjoy your 3-5 extra FPS in gaming. ;)Korguz - Friday, November 1, 2019 - link
and the added power usage....Chaitanya - Thursday, October 31, 2019 - link
Also comes with only 1 year warranty. By special it really should mean mentally defective edition.amnesia0287 - Friday, November 8, 2019 - link
How many people ever actually use the warranty anyway lol.Samus - Friday, November 1, 2019 - link
Because it's drop-in compatible with the poor sap who isn't getting enough from the cheap Walmart i3 gaming PC they overpaid for.Unfortunately AMD just doesn't have the presence in retail to gloat that. Ironic, because traditionally AMD has had a superior upgrade path, keeping sockets longer and (provided motherboard vendors support their boards) new microcode support via BIOS updates.
josiasmat - Thursday, October 31, 2019 - link
I find it funny that in the past Intel CPUs were praised for their power efficiency over AMD ones. Now that AMD has a 65W CPU that is almost as fast as the reviewed CPU, it doesn't matter at all...Sivar - Thursday, October 31, 2019 - link
Indeed, the 7nm process is clearly a win here. That said, total platform power with Intel (9900KS excluded!) still tends to be lower very similar or even lower, in part due to the rather power-hungry 14nm AMD 570x chipset.470x-based AMD systems still win in most cases, but not by an extremely large amount.
Spunjji - Friday, November 1, 2019 - link
172W for 8 cores at 5Ghz with the 9900KS142W for 12 cores at ~4.2Ghz for the 3900X.
The X570 chipset TDP is around 15W, 6W for Z390. There's simply no aspect of power efficiency where Intel come out on top here.
Sivar - Friday, November 1, 2019 - link
Please read the first couple of sentences before criticizing.No one is saying the 9900KS is a power efficiency winner. Indeed, when running compute-intense tasks, AMD's larger CPUs are likely the winner most of the time, but you cannot post one set of measurements with one specific configuration and draw broad conclusions. Other measurements differ, and most pit the idle power draw (where most PCs remain most of the time) as lower for Intel. For example,
https://www.guru3d.com/articles-pages/amd-ryzen-7-...
Korguz - Friday, November 1, 2019 - link
thats funny.. cause before zen.. thats what people were saying about amd.. and the power those chips use. now its the opposite, and its ok for intel to use so much power ??? come onvMax65 - Friday, November 1, 2019 - link
Just to add to this...der8aur just tested the 9900KS and it runs games all core 5GHz all the time at between 98w and 126w....Video showing this starts at 3:50..the only time it uses more power is if you are doing high compute loads and or with AVX workloads...but for gaming, it will stay well below...https://www.youtube.com/watch?v=YWSn0cHauJ4
Korguz - Friday, November 1, 2019 - link
well, this review, says other wise. maybe different test setups ??Jorgp2 - Thursday, October 31, 2019 - link
That's because AMDs CPUs were woefully underpowered.Intel is still more efficient in the mobile space
evernessince - Friday, November 1, 2019 - link
Humans find ways to defend their purchases. RTX 2080 Ti owners have 1,200 reasons to defend theirs.AshlayW - Friday, November 1, 2019 - link
If I only I could upvote this post...vMax65 - Friday, November 1, 2019 - link
Surprising that it seems to hurt some that others might want to buy a RTX 2080Ti or a 9900K or a Ferrari for whatever reason. Are we now all to be lambasted for buying anything that maybe costs more?...or heaven forbid they want the best!!! Oh no...run!!! You might as well start on all those that buy better shoes, cookers, microwaves, TV's, bread, cars, chocolates, phones etc, etc ad infinitum or for that matter any branded products period as they cost more than there non branded counter parts...Hold on, we are not going far enough, here's an idea, lets all wear the same clothes, live in the same houses, watch the same TV, eat the same gruel, have only one manufacturere of all products.......so no one can be different and most importantly we certainly do not want any choice, creativity, design et all....Madness......
Midwayman - Thursday, October 31, 2019 - link
I'm mostly reading its a bad time to buy a 9900K since intel will have binned out all the golden samples.Supercell99 - Thursday, October 31, 2019 - link
Yea you are not going to be able to OC a 9900K much. They have panned through all the 9900 goldJorgp2 - Thursday, October 31, 2019 - link
This is a new stepping.TEAMSWITCHER - Thursday, October 31, 2019 - link
The time to hop on the 9900K train was at the introduction.. Now just wait for Comet Lake.AshlayW - Friday, November 1, 2019 - link
Or be a smart consumer and wait for 3900X to be back in stock :)Sivar - Friday, November 1, 2019 - link
Most PC users make better use of fast cores than more cores.GreenReaper - Friday, November 1, 2019 - link
Historically this has been very true. However browsers nowadays can take advantage of many cores, as can games - they have been forced to, or else be outcompeted by those who do.Kjella - Thursday, October 31, 2019 - link
Considering it's a brief holiday special I would think this is all the golden samples from the past year. I'm quite sure they all pre-bin and keep chips that are "perfect" in some way like no defects/high frequency/ultra low voltage even though they don't have an SKU for it yet. Normally they'd launch one notch up as a 9950K or something, with this time limited special I guess Intel is saying we'll never produce these chips in enough volume to justify that so we're doing a limited edition for PR instead.Spunjji - Friday, November 1, 2019 - link
Nailed it.GreenReaper - Saturday, November 2, 2019 - link
They keep the duds, too. That's one reason the Pentium and Celeron come out later. The other being they don't want to spoil the market for higher-end chips.DigitalFreak - Thursday, October 31, 2019 - link
Once you cut through all the Intel marketing BS, to run all cores at 5Ghz as advertised it's a 172W processor.Jorgp2 - Thursday, October 31, 2019 - link
Did you actually read the article?Korguz - Thursday, October 31, 2019 - link
um.. it is a 172w cpu.... did you read it jorgp2 ?? intel tdp... is crap.. this is listed as being a 127 w cpu.. at BASE frequency. at 5ghz... it will probably use 172 w.. or morevMax65 - Friday, November 1, 2019 - link
Err...no...der8aur jus tested the 9900KS and it runs games all core 5GHz at between 98w and 126w....Video showing this starts at 3:50..It just depends on what you are doing and whether you are using heavy compute applications but for most games and normal day to day usage no it will hapily run under well under 172w... and that will never be an issue and that's really where this CPU is targeted...albeit those with deeper pockets...https://www.youtube.com/watch?v=YWSn0cHauJ4
Korguz - Friday, November 1, 2019 - link
what was his test setup ??? does it state that ?? i dont see how this chip can use less power. then its own siblings.GreenReaper - Saturday, November 2, 2019 - link
It's possible that it is if the chip is more power efficient (i.e. needs a lower voltage for a given load), which it has to be just to reach 5Ghz on all cores without sending power through the roofDigitalFreak - Friday, November 1, 2019 - link
Yep, I did. Their testing showed that all cores running at 5Ghz had the CPU using 172W. If you're not going to keep all cores running at 5Ghz, except during idle, what's the point of buying this over a 9900K?vMax65 - Friday, November 1, 2019 - link
I am not sure how you missed it but he clearly says when running CSGO, SOTTR and even Timespy GT1 and GT2 the CPU is always at all core 5GHz all the time with max power draw between 75w and a 126w.....He clearly states that at 4 minutes and onwards...The only time this CPU will draw max power of 170w is when you are doing heavy duty compute loads or testing with AVX...For normal use like gaming...this is what most people will buy it for it will never be at 170w plus..
Korguz - Friday, November 1, 2019 - link
again.. you dont state what the test setup was. how it was configured.. so, either he did something wrong, or anandtech did, and sorry to say, i trust AT more...vMax65 - Saturday, November 2, 2019 - link
Asus z390 maximus xi extreme, RTX 2080Ti and not sure what RAM...but as he cleary states this was during the bechmark runs and in the comments he covers forgetting to mention the 2080Ti but clarifies this point. Bottom line for 'normal' gaming usage this CPU will run all core 5GHz all the time at on or under 127w...vMax65 - Saturday, November 2, 2019 - link
And Anandtech tested power with Cinebench...not with games....Shekels - Saturday, November 2, 2019 - link
Y'all are not understanding vMax65 whatsoever. He's talking about IN GAMES and not in benchmarks. It is extremely rare for your CPU to come close to 100% utilisation while playing games unless you're running a competitive setup at super low settings in a game like Overwatch or CSGO. Just because a CPU is running at all core 5ghz doesn't mean it's consuming 172W. No its consuming 172W at 100% utilisation. So with that being said, when you're using it at 70-80% utilisation it will consumer far less power so the 90ish W to 125W makes complete sense. I hate Intel and their practices, and their CPUs are overpriced and uncompetitive, but don't spread false information without understanding how a CPU even draws power...RSAUser - Wednesday, November 6, 2019 - link
I'm very, very sure it will not run at all-core 100% 5 GHz for CS:GO and I still don't know why you'd want to do it. I'd limit it to just above monitor refresh rate.Korguz - Saturday, November 2, 2019 - link
ahhh.. so you are trying to compare apples to oranges......liquid_c - Sunday, November 3, 2019 - link
For a CPU directed towards power gamers, tell me, please, in what circumstances do you see a gamer or a normal, day to day user, execute Cinebench on their PC? Stop being such a dummy and stop acting like a rabid dog over a product that not only you will never get but clearly, it's not geared towards you.Korguz - Sunday, November 3, 2019 - link
and keep in mind the test bed were DIFFERENT, that is a factor there as well. how would you know if some one would never get this cpu ?? and now you resort insults and name calling?? that makes your opions, pointless.satai - Thursday, October 31, 2019 - link
Can we get compilation benchmarks, pretty please?lux44 - Thursday, October 31, 2019 - link
Would it be possible to find a compilation benchmark? Chrome or Firefox perhaps? I remember Linux kernel compilation finished in 30 seconds, perhaps it's getting too small...Given .Net Core is cross platform, maybe compiling the platform itself or Roslyn compiler? If I remember correctly, compiling the framework right now is not trivial, but the goal is to streamline it for .Net 5 release.
Slash3 - Thursday, October 31, 2019 - link
Michael over at Phoronix has some compilation test results, but they are Linux centric.https://www.phoronix.com/scan.php?page=article&...
lux44 - Friday, November 1, 2019 - link
Thank you!dguy6789 - Thursday, October 31, 2019 - link
As long as you have good cooling every single 9900K ever made does all cores at 5Ghz all the time without any sort of turbo time limit when the motherboard has MCE(multi core enhancement) enabled- most have this enabled by default. The 9900KS is nothing new.Jorgp2 - Thursday, October 31, 2019 - link
9900KS most likely does it at a lower voltage and power, plus it has hardware mitigations.Korguz - Thursday, October 31, 2019 - link
this right there... shows you didnt read this article.edzieba - Friday, November 1, 2019 - link
The 9900KS has a different hardware mitigation report to the 9900K. See Phor0nics article p.1: https://www.phoronix.com/scan.php?page=article&...Khenglish - Thursday, October 31, 2019 - link
I think it would be interesting to see how the 3700x and 3800x do if you bump their power limit up to 9900KS levels. The 3700x is probably being held back quite a bit at 65W but still is competitive.imaheadcase - Thursday, October 31, 2019 - link
"AMD also has the 16-core Ryzen 9 3950X coming around the corner, promising slightly more performance than the 3900X, and aside from the $749 MSRP, it’s going to be an unknown on availability until it gets released in November."Wasn't that delayed till December? I swear i saw that in the news..
imaheadcase - Thursday, October 31, 2019 - link
oh nm i got months messed up. I read that last month..forgot its already Oct. LOLjgarcows - Thursday, October 31, 2019 - link
Your IGP tests aren't very useful. I'm not going to run a game at 720p with the lowest settings on my IGP if I'm getting 250+ FPS. I'm going to increase the resolution or graphics settings until I'm getting the best visuals I can while still hitting 60 FPS.Please compare with settings that give around 60 FPS on the IGPs so we can get an idea of how the integrated graphics performs when you actually stress it some. At 720p, you are just testing the CPU cores and not the IGP at all.
Slash3 - Thursday, October 31, 2019 - link
This has been a bugaboo of mine for quite a while.There are *no* IGP benchmarks. What you're seeing is a benchmark detail setting that AT refers to as "IGP." In effect, "Very Low" detail. The test is still run on the GeForce GTX 1080 and not the CPU's built-in graphics.
My other complaint is that the use of a GTX 1080 makes the High preset results entirely meaningless, as they're almost always limited by the GPU. It's a waste of graphs and a waste of test time.
katsetus - Friday, November 1, 2019 - link
The gaming benchmarks as they are shown here are absolutely pointless - all they show is that the GPU is the more important factor.Them saying that they keep the GTX 1080 for the comparability with older results is pointless, when the variability between three generations of CPUs is 3% at best, because the whole thing is nowhere near being limited by the CPU.
Cellar Door - Thursday, October 31, 2019 - link
Blender chart is missing Ryzen 3700 and 3900.airdrifting - Thursday, October 31, 2019 - link
What a joke and pos, and they have the audacity to claim 127W TDP?9900K out of box without messing with BIOS is drawing 180 watt at default 4.7GHz turbo, you expect me to believe 127W TDP? I know fanboys are going to defend TDP is not actual power usage blah blah blah, but it should be used as a point of reference and frankly actual power draw is 100% over TDP is not good enough for most people.
dullard - Thursday, October 31, 2019 - link
I'm confused by your post. It is a 4 GHz processor with 127W. When you ramp up the speed over 4 GHz, it of course uses more power. Plain and simple.Your problem is that you think it is a 5 GHz processor.
airdrifting - Thursday, October 31, 2019 - link
Are you not capable of reading? 9900K out of box runs 4.7GHz all core turbo without "ramp up" anything, that is its default speed without overclocking. Do you even own the processor?dullard - Friday, November 1, 2019 - link
Yes, the 9900K turbos at 4.7 GHz if given proper cooling. But, again, you do not understand the power rating nor do you understand turbo. The 9900K does not turbo to 4.7 GHz in many circumstances (think of a field application in a hot desert, or in an enclosed box without air holes, or in a dusty environment, or in a low pressure environment, or an application with passive cooling, etc).The 9900K is actually a 3.6 GHz processor at 95W of power. If you happen to be in an environment and application that allows great cooling, then the 9900K will go up to 4.7 GHz at far more power.
airdrifting - Friday, November 1, 2019 - link
Wow you are a pathetic loser. Just admit you are wrong instead of bringing up extreme cases like "hot desert" "enclosed box" to make yourself look ridiculous.Of all the processors sold, how many are going to run "in an enclosed box without air holes"? 1%? 2%? Even without proper cooling 9900K still runs 4.7GHz turbo as long as the motherboard can support it, hitting 100 Celsius without throttling, that is a FACT. Now go away, stupid fanboy.
dullard - Friday, November 1, 2019 - link
The answer is "Whatever percent of computers aren't regularly cleaned". Haven't you ever opened someone else's computer and found enough dust/fur to create a large stuffed animal? Reviews of new CPUs in new computer cases with sparkling new fans are nothing like the real life for the vast majority of CPU usage after real world usage.I'm only replying the way TDP is defined. TDP is the max power used at base speed. CPUs are only guaranteed to run at base speed. Turbo is NEVER a guarantee. And when a CPU is in turbo it will use more power than TDP. If that makes me a fanboy, then which product am I a fanboy of?
Sivar - Saturday, November 2, 2019 - link
I always know that someone is seasoned, wise, and definitely not a 14-yr-old when they throw personal insults out over a CPU wattage disagreement.AshlayW - Friday, November 1, 2019 - link
As airdrifiting said below, it's misleading because absolutely non (read: NONE) of the Intel Turbo capable CPUs run at their base clock only. I've seen 'reviewers' and people, comparing power use characteristics based on TDP, and conclude that the 9900K (non S, mind, this was before) is more efficient than the 3800X. Go Figure.vMax65 - Friday, November 1, 2019 - link
Just to add to this...der8aur jus tested the 9900KS and it runs games all core 5GHz at between 98w and 126w....Video showing this starts at 3:50..https://www.youtube.com/watch?v=YWSn0cHauJ4
Korguz - Friday, November 1, 2019 - link
broken recordAshlayW - Friday, November 1, 2019 - link
Also: This CPU is advertised as a "5 GHz" processor, well, pretty much everywhere.Spunjji - Friday, November 1, 2019 - link
This. When the whole selling point of the CPU is a higher turbo speed, advertising the TDP at the base clocks is just plain misleading.Jorgp2 - Thursday, October 31, 2019 - link
Do you know what TDP is?Korguz - Thursday, October 31, 2019 - link
do you ???diehardmacfan - Thursday, October 31, 2019 - link
whats the point of reading a review if you are still going to pay so much attention to a number on a box?Korguz - Thursday, October 31, 2019 - link
cause it is kind of an important stat ??diehardmacfan - Friday, November 1, 2019 - link
It's important for OEM's and cooler manufacturers and means very little to the consumer as it doesn't represent power usage on AMD or Intel.Korguz - Friday, November 1, 2019 - link
but it could be used to select a cooler for that cpu.. as intels cpu's, dont come with a coolervMax65 - Friday, November 1, 2019 - link
take a look at de8aur's lates video on the 9900KS on TDP and power consumption, specifically around gaming at an all core 5GHz. The CPU draws under 127w all core in games and only goes up on serous workloads like compute. Start at 3 minutes 45 seconds...ish.https://www.youtube.com/watch?v=YWSn0cHauJ4
vMax65 - Friday, November 1, 2019 - link
der8aur jus tested the 9900KS and it runs games all core 5GHz all the time at between 98w and 126w....Video showing this starts at 3:50..https://www.youtube.com/watch?v=YWSn0cHauJ4
mikato - Thursday, October 31, 2019 - link
Is it just me or does it seem like the 3700X wins all those tests?Sivar - Thursday, October 31, 2019 - link
I am pretty sure it's just you. The 3700x won exactly one test case: Strange Brigade on low detail.AshlayW - Friday, November 1, 2019 - link
In terms of value, yeah it pretty much does. It beats Intel's entire lineup. -5-10% performance, (even in gaming), -40% price, and +30% efficiency, includes a cooler too. The thing is, the Ryzen 5 3600 does it, and then some.Sivar - Friday, November 1, 2019 - link
The Ryzen 5 3600 is probably overall the "Smart person's CPU" while those that don't know how money works, like myself, quibble over a few % difference across $500 CPUs.Death666Angel - Saturday, November 2, 2019 - link
Or if you do any sort of work or intensive stuff with your PC besides game. :)The_Assimilator - Thursday, October 31, 2019 - link
Imagine if Intel had spent as much time and effort on fixing their 10nm process, as they're spending on binning and advertising 5GHz 14nm++++ chips./sarc
Jorgp2 - Thursday, October 31, 2019 - link
Any idea when cascade lake drops?nathanddrews - Thursday, October 31, 2019 - link
Seems like a beast CPU, not sure why all the haters.TEAMSWITCHER - Thursday, October 31, 2019 - link
I think there are a set of AMD propaganda artists that like to hit pieces like this. You see the same comments in lots of different articles.Alistair - Thursday, October 31, 2019 - link
It isn't AMD propaganda, one of the problems is the way people perceive written communication with strangers, you can't share ideas with people you don't know, especially contrary ones.If I check my Asus motherboard and 9700k, it uses 170W at default settings. Then it drops to 3.9ghz and uses exactly 95W. So it is a 170W CPU with all those benchmarks you see out there, that is highly misleading. I'd expect this new one to use more. People have been angry at Intel for some time for not listing their turbo power as TDP.
If I check my 3700x, it uses 65 watts. So my 9700k uses 2.6x times more power, and yes, it is faster (not in MT, but in ST) than my 9700k. That's all fair to point out, and not propaganda.
I think the 8700k was a genuinely good processor, with a large cache (not artificially limited like the 9700k) and 50 percent more speed for the same money. Now we have the 9900k only 33 percent faster at best, and WAY more expensive and much too power hungry. I haven't built any 9900k computers, and only have one 9700k computer (I bought the CPU used for a low price).
Alistair - Thursday, October 31, 2019 - link
Correction: If I check my 3700x, it uses 65 watts. So my 9700k uses 2.6x times more power, and yes, it is faster (not in MT, but in ST) than my 3700k, but not overall. That's all fair to point out, and not propaganda.Alistair - Thursday, October 31, 2019 - link
3700x damn it, hard to read on the phone...vMax65 - Friday, November 1, 2019 - link
Take a look at TDP and power consumption for the 9900KS for gaming at an all core 5GHz and you will notice it is between 98w and a 126w and thats 5GHz all the time....The power consumption figures start at 3 minutes 45 seconds...ish
https://www.youtube.com/watch?v=YWSn0cHauJ4
Korguz - Thursday, October 31, 2019 - link
the same can be said about the intel chips and the fact practically the only thing left for them to brag about, is single core performance, cause intel pretty much looses every where else.TEAMSWITCHER - Friday, November 1, 2019 - link
Sure Zen 2 is great, but not great enough to spend almost $1000 on a new Motherboard and Processor right now. My Intel Core i9 9900K seems to hit 5GHz easily, which is nice boost for every workload. I'll make that AMD vs Intel decision again though someday ... maybe in a couple of years.Korguz - Friday, November 1, 2019 - link
$1000 ??? nice try... maybe if you were going from amd to intel...AshlayW - Friday, November 1, 2019 - link
The difference between 9900KS and 3900X in gaming is unnoticable. The difference in everything else (in 3900X's favour) very much is. And people are starting to get mad at Intel for being so arrogant as to charge over 500 bucks for this garbage CPU that doubles as an immersion heater. They won't even include the 150 bucks worth of cooling you need to actually have it run at 5 GHz.'AMD propaganda artistis'? Nah; people are finally waking up to Intel and their stranglehold on the consumer CPU market.
shompa - Tuesday, November 5, 2019 - link
Factual: There is no difference in gaming in 2160P. We are limited by graphic cards. In 1440/1460P its +/- 3%. In 1080P: Only there Intels 5ghz is faster than AMD, since this is not limited by graphics. But still: P = 60hz. So Intel can produce 10-15% more frames that you can't see. "but... maybe I want to game in 1920x1080 with 240hz". Well. Then again, there is no difference since its graphic card limited. Maybe next gen graphics a fast CPU in 1440/1460/2160P can make a difference. But then again: maybe games can be threaded better over more cores.Korguz - Tuesday, November 5, 2019 - link
source for your post.. or is it your own " facts " and there fore.. your own opinion ??Korguz - Friday, November 1, 2019 - link
maybe you are just to used to the intel propaganda.. and intel bs ???AshlayW - Friday, November 1, 2019 - link
'Hater's dislike this CPU because once again Intel is relying on marketing ploys and borderline mis-information to sell recycled parts based on tech from 2015, at a price that is higher than the competition's part with 4 more cores, signficnatly higher MT perf, higher efficiency, an included cooler, it's not EOL like Z390, etc. Do you see what I'm saying? these 'haters' are sick and tired of Intel's stranglehold on price-gauging the CPU market and they voice their opinions now that we have some really viable competition. Shocker: people care about value and features.nathanddrews - Friday, November 1, 2019 - link
And to some customers, this chip has the features they want at a value level they can afford. I'm not sure why that's so difficult a concept to understand. Rather, I think that the "haters" understand perfectly well, they irrationally fear what other people purchase.Spunjji - Friday, November 1, 2019 - link
None of what you said discounts what he said - it's just whataboutism followed by some unflattering generalisations about people you think you disagree with. :|GreenReaper - Saturday, November 2, 2019 - link
It's not fear; it's pity.brantron - Thursday, October 31, 2019 - link
What does the ring bus clock run at by default? If it's still 3.7 - 4 GHz, these may be a little more overclockable than the turbo clock implies.I'm curious how the power and temperature compares to the regular 9900K with everything pushed as far as it will go.
If and when there's a desktop Comet Lake, I'd also like to see a comparison from Skylake and on. With chiplets, CPU manufacturing processes may no longer completely go out of style, so it would be interesting to see how 14nm progressed over 5 years.
Alistair - Thursday, October 31, 2019 - link
It's kind of funny, that the only thing different about this 9900k is the possible voltage that it might be able to run at, and he didn't test that. "voltage" is nowhere to be found in this review. I.e. this CPU is just a 9900k, nothing improved. Just set your voltage to 1.3V with a normal 9900k and set 5ghz, there, the same. This by all rights will be crushed by the 3950x, I'd rather take double the cores for almost the same money, imo.TEAMSWITCHER - Thursday, October 31, 2019 - link
"I'd rather take double the cores for almost the same money".I jumped on the 9900K long before Zen 2 even had a release date. I'm not disappointed with the performance I get in multi-threaded tasks, certainly not enough to spend $1000 to switch platforms. AMD's 3900X and (eventually) the 3950X might have more cores but it's going to be at least two years before I buy another platform. Instead, I will use that $1000 to purchase an RTX 3080Ti the very instant nvidia puts them up for sale.
Korguz - Thursday, October 31, 2019 - link
heh,.. and you really think nvidia is going to charge 1k for a 3080Ti ??? more like 2000+Alistair - Thursday, October 31, 2019 - link
um... you could have just kept the 8700k instead if you wanted the ST performance, that hasn't changedTEAMSWITCHER - Friday, November 1, 2019 - link
What makes you think I had an 8700K? My previous platform was an Intel Core i7 5930K on an X99 Deluxe motherboard. The jump from 6 - 8 cores wasn't huge, but the Coffee Lake cores are faster than the Haswell-E - my Cinebench score still doubled. However, Cinebench is kind of stupid now, and why I dropped the HEDT platform. I'd rather spend more on a GPU, not just for great 4K gaming, but Blender can do faster rendering on a GPU, and you can also use your GPU to accelerate video encoding. 8 cores and 16 threads is plenty for everything else.AshlayW - Friday, November 1, 2019 - link
People like you are why Nvidia and Intel can get away with this xDSpunjji - Friday, November 1, 2019 - link
$1000 to switch platforms? Wut? $500 CPU, $200 motherboard, and then what..?GreenReaper - Saturday, November 2, 2019 - link
That's not true; according to Phoronix it has additional hardware mitigations to reduce context switching times.brantron - Sunday, November 3, 2019 - link
Still haven't seen anything about the ring bus clock.Silicon Lottery posted their 9900KS binning results: 5 GHz at 1.250v, 5.1 GHz at 1.287v, and 5.2 GHz at 1.325v. The most recent numbers for 9900K and 9900KF had 5 GHz at 1.30v.
After the rather uneventful 8086K, if this is going to be a recurring thing, I think they need a little more special sauce. For example, cleaner solder and/or lapped IHS.
crashtech - Thursday, October 31, 2019 - link
The box is dodecahedral, a minor quibble.Khenglish - Thursday, October 31, 2019 - link
@Ian I feel like fanboyism is on the rise at Anandtech article comments sections lately, but I'm not sure if it's just a nostalgia thing where I just think the past was better. Do you have an archive of the comments from P4 vs Athlon-xp, Athlon64 days? I swear back then was just a simpler time of "In Soviet Russia... " jokes.Slash3 - Thursday, October 31, 2019 - link
It's definitely worse.Spunjji - Friday, November 1, 2019 - link
I think it's a symptom of wider trends on the internet. Astroturfing on tech forums has been a thing for a while now, so people are even more suspicious of the motives of others. The communities are larger, so there are more names to keep track of and less trust as a result. More people have learned stupid "debate" tactics like whataboutism and gish galloping from useless right-wing Youtubers, so even small disagreements and misunderstandings spiral out into lengthy pissing matches.AshlayW - Friday, November 1, 2019 - link
I like to think my comments aren't 'Faboyism' but I am very open that I absolutely hate Intel as a company. Here's one for the 9900KS:"In Soviet Russia, CPU buys YOU"
Jorgp2 - Friday, November 1, 2019 - link
I blame /r/pcmasterraceJust a big circlejerk
1_rick - Thursday, October 31, 2019 - link
As far as pricing goes, Microcenter lists it as in stock for $599 (might be an in-store only price). Newegg lists it for $569, but they're sold out.Sahrin - Thursday, October 31, 2019 - link
lol, this is supposed to be the chip Intel launches in response to AMD's second-best to get the 'last word' - instead it's a replay of Zen2 launch day all over again with the 9900K's sporty younger brother getting humiliated.ballsystemlord - Thursday, October 31, 2019 - link
Spelling and grammar errors:"...but the system was fully table the entire time."
"stable" not "table".
"...but the system was fully stable the entire time."
"It would appear that there is something else the bottleneck in this test."
Mis-written:
"It would appear that there is some other bottleneck in this test."
ballsystemlord - Thursday, October 31, 2019 - link
"One question that does remain however, is which set of results should we keep?" Keep to Intel guidelines. That's what you do with memory speeds and other things, I'm sure.That is also the guaranteed results for all consumers.
Goloith - Thursday, October 31, 2019 - link
Normally this hardware site is my "go to" for hardware reviews. However, holy mother of Jebus was this test bed terrible. Let's buy a Porshe and throw on old outdated tires on it. This should have been paired with a 2080 to, but you pair it with a 1080?Rule# 1, don't go full retard.
Goloith - Thursday, October 31, 2019 - link
Or with a 2080 tiDeath666Angel - Saturday, November 2, 2019 - link
The thing is, that would invalidate all older results. I'm guessing they will update the GPU in 2020 and retest certain legacy CPUs. What do you think would fundamentally change about the results with a newer GPU, though? I doubt it would be much, since the trends are the same with all the different graphical settings they already test at.ablevemona - Friday, November 1, 2019 - link
"in its hexagonal box"?Nichronos - Friday, November 1, 2019 - link
Why there isn't any Voltage table as all other Anandtech review have? We want to know how much can you undervolt the KS at the stock 5GHz!!!Thunder 57 - Friday, November 1, 2019 - link
Is that Katamari all over your new toys again??Agent Smith - Friday, November 1, 2019 - link
Only one year warranty with this CPU, reduced from 3yrs. So it’s marginally faster, uses more power, offers no gaming advantages and it’s price hike doesn’t justify the performance gain and warranty disadvantage over 9900k.... and the 3950x is about to arrive. Mmm?
willis936 - Friday, November 1, 2019 - link
Counter strike really needs to be added to benchmarks. It’s just silly how useless these gaming benchmarks are. There is virtually nothing that separates any of the processors. How can you recommend it for gaming when your data shows that a processor half the price is just as good? Test the real scenarios that people would want to use this chip.Xyler94 - Friday, November 1, 2019 - link
It's more because you need a specific set of circumstances these days to see the difference in gaming that's more than margin of error.You need at least a 2080, but preferably a 2080ti
You need absolutely nothing else running on the computer other than OS, Game and launcher
You need the resolution to be set at 1080p
You need the quality to be at medium to high.
then, you can see differences. CS:GO shows nice differences... but there's no monitor in the world that can display 400 to 500FPS, so yeah... Anandtech still uses a 1080, which is hardly taxing to any modern CPU, that's why you see no differences.
willis936 - Friday, November 1, 2019 - link
csgo is a proper use case. It isn’t intense, graphically, and people regularly play with 1440p120. Shaving milliseconds off input to display latency matters. I won’t go into an in depth analysis to why, but imagine a human response time has a gaussian distribution and whoever responds first wins. Even if the mean response time is 150 ms, if the standard deviation is 20 ms and your input to display latency is 50 ms then there are gains to cutting 20, 10, even 5 ms off of it.And yes, more fps does reduce input latency, even in cases where the monitor refresh rate is lower than the fps.
https://youtu.be/hjWSRTYV8e0
Xyler94 - Tuesday, November 5, 2019 - link
If you visually can't react fast enough, doesn't matter how quickly the game can take an input, you're still limited on the information presented to you. 240hz is the fastest you can go, and 400FPS vs 450FPS isn't gonna win you tournaments.CS:GO is not a valid test, as there's more to gaming than FPS. Input lag is more about the drivers and peripherals, and there's even lag between your monitor and GPU to consider. But go on, pretend 50FPS at 400+ makes that huge of a difference.
solnyshok - Friday, November 1, 2019 - link
No matter what GHz, buying a 14nm/PCIE3 chip/mobo just before 10nm/PCIE4 comes to the market... Seriously? Wait another 6 months.mattkiss - Friday, November 1, 2019 - link
10nm/PCIe 4 isn't coming to desktop next year, where did you hear that?eek2121 - Friday, November 1, 2019 - link
The 3700X is totally trolling Intel right now.RoboMurloc - Friday, November 1, 2019 - link
I dunno if anyone mentioned yet, but the KS has additional security measures to mitigate exploits which are probably causing the performance regressions.PeachNCream - Friday, November 1, 2019 - link
I expect I will never own an i9-9900KS or a Ryzen 7 3700X, but it is interesting to see how close AMD's 65W 8 core chip gets to Intel's 127+W special edition CPU in terms of performance in most of these benchmarks.mattkiss - Friday, November 1, 2019 - link
The Processor Specification field in the CPU-Z screen shot has "(ES)" after the cpu name. Did you test and engineering sample or an actual retail sample?GreenReaper - Sunday, November 3, 2019 - link
It was only launched a couple of days ago, so they'd have to have had a sample, you couldn't run the tests and write the article having bought it at retail.BAF2782 - Friday, November 1, 2019 - link
Nice to see a 10+year old Thermalright True Copper: 2kg for a CPU that has a mini-nuclear thermal reactor powering itself. LoL. I can't really say much. I'm using a 11year old ZALMAN CNPS 9900MAX (300watt) AIR cooler on a 5.3GHz Golden 8700K with 1.380v in a z370 Aorus Gaming 7. I thought about getting a new cooler with the $400 open-box 9900K that i just found. But, after seeing the 9900KS running on the 10year old Thermalright True Copper: 2 kg. I don't have a doubt the ZALMAN 9900MAX (300watt) cooler will have any problems running a 5GHz (PO) 9900K.crotach - Saturday, November 2, 2019 - link
So, this is a direct competitor to 2700X, but costs twice as much?I wonder if we'll see intel as a budget offering in a couple of years, cutting the price of their 14nm tech to half of what AMD charges just to get some sales.
My my, how the tables have turned :)
Dragonstongue - Saturday, November 2, 2019 - link
if they can say 4ghs base, but 5ghz all core turbo, why not just have as 5ghz directly ??I confused, seems another marketing ploy of some sort (no real surprise there)
that being said, likely good for AMD as well as sets a new target for them TSMC et al.
Orkiton - Saturday, November 2, 2019 - link
It's like a Caterpillar engine that will need hydrogen fuel to beat most benchmarks. That means expensive Mobo, GPU, PSU, Mem, Cooler, Case.It's Intel stretching to the limits for PR purposes, yet much bellow AMD, in terms of value for money.
liquid_c - Sunday, November 3, 2019 - link
It's unbelievable, i swear. Even Anandtech's comment section has turned into a pcgamer shit show. Instead of just trying to take the article as it is - a piece of info, most of the people here either start a revolution against Intel or just plain dismiss it as "fake information" in regards to TDP. But i see nobody here admitting that Intel's 14nm is ON PAR with AMD's 7nm (i've heard they have the same density but don't quote me on that). Or that said products is not geared towards you or your acquaintances. I swear to God, it's like Intel has raped some family member or something. Just buy whatever you think is fit for you and leave others to enjoy proper journalism (which is so f*cking rare nowadays).liquid_c - Sunday, November 3, 2019 - link
*are not geared.Sigh, i miss an edit button :(
Korguz - Sunday, November 3, 2019 - link
cause it is fake info, 127 watts ?? nope, 200+ more then likely. and some i think, are tired of the lies, BS, and over charging intel has had us pay over the years. face it, if amd didnt bring out zen, the chances are we would still be stuck at quad core for the mainstream, and anything above that, would be HEDT. FYI, it may be on par, but intel should also be on 10nm by now... maybe even the next node size, and yet, intel kept saying, 10nm is on track. there are better, less expensive options then this cpu, this is just intel being intel, last ditch effort to try to save face.The Garden Variety - Sunday, November 3, 2019 - link
Thank you, liquid_c. This needs to be repeated over and over again. You have this stellar quality content, painstakingly researched and presented, and it's clear many of the readers (or at least the ones that comment) are neither understanding it, reading it in context, or even trying to think about it. The smartphone reviews are identical. It's all just "how do I read what I want so I can wage my emotional holy war because I have literally nothing else going for me."Here, I'll go one step further: there's zero reason for a site like Anandtech to have comments following articles at all. You could delete 9 out of 10 attached to this article and zero of value would be lost. Given how terribad the comment system is (no editing, tiny and unscalable input window, spam posts that go undetected for days, etc.) and the effort/investment it would likely take Anandtech's limited team to improve it, just getting the hell rid of it would be a far more sensible solution. Let the content be the star. Send people to the forums in a special board requiring a 500 post barrier in the broader community to contribute to article discussions.
liquid_c - Sunday, November 3, 2019 - link
One of the few reasons i keep reading articles from sites / news outlets like Anandtech and ArsTechnica is the fact that besides good, well developed and portrayed journalism, i also expect knowledgeable people commenting on said articles. I always learn(ed) a little bit of extra info by doing so and it pains me to see this ongoing fan war between Intel / AMD fans, Apple / *Insert any other Android vendor name* fans, etc. So instead of finding out the “ifs” and “whens” of specific tech topics, i have to skip through countless hate posts.PeachNCream - Monday, November 4, 2019 - link
People in glass houses.... liquid_c - Sunday, November 03, 2019 - Stop being such a dummy and stop acting like a rabid dog over a product that not only you will never get but clearly, it's not geared towards you.Although I agree in principal that there are a fair number of toxic comments, yours are among them so I don't think you have much room to complain while at the same time contributing to the problem.
sorten - Sunday, November 3, 2019 - link
The 65W 3700X is the star of this show.shompa - Monday, November 4, 2019 - link
The main problem is: 16 PCI lanes. You can't really connect anything to the system without starving it for bandwidth. The 16 PCI lanes are used for graphics. DMI link to the motherboards PCI lanes has a bandwidth of 3.8GB/s. 4-year-old NVme SSDs are already at 3.5GB/s. Forget using 2 NVme. Usually, the graphics card is pushed to 8PCI lanes killing 10% performance. Forget connecting fun stuff with a thunderbolt, or use high-end capture cards and so on. There is no bandwidth. AMD Ryzen 3 with X570 is a bit better: 24 PCI lanes. 16 for PCI and 2x4 for dual NVme SSD with 8GB/s support.trojtalen - Wednesday, November 6, 2019 - link
Really? In FarCry i missing Ryzen 3000 generation, why?...Sorry but in Vulkan (Strange Brigade) its shame! 65W AMD beat 250W+ intel on 5GHz? :D:D:D LOLalufan - Thursday, November 7, 2019 - link
so this has been the main front page story on this site for the last 8 Days strikes me as a little biased other CPUs come and go yet intels last gasp attempt to make something matter is upfront and centre for over a week strikes me as a little unfairjonbar - Thursday, November 7, 2019 - link
So.... În other review, Ryzen slaughtered 9900ks in blender, even 3700x is faster :)Some say in gaming it's faster than 3900x, but so is 9700k with 2080 :) 9700k is on par with 9900ks.
Other than "having the best of the best", 9900ks requires a lot more spending to open it's full potential. From this review, 3700x is really, really good all-round with minimal spending a 100$ B450 and a good 550w psu
Maxiking - Thursday, November 7, 2019 - link
Any Ryzen CPU is getting slaughtered in gaming by piss poor 9700k without HT so your point? Don't hate things just because you are too poor to afford them. We rich bois don't care. Don't forget to attend Fridays for future today.jonbar - Friday, November 8, 2019 - link
:) how could you live a day without insults.People asked why no ryzen 3000 results for blender, and slaughters is when a 127W-rated, "special edition" "all cores 5GHz all times", 550$+ CPU doesn't even manage to beat the 3700X at 325$.
People also asked why 1080 and argued that 9900ks is still better for gaming, guess what - on a 2080, de difference is not so great, other than Hitman, 9700k (piss poor? I suppose rich boys don't know that it, at the moment, is 30$ more expensive than 3700x, with no cooler AND requires the Z mobo to overclock :)
Maxiking - Friday, November 8, 2019 - link
Yeah, I am asking too why that allegedly excellent 7nm cpu with HT is unable to beat a piss poor 14nm 9700k with HT disabled? See, there is a reason why it is so cheap.Yes, it is a piss poor version of 9900k with disabled HT, the silicone quality is subpar by Intel high standards and those pieces are unable to reach 5ghz boosts with HT being enabled , so Intel disabled the HT and sell them as 9700k. Of course, I should not forget to mention that 9700k is a piss poor quality cpu only by the high Intel standards and high Intel customers standards.. We do not bother with anything below 4.7 Ghz.
By AMD standards, anything reaching 4.0 is a godly cpu and is being binned and sold as 3900x for the premium and they are pumping up to 1.55v into it so it can reach that mighty 4.3ghz boost on a single core whilst they promised 4.6ghz.
Let's wait for 3950x and the promised 4.7ghz boost. You know, I smell a trap. Not that kind of one, you pervert. 3900x is able to reach 4.6ghz for microseconds so, that 4.7ghz boost won't be most likely even measurable because such tiny time unit the boost would be active for hasn't been discovered yet.
Maxiking - Thursday, November 7, 2019 - link
Shame you don't complain so much about AMD CPUs unable to reach promised boost clocks as much as you care about Intel power consumption. We get it, you are poor, you could finally afford 8 cores thanks to AMD yet loosing to Skylake refresh crippled by security patches so venting your frustration here. Difficult time to be an AMD fan, especially after the first gen threadripper support drop fiasco, suddenly a new socket and no backward compatibility is not an issue. Don't hate things just because you can't afford them. Fridays for future is up today again, vent your problems there, thanks. Anyway, bye, a private plane is waiting, gonna have a pizza for dinner in Italy to piss off Greta because I can.jonbar - Friday, November 8, 2019 - link
"Skylake refresh crippled by security patches" - you must be kidding, right? It shouldn't have those security holes. Please stop talking shit about poor because people here talk about optimization - the best for the least money at a price point. And please stop bashing AMD's ryzen - it's not bulldozer, without ryzen this shit here would be sold as "Intel i11 Unobtanium Edition" for 1k$ and you, rich boy, would have 6 cores or more only on LGA 2011.Nobody hates a product - I don't like Intel practices - 5% increase per generation to the point. Where my i7 3840qm is 10-15% slower than 7700hq with a 4! generations gap.
Speaking about private planes - nobody gives 1 cent on rich boys approach on tech at this level because, while you can afford stupid - the rest of us have to be smart. Now you can fly eat your pizza:)
Korguz - Friday, November 8, 2019 - link
wow maxiking... resorting to insults and name calling still ?? still believing all the intel bs ?? still believe intels bs about how much power their cpus use ?? talking like you have money is supposed to impress people ?? good for you.. nice to see you are also arrogant rich spoiled bratMaxiking - Friday, November 8, 2019 - link
How dare you? Where did I name call anyone? If someone is fat and I call them fat or if they smell and I tell them so, it is not an insult, it is called stating a fact.I see you still do not get the TDP does not mean power consumption, it is even stated and explained in the review.
If I were you, I would be more concerned about 1700x, 1800x, 2700x, 3900x TDPs and AMD misleading marketing about boost frequencies because there have been so far 3 bios patches which were supposed to fix the issue and guess what. Nothing has changed. People have to use a makeshift custom power profile created by a geek in order to get closer to the promised boost clocks.
Typical AMD, I give it 3 months till he starts fixing their awful gpu drivers aswell.
Korguz - Friday, November 8, 2019 - link
calling people poor.. among many other things in previous posts by you.. and yes it is an unsult to call some one fat.. or they smell.. but, i bet you do that because either your selfesteem, and self worth is so low, you have to say things like that to make your self feel better..yet you still cry about ryzen and the clock speeds.. but yet. you STILL refuse to admit the fraud intel calls is tdp spec ?? so what ever maxipadking . go back to your cave...
Maxiking - Sunday, November 10, 2019 - link
Yeah, my self esteem is so low that I regularly visit Mercedes and BMW showrooms only to tell them how they cars are overpriced and my Dacia is cheaper and can perform the same and consuming less gass like you do. If Intel TDP is fraud, so does is AMD's one and their promised boost clocks and video on youtube where they promise you can overclock chips even further with sufficient cooling. What do they mean by that? Ay, and what about the bulldozer fraud?Korguz - Sunday, November 10, 2019 - link
yea sure you do, your the one who is probably poor... you are becoming the worst intel shill on here now.... all you EVER do is talk. if you are so sure amd is committing fraud as you claim, then put your supposed money where your mouth is, and take AMD to court,m or shut upMaxiking - Monday, November 11, 2019 - link
Again, it is you, you and only you perpetuating lies. I never come here first talking **********, I only reply to amdfanboys comments.I do not own any AMD cpu, I do not buy subpar products so I can not take them to court.
Anyway, if you are so sure about Intel wrongdoings, take them to the court. EZ.
Unfortunately for you, it is AMD who lost at court and got caught misleading about that parody on cpu called bulldozer. Claiming to possess 2 times more cores than they actually had.
This is your AMD marketing in a nutshell
https://cdn.vox-cdn.com/uploads/chorus_asset/file/...
QQ more. Deal with it.
Korguz - Monday, November 11, 2019 - link
maxipadking.. you are so full of it... what about the intel lies about its 10nm nodes for the last what.. 6 years being on track ?? what about the lies about their not doing anything wrong to prevent amd feom selling its products ?? among various other things over the years that you so easily for get... you never come here 1st ?? BS actually.. you DO buy sub par products.. intel is sub par now.. but in your intel blindness.. you just dont see it... intels marking has been worse over the years then amd.. deal with that.keep QQing more about it... your good at it..
Gastec - Tuesday, November 19, 2019 - link
Maxiking, this is a tech site, not your favourite social network for trolling. Your shameless trolling should be punishable with a ban.Lolimaster - Sunday, November 10, 2019 - link
Let's not forget thr new agesa efficiency mode where losing 1-3% fps the cpu power consumption goes down by 35%.Maxiking - Sunday, November 10, 2019 - link
Yeah, let's not forget that agesa was supposed to fix the boost and it didn't again.Qasar - Sunday, November 10, 2019 - link
ok maxiking, IF you THINK you are so smart and know it ALL let me ask you this:WHY does it take a HIGHER clocked intel chip, to beat a LOWER clocked equivalent ?? i would LOVE to read your reason for this one.
clock both intel and AMD at the SAME clocks, and i bet, AMD's chips would slaughter intels, in EVERY metric, performance, and power usage.
i will tell you what, i will save you the trouble, you WON'T reasonably answer those 2 questions with a real, logical answer, you will just attack me like you do others on here, call me names, or insult me, among other things, but the answer for both, is amd's current chips, now either match, or surpass intel in IPC, and performance per what.
come one maxiking, i dare you to answer those 2 questions with a reasonable, logicial answer, WITH OUT being insulting, or resorting to name calling.
Qasar - Tuesday, November 12, 2019 - link
yep... just as i thought.. NO response from maxiking, because he CAN'T give an logical answer to those 2 questions.shaolin95 - Monday, November 11, 2019 - link
interesting that you guys always "forget" to enable igpu for video encoding....