Newegg started allowing third party vendors to sell products through their site (similar to Amazon) a while back - one of the side effects is that you end up with ridiculous listings like that.
The art of selling an empty product box has a long and storied history on everyone's favorite online auction site (empty launch day PS3 boxes, anyone?), but the unique packaging for the i9-9900K/KF made it a prime candidate for flipping as well.
It's such a dumb strategy for at least i7 and above. They waste so much die area and TDP on these embedded GPUs that nobody is ever going to use. I'll just buy a CPU that dedicates the full die to CPU cores, thank you.
That's why jwcalla said for the i7 and above. I highly doubt anyone is using an integrated gpu on the i7 or i9 without a discrete graphics solution. Business and workstation class machines that have those processors will almost always have a discrete graphics card in the chassis.
jwcalla never said all pc's are for gamers, but thanks for assuming that gamers never use their pc's for anything but games. Some of us may have need for those extra cores like CAD rendering and 3D art.
I'm "working" for a Japanese electronics company (office not in japan) as a developer and EVERY developer PC here has an i7 (main device) and none of those standard machines has a discrete GPU.
We have Nvidia GPUs on some machines where we experiment with CUDA acceleration and AI but those are Xeon anyway.
You're making weird assumptions in regards to i7...
Those are probably pre-Zen quad cores. Current model desktop i7/i9 are a different beast. How many desktop workloads make large gains performance from an expensive 8 or 16 thread CPU, yet don't benefit at all from having decent graphics? Even modern web browsers offload work to the GPU.
1. Please read up what browsers offload exactly before you post...
2. Compiling large projects (what millions of developers do every day) is always multi-threaded. Linux developers are sometimes even obsessed with compiling the kernel on server CPUs with many cores and sockets (just look at phoronix or servethehome ;)).
I am mostly doing .net stuff locally and on team foundation server. They are smart and try to compile/build parts as soon as they can but every time we merge our work to build and test we always hope that we get more cores next time to work with ;)
If you're a developer then you're the absolute perfect case scenario for what I'm talking about. You would be far better off with a CPU that has die area completely dedicated to CPU cores, and then a $30 discrete GPU thrown into a PCIe slot to run a display.
As it is now you have HALF of a CPU wasted just so you can drive a display. It's dumb beyond belief.
"That's why jwcalla said for the i7 and above. I highly doubt anyone is using an integrated gpu on the i7 or i9 without a discrete graphics solution. "
Stop doubting. Practically nobody in business world needs separate graphics card while a lot of people benefit from faster processors - almost everybody but "Outlook warriors".
I'm sorry but you are incorrect. We have hundreds of i7 desktops and not a one has a graphics card in it. You will find that repeated across every major business.
Yep... in our department one machine has dedicated gpu and that is for research project... So definitely there is need for high power cpu with build in gpu!
A lot of people also use them for accelerated encoding (Quick Sync) in addition to powering a second display in a mixed-refresh rate setup (eg, 144Hz on primary, 60Hz on secondary). They're handy if a primary GPU fails, too, or if you want to sell a card before a new part arrives or is available.
Why is the iGPU useful in multi monitor refresh rate scenarios? I've had a 1440p100Hz display together with a 1200p60Hz one connected to a GTX 1080 or a GTX 960 and it was fine. And again (see below) how often do you guys break your GPU? You should look into that. If that is a recurring thing the iGPU option might make sense of course. But I would look into my setup and investigate why that happens so often and try to avoid it in the future.
I mentioned GPU failure as a scenario in which an integrated GPU can be useful, nowhere did I suggest that people were breaking their GPUs. Parts fail (ask RTX 2080 Ti early adopters who ended up with Micron GDDR6) and it's nice to have a backup graphics output.
Like you, I'd love to see a pared down i9-9900 style CPU with no physical GPU transistors on the die, assuming it brought with it a cost savings, improved thermals or more overclocking headroom.
The unused die area contributes to suck up some thermal power and makes it easier to keep the chip cool so the igp is beneficial even when it's not used, you are not getting better thermals by removing it and therefore you are not getting more overclocking headroom either unless you can decrease the maximum lenght of timing pathways by removing it, and since they are on different clock planes I doubt it.
APUs are far more popular. And not only because of business and average home users. It's useful for gamers and overclockers as well, as you don't have to connect huge graphics card, when not needed or when it's broken, which happens quite often. And when devs will finally deprecate DX11, there is useful multi GPU feature of DX12.
"which happens quite often" uh, huh? I've had zero GPUs get broken by me and I'm in the business of PC gaming since the TNT2 M64. Overclockers generally don't want to use iGPUs (unless they want a WR for iGPU OC) since that will skew the results of any CPU OC and most overclockers are also interested in combined OC. Most gamers that are not on a budget want that huge GPU to be able to play all titles at acceptable visual levels without low FPS hindering their enjoyment. And if even AMD and Nvidia aren't investing any resources into multi GPU setups these days (when they can directly benefit fincancially from it) I doubt just going with DX12 will suddenly make asynchronus multi GPU setups between iGPU and dGPUs viable.
The biggest issue I have with there lineup has nothing to do with F pricing but the lack of lower than 9900 with HT. Right now you can get an 8 thread CPU, but need to go to 9900, 16 thread part, there should be a 12 thread (6/12 CPU) one.
It depends on the application, right now there a huge gap in thread #, 8 to next is 16 and only in top CPU. Sure I know why they do it, just saying it's not good IMO :)
There's plenty of alternatives: Ryzen 7 3700X or 3800X are great examples. However, I agree that a 6/12 would be nice in the current lineup of Intel Core parts. You may find a discount on an i7-8700 perhaps?
8xxx and 9xxx CPUs in fact belong to a single lineup, as well as 7xxx and even 6xxx. (Besides new stepping (with hardware mitigations for Spectre+Meltdown) in *some* 9xxx SKUs.)
Disgusting anti-consumer company. Yes, they've actually now reduced the price of these salvaged parts, but the fact that it's taken them this long is appalling. I'm so glad you don't 'have' to buy Intel these days for the highest performance, efficiency and features. Because nobody should be supporting this company willingly in 2019.
AMD is no princess either. Advertised max turbo speeds for Ryzen 3000 parts were an exaggeration. 3900X pricing hasn't reached MSRP now months after the launch. And 3950X was announced and then failed to reach market on schedule. Competition is great, but it has to be more than a promise, you have to deliver. So hard to buy AMD when they drop the ball so frequently.
Come on, there was a bug that limited the turbo boosts by 25mhz, that was not noticeable. And none of the Ryzen systems I've built with Asus boards ever had a problem, depends on the board that you bought, and the problem has been fixed anyways. The 3900x has been MSRP since launch (and I was able to buy one the first week), if you don't know how to buy a CPU, that's on you. Don't pay attention to online listings, there are 10 x 3900x in the store near where I live. I own a 9700k, and not surprisingly, my 3900x is WAY faster for work.
To be fair, 3900X supplies have been exceedingly poor outside of some regional Microcenter stores and the occasional ten minute in-stock notification. Most vendors have now priced the CPU at ~$570 rather than the $499 MSRP (as found at Best Buy and the occasional Amazon listing).
As of right now, it's in stock at both Amazon and B&H Photo for ~$570, for anyone trying to track one down.
Hmm dont think thats a fair comment, issue was with a software error that was fixed, AMD has never said all cores will boost too x on any CPU as for drop the Ball lets not mention the security issues with intel, the anti competitive actions that saw huge fines or the fact they have blatently overcharged for years for a CPU that has not got a hope of staying within its TDP rate. AMD is simply making a decent profit on a good product they are no longer required to be bargain basement products but still they are more reasonable than intel as can be seen from intels recent need to drop prices to try and compete with a better product its called competition and its great for us the consumer
Intel's 8086k was for one thread only, AVX off. That never happens, thus you never see 5ghz. At least AMD's TDP means something, and the boost clock doesn't require AVX to not be used. I find it a bit bizarre that Intel never reached their boost clocks for years and it wasn't a scandal until AMD came up 25mhz short. I think the only reason for that is the boost clocks you can get with AMD exceed the all core clocks, while with Intel the boost clocks are lower than the all core OC possible, so doesn't bother anyone since you just set the clock speed higher than the boost anyways.
It is a fair comment. Ryzen 3900X is really just two defective 7nm chiplets, availability has been poor and certainly NOT at MSRP. Intel's i9 9900K is available below MSRP ($450 @ microcenter.com) and leads the 3900X is many gaming benchmarks. With hyper threading it's certainly no slouch and can get work done when you need it to. 3950X is the "flagship" part worth waiting for and it was a no-show in September.
Finally... While everyone is thanking AMD, I would like to point out that Intel has routinely dropped per-core pricing on new HEDT processors:
10-core Broadwell-E was $1723, while 10-core Skylake-X was only $999. 8-Core Haswell-E was $999, while 8-core Skylake-X was only $599 6-Core Ivy Bridge-E was $999, while 6-Core Haswell-E was only $583.
Yes this year's price drop is a bit bigger than before, but the pattern was there long before AMD entered the market with Ryzen or Threadripper.
TEAMSWITCHER " Ryzen 3900X is really just two defective 7nm chiplets " or really, says who ? you have proof if this, or is this just your own opinion ? i bet the 9900k also wasnt as MSRP when it was 1st released, your point is????
according to here : https://www.anandtech.com/bench/product/2263?vs=25... seems for the most part, they are even with games, and even then, the difference on some, is 10 frames or less, and that's with the 3900X having a clock speed disadvantage to boot, imagine if the clocks were the SAME.
" that Intel has routinely dropped per-core pricing on new HEDT processors:" yea.. probably because intel released the replacement for it :-) and we SHOULD be thanking AMD for the price drops, case in point, look at what intel charges for the Core i9-10980XE, $979, vs $1979 for the Core i9-9980XE. and over all for cascade lake X vs skylake X.
While there obviously is a huge business/budget PC market that will benefit from having an integrated GPU, there is also a considerable market that does not care about it or want it in there as it will never be used. Yields make such a big impact on their bottom line it is a wonder why they've never bothered to make a second line that omits the GPU entirely, not merely using defective units by disabling the GPU, so as to get a much higher unit count from every wafer. Actually get rid of the GPU from the die altogether so that exactly zero silicon is used for it so that they can get many more CPUs from each wafer.
QuickSync encoding quality is so horrible when using actually-sane bitrates. Software encoding with x264/x265 is always going to be better because the people adding these QuickSync/NVENC features don't actually care about making quality encodes. They just want another marketing checkbox to fill in. If any features like this would actually work as quickly as they do and also give quality that's comparable to a reasonable target, such as x264/x265's medium preset or better, then they would actually be useful. Most especially if it matched medium or better at streamable bitrates. But their quality at streamable bitrates is so bad. So, so bad.
If they were smart, they'd do a limited run of such a CPU without a GPU even existing on it just to see how it would actually sell. Sell something equivalent to the 9900K sans GPU, and price it with a discount compared to the 9900K that relates directly to the percentage of silicon saved per die/wafer. Does it make the CPU die 20% smaller? Sell it for 20% less. I think they'd be surprised at just how many they sold. Rather, how quickly it would sell out.
Intel should have got rid of that whole iGPU BS on the die, precious die space laid to waste. Uncore power is still there no matter how much the iGPU is idle. Unlocked processors do not need the retarded extra overhead of the iGPU junk, esp the new DCH driver rubbish is horrid.
I hope their new 10C Cometlake doesn't have that stupid garbage.
IMO it isn't worth it. It actually kills too many valid use cases (quicksync, linux KVM with GPU passthrough without a second discrete gpu, troubleshooting, etc)
This move is just in response to Zen 2. Intel's way of coming closer to AMD's pricing (AMD Zen 2 based chips don't have integrated graphics)
Side note: this site is slowly becoming an unuasable ad infested mess. A video with noise right above the comment box that autoplays? Really? Time to turn the adblocker back on or stop visiting anandtech. Purch is slowly killing off tech sites.
They just admitted they cant fix all the security issues. Not even in the next generation and the one after that, and probably not even in the one after that. So why should I pay that much money for insecure CPUs? Huh, Intel?
Something like a 35 dollar shave across the board? Nice. Still not sure why one would buy anything but an i9 from intel other than compatibility reasons though.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
48 Comments
Back to Article
DigitalFreak - Monday, October 7, 2019 - link
Only a $10 price difference between the 9900K and KF @ Newegg.... and someone is selling the 9900K box and bag(?) only, no processor, for $74 there was well. WTF?
Slash3 - Monday, October 7, 2019 - link
Newegg started allowing third party vendors to sell products through their site (similar to Amazon) a while back - one of the side effects is that you end up with ridiculous listings like that.The art of selling an empty product box has a long and storied history on everyone's favorite online auction site (empty launch day PS3 boxes, anyone?), but the unique packaging for the i9-9900K/KF made it a prime candidate for flipping as well.
imaheadcase - Monday, October 7, 2019 - link
Newegg went to crap after it was acquired and changed into a "market". It was so much better when just focused on computer related sales.Slash3 - Monday, October 7, 2019 - link
1,000%eek2121 - Tuesday, October 8, 2019 - link
I use newegg to browse and Amazon to buy.Surfacround - Tuesday, October 8, 2019 - link
1000% percent...jwcalla - Monday, October 7, 2019 - link
It's such a dumb strategy for at least i7 and above. They waste so much die area and TDP on these embedded GPUs that nobody is ever going to use. I'll just buy a CPU that dedicates the full die to CPU cores, thank you.brakdoo - Monday, October 7, 2019 - link
Business PCs use the integrated GPU.Just look at your average desktop from Dell/HP/Lenovo.
Not every PC is for gamers...
Ninhalem - Monday, October 7, 2019 - link
That's why jwcalla said for the i7 and above. I highly doubt anyone is using an integrated gpu on the i7 or i9 without a discrete graphics solution. Business and workstation class machines that have those processors will almost always have a discrete graphics card in the chassis.jwcalla never said all pc's are for gamers, but thanks for assuming that gamers never use their pc's for anything but games. Some of us may have need for those extra cores like CAD rendering and 3D art.
brakdoo - Monday, October 7, 2019 - link
I'm "working" for a Japanese electronics company (office not in japan) as a developer and EVERY developer PC here has an i7 (main device) and none of those standard machines has a discrete GPU.We have Nvidia GPUs on some machines where we experiment with CUDA acceleration and AI but those are Xeon anyway.
You're making weird assumptions in regards to i7...
Alexvrb - Monday, October 7, 2019 - link
Those are probably pre-Zen quad cores. Current model desktop i7/i9 are a different beast. How many desktop workloads make large gains performance from an expensive 8 or 16 thread CPU, yet don't benefit at all from having decent graphics? Even modern web browsers offload work to the GPU.brakdoo - Tuesday, October 8, 2019 - link
1. Please read up what browsers offload exactly before you post...2. Compiling large projects (what millions of developers do every day) is always multi-threaded. Linux developers are sometimes even obsessed with compiling the kernel on server CPUs with many cores and sockets (just look at phoronix or servethehome ;)).
I am mostly doing .net stuff locally and on team foundation server. They are smart and try to compile/build parts as soon as they can but every time we merge our work to build and test we always hope that we get more cores next time to work with ;)
jwcalla - Tuesday, October 8, 2019 - link
If you're a developer then you're the absolute perfect case scenario for what I'm talking about. You would be far better off with a CPU that has die area completely dedicated to CPU cores, and then a $30 discrete GPU thrown into a PCIe slot to run a display.As it is now you have HALF of a CPU wasted just so you can drive a display. It's dumb beyond belief.
peevee - Monday, October 7, 2019 - link
"That's why jwcalla said for the i7 and above. I highly doubt anyone is using an integrated gpu on the i7 or i9 without a discrete graphics solution. "Stop doubting. Practically nobody in business world needs separate graphics card while a lot of people benefit from faster processors - almost everybody but "Outlook warriors".
Icehawk - Monday, October 7, 2019 - link
I'm sorry but you are incorrect. We have hundreds of i7 desktops and not a one has a graphics card in it. You will find that repeated across every major business.haukionkannel - Tuesday, October 8, 2019 - link
Yep... in our department one machine has dedicated gpu and that is for research project... So definitely there is need for high power cpu with build in gpu!Slash3 - Monday, October 7, 2019 - link
A lot of people also use them for accelerated encoding (Quick Sync) in addition to powering a second display in a mixed-refresh rate setup (eg, 144Hz on primary, 60Hz on secondary). They're handy if a primary GPU fails, too, or if you want to sell a card before a new part arrives or is available.Death666Angel - Monday, October 7, 2019 - link
Why is the iGPU useful in multi monitor refresh rate scenarios? I've had a 1440p100Hz display together with a 1200p60Hz one connected to a GTX 1080 or a GTX 960 and it was fine.And again (see below) how often do you guys break your GPU? You should look into that. If that is a recurring thing the iGPU option might make sense of course. But I would look into my setup and investigate why that happens so often and try to avoid it in the future.
Slash3 - Monday, October 7, 2019 - link
I haven't experienced the issue personally (both of my displays are still 60Hz), but there are a lot of people for whom it has been a problem.https://www.nvidia.com/en-us/geforce/forums/discov...
I mentioned GPU failure as a scenario in which an integrated GPU can be useful, nowhere did I suggest that people were breaking their GPUs. Parts fail (ask RTX 2080 Ti early adopters who ended up with Micron GDDR6) and it's nice to have a backup graphics output.
Like you, I'd love to see a pared down i9-9900 style CPU with no physical GPU transistors on the die, assuming it brought with it a cost savings, improved thermals or more overclocking headroom.
Zoolook13 - Tuesday, October 8, 2019 - link
The unused die area contributes to suck up some thermal power and makes it easier to keep the chip cool so the igp is beneficial even when it's not used, you are not getting better thermals by removing it and therefore you are not getting more overclocking headroom either unless you can decrease the maximum lenght of timing pathways by removing it, and since they are on different clock planes I doubt it.Zoolook13 - Tuesday, October 8, 2019 - link
Sorry for the long sentence, blame the nonexisting editing possibilities.regsEx - Monday, October 7, 2019 - link
APUs are far more popular. And not only because of business and average home users. It's useful for gamers and overclockers as well, as you don't have to connect huge graphics card, when not needed or when it's broken, which happens quite often. And when devs will finally deprecate DX11, there is useful multi GPU feature of DX12.Death666Angel - Monday, October 7, 2019 - link
"which happens quite often" uh, huh? I've had zero GPUs get broken by me and I'm in the business of PC gaming since the TNT2 M64. Overclockers generally don't want to use iGPUs (unless they want a WR for iGPU OC) since that will skew the results of any CPU OC and most overclockers are also interested in combined OC. Most gamers that are not on a budget want that huge GPU to be able to play all titles at acceptable visual levels without low FPS hindering their enjoyment. And if even AMD and Nvidia aren't investing any resources into multi GPU setups these days (when they can directly benefit fincancially from it) I doubt just going with DX12 will suddenly make asynchronus multi GPU setups between iGPU and dGPUs viable.Death666Angel - Monday, October 7, 2019 - link
No TDP gets wasted on inactive iGPUs.Edkiefer - Monday, October 7, 2019 - link
The biggest issue I have with there lineup has nothing to do with F pricing but the lack of lower than 9900 with HT. Right now you can get an 8 thread CPU, but need to go to 9900, 16 thread part, there should be a 12 thread (6/12 CPU) one.drexnx - Monday, October 7, 2019 - link
their segmentation doesn't allow it - where would it fit? the i5? there would be cases where a 6/12 i5 might beat the 8/8 i7; can't have that!Edkiefer - Monday, October 7, 2019 - link
It depends on the application, right now there a huge gap in thread #, 8 to next is 16 and only in top CPU. Sure I know why they do it, just saying it's not good IMO :)shabby - Monday, October 7, 2019 - link
It's funny how only their top of the line cpu had ht, pathetic.Great_Scott - Monday, October 7, 2019 - link
It looked like Intel was depreciaing HT, I assumed it was for memory security reasons, but then they still have it for the 9900-series.So I have no idea what they're thinking. Maybe no-HT allows for higher clocks?
Korguz - Monday, October 7, 2019 - link
or, you want HT ?? you will need to buy this cpu instead, and intel will gladly take your money, too :-)AshlayW - Monday, October 7, 2019 - link
There's plenty of alternatives: Ryzen 7 3700X or 3800X are great examples. However, I agree that a 6/12 would be nice in the current lineup of Intel Core parts. You may find a discount on an i7-8700 perhaps?bolkhov - Tuesday, October 8, 2019 - link
i7-8700 is 6/12.8xxx and 9xxx CPUs in fact belong to a single lineup, as well as 7xxx and even 6xxx.
(Besides new stepping (with hardware mitigations for Spectre+Meltdown) in *some* 9xxx SKUs.)
AshlayW - Monday, October 7, 2019 - link
Disgusting anti-consumer company. Yes, they've actually now reduced the price of these salvaged parts, but the fact that it's taken them this long is appalling. I'm so glad you don't 'have' to buy Intel these days for the highest performance, efficiency and features. Because nobody should be supporting this company willingly in 2019.TEAMSWITCHER - Monday, October 7, 2019 - link
AMD is no princess either. Advertised max turbo speeds for Ryzen 3000 parts were an exaggeration. 3900X pricing hasn't reached MSRP now months after the launch. And 3950X was announced and then failed to reach market on schedule. Competition is great, but it has to be more than a promise, you have to deliver. So hard to buy AMD when they drop the ball so frequently.Alistair - Monday, October 7, 2019 - link
Come on, there was a bug that limited the turbo boosts by 25mhz, that was not noticeable. And none of the Ryzen systems I've built with Asus boards ever had a problem, depends on the board that you bought, and the problem has been fixed anyways. The 3900x has been MSRP since launch (and I was able to buy one the first week), if you don't know how to buy a CPU, that's on you. Don't pay attention to online listings, there are 10 x 3900x in the store near where I live. I own a 9700k, and not surprisingly, my 3900x is WAY faster for work.ilt24 - Monday, October 7, 2019 - link
@Alistair ... "Come on, there was a bug that limited the turbo boosts by 25mhz"There was a patch that provided an extra 25Mhz to 50Mhz, but this still doesn't mean all Ryzen's are hitting their advertised boost speeds.
Slash3 - Monday, October 7, 2019 - link
To be fair, 3900X supplies have been exceedingly poor outside of some regional Microcenter stores and the occasional ten minute in-stock notification. Most vendors have now priced the CPU at ~$570 rather than the $499 MSRP (as found at Best Buy and the occasional Amazon listing).As of right now, it's in stock at both Amazon and B&H Photo for ~$570, for anyone trying to track one down.
Korguz - Monday, October 7, 2019 - link
Slash3, maybe where you are.. no problem getting them here 90% of the timealufan - Monday, October 7, 2019 - link
Hmm dont think thats a fair comment, issue was with a software error that was fixed, AMD has never said all cores will boost too x on any CPU as for drop the Ball lets not mention the security issues with intel, the anti competitive actions that saw huge fines or the fact they have blatently overcharged for years for a CPU that has not got a hope of staying within its TDP rate.AMD is simply making a decent profit on a good product they are no longer required to be bargain basement products but still they are more reasonable than intel as can be seen from intels recent need to drop prices to try and compete with a better product its called competition and its great for us the consumer
Alistair - Monday, October 7, 2019 - link
Intel's 8086k was for one thread only, AVX off. That never happens, thus you never see 5ghz. At least AMD's TDP means something, and the boost clock doesn't require AVX to not be used. I find it a bit bizarre that Intel never reached their boost clocks for years and it wasn't a scandal until AMD came up 25mhz short. I think the only reason for that is the boost clocks you can get with AMD exceed the all core clocks, while with Intel the boost clocks are lower than the all core OC possible, so doesn't bother anyone since you just set the clock speed higher than the boost anyways.TEAMSWITCHER - Monday, October 7, 2019 - link
It is a fair comment. Ryzen 3900X is really just two defective 7nm chiplets, availability has been poor and certainly NOT at MSRP. Intel's i9 9900K is available below MSRP ($450 @ microcenter.com) and leads the 3900X is many gaming benchmarks. With hyper threading it's certainly no slouch and can get work done when you need it to. 3950X is the "flagship" part worth waiting for and it was a no-show in September.Finally... While everyone is thanking AMD, I would like to point out that Intel has routinely dropped per-core pricing on new HEDT processors:
10-core Broadwell-E was $1723, while 10-core Skylake-X was only $999.
8-Core Haswell-E was $999, while 8-core Skylake-X was only $599
6-Core Ivy Bridge-E was $999, while 6-Core Haswell-E was only $583.
Yes this year's price drop is a bit bigger than before, but the pattern was there long before AMD entered the market with Ryzen or Threadripper.
Korguz - Monday, October 7, 2019 - link
TEAMSWITCHER" Ryzen 3900X is really just two defective 7nm chiplets " or really, says who ? you have proof if this, or is this just your own opinion ?
i bet the 9900k also wasnt as MSRP when it was 1st released, your point is????
according to here : https://www.anandtech.com/bench/product/2263?vs=25... seems for the most part, they are even with games, and even then, the difference on some, is 10 frames or less, and that's with the 3900X having a clock speed disadvantage to boot, imagine if the clocks were the SAME.
" that Intel has routinely dropped per-core pricing on new HEDT processors:" yea.. probably because intel released the replacement for it :-) and we SHOULD be thanking AMD for the price drops, case in point, look at what intel charges for the Core i9-10980XE, $979, vs $1979 for the Core i9-9980XE. and over all for cascade lake X vs skylake X.
_Shorty - Monday, October 7, 2019 - link
While there obviously is a huge business/budget PC market that will benefit from having an integrated GPU, there is also a considerable market that does not care about it or want it in there as it will never be used. Yields make such a big impact on their bottom line it is a wonder why they've never bothered to make a second line that omits the GPU entirely, not merely using defective units by disabling the GPU, so as to get a much higher unit count from every wafer. Actually get rid of the GPU from the die altogether so that exactly zero silicon is used for it so that they can get many more CPUs from each wafer.QuickSync encoding quality is so horrible when using actually-sane bitrates. Software encoding with x264/x265 is always going to be better because the people adding these QuickSync/NVENC features don't actually care about making quality encodes. They just want another marketing checkbox to fill in. If any features like this would actually work as quickly as they do and also give quality that's comparable to a reasonable target, such as x264/x265's medium preset or better, then they would actually be useful. Most especially if it matched medium or better at streamable bitrates. But their quality at streamable bitrates is so bad. So, so bad.
If they were smart, they'd do a limited run of such a CPU without a GPU even existing on it just to see how it would actually sell. Sell something equivalent to the 9900K sans GPU, and price it with a discount compared to the 9900K that relates directly to the percentage of silicon saved per die/wafer. Does it make the CPU die 20% smaller? Sell it for 20% less. I think they'd be surprised at just how many they sold. Rather, how quickly it would sell out.
Quantumz0d - Monday, October 7, 2019 - link
Intel should have got rid of that whole iGPU BS on the die, precious die space laid to waste. Uncore power is still there no matter how much the iGPU is idle. Unlocked processors do not need the retarded extra overhead of the iGPU junk, esp the new DCH driver rubbish is horrid.I hope their new 10C Cometlake doesn't have that stupid garbage.
eek2121 - Tuesday, October 8, 2019 - link
IMO it isn't worth it. It actually kills too many valid use cases (quicksync, linux KVM with GPU passthrough without a second discrete gpu, troubleshooting, etc)This move is just in response to Zen 2. Intel's way of coming closer to AMD's pricing (AMD Zen 2 based chips don't have integrated graphics)
Side note: this site is slowly becoming an unuasable ad infested mess. A video with noise right above the comment box that autoplays? Really? Time to turn the adblocker back on or stop visiting anandtech. Purch is slowly killing off tech sites.
Beaver M. - Tuesday, October 8, 2019 - link
They just admitted they cant fix all the security issues. Not even in the next generation and the one after that, and probably not even in the one after that.So why should I pay that much money for insecure CPUs? Huh, Intel?
Korguz - Tuesday, October 8, 2019 - link
Beaver Mproof of this ???
hanselltc - Tuesday, October 8, 2019 - link
Something like a 35 dollar shave across the board? Nice. Still not sure why one would buy anything but an i9 from intel other than compatibility reasons though.