The Sorry State of the CPU Industry: INCEL IS BANKRUPT & FINISHED

Started by Magyarorszag, August 16, 2018, 07:46:07 PM

previous topic - next topic

0 Members and 1 Guest are viewing this topic.

Go Down

Magyarorszag

Quote from: bluaki on August 16, 2018, 06:42:00 AMIntel's woes continue: it looks like they've finally managed to make some 10nm CPUs for fairly widely-distributed devices, but can't fit a working video core on the same die.

https://www.anandtech.com/show/13233/intel-crimson-canyon-nuc-with-cannon-lake-10nm

The first 10nm NUC lineup doesn't have any Intel GPUs. Instead they're pairing with radeon again, this time even on the lower models. Before, they only used Radeon in the most expensive high-powered Hades Canyon NUC, which like the Skull Canyon before it is quite a huge step up from the regular NUCi7 in price, size, and power draw.

I have always been of the opinion that a compelling argument in favor of Intel CPUs over AMD's is that Intel's entire lineup (minus server products) always come with a GPU built right in. It seems odd and maybe even slightly suspicious to me that this huge advantage is so commonly ignored or dismissed by professional CPU reviewers and AMD fanboys alike. befuddlement

flagship amd cpus necessarily require you to go out and buy a separate gpu, which might be okay for gamers but is less attractive to enterprise and academic customers, for example (i mean yeah you could get an apu but those came over a year later and are way weaker)

So to hear now that Intel is struggling to integrate video cores onto their next-gen CPUs really is much bigger news than I think many people realize. goowan

i would be absolutely shaking if i were incel, because this will greatly weaken their value proposition to all customers vs amd, unless they ultimately are able to squeeze a gpu back in in time for the general canon lake release (...in 2024 at this rate smth;)

Quote from: bluaki on August 16, 2018, 06:42:00 AMThis new 10nm-based "Crimson Canyon" NUC line is supposedly releasing this September, more than a full year before the currently-projected (and probably prone to yet another delay) full run of 10nm chips. And this is only two months after the Coffee Lake NUCs "launched" (even though they're barely launched yet considering only like one online retailer has it in stock for an excessive price). Confusingly, both NUC lines are "8th Generation"

incel's suddenly very rapid and confusing release schedule strikes me as panicky lol sillydood;

they were so much more composed (outwardly, at least) before ryzen struck

Quote from: bluaki on August 16, 2018, 06:42:00 AMAlso bafflingly, Intel is replacing the USB-C/TB3 port of the previous two NUC generations and the miniDP port of the three generations before that with a second HDMI port. There's no DisplayPort support at all. Also because of Radeon the PSU is bumped up from 65W to 90W despite sharing the same form factor as previous NUCs.

Just... why?

Quote from: bluaki on August 16, 2018, 06:42:00 AMAlso because of Radeon the PSU is bumped up from 65W to 90W despite sharing the same form factor as previous NUCs.

the sorry state of radeon

bluaki

Quote from: Magyarorszag on August 16, 2018, 07:46:07 PMI have always been of the opinion that a compelling argument in favor of Intel CPUs over AMD's is that Intel's entire lineup (minus server products) always come with a GPU built right in. It seems odd and maybe even slightly suspicious to me that this huge advantage is so commonly ignored or dismissed by professional CPU reviewers and AMD fanboys alike. befuddlement

flagship amd cpus necessarily require you to go out and buy a separate gpu, which might be okay for gamers but is less attractive to enterprise and academic customers, for example (i mean yeah you could get an apu but those came over a year later and are way weaker)

So to hear now that Intel is struggling to integrate video cores onto their next-gen CPUs really is much bigger news than I think many people realize. goowan

i would be absolutely shaking if i were incel, because this will greatly weaken their value proposition to all customers vs amd, unless they ultimately are able to squeeze a gpu back in in time for the general canon lake release (...in 2024 at this rate smth;)
I doubt they'll release desktop CPUs without integrated graphics. I just mentioned this as a sign of just how incomplete their ability to manufacture 10nm chips really is at this point, like maybe the manufacturing failure rate is so bad right now they have to keep the chips simpler to get an acceptable pass rate. If every transistor has an equal defect rate, having say half as many transistors in the chip drastically lowers the number of chips with any defects. I know the graphics core is always a huge portion of modern Intel CPUs so half is probably a pretty fair estimate.

Funny thing is I didn't see news reports even mentioning this at all. They just see the included discrete GPU as a bonus and don't think about the deeper meaning of why this happened. Honestly, that's probably playing right into Intel's hands, since I'm sure they don't want the shareholders to think about this.

Quote from: Magyarorszag on August 16, 2018, 07:46:07 PM
Quote from: bluaki on August 16, 2018, 06:42:00 AMAlso because of Radeon the PSU is bumped up from 65W to 90W despite sharing the same form factor as previous NUCs.
the sorry state of radeon
I mean, the power bump is more because of moving to discrete graphics at all than because of Radeon in particular. If Intel wants to claim not including their own GPU is a feature, they need to pick a powerful enough GPU that justifies going discrete.

bluaki

https://twitter.com/IntelGraphics/status/1029792940648878080

Intel's trying to tease their future (2020) discrete graphics cards by bragging about their experience with graphics. "IntelGraphics" is actually a brand new Twitter account, indicating Intel's shift to more graphics marketing I guess.

"Because at Intel, graphics are integrated into our core" sure unless you're having so much trouble actually making processors that you have to leave it out
"We were the first PC graphics to play Netflix movies in 4K" yeah you're sure leading the way on DRM

Magyarorszag

Quote from: bluaki on August 17, 2018, 07:43:16 PMhttps://twitter.com/IntelGraphics/status/1029792940648878080

Intel's trying to tease their future (2020) discrete graphics cards by bragging about their experience with graphics. "IntelGraphics" is actually a brand new Twitter account, indicating Intel's shift to more graphics marketing I guess.

"Because at Intel, graphics are integrated into our core" sure unless you're having so much trouble actually making processors that you have to leave it out
"We were the first PC graphics to play Netflix movies in 4K" yeah you're sure leading the way on DRM


i wonder why intel isn't bragging about larrabee goowan

last year, just as soon as it became clear just how much of a miserable flaming blunder amd's vega is, amd's head gpu architect raja koduri (that's marijuana btw LOL) took a """""""""""""sabbatical"""""""""""""" from which he of course never returned, and then almost immediately got poached by intel upon separating from amd

https://newsroom.intel.com/biography/raja-m-koduri/

https://twitter.com/rajaontheedge

i am convinced the guy is a complete hack

i would be beyond (pleasantly) surprised to see a competent gpu come out of intel's newest discrete graphics project by 2020

my expectations, knowing their previous failures and knowing anything about koduri, are below even my expectations for radeon

Snowy

I loved my i5 4690K but I'll be damned if I shell out that kind of money again on a CPU. Pretty sure I got my Ryzen and mobo for the same price as just my cpu back when I made my original build.
Quote from: Samus Aran on November 05, 2009, 12:50:30 PMlast night i bludgeoned a bull moose to death with my cock

Magyarorszag

Quote from: Snowy on August 20, 2018, 06:41:25 AMI loved my i5 4690K but I'll be damned if I shell out that kind of money again on a CPU. Pretty sure I got my Ryzen and mobo for the same price as just my cpu back when I made my original build.

What exactly did you end up getting again? Has it been a worthy improvement for you over the 4690k? befuddlement

Snowy

Quote from: Magyarorszag on August 24, 2018, 12:34:32 PM
Quote from: Snowy on August 20, 2018, 06:41:25 AMI loved my i5 4690K but I'll be damned if I shell out that kind of money again on a CPU. Pretty sure I got my Ryzen and mobo for the same price as just my cpu back when I made my original build.

What exactly did you end up getting again? Has it been a worthy improvement for you over the 4690k? befuddlement
ve the Ryzen 5 1600X and an Asus 370X Pro mobo. I noticed immediate improvements for any CPU intense games, and on the rare occasion that I did stream I wouldn't have to turn graphics settings down.

With my CPU at 4GHZ@1.3V I can stream Destiny 2 on the highest settings at 60fps with zero frame drops. I do plan on getting a Ryzen 7 when Zen 2 comes out, I like the 1600X a lot but you can tell that it's gen 1/early pressing of it.
Quote from: Samus Aran on November 05, 2009, 12:50:30 PMlast night i bludgeoned a bull moose to death with my cock

Magyarorszag

i'd like to upgrade myself but it doesn't really make sense until at least the next generation (cannon lake/zen 2) of cpus

so i have high hopes for zen 2, plz don't disappoint amd :'(

Snowy

I mean even Zen+ is really good lol. I know the 2700X is supposed to be really good.
Quote from: Samus Aran on November 05, 2009, 12:50:30 PMlast night i bludgeoned a bull moose to death with my cock

Magyarorszag

oh i know they're good

there's just nothing on the market that's quite good enough compared to the i5 6500 to justify spending $400+ atm lol

honestly i'll probably wait until ddr5 becomes the standard before i do anything, and they claim it'll begin phasing ddr4 out in (probably late) 2019

bluaki

A couple days ago, Intel released a microcode update for patching some speculative execution vulnerabilities with a license agreement asserting users are forbidden from publishing any kind of benchmark data after installing the update. The language was so vague it could even apply to having for example a visible FPS counter while streaming.

After a very predictable Internet outrage, they responded and removed that clause from the user agreement.

Snowy

I mean obv they have something to hide. Didn't the last one gimp performance as well?
Quote from: Samus Aran on November 05, 2009, 12:50:30 PMlast night i bludgeoned a bull moose to death with my cock

Go Up