This one (and deep learning in general) is a much, much better fit for Intel than those earlier things were. Huge computational requirements, premium components etc. Some very real competition for them of course but still.
Oh I agree but as stated above they're always late. It's as if they have no long term vision. I'm a grad CS student studying machine learning. Guess what I'll be using for deep learning, an Nvidia GPU.
Yes, but CPUs will continue to be used in the AI system. GPUs are better at deep learning that CPUs, but they are far inferior to neuromorphic chips for deep learning, and particularly for sensory processing, and take far too much power. The sensory processing will likely be farmed out to neuromorphic co-processors (think updated, more powerful versions of QuarkSE or CogniMem's CM1K).
But you could make the same argument for GPU market. Large computation requirements, premium components, etc.
However, you still can't buy an Intel mid or high end GPU or even a discrete Intel GPU. They should work on fixing their current core competencies before jumping into another area with token amounts of funding...
The GPU market has/had masses of built in development effort behind it which is very hard to duplicate.
This kind of AI compute mostly has repurposed general purpose GPU's as competition - NV have only recently moved towards specalising their designs for it - so a priori Intel should have a plausible shot at it. So should the sundry other companies who may well spring up going forwards.
Agree with the note on previous page that it might be better if they were there already but....
I suspect it is a "fix" from some marketing guy who got the original slide from the engineers with 50 B in it and considered it a typo. Notice that all figures have space between the number and the unit, except GPS. Which means the other data are probably legit, even though the GPS value may suggest otherwise.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
17 Comments
Back to Article
WaitingForNehalem - Wednesday, November 30, 2016 - link
Lol as usual they have no idea what to invest in..."ok guys, time to find the next predicted hot trend that we know nothing about"Mobile...nope, IoT...nope, wearables...nope, security...nope
beginner99 - Wednesday, November 30, 2016 - link
It's more that they usually are half a decade too late to the party.Qwertilot - Wednesday, November 30, 2016 - link
This one (and deep learning in general) is a much, much better fit for Intel than those earlier things were. Huge computational requirements, premium components etc. Some very real competition for them of course but still.WaitingForNehalem - Wednesday, November 30, 2016 - link
Oh I agree but as stated above they're always late. It's as if they have no long term vision. I'm a grad CS student studying machine learning. Guess what I'll be using for deep learning, an Nvidia GPU.p1esk - Wednesday, November 30, 2016 - link
Do you not want any competition to Nvidia?shabby - Wednesday, November 30, 2016 - link
Is intel even capable of competing with ati/nvidia?willis936 - Wednesday, November 30, 2016 - link
ati?shabby - Wednesday, November 30, 2016 - link
Hah oops, you know what i mean.nerd1 - Wednesday, November 30, 2016 - link
BUT no one's using intel CPU for deep learning.beginner99 - Wednesday, November 30, 2016 - link
Well knights landing is available as CPU and should be very good for deep learning...Jaybus - Wednesday, December 7, 2016 - link
Yes, but CPUs will continue to be used in the AI system. GPUs are better at deep learning that CPUs, but they are far inferior to neuromorphic chips for deep learning, and particularly for sensory processing, and take far too much power. The sensory processing will likely be farmed out to neuromorphic co-processors (think updated, more powerful versions of QuarkSE or CogniMem's CM1K).webdoctors - Wednesday, November 30, 2016 - link
But you could make the same argument for GPU market. Large computation requirements, premium components, etc.However, you still can't buy an Intel mid or high end GPU or even a discrete Intel GPU. They should work on fixing their current core competencies before jumping into another area with token amounts of funding...
Qwertilot - Thursday, December 1, 2016 - link
The GPU market has/had masses of built in development effort behind it which is very hard to duplicate.This kind of AI compute mostly has repurposed general purpose GPU's as competition - NV have only recently moved towards specalising their designs for it - so a priori Intel should have a plausible shot at it. So should the sundry other companies who may well spring up going forwards.
Agree with the note on previous page that it might be better if they were there already but....
Ariknowsbest - Friday, December 2, 2016 - link
And Altera IP could be used for Machine learning just as well as a GPU.Shadowmaster625 - Wednesday, November 30, 2016 - link
Since when does GPS require 50KB of data per second? Shouldnt it be more like 50 bytes per second?risa2000 - Wednesday, November 30, 2016 - link
I suspect it is a "fix" from some marketing guy who got the original slide from the engineers with 50 B in it and considered it a typo. Notice that all figures have space between the number and the unit, except GPS. Which means the other data are probably legit, even though the GPS value may suggest otherwise.GhostOfAnand - Thursday, December 1, 2016 - link
Intel should just open up its fabs for others.