Intel has long believed that computers work best when they are designed with powerful general-purpose processors that do just about everything. That’s not too surprising since Intel’s success has been built almost entirely on general-purpose processors, from the 8086 of the late 1970s to today’s Core family.
But there has always been a counter-thrust in the semiconductor industry that has promoted specialized silicon to handle specific jobs, such as signal processing. Overall, the fight has been something of a draw, with Intel owning the PC market and the makers of more specialized chips such as Texas Instruments winning in the world of smartphones, cameras, and the like.
In the last couple of years, however, the world of PC graphics has become a major battleground between these two philosophies. And right now, it looks like Intel is in some trouble. Intel still supplies the graphic adapters for the great majority of PCs sold. But these are low-end systems that have graphics adapters integrated into the chipsets that support the main processor and that share memory with the processor and everything it does. The high-end part of the business is dominated by separate, or discrete, graphics adapters that feature dozens of specialized processors and large amounts of dedicated memory. This business is owned by Nvidia and the ATI unit of AMD.
The problem is that the graphics demands of computers are soaring. At one time, only gamers, artists and graphics designers, scientists, and engineers really cared very much about graphics quality. Now, nearly everyone is running graphics-intense applications, particularly high-quality video. To see the difference an improvement in graphics processing can make, all you have to do is look at Hulu.com on a standard Atom-powered netbook with Intel’s integrated graphics and one. such as the Lenovo IdeaPad S12 netbook with Nvidia’s Ion graphics.
Intel recognized this growing problem a couple of years ago and started development of its own discrete graphics adapter chip. But the project, called Larrabee, has gone anything but smoothly, and after falling further and further behind schedule, Intel has had to admit that its plans to introduce a Larrabee product in the next year or two are dead.
“Larrabee silicon and software development are behind where we had hoped to be at this point in the project,” Intel spokesman Nick Knupffer wrote in an email. “As a result, our first Larrabee product will not be launched as a standalone discrete graphics product, but rather be used as a software development platform for internal and external use.”
The announcement is likely to cause much gloating at Nvidia and AMD. Nvidia CEO Jen-Hsun Wang has stopped just short of saying it really doesn’t matter what sort of general-purpose processor you have in a computer as long as you have enough firepower in your graphics adapter, now more often called a graphics processing unit. Nvidia has been promoting a technology called Cuda that offloads processing chores from the CPU to the GPU. AMD, which of course sells both general-purpose processors and GPUs, has been singing a less aggressive version of the same tune.
Despite its setbacks with Larrabee, Intel can’t afford to get out of this game. There’s a major push within the industry to make GPUs major players in computation, not just graphics. Apple is leading a push for a software standard called OpenCL that will help accomplish this and most major players, including Intel, have signed on. The question, now more than ever, is just when Intel will be able to get its own dog into this fight.
“The performance of the initial Larrabee product for throughput computing applications — as demonstrated at SC09 — is extremely promising and we will be adding a throughput computing development platform based on Larrabee, too,” said Knupffer’s email. “While we are disappointed that the product is not yet where we expected, we remain committed to delivering world-class many-core graphics products to our customers. Additional plans for discrete graphics products will be discussed some time in 2010.”