Larrabee isn’t coming just yet.

Hmm… I am disappointed (story). No, I wasn’t expecting the first versions of the technology to be game changer in the graphics or for that matter in the HPC  or the compute world, but I was very very interested in knowing more about the Larrabee technology. Thus far Intel has only thrown “bits and pieces” about their new tech, and that in no way gives one a clear picture. No, Intel hasn’t given up on the technology, but seems to have postponed the release in it’s current form because the performance targets weren’t being met. Ironically, Intel had initially made claims that Larrabee chips would stand up to discrete solutions from ATI and Nvidia. However, it looks like the tech still needs some work done to measure up to that.

At this point all we can do is speculate, but the fact is — building a chip that can do HPC and compute and graphics and have driver/software/optimizing compilers working perfectly is a tall order, even for a giant like Intel. I am sure they have done most of it right, but most of it isn’t all of it, and that’s probably the reason we are seeing the launch being canceled in it’s current form.

Many-core computing is the next big thing, and technologies like Larrabee are the future. I am disappointed because more than the tech, Larrabee would have been a window into how things are shaping up. How does software development scale to the future? Would the new optimizing compilers allow the use of current software methods? Or, does it mean a radical shift in the way software systems are built? How would the new tech address task parallelism? — I guess we will have to wait a while longer to see how these (and I am sure may more) questions are answered.

2 thoughts on “Larrabee isn’t coming just yet.

  1. So how does larabee differ from cuda? Nvidia already has this in their GPUs. Larabee was just hype and some clever spin doctoring from Intel.

  2. The difference is subtle, but significant. Larrabee is designed with more general purpose compute in mind. It’s not a GPU per say, though Intel claims Larrabee could be used for graphics. GPUs (as of writing this comment) are specifically designed for graphics, though this is changing very fast. However, a lot of functionality is still hard-wired into the GPU specifically for graphics. Larrabee will do all of that in software (depth calculations, clipping etc etc).

    Larrabee has a wide vector units like a GPU but has coherent cache hierarchy like a CPU. But the most important aspect of Larrabee (for developers) is the x86 instruction set. My interest in Larrabee was how one would program for such an architecture. Plus, there have been some new additions like LRBni. There were some interesting articles by Michael Abrash on that very aspect.
    http://www.ddj.com/architect/216402188
    http://www.ddj.com/architect/217200602

    You should read into those. He has given a clear and concise answer to your questions (, far better than I ever can).

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.