/?pid=4856

Updated:06:50 PM EDT Nov 01


this is ggmania.com subsite Intel Larrabee architecture details revealed - TechAmok

Intel Larrabee architecture details revealed - [hardware]
06:40 PM EDT - Aug,04 2008 - post a comment

A new chip unveiled by hardware maker Intel is hoping to take graphics processing back to the x86 instruction set while still offering DirectX and OpenGL support. The chip will be offered as discrete chip on motherboards as well as a standalone processor to compete directly with the GeForce and Radeon products. Dubbed "Larrabee", the chip was unveiled at this year's SIGGRAPH conference and sports a variety of new technical features, including a fully coherent memory subsystem which allows for more efficient multi-chip implementation. Should the x86-based graphical rendering capabilities prove viable to developers, Larrabee could serve as a cost-efficient alternative to expensive PC video cards, such as those produced by AMD and Nvidia.
- An x86-compatible basic processing core derived from the original Pentium. This core has been heavily modified to include a 16-ALU-wide vector unit for use in Larrabee. Each core has L1 instruction and data caches plus a 256KB L2 cache, all fully coherent

- A 1024-bit ring bus (512 bits in each direction) for inter-core communication. This bus will carry data between the cores and other major units on the chip, including data being shared between the cores' individual L2 cache partitions and the cache coherency traffic needed to keep track of it all.

- Very little fixed-function logic. Most stages of the traditional graphics pipeline will run as programs on Larabee's processing cores, including primitive setup, rasterization, and back-end frame buffer blending. The major exception here is texture sampling, where Intel has chosen to use custom logic for texture decompression and filtering. Intel expects this approach to yield efficiency benefits by allowing for very fine-grained load balancing; each stage of the graphics pipeline will occupy the shaders only as long as necessary, and no custom hardware will sit idle while other stages are processed.

- DirectX and OpenGL support via tile-based deferred rendering. With Larrabee's inherent programmability, support for traditional graphics APIs will run as software layers on Larrabee, and Intel has decided to implement those renderers using a tile-based deferred rendering approach similar to the one last seen on the PC in the Kyro II chip. My sense is that tile-based deferred rendering can be very bandwidth-efficient, but may present compatibility problems at first since it's not commonly used on the PC today. Should be interesting to see how it fares in the wild.

- A native C/C++ programming mode for all-software renderers. Graphics developers will have the option of bypassing APIs like OpenGL altogether and writing programs for Larrabee using Intel's C/C++-style programming model. They wouldn't get all of the built-in facilities of an existing graphics API, but they'd gain the ability to write their own custom rendering pipelines with whatever features they might wish to include at each and every stage.
Dunno if it's just me, but I see a lot of promise in this architecture. If they get the drivers right...


Add your comment (free registrationrequired)

Short overview of recent news articles

Nov,01 2025 Battlefield REDSEC - Official Live-Action Trailer
Nov,01 2025 What's the Best PC Upgrade (besides CPU/GPU)?
Oct,31 2025 Directive 8020 - RTX On Trailer
Oct,30 2025 Stranger Things 5 - Official Trailer
Oct,29 2025 AMD Releases Software Adrenalin Edition 25.10.2 WHQL Drivers
Oct,29 2025 Exploding AMD CPUs | Investigating ASRock's Murderboards
Oct,29 2025 Setting Up Our First Huge Gaming Event was CHAOS
Oct,27 2025 Malware of the Future: What an infected system looks like in 2025
Oct,27 2025 F1: Race Highlights | 2025 Mexico City Grand Prix
Oct,26 2025 F1: Qualifying Highlights | 2025 Mexico City Grand Prix
Oct,25 2025 New Big Windows 11 25H2 October Update - New Taskbar Battery Icons
Oct,25 2025 Apple Prepping 'Transfer to Android' Feature, Including 3rd-Party
Oct,24 2025 HW News - RIP Internet, RAM Prices Skyrocket from AI Demand, Intel
Oct,21 2025 Retro Gaming PC Upgrades go WRONG!
Oct,21 2025 How social media has ruined us - the more time you spend online, the
Oct,20 2025 FERRARI 12 CILINDRI // 340KMH REVIEW on AUTOBAHN
Oct,20 2025 ROG Xbox Ally X - a PC Gamer's Perspective
Oct,20 2025 Race Highlights | 2025 United States Grand Prix
Oct,18 2025 RedMagic Puts Liquid Cooling in its New Gaming Phone
Oct,18 2025 Russia Says U.S. Is Planning a $37 Trillion Crypto Reset
Oct,18 2025 Tor Browser says no to Firefox's AI features as it removes them
Oct,14 2025 NVIDIA GeForce 581.57 WHQL Driver
Oct,13 2025 Samsung One UI 8.5 vs iOS 26 - COMPARISON
Oct,12 2025 Google Turned Down by Supreme Court, Must Open up App Payments
Oct,10 2025 AMD releases new 25.10.1 preview graphics driver with Battlefield 6
Oct,10 2025 MERCY Official Trailer (2026) Chris Pratt
Oct,07 2025 Galaxy S26 Ultra - Samsung, Please Don't Copy This
Oct,06 2025 Canada's Las Vegas Sphere is here - and I game on it
Oct,06 2025 Predator: Badlands - Official Final Trailer (2025)
Oct,04 2025 Chasing a Gaming World Record
Oct,02 2025 Frankenstein - Official Trailer (2025) Guillermo del Toro, Oscar
Oct,02 2025 iPhone 17 Pro Max vs 16 Pro Max / Pixel 10 Pro XL / Galaxy S25 Ultra
Sep,30 2025 iOS 26.0.1 is Out! - What's New?
Sep,30 2025 NEW! 2026 Audi Q3 2.0 TFSI (265hp) vs. e-hybrid (272hp)| 0-100 km/h
Sep,29 2025 Samsung One UI 8.5 Hands on - I Was Wrong
Sep,28 2025 iPhone Air Teardown - What is 3D Printed Titanium?
Sep,28 2025 Nvidia Wouldn't Send Me This $30,000 GPU - H200 Holy $H!T
Sep,27 2025 The Astronaut - Official Trailer (2025) Kate Mara, Laurence
Sep,25 2025 iPhone 17 Durability Test -- What Scratches are Permanent?
Sep,23 2025 iPhone 17 Pro Max vs. Galaxy S25 Ultra Drop Test!
>> News Archive <<

TechAmok - Privacy Policy        loading time:0.01secs