- Joined
- Aug 28, 2007
- Messages
- 36,154
- Thread Author
- #1
ATI already won next gen Xbox deal Link Removed due to 404 Error Link Removed due to 404 Error Written by Link Removed due to 404 Error Tuesday, 13 October 2009 12:59
Game over
We've learned from industry sources that AMD / ATI has already won the GPU deal for the next generation XboX console. It looks like Microsoft was happy with first Xeons GPU and it wants to continue using the same, especially since the new ATI GPU should keep the compatibility with legacy games.
The consoles refresh was supposed to happen in 2010 but due to the recession both Microsoft and Sony have decided to push its plans for 2012 and keep the Xbox 360 and Playstation 3 alive for more than it was originally planned.
We don’t know how the GPU looks like but judging from the timeline when it is supposed to be delivered we suspect that it might be a 28nm part.
Link Removed due to 404 Error
Game over
We've learned from industry sources that AMD / ATI has already won the GPU deal for the next generation XboX console. It looks like Microsoft was happy with first Xeons GPU and it wants to continue using the same, especially since the new ATI GPU should keep the compatibility with legacy games.
The consoles refresh was supposed to happen in 2010 but due to the recession both Microsoft and Sony have decided to push its plans for 2012 and keep the Xbox 360 and Playstation 3 alive for more than it was originally planned.
We don’t know how the GPU looks like but judging from the timeline when it is supposed to be delivered we suspect that it might be a 28nm part.
Link Removed due to 404 Error
cybercore
New Member
- Joined
- Jul 7, 2009
- Messages
- 15,641
Good news,
Link Removed due to 404 Error
Link Removed
Link Removed
WITH MICROSOFT'S OFFICIAL announcement of the next-generation Xbox 360 console this week, ATI has decided to disclose some of the architectural details of the graphics processor that it created for the system. I had a brief but enlightening conversation with Bob Feldstein, Vice President of Engineering at ATI, who helped oversee the Xbox 360 GPU project. He spelled out some of the GPU's details for me, and they're definitely intriguing.
Feldstein said that ATI and Microsoft developed this chip together in the span of two years, and that they worked "from the ground up" to do a console product. He said that Microsoft was a very good partner with some good chip engineers who understood the problems of doing a non-PC system design. Also, because the part was custom created for a game console, it could be designed specifically for delivering a good gaming experience as part of the Xbox 360 system.
System architecture
Now that we've delved into the shaders a bit, we should take a step back and look at the bigger picture. The Xbox 360 GPU not only packs a lot of shader power, but it's also the central hub in the Xbox 360 system, acting as the main memory controller as well as the GPU. The Xbox 360 has 512MB of GDDR3 memory onboard running at 700MHz, with a 128-bit interface to ATI's memory controller. The ATI GPU, in turn, has a very low latency path to the Xbox 360's three IBM CPU cores. This link has about 25GB/s of bandwidth. Feldstein said the graphics portion of the chip has something of a crossbar arrangement for getting to memory, but he didn't know whether the CPU uses a similar scheme.
Unified shaders
Feldstein cited several major areas of innovation where the Xbox 360 GPU breaks new ground. The first of those is the chip's unified shader array, which does away with separate vertex and pixel shaders in favor of 48 parallel shaders capable of operating on data for both pixels and vertices. The GPU can dynamically allocate shader resources as necessary in order to best address a computational constraint, whether that constraint is vertex- or pixel-related.
This sort of graphics architecture has been rumored as a future possibility for some time, but ATI worried that using unified shaders might cause some efficiency loss. To keep all of the shader units utilized as fully as possible, the design team created a complex system of hardware threading inside the chip itself. In this case, each thread is a program associated with the shader arrays. The Xbox 360 GPU can manage and maintain state information on 64 separate threads in hardware. There's a thread buffer inside the chip, and the GPU can switch between threads instantaneously in order to keep the shader arrays busy at all times.
Link Removed
Embedded DRAM for "free" antialiasing
The GPU won't be using system memory itself quite as much as one might expect, because it packs 10MB of embedded DRAM right on the package. In fact, the Xbox 360 GPU is really a two-die design, with two chips in a single package on a single substrate. The parent die contains the GPU and memory controller, while the daughter die consists of the 10MB of eDRAM and some additional logic. There's a high-speed 2GHz link between the parent and daughter dies, and Feldstein noted that future revisions of the GPU might incorporate both dies on a single piece of silicon for cost savings.
The really fascinating thing here is the design of that daughter die. Feldstein called it a continuation of the traditional graphics pipeline into memory. Basically, there's a 10MB pool of embedded DRAM, designed by NEC, in the center of the die. Around the outside is a ring of logic designed by ATI. This logic is made up of 192 component processors capable of doing the basic math necessary for multisampled antialiasing. If I have it right, the component processors should be able to process 32 pixels at once by operating on six components per pixel: red, green, blue, alpha, stencil, and depth. This logic can do the resolve pass for multisample antialiasing right there on the eDRAM die, giving the Xbox 360 the ability to do 4X antialiasing on a high-definition (1280x768) image essentially for "free"—i.e., with no appreciable performance penalty. The eDRAM holds the contents of all of the back buffers, does the resolve, and hands off the resulting image into main system memory for scan-out to the display.
Link Removed due to 404 Error
Link Removed
Link Removed
WITH MICROSOFT'S OFFICIAL announcement of the next-generation Xbox 360 console this week, ATI has decided to disclose some of the architectural details of the graphics processor that it created for the system. I had a brief but enlightening conversation with Bob Feldstein, Vice President of Engineering at ATI, who helped oversee the Xbox 360 GPU project. He spelled out some of the GPU's details for me, and they're definitely intriguing.
Feldstein said that ATI and Microsoft developed this chip together in the span of two years, and that they worked "from the ground up" to do a console product. He said that Microsoft was a very good partner with some good chip engineers who understood the problems of doing a non-PC system design. Also, because the part was custom created for a game console, it could be designed specifically for delivering a good gaming experience as part of the Xbox 360 system.
System architecture
Now that we've delved into the shaders a bit, we should take a step back and look at the bigger picture. The Xbox 360 GPU not only packs a lot of shader power, but it's also the central hub in the Xbox 360 system, acting as the main memory controller as well as the GPU. The Xbox 360 has 512MB of GDDR3 memory onboard running at 700MHz, with a 128-bit interface to ATI's memory controller. The ATI GPU, in turn, has a very low latency path to the Xbox 360's three IBM CPU cores. This link has about 25GB/s of bandwidth. Feldstein said the graphics portion of the chip has something of a crossbar arrangement for getting to memory, but he didn't know whether the CPU uses a similar scheme.
Unified shaders
Feldstein cited several major areas of innovation where the Xbox 360 GPU breaks new ground. The first of those is the chip's unified shader array, which does away with separate vertex and pixel shaders in favor of 48 parallel shaders capable of operating on data for both pixels and vertices. The GPU can dynamically allocate shader resources as necessary in order to best address a computational constraint, whether that constraint is vertex- or pixel-related.
This sort of graphics architecture has been rumored as a future possibility for some time, but ATI worried that using unified shaders might cause some efficiency loss. To keep all of the shader units utilized as fully as possible, the design team created a complex system of hardware threading inside the chip itself. In this case, each thread is a program associated with the shader arrays. The Xbox 360 GPU can manage and maintain state information on 64 separate threads in hardware. There's a thread buffer inside the chip, and the GPU can switch between threads instantaneously in order to keep the shader arrays busy at all times.
Link Removed
Embedded DRAM for "free" antialiasing
The GPU won't be using system memory itself quite as much as one might expect, because it packs 10MB of embedded DRAM right on the package. In fact, the Xbox 360 GPU is really a two-die design, with two chips in a single package on a single substrate. The parent die contains the GPU and memory controller, while the daughter die consists of the 10MB of eDRAM and some additional logic. There's a high-speed 2GHz link between the parent and daughter dies, and Feldstein noted that future revisions of the GPU might incorporate both dies on a single piece of silicon for cost savings.
The really fascinating thing here is the design of that daughter die. Feldstein called it a continuation of the traditional graphics pipeline into memory. Basically, there's a 10MB pool of embedded DRAM, designed by NEC, in the center of the die. Around the outside is a ring of logic designed by ATI. This logic is made up of 192 component processors capable of doing the basic math necessary for multisampled antialiasing. If I have it right, the component processors should be able to process 32 pixels at once by operating on six components per pixel: red, green, blue, alpha, stencil, and depth. This logic can do the resolve pass for multisample antialiasing right there on the eDRAM die, giving the Xbox 360 the ability to do 4X antialiasing on a high-definition (1280x768) image essentially for "free"—i.e., with no appreciable performance penalty. The eDRAM holds the contents of all of the back buffers, does the resolve, and hands off the resulting image into main system memory for scan-out to the display.