AMD respond to NVIDIA's tough Radeon HD 5800 questions
Author: Shane Baxtor Posted: 1 day, 7 hours Ago
Why is AMD focusing so much on DX11 when most games are on DX9 console ports?
Today and over the life of the Radeon HD 5000 series, dozens of triple A titles will leverage DirectX11 features. If NV was able to produce a DirectX11 card today, they'd be delivering a much different narrative on DirectX 11 to the press. If NV really believes that DirectX11 doesn't matter, then we challenge them to say that publically, on the record.
I would say dx11 is the future right? The future of better gaming? Much better quality gaming environment? The future of moving on to extreme gaming? Creating history for what is coming in the future?
If nvidia came out with dx11 first would amd be asking these same questions in mind & AMD, would they try to compete not question? Maybe nvidia needs to get on the ball?
ionno tis my thought.
- Joined
- Aug 28, 2007
- Messages
- 36,176
- Thread Author
- #82
Highwayman
Extraordinary Member
- Joined
- Jun 3, 2009
- Messages
- 3,969
I suppose they do have a point in regards to the DX9 ported games issue...BUT what have they done to push the DEVs into using DX10 as the primary renderer since release? ~muffled silence~ yea bugger all. Also the article was innacurate on the Physx enabled games being single digits....unless he's counting in some new maths I've never heard of...theres at least 60 games listed I know of, still a terrible low amount but lets get facts accurate AMD .
As for lack of Dx11 games being a chicken and egg situation, well thats no surprise...but once DEVs get out of the "console" frame of mind they will stop being lazy t**ts and start using PC as the flagship not the poor country cousin...well I hope...but since PC games don't rake in the money, as most DEVs are money focused mega corps. that aren't gonna give up the cash cow while there's people willing to pay well over the odds for console games. I wouldn't be so annoyed at this marketing strategy IF they got thier collective arses into gear and ported things better, or ported some of the more interesting titles PC never seems to get, they can't use piracy as an excuse as more people pirate console stuff, carboot markets are always full of them.
I'll always look down on the consoles with fury at the fact they are always be the DEV's mummy's boy
As for lack of Dx11 games being a chicken and egg situation, well thats no surprise...but once DEVs get out of the "console" frame of mind they will stop being lazy t**ts and start using PC as the flagship not the poor country cousin...well I hope...but since PC games don't rake in the money, as most DEVs are money focused mega corps. that aren't gonna give up the cash cow while there's people willing to pay well over the odds for console games. I wouldn't be so annoyed at this marketing strategy IF they got thier collective arses into gear and ported things better, or ported some of the more interesting titles PC never seems to get, they can't use piracy as an excuse as more people pirate console stuff, carboot markets are always full of them.
I'll always look down on the consoles with fury at the fact they are always be the DEV's mummy's boy
Highwayman
Extraordinary Member
- Joined
- Jun 3, 2009
- Messages
- 3,969
It's just occured to me that even the first Gt300 series card out by Nvidia should own the 5870...merely based on the fact that the 5870 only equals the 295 in most of the tests using Dx9 & Dx10...let remind ourselves that the 295 is 2x 275 on one die with 480 cores...I read some time back the first Gt300 was gonna be 512cores before it even starts so in theory should squeeze a small but noticable jump past the new AMD offerings. Guess its gonna come down to dx11 performance and price.
- Joined
- Aug 28, 2007
- Messages
- 36,176
- Thread Author
- #85
Radenight
New Member
- Joined
- Nov 16, 2008
- Messages
- 4,504
I would say dx11 is the future right? The future of better gaming? Much better quality gaming environment? The future of moving on to extreme gaming? Creating history for what is coming in the future?
If nvidia came out with dx11 first would amd be asking these same questions in mind & AMD, would they try to compete not question? Maybe nvidia needs to get on the ball?
ionno tis my thought.
It will be the future of gaming until DX 12 comes out..
loathe
New Member
- Joined
- Jan 16, 2009
- Messages
- 1,718
It will be the future of gaming until DX 12 comes out..
True. Which is why I wont wait for 2nd gen cards before I buy. Theres something better every month with GPU's. Maybe I will try DiRT out on my 4870 X2 first, but I cant see that lasting long. It will be interesting to see how well DX10.1 cards do with DX11.
As for Nvidia, usual story with them, wait for AMD then release something faster lol.
Radenight
New Member
- Joined
- Nov 16, 2008
- Messages
- 4,504
True. Which is why I wont wait for 2nd gen cards before I buy. Theres something better every month with GPU's. Maybe I will try DiRT out on my 4870 X2 first, but I cant see that lasting long. It will be interesting to see how well DX10.1 cards do with DX11.
As for Nvidia, usual story with them, wait for AMD then release something faster lol.
It will be interesting to see if there's another "halt" in video cards.. There was a good 5 or 6 months at least that nothing much happeneed with gpu's..
loathe
New Member
- Joined
- Jan 16, 2009
- Messages
- 1,718
It will be interesting to see if there's another "halt" in video cards.. There was a good 5 or 6 months at least that nothing much happeneed with gpu's..
Depends on how you look at it. Theres always a card with higher clocks, or a new cooler being released. Recent example is EVGA's new Link Removed. As for the hardware on the card itself your right.
- Joined
- Aug 28, 2007
- Messages
- 36,176
- Thread Author
- #90
Nvidia say yields are fine...
After a lot of rumours about bad yields with GT300, Nvidia has decided to talk to us and give us a simple quote about the state of its next generation 40nm product line.
Senior manager from Nvidia has told Fudzilla that “Our (Nvidia’s) 40nm yields are fine. The rumors you are hearing are baseless.â€Â
This comes after some stories that GT300 yields are really bad and that next generation Nvidia is in really bad shape. According to AMD’s competitive analysis team, only nine chips should work per wafer, but they got this number due to a bad translation and we can definitely dismiss this number as completely inaccurate information.
As we've said before, the chip we call GT300 is a completely new design and it simply could not come out earlier. Nvidia certainly doesn’t like the fact that ATI is first with new technology again, but it comforts them that they can make its DirectX 11 faster.
You can expect a lot of architectural changes - the chip is completely redesigned and much faster than the GT200 generation. We would not be surprised if Nvidia starts talking about the chip ahead of official announcement as it currently hurts from positive Radeon 5870 reviews around the globe.
Link Removed due to 404 Error
After a lot of rumours about bad yields with GT300, Nvidia has decided to talk to us and give us a simple quote about the state of its next generation 40nm product line.
Senior manager from Nvidia has told Fudzilla that “Our (Nvidia’s) 40nm yields are fine. The rumors you are hearing are baseless.â€Â
This comes after some stories that GT300 yields are really bad and that next generation Nvidia is in really bad shape. According to AMD’s competitive analysis team, only nine chips should work per wafer, but they got this number due to a bad translation and we can definitely dismiss this number as completely inaccurate information.
As we've said before, the chip we call GT300 is a completely new design and it simply could not come out earlier. Nvidia certainly doesn’t like the fact that ATI is first with new technology again, but it comforts them that they can make its DirectX 11 faster.
You can expect a lot of architectural changes - the chip is completely redesigned and much faster than the GT200 generation. We would not be surprised if Nvidia starts talking about the chip ahead of official announcement as it currently hurts from positive Radeon 5870 reviews around the globe.
Link Removed due to 404 Error
- Joined
- Aug 28, 2007
- Messages
- 36,176
- Thread Author
- #91
- Joined
- Aug 28, 2007
- Messages
- 36,176
- Thread Author
- #92
ATI Hemlock (Radeon HD 5870 X2) Pictures
Link Removed due to 404 Error
ATI Radeon HD 5870 SIX/Eyefinity edition on the right, the beefier HD 5870 X2 on the left. Note that both cards require 6+8 pin PCIe connectors, as opposed to 6+6 pin for the regular HD 5870
Link Removed due to 404 Error
The full length of Hemlock from behind - all 12 inches of it. Since this is the longest consumer card ever - you might want to check for space inside your case if you are planning. The single crossfire connector is visible - allowing quad-CrossfireX. This picture also pretty much crushes rumours of an MCM single package Hemlock chip. However, it is interesting to notice how close the two Cypress dies are to each other. This could suggest a possible relocation of the PLX chip - or perhaps a new interconnect.
Hemlock is scheduled to be launched in one month's time at a rumoured $499. For more information, do read our earlier report on AMD's demonstration
Reference: eNet
Link Removed
Link Removed due to 404 Error
ATI Radeon HD 5870 SIX/Eyefinity edition on the right, the beefier HD 5870 X2 on the left. Note that both cards require 6+8 pin PCIe connectors, as opposed to 6+6 pin for the regular HD 5870
Link Removed due to 404 Error
The full length of Hemlock from behind - all 12 inches of it. Since this is the longest consumer card ever - you might want to check for space inside your case if you are planning. The single crossfire connector is visible - allowing quad-CrossfireX. This picture also pretty much crushes rumours of an MCM single package Hemlock chip. However, it is interesting to notice how close the two Cypress dies are to each other. This could suggest a possible relocation of the PLX chip - or perhaps a new interconnect.
Hemlock is scheduled to be launched in one month's time at a rumoured $499. For more information, do read our earlier report on AMD's demonstration
Reference: eNet
Link Removed
- Joined
- Aug 28, 2007
- Messages
- 36,176
- Thread Author
- #93
Tweaktown have done a comparison between 4890 and a 5870:
AMD's ATI Radeon HD 5870 vs. HD 4890 GPU Comparison Introduction :: TweakTown
AMD's ATI Radeon HD 5870 vs. HD 4890 GPU Comparison Introduction :: TweakTown
- Joined
- Aug 28, 2007
- Messages
- 36,176
- Thread Author
- #94
AMD vs Nvidia DX11 GPU War Heats Up This Christmas
Published on Friday, September 25 2009 11:42 pm by Visionary
Filed under: AMD, Graphics Cards, News, Nvidia
Link Removed due to 404 Error
VR-Zone has put together a list of all the upcoming graphics cards that are going to be released by the AMD and NVIDIA in the next few months. Following the launch of the ATI Radeon HD 5800 series (Cypress) on 23rd September, AMD is planning for the next important launch in mid October, the Radeon HD 5700 series (Juniper) for the sub $200 mainstream market.
At the same time, Nvidia will finally launch their much delayed 40nm DX10.1 discrete graphics cards; GeForce G210 based on GT218/D10M1 and GeForce GT220 based on GT216/D10M2. GeForce G210, a sub $40 entry level card has 24 shaders with DDR2 memories on 64-bit interface while GeForce GT 220, a sub $60 mainstream card has 48 shaders and DDR3 memories on 128-bit interface. Shortly after, Nvidia is planning to release another DX10.1 card based on the GT215/D10P1, the GeForce GT 240 with 96 shaders and GDDR5 memories on 128-bit interface.
Initially, NVIDIA has planned for an enthusiast GPU codenamed GT212 but was canned as it was pointless against the more formidable EG Cypress. Instead, NVIDIA concentrate their effort on the next generation DX11 GT300 GPU and is determined to get it out this coming December. Interestingly, sources told us that AIC partners will get to design the GT300 cards themselves, which means, no more boring reference cards. Information we have gathered so far implies that GT300 has more than double the shader processors of the previous generation. Which means, more than 480 shader processors since GT200 has 240. GT300 card will be fitted with GDDR5 memories featuring a 384-bit memory interface.
NVIDIA seems pretty confident about this card snatching the performance crown back from AMD with a performance target of P2xxxx in 3DMark Vantage. However, it is too early to tell who is the winner in the next gen GPU performance war especially when the DX11 titles aren't out yet. Of course, AMD has their dual "Cypress" card, the "Hemlock" in place to stop NVIDIA advancement. Also, we heard faintly that there is a Radeon HD 5890 card somewhere in the pipeline too. Surely, this Christmas gonna be an interesting one.
Early next year, AMD is going to push ahead with their mainstream and value offerings, the Redwood and Cedar which complete the whole Evergreen roll out.
Link Removed
A 5890? Now I may go for that depending on figures and what Nvidia bring to the table of course...
Published on Friday, September 25 2009 11:42 pm by Visionary
Filed under: AMD, Graphics Cards, News, Nvidia
Link Removed due to 404 Error
VR-Zone has put together a list of all the upcoming graphics cards that are going to be released by the AMD and NVIDIA in the next few months. Following the launch of the ATI Radeon HD 5800 series (Cypress) on 23rd September, AMD is planning for the next important launch in mid October, the Radeon HD 5700 series (Juniper) for the sub $200 mainstream market.
At the same time, Nvidia will finally launch their much delayed 40nm DX10.1 discrete graphics cards; GeForce G210 based on GT218/D10M1 and GeForce GT220 based on GT216/D10M2. GeForce G210, a sub $40 entry level card has 24 shaders with DDR2 memories on 64-bit interface while GeForce GT 220, a sub $60 mainstream card has 48 shaders and DDR3 memories on 128-bit interface. Shortly after, Nvidia is planning to release another DX10.1 card based on the GT215/D10P1, the GeForce GT 240 with 96 shaders and GDDR5 memories on 128-bit interface.
Initially, NVIDIA has planned for an enthusiast GPU codenamed GT212 but was canned as it was pointless against the more formidable EG Cypress. Instead, NVIDIA concentrate their effort on the next generation DX11 GT300 GPU and is determined to get it out this coming December. Interestingly, sources told us that AIC partners will get to design the GT300 cards themselves, which means, no more boring reference cards. Information we have gathered so far implies that GT300 has more than double the shader processors of the previous generation. Which means, more than 480 shader processors since GT200 has 240. GT300 card will be fitted with GDDR5 memories featuring a 384-bit memory interface.
NVIDIA seems pretty confident about this card snatching the performance crown back from AMD with a performance target of P2xxxx in 3DMark Vantage. However, it is too early to tell who is the winner in the next gen GPU performance war especially when the DX11 titles aren't out yet. Of course, AMD has their dual "Cypress" card, the "Hemlock" in place to stop NVIDIA advancement. Also, we heard faintly that there is a Radeon HD 5890 card somewhere in the pipeline too. Surely, this Christmas gonna be an interesting one.
Early next year, AMD is going to push ahead with their mainstream and value offerings, the Redwood and Cedar which complete the whole Evergreen roll out.
Link Removed
A 5890? Now I may go for that depending on figures and what Nvidia bring to the table of course...
cybercore
New Member
- Joined
- Jul 7, 2009
- Messages
- 15,641
Good post here, kemical, very quality comparison.
Link Removed - Invalid URL
Talking about video card memory bus width, it is important. 128, 256, 512-bit refers to how much data the video card can handle internally at once. The higher the resolution you're playing at, the more impact the memory bus width will have (at lower resolutions it's the GPU core that matters the most). We all want to play a game at 1280x1024 and higher, so I reason ATI should widen it to 512 bit, and I would prefer it 1024.
Link Removed - Invalid URL
Talking about video card memory bus width, it is important. 128, 256, 512-bit refers to how much data the video card can handle internally at once. The higher the resolution you're playing at, the more impact the memory bus width will have (at lower resolutions it's the GPU core that matters the most). We all want to play a game at 1280x1024 and higher, so I reason ATI should widen it to 512 bit, and I would prefer it 1024.
Highwayman
Extraordinary Member
- Joined
- Jun 3, 2009
- Messages
- 3,969
Also spotted this special edition version ASUS is doing....38% faster over stock version
"Asus has today launched what is being called the world's first EAH5800 series graphics card with "voltage tweak technology" that promises up to a 38% boost in performance. The new 1GB EAH5870 and 1GB EAH5850 are the planet's first two cards to utilize this new tech, which essentially gives power users the ability to boost GPU voltages via the SmartDoctor application to enjoy up to a 38% improvement in performance."
"Asus has today launched what is being called the world's first EAH5800 series graphics card with "voltage tweak technology" that promises up to a 38% boost in performance. The new 1GB EAH5870 and 1GB EAH5850 are the planet's first two cards to utilize this new tech, which essentially gives power users the ability to boost GPU voltages via the SmartDoctor application to enjoy up to a 38% improvement in performance."
- Joined
- Aug 28, 2007
- Messages
- 36,176
- Thread Author
- #97
- Joined
- Aug 28, 2007
- Messages
- 36,176