Months ago at NVIDIA's GPU Technology Conference, CEO Jen Hsun Huang announced the upcoming Fermi architecture to the world. Due to the nature of that conference, very few -if any- details were leaked regarding the architecture's performance in the one area where NVIDIA's roots lie: 3D graphics processing. All we knew at that time was Fermi-based cards weren’t anywhere near mass production capable and that ATI already had troops on the ground in the DX11 marketplace. Since then, ATI has continued to run away with the DX11 GPU market and things couldn’t have looked worse for Team Green. Designing a whole new architecture from the ground up takes time and NVIDIA was always the first to admit that.
If anything, NVIDIA has been extremely tight-lipped about all things Fermi ever since the GTC but from the information we received from our moles within their Santa Clara offices, things were moving along at a quick pace. New technologies were being developed to showcase the chip’s advanced capabilities, features were being toyed with, drivers were written and TSMC was pistol-whipped into shape after showing disappointing yields on early silicon. As CES rolled around, NVIDIA wanted to put most of their cards on the table and finally disclosed how they have adapted Fermi's architecture for the consumer GPU marketplace. This included selective tech demos and hardware being shown on the show floor as well as all-day “Deep Dive” briefings for select journalists being conducted behind the scenes. We attended one of these briefings and in this article we will finally shed some light on the technologies that will make the Fermi architecture hum along in games and other 3D apps.
Spotted this early sneak look at the FERMO GT100 vs Nvidia 285 and 5870/90 using "Dark Void" & "Farcry 2" benchmarks...take results with pinch of salt as no mention of drivers and final product yet, but shaping up to be great IF the price is right.
Snippet from Hexus.net's first look, full link below.
The Fermi and GTX 285 numbers were provided by NVIDIA, in NVIDIA-controlled conditions, at an event in Las Vegas*
NVIDIA divulged further details on its upcoming DX11 'Fermi' GF100 GPU architecture during an event held just after the Consumer Electronics Show in Las Vegas, Nevada.
Providing the meat on the bones that were laid out back in October 2009, NVIDIA is set to deliver Fermi at the start of March 2010, according to senior representatives present.
What NVIDIA kept close to its chest were the finer details that would, once released, define performance. There was no mention of clock-speed, frame-buffer sizes, heat, power, or, well, any real performance metric that could be inferred by looking at the architecture.
Towards the end of the day-long presentations, however, the graphics goliath took time out to compare a Fermi GF100 card against the fastest single-GPU card that's currently available from the green stable, the GeForce GTX 285.
Again, we repeat that no mention was made of Fermi clock-speeds, but it would be sensible to imagine that NVIDIA wanted to create a good impression and therefore had a well-screened, high-frequency card in the box.
As an update, an NVIDIA spokesperson contacted us and said that 'Well, all I can say is that it was not final and not running final clocks. Final perf will be higher'.
Here are the same two NVIDIA cards running Dark Void, released in the US today and in Europe on the 22nd January. Based on the Unreal Engine 3 with lots of PhysX thrown in for good measure, NVIDIA ran the timedemo at 1,920x1,200 with 4xAA and 16x AF and PhysX set to medium.
Dark Void test The GeForce GTX 285 returns an average frame-rate of 38.38fps and a minimum of 29.28fps, suggesting that the game will need a beefy GPU if it's to run well at this setting, The same test on Fermi shows an average frame-rate of 78.32fps and a minimum of 53.32fps.
FARCRY2 test GeForce GTX 285 returns an average frame-rate of 50.32fps with a maximum of 73.13fps and minimum of 38.4fps. Fermi returns an average frame-rate of 84.05fps with a maximum of 126.20fps and a minimum of 64.6fps. The minimum frame-rate is higher than the GTX 285's average, and the 67 per cent increase in average frame-rate is significant.
Radeon HD 5870 scores an average frame-rate of 65.84fps with a maximum of 136.47fps (we can kind of ignore this as it's the first frame) and a minimum of 40.40fps - rising to 48.40fps on the highest of three runs. Radeon HD 5970 Average frame-rate increases to 99.79fps with the dual-GPU card, beating out Fermi handily. Maximum frame-rate is 133.52fps and minimum is 76.42fps.
We've learned that some serious DirectX 11 games are expected after late February, most probably in March time.
We are of course talking about the heavily delayed Alien Versus Predator that is expected sometimes in Q1 2010 and Battlefield Bad Company 2 that should come in March 2nd in US and March 5th in Europe.
Dirt 2 is out as of December and this is the only game that ATI can rely on, but let's not forget that BattleForge has DirectX 11 support as of late September and S.T.A.L.K.E.R.: Call of Pripyat is also a DirectX 11 game.
The catch with S.T.A.L.K.E.R.: Call of Pripyat is that the game is out only in Russia, Germany, Switzerland and Austria while the English and rest of the EU / World version is expected to come in February.
There are at least six more games that will support DirectX 11 and some of them such as Crysis 2 will be highly expected but might probably launch in later part of 2010.
Native DirectX 11 games will only come with next generation console refresh, something that is expected in 2012.
Aliens vs Predator is DX11 too I heard, an seriously I hate the term "Next Gen Console" cos they are for a nano-second before they are "Current Gen", I wish people would drop that term it really pisses me off, cos Xb360 is "Next Gen" even though it's a stale 4 year old bit of legacy cack.
To come in 2010
If all goes well, the second generation Fermi architecture should be coming in late 2010. Since Fermi is Nvidia’s big transition that got delayed, it looks like Nvidia plans to have two new major generations in a single year. Traditionally the new chip comes in one year and roughly 12 months after the company is ready to launch faster version of the same chip, with more muscle.
We also learned that the mainstream Fermi won’t be significantly delayed from the original plan and we hope to see GF100-based GT300 single-chip card in March of this year.
By November Nvidia should be ready to announce its new architecture, but this time we only know that Fermi is the first step in new architecture development that should stay with us for a few years. G92 foundations lasted almost three years and we expect that Fermi and future Fermi chips in next few years to feature the same fundamental design.
After all this is what ATI and Nvidia have been doing for quite some time now. If Nvidia doesn't push second generation Fermi to a later date, you should see these chips by Q4 2010.
A significant amount of ATI users have taken to the AMD user to user forums to complain about grey screens, crashes and colored-striped hangups.
Zoom(There's actually countless forums out there with threads like this – including at least 4 threads in our own forums. I'm concentrating on the AMD forums because these guys, between them, have collected nearly 40 pages of possible causes involving everything from Windows, to mobos, to RAM.)
So far it's unclear as to what is causing the problem. Users report that grey, brown or colored stripes/screen appear while playing games, watching movies and in some cases, while idle. The problems seems to be confined to the HD 5xxx series, although there are a couple of mentions of 4xxx cards
Read the rest of this article here:
I have an HIS HD5770 which when I first installed it and installed the drivers that came on the cd used to black screen when starting anything and i do mean anything from Explorer to task manager to spider solitaire,
So i uninstalled them then tried the cat 9.12's same thing black screen on everything BAH..... ... so I uninstalled them too I'm at wits end then i found the cat 9.12 hotfix drivers well these worked like a charm and now so do the cat 10.1 drivers aswell..
I have heard/read in my trawling of forums that those with an 6pin power plug on not so stellar PSU's shouldn't use them rather use an 2x 4 pin Molex to 6 pin power plug instead
A short tweet on Nvidia's Twitter accounts has revealed what appears to be first two cards that will be based on the Fermi architecture, the GTX 480 and GTX 470. This will at least clear some confusion and the average consumer can still buy cards keeping in mind that bigger number means new and better card, at least it should until Nvidia starts with more rebranding.
The Fermi will apparently be the 400 series, and for a while it has been known that two cards are going to launch as soon as Nvidia gets it ready. The specifications of these two models were apparently too long for a tweet so we'll have to wait a bit more to get these from Nvidia.
Even the Nvidia partners have stop talking about dates but everyone hopes that these will be available right after CeBIT.
You can find the "Fun Fact of the Week" tweet here.
According to our info, Fermi GTX 480 and GTX 470 names were a last minute change, as even partners were talking about GTX 3xx, but it looks like Nvidia decided to clear up some confusion and went with the GTX 4xx names instead.
The cards should be showcased at Cebit by various Nvidia partners which means that they might get it by the end of February, but as always since we are talking about Fermi here, nothing is certain.
Apparently, the old names for the GF100 were GTX 380 and GTX 360, something that would make sense considering the previous naming history, but as it could create a lot of confusion for the average consumer considering the fact that Nvidia rebranded some old cards as the 300 series, so we guess that this is even better.
On the other hand, we aren't sure why Nvidia decided to go with the GTX 470 and not the GTX 460 name as that would make perfect sense, but this might have something to do with the specification difference between these first cards as, performance wise, they might be quite close to each other.
Of course, the performance numbers are still blurry as Nvidia is keeping these tight under wraps and partners are yet to receive first cards, so things should start to get a bit more clear as we draw closer to the CeBIT show in March.
Nvidia is working on what looks to be its first DirectX 11 card and as it was the case before, Nvidia will start with DirectX 11 in the ultra high-end and it will pass it on to slower cards at a later date. The codename that we've heard is a quite logical one, GT300, and this card will help Nvidia to fight and eventually take the performance crown in ultra high-end market.
We do know that ATI should have its high end DirectX 11 at a similar date, and in the meantime both companies will focus more on cheaper cards in 2009, as the year of the Ox will probably be a good year for selling cheaper and more affordable stuff. All of the cards to launch in next three quarters in mainstream and entry-level will stick with DirectX 10 or 10.1, depending who are you talking about.
Many of you know that high-end helps selling entry level and mainstream, and it is rather important who wins this round, but it is still way too early to tell.
ATI is currently rocking a substantial lead in the DirectX 11 space—it’s blazing a trail where it has, in the past, often followed suit. “But Chris, ATI gave us DirectX 10.1!” And look how pervasive or impactful that turned out to be. This time it’s different, though. ATI and Nvidia both agree that DirectX 11 is the API that’ll change the figurative game. Has it yet? Is this boat already sailing off into the sunset? Decidedly not. ATI’s own schedule of DirectX 11-compatible software lists three titles currently shipping with support, three more slated for Q1, and two more expected in 2010. The undeniable reality is that ATI is out in front, and Nvidia brings up the rear this time around. But its tardiness means very little in the big picture (so long as you’re an enthusiast, and not a shareholder). Playing Catch-Up
The real values in ATI’s Radeon HD 5800-series lineup, as it stands today, are gaming performance in the more pervasive DirectX 9 and DirectX 10 titles, Eyefinity, and the family’s handling of high-def audio/video content.
And while today’s preview of Nvidia’s GF100 graphics processor (the first model based on the company’s Fermi architecture) is full of detail covering the chip’s building blocks and architectural philosophies, we’re left to make educated guesses on how a productized GF100 will stand up to its competition.