ATI Morphological Anti-Aliasing

kemical

Windows Forum Admin
Staff member
Premium Supporter
Morphological Anti-Aliasing

Morphological anti-aliasing (AA) is an all-new option for the Radeon HD 6000-series cards. It presents a different approach to the aliasing problem in that it needs no insight into the makeup of the scene’s geometry; morphological AA is a post-process filtering technique, accelerated with DirectCompute and compatible with any application from DirectX 9 to 11 (in theory). After a frame is rendered, it is passed through the morphological AA shader that looks for high-contrast edges and patterns consistent with aliasing. It then blends the colors of adjacent pixels to approximate a smooth transition along a line instead of aliased steps. This means that the smoothing effect isn’t limited to the edges of geometry or alpha textures like CFAA; it applies to all of the pixels in the scene.

Conceptually, this method promises results similar to super-sampling, but with performance comparable to edge-detect AA. AMD suggests that some applications will look better than others, and that the technique is not ideal for all scenes and games. So, we tried it ourselves to see what the actual result looks like.

Read the full article here: Link Removed - Invalid URL
 
The method to unlock for 5 series drivers seems to have been leaked. I just tried the mod and it worked! (see screenshots)
Don'y forget you need the 10.10a Cats for this to work:

Here's the method:

Start the downloaded file and extract to a folder.
Go to directory(where you extracted)\Catalyst_8.782.1RC5_Win7_MLAA_Oct21\Pac kages\Drivers\Display\W7_INF\ and open CW106976.inf

Now download this .txt file
http://www.forum-3dcenter.org/vbulle...5&postcount=18. Now select with Ctrl+A and copy the modded lines into the inf.
Do the same with C7106976.inf in \Catalyst_8.782.1RC5_Win7_MLAA_Oct21\Packages\Driv ers\Display\W76A_INF\.

Now start the setup and the function should be unlocked in CCC.


Thanks to Jakshi
Ref: ATI/AMD's new Morphological Anti-aliasing - Guru3D.com Forums

I've not really tested this yet but that's about to change, results shortly.
 
I tested by turning the AA totally off in New vegas and running the new setting. I still couldn't see any 'jaggies' and lines were crisp as well as sharp..
 
I wanted to express my sincere thanks for the posting of this information. For those interested, I'm truly unable to notice a difference on my end after making the unofficial patches necessary to access the filtering. However, it looks like this could become a "big deal" very soon for those looking to improve their 5XXX series cards on games and graphics. Nice find, truly will look forward to the next driver release which will officially unlock it. I think if the promise of less resource consumption is for real, you will see some MMORPGs make use of it like Word of Warcraft and Final Fantasy XIV. I mention the latter because I've tried it, and the graphics are absolutely breath-taking. It is the most ambitious MMO I have seen in a long time, and features truly beautiful and breathtaking landscapes and visuals. I could see them adding support for it to make the game use less GPU resources. Surely WoW might take advantage for their millions of customers. I think of a lot of "living games" as the ones that are patched a lot and would introduce it if its practical. Games that are released late in this year and older games may be patched to support the feature under native settings, if it is attractive enough to improve scaling for those on low-end hardware.

Here's my take: I tested the new feature on my ATI Radeon 5870. Since buying a Samsung monitor everything I do is in full HD. I stick to the same few games these days - as my time is short, but when I have it, I'm going to go all out on long-lasting games. My favorite is easily the Fallout series, so I tried this on there by disabling in-game AA and applying the max settings in the Catalyst Control Center (CCC). I could not notice a difference, but this may be one of those games where you won't. I'm not sure if running the textures at such a high res. has something to do with it. These days, its very common that the textures are rendered in HD to begin with and scaled down for lower resolution monitors.

I can't tell if using this feature reduces GPU resource consumption, because generally I don't measure it to begin with. It would be great if some people who do heavy benchmarking on their video graphics cards could come by and do some thorough comparisons. To compliment this great find, and new feature, we need some Radeon users who run benchmarks frequently to see what the impact is. I'm sure it must provide some benefit. I will usually jump on something like this when there is a hint of a big performance boost. The system I am using right now is such a powerhouse to begin with, that I honestly cannot determine any difference. However, I am sure someone will start number crunching soon. Excellent find and post.

I found the instructions a bit confusing on what to do, as they are spread amongst multiple sites and needed files. For AMD/ATI Radeon Series 5 owners, be sure to read carefully. Updating to 10.10a is not going to help you if you have the 5XXX series cards. 10.10a was released after 10.10, for clarification. It activates this for 6-series cards and has an unsigned driver. 5XXX series card owners: You must replace the INF files in the setup folder before running the setup for 10.10a. You must then restart to see the feature in the Catalyst Control Panel. You do this by downloading 10.10a and replacing the default INFs in the subdirs with the ones supplied on the links kemical threw up.

Because I did not notice a massive improvement, and because I am not big into benchmarks, I would recommend that those who may find these instructions confusing wait it out until the next version of the drivers, which will likely have support for this feature for the Radeon 5-series cards.
 
Hmm... ok..

Well I'm still benchmarking and the like and when I have better data I'll share.................
 
Thanks for that news kemical . I have with your help recently installed the software , ATI Morphological Anti-Aliasing . I await with interest your findings via benchmarking etc :)
 
These two shots show the morph feature enabled and then disabled. I've arrowed the most obvious difference as it is slight. I had disabled AA in both the game and on the GPU before initialising the test..
Using fraps to gain the above images I noticed that there was no FPS drop off when morph was enabled. This may not be the same for every game and may change.
 
Last edited:
I also did a quick bench using Crysis..

Crysis@4xAA,very high- morph disabled= 48FPS

Crysis@0xAA,very high-morph enabled=49FPS

Crysis@0xAA,very high-morph disabled=50FPS

Crysis@4xAA,very high-morph enabled=48FPS

Remember that when running with 4xAA and morph enabled it's roughly equivalent to 8xAA.
To attain each figure Crysis was benched 3times at that particular setting.

It's still early days but so far things look promising. I'll be back with more benching later....
 
If your a 64bit user then I've uploaded a modded driver to Rapidshare:

Link Removed - Invalid URL
 
Back
Top