• Anisotropic Filtering revisited: Radeon HD 6800 and Geforce 400 series compared

    Anisotropic texture filtering has been invented in order to improve image quality (not only) in 3D games. The improved image quality, however, has fallen victim not only to so called optimizations but from time to time also to hardware bugs, for example the banding encountered in the HD 5000 series, as was confirmed at the launch of AMDs brand new HD 6800 series of DirectX 11 compatible graphics processors. Today, I'll try and compare a bunch of screenshots from both the Radeon HD 6800 series (with Catalyst 10.10 WHQL) and Nvidias Geforce 400 Series (with Geforce 260.99 WHQL) and highlight some of the differences not only with the highest possible quality settings in the driver, but also the default quality in case of AMDs Radeon HD 6800.

    Reviews & Image Quality: victims of the optimization race

    Current graphics cards in the segment from 100 Euros upwards have plenty of texture units to spend a cycle or two on improving the texture filtering yet extensive economizations were put in place with both Geforce and Radeon cards to win at the ever so important benchmark reviews. What most of these reviews do not take into account is image quality though and thus they end up comparing only the reported frames per second without evaluating the actual work done. The last part is very important, because at least both big IHVs have become quite adept at skipping texture filtering work in order to cross the finish line first, figuratively speaking. In some extreme cases, more than 25 percent of the work is left undone.

    Now, it's quite tricky to objectively evaluate image quality because there's no definitive standard. With a healthy eyesight, a bit of experience, a few different pieces of hardware to compare to each other and a lot of time you can come to pretty conclusive results however. While most consumers at least lack in the department of having a stack of hardware available, the real problem is reviewers, which should command at least three of the four points mentioned before (and the eyesight can be remedied with a pair of glasses, if necessary). But almost all of them feel the pressure of having their respective reviews ready at launch day of a given product – and more often than not only a few precious days to run pre-production hardware with pre-qualification drivers through a bunch of benchmarks, in the worst case in addition to their day job.

    So it's quite understandable, that some things are left on the table - mostly this is image quality since it is the hardest to evaluate and - more important - to convincingly convey that message to their readers. For some obscure reason, 99 out of 100 readers (my guess) trust a random number incorporated into a pretty benchmark bar to a far greater extent than they would a worded statement regarding image quality.

    HD 5000 shipped with a hardware bug - supposedly, HD 6800 fixes that
    As I've briefly mentioned before, HD 5000 cards (the Evergreen series) had a problem with some textures showing a hard banding between two Mip-Map levels (different resolution levels of a given texture), which I took a closer look at here. Contrary to my earlier conclusion it was confirmed by AMD engineering that in fact under specific circumstances the hardware selected the wrong filter kernels to use, causing the abrupt end of one and beginning of the next Mip level. As AMD presented at HD 6800 launch with a separate slide in their deck, this problem was supposed to be fixed, the official wording being as follows:
    Improved Anisotropic Filtering
    • Refined Algorithm
    • Adresses visible discontinuities in very noisy textures
    • Smoother transitions between filter levels
    • Maintains full performance and angle independence
    Next to is, there's a dual screenshot showing the problem (HD5k) and it's solution (HD6k).
    But there is visible banding still - albeit under slightly different circumstances which I will show in just a second. But first I'll have to mention one thing: You need to look at image quality as a whole. Not just the single aspect, and there are many pros to the contras I am going to concentrate on. But those have been sufficiently repeatet in a gazillion of reviews of the marketing slides, I guess.

    Still I am using the ever famous classic version of the so called AF-tester. That's a small program showing the view into a virtually endless tunnel made up of 200 faces so it appears pretty round. You can fiddle around with various texture settings and as I have showed in the other article (see link above), there are certain settings that do a better job showing the issue at hand than others so I'll concentrate on those today rather than post all 158 screenshots I took.

    Following you'll find, from left to right, pictures of the Radeon HD 6850 using Catalyst 10.10 WHQL with the AI texture slider first set to „Quality” (which is the default), then set to „High Quality”, a HD 5770 using Catalyst 10.9 WHQL and AI disabled (which should be analog to 10.10's High Quality setting) and finally a comparative image from a GF108-based Geforce GT 430, which should by all means be identical to the images rendered by its larger brethren. I'll have to mention one thing first though. Compared to earlier Catalyst drivers (and HD-series cards possibly), AMD opted to decouple the texture economizations from other, useful optimizations by adding the mentioned Catalyst AI slider for texture filtering quality. Supposedly, with the slider set to high quality all texture optimizations are turned off as was the case on HD 5800 with Catalyst 10.9 and AI set to default already (see here and the following post). Slider setting Quality enables some trilinear and anisotropic optimizations that I've just shown to amount to up to 28 percent of work skipped (or performance boost, if you like the euphemism) - in „real games” the effect is less pronounced and more close to 5 - 7 percent. AMD claims, they are designed to have no visible impact on image quality.
    Ok, enough talked, let's see the shots!

    This time, I'll start with 16x AF, no colorization of Mip-Maps („normal”) and the texture set to 3 right away:

    Radeon HD 6850, Catalyst 10.10 WHQL, AI Texture Slider "Quality", 16xAFRadeon HD 6850, Catalyst 10.10 WHQL, AI Texture Slider "High Quality", 16xAFRadeon HD 5770, Catalyst 10.9 WHQL, AI disabled, 16xAFGeforce GT 430, Geforce 260.99 WHQL, High Quality, 16xAF
    Comparing High Quality on the HD 6850 and AI off on the HD 5770 (which should be comparable!), we see, that where the latter card shows harsh transisitions between filter levels, the former smoothes the transitions in the image better. Comparing High Quality and Quality, the image suggests that the LOD is pushed back a bit, resulting in sharper texture but at the same time making the image more prone to texture shimmering which would only show in movement though. The Geforce card has even slighty less visible Mip-transitions (especially visible at the transition to the part displayed as a grey disc) and also more regular texture patterns, indicating a lower tendency to exhibit the dreaded shimmering.

    Now, moving the texture to a setting of „2” thus increasing the details or „noisyness”:

    Radeon HD 6850, Catalyst 10.10 WHQL, AI Texture Slider "Quality", 16xAFRadeon HD 6850, Catalyst 10.10 WHQL, AI Texture Slider "High Quality", 16xAFRadeon HD 5770, Catalyst 10.9 WHQL, AI disabled, 16xAFGeforce GT 430, Geforce 260.99 WHQL, High Quality, 16xAF
    Now its starting to get really interesting. While we can observe once again the better transitions for HD 6800 compared to HD 5770 (and the rest of Evergreen) and „quality” pushing the LOD further back than „high quality”, there's also something else becoming visible. At the far egde of each transition band, there still is a visible, sharp border between texture resolutions. This is not the case on the Geforce image.

    Texture set to „1”:

    Radeon HD 6850, Catalyst 10.10 WHQL, AI Texture Slider "Quality", 16xAFRadeon HD 6850, Catalyst 10.10 WHQL, AI Texture Slider "High Quality", 16xAFRadeon HD 5770, Catalyst 10.9 WHQL, AI disabled, 16xAFGeforce GT 430, Geforce 260.99 WHQL, High Quality, 16xAF
    Still the same as above, just a tad more pronounced and due to the higher texture graininess, there are more bands visible.

    And finally, the hardest test: Texture set to 0.

    Radeon HD 6850, Catalyst 10.10 WHQL, AI Texture Slider "Quality", 16xAFRadeon HD 6850, Catalyst 10.10 WHQL, AI Texture Slider "High Quality", 16xAFRadeon HD 5770, Catalyst 10.9 WHQL, AI disabled, 16xAFGeforce GT 430, Geforce 260.99 WHQL, High Quality, 16xAF
    Here, we have a family reunion with the banding - not matter if the driver is set to high quality. It first appears even in the same place as with HD 5770. After the nearest banding there's much more visible detail with HD6800 and Geforce 400 (but also additional bands in case of the former) compared to the HD 5770. Strangely enough, the moire patterns (or what I can make out of them) seem more closely related between GF400 and Evergreen, where the repititions in case of the HD 6800 look like they were rotated to some degree.


    The verdict

    Now, there's definitve an improvement with regard to the banding that harrassed Evergreen - no doubt about that. Quite a bit of change seems to have happened in the way, texture filtering is handled - either by HD6k or by the new Catalyst AI. The good news is that up until the most detailed structures, the banding present in the Evergreen series of GPUs is remedied to a degree where it isn't noticeable any more. The bad news is, that the more detailed (call it noisy if you want) your texture gets, the higher the tendency for the banding to creep back in. Coming through the backdoor at first, it may possibly re-enter in full glory in a worst case as the last row of screenshots show.

    Now, there's more. Seemingly, there is more visible details in the more distant textures in the tunnel. Some people tend to regard a uniform grey there as the optimum, indicating that enough sampling occurs in order to remove the artefacts still showing in Geforce 400. That would also mean, that apart from the harsh banding the Evergreens texture filter is as good as it gets - at least with AI disabled - and HD6k is step back with regard to texture quality.

    I cannot tell for sure, who is right and who is wrong. Obviously, there's more research and testing to be done. Hopefully, other people (possibly more knowledgable ones) can pick up where i left and do a thorough investigation what's going on in texture filter land.