• Radeon HD 6990 & Geforce GTX 590: What's the purpose of Crossfire/SLI-on-a-stick really?

    After both cats now out of the bag, to literally translate a german proverb, I wonder more and more what those cards, essentially Crossfire/SLI-on-a-stick, are really made for? My guess is one thing only: Get reviews for „the fastest graphics card” - well, Nvidia failed at it this time - and make that halo shine over their less powerfull products. My reasoning for this is as follows.

    Horsepower? Both companies have quite fast and proven single-GPU cards (Geforce GTX 580 and Radeon HD 6970, in order of release date) that can be run in pairs, triplets and even quartets in case of AMD. So people who need more graphics horsepower than the fastest single-GPU card can provide, can step up quite a bit without Radeon HD 6990 and Geforce GTX 590 already. Additionally, both cards are clocked lower than their respective single-GPU brethren - more in case of Geforce GTX 590 or less in case of the Radeon HD 6990 (even in 450-watt-mode, if you are about to argue that, has lower memory speed than a regular Radeon HD 6970). So I cannot see graphics horsepower per system as primary reason, with the small exempt that really rich people could use four Nvidia-GPUs teamed-up only with two dual-GPU cards. But I'd wager this is a design choice by Nvidia and it is not inherently impossible with their products to let work four together.

    Cost? Both cards sell for enormous prices. About 550 EUR for the Radeon HD 6990 which already had more than two weeks availability and time for street prices to adjust and around 640ish EUR for the Geforce GTX 590 where prices have not settled yet but could also go higher if you follow the rumour that there'll be limited quantities available. In total, that's not much cheaper than two Radeon HD 6970 or Geforce GTX 570 (those are performance-wise in SLI more similar to Geforce GTX 590 than two Geforce GTX 580 would be) would cost. Enthusiasts being the target for those cards anyway, I doubt that there's a group of relevant size not willing or able to upgrade their motherboards for dual PCIe-slots (which could boost performance ever so slightly again). If they do not own one already, since many enthusiast-level boards do come with full Crossfire- and oftentimes full SLI-support as well.

    Power? Ok, this sounds a bit ridiculous connecting ecological aspects to the use of multiple high-end chips for higher fps rates in games. But let me consider it nevertheless: Both cards utilize chips screened for low leakage and - in case of Geforce GTX 590 - operationability at reduced voltages, thus requiring less power than two single-cards of similar performance. Point taken, BUT: The screening could be done without dual-gpu cards requiring it and the resulting Radeon HD6970 LP or Geforce GTX 570 Eco would also require significantly less power. Moreover, those cards would or at least could be of interest to more people, thus having a greater effect on planet welfare. AMD and Intel are doing this with CPUs, even high-performance models in the server space, for a long time. Personally, I'd buy a low-voltage version of any given card over an overclocked version of the same card any day, saves me the trouble fiddling with the BIOS doing that myself.

    Noise? Nvidia seems to have loudness pretty good in check, compared to AMD, but that has been a trait for second-fastest cards for a long time, being less noisy than the fastest models. The Geforce GTX 590 is barely louder than a Geforce GTX 580 if at all, ironically not in idle mode, where you'd get the least from you dual-GPU card; the Radeon HD 6990 is only slightly better in this regard when doing nothing. But there's two catches: First, noise doesn't sum up. Even if you put two, say, 5,6 sone cards working in tandem, the result is not 11,2 sone but something inbetween. Second: Both, the Radeon HD 6990 and the Geforce GTX 590 have the less than desireable, trait of blowing the hot air from one of the GPUs and parts of the power circuitry as well back into the case - in directly contrary direction of suggested airflow for most cases. I have yet to see the effects of prolonged gaming sessions of that, but odds are, it's not helping temperatures, thus increasing both fan noise and power consumption. In contrast, when you put two reference cards like GTX 570 and HD 6970 to work in tandem into your system, the coolers of each cards blows out the hot air individually, leaving only pretty much unavoidable residue heat from memories and power circuitry inside the case.


    I seriously cannot come up with other reasons or aspects that could make a dual-GPU card attractive to anyone, assuming that person is willing to spend 500 EUR or more on a single graphics card. You can drive multiple displays with two cards as well, if not better, when relying on the still prevalent DVI-output (that's four against one (Radeon HD 6990) or three (Geforce GTX590). You can drive two independent display and the GPUs will be able to enter their most efficient power state individually even when using differenz resolutions and timings for the displays. This is something that left me a little baffled when testing the power consumption of both cards at my day job. AMD and Nvidia could have routed the display outputs individually from both GPUs, or at least part of them, so as to make sure, each GPU drives a single display when two are connected and being able to enter their deepest slumbers - yet they didn't bother, but I guess that falls under the premises from above: enthusiasts don't care about power, they just want more of it. Furthermore, you can assign one of your GPUs to do the dirty PhysX work in case of the Geforce. Still no advantages for the duallies-on-their-respective-sticks.

    When coming to think of it, two individual cards even have an advantage beyond the clock speeds:
    • GPU-Computing. Serious tasks are often bound or at least impaired by the bandwith to the host system. Two individual PCIe-x16-slots should be more capable as one being shared.
    • Fault tolerance. When one card breaks and you have to send it in for RMA, you can still keep playing with the other one.

    And I am not even mentioning typical Multi-GPU problems like a stronger dependency on driver profiles or this:
    http://www.youtube.com/watch?v=zOtre...el_video_title

    Little personal anecdote/blast from the past: Back in the day, when fast internet in germany meant having a 56k-moden (that's kilo-BIT, not byte, for all you younger people), we'd often meet at a friends house and having a weekend-long multiplayer session - or LAN-party, if you like. Me and another guy each had two 3Dfx (still with a capital D) Voodoo 2 working in SLI, but for the weekend, we spread them out over four machines to keep everything fair and everyone able to play the same games at the same smoothness mostly.