Nvidia XGL V. 4

And then there were three…

Nvidia's 750XGL
Imagine my chagrin when, just a few days after I posted my review of the FireGL 8800 from ATI, I received an email from a random animator on the Maya list serv taunting me about the Nvidia 750XGL card and how its numbers were supposed to be better than the FireGL 8800’s numbers I had just bought with my own blood money. It was doubly aggravating for me since it was only an email and I had no one’s neck I could immediately put my foot on in blind fury.

But, as fate would have it, another email came flying in through my Inbox a day later from a kindly person waving a candy bar in front of a sugar crazed toddler. “Hey,” she more or less wrote, “you wanna review an Nvidia 750XGL card?” I’m not sure how long it’s supposed to take the human vocal chords to generate the word yes, but I’m pretty sure my answer was faster. Then I realized I was talking to my computer screen and I composed an email response, strongly flavored with the affirmative tone and a little Amaretto. About a week later I had sitting on my desk, not the lovely Cate Blanchett whom you should usually find sitting on my desk, but the Nvidia 750XGL and 900XGL cards. Both are dual display workstation class video accelerators based on the Quadro 4 chipset from Nvidia sporting 128MB of DDR memory. And they both look almost as tasty as Cate holding out a tray of gin and cookies.

The cards are almost exactly the same physically, except for their external connectors. The 750XGL sports an analog VGA connector and a DVI port, while the 950 sports two DVIs. Both cards support dual analog, dual digital, or analog/digital displays using DVI to VGA adapters.

Having originally replaced my GVX1 cards with the FireGL8800 when we last spoke—well you read, I wrote—proved quite the improvement in my computing experience. But I wondered how much of an improvement the Nvidias would provide. I buckled and finally decided to research 3D benchmarks, despite my distaste for them, to be able to take some more precise measurements, since I knew the performance of all three cards would be pretty similar.

I installed each of the three cards into the same system, each one consecutively over about a six-week period—during which I must have cut myself on the edges of my case at least three times, dropped all sorts of things into it and having to fish them all out, and stuck a finger into one of the 92mm-high-powered-knuckle-busting case fans at least 18 times (I look like I teach wood shop). At this point my crazy Apple-loving friends would point out that none of this would ever happen if I own an easy to operate G4, to which I would respond by throwing my chair at them. But that’s another story entirely.

Now, one of my initial attractions to the FireGL 8800, funny enough, was the card’s size. I originally owned a GVX1 card, which reached from one end of my system to the front where it overlapped the back of one of my hard drives, and was very annoying to get in my case. The ATI card was half that length and fit easily in my case without my having to remove the drive cage or unplug the ATA cables to fit the card into the case (but that’s only because my hog of a system is lovingly crammed full of crap). The Nvidia cards were almost the length of the old GVX1 card I had, but it was no big deal with a little aggressive cable management. After backing up my C: drive with Norton Ghost, I fitted the 750XGL in my Soyo Dragon Ultra’s AGP slot.

I powered up PentiKoosh (my workstation) to immediately notice a relatively louder and higher pitched whine from the case, due to the heatsink’s fan on the Nvidia card. Since my system sits in my Les Nessman-inspired office on the other side of a curtain partition from my living room, I really prefer things to be quiet in my office corner. (

Immediately though, the VGA cooler made by the folks at Zalman came to mind, the ZM17Cu ( This all-copper heatsink sits on the GPU and disperses heat laterally across its length, and goes without the need for an attached fan, which is the source of the high-pitched whine. Installing this would negate the extra noise, so I decided to ask the folks at Zalman for one, and they obliged quite graciously. But more on that in a later article; I got things to do here first.

Once I slapped the 750XGL in there, I booted up and began the driver installation. Easy as pie. Well, easier than that since I couldn’t bake store bought cookie dough with a full manual and a how-to video. The installation of the Nvidia drivers was a snap, as was their dual display management software Nview, similar in many ways to ATI’s Hydravision. Within one more reboot I had the resolution I like (1280x1024) on both monitors for a total of 2560x1024. Who's yer daddy?

Now, simple tooling around Windows and general applications will net you absolutely no discernable performance differences between the 8800 and the 750XGL cards. Where they shine is in 3D, and that’s exactly where I went as soon as I regained control of my mouse after the reboot.

I’m not one for benchmarks and tables of numbers, but I knew I would have to run them to net some performance differences out of these cards. I went and downloaded a few different benchmarks to hit up as many aspects of these cards as I could. I used Spec Viewperf 7.0’s 3D Studio Max 4.2 test from, 3Dmark 2001 gaming benchmark from, and Ziff Davis’ 3D WinBench 2000 v1.1 from

More important to me are real world results, so before I ran any of those benchmarks, I loaded Maya 4.02 and opened up the 3D test scene from as real world benchmark. I wanted to know exactly how fast this file would playback using all three of these cards. I was also curious about these cards’ hardware particle performances, so I devised a benchmark to use in testing Maya particle quality and playback speeds.

Next Page: The Benchmarks

1 2 3 Next

Related Forums:
Related Sites:Animation ArtistDigital AnimatorsDigital Game DeveloperDigital Post ProductionDigital ProducerHollywood Industry