In 1997 it was not always the case that a 3d accelerator really would accelerate your 3d graphics - just remember S3 inglorious ViRGE series which were dubbed 3d decelerators for a reason. But if you were one of the lucky few and had opted for 3dfx' Voodoo Graphics, ATi Rage, Nvidia Riva 128 and a few select others, then you could impress your friends with benchmarks like Final Reality, a co-development of Remedy Entertainment and VNU European Labs.

The Benchmark is using DirectX, albeit I could not confirm which exact version. Chances are either 5 or 6 is the correct number with the former being more likely. Other than also testing 2D capabilities which were basically only CPU tests in radial blur and a chaos zoomer, it offers a few scenes with actual 3d display such as a spinning space ship displaying simple texturing, environment mapping and transparency. Next up is a scene with a mech walking menacingly inside a valley base lit with moving lights. After that, you follow the path of a small glider, taking you through a futuristic, though deserted city with neon lights, ads (among them, for Max Payne, Remedys upcoming 3D shooter) a layer of fog/clouds.

In the advanced section, there's also an AGP test, where you can select the size of the textures used so see whether or not your system really was using AGP texturing - then a much hyped capability supposed to make graphics cards do with 4 MiByte local memory only. One of the early chips that could really shine at this particular test was Intels and Real3Ds i740 and later, 1998, S3's Savage3D.

The Features used in the benchmark were as follows (according to the benchmark itself):

  • Bilinear Texture Filtering
  • Z-Buffer Sorting
  • Texture Mip-Mapping
  • Texture Tri-Linear Mip-Mapping
  • Depth Fog
  • Specular Gouraud
  • Vertex Alpha
  • Alpha Blending (Crossfade)
  • Additive Alpha (Lighten)
  • Multiplicative Alpha (Darken)
  • Subpixel Accuracy


The scores achieved are barely reproducible today, because it almost entirely depends on the CPU used. So, outside of a pure retro system with a Pentium-, Pentium-MMX (yes, Final Reality already made use of it) or at most a Pentium-II-class processor, the values are worthless.

Modern graphics cards on the other hand, can no longer display the 16 bit rendering correctly that Final Reality is using. Initially, i was under the impression, that only DX10-parts lacked this backwards compatibility, but when I tried running the benchmark with a Geforce 6200A, there was also some banding which wasn't supposed to be there. So I used my HD 5870 for capturing the 2D part and an ancient Geforce 3 for the 3D scenes. Since there was Fraps running in the background, grabbing uncompressed 30 frames per second from the video cards memory, performance seen in the video is not indicative of actual performance in the benchmark.


To make this a little more like a look into the past, I have compiled a video depicting the benchmark as faithfully as I could. The 2D part had to be recorded separately from the 3D part, because Fraps would display garbled colours and a double image as soon as 3D mode was entered. Also, to preserve the impression of retro-ness, I've opted to not scale the resolution up from the original 640 x 480 pixels, embedding it in a 720p-frame to give you a feeling of how far from todays lower standards this benchmark already is.