You can view the page at http://www.gpu-tech.org/content.php/129-3D-Graphics-Benchmark-1997-style-Enter-Final-Reality-v1.1?
Some additional background info:
I'm quite sure that DirectX 5 is the correct number, as DirectX 6 was not out yet by the time the benchmark was released (the 1.01 build is dated January 28th 1998, 1.0 is from early 1997 I believe).
It could even be an older version than DirectX 5 actually.
The city scene is actually a reference to the legendary Second Reality demo (as is the name Final Reality):
http://www.youtube.com/watch?v=54XHDUOHuzU
The Finnish programmers who set up the Remedy company had a demoscene past. FutureMark/Mad Onion was split off after Final Reality, resulting in the 3DMark series (still working closely with Remedy, sharing the MAX-FX 3D engine, and sometimes using Max Payne content in benchmark scenes).
I think what you are referring to with 16-bit rendering is 'dithering' (that is, unless you are having other problems. I can still run Final Reality 1.01 just fine on my GeForce GTX460 with Windows 7 x64). Whether or not this is the 'correct' way to render 16-bit graphics is debatable (for example, PowerVR cards would render in 32-bit and perform colour reduction on the final image, resulting in pretty much the same results as modern cards).
But you are correct that most older videocards have a specific 16-bit mode where alphablending is done with dithering rather than actual blending of colours, which gives a specific look.
Since 16-bit modes have long been obsolete, modern videocards no longer have support for them, and lack the dithering logic (although it could be simulated in a shader).
It looks like YouTube did not do a very good job at preserving the dithering-look however.
Powered by vBulletin® Version 4.2.2 Copyright © 2024 vBulletin Solutions, Inc. All rights reserved.