Friday, December 11, 2009

NVidia: 3 Ben: 0

This is getting embarrassing - I'm at risk of getting shut out. I was able to fix the "Null texture, how" error users were seeing on NVidia hardware.

It turns out it was an uninitialized variable in code that was never used until NV changed their drivers. As far as I can tell, NV dropped support for FSAA in 16-bit mode a few months ago, at least on some of their newer GPUs. (It is also possible that the incantation necessary to get FSAA has changed a lot and I simply don't know what it is.)

So the dialog between X-Plane and the video card ran something like the Monty Python cheese shop sketch:
X-Plane: So ... can you do full screen anti-aliasing?
GeForce 8: Oh yes, of course! (Please, I'm a GeForce 8 card.)
X-Plane: Splendid! So...how about 16x FSAA?
GeForce 8: Sorry, can't I can't do that.
X-Plane: Ah. How about 8x FSAA?
GeForce 8: Sorry, can't do that either.
X-Plane: I see. Well then, how about 4x FSAA?
GeForce 8: Nope.
X-Plane: 2x FSAA?
GeForce 8: No way.
X-Plane: Ah. I see.
At this point in the dialog X-Plane would promptly lose track of what it had been doing in the setup process, throw out its notes on the GPU setup, and then freakout a bit later when it realized its note taking left something to be desired.

This is the first case I've hit where a video card advertises FSAA and can't actually do it.

Anyway, if you have hit this bug:
  1. Update to 941 final - it should fix it.

  2. Stop trying to run with FSAA and 16-bit color. This is a somewhat crazy combination. FSAA attempts to clean up rendering artifacts at the cost of fill rate. 16-bit color creates artifacts to save fill rate. If your GPU needs 16-bit color to run at high framerate, it's time to turn FSAA off.

(I realize that 16-bit color and aliasing are different kinds of artifacts, and some users might prefer harsh color transitions to harsh polygon transitions. But I still say, go for 32-bit color, no FSAA. When the sim is running in 16-bit mode, a good chunk of the sim still runs in 32-bit mode because 16-bit RGB surfaces only have 1 bit of alpha.* So you're not quite getting universal savings but you get 16-bit output colors, so the results look universally bad.)

*This assumes 5551, or 565 pixels. There is a 4-bit alpha 16-bit color format, cleverly called 4444, but if you thought 16-bit looks bad...

2 comments:

Anonymous said...

People still run games 16Bit color??? Wow .... reminded me of days when I used a dial-up modem.

Anonymous said...

I was actually about to ask for nice 8-bit paletted colors. With pixel shaders.