rendering – Implementing DirectX 7 light attenuation in Unity’s Universal Render Pipeline

I am create a modern Unity based client of an old MMO from 2000, and I am working on implementing their lighting system. Based on the data I have reverse engineered, it looks like they used a custom attenuation formula. The original client uses DirectX 7. I have found a reference sheet for DX7 light.

http://developer.download.nvidia.com/assets/gamedev/docs/GDC2K_D3D_7_Vert_Lighting.pdf

The relevant info starts at page 17. In the URP, in Lighting.hlsl, I have set up linear attenuation:

float range = rsqrt(distanceAndSpotAttenuation.x);
float dist = distance( positionWS, lightPositionWS);
half attenuation = 1.0f - dist/range;        

Looks good! https://i.imgur.com/5bfJ5fF.png

Everything looks good. Now I am trying to (based on the document) recreate the same linear attenuation. It should look something like this:

// DX 7
float range = rsqrt(distanceAndSpotAttenuation.x);
float dist = distance( positionWS, lightPositionWS);
half c0 = 1.0f;
half c1 = 1.0f;
half c2 = 0.0f;
half d = dist/range;
half attenuation = 1.0f / ( c0 + c1 * d + c2 * d * d );

This should give me linear attenuation according to the doc, but it ends up bleeding light way past the range of the light. So I am convinced the distance is wrong. What kind of value is expected here? Normalized distance? It doesn’t specify. I have played around with the distance a lot. Set it to the attenuation in the first code example, to the raw value, and a bunch of other values. No luck: https://i.imgur.com/JWJ0HF2.png

Does anyone have any ideas?