This tutorial covers **textures for light attenuation** or — more generally spoken — textures as lookup tables.

It is based on Section “Cookies”. If you haven't read that tutorial yet, you should read it first.

### Texture Maps as Lookup TablesEdit

One can think of a texture map as an approximation to a two-dimensional function that maps the texture coordinates to an RGBA color. If one of the two texture coordinates is kept fixed, the texture map can also represent a one-dimensional function. Thus, it is often possible to replace mathematical expressions that depend only on one or two variables by lookup tables in the form of texture maps. (The limitation is that the resolution of the texture map is limited by the size of the texture image and therefore the accuracy of a texture lookup might be insufficient.)

The main advantage of using such a texture lookup is a potential gain of performance: a texture lookup doesn't depend on the complexity of the mathematical expression but only on the size of the texture image (to a certain degree: the smaller the texture image the more efficient the caching up to the point where the whole texture fits into the cache). However, there is an overhead of using a texture lookup; thus, replacing simple mathematical expressions — including built-in functions — is usually pointless.

Which mathematical expressions should be replaced by texture lookups? Unfortunately, there is no general answer because it depends on the specific GPU whether a specific lookup is faster than evaluating a specific mathematical expression. However, one should keep in mind that a texture map is less simple (since it requires code to compute the lookup table), less explicit (since the mathematical function is encoded in a lookup table), less consistent with other mathematical expressions, and has a wider scope (since the texture is available in the whole fragment shader). These are good reasons to avoid lookup tables. However, the gains in performance might outweigh these reasons. In that case, it is a good idea to include comments that document how to achieve the same effect without the lookup table.

### Unity's Texture Lookup for Light AttenuationEdit

Unity actually uses a lookup texture `_LightTextureB0`

internally for the light attenuation of point lights and spotlights. (Note that in some cases, e.g. point lights without cookie textures, this lookup texture is set to `_LightTexture0`

without `B`

. This case is ignored here; thus, you should use spot lights to test the code.) In Section “Diffuse Reflection”, it was described how to implement linear attenuation: we compute an attenuation factor that includes one over the distance between the position of the light source in world space and the position of the rendered fragment in world space. In order to represent this distance, Unity uses the coordinate in light space. Light space coordinates have been discussed in Section “Cookies”; here, it is only important that we can use the Unity-specific uniform matrix `_LightMatrix0`

to transform a position from world space to light space. Analogously to the code in Section “Cookies”, we store the position in light space in the vertex output parameter `posLight`

. We can then use the coordinate of this parameter to look up the attenuation factor in the alpha component of the texture `_LightTextureB0`

in the fragment shader:

float distance = input.posLight.z; // use z coordinate in light space as signed distance attenuation = tex2D(_LightTextureB0, float2(distance, distance)).a; // texture lookup for attenuation // alternative with linear attenuation: // float distance = length(vertexToLightSource); // attenuation = 1.0 / distance;

Using the texture lookup, we don't have to compute the length of a vector (which involves three squares and one square root) and we don't have to divide by this length. In fact, the actual attenuation function that is implemented in the lookup table is more complicated in order to avoid saturated colors at short distances. Thus, compared to a computation of this actual attenuation function, we save even more operations.

### Complete Shader CodeEdit

The shader code is based on the code of Section “Cookies”. The `ForwardBase`

pass was slightly simplified by assuming that the light source is always directional without attenuation. The vertex shader of the `ForwardAdd`

pass is identical to the code in Section “Cookies” but the fragment shader includes the texture lookup for light attenuation, which is described above. However, the fragment shader lacks the cookie attenuation in order to focus on the attenuation with distance. It is straightforward (and a good exercise) to include the code for the cookie again.

Shader "Cg light attenuation with texture lookup" { Properties { _Color ("Diffuse Material Color", Color) = (1,1,1,1) _SpecColor ("Specular Material Color", Color) = (1,1,1,1) _Shininess ("Shininess", Float) = 10 } SubShader { Pass { Tags { "LightMode" = "ForwardBase" } // pass for ambient light and // first directional light source without attenuation CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" uniform float4 _LightColor0; // color of light source (from "Lighting.cginc") // User-specified properties uniform float4 _Color; uniform float4 _SpecColor; uniform float _Shininess; struct vertexInput { float4 vertex : POSITION; float3 normal : NORMAL; }; struct vertexOutput { float4 pos : SV_POSITION; float4 posWorld : TEXCOORD0; float3 normalDir : TEXCOORD1; }; vertexOutput vert(vertexInput input) { vertexOutput output; float4x4 modelMatrix = _Object2World; float4x4 modelMatrixInverse = _World2Object; // multiplication with unity_Scale.w is unnecessary // because we normalize transformed vectors output.posWorld = mul(modelMatrix, input.vertex); output.normalDir = normalize( mul(float4(input.normal, 0.0), modelMatrixInverse).xyz); output.pos = mul(UNITY_MATRIX_MVP, input.vertex); return output; } float4 frag(vertexOutput input) : COLOR { float3 normalDirection = normalize(input.normalDir); float3 viewDirection = normalize( _WorldSpaceCameraPos - input.posWorld.xyz); float3 lightDirection = normalize(_WorldSpaceLightPos0.xyz); float3 ambientLighting = UNITY_LIGHTMODEL_AMBIENT.rgb * _Color.rgb; float3 diffuseReflection = _LightColor0.rgb * _Color.rgb * max(0.0, dot(normalDirection, lightDirection)); float3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = float3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = _LightColor0.rgb * _SpecColor.rgb * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } return float4(ambientLighting + diffuseReflection + specularReflection, 1.0); } ENDCG } Pass { Tags { "LightMode" = "ForwardAdd" } // pass for additional light sources Blend One One // additive blending CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" uniform float4 _LightColor0; // color of light source (from "Lighting.cginc") uniform float4x4 _LightMatrix0; // transformation // from world to light space (from Autolight.cginc) uniform sampler2D _LightTextureB0; // cookie alpha texture map (from Autolight.cginc) // User-specified properties uniform float4 _Color; uniform float4 _SpecColor; uniform float _Shininess; struct vertexInput { float4 vertex : POSITION; float3 normal : NORMAL; }; struct vertexOutput { float4 pos : SV_POSITION; float4 posWorld : TEXCOORD0; // position of the vertex (and fragment) in world space float4 posLight : TEXCOORD1; // position of the vertex (and fragment) in light space float3 normalDir : TEXCOORD2; // surface normal vector in world space }; vertexOutput vert(vertexInput input) { vertexOutput output; float4x4 modelMatrix = _Object2World; float4x4 modelMatrixInverse = _World2Object; // multiplication with unity_Scale.w is unnecessary // because we normalize transformed vectors output.posWorld = mul(modelMatrix, input.vertex); output.posLight = mul(_LightMatrix0, output.posWorld); output.normalDir = normalize( mul(float4(input.normal, 0.0), modelMatrixInverse).xyz); output.pos = mul(UNITY_MATRIX_MVP, input.vertex); return output; } float4 frag(vertexOutput input) : COLOR { float3 normalDirection = normalize(input.normalDir); float3 viewDirection = normalize( _WorldSpaceCameraPos - input.posWorld.xyz); float3 lightDirection; float attenuation; if (0.0 == _WorldSpaceLightPos0.w) // directional light? { attenuation = 1.0; // no attenuation lightDirection = normalize(_WorldSpaceLightPos0.xyz); } else // point or spot light { float3 vertexToLightSource = _WorldSpaceLightPos0.xyz - input.posWorld.xyz; lightDirection = normalize(vertexToLightSource); float distance = input.posLight.z; // use z coordinate in light space as signed distance attenuation = tex2D(_LightTextureB0, float2(distance, distance)).a; // texture lookup for attenuation // alternative with linear attenuation: // float distance = length(vertexToLightSource); // attenuation = 1.0 / distance; } float3 diffuseReflection = attenuation * _LightColor0.rgb * _Color.rgb * max(0.0, dot(normalDirection, lightDirection)); float3 specularReflection; if (dot(normalDirection, lightDirection) < 0.0) // light source on the wrong side? { specularReflection = float3(0.0, 0.0, 0.0); // no specular reflection } else // light source on the right side { specularReflection = attenuation * _LightColor0.rgb * _SpecColor.rgb * pow(max(0.0, dot( reflect(-lightDirection, normalDirection), viewDirection)), _Shininess); } return float4(diffuseReflection + specularReflection, 1.0); } ENDCG } } // The definition of a fallback shader should be commented out // during development: // Fallback "Specular" }

If you compare the lighting computed by this shader with the lighting of a built-in shader, you will notice a difference in intensity by a factor of about 2 to 4. However, this is mainly due to additional constant factors in the built-in shaders. It is straightforward to introduce similar constant factors in the code above.

It should be noted that the coordinate in light space is not equal to the distance from the light source; it's not even proportional to that distance. In fact, the meaning of the coordinate depends on the matrix `_LightMatrix0`

, which is an undocumented feature of Unity and can therefore change anytime. However, it is rather safe to assume that a value of 0 corresponds to very close positions and a value of 1 corresponds to farther positions.

Also note that point lights without cookie textures specify the attenuation lookup texture in `_LightTexture0`

instead of `_LightTextureB0`

; thus, the code above doesn't work for them. Moreover, the code doesn't check the sign of the coordinate, which is fine for spot lights but results in a lack of attenuation on one side of point light sources.

### Computing Lookup TexturesEdit

So far, we have used a lookup texture that is provided by Unity. If Unity wouldn't provide us with the texture in `_LightTextureB0`

, we had to compute this texture ourselves. Here is some JavaScript code to compute a similar lookup texture. In order to use it, you have to change the name `_LightTextureB0`

to `_LookupTexture`

in the shader code and attach the following JavaScript to any game object with the corresponding material:

@script ExecuteInEditMode() public var upToDate : boolean = false; function Start() { upToDate = false; } function Update() { if (!upToDate) // is lookup texture not up to date? { upToDate = true; var texture = new Texture2D(16, 16); // width = 16 texels, height = 16 texels texture.filterMode = FilterMode.Bilinear; texture.wrapMode = TextureWrapMode.Clamp; renderer.sharedMaterial.SetTexture("_LookupTexture", texture); // "_LookupTexture" has to correspond to the name // of the uniform sampler2D variable in the shader for (var j : int = 0; j < texture.height; j++) { for (var i : int = 0; i < texture.width; i++) { var x : float = (i + 0.5) / texture.width; // first texture coordinate var y : float = (j + 0.5) / texture.height; // second texture coordinate var color = Color(0.0, 0.0, 0.0, (1.0 - x) * (1.0 - x)); // set RGBA of texels texture.SetPixel(i, j, color); } } texture.Apply(); // apply all the texture.SetPixel(...) commands } }

In this code, `i`

and `j`

enumerate the texels of the texture image while `x`

and `y`

represent the corresponding texture coordinates. The function `(1.0-x)*(1.0-x)`

for the alpha component of the texture image happens to produce similar results as compared to Unity's lookup texture.

Note that the lookup texture should not be computed in every frame. Rather it should be computed only when necessary. If a lookup texture depends on additional parameters, then the texture should only be recomputed if any parameter has been changed. This can be achieved by storing the parameter values for which a lookup texture has been computed and continuously checking whether any of the new parameters are different from these stored values. If this is the case, the lookup texture has to be recomputed.

### SummaryEdit

Congratulations, you have reached the end of this tutorial. We have seen:

- How to use the built-in texture
`_LightTextureB0`

as a lookup table for light attenuation. - How to compute your own lookup textures in JavaScript.

### Further ReadingEdit

If you still want to know more

- about light attenuation for light sources, you should read Section “Diffuse Reflection”.
- about basic texture mapping, you should read Section “Textured Spheres”.
- about coordinates in light space, you should read Section “Cookies”.
- about the SECS principles (simple, explicit, consistent, minimal scope), you could read Chapter 3 of David Straker's book “C Style: Standards and Guidelines”, published by Prentice-Hall in 1991, which is available online.