GLSL Programming/Unity/Lighting of Bumpy Surfaces

This tutorial covers normal mapping.

“The Incredulity of Saint Thomas” by Caravaggio, 1601-1603.

It's the first in a series of tutorials about texturing techniques that go beyond two-dimensional surfaces (or layers of surfaces). In this tutorial, we start with normal mapping, which is a very well established technique to fake the lighting of small bumps and dents — even on coarse polygon meshes. The code of this tutorial is based on Section “Smooth Specular Highlights” and Section “Textured Spheres”.

Perceiving Shapes Based on Lighting edit

The painting by Caravaggio depicted to the left is about the incredulity of Saint Thomas, who did not believe in Christ's resurrection until he put his finger in Christ's side. The furrowed brows of the apostles not only symbolize this incredulity but clearly convey it by means of a common facial expression. However, why do we know that their foreheads are actually furrowed instead of being painted with some light and dark lines? After all, this is just a flat painting. In fact, viewers intuitively make the assumption that these are furrowed instead of painted brows — even though the painting itself allows for both interpretations. The lesson is: bumps on smooth surfaces can often be convincingly conveyed by the lighting alone without any other cues (shadows, occlusions, parallax effects, stereo, etc.).

Normal Mapping edit

Normal mapping tries to convey bumps on smooth surfaces (i.e. coarse triangle meshes with interpolated normals) by changing the surface normal vectors according to some virtual bumps. When the lighting is computed with these modified normal vectors, viewers will often perceive the virtual bumps — even though a perfectly flat triangle has been rendered. The illusion can certainly break down (in particular at silhouettes) but in many cases it is very convincing.

More specifically, the normal vectors that represent the virtual bumps are first encoded in a texture image (i.e. a normal map). A fragment shader then looks up these vectors in the texture image and computes the lighting based on them. That's about it. The problem, of course, is the encoding of the normal vectors in a texture image. There are different possibilities and the fragment shader has to be adapted to the specific encoding that was used to generate the normal map.

 
A typical example for the appearance of an encoded normal map.

Normal Mapping in Unity edit

The very good news is that you can easily create normal maps from gray-scale images with Unity: create a gray-scale image in your favorite paint program and use a specific gray for the regular height of the surface, lighter grays for bumps, and darker grays for dents. Make sure that the transitions between different grays are smooth, e.g. by blurring the image. When you import the image with Assets > Import New Asset change the Texture Type in the Inspector View to Normal map and check Generate from greyscale. After clicking Apply, the preview should show a bluish image with reddish and greenish edges. Alternatively to generating a normal map, the encoded normal map to the left can be imported (don't forget to uncheck the Generate from greyscale box).

The not so good news is that the fragment shader has to do some computations to decode the normals. First of all, the texture color is stored in a two-component texture image, i.e. there is only an alpha component   and one color component available. The color component can be accessed as the red, green, or blue component — in all cases the same value is returned. Here, we use the green component   since Unity also uses it. The two components,   and  , are stored as numbers between 0 and 1; however, they represent coordinates   and   between -1 and 1. The mapping is:

    and    

From these two components, the third component   of the three-dimensional normal vector n  can be calculated because of the normalization to unit length:

     

Only the “+” solution is necessary if we choose the   axis along the axis of the smooth normal vector (interpolated from the normal vectors that were set in the vertex shader) since we aren't able to render surfaces with an inwards pointing normal vector anyways. The code snippet from the fragment shader could look like this:

            vec4 encodedNormal = texture2D(_BumpMap, 
               _BumpMap_ST.xy * textureCoordinates.xy 
               + _BumpMap_ST.zw);
            vec3 localCoords = 
                vec3(2.0 * encodedNormal.ag - vec2(1.0), 0.0);
            localCoords.z = sqrt(1.0 - dot(localCoords, localCoords));
               // approximation without sqrt:  localCoords.z = 
               // 1.0 - 0.5 * dot(localCoords, localCoords);

The decoding for devices that use OpenGL ES is actually simpler since Unity doesn't use a two-component texture in this case. Thus, for mobile platforms the decoding becomes:

            vec4 encodedNormal = texture2D(_BumpMap, 
               _BumpMap_ST.xy * textureCoordinates.xy 
               + _BumpMap_ST.zw);
            vec3 localCoords = 2.0 * encodedNormal.rgb - vec3(1.0);

However, the rest of this tutorial (and also Section “Projection of Bumpy Surfaces”) will cover only (desktop) OpenGL.

 
Tangent plane to a point on a sphere.

Unity uses a local surface coordinate systems for each point of the surface to specify normal vectors in the normal map. The   axis of this local coordinates system is given by the smooth, interpolated normal vector N in world space and the   plane is a tangent plane to the surface as illustrated in the image to the left. Specifically, the   axis is specified by the tangent attribute T that Unity provides to vertices (see the discussion of attributes in Section “Debugging of Shaders”). Given the   and   axis, the   axis can be computed by a cross product in the vertex shader, e.g. B = N × T. (The letter B refers to the traditional name “binormal” for this vector.)

Note that the normal vector N is transformed with the transpose of the inverse model matrix from object space to world space (because it is orthogonal to a surface; see Section “Applying Matrix Transformations”) while the tangent vector T specifies a direction between points on a surface and is therefore transformed with the model matrix. The binormal vector B represents a third class of vectors which are transformed differently. (If you really want to know: the skew-symmetric matrix B corresponding to “B×” is transformed like a quadratic form.) Thus, the best choice is to first transform N and T to world space, and then to compute B in world space using the cross product of the transformed vectors.

With the normalized directions T, B, and N in world space, we can easily form a matrix that maps any normal vector n of the normal map from the local surface coordinate system to world space because the columns of such a matrix are just the vectors of the axes; thus, the 3×3 matrix for the mapping of n to world space is:

 

These calculations are performed by the vertex shader, for example this way:

         varying vec4 position; 
            // position of the vertex (and fragment) in world space 
         varying vec4 textureCoordinates; 
         varying mat3 localSurface2World; // mapping from 
            // local surface coordinates to world coordinates
 
         #ifdef VERTEX
 
         attribute vec4 Tangent;

         void main()
         {                                
            mat4 modelMatrix = _Object2World;
            mat4 modelMatrixInverse = _World2Object; // unity_Scale.w 
               // is unnecessary because we normalize vectors
 
            localSurface2World[0] = normalize(vec3(
               modelMatrix * vec4(vec3(Tangent), 0.0)));
            localSurface2World[2] = normalize(vec3(
               vec4(gl_Normal, 0.0) * modelMatrixInverse));
            localSurface2World[1] = normalize(
               cross(localSurface2World[2], localSurface2World[0]) 
               * Tangent.w); // factor Tangent.w is specific to Unity

            position = modelMatrix * gl_Vertex;
            textureCoordinates = gl_MultiTexCoord0;
            gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
         }
 
         #endif

The factor Tangent.w in the computation of binormal is specific to Unity, i.e. Unity provides tangent vectors and normal maps such that we have to do this multiplication.

In the fragment shader, we multiply the matrix in localSurface2World with n. For example, with this line:

            vec3 normalDirection = 
               normalize(localSurface2World * localCoords);

With the new normal vector in world space, we can compute the lighting as in Section “Smooth Specular Highlights”.

Complete Shader Code edit

This shader code simply integrates all the snippets and uses our standard two-pass approach for pixel lights.

Shader "GLSL normal mapping" {
   Properties {
      _BumpMap ("Normal Map", 2D) = "bump" {}
      _Color ("Diffuse Material Color", Color) = (1,1,1,1) 
      _SpecColor ("Specular Material Color", Color) = (1,1,1,1) 
      _Shininess ("Shininess", Float) = 10
   }
   SubShader {
      Pass {      
         Tags { "LightMode" = "ForwardBase" } 
            // pass for ambient light and first light source
 
         GLSLPROGRAM
 
         // User-specified properties
         uniform sampler2D _BumpMap;	
         uniform vec4 _BumpMap_ST;
         uniform vec4 _Color; 
         uniform vec4 _SpecColor; 
         uniform float _Shininess;
 
         // The following built-in uniforms (except _LightColor0) 
         // are also defined in "UnityCG.glslinc", 
         // i.e. one could #include "UnityCG.glslinc" 
         uniform vec3 _WorldSpaceCameraPos; 
            // camera position in world space
         uniform mat4 _Object2World; // model matrix
         uniform mat4 _World2Object; // inverse model matrix
         uniform vec4 _WorldSpaceLightPos0; 
            // direction to or position of light source
         uniform vec4 _LightColor0; 
            // color of light source (from "Lighting.cginc")
 
         varying vec4 position; 
            // position of the vertex (and fragment) in world space 
         varying vec4 textureCoordinates; 
         varying mat3 localSurface2World; // mapping from local 
            // surface coordinates to world coordinates
 
         #ifdef VERTEX
 
         attribute vec4 Tangent;

         void main()
         {                                
            mat4 modelMatrix = _Object2World;
            mat4 modelMatrixInverse = _World2Object; // unity_Scale.w 
               // is unnecessary because we normalize vectors
 
            localSurface2World[0] = normalize(vec3(
               modelMatrix * vec4(vec3(Tangent), 0.0)));
            localSurface2World[2] = normalize(vec3(
               vec4(gl_Normal, 0.0) * modelMatrixInverse));
            localSurface2World[1] = normalize(
               cross(localSurface2World[2], localSurface2World[0]) 
               * Tangent.w); // factor Tangent.w is specific to Unity

            position = modelMatrix * gl_Vertex;
            textureCoordinates = gl_MultiTexCoord0;
            gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
         }
 
         #endif
 
         #ifdef FRAGMENT
 
         void main()
         {
            // in principle we have to normalize the columns of 
            // "localSurface2World" again; however, the potential 
            // problems are small since we use this matrix only to
            // compute "normalDirection", which we normalize anyways

            vec4 encodedNormal = texture2D(_BumpMap, 
               _BumpMap_ST.xy * textureCoordinates.xy 
               + _BumpMap_ST.zw);
            vec3 localCoords = 
               vec3(2.0 * encodedNormal.ag - vec2(1.0), 0.0);
            localCoords.z = sqrt(1.0 - dot(localCoords, localCoords));
               // approximation without sqrt: localCoords.z = 
               // 1.0 - 0.5 * dot(localCoords, localCoords);
            vec3 normalDirection = 
               normalize(localSurface2World * localCoords);

            vec3 viewDirection = 
               normalize(_WorldSpaceCameraPos - vec3(position));
            vec3 lightDirection;
            float attenuation;
 
            if (0.0 == _WorldSpaceLightPos0.w) // directional light?
            {
               attenuation = 1.0; // no attenuation
               lightDirection = normalize(vec3(_WorldSpaceLightPos0));
            } 
            else // point or spot light
            {
               vec3 vertexToLightSource = 
                  vec3(_WorldSpaceLightPos0 - position);
               float distance = length(vertexToLightSource);
               attenuation = 1.0 / distance; // linear attenuation 
               lightDirection = normalize(vertexToLightSource);
            }
 
            vec3 ambientLighting = 
               vec3(gl_LightModel.ambient) * vec3(_Color);
 
            vec3 diffuseReflection = 
               attenuation * vec3(_LightColor0) * vec3(_Color) 
               * max(0.0, dot(normalDirection, lightDirection));
 
            vec3 specularReflection;
            if (dot(normalDirection, lightDirection) < 0.0) 
               // light source on the wrong side?
            {
               specularReflection = vec3(0.0, 0.0, 0.0); 
                  // no specular reflection
            }
            else // light source on the right side
            {
               specularReflection = attenuation * vec3(_LightColor0) 
                  * vec3(_SpecColor) * pow(max(0.0, dot(
                  reflect(-lightDirection, normalDirection), 
                  viewDirection)), _Shininess);
            }
 
            gl_FragColor = vec4(ambientLighting 
               + diffuseReflection + specularReflection, 1.0);
         }
 
         #endif
 
         ENDGLSL
      }
 
      Pass {      
         Tags { "LightMode" = "ForwardAdd" } 
            // pass for additional light sources
         Blend One One // additive blending 
 
        GLSLPROGRAM
 
         // User-specified properties
         uniform sampler2D _BumpMap;	
         uniform vec4 _BumpMap_ST;
         uniform vec4 _Color; 
         uniform vec4 _SpecColor; 
         uniform float _Shininess;
 
         // The following built-in uniforms (except _LightColor0) 
         // are also defined in "UnityCG.glslinc", 
         // i.e. one could #include "UnityCG.glslinc" 
         uniform vec3 _WorldSpaceCameraPos; 
            // camera position in world space
         uniform mat4 _Object2World; // model matrix
         uniform mat4 _World2Object; // inverse model matrix
         uniform vec4 _WorldSpaceLightPos0; 
            // direction to or position of light source
         uniform vec4 _LightColor0; 
            // color of light source (from "Lighting.cginc")
 
         varying vec4 position; 
            // position of the vertex (and fragment) in world space 
         varying vec4 textureCoordinates; 
         varying mat3 localSurface2World; // mapping from 
            // local surface coordinates to world coordinates
 
         #ifdef VERTEX

         attribute vec4 Tangent;
 
         void main()
         {                                
            mat4 modelMatrix = _Object2World;
            mat4 modelMatrixInverse = _World2Object; // unity_Scale.w 
               // is unnecessary because we normalize vectors
 
            localSurface2World[0] = normalize(vec3(
               modelMatrix * vec4(vec3(Tangent), 0.0)));
            localSurface2World[2] = normalize(vec3(
               vec4(gl_Normal, 0.0) * modelMatrixInverse));
            localSurface2World[1] = normalize(
               cross(localSurface2World[2], localSurface2World[0]) 
               * Tangent.w); // factor Tangent.w is specific to Unity

            position = modelMatrix * gl_Vertex;
            textureCoordinates = gl_MultiTexCoord0;
            gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
         }
 
         #endif
 
         #ifdef FRAGMENT
 
         void main()
         {
            // in principle we have to normalize the columns of 
            // "localSurface2World" again; however, the potential 
            // problems are small since we use this matrix only to
            // compute "normalDirection", which we normalize anyways

            vec4 encodedNormal = texture2D(_BumpMap, 
               _BumpMap_ST.xy * textureCoordinates.xy 
               + _BumpMap_ST.zw);
            vec3 localCoords = 
               vec3(2.0 * encodedNormal.ag - vec2(1.0), 0.0);
            localCoords.z = sqrt(1.0 - dot(localCoords, localCoords));
               // approximation without sqrt: localCoords.z = 
               // 1.0 - 0.5 * dot(localCoords, localCoords);
            vec3 normalDirection = 
               normalize(localSurface2World * localCoords);

            vec3 viewDirection = 
               normalize(_WorldSpaceCameraPos - vec3(position));
            vec3 lightDirection;
            float attenuation;
 
            if (0.0 == _WorldSpaceLightPos0.w) // directional light?
            {
               attenuation = 1.0; // no attenuation
               lightDirection = normalize(vec3(_WorldSpaceLightPos0));
            } 
            else // point or spot light
            {
               vec3 vertexToLightSource = 
                  vec3(_WorldSpaceLightPos0 - position);
               float distance = length(vertexToLightSource);
               attenuation = 1.0 / distance; // linear attenuation 
               lightDirection = normalize(vertexToLightSource);
            }
 
            vec3 diffuseReflection = 
               attenuation * vec3(_LightColor0) * vec3(_Color) 
               * max(0.0, dot(normalDirection, lightDirection));
 
            vec3 specularReflection;
            if (dot(normalDirection, lightDirection) < 0.0) 
               // light source on the wrong side?
            {
               specularReflection = vec3(0.0, 0.0, 0.0); 
                  // no specular reflection
            }
            else // light source on the right side
            {
               specularReflection = attenuation * vec3(_LightColor0) 
                  * vec3(_SpecColor) * pow(max(0.0, dot(
                  reflect(-lightDirection, normalDirection), 
                  viewDirection)), _Shininess);
            }
 
            gl_FragColor = 
               vec4(diffuseReflection + specularReflection, 1.0);
         }
 
         #endif
 
         ENDGLSL
      }
   } 
   // The definition of a fallback shader should be commented out 
   // during development:
   // Fallback "Bumped Specular"
}

Note that we have used the tiling and offset uniform _BumpMap_ST as explained in the Section “Textured Spheres” since this option is often particularly useful for bump maps.

Summary edit

Congratulations! You finished this tutorial! We have look at:

  • How human perception of shapes often relies on lighting.
  • What normal mapping is.
  • How Unity encodes normal maps.
  • How a fragment shader can decode Unity's normal maps and use them to per-pixel lighting.

Further Reading edit

If you still want to know more


< GLSL Programming/Unity

Unless stated otherwise, all example source code on this page is granted to the public domain.