# GLSL Programming/Blender/Lighting of Bumpy Surfaces

This tutorial covers **normal mapping**.

It's the first of two tutorials about texturing techniques that go beyond two-dimensional surfaces (or layers of surfaces). In this tutorial, we start with normal mapping, which is a very well established technique to fake the lighting of small bumps and dents — even on coarse polygon meshes. The code of this tutorial is based on the tutorial on smooth specular highlights and the tutorial on textured spheres.

### Perceiving Shapes Based on LightingEdit

The painting by Caravaggio that is depicted to the left is about the incredulity of Saint Thomas, who did not believe in Christ's resurrection until he put his finger in Christ's side. The furrowed brows of the apostles not only symbolize this incredulity but clearly convey it by means of a common facial expression. However, why do we know that their foreheads are actually furrowed instead of being painted with some light and dark lines? After all, this is just a flat painting. In fact, viewers intuitively make the assumption that these are furrowed instead of painted brows — even though the painting itself allows for both interpretations. The lesson is: bumps on smooth surfaces can often be convincingly conveyed by the lighting alone without any other cues (shadows, occlusions, parallax effects, stereo, etc.).

### Normal MappingEdit

Normal mapping tries to convey bumps on smooth surfaces (i.e. coarse triangle meshes with interpolated normals) by changing the surface normal vectors according to some virtual bumps. When the lighting is computed with these modified normal vectors, viewers will often perceive the virtual bumps — even though a perfectly flat triangle has been rendered. The illusion can certainly break down (in particular at silhouettes) but in many cases it is very convincing.

More specifically, the normal vectors that represent the virtual bumps are first **encoded** in a texture image (i.e. a normal map). A fragment shader then looks up these vectors up in the texture image and computes the lighting based on them. That's about it. The problem, of course, is the encoding of the normal vectors in a texture image. There are different possibilities and the fragment shader has to be adapted to the specific encoding that was used to generate the normal map.

### Normal Mapping in BlenderEdit

Normal maps are supported by Blender; see the description in the Blender 3D: Noob to Pro wikibook. Here, however, we will use the normal map to the left and write a GLSL shader to use it.

For this tutorial, you should use a cube mesh instead of the UV sphere that was used in the tutorial on textured spheres. Apart from that you can follow the same steps to assign a material and the texture image to the object. Note that you should specify a default **UV Map** in the **Properties window > Object Data tab**. Furthermore, you should specify **Coordinates > UV** in the **Properties window > Textures tab > Mapping**.

When decoding the normal information, it would be best to know how the data was encoded. However, there are not so many choices; thus, even if you don't know how the normal map was encoded, a bit of experimentation can often lead to sufficiently good results. First of all, the RGB components are numbers between 0 and 1; however, they usually represent coordinates between -1 and 1 in a local surface coordinate system (since the vector is normalized, none of the coordinates can be greater than +1 or less than -1). Thus, the mapping from RGB components to coordinates of the normal vector **n** could be:

, , and

However, the coordinate is usually positive (because surface normals are not allowed to point inwards). This can be exploited by using a different mapping for :

, , and

If in doubt, the latter decoding should be chosen because it will never generate surface normals that point inwards. Furthermore, it is often necessary to normalize the resulting vector.

An implementation in a fragment shader that computes the normalized vector **n** in the variable `localCoords`

could be:

```
vec4 encodedNormal = texture2D(normalMap, vec2(texCoords));
vec3 localCoords =
normalize(vec3(2.0, 2.0, 1.0) * vec3(encodedNormal)
- vec3(1.0, 1.0, 0.0));
```

Usually, a local surface coordinate systems for each point of the surface is used to specify normal vectors in the normal map. The axis of this local coordinates system is given by the smooth, interpolated normal vector **N** and the plane is a tangent plane to the surface as illustrated in the image to the left. Specifically, the axis is specified by the tangent attribute **T** that Blender provides to vertices (see the discussion of attributes in the tutorial about debugging of shaders). Given the and axis, the axis can be computed by a cross product in the vertex shader, e.g. **B** = **T** × **N**. (The letter **B** refers to the traditional name “binormal” for this vector.)

Note that the normal vector **N** is transformed with the transpose of the inverse model-view matrix from object space to view space (because it is orthogonal to a surface; see “Applying Matrix Transformations”) while the tangent vector **T** specifies a direction between points on a surface and is therefore transformed with the model-view matrix. The binormal vector **B** represents a third class of vectors which are transformed differently. (If you really want to know: the skew-symmetric matrix B corresponding to “**B**×” is transformed like a quadratic form.) Thus, the best choice is to first transform **N** and **T** to view space, and then to compute **B** in view space using the cross product of the transformed vectors.

Also note that the configuration of these axes depends on the tangent data that is provided, the encoding of the normal map, and the texture coordinates. However, the axes are practically always orthogonal and a bluish tint of the normal map indicates that the blue component is in the direction of the interpolated normal vector.

With the normalized directions **T**, **B**, and **N** in view space, we can easily form a matrix that maps any normal vector **n** of the normal map from the local surface coordinate system to view space because the columns of such a matrix are just the vectors of the axes; thus, the 3×3 matrix for the mapping of **n** to view space is:

These calculations are performed by the vertex shader, for example this way:

```
attribute vec4 tangent;
varying mat3 localSurface2View; // mapping from
// local surface coordinates to view coordinates
varying vec4 texCoords; // texture coordinates
varying vec4 position; // position in view coordinates
void main()
{
// the signs and whether tangent is in localSurface2View[1]
// or localSurface2View[0] depends on the tangent
// attribute, texture coordinates, and the encoding
// of the normal map
// gl_NormalMatrix is precalculated inverse transpose of
// the gl_ModelViewMatrix; using this preserves data
// during non-uniform scaling of the mesh
// localSurface2View[1] is multiplied by the cross sign of
// the tangent, in tangent.w; this allows mirrored UVs
// (tangent.w is 1 when normal, -1 when mirrored)
localSurface2View[0] = normalize(gl_NormalMatrix
* tangent.xyz);
localSurface2View[2] =
normalize(gl_NormalMatrix * gl_Normal);
localSurface2View[1] = normalize(
cross(localSurface2View[2], localSurface2View[0])
* tangent.w);
texCoords = gl_MultiTexCoord0;
position = gl_ModelViewMatrix * gl_Vertex;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
```

In the fragment shader, we multiply this matrix with **n** (i.e. `localCoords`

). For example, with this line:

```
vec3 normalDirection =
normalize(localSurface2View * localCoords);
```

With the new normal vector in view space, we can compute the lighting as in the tutorial on smooth specular highlights.

### Complete Shader CodeEdit

The complete fragment shader simply integrates all the snippets and the per-pixel lighting from the tutorial on smooth specular highlights. Also, we have to request tangent attributes and set the texture sampler (make sure that the normal map is in the first position of the list of textures or adapt the second argument of the call to `setSampler`

). The Python script is then:

```
import bge
cont = bge.logic.getCurrentController()
VertexShader = """
attribute vec4 tangent;
varying mat3 localSurface2View; // mapping from
// local surface coordinates to view coordinates
varying vec4 texCoords; // texture coordinates
varying vec4 position; // position in view coordinates
void main()
{
// the signs and whether tangent is in localSurface2View[1]
// or localSurface2View[0] depends on the tangent
// attribute, texture coordinates, and the encoding
// of the normal map
// gl_NormalMatrix is precalculated inverse transpose of
// the gl_ModelViewMatrix; using this preserves data
// during non-uniform scaling of the mesh
// localSurface2View[1] is multiplied by the cross sign of
// the tangent, in tangent.w; this allows mirrored UVs
// (tangent.w is 1 when normal, -1 when mirrored)
localSurface2View[0] = normalize(gl_NormalMatrix
* tangent.xyz);
localSurface2View[2] =
normalize(gl_NormalMatrix * gl_Normal);
localSurface2View[1] = normalize(
cross(localSurface2View[2], localSurface2View[0])
* tangent.w);
texCoords = gl_MultiTexCoord0;
position = gl_ModelViewMatrix * gl_Vertex;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
"""
FragmentShader = """
varying mat3 localSurface2View; // mapping from
// local surface coordinates to view coordinates
varying vec4 texCoords; // texture coordinates
varying vec4 position; // position in view coordinates
uniform sampler2D normalMap;
void main()
{
// in principle we have to normalize the columns of
// "localSurface2View" again; however, the potential
// problems are small since we use this matrix only
// to compute "normalDirection", which we normalize anyways
vec4 encodedNormal = texture2D(normalMap, vec2(texCoords));
vec3 localCoords =
normalize(vec3(2.0, 2.0, 1.0) * vec3(encodedNormal)
- vec3(1.0, 1.0, 0.0));
// constants depend on encoding
vec3 normalDirection =
normalize(localSurface2View * localCoords);
// Compute per-pixel Phong lighting with normalDirection
vec3 viewDirection = -normalize(vec3(position));
vec3 lightDirection;
float attenuation;
if (0.0 == gl_LightSource[0].position.w)
// directional light?
{
attenuation = 1.0; // no attenuation
lightDirection =
normalize(vec3(gl_LightSource[0].position));
}
else // point light or spotlight (or other kind of light)
{
vec3 positionToLightSource =
vec3(gl_LightSource[0].position - position);
float distance = length(positionToLightSource);
attenuation = 1.0 / distance; // linear attenuation
lightDirection = normalize(positionToLightSource);
if (gl_LightSource[0].spotCutoff <= 90.0) // spotlight?
{
float clampedCosine = max(0.0, dot(-lightDirection,
gl_LightSource[0].spotDirection));
if (clampedCosine < gl_LightSource[0].spotCosCutoff)
// outside of spotlight cone?
{
attenuation = 0.0;
}
else
{
attenuation = attenuation * pow(clampedCosine,
gl_LightSource[0].spotExponent);
}
}
}
vec3 ambientLighting = vec3(gl_LightModel.ambient)
* vec3(gl_FrontMaterial.emission);
vec3 diffuseReflection = attenuation
* vec3(gl_LightSource[0].diffuse)
* vec3(gl_FrontMaterial.emission)
* max(0.0, dot(normalDirection, lightDirection));
vec3 specularReflection;
if (dot(normalDirection, lightDirection) < 0.0)
// light source on the wrong side?
{
specularReflection = vec3(0.0, 0.0, 0.0);
// no specular reflection
}
else // light source on the right side
{
specularReflection = attenuation
* vec3(gl_LightSource[0].specular)
* vec3(gl_FrontMaterial.specular)
* pow(max(0.0, dot(reflect(-lightDirection,
normalDirection), viewDirection)),
gl_FrontMaterial.shininess);
}
gl_FragColor = vec4(ambientLighting + diffuseReflection
+ specularReflection, 1.0);
}
"""
mesh = cont.owner.meshes[0]
for mat in mesh.materials:
shader = mat.getShader()
if shader != None:
if not shader.isValid():
shader.setSource(VertexShader, FragmentShader, 1)
shader.setAttrib(bge.logic.SHD_TANGENT)
shader.setSampler('normalMap', 0)
```

### SummaryEdit

Congratulations! You finished this tutorial! We have look at:

- How human perception of shapes often relies on lighting.
- What normal mapping is.
- How to decode common normal maps.
- How a fragment shader can decode a normal map and use it for per-pixel lighting.

### Further ReadingEdit

If you still want to know more

- about texture mapping (including tiling and offseting), you should read the tutorial on textured spheres.
- about per-pixel lighting with the Phong reflection model, you should read the tutorial on smooth specular highlights.
- about transforming normal vectors, you should read “Applying Matrix Transformations”.
- about normal mapping, you could read Mark J. Kilgard: “A Practical and Robust Bump-mapping Technique for Today’s GPUs”, GDC 2000: Advanced OpenGL Game Development, which is available online.