# GLSL Programming/Blender/Layers of Textures

Layers of human skin.

This tutorial introduces multitexturing, i.e. the use of multiple texture images in a shader.

It extends the shader code of the tutorial on textured spheres to multiple textures and shows a way of combining them. If you haven't read that tutorials, this would be a very good opportunity to read it.

### Layers of SurfacesEdit

Many real surfaces (e.g. the human skin illustrated in the image to the left) consist of several layers of different colors, transparencies, reflectivities, etc. If the topmost layer is opaque and doesn't transmit any light, this doesn't really matter for rendering the surface. However, in many cases the topmost layer is (semi)transparent and therefore an accurate rendering of the surface has to take multiple layers into account.

In fact, the specular reflection that is included in the Phong reflection model (see the tutorial on specular highlights) often corresponds to a transparent layer that reflects light: sweat on human skin, wax on fruits, transparent plastics with embedded pigment particles, etc. On the other hand, the diffuse reflection corresponds to the layer(s) below the topmost transparent layer.

Lighting such layered surfaces doesn't require a geometric model of the layers: they can be represented by a single, infinitely thin polygon mesh. However, the lighting computation has to compute different reflections for different layers and has to take the transmission of light between layers into account (both when light enters the layer and when it exits the layer). Examples of this approach are included in the “Dawn” demo by Nvidia (see Chapter 3 of the book “GPU Gems”, which is available online) and the “Human Head” demo by Nvidia (see Chapter 14 of the book “GPU Gems 3”, which is also available online).

A full description of these processes is beyond the scope of this tutorial. Suffice to say that layers are often associated with texture images to specify their characteristics. Here we just show how to use two textures and one particular way of combining them. The example is in fact not related to layers and therefore illustrates that multitexturing has more applications than layers of surfaces.

Map of the unlit Earth.
Map of the sunlit Earth.

### Lit and Unlit EarthEdit

Due to human activities, the unlit side of the Earth is not completely dark. Instead, artificial lights mark the position and extension of cities as shown in the image to the left. Therefore, diffuse lighting of the Earth should not just dim the texture image for the sunlit surface but actually blend it to the unlit texture image. Note that the sunlit Earth is far brighter than human-made lights on the unlit side; however, we reduce this contrast in order to show off the nighttime texture.

The shader code extends the code from the tutorial on textured spheres to two texture images and uses the computation described in the tutorial on diffuse reflection for a single, directional light source:

$I_\text{diffuse} = I_\text{incoming}\,k_\text{diffuse} \max(0,\mathbf{N}\cdot \mathbf{L})$

According to this equation, the level of diffuse lighting levelOfLighting is max(0, N·L). We then blend the colors of the daytime texture and the nighttime texture based on levelOfLighting. This could be achieved by multiplying the daytime color with levelOfLighting and multiplying the nighttime color with 1.0 - levelOfLighting before adding them to determine the fragment's color. Alternatively, the built-in GLSL function mix can be used (mix(a, b, w) = b*w + a*(1.0-w)), which is likely to be more efficient. Thus, the fragment shader could be (again with our particular computation of texture coordinates in longitudeLatitude):

         void main()
{
vec2 longitudeLatitude = vec2(
(atan(texCoords.y, texCoords.x)/3.1415926+1.0)*0.5,
1.0 - acos(texCoords.z) / 3.1415926);

nighttimeColor =
texture2D(nighttimeTexture, longitudeLatitude);
daytimeColor =
texture2D(daytimeTexture, longitudeLatitude);
gl_FragColor = mix(nighttimeColor, daytimeColor,
levelOfLighting);
// = daytimeColor * levelOfLighting
// + nighttimeColor * (1.0 - levelOfLighting)
}


Note that this blending is very similar to the alpha blending that was discussed in the tutorial on transparency except that we perform the blending inside a fragment shader and use levelOfLighting instead of the alpha component (i.e. the opacity) of the texture that should be blended “over” the other texture. In fact, if daytimeTexture specified an alpha component (see the tutorial on transparent textures), we could use this alpha component to blend daytimeTexture over nighttimeTexture. This would correspond to a layer which is partially transparent on top of an opaque layer that is visible where the topmost layer is transparent.

The shader code multiplies gl_FrontMaterial.emission component-wise to the texture color of the nighttime texture in order to make it possible to control its overall brightness by changing Shading > Emit in the Material tab of the Properties window. Furthermore, the color of the light source gl_LightSource[0].diffuse is multiplied (also component-wise) to the final color in order to take colored light sources into account.

import bge

cont = bge.logic.getCurrentController()

varying float levelOfLighting; // the level of diffuse
// lighting that is computed in the vertex shader
varying vec4 texCoords;

void main()
{
vec3 normalDirection =
normalize(gl_NormalMatrix * gl_Normal);
vec3 lightDirection;

if (0.0 == gl_LightSource[0].position.w)
// directional light?
{
lightDirection =
normalize(vec3(gl_LightSource[0].position));
}
else // point light or spotlight (or other kind of light)
{
vec3 vertexToLightSource =
vec3(gl_LightSource[0].position
- gl_ModelViewMatrix * gl_Vertex);
lightDirection = normalize(vertexToLightSource);
}

levelOfLighting =
max(0.0, dot(normalDirection, lightDirection));
texCoords = gl_MultiTexCoord0;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
"""

varying float levelOfLighting;
varying vec4 texCoords;
uniform sampler2D daytimeTexture;
uniform sampler2D nighttimeTexture;

void main()
{
vec2 longitudeLatitude = vec2(
(atan(texCoords.y, texCoords.x)/3.1415926+1.0)*0.5,
1.0 - acos(texCoords.z) / 3.1415926);

vec4 nighttimeColor =
texture2D(nighttimeTexture, longitudeLatitude)
* gl_FrontMaterial.emission;
vec4 daytimeColor =
texture2D(daytimeTexture, longitudeLatitude);
gl_FragColor = mix(nighttimeColor, daytimeColor,
levelOfLighting) * gl_LightSource[0].diffuse;
// = daytimeColor * levelOfLighting
// + nighttimeColor * (1.0 - levelOfLighting) * ...
}
"""

mesh = cont.owner.meshes[0]
for mat in mesh.materials:


When using this shader, you have to make sure that the texture for daytime is in the first position of the list in the Properties window > Textures tab and the texture for nighttime is in the second position. Otherwise you have to adapt the integers in the last two lines of the Python script.

### SummaryEdit

Congratulations! You have reached the end of the last tutorial on basic texturing. We have looked at:

• How layers of surfaces can influence the appearance of materials (e.g. human skin, waxed fruits, plastics, etc.)
• How artificial lights on the unlit side can be taken into account when texturing a sphere representing the Earth.
• How to implement this technique in a shader.
• How this is related to blending an alpha texture over a second opaque texture.