GLSL Programming/Blender/Shading in View Space
This tutorial introduces uniform variables. It builds on the tutorial about a minimal shader, the tutorial about an RGB cube, and the tutorial about debugging of shaders.
In this tutorial we will look at a shader that changes the fragment color depending on its position in “view space”, which is just a rotated version of world space in which the camera is at the origin and points to the negative axis while the axis is aligned to the vertical axis of the camera; see the description in “Vertex Transformations”. The concept is not too complicated; however, there are extremely important applications, e.g. shading with lights and environment maps. We will also have a look at shaders in the real world; i.e., what is necessary to enable non-programmers to use your shaders?
Transforming from Object to View Space
editAs mentioned in the tutorial about debugging of shaders, the attribute gl_Vertex
specifies object coordinates, i.e. coordinates in the local object (or model) space of a mesh. The object space (or object coordinate system) is specific to each object; however, all objects are transformed into one common coordinate system — the view space. Conceptually they are first transformed to the common world space (with the model transformation) and then to the view space (with the view transformation). In practice, both transformations are combined into one transformation (the model-view transformation).
If an object is put directly into the world space, the object-to-world transformation is specified in Blender by the transform section of the object properties. To see it, select the object in the 3D View and then open the Object tab of the Properties window and expand the Transform section. There are parameters for “Location”, “Rotation” and “Scale”, which specify how vertices are transformed from object coordinates to world coordinates. The transformations of vertices by translations, rotations and scalings, as well as the combination of transformations and their representation as 4×4 matrices are discussed in “Vertex Transformations”. That page also discusses the transformation from world to view space, which only depends on the position and orientation of the camera (which can be inspected by selecting the camera and then opening a Properties window; it can be activated by choosing View > Camera in the menu of a 3D View).
In the following example, the transformation from object space to view space is put into a 4×4 matrix, which is also known as “model-view matrix” (since it combines the “model transformation” and the “view transformation”). This matrix is available in the built-in uniform variable gl_ModelViewMatrix
(similarly to the gl_ModelViewProjectionMatrix
that has been used in previous tutorials). This uniform is used in the following vertex shader, which should be assigned to the variable VertexShader
of the Python script in the tutorial about a minimal shader:
varying vec4 position_in_view_space;
void main()
{
position_in_view_space = gl_ModelViewMatrix * gl_Vertex;
// transformation of gl_Vertex from object coordinates
// to view coordinates;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
The fragment shader should be assigned to the variable FragmentShader
:
varying vec4 position_in_view_space;
void main()
{
float dist = distance(position_in_view_space,
vec4(0.0, 0.0, 0.0, 1.0));
// computes the distance between the fragment position
// and the origin (4th coordinate should always be 1
// for points). The origin in view space is actually
// the camera position.
if (dist < 5.0)
{
gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0);
// color near origin
}
else
{
gl_FragColor = vec4(0.3, 0.3, 0.3, 1.0);
// color far from origin
}
}
Usually, an OpenGL application has to set the value of uniform variables; however, Blender takes care of setting the values of predefined uniform variables such as gl_ModelViewMatrix
and gl_ModelViewProjectionMatrix
(which is actually the matrix product gl_ProjectionMatrix * gl_ModelViewMatrix
); thus, we don't have to worry about it. (In some cases, however, Blender appears to set some of these matrices incorrectly (in particular gl_ModelViewMatrix
) unless the camera view is activated with 3D View > View > Cameras > Active Camera.)
This shader transforms the vertex position to view space and gives it to the fragment shader in a varying variable. For the fragment shader the varying variable contains the interpolated position of the fragment in view coordinates. Based on the distance of this position to the origin of the view coordinate system, one of two colors is set. Thus, if you move an object with this shader around, it will turn green near the camera. Farther away from the camera it will turn dark grey.
Transforming from View Space to World Space
editMany computations in shaders can be performed in view space. Since the position of light sources are specified in view space and no transformations to world space are provided by default, this is often the most convenient and efficient coordinate system to work in. However, in some cases it's necessary to work in world space (for example, when using environment maps which are positioned in world space). In these cases it is necessary to provide the shader with the transformation from view space to world space in a uniform variable. The uniform variable has to be defined outside the main function in any shader that uses it:
uniform mat4 viewMatrixInverse;
mat4
is just the data type for 4×4 matrices and uniform
indicates that it is the same value for all vertices and fragments, which has to be set externally. In fact, in Blender it is set in a Python script with the line
shader.setUniformMatrix4('viewMatrixInverse', value_of_uniform)
where shader
is the Python object of the shader program, 'viewMatrixInverse'
is the name of the uniform variable in the shader, and value_of_uniform
is the value it should be set to. setUnifomMatrix4
is used to set a 4×4 matrix uniform as documented in Blender's Python API. More functions are available to set uniforms of others types.
Here, we need the 4×4 matrix that transforms points from view space (also known as eye or camera space) to world space, which can be obtained this way:
value_of_uniform = bge.logic.getCurrentScene().active_camera.camera_to_world
I.e. the current scene provides an active camera, which provides the transformation that we need. (Just for completeness: the camera also provides the matrix for the world_to_camera
transformation, i.e. the view matrix itself.)
We can now rewrite the example to work in world space instead of view space, i.e. fragments are colored green if they are close to the origin in world space:
import bge
cont = bge.logic.getCurrentController()
VertexShader = """
uniform mat4 viewMatrixInverse; // view to world
// transformation (the view matrix corresponds to the
// world to view transformation; thus, the inverse
// matrix corresponds to the view to world trafo)
varying vec4 position_in_world_space;
void main()
{
vec4 position_in_view_space =
gl_ModelViewMatrix * gl_Vertex;
// transformation of gl_Vertex from object coordinates
// to view coordinates;
position_in_world_space = viewMatrixInverse *
position_in_view_space;
// transformation of gl_Vertex from view coordinates
// to world coordinates;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
"""
FragmentShader = """
varying vec4 position_in_world_space;
void main()
{
float dist = distance(position_in_world_space,
vec4(0.0, 0.0, 0.0, 1.0));
// computes the distance between the fragment position
// and the origin (the 4th coordinate should always
// be 1 for points).
if (dist < 5.0)
{
gl_FragColor = vec4(0.0, 1.0, 0.0, 1.0);
// color near origin
}
else
{
gl_FragColor = vec4(0.3, 0.3, 0.3, 1.0);
// color far from origin
}
}
"""
mesh = cont.owner.meshes[0]
for mat in mesh.materials:
shader = mat.getShader()
if shader != None:
if not shader.isValid():
shader.setSource(VertexShader, FragmentShader, 1)
value_of_uniform = \
bge.logic.getCurrentScene().active_camera.camera_to_world
shader.setUniformMatrix4('viewMatrixInverse', value_of_uniform)
Note that we should set the current value of the uniform in every frame while we set the source code of the shader only once.
Also note that there is one additional transformation in the vertex shader. This is a matrix-vector multiplication, which includes 16 multiplications and therefore costs some performance. If possible, this kind of additional transformations should be avoided. In practice, we will therefore do lighting computations in view space such that we don't have to transform the view-space positions of light sources to world space.
More OpenGL-Specific Uniforms
editThe positions of light sources are actually available as built-in uniforms similar to gl_ModelViewMatrix
and gl_ProjectionModelViewMatrix
. All these built-in uniforms are defined for the OpenGL compability profile. The corresponding transformations are described in detail in “Vertex Transformations”.
As you can see in the shader above, these uniforms don't have to be defined; they are always available in GLSL shaders in Blender. If you had to define them, the definitions would look like this:
uniform mat4 gl_ModelViewMatrix;
uniform mat4 gl_ProjectionMatrix;
uniform mat4 gl_ModelViewProjectionMatrix;
uniform mat4 gl_TextureMatrix[gl_MaxTextureCoords];
uniform mat3 gl_NormalMatrix;
// transpose of the inverse of gl_ModelViewMatrix
uniform mat4 gl_ModelViewMatrixInverse;
uniform mat4 gl_ProjectionMatrixInverse;
uniform mat4 gl_ModelViewProjectionMatrixInverse;
uniform mat4 gl_TextureMatrixInverse[gl_MaxTextureCoords];
uniform mat4 gl_ModelViewMatrixTranspose;
uniform mat4 gl_ProjectionMatrixTranspose;
uniform mat4 gl_ModelViewProjectionMatrixTranspose;
uniform mat4 gl_TextureMatrixTranspose[gl_MaxTextureCoords];
uniform mat4 gl_ModelViewMatrixInverseTranspose;
uniform mat4 gl_ProjectionMatrixInverseTranspose;
uniform mat4 gl_ModelViewProjectionMatrixInverseTranspose;
uniform mat4 gl_TextureMatrixInverseTranspose[
gl_MaxTextureCoords];
struct gl_MaterialParameters {
vec4 emission; // = Properties > Material tab > Diffuse Color *
// Shading > Emit (alpha = 1)
vec4 ambient; // = black
vec4 diffuse; // = black
// if Properties > Material tab > Diffuse Color activated;
// diffuse = Properties > Object tab > Object Color
vec4 specular; // = Properties > Material tab > Specular Color
// * Intensity (alpha = Intensity!)
float shininess;
// = Properties > Material tab > Specular > Hardness / 4.0
};
uniform gl_MaterialParameters gl_FrontMaterial; // see above
uniform gl_MaterialParameters gl_BackMaterial;
// same as gl_FrontMaterial
struct gl_LightSourceParameters {
vec4 ambient; // = black
vec4 diffuse; // = Properties > Object Data tab > Color *
// Energy (alpha = 1)
vec4 specular; // = Properties > Object Data tab > Color *
// Energy (alpha = 1)
vec4 position; // in view space, w = 0 for Sun (directional),
// w = 1 otherwise
vec4 halfVector; // = average of vectors to light and viewer
vec3 spotDirection; // in view space
float spotExponent; // = Properties > Object Data tab >
// Spot Shape > Blend * 128.0
float spotCutoff; // = Properties > Object Data tab >
// Spot Shape > Size / 2.0 (180.0 if not spotlight)
float spotCosCutoff; // derived: cos(spotCutoff)
// (range: [1.0,0.0] or -1.0 if not spotlight)
float constantAttenuation; // = 1.0
float linearAttenuation; // = 0.0
float quadraticAttenuation;// = 0.0
};
uniform gl_LightSourceParameters gl_LightSource[gl_MaxLights];
// see above for each light
struct gl_LightModelParameters
{
vec4 ambient; // = Properties > World tab > World > Ambient Color
// * Material tab > Shading > Ambient
};
uniform gl_LightModelParameters gl_LightModel; // see above
...
In fact, the compability profile of OpenGL defines even more (less interesting) uniforms; see Chapter 7 of the “OpenGL Shading Language 4.10.6 Specification” available at Khronos' OpenGL page. Apparently Blender supports many of them but not all.
Some of these uniforms are arrays, e.g gl_TextureMatrix
. In fact, an array of matrices
gl_TextureMatrix[0]
, gl_TextureMatrix[1]
, gl_TextureMatrix[2]
, ..., gl_TextureMatrix[gl_MaxTextureCoords - 1]
is available, where gl_MaxTextureCoords
is a built-in integer.
Others are structs, e.g. gl_LightModel
; thus, you have to use the dot-notation to access its members, e.g. gl_LightModel.ambient
for the ambient scene color.
Computing the Model Matrix
editAs we saw above, it is possible to access the view matrix and the inverse view matrix with the help of a Python script. On the other hand, OpenGL offers the product of the model matrix and the view matrix , i.e. the model-view matrix , which is available in the uniform gl_ModelViewMatrix
. What is not so easily available is the model matrix .
However, we can easily compute it. The mathematics looks like this:
In other words, the model matrix is the product of the inverse view matrix and the model-view matrix. Assuming that we have defined the uniform viewMatrixInverse
as above, we can compute the model matrix this way in GLSL:
mat4 modelMatrix = viewMatrixInverse * gl_ModelViewMatrix;
User-Specified Uniforms: Game Properties
editThere is an important application of uniform variables: uniforms that can be set by the user. Actually, these are called game properties in Blender. You can think of them as parameters of objects and, more specifically, of their shaders. A shader without parameters is usually used only by its programmer because even the smallest necessary change requires some programming. On the other hand, a shader using parameters with descriptive names can be used by other people, even non-programmers, e.g. CG artists. Imagine you are in a game development team and a CG artist asks you to adapt your shader for each of 100 design iterations. It should be obvious that a few parameters, which even a CG artist can play with, might save you a lot of time. Also, imagine you want to sell your shader: parameters will often dramatically increase the value of your shader.
The Blender documentation about game properties describes how to set up a new game property: Select the object that the Python script is attached to. Open the Logic Editor and choose View > Properties from the menu (or press n or click the small icon in the upper left corner). Click on Add Game Property and set the name to my_uniform
and the type to Float
. We can then define a uniform in the shader (which could have a different name if we want to confuse ourselves) and set it with the Python function
shader.setUniform1f('my_uniform', cont.owner.get('my_uniform'))
where the first 'my_uniform'
refers to the name in the shader and the second 'my_uniform'
refers to the name of the game property. setUniform1f
is used to set one-dimensional floating-point uniforms while other functions are available to set uniforms of others types, as documented in Blender's Python API.
A simple example could then use the following Python script, which uses the value of the game property to set the intensity of the red component of the fragment's color:
import bge
cont = bge.logic.getCurrentController()
VertexShader = """
void main()
{
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
"""
FragmentShader = """
uniform float my_uniform;
void main()
{
gl_FragColor = vec4(my_uniform, 0.0, 0.0, 1.0);
}
"""
mesh = cont.owner.meshes[0]
for mat in mesh.materials:
shader = mat.getShader()
if shader != None:
if not shader.isValid():
shader.setSource(VertexShader, FragmentShader, 1)
shader.setUniform1f('my_uniform', cont.owner.get('my_uniform'))
Summary
editCongratulations, you made it! We discussed:
- How to transform a vertex into view coordinates.
- How to transform a vertex into world coordinates.
- The most important OpenGL-specific uniforms that are supported by Blender.
- How to make a shader more useful and valuable by adding game properties.
Further Reading
editIf you want to know more
- about vector and matrix operations (e.g. the
distance()
function), you should read “Vector and Matrix Operations”. - about the standard vertex transformations, e.g. the model matrix and the view matrix, you should read “Vertex Transformations”.
- about the application of transformation matrices to points and directions, you should read “Applying Matrix Transformations”.
- about setting shader uniforms in Blender's Python API, you should read Blender's documentation about the class bge.types.BL_Shader.