GLSL Programming/Unity/Debugging of Shaders

This tutorial introduces attribute variables. It is based on Section “Minimal Shader” and Section “RGB Cube”.

A false-color satellite image.

This tutorial also introduces the main technique to debug shaders in Unity: false-color images, i.e. a value is visualized by setting one of the components of the fragment color to it. Then the intensity of that color component in the resulting image allows you to make conclusions about the value in the shader. This might appear to be a very primitive debugging technique because it is a very primitive debugging technique. Unfortunately, there is no alternative in Unity.

Where Does the Vertex Data Come from? edit

In Section “RGB Cube” you have seen how the fragment shader gets its data from the vertex shader by means of varying variables. The question here is: where does the vertex shader get its data from? Within Unity, the answer is that the Mesh Renderer component of a game object sends all the data of the mesh of the game object to OpenGL in each frame. (This is often called a “draw call”. Note that each draw call has some performance overhead; thus, it is much more efficient to send one large mesh with one draw call to OpenGL than to send several smaller meshes with multiple draw calls.) This data usually consists of a long list of triangles, where each triangle is defined by three vertices and each vertex has certain attributes, including position. These attributes are made available in the vertex shader by means of attribute variables.

Built-in Attribute Variables and how to Visualize Them edit

In Unity, most of the standard attributes (position, color, surface normal, and texture coordinates) are built in, i.e. you need not (in fact must not) define them. The names of these built-in attributes are actually defined by the OpenGL “compability profile” because such built-in names are needed if you mix an OpenGL application that was written for the fixed-function pipeline with a (programmable) vertex shader. If you had to define them, the definitions (only in the vertex shader) would look like this:

   attribute vec4 gl_Vertex; // position (in object coordinates, 
      // i.e. local or model coordinates)
   attribute vec4 gl_Color; // color (usually constant)
   attribute vec3 gl_Normal; // surface normal vector 
      // (in object coordinates; usually normalized to unit length)
   attribute vec4 gl_MultiTexCoord0; //0th set of texture coordinates 
      // (a.k.a. “UV”; between 0 and 1) 
   attribute vec4 gl_MultiTexCoord1; //1st set of texture coordinates 
      // (a.k.a. “UV”; between 0 and 1)
   ...

There is only one attribute variable that is provided by Unity but has no standard name in OpenGL, namely the tangent vector, i.e. a vector that is orthogonal to the surface normal. You should define this variable yourself as an attribute variable of type vec4 with the specific name Tangent as shown in the following shader:

Shader "GLSL shader with all built-in attributes" {
   SubShader {
      Pass {
         GLSLPROGRAM

         #ifdef VERTEX

         varying vec4 color;

         attribute vec4 Tangent; // this attribute is specific to Unity 
         
         void main()
         {
            color = gl_MultiTexCoord0; // set the varying variable

            // other possibilities to play with:

            // color = gl_Vertex;
            // color = gl_Color;
            // color = vec4(gl_Normal, 1.0);
            // color = gl_MultiTexCoord0;
            // color = gl_MultiTexCoord1;
            // color = Tangent;
            
            gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
         }
         
         #endif

         #ifdef FRAGMENT
               
         varying vec4 color;

         void main()
         {
            gl_FragColor = color; // set the output fragment color
         }
         
         #endif

         ENDGLSL
      }
   }
}

In Section “RGB Cube” we have already seen, how to visualize the gl_Vertex coordinates by setting the fragment color to those values. In this example, the fragment color is set to gl_MultiTexCoord0 such that we can see what kind of texture coordinates Unity provides.

Note that only the first three components of Tangent represent the tangent direction. The scaling and the fourth component are set in a specific way, which is mainly useful for parallax mapping (see Section “Projection of Bumpy Surfaces”).

How to Interpret False-Color Images edit

When trying to understand the information in a false-color image, it is important to focus on one color component only. For example, if the standard attribute gl_MultiTexCoord0 for a sphere is written to the fragment color then the red component of the fragment visualizes the x coordinate of gl_MultiTexCoord0, i.e. it doesn't matter whether the output color is maximum pure red or maximum yellow or maximum magenta, in all cases the red component is 1. On the other hand, it also doesn't matter for the red component whether the color is blue or green or cyan of any intensity because the red component is 0 in all cases. If you have never learned to focus solely on one color component, this is probably quite challenging; therefore, you might consider to look only at one color component at a time. For example by using this line to set the varying in the vertex shader:

            color = vec4(gl_MultiTexCoord0.x, 0.0, 0.0, 1.0);

This sets the red component of the varying variable to the x component of gl_MultiTexCoord0 but sets the green and blue components to 0 (and the alpha or opacity component to 1 but that doesn't matter in this shader).

If you focus on the red component or visualize only the red component you should see that it increases from 0 to 1 as you go around the sphere and after 360° drops to 0 again. It actually behaves similar to a longitude coordinate on the surface of a planet. (In terms of spherical coordinates, it corresponds to the azimuth.)

If the x component of gl_MultiTexCoord0 corresponds to the longitude, one would expect that the y component would correspond to the latitude (or the inclination in spherical coordinates). However, note that texture coordinates are always between 0 and 1; therefore, the value is 0 at the bottom (south pole) and 1 at the top (north pole). You can visualize the y component as green on its own with:

            color = vec4(0.0, gl_MultiTexCoord0.y, 0.0, 1.0);

Texture coordinates are particularly nice to visualize because they are between 0 and 1 just like color components are. Almost as nice are coordinates of normalized vectors (i.e., vectors of length 1; for example, gl_Normal is usually normalized) because they are always between -1 and +1. To map this range to the range from 0 to 1, you add 1 to each component and divide all components by 2, e.g.:

            color = vec4((gl_Normal + vec3(1.0, 1.0, 1.0)) / 2.0, 1.0);

Note that gl_Normal is a three-dimensional vector. Black corresponds then to the coordinate -1 and full intensity of one component to the coordinate +1.

If the value that you want to visualize is in another range than 0 to 1 or -1 to +1, you have to map it to the range from 0 to 1, which is the range of color components. If you don't know which values to expect, you just have to experiment. What helps here is that if you specify color components outside of the range 0 to 1, they are automatically clamped to this range. I.e., values less than 0 are set to 0 and values greater than 1 are set to 1. Thus, when the color component is 0 or 1 you know at least that the value is less or greater than what you assumed and then you can adapt the mapping iteratively until the color component is between 0 and 1.

Debugging Practice edit

In order to practice the debugging of shaders, this section includes some lines that produce black colors when the assignment to color in the vertex shader is replaced by each of them. Your task is to figure out for each line, why the result is black. To this end, you should try to visualize any value that you are not absolutely sure about and map the values less than 0 or greater than 1 to other ranges such that the values are visible and you have at least an idea in which range they are. Note that most of the functions and operators are documented in Section “Vector and Matrix Operations”.

            color = gl_MultiTexCoord0 - vec4(1.5, 2.3, 1.1, 0.0);


            color = vec4(gl_MultiTexCoord0.z);


            color = gl_MultiTexCoord0 / tan(0.0);

The following lines require some knowledge about the dot and cross product:

            color = dot(gl_Normal, vec3(Tangent)) * gl_MultiTexCoord0;


            color = dot(cross(gl_Normal, vec3(Tangent)), gl_Normal) * 
               gl_MultiTexCoord0;


            color = vec4(cross(gl_Normal, gl_Normal), 1.0);


            color = vec4(cross(gl_Normal, vec3(gl_Vertex)), 1.0); 
               // only for a sphere!

Do the functions radians() and noise() always return black? What's that good for?

            color = radians(gl_MultiTexCoord0);


            color = noise4(gl_MultiTexCoord0);

Consult the documentation in the “OpenGL ES Shading Language 1.0.17 Specification” available at the “Khronos OpenGL ES API Registry” to figure out what radians() is good for and what the problem with noise4() is.

Special Variables in the Fragment Shader edit

Attributes are specific to vertices, i.e., they usually have different values for different vertices. There are similar variables for fragment shaders, i.e., variables that have different values for each fragment. However, they are different from attributes because they are not specified by a mesh (i.e. a list of triangles). They are also different from varyings because they are not set explicitly by the vertex shader.

Specifically, a four-dimensional vector gl_FragCoord is available containing the screen (or: window) coordinates   of the fragment that is processed; see Section “Vertex Transformations” for the description of the screen coordinate system.

Moreover, a boolean variable gl_FrontFacing is provided that specifies whether the front face or the back face of a triangle is being rendered. Front faces usually face the “outside” of a model and back faces face the “inside” of a model; however, there is no clear outside or inside if the model is not a closed surface. Usually, the surface normal vectors point in the direction of the front face, but this is not required. In fact, front faces and back faces are specified by the order of the vertex triangles: if the vertices appear in counter-clockwise order, the front face is visible; if they appear in clockwise order, the back face is visible. An application is shown in Section “Cutaways”.

Summary edit

Congratulations, you have reached the end of this tutorial! We have seen:

  • The list of built-in attributes in Unity: gl_Vertex, gl_Color, gl_Normal, gl_MultiTexCoord0, gl_MultiTexCoord1, and the special Tangent.
  • How to visualize these attributes (or any other value) by setting components of the output fragment color.
  • The two additional special variables that are available in fragment programs: gl_FragCoord and gl_FrontFacing.

Further Reading edit

If you still want to know more


< GLSL Programming/Unity

Unless stated otherwise, all example source code on this page is granted to the public domain.