GLSL Programming/Unity/Textured Spheres
This tutorial introduces texture mapping.
It's the first in a series of tutorials about texturing in GLSL shaders in Unity. In this tutorial, we start with a single texture map on a sphere. More specifically, we map an image of the Earth's surface onto a sphere. Based on this, further tutorials cover topics such as lighting of textured surfaces, transparent textures, multitexturing, gloss mapping, etc.
Texture Mapping
editThe basic idea of “texture mapping” (or “texturing”) is to map an image (i.e. a “texture” or a “texture map”) onto a triangle mesh; in other words, to put a flat image onto the surface of a three-dimensional shape.
To this end, “texture coordinates” are defined, which simply specify the position in the texture (i.e. image). The horizontal coordinate is officially called S
and the vertical coordinate T
. However, it is very common to refer to them as x
and y
. In animation and modeling tools, texture coordinates are usually called U
and V
.
In order to map the texture image to a mesh, every vertex of the mesh is given a pair of texture coordinates. (This process (and the result) is sometimes called “UV mapping” since each vertex is mapped to a point in the UV-space.) Thus, every vertex is mapped to a point in the texture image. The texture coordinates of the vertices can then be interpolated for each point of any triangle between three vertices and thus every point of all triangles of the mesh can have a pair of (interpolated) texture coordinates. These texture coordinates map each point of the mesh to a specific position in the texture map and therefore to the color at this position. Thus, rendering a texture-mapped mesh consists of two steps for all visible points: interpolation of texture coordinates and a look-up of the color of the texture image at the position specified by the interpolated texture coordinates.
In OpenGL, any valid floating-point number is a valid texture coordinate. However, when the GPU is asked to look up a pixel (or “texel”) of a texture image (e.g. with the “texture2D” instruction described below), it will internally map the texture coordinates to the range between 0 and 1 in a way depending on the “Wrap Mode” that is specified when importing the texture: wrap mode “repeat” basically uses the fractional part of the texture coordinates to determine texture coordinates in the range between 0 and 1. On the other hand, wrap mode “clamp” clamps the texture coordinates to this range. These internal texture coordinates in the range between 0 and 1 are then used to determine the position in the texture image: specifies the lower, left corner of the texture image; the lower, right corner; the upper, left corner; etc.
Texturing a Sphere in Unity
editTo map the image of the Earth's surface onto a sphere in Unity, you first have to import the image into Unity. Click the image until you get to a larger version and save it (usually with a right-click) to your computer (remember where you saved it). Then switch to Unity and choose Assets > Import New Asset... from the main menu. Choose the image file and click on Import in the file selector box. The imported texture image should appear in the Project View. By selecting it there, details about the way it is imported appear (and can be changed) in the Inspector View.
Now create a sphere, a material, and a shader, and attach the shader to the material and the material to the sphere as described in Section “Minimal Shader”. The shader code should be:
Shader "GLSL shader with single texture" {
Properties {
_MainTex ("Texture Image", 2D) = "white" {}
// a 2D texture property that we call "_MainTex", which should
// be labeled "Texture Image" in Unity's user interface.
// By default we use the built-in texture "white"
// (alternatives: "black", "gray" and "bump").
}
SubShader {
Pass {
GLSLPROGRAM
uniform sampler2D _MainTex;
// a uniform variable referring to the property above
// (in fact, this is just a small integer specifying a
// "texture unit", which has the texture image "bound"
// to it; Unity takes care of this).
varying vec4 textureCoordinates;
// the texture coordinates at the vertices,
// which are interpolated for each fragment
#ifdef VERTEX
void main()
{
textureCoordinates = gl_MultiTexCoord0;
// Unity provides default longitude-latitude-like
// texture coordinates at all vertices of a
// sphere mesh as the attribute "gl_MultiTexCoord0".
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
#endif
#ifdef FRAGMENT
void main()
{
gl_FragColor =
texture2D(_MainTex, vec2(textureCoordinates));
// look up the color of the texture image specified by
// the uniform "_MainTex" at the position specified by
// "textureCoordinates.x" and "textureCoordinates.y"
// and return it in "gl_FragColor"
}
#endif
ENDGLSL
}
}
// The definition of a fallback shader should be commented out
// during development:
// Fallback "Unlit/Texture"
}
Note that the name _MainTex
was chosen to make sure that the fallback shader Unlit/Texture
can access it (see the discussion of fallback shaders in Section “Diffuse Reflection”).
The sphere should now be white. If it is grey, you should check whether the shader is attached to the material and the material is attached to the sphere. If the sphere is magenta, you should check the shader code. In particular, you should select the shader in the Project View and read the error message in the Inspector View.
If the sphere is white, select the sphere in the Hierarchy View or the Scene View and look at the information in the Inspector View. Your material should appear under Mesh Renderer and under it should be a label Texture Image. (Otherwise click on the material bar to make it appear.) The label “Texture Image” is the same that we specified for our shader property _MainTex
in the shader code. There is an empty box to the right of this label. Either click on the small Select button in the box and select the imported texture image or drag & drop the texture image from the Project View to this empty box.
If everything went right, the texture image should now appear on the sphere. Congratulations!
How It Works
editSince many techniques use texture mapping, it pays off very well to understand what is happening here. Therefore, let's review the shader code:
The vertices of Unity's sphere object come with attribute data in gl_MultiTexCoord0
for each vertex, which specifies texture coordinates that are similar to longitude and latitude (but range from 0 to 1). This is analogous to the attribute gl_Vertex
, which specifies a position in object space, except that gl_MultiTexCoord0
specifies texture coordinates in the space of the texture image.
The vertex shader then writes the texture coordinates of each vertex to the varying variable textureCoordinates
. For each fragment of a triangle (i.e. each covered pixel), the values of this varying at the three triangle vertices are interpolated (see the description in Section “Rasterization”) and the interpolated texture coordinates are given to the fragment shader. The fragment shader then uses them to look up a color in the texture image specified by the uniform _MainTex
at the interpolated position in texture space and returns this color in gl_FragColor
, which is then written to the framebuffer and displayed on the screen.
It is crucial that you gain a good idea of these steps in order to understand the more complicated texture mapping techniques presented in other tutorials.
Repeating and Moving Textures
editIn Unity's interface for the shader above, you might have noticed the parameters Tiling and Offset, each with an x and a y component. In built-in shaders, these parameters allow you to repeat the texture (by shrinking the texture image in texture coordinate space) and move the texture image on the surface (by offsetting it in texture coordinate space). In order to be consistent with this behavior, another uniform has to be defined:
uniform vec4 _MainTex_ST;
// tiling and offset parameters of property "_MainTex"
For each texture property, Unity offers such a vec4
uniform with the ending “_ST”. (Remember: “S” and “T” are the official names of the texture coordinates, which are usually called “U” and “V”, or “x” and “y”.) This uniform holds the x and y components of the Tiling parameter in _MainTex_ST.x
and _MainTex_ST.y
, while the x and y components of the Offset parameter are stored in _MainTex_ST.w
and _MainTex_ST.z
. The uniform should be used like this:
gl_FragColor = texture2D(_MainTex,
_MainTex_ST.xy * textureCoordinates.xy
+ _MainTex_ST.zw);
This makes the shader behave like the built-in shaders. In the other tutorials, this feature is usually not implemented in order to keep the shader code a bit cleaner.
And just for completeness, here is the complete shader code with this feature:
Shader "GLSL shader with single texture" {
Properties {
_MainTex ("Texture Image", 2D) = "white" {}
}
SubShader {
Pass {
GLSLPROGRAM
uniform sampler2D _MainTex;
uniform vec4 _MainTex_ST;
// tiling and offset parameters of property
varying vec4 textureCoordinates;
#ifdef VERTEX
void main()
{
textureCoordinates = gl_MultiTexCoord0;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}
#endif
#ifdef FRAGMENT
void main()
{
gl_FragColor = texture2D(_MainTex,
_MainTex_ST.xy * textureCoordinates.xy
+ _MainTex_ST.zw);
// textureCoordinates are multiplied with the tiling
// parameters and the offset parameters are added
}
#endif
ENDGLSL
}
}
// The definition of a fallback shader should be commented out
// during development:
// Fallback "Unlit/Texture"
}
Summary
editYou have reached the end of one of the most important tutorials. We have looked at:
- How to import a texture image and how to attach it to a texture property of a shader.
- How a vertex shader and a fragment shader work together to map a texture image onto a mesh.
- How Unity's tiling and offset parameters for textures work and how to implement them.
Further Reading
editIf you want to know more
- about the data flow in and out of vertex shaders and fragment shaders (i.e. vertex attributes, varyings, etc.), you should read the description in Section “OpenGL ES 2.0 Pipeline”.
- about the interpolation of varying variables for the fragment shader, you should read the discussion in Section “Rasterization”.