Cg Programming/Unity/Textured Spheres
This tutorial introduces texture mapping.
It's the first in a series of tutorials about texturing in Cg shaders in Unity. In this tutorial, we start with a single texture map on a sphere. More specifically, we map an image of the Earth's surface onto a sphere. Based on this, further tutorials cover topics such as lighting of textured surfaces, transparent textures, multitexturing, gloss mapping, etc.
Texture Mapping
editThe basic idea of “texture mapping” (or “texturing”) is to map an image (i.e. a “texture” or a “texture map”) onto a triangle mesh; in other words, to put a flat image onto the surface of a three-dimensional shape.
To this end, “texture coordinates” are defined, which simply specify the position in the texture (i.e. image). In OpenGL, the horizontal coordinate is called S
and the vertical coordinate T
. However, it is very common to refer to them as x
and y
. In animation and modeling tools, texture coordinates are usually called U
and V
.
In order to map the texture image to a mesh, every vertex of the mesh is given a pair of texture coordinates. (This process (and the result) is sometimes called “UV mapping” since each vertex is mapped to a point in the UV-space.) Thus, every vertex is mapped to a point in the texture image. The texture coordinates of the vertices can then be interpolated for each point of any triangle between three vertices and thus every point of all triangles of the mesh can have a pair of (interpolated) texture coordinates. These texture coordinates map each point of the mesh to a specific position in the texture map and therefore to the color at this position. Thus, rendering a texture-mapped mesh consists of two steps for all visible points: interpolation of texture coordinates and a look-up of the color of the texture image at the position specified by the interpolated texture coordinates.
Usually, any valid floating-point number is a valid texture coordinate. However, when the GPU is asked to look up a pixel (or “texel”) of a texture image (e.g. with the “tex2D” instruction described below), it will internally map the texture coordinates to the range between 0 and 1 in a way depending on the “Wrap Mode” that is specified when importing the texture: wrap mode “repeat” basically uses the fractional part of the texture coordinates to determine texture coordinates in the range between 0 and 1. On the other hand, wrap mode “clamp” clamps the texture coordinates to this range. These internal texture coordinates in the range between 0 and 1 are then used to determine the position in the texture image: specifies the lower, left corner of the texture image; the lower, right corner; the upper, left corner; etc.
Texturing a Sphere in Unity
editTo map the image of the Earth's surface onto a sphere in Unity, you first have to import the image into Unity. Click the image until you get to a larger version and save it (usually with a right-click) to your computer (remember where you saved it). Then switch to Unity and choose Assets > Import New Asset... from the main menu. Choose the image file and click on Import in the file selector box. The imported texture image should appear in the Project Window. (Alternatively, you can simply drag & drop the image file into the Project Window.) By selecting it there, details about the way it is imported appear (and can be changed) in the Inspector Window.
Now create a sphere, a material, and a shader, and attach the shader to the material and the material to the sphere as described in Section “Minimal Shader”. The shader code should be:
Shader "Cg shader with single texture" {
Properties {
_MainTex ("Texture Image", 2D) = "white" {}
// a 2D texture property that we call "_MainTex", which should
// be labeled "Texture Image" in Unity's user interface.
// By default we use the built-in texture "white"
// (alternatives: "black", "gray" and "bump").
}
SubShader {
Pass {
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
uniform sampler2D _MainTex;
// a uniform variable referring to the property above
// (in fact, this is just a small integer specifying a
// "texture unit", which has the texture image "bound"
// to it; Unity takes care of this).
struct vertexInput {
float4 vertex : POSITION;
float4 texcoord : TEXCOORD0;
};
struct vertexOutput {
float4 pos : SV_POSITION;
float4 tex : TEXCOORD0;
};
vertexOutput vert(vertexInput input)
{
vertexOutput output;
output.tex = input.texcoord;
// Unity provides default longitude-latitude-like
// texture coordinates at all vertices of a
// sphere mesh as the input parameter
// "input.texcoord" with semantic "TEXCOORD0".
output.pos = UnityObjectToClipPos(input.vertex);
return output;
}
float4 frag(vertexOutput input) : COLOR
{
return tex2D(_MainTex, input.tex.xy);
// look up the color of the texture image specified by
// the uniform "_MainTex" at the position specified by
// "input.tex.x" and "input.tex.y" and return it
}
ENDCG
}
}
Fallback "Unlit/Texture"
}
Note that the name _MainTex
was chosen to make sure that the fallback shader Unlit/Texture
can access it (see the discussion of fallback shaders in Section “Diffuse Reflection”).
The sphere should now be white. If it is grey, you should check whether the shader is attached to the material and the material is attached to the sphere. If the sphere is magenta, you should check the shader code. In particular, you should select the shader in the Project Window and read the error message in the Inspector Window.
If the sphere is white, select the sphere in the Hierarchy Window or the Scene View and look at the information in the Inspector Window. Your material should appear under Mesh Renderer and under it should be a label Texture Image. (Otherwise click on the material bar to make it appear.) The label “Texture Image” is the same that we specified for our shader property _MainTex
in the shader code. There is an empty box to the right of this label. Either click on the small Select button in the box and select the imported texture image or drag & drop the texture image from the Project Window to this empty box.
If everything went right, the texture image should now appear on the sphere. Congratulations!
How It Works
editSince many techniques use texture mapping, it pays off very well to understand what is happening here. Therefore, let's review the shader code:
The vertices of Unity's sphere object come with texture coordinates for each vertex in the vertex input parameter texcoord
with semantic TEXCOORD0
. These coordinates are similar to longitude and latitude (but range from 0 to 1). This is analogous to the vertex input parameter vertex
with semantic POSITION
, which specifies a position in object space, except that texcoord
specifies texture coordinates in the space of the texture image.
The vertex shader then writes the texture coordinates of each vertex to the vertex output parameter output.tex
. For each fragment of a triangle (i.e. each covered pixel), the values of this output parameter at the three triangle vertices are interpolated (see the description in Section “Rasterization”) and the interpolated texture coordinates are given to the fragment shader as input parameters. The fragment shader then uses them to look up a color in the texture image specified by the uniform _MainTex
at the interpolated position in texture space and returns this color as fragment output parameter, which is then written to the framebuffer and displayed on the screen.
It is crucial that you gain a good idea of these steps in order to understand the more complicated texture mapping techniques presented in other tutorials.
Repeating and Moving Textures
editIn Unity's interface for the shader above, you might have noticed the parameters Tiling and Offset, each with an x and a y component. In built-in shaders, these parameters allow you to repeat the texture (by shrinking the texture image in texture coordinate space) and move the texture image on the surface (by offsetting it in texture coordinate space). In order to be consistent with this behavior, another uniform has to be defined:
uniform float4 _MainTex_ST;
// tiling and offset parameters of property "_MainTex"
For each texture property, Unity offers such a float4
uniform with the ending “_ST”. (Remember: “S” and “T” are the official names of the texture coordinates, which are usually called “U” and “V”, or “x” and “y”.) This uniform holds the x and y components of the Tiling parameter in _MainTex_ST.x
and _MainTex_ST.y
, while the x and y components of the Offset parameter are stored in _MainTex_ST.z
and _MainTex_ST.w
. The uniform should be used like this:
return tex2D(_MainTex,
_MainTex_ST.xy * input.tex.xy + _MainTex_ST.zw);
This makes the shader behave like the built-in shaders. In the other tutorials, this feature is usually not implemented in order to keep the shader code a bit cleaner.
And just for completeness, here is the complete shader code with this feature:
Shader "Cg shader with single texture" {
Properties {
_MainTex ("Texture Image", 2D) = "white" {}
// a 2D texture property that we call "_MainTex", which should
// be labeled "Texture Image" in Unity's user interface.
// By default we use the built-in texture "white"
// (alternatives: "black", "gray" and "bump").
}
SubShader {
Pass {
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
uniform sampler2D _MainTex;
uniform float4 _MainTex_ST;
// tiling and offset parameters of property
struct vertexInput {
float4 vertex : POSITION;
float4 texcoord : TEXCOORD0;
};
struct vertexOutput {
float4 pos : SV_POSITION;
float4 tex : TEXCOORD0;
};
vertexOutput vert(vertexInput input)
{
vertexOutput output;
output.tex = input.texcoord;
output.pos = UnityObjectToClipPos(input.vertex);
return output;
}
float4 frag(vertexOutput input) : COLOR
{
return tex2D(_MainTex,
_MainTex_ST.xy * input.tex.xy + _MainTex_ST.zw);
// texture coordinates are multiplied with the tiling
// parameters and the offset parameters are added
}
ENDCG
}
}
Fallback "Unlit/Texture"
}
Unity provides a macro for this kind of texture coordinate transformation in UnityCG.cginc
, i.e., you have to include this line in a Pass
:
#include "UnityCG.cginc"
With this you can use the macro TRANSFORM_TEX()
to rewrite the return
statement from above:
return tex2D(_MainTex, TRANSFORM_TEX(input.tex, _MainTex));
Summary
editYou have reached the end of one of the most important tutorials. We have looked at:
- How to import a texture image and how to attach it to a texture property of a shader.
- How a vertex shader and a fragment shader work together to map a texture image onto a mesh.
- How Unity's tiling and offset parameters for textures work and how to implement them.
Further reading
editIf you want to know more
- about the data flow in and out of vertex shaders and fragment shaders (i.e. vertex input and output parameters, etc.), you should read the description in Section “Programmable Graphics Pipeline”.
- about the interpolation of vertex output parameters for the fragment shader, you should read the discussion in Section “Rasterization”.