Blender 3D: Noob to Pro/Glossary


Contents: Top - 0–9 A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

A edit

  • Alpha Channel is an additional channel in a 2D image for transparency. In an image element which stores a color for each pixel, an additional value is stored in the alpha channel containing a value between 0 and 1. A value of 0 means that the pixel does not have any coverage information; i.e. there was no color contribution from any geometry because the geometry did not overlap this pixel. A value of 1 means that the pixel is fully opaque because the geometry completely overlapped the pixel.
  • Ambient Light is light that doesn't seem to come from a specific source, but is just there. Look under the desk - it's pretty dark, but there's some light there. In the real world, this is caused by stray photons bouncing around and occasionally ricocheting under the desk. Ambient light is the basic, minimal amount of light in the whole scene. Adding too much ambient light makes a scene look washed out. Since the light doesn't come from anywhere, all sides of an object are illuminated equally, and it won't have any shading on it.
  • Ambient Occlusion (AO) is a ratio of how much ambient light a surface point would be likely to receive. It simulates a huge dome light surrounding the entire scene. If a surface point is under a foot or table, it will end up much darker than the top of someone's head or the tabletop.
  • Armature is the interconnection of bones that form the skeleton of an animated figure. The Inverse Kinematics library contains the code to make armatures move. The armature must still be rigged with 3D objects to give shape to its head, hands, trunk, feet, etc.

B edit

  • Background image: a 2D image ("picture") that is placed "behind" the entire 3D scene, like a backdrop on a movie set. Blender permits the placement of these images in all six directions from the origin: back, front, top, bottom, left, right.
  • Bake: to precompute computationally-intensive elements of an animation. For example, in a physics simulation involving the behaviour of fluids or clothing, you would set up the physical parameters, then compute (bake) the positions and shapes of the objects over the duration of the animation. Afterwards, you can assign materials and lighting, and then render the frames to produce the actual animation. Doing the baking as a separate step, and saving the results from that, means you can change your mind about the materials and lighting and rerender the frames more quickly.
  • Bézier surfaces were first described in 1972 by the French engineer Pierre Bézier who used them to design automobile bodies. Bézier surfaces can be of any degree, but bicubic Bézier surfaces generally provide enough degrees of freedom for most applications.
  • BF is Blender Foundation
  • Blend - to Blend, working with Blender; also Blender's file extension.
  • Bounce Light: Simple lighting situations have a single light, called a key light, illuminating one side of an object. This creates strong shading and definition of the volume of the object. However, a 3D light will often make the contrast too great - the dark side of the object is completely black since no light is hitting it. In reality it would still be lit a little, just not as much as the brightly lit side, because of light bouncing around the room and hitting the dark side of the object. In realtime 3D, bounce light is not calculated, so you have to create it yourself. Either add a little ambient color, or put a second, less bright directional light pointing the opposite direction to give a little light to the shadows.
  • Bump mapping is a technique where at each pixel, a perturbation to the surface normal of the object being rendered is looked up in a texture map and applied before the illumination calculation is done. Bump Mapping use a gray-scale image map to change the direction of surface normals. You can use this to simulate height, so that you can paint wrinkles and bumps. 50 % grey means neutral (no change is made), lighter means higher, darker means lower. Note that the position of faces is not actually changed; by rotating just the normals, lighting will change too, to give the illusion of a height difference. This has downsides too: the outline of objects isn't changed, so the trick is given away. For similar effects you can use Displacement Mapping and Normal Mapping.

C edit

  • Caustics in optics is a bundle of light rays. For example a caustic effect may be seen when light refracts or reflects through some refractive or reflective material, to create a more focused, stronger light on the final location. Such amplification, especially of sunlight, can burn — hence the name. A common situation when caustics are visible is when some light points on glass. There is a shadow behind the glass, but also there is a stronger light spot. Nowadays, almost every advanced rendering system supports caustics. Some of them even support volumetric caustics. This is accomplished by raytracing the possible paths of the light beam through the glass, accounting for the refraction, reflection, etc.
  • CG is Computer Graphics
  • CGI is Computer Generated Imagery

D edit

  • Depth of Field (DOF) is the distance in front of and behind the subject which appears to be in focus. For any given lens setting, there is only one distance at which a subject is precisely in focus, but focus falls off gradually on either side of that distance, so there is a region in which the blurring is tolerable. This region is greater behind the point of focus than it is in front, as the angle of the light rays change more rapidly; they approach being parallel with increasing distance.
  • Diffuse Light is even, directed light coming off a surface. For most things, the diffuse light is the main lighting we see. Diffuse light comes from a specific direction or location, and creates shading. Surfaces facing towards the light source will be brighter, while surfaces facing away from the light source will be darker.
  • Directional Light is a light that has a specific direction, but no location. It seems to come from an infinitely far away source, like the sun. Surfaces facing the light are illuminated more than surfaces facing away, but their location doesn't matter. A Directional Light illuminates all objects in the scene, no matter where they are.
  • Displacement Mapping uses a greyscale heightmap, like Bump Mapping, but the image is used to physically move the vertices of the mesh at render time. This is of course only useful if the mesh has large amounts of vertices, but the (relatively) new "Simple Subdiv" subsurf option allows you to add more vertices at render time which will be moved by the displacement. This makes it much slower than Bump Mapping, as there need to be many more faces to render, but it is much more realistic.

E edit

  • Environment Maps (EnvMaps) is the method of calculating reflections. Involved rendering images at strategic positions and applying them as textures to the mirror. Now in most cases obsoleted by Raytracing, which though slower is easier to use and more accurate.

F edit

  • Focal Length of a lens is the distance along the optical axis from the lens to the focus (or focal point). The inverse of a lens' focal length is called its power.
  • Focus of a lens is the point onto which collimated light parallel to the axis is focused.
  • Foreshortening
  • Fresnel lens is a type of lens invented by Augustin-Jean Fresnel. Originally developed for lighthouses, the design enables the construction of lenses of large size and short focal length without the weight and volume of material which would be required in a lens of conventional design. As it relates to rendering, fresnel refers to the tendency of materials to be more reflective when light strikes at a high angle of incidence-- think of how sunlight reflects from distant water, but penetrates closer water, or of how road glare is most extreme at dawn or dusk. This varies with material, and specification of fresnel is an important part of material definition.

G edit

  • GE is Game Engine.
  • Global Illumination (GI) is a superset of radiosity and ray tracing. The goal is to compute all possible light interactions in a given scene, and thus obtain a truly photorealistic image. All combinations of diffuse and specular reflections and transmissions must be accounted for. Effects such as colour bleeding and caustics must be included in a global illumination simulation.
  • Gouraud shading is a method used in computer graphics to simulate the differing effects of light and colour across the surface of an object. In practice, Gouraud shading is used to achieve smooth lighting on low-polygon surfaces without the heavy computational requirements of calculating lighting for each pixel. The technique was first presented by Henri Gouraud in 1971.

H edit

  • High Dynamic Range Image (HDRI) is a set of techniques that allow a far greater dynamic range of exposures than normal digital imaging techniques. The intention is to accurately represent the wide range of intensity levels found in real scenes, ranging from direct sunlight to the deepest shadows. The use of high dynamic range imaging in computer graphics has been popularised by the work of Paul Debevec. Blender uses Yafray for these techniques.

I edit

  • Index Of Refraction (IOR) is about the way that light passes through different types of materials... diamond, glass, water etc. When a light ray travels through the same volume it follows a straight path. However if it passes from one transparent volume to another, it bends. This is why a straw in water looks bent. The amount of bending differs between materials. The angle by which the ray is bent can be determined by knowing two things: the angle at which the incoming ray has been cast and the Index of Refraction. This IOR value is unique for every material. Glass has an IOR of about 1.5 and water 1.3. By increasing the IOR value for a Blender material, you can control how much the environment behind the transparent object is distorted, and thus improving the realism of the shader.
  • Interpolation (IPO) is an animation curve: it indicates how the object must "move" between an initial and a final position, at the rate determined by the rendering engine. Objects can be animated in many ways. They can be animated as Objects, changing their position, orientation or size in time; they can be animated by deforming them; that is animating their vertices or control points; or they can be animated via very complex and flexible interaction with a special kind of object: the Armature.
  • Inverse Kinematics (IK) is the process of determining the movement of interconnected segments of a body or model, starting from the desired motion of the endpoints of the armature. Using ordinary Kinematics on a hierarchically structured object you can for example move the shoulder of a puppet. The upper and lower arm and hand will automatically follow that movement. IK will allow you to move the hand and let the lower and upper arm go along with the movement. Without IK the hand would come off the model and would move independently in space. The Blender Armature System includes Inverse Kinematics. For general armatures there are many possible solutions to the IK.

J edit

  • JPEG Acronym for Joint Photographic Expert Group (pronounced jay-peg) is a commonly used standard method of lossy compression for photographic images. The file format which employs this compression is commonly also called JPEG; the most common file extensions for this format are .jpeg, .jfif, .jpg, .JPG, or .JPE although .jpg is the most common on all platforms.

K edit

  • Keyframe is a frame in an animated sequence of frames that was drawn or otherwise constructed directly by the user. When all frames were drawn by animators, the senior artist would draw these frames, leaving the "in between" frames to an apprentice. Now, the animator creates only the first and last frames of a simple sequence; the computer fills in the gap. This is called tweening.

L edit

  • Luminosity (more properly called luminance) is the density of luminous intensity in a given direction. In astronomy, luminosity is the amount of energy a body radiates per unit time. It is typically expressed in the SI units watts, in the cgs units ergs per second, or in terms of solar luminosities, Ls; that is, how many times more energy the object radiates than the Sun, whose luminosity is 3.827×1026 W.

M edit

  • Motion Blur is the simulation of the phenomenon that occurs when we perceive a rapidly moving object. The object appears to be blurred because of our persistence of vision. Doing motion blur makes computer animation appear more realistic. It can be thought of as adding back some of the time dependence expressed in the Rendering Equation.

N edit

  • Nabla. Ton wrote: Almost all procedural textures in Blender use derivatives for calculating normals for texture mapping (with as exception "Blend" and "Magic). The texture normal, the derivative, is calculated by using four samples in the texture formula:
s0= texture(x, y, z)
s1= texture(x+nabla, y, z)
s2= texture(x, y+nabla, z)
s3= texture(x, y, z+nabla)
normal[0]= s0-s1
normal[1]= s0-s2
normal[2]= s0-s3 

Up to now, the "nabla" offset was a constant (0.025) which worked fine in most cases, but doesn't give proper control over the way a texture is sampled, for example to make the effect smoother or sharper. This feature especially is useful in combination with the ColorBand feature.

  • Non-Linear Animation (NLA) allows the animator to edit motions as a whole, not just the individual keys. Nonlinear animation is not just about editing and manipulating groups of keyframes, but it also allows you to combine, mix, and blend motions to create entirely new animations.
  • Normal (Surface Normal) to a flat surface is a three-dimensional vector which is perpendicular to that surface. A normal to a non-flat surface at a point p on the surface is a vector which is perpendicular to the tangent plane to that surface at p.
  • Normal Mapping is similar to Bump Mapping, but instead of the image being a greyscale heightmap, the colours define in which direction the normal should be shifted, the 3 colour channels being mapped to the 3 directions X, Y and Z. This allows more detail and control over the effect.

O edit

  • Orange is the first Blender open movie project.
  • Oversampling (OSA), also called Anti-Aliasing is the technique of minimizing aliasing when representing a high-resolution signal at a lower resolution. In most cases, anti-aliasing means removing data at too high a frequency to represent. When such data is left in a signal, it causes unpredictable artifacts.

P edit

  • Phong shading term is used indiscriminately to describe both an illumination model and an interpolation method in 3D computer graphics. Phong reflection is a local illumination model and can produce a certain degree of realism in three-dimensional objects by combining three elements - diffuse, specular and ambient for each considered point on a surface. It has several assumptions - all lights are points, only surface geometry is considered, only local modelling of diffuse and specular, specular colour is the same as light colour, ambient is a global constant.
  • Point Light is a light that has a specific location and radiates equally out in all directions. Examples of point lights would be candles or bare lightbulbs. Surfaces close to the point light are brighter than those which are far away. Point lights have attenuation, which controls how quickly the light intensity drops off as you move away from it. Lights with high attenuation are very localized, while lights with low attenuation will spread farther.
  • Polygonization (of meta-surfaces) is the process of approximating the meta-surface via polygons so it can be displayed/rendered in Blender.
  • Purple runs as a normal Verse client. It implements a node database to mirror the contents of its host. It loads the plug-ins, which reside in libraries, from local disk as DLLs or shared objects depending on the platform.

Q edit

  • Quaternion is a representation of 3D rotations with four numbers. It can be interpreted as an extension of complex numbers to 3D. The interpretation of the four numbers is not very intuitive for a human, but the numerical advantage of quaternions is that it is the smallest mathematical representation that does not suffer from the gimbal lock singularity problem. This problem occurs for example with Euler angle representations, when a small change in the 3D orientation can give rise to a large change in Euler angles.

R edit

  • Radiosity is a more accurate but also more process-intensive technique than raytracing, that calculates patterns of light and shadow for rendering graphics images from three-dimensional models. One of the many different tools which can simulate diffuse lighting in Blender.
  • Raytracing works by tracing the path taken by a ray of light through the scene, and calculating reflection, refraction, or absorption of the ray whenever it intersects an object in the world. More accurate than Scanline, but much slower.
  • Render: to generate actual viewable images from a 3D model or scene. This may or may not need to happen in real time; for example an interactive game requires real-time rendering, whereas a feature film does not. These situations require very different rendering techniques.
  • Refraction in geometric optics is the change in direction of a wave due to a change in velocity. It happens when waves travel from a medium with a given refractive index to a medium with another. At the boundary between the media the wave changes direction; its wavelength increases or decreases but frequency remains constant. For example, a light ray will refract as it enters and leaves glass.
  • Relative Vertex Keys (RVK) are part of a keyframe animation system that operates on vertex level objects. Each (shape) key is stored as a morph target such that several keys may be blended together to achieve complex mesh animation. With RVK you can create facial expressions, speech, and other detailed animated keyframed movements within your mesh-based models.
  • Rig: the controls that aid in the manipulation of a digital character.
  • Rigging is the process by which a person creates constraints and relationships between objects that will generate controls to aid in the manipulation of a digital character.

S edit

  • Scanline is one row of pixels in the final render. Also the term for one of the methods of rendering which Blender can use. It is much faster than Raytracing, but allows fewer effects, such as reflections, refractions, motion blur and focal blur.
  • Seed: a starting number for generating a random-looking number sequence. Using the same seed will always give you the same sequence. Technically, such a sequence is not truly random, it is only pseudorandom.
  • Shader: an algorithm for computing the appearance of a given material based on the colour, angle and intensity of the light. Specular shaders produce a more shiny, mirrorlike finish, while diffuse shaders give a duller surface appearance. There are also toon shaders, which are deliberately designed to produce an effect more akin to a cartoon drawing, with delineated object borders and less gradation of colours over a surface.
  • Shadows: simulated lights don't normally cast shadows. And, they also pass through solid objects - so a light inside a closed box would actually illuminate things outside the box as if the box were transparent. The shading on objects is only calculated based on the angle of the surface.
  • Specular Light refers to the highlights on reflective objects, like diamonds, billiard balls, and eyes. Specular highlights often appear as bright spots on a surface, at a point where the light source hits it directly. Ambient, Diffuse, and Specular are called the three components of a light source. Each one is given a color, which, when added together, create the final color of a light. For most lights, the main overall color of the light is defined by the Diffuse color. Sunlight or lightbulbs would be white, while moonlight would be a darker blue, and a candle would be yellow. You can use the ambient color to adjust the overall color range of the light source; or, you can get a slight tint to shadows by making the diffuse component yellow and the ambient a slight blue. In many lights, the ambient color is left at black, meaning that it won't have any effect. Specular components are often left at white, but you can make them different colors to get interesting effects. Most of the time you can completely ignore the specular and diffuse settings on a light, but just be aware that the way you set the color is by specifically setting the diffuse color. The final color that an object appears to be is a combination of the light hitting it and the color of the surface.
  • Spotlight is a light with both location and direction. A spotlight sends out a cone of light defined by the spotlight angle, and illuminates only objects within that cone. Spotlights also have attenuation, as well as a parameter that controls whether the spot of light is sharply defined or has smooth edges. These 4 types of lights are listed in order of computational complexity; the more lights you have, the more work the computer has to do. Generally it's a good idea to use directional lights whenever possible, since they're the cheapest, and use pointlights and spotlights sparingly.
  • Stucci is one of the classes of blender textures. Stucci is not an English word, but is used in Blender as the plural of stucco.
  • Subdivision Surface (Subsurf) is the tool which subdivides your model at render-time, without affecting your mesh at design-time. There are two subsurf algorithms in Blender to choose from - Simple Subdiv, which doesn't affect the shape of your mesh, and is used to add detail to displacement mapping or render-time radiosity, both of which operate on a per-vertex basis. The other is Catmull-Clark, a common subdivision algorithm which smooths out curves, and allows you to make complicated smooth surfaces (e.g. people, plants, etc.) with very few faces. However this algorithm can sometimes (read: often) have strange results with meshes containing triangles or vertices with many edges ("poles"), unless it is correctly handled.
  • Sub Surface Scattering (SSS) is a mechanism of light transport in which light penetrates the surface of a translucent object, is scattered by interacting with the material, and exits the surface at a different point. All non-metallic materials are translucent to some degree. In particular, materials such as marble, skin, and milk are extremely difficult to simulate realistically without taking subsurface scattering into account.

T edit

  • Tuhopuu is an experimental version of Blender that is like a code playground, developers can put their new code in there to be tested and played with by users before it gets put into the official Blender. Tuhopuu is Finnish for "Tree of destruction".
  • Tweening is short for in-betweening, the process of generating intermediate frames between two images to give the appearance that the first image evolves smoothly into the second image. Tweening is a key process in all types of animation, including computer animation. Sophisticated animation software enables one to identify specific objects in an image and define how they should move and change during the tweening process. Another word for tweening is interpolation.

U edit

  • UV Mapping (UV) This refers to the process of re-parameterizing a 3d object with dimensions x, y and z into a 2d plane with coordinates u and v. Most texturing requires this step because it tells the program HOW to apply a 2d image map onto a 3d object. If all your textures must be 2d and flat, the easiest way to determine what pixel goes where is if your model is flattened and made 2d. It also establishes a relationship between a 2d image and the mesh such that if the mesh deforms, the image map will deform along with it. Think of it as skinning a cat and pinning its hide onto cardboard to facilitate painting it!

V edit

  • Verse is a network protocol that lets multiple applications act together as one large application by sharing data over a network. If one application makes a change to shared data, the change is distributed instantly to all the other interested clients.

W edit

  • WC is Weekend Challenge.
  • WIP is Work In Progress.

X edit

Y edit

  • Yet Another Free Raytracer (YafRay) is an open source ray tracing program that uses an XML scene description language. It has been integrated into, and is often used to render scenes made in, Blender.

Z edit