cs-378: game technology lecture #7: more mapping prof. okan arikan university of texas, austin...
TRANSCRIPT
CS-378: Game Technology
Lecture #7: More Mapping
Prof. Okan ArikanUniversity of Texas, Austin
Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica HodginsV2005-08-1.1
Today
More on mapping
Environment mapping
Light mapping
Bump mapping
Buffers
Environment MappingEnvironment mapping produces reflections on shiny objects
Texture is transferred in the direction of the reflected ray from the environment map onto the object
Uses ray with same direction but starting at the object center
Map contains a view of the world as seen from the center of the object
Object
Viewer
Reflected ray
EnvironmentMap
Lookup ray
Environment Mapping
www.debevec.org
Need For Speed Underground
Far Cry
Lat/Long MappingThe original algorithm (1976) placed the map on a sphere centered on the object
Mapping functions assume that s,t texture coordinates equate to latitude and longitude on the sphere:
What is bad about this method?
Sampling
Map generation
Complex texture coordinate computations
2
1 ,tan
11
2
1 1
z
x
y tsR
R
R
Cube Mapping
Put the object at the center of a cube
Represent the environment on the cube faces
Assumptions ?
Hardware supported
View ray
Reflection ray
Sphere Mapping
Again the map lives on a sphere, but now the coordinate mapping is simplified
To generate the map:
Take a map point (s,t), cast a ray onto a sphere in the -Z direction, and record what is reflected
Equivalent to photographing a reflective sphere with an orthographic camera (long lens, big distance)
Again, makes the method suitable for film special effects
A Sphere Map
Indexing Sphere Maps
Given the reflection vector:
Implemented in hardware
Problems:
Highly non-uniform sampling
Highly non-linear mapping
21
222 12
2
1 ,
2
1
zyx
yx
RRRm
m
Rt
m
Rs
Non-uniform Sampling
Non-linear Mapping
Linear interpolation of per-vertex texture coordinates picks up the wrong texture pixels
Use small polygons!
Correct Linear
Example
Other Env. Map Tricks
Partially reflective objects
First stage applied color texture
Second stage does environment mapping using alpha blend with existing color
Just put the lights in the environment map
What does this simulate?
Recursive reflections
Bad cases for environment maps?
Light Maps
Speed up lighting calculations by pre-computing lighting and storing it in maps
Allows complex illumination models to be used in generating the map (eg shadows, radiosity)
Used in complex rendering algorithms (Radiance), not just games
Issues:
How is the mapping determined?
How are the maps generated?
How are they applied at run-time?
Example
www.flipcode.com
Call
of
du
ty
Choosing a Mapping
Problem: In a preprocessing phase, points on polygons must be associated with points in maps
One solution:
Find groups of polygons that are “near” co-planar and do not overlap when projected onto a plane
Result is a mapping from polygons to planes
Combine sections of the chosen planes into larger maps
Store texture coordinates at polygon vertices
Lighting tends to change quite slowly (except when?), so the map resolution can be poor
Generating the MapProblem: What value should go in each pixel of the light map?
Solution:
Map texture pixels back into world space (using the inverse of the texture mapping)
Take the illumination of the polygon and put it in the pixel
Advantages of this approach:
Choosing “good” planes means that texture pixels map to roughly square pieces of polygon - good sampling
Not too many maps are required, and not much memory is wasted
Example
Nearest interpolation Linear interpolation
What type of lighting (diffuse, specular, reflections) can the map store?
Example
No light maps With light maps
Applying Light MapsUse multi-texturing hardware
First stage: Apply color texture map
Second stage: Modulate with light map
Actually, make points darker with light map
DirectX allows you to make points brighter with texture
Pre-lighting textures:
Apply the light map to the texture maps as a pre-process
Why is this less appealing?
Multi-stage rendering:
Same effect as multi-texturing, but modulating in the frame buffer
Dynamic Light Maps
Light maps are a preprocessing step, so they can only capture static lighting
Texture transformations allow some effects
What is required to recompute a light map at run-time?
How might we make this tractable?
Spatial subdivision algorithms allow us to identify nearby objects, which helps with this process
Compute a separate, dynamic light map at runtime using same mapping as static light map
Add additional texture pass to apply the dynamic map
Fog Maps
Dynamic modification of light-maps
Put fog objects into the scene
Compute where they intersect with geometry and paint the fog density into a dynamic light map
Use same mapping as static light map uses
Apply the fog map as with a light map
Extra texture stage
Fog Map Example
Bump Mapping
Bump mapping modifies the surface normal vector according to information in the map
Light dependent: the appearance of the surface depends on the lighting direction
View dependent: the effect of the bumps may depend on which direction the surface is viewed from
Bump mapping can be implemented with multi-texturing, multi-pass rendering, or pixel shaders
Storing the Bump MapSeveral options for what to store in the map
The normal vector to use
An offset to the default normal vector
Data derived from the normal vector
Illumination changes for a fixed view
Embossing
Apply height field as a modulating texture map
First application, apply it in place
Second application, shift it by amount that depends on the light direction, and subtract it
Dot Product bump mapping
Store normal vectors in the bump map
Specify light directions instead of colors at the vertices
Apply the bump map using the dot3 operator
Takes a dot product
Lots of details:
Light directions must be normalized – can be done with a cubic environment map
How do you get the color in?
How do you do specular highlights?
Dot Product Results
www.nvidia.com
Normal Mapping
DOOM 3
James Hastings-Trew
Environment Bump Mapping
Perturb the environment map lookup directions with the bump map
Far Cry
Nvidia
Multi-Pass RenderingThe pipeline takes one triangle at a time, so only local information and pre-computed maps are available
Multi-pass techniques render the scene, or parts of the scene, multiple times
Makes use of auxiliary buffers to hold information
Make use of tests and logical operations on values in the buffers
Really, a set of functionality that can be used to achieve a wide range of effects
Mirrors, shadows, bump-maps, anti-aliasing, compositing, …
BuffersBuffers allow you to store global information about the rendered scene
Like scratch work space, or extra screen memory
They are only cleared when you say so
This functionality is fundamentally different from that of vertex or pixel shaders
Buffers are defined by:
The type of values they store
The logical operations that they influence
The way they are accessed (written and read)
OpenGL Buffers
Color buffers: Store RGBA color information for each pixel
OpenGL actually defines four or more color buffers: front/back (double buffering), left/right (stereo) and auxiliary color buffers
Depth buffer: Stores depth information for each pixel
Stencil buffer: Stores some number of bits for each pixel
Accumulation buffer: Like a color buffer, but with higher resolution and different operations
Fragment TestsA fragment is a pixel-sized piece of shaded polygon, with color and depth information
After pixel shaders and/or texturing
The tests and operations performed with the fragment on its way to the color buffer are essential to understanding multi-pass techniques
Most important are, in order:
Alpha test
Stencil test
Depth test
Blending
Tests must be explicitly enabled
As the fragment passes through, some of the buffers may also have values stored into them
Alpha TestThe alpha test either allows a fragment to pass, or stops it, depending on the outcome of a test:
Here, fragment is the fragment’s alpha value, and reference is a reference alpha value that you specify
op is one of: <, <=, =, !=, >, >=
There are also the special tests: Always and Never
Always let the fragment through or never let it through
What is a sensible default?
if ( fragment op reference )pass fragment on
BillboardsBillboards are texture-mapped polygons, typically used for things like trees
Image-based rendering method where complex geometry (the tree) is replaced with an image placed in the scene (the textured polygon)
The texture has alpha values associated with it: 1 where the tree is, and 0 where it isn’t
So you can see through the polygon in places where the tree isn’t
Alpha Test and BillboardsYou can use texture blending to make the polygon see through, but there is a big problem
What happens if you draw the billboard and then draw something behind it?
Hint: Think about the depth buffer values
This is one reason why transparent objects must be rendered back to front
The best way to draw billboards is with an alpha test: Do not let alpha < 0.5 pass through
Depth buffer is never set for fragments that are see through
Doesn’t work for partially transparent polygons - more later
Stencil Buffer
The stencil buffer acts like a paint stencil - it lets some fragments through but not others
It stores multi-bit values – you have some control of #bits
You specify two things:
The test that controls which fragments get through
The operations to perform on the buffer when the test passes or fails
All tests/operation look at the value in the stencil that corresponds to the pixel location of the fragment
Typical usage: One rendering pass sets values in the stencil, which control how various parts of the screen are drawn in the second pass
Stencil Tests
You give an operation, a reference value, and a mask
Operations:
Always let the fragment through
Never let the fragment through
Logical operations between the reference value and the value in the buffer: <, <=, =, !=, >, >=
The mask is used to select particular bit-planes for the operation
(reference & mask ) op ( buffer & mask )
Stencil OperationsSpecify three different operations
If the stencil test fails
If the stencil passes but the depth test fails
If the stencil passes and the depth test passes
Operations are:
Keep the current stencil value
Zero the stencil
Replace the stencil with the reference value
Increment the stencil
Decrement the stencil
Invert the stencil (bitwise)
Depth Test and Operation
Depth test compares the depth of the fragment and the depth in the buffer
Depth increases with greater distance from viewer
Tests are: Always, Never, <, <=, =, !=, >, >=
Depth operation is to write the fragments depth to the buffer, or to leave the buffer unchanged
Why do the test but leave the buffer unchanged?
Each buffer stores different information about the pixel, so a test on one buffer may be useful in managing another
Copy to Texture
You can copy the framebuffer contents to a texture
Very powerful
Why ?
Multi-Pass Algorithms
Designing a multi-pass algorithm is a non-trivial task
At least one person I know of has received a PhD for developing such algorithms
References for multi-pass algorithms:
Real Time Rendering has them indexed by problem
The OpenGL Programming guide discusses many multi-pass techniques in a reasonably understandable manner
Game Programming Gems has some
Watt and Policarpo has others
Several have been published as academic papers
Multipass examples
Transparent objects
Reading
Core Techniques & Algorithms in Game Programming
Chapter 18 pages 565 - 600