cs-378: game technology lecture #7: more mapping prof. okan arikan university of texas, austin...

45
CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica Hodgins V2005-08-1.1

Upload: russell-greer

Post on 04-Jan-2016

220 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

CS-378: Game Technology

Lecture #7: More Mapping

Prof. Okan ArikanUniversity of Texas, Austin

Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica HodginsV2005-08-1.1

Page 2: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Today

More on mapping

Environment mapping

Light mapping

Bump mapping

Buffers

Page 3: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Environment MappingEnvironment mapping produces reflections on shiny objects

Texture is transferred in the direction of the reflected ray from the environment map onto the object

Uses ray with same direction but starting at the object center

Map contains a view of the world as seen from the center of the object

Object

Viewer

Reflected ray

EnvironmentMap

Lookup ray

Page 4: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Environment Mapping

www.debevec.org

Need For Speed Underground

Far Cry

Page 5: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Lat/Long MappingThe original algorithm (1976) placed the map on a sphere centered on the object

Mapping functions assume that s,t texture coordinates equate to latitude and longitude on the sphere:

What is bad about this method?

Sampling

Map generation

Complex texture coordinate computations

2

1 ,tan

11

2

1 1

z

x

y tsR

R

R

Page 6: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Cube Mapping

Put the object at the center of a cube

Represent the environment on the cube faces

Assumptions ?

Hardware supported

View ray

Reflection ray

Page 7: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Sphere Mapping

Again the map lives on a sphere, but now the coordinate mapping is simplified

To generate the map:

Take a map point (s,t), cast a ray onto a sphere in the -Z direction, and record what is reflected

Equivalent to photographing a reflective sphere with an orthographic camera (long lens, big distance)

Again, makes the method suitable for film special effects

Page 8: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

A Sphere Map

Page 9: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Indexing Sphere Maps

Given the reflection vector:

Implemented in hardware

Problems:

Highly non-uniform sampling

Highly non-linear mapping

21

222 12

2

1 ,

2

1

zyx

yx

RRRm

m

Rt

m

Rs

Page 10: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Non-uniform Sampling

Page 11: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Non-linear Mapping

Linear interpolation of per-vertex texture coordinates picks up the wrong texture pixels

Use small polygons!

Correct Linear

Page 12: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Example

Page 13: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Other Env. Map Tricks

Partially reflective objects

First stage applied color texture

Second stage does environment mapping using alpha blend with existing color

Just put the lights in the environment map

What does this simulate?

Recursive reflections

Bad cases for environment maps?

Page 14: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Light Maps

Speed up lighting calculations by pre-computing lighting and storing it in maps

Allows complex illumination models to be used in generating the map (eg shadows, radiosity)

Used in complex rendering algorithms (Radiance), not just games

Issues:

How is the mapping determined?

How are the maps generated?

How are they applied at run-time?

Page 15: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Example

www.flipcode.com

Call

of

du

ty

Page 16: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Choosing a Mapping

Problem: In a preprocessing phase, points on polygons must be associated with points in maps

One solution:

Find groups of polygons that are “near” co-planar and do not overlap when projected onto a plane

Result is a mapping from polygons to planes

Combine sections of the chosen planes into larger maps

Store texture coordinates at polygon vertices

Lighting tends to change quite slowly (except when?), so the map resolution can be poor

Page 17: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Generating the MapProblem: What value should go in each pixel of the light map?

Solution:

Map texture pixels back into world space (using the inverse of the texture mapping)

Take the illumination of the polygon and put it in the pixel

Advantages of this approach:

Choosing “good” planes means that texture pixels map to roughly square pieces of polygon - good sampling

Not too many maps are required, and not much memory is wasted

Page 18: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Example

Nearest interpolation Linear interpolation

What type of lighting (diffuse, specular, reflections) can the map store?

Page 19: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Example

No light maps With light maps

Page 20: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Applying Light MapsUse multi-texturing hardware

First stage: Apply color texture map

Second stage: Modulate with light map

Actually, make points darker with light map

DirectX allows you to make points brighter with texture

Pre-lighting textures:

Apply the light map to the texture maps as a pre-process

Why is this less appealing?

Multi-stage rendering:

Same effect as multi-texturing, but modulating in the frame buffer

Page 21: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Dynamic Light Maps

Light maps are a preprocessing step, so they can only capture static lighting

Texture transformations allow some effects

What is required to recompute a light map at run-time?

How might we make this tractable?

Spatial subdivision algorithms allow us to identify nearby objects, which helps with this process

Compute a separate, dynamic light map at runtime using same mapping as static light map

Add additional texture pass to apply the dynamic map

Page 22: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Fog Maps

Dynamic modification of light-maps

Put fog objects into the scene

Compute where they intersect with geometry and paint the fog density into a dynamic light map

Use same mapping as static light map uses

Apply the fog map as with a light map

Extra texture stage

Page 23: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Fog Map Example

Page 24: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Bump Mapping

Bump mapping modifies the surface normal vector according to information in the map

Light dependent: the appearance of the surface depends on the lighting direction

View dependent: the effect of the bumps may depend on which direction the surface is viewed from

Bump mapping can be implemented with multi-texturing, multi-pass rendering, or pixel shaders

Page 25: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Storing the Bump MapSeveral options for what to store in the map

The normal vector to use

An offset to the default normal vector

Data derived from the normal vector

Illumination changes for a fixed view

Page 26: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Embossing

Apply height field as a modulating texture map

First application, apply it in place

Second application, shift it by amount that depends on the light direction, and subtract it

Page 27: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Dot Product bump mapping

Store normal vectors in the bump map

Specify light directions instead of colors at the vertices

Apply the bump map using the dot3 operator

Takes a dot product

Lots of details:

Light directions must be normalized – can be done with a cubic environment map

How do you get the color in?

How do you do specular highlights?

Page 28: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Dot Product Results

www.nvidia.com

Page 29: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Normal Mapping

DOOM 3

James Hastings-Trew

Page 30: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Environment Bump Mapping

Perturb the environment map lookup directions with the bump map

Far Cry

Nvidia

Page 31: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Multi-Pass RenderingThe pipeline takes one triangle at a time, so only local information and pre-computed maps are available

Multi-pass techniques render the scene, or parts of the scene, multiple times

Makes use of auxiliary buffers to hold information

Make use of tests and logical operations on values in the buffers

Really, a set of functionality that can be used to achieve a wide range of effects

Mirrors, shadows, bump-maps, anti-aliasing, compositing, …

Page 32: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

BuffersBuffers allow you to store global information about the rendered scene

Like scratch work space, or extra screen memory

They are only cleared when you say so

This functionality is fundamentally different from that of vertex or pixel shaders

Buffers are defined by:

The type of values they store

The logical operations that they influence

The way they are accessed (written and read)

Page 33: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

OpenGL Buffers

Color buffers: Store RGBA color information for each pixel

OpenGL actually defines four or more color buffers: front/back (double buffering), left/right (stereo) and auxiliary color buffers

Depth buffer: Stores depth information for each pixel

Stencil buffer: Stores some number of bits for each pixel

Accumulation buffer: Like a color buffer, but with higher resolution and different operations

Page 34: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Fragment TestsA fragment is a pixel-sized piece of shaded polygon, with color and depth information

After pixel shaders and/or texturing

The tests and operations performed with the fragment on its way to the color buffer are essential to understanding multi-pass techniques

Most important are, in order:

Alpha test

Stencil test

Depth test

Blending

Tests must be explicitly enabled

As the fragment passes through, some of the buffers may also have values stored into them

Page 35: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Alpha TestThe alpha test either allows a fragment to pass, or stops it, depending on the outcome of a test:

Here, fragment is the fragment’s alpha value, and reference is a reference alpha value that you specify

op is one of: <, <=, =, !=, >, >=

There are also the special tests: Always and Never

Always let the fragment through or never let it through

What is a sensible default?

if ( fragment op reference )pass fragment on

Page 36: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

BillboardsBillboards are texture-mapped polygons, typically used for things like trees

Image-based rendering method where complex geometry (the tree) is replaced with an image placed in the scene (the textured polygon)

The texture has alpha values associated with it: 1 where the tree is, and 0 where it isn’t

So you can see through the polygon in places where the tree isn’t

Page 37: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Alpha Test and BillboardsYou can use texture blending to make the polygon see through, but there is a big problem

What happens if you draw the billboard and then draw something behind it?

Hint: Think about the depth buffer values

This is one reason why transparent objects must be rendered back to front

The best way to draw billboards is with an alpha test: Do not let alpha < 0.5 pass through

Depth buffer is never set for fragments that are see through

Doesn’t work for partially transparent polygons - more later

Page 38: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Stencil Buffer

The stencil buffer acts like a paint stencil - it lets some fragments through but not others

It stores multi-bit values – you have some control of #bits

You specify two things:

The test that controls which fragments get through

The operations to perform on the buffer when the test passes or fails

All tests/operation look at the value in the stencil that corresponds to the pixel location of the fragment

Typical usage: One rendering pass sets values in the stencil, which control how various parts of the screen are drawn in the second pass

Page 39: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Stencil Tests

You give an operation, a reference value, and a mask

Operations:

Always let the fragment through

Never let the fragment through

Logical operations between the reference value and the value in the buffer: <, <=, =, !=, >, >=

The mask is used to select particular bit-planes for the operation

(reference & mask ) op ( buffer & mask )

Page 40: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Stencil OperationsSpecify three different operations

If the stencil test fails

If the stencil passes but the depth test fails

If the stencil passes and the depth test passes

Operations are:

Keep the current stencil value

Zero the stencil

Replace the stencil with the reference value

Increment the stencil

Decrement the stencil

Invert the stencil (bitwise)

Page 41: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Depth Test and Operation

Depth test compares the depth of the fragment and the depth in the buffer

Depth increases with greater distance from viewer

Tests are: Always, Never, <, <=, =, !=, >, >=

Depth operation is to write the fragments depth to the buffer, or to leave the buffer unchanged

Why do the test but leave the buffer unchanged?

Each buffer stores different information about the pixel, so a test on one buffer may be useful in managing another

Page 42: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Copy to Texture

You can copy the framebuffer contents to a texture

Very powerful

Why ?

Page 43: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Multi-Pass Algorithms

Designing a multi-pass algorithm is a non-trivial task

At least one person I know of has received a PhD for developing such algorithms

References for multi-pass algorithms:

Real Time Rendering has them indexed by problem

The OpenGL Programming guide discusses many multi-pass techniques in a reasonably understandable manner

Game Programming Gems has some

Watt and Policarpo has others

Several have been published as academic papers

Page 44: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Multipass examples

Transparent objects

Page 45: CS-378: Game Technology Lecture #7: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica

Reading

Core Techniques & Algorithms in Game Programming

Chapter 18 pages 565 - 600