dynamically reparameterized light fields aaron isaksen, leonard mcmillan (mit), steven gortler...

35
Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy 2003/4/24

Upload: gavin-knight

Post on 12-Jan-2016

223 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Dynamically Reparameterized Light

FieldsAaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard)

Siggraph 2000

Presented by Orion Sky Lawlorcs497yzy 2003/4/24

Page 2: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

IntroductionLightfield Aquisition

Image ReconstructionSynthetic Aperture

Page 3: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Introduction

Rendering cool pictures is hard Rendering them in realtime is

even harder (Partial) Solution: Image-based

rendering Acquire or pre-render many images At display time, recombine existing

images somehow Standard sampling problems:

Aliasing, acquisition, storage

Page 4: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Why use Image-based Rendering? Captures arbitrarily complex

material/light interactions Spatially varying glossy BRDF Global, volumetric, subsurface, ...

Display speed independent of scene complexity Excellent for natural scenes

Non-polygonal description avoids Difficulty doing sampling & LOD Cracks, watertight, manifold, ...

Page 5: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Why not use Image-based? Must acquire images

beforehand Fixed scene & lighting

•Often only the camera can move Predetermined sampling rate

•Undersampling, aliasing problems Predetermined set of views

•Can’t look in certain directions! Acquisition painful or expensive

Must store many, many images Yet access must be quick

Page 6: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

How do Lightfields not Work? At every point in space, take a

picture (or environment map):

3D Space, 2D Images => 5D

Display is just image lookup!

Page 7: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Why don’t Lightfields work like that?

These images all contain duplicate rays, again and again

3D Space, 2D Images => 5D

Page 8: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

How do Lightfields actually Work? We can thus get away with just

one layer of cameras:

2D Cameras

2D Images

=> 4D LightfieldReconstructed novel viewpoint

Only assumption:Rays are unchanged along

path

Display means interpolating several views

Page 9: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Camera Array Geometry

(Illustration: Isaksen, MIT)

Page 10: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

IntroductionLightfield Aquisition

Image ReconstructionSynthetic Aperture

Page 11: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

How do you make a Lightfield? Synthetic scene

Render from different viewpoints Real scene

Sample from different viewpoints In either case, need

Fairly dense sampling•Lots of data, compression useful

Good antialiasing, over both the image plane (pixels), and camera plane (apertures)

Page 12: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy
Page 13: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

XY Motion Control Camera Mount

(Isaksen, MIT)

Page 14: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

8 USB Digital Cameras, covers removed

(Jason Chang, MIT)

Page 15: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Lens array (bug boxes!) on a flatbed scanner (Jason Chang, MIT)

Page 16: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

(Lightfield: Isaksen, MIT)

Page 17: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

IntroductionLightfield Aquisition

Image ReconstructionSynthetic Aperture

Page 18: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Lightfield Reconstruction To build a view, just look up

light along each outgoing ray:

Camera Array

Reconstructed novel viewpoint

Need both direction and camera interpolation

Page 19: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Two-Plane Parameterization Parameterize

any ray via its intersection with two planes: Focal plane, for

ray direction Camera plane

May need 6 pairs of planes to capture all sides of a 3D object

(Slide: Levoy & Hanrahan, Stanford)

Page 20: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Camera and Direction Interpolation

(Slide: Levoy & Hanrahan, Stanford)

Page 21: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Mapping camera views to screen Can map camera view to new

viewpoint using texture mapping (since everything’s linear)

(Figure: Isaksen, MIT)

New Camera

Old Camera

Focal Plane

Page 22: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Lightfield Reconstruction (again) To build a view, just look up

light along each outgoing ray:

Camera Array

Reconstructed novel viewpoint

Reconstruction done via graphics hardware & laws of perspective

Page 23: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Related: Lenticular Display Replace cameras with

directional emitters, like many little lenses:

Reconstructed novel viewpoint

Reconstruction done in free space & laws of optics

Lens array

ImageOptional Blockers

(Isaksen)

Page 24: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Related: Holography A Hologram is just a sampling

plane with directional emission:

Reconstructed novel viewpoint

Reconstruction done in free space & coherent optics

Holographic film

Interference patterns on film act like little diffraction gratings, and give directional emission.

Reference Beam

(Hanrahan)

Page 25: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

IntroductionLightfield Aquisition

Image ReconstructionSynthetic Aperture

Page 26: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Camera Aperture & Focus Non-pinhole cameras accept

rays from a range of locations:

Stuff’s in focus here

Stuff’s blurry out here

Lens

One pixel on CCD or film

Page 27: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Camera Aperture Can vary effective lens size by

changing physical aperture (“hole”) On a camera, this is the f-stop

Small Aperture Big Aperture

Not much blurring—long depth of field Lots of depth

blurring—short depth of field

Page 28: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Synthetic Aperture Can build a larger aperture in

postprocessing, by combining smaller apertures

Big Assembled Aperture

Note: you can assemble a big aperture out of small ones, but not split a small aperture from a big one—it’s easy to blur, but not to un-blur.

Same depth blurring as with a real aperture!

Page 29: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Synthetic Aperture Example

Vary reconstructed camera’s aperture size: a larger synthetic aperture means a shorter “depth of field”—shorter range of focused depths.

(Illustration: Isaksen, MIT)

Page 30: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Camera Focal Distance Can vary real focal distance by

changing the camera’s physical optics

Far Near

Page 31: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Synthetic Aperture Focus With a synthetic aperture, can

vary focus by varying direction

Synthetic Far Synthetic Near

Note: this is only works exactly in the limit of small source apertures, but works OK for finite apertures.

Page 32: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Synthetic Aperture Focus: Aliasing Aliasing artifacts can be caused

by focal plane mismatch

Synthetic Far Synthetic Near

Point sampling along this plane causes aliasing artifacts

Blurring along this plane due to source focal length

Page 33: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Variable Focal Plane Example

Vary reconstructed camera’s focal length: just a matter of changing the directions before aperture assembly.

(Illustration: Isaksen, MIT)

Page 34: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Advantages of Synthetic Aperture:

Can simulate a huge aperture Impractical with a conventional camera

Can even tilt focal plane Impossible with conventional optics!

(Illustration: Isaksen, MIT)

Page 35: Dynamically Reparameterized Light Fields Aaron Isaksen, Leonard McMillan (MIT), Steven Gortler (Harvard) Siggraph 2000 Presented by Orion Sky Lawlor cs497yzy

Conclusions Lightfields are a unique way to

represent the world Supports arbitrary light transport Equivalent to holograms &

lenticular displays Isaksen et al.’s synthetic

aperture technique allows lightfields to be refocused Opportunity to extract more

information from lightfields