Title: Dynamically Reparameterized Light Fields
1Dynamically Reparameterized Light Fields
- Aaron Isaksen, Leonard McMillan (MIT), Steven
Gortler (Harvard) - Siggraph 2000
- Presented by Orion Sky Lawlor
- cs497yzy 2003/4/24
2IntroductionLightfield AquisitionImage
ReconstructionSynthetic Aperture
3Introduction
- Rendering cool pictures is hard
- Rendering them in realtime is even harder
- (Partial) Solution Image-based rendering
- Acquire or pre-render many images
- At display time, recombine existing images
somehow - Standard sampling problems
- Aliasing, acquisition, storage
4Why use Image-based Rendering?
- Captures arbitrarily complex material/light
interactions - Spatially varying glossy BRDF
- Global, volumetric, subsurface, ...
- Display speed independent of scene complexity
- Excellent for natural scenes
- Non-polygonal description avoids
- Difficulty doing sampling LOD
- Cracks, watertight, manifold, ...
5Why not use Image-based?
- Must acquire images beforehand
- Fixed scene lighting
- Often only the camera can move
- Predetermined sampling rate
- Undersampling, aliasing problems
- Predetermined set of views
- Cant look in certain directions!
- Acquisition painful or expensive
- Must store many, many images
- Yet access must be quick
6How do Lightfields not Work?
- At every point in space, take a picture (or
environment map)
3D Space, 2D Images gt 5D
Display is just image lookup!
7Why dont Lightfields work like that?
- These images all contain duplicate rays, again
and again
3D Space, 2D Images gt 5D
8How do Lightfields actually Work?
- We can thus get away with just one layer of
cameras
2D Cameras 2D Images gt 4D Lightfield
Reconstructed novel viewpoint
Display means interpolating several views
- Only assumption
- Rays are unchanged along path
9Camera Array Geometry
(Illustration Isaksen, MIT)
10IntroductionLightfield AquisitionImage
ReconstructionSynthetic Aperture
11How do you make a Lightfield?
- Synthetic scene
- Render from different viewpoints
- Real scene
- Sample from different viewpoints
- In either case, need
- Fairly dense sampling
- Lots of data, compression useful
- Good antialiasing, over both the image plane
(pixels), and camera plane (apertures)
12(No Transcript)
13XY Motion Control Camera Mount (Isaksen, MIT)
14- 8 USB Digital Cameras, covers removed
- (Jason Chang, MIT)
15Lens array (bug boxes!) on a flatbed scanner
(Jason Chang, MIT)
16(Lightfield Isaksen, MIT)
17IntroductionLightfield AquisitionImage
ReconstructionSynthetic Aperture
18Lightfield Reconstruction
- To build a view, just look up light along each
outgoing ray
Camera Array
Reconstructed novel viewpoint
- Need both direction and camera interpolation
19Two-Plane Parameterization
- Parameterize any ray via its intersection with
two planes - Focal plane, for ray direction
- Camera plane
- May need 6 pairs of planes to capture all sides
of a 3D object
(Slide Levoy Hanrahan, Stanford)
20Camera and Direction Interpolation
(Slide Levoy Hanrahan, Stanford)
21Mapping camera views to screen
- Can map camera view to new viewpoint using
texture mapping (since everythings linear)
Old Camera
New Camera
Focal Plane
(Figure Isaksen, MIT)
22Lightfield Reconstruction (again)
- To build a view, just look up light along each
outgoing ray
Camera Array
Reconstructed novel viewpoint
- Reconstruction done via graphics hardware laws
of perspective
23Related Lenticular Display
- Replace cameras with directional emitters, like
many little lenses
Image
Optional Blockers
Lens array
Reconstructed novel viewpoint
- Reconstruction done in free space laws of optics
(Isaksen)
24Related Holography
- A Hologram is just a sampling plane with
directional emission
Reference Beam
Holographic film
Interference patterns on film act like little
diffraction gratings, and give directional
emission.
Reconstructed novel viewpoint
- Reconstruction done in free space coherent
optics
(Hanrahan)
25IntroductionLightfield AquisitionImage
ReconstructionSynthetic Aperture
26Camera Aperture Focus
- Non-pinhole cameras accept rays from a range of
locations
Stuffs blurry out here
Stuffs in focus here
Lens
One pixel on CCD or film
27Camera Aperture
- Can vary effective lens size by changing physical
aperture (hole) - On a camera, this is the f-stop
Not much blurringlong depth of field
Lots of depth blurringshort depth of field
Small Aperture
Big Aperture
28Synthetic Aperture
- Can build a larger aperture in postprocessing, by
combining smaller apertures
Note you can assemble a big aperture out of
small ones, but not split a small aperture from a
big oneits easy to blur, but not to un-blur.
Same depth blurring as with a real aperture!
Big Assembled Aperture
29Synthetic Aperture Example
Vary reconstructed cameras aperture size a
larger synthetic aperture means a shorter depth
of fieldshorter range of focused depths.
(Illustration Isaksen, MIT)
30Camera Focal Distance
- Can vary real focal distance by changing the
cameras physical optics
Far
Near
31Synthetic Aperture Focus
- With a synthetic aperture, can vary focus by
varying direction
Note this is only works exactly in the limit of
small source apertures, but works OK for finite
apertures.
Synthetic Far
Synthetic Near
32Synthetic Aperture Focus Aliasing
- Aliasing artifacts can be caused by focal plane
mismatch
Blurring along this plane due to source focal
length
Point sampling along this plane causes aliasing
artifacts
Synthetic Far
Synthetic Near
33Variable Focal Plane Example
Vary reconstructed cameras focal length just a
matter of changing the directions before aperture
assembly.
(Illustration Isaksen, MIT)
34Advantages of Synthetic Aperture
- Can simulate a huge aperture
- Impractical with a conventional camera
- Can even tilt focal plane
- Impossible with conventional optics!
(Illustration Isaksen, MIT)
35Conclusions
- Lightfields are a unique way to represent the
world - Supports arbitrary light transport
- Equivalent to holograms lenticular displays
- Isaksen et al.s synthetic aperture technique
allows lightfields to be refocused - Opportunity to extract more information from
lightfields