Title: A1256655594RhMAe
1Synthetic Aperture Focusing Using Dense Camera
Arrays
Vaibhav Vaish
Computer Graphics Laboratory Stanford University
2Cameras are Getting Cheaper
3Why Camera Arrays ?
- High performance imaging
- Virtual reality
4Stanford Multi-camera Array
- 100 cameras X
- 640x480 pixels X
- 30 frames/sec
- 1 GB/sec
Scalable
Flexible
5Stanford Multi-camera Array
6Demo 1 High Speed Video
- N cameras, each running at 30Hz
- Stagger the frames of cameras by 1/Nth of a frame
- Align images to single perspective
- Video 52 cameras, 1560 Hz
7Demo 2 High Resolution Video
- 12 8 array of VGA cameras
- total field of view 29 wide
- seamless stitching
- cameras individually metered
Tiled Video 7 Megapixels
8Camera Array Portable Version
- 48 cameras in 16 x 3 layout
- 2m wide baseline
9Synthetic Aperture Focusing Scene
- distance to occluder 33m
- distance to targets 40m
- field of view at target 3m
10Synthetic Aperture Focusing Results
Synthetic aperture sequence
11Synthetic Aperture Focusing Results
Synthetic aperture sequence
12Outline
- Synthetic aperture focusing basics
- Technical challenges
- Determining the image transformations
- How to refocus efficiently
- Real-time system
- Future Work
13Synthetic Aperture Focusing
14Synthetic Aperture Focusing
15Synthetic Aperture Focusing
16Synthetic Aperture Focusing
17Synthetic Aperture Focusing
18Synthetic Aperture Focusing
19Synthetic Aperture Focusing Properties
- Focusing is a computational process (as opposed
to optical) - can vary focal plane after capturing images
- Can use arbitrary apertures
- Averaging multiple images improves
signal-to-noise ratio
20Focusing on one Plane
Focal Plane
- Backproject each camera image on to focal plane
- This is a 2D image warp called a homography (3x3
matrix)
21Focusing on one Plane
Focal Plane
- The final image is the average of all the
backprojected camera images
22Example Focusing on one plane
Add camera images so that points on one plane are
in good focus
23Technical Challenges
- How do we determine the projections for focusing
digitally ? - what camera parameters do we need to calibrate ?
- what are the image warps required ?
- Are there efficient algorithms for varying the
focal plane ? - a homography requires 3 adds 1 divide/pixel
- 100 video cameras 90 million pixels/sec
- computationally intensive!
24Varying the Focal Plane
Plane 1
25Varying the Focal Plane
Reference Plane
Reference Plane
Camera Image
Focal Plane
Camera Image
Reference Plane
Focal Plane
26Planar Homologies
- Refocusing requires projecting image from
reference plane on to new focal plane - This reprojection is called a planar homology
- A homology is described by a matrix of the form
- Hi I ?i ei l T
27Focusing Algorithm
- Calibration (Pre-process)
- Homographies for projection on to the reference
plane - Epipoles
- Vaish 2004, Hartley 2000
- Project images on reference plane
- Vary the focal plane by applying the homologies
given by - Hi I ?i ei l T
28Example varying focal planes
Rotating focal plane
29Case 1 Parallel Reference Plane
Focal Planes
Reference Plane
Camera Plane
- When the reference plane is parallel to the
camera plane, the epipoles ei xi yi 0 T are
points at infinity - Homologies Hi I ?i ei lT reduce to affine
transforms
30Case 2 Parallel Focal Planes
Parallel Focal Planes
Reference Plane
- When focal planes are parallel to reference
plane, the line l 0 0 1 T is at infinity - Homologies Hi I ?i ei lT reduce to a scale
and shift
31Case 3 Frontoparallel Planes
Focal Planes
Reference Plane
Camera Plane
- When the reference plane, camera plane and focal
planes are parallel, the epipoles ei and line l
are both at infinity - Homologies Hi I ?i ei lT reduce to shifts
Vaish 2004
32Case 4 Scheimpflug Configuration
Focal Planes
Reference Plane
Camera Plane
- When the camera plane, reference plane and focal
planes intersect in the same line, l and ei can
both be mapped to infinity - Homologies Hi I ?i ei lT again reduce to
shifts
33Taxonomy of Homologies
2. Scale shift
1. Affine
4. Shift post warp of final image
3. Shift
34Real-time Implementation
- Projection on to fixed reference plane (Look-up
table) - Shift image for desired focal plane (FPGA)
- Send MPEG stream to client PC (Firewire)
- Decompress and add streams on client PC
- Server adds streams from clients and displays
live synthetic aperture video
35Real-time system Demo
36Real-time System Discussion
- Reference plane is fixed
- Initial projection (homographies) can be
implemented via look-up table which is computed
beforehand - Varying focal plane requires shifting images
- Easy to realize in FPGAs (2 adds/camera)
- Keeps per-camera cost low
- Extensions
- Implement affine warps (2 adds/pixel)
- Computer assisted focusing
- Study other architectures for digital focusing
(GPU)
37Outline
- Synthetic aperture focusing basics
- Technical challenges
- Determining the image transforms
- How to refocus efficiently
- Real-time system
- Future Work
- Matted synthetic apertures
- General focal surfaces
38Matted Synthetic Apertures
39Crowd scene
40Crowd scene
41Curved focal surfaces
42General Focal Surfaces
- Can we reconstruct the correct focal depth for
every pixel ?
43General Focal Surfaces
- Can we reconstruct the correct focal depth for
every pixel ?
Shape from stereo
44Summary
- Using a camera array for synthetic aperture
focusing - large synthetic aperture allows seeing through
partial occluders - Geometry of digital focusing
- Real-time system
- Future work
- explore general apertures
- reconstruct correct focal depths for each pixel
- study design space of synthetic aperture camera
arrays
45Acknowledgements
- Sponsors
- Bosch Research
- NSF IIS-0219856-001
- DARPA NBCH 1030009
- Acquisition assistance
- Augusto Roman, Billy Chen, Abhishek Bapna, Mike
Cammarano - Listeners
- Gaurav Garg, Ren Ng, Jeff Klingner, Doantam Phan,
Niloy Mitra, Sriram Sankaranarayanan
46The Camera Array Team
- staff
- Mark Horowitz
- Marc Levoy
- Bennett Wilburn
- students
- Vaibhav Vaish
- Gaurav Garg
- Eino-Ville Talvala
- Emilio Antunez
- Andrew Adams
- Neel Joshi
- Georg Petschnigg
- Guillaume Poncin
- Monica Goyal
- collaborators
- Mark Bolas
- Ian McDowall
- Microsoft Research
- funding
- Bosch Research
- Intel
- Sony
- Interval Research
- NSF
- DARPA
http//graphics.stanford.edu/projects/array
47Effect of feature size
s 2
a 6
d 125
?z 15
see-around ability a ?z / d s
- see-around ability increases with aperture width
(a) and separation (?z) relative to distance (d)
from the cameras and to feature size (s) - can see around 2 leaves at 125 using a 16
aperture (ours was 6) - independent of number of cameras
48Effect of occluder density
- see-through ability increases with number of
cameras (n) relative to occluder opacity (a) - independent of aperture size
- our bushes averaged 97 opaque (needs better
measurement) - qualitative figure of merit depends on human
perception