Inverse Rendering Methods for Hardware-Accelerated Display of Parameterized Image Spaces

1 / 85
About This Presentation
Title:

Inverse Rendering Methods for Hardware-Accelerated Display of Parameterized Image Spaces

Description:

Inverse Rendering Methods for HardwareAccelerated Display of Parameterized Image Spaces –

Number of Views:25
Avg rating:3.0/5.0
Slides: 86
Provided by: ResearchM53
Category:

less

Transcript and Presenter's Notes

Title: Inverse Rendering Methods for Hardware-Accelerated Display of Parameterized Image Spaces


1
Inverse Rendering Methods forHardware-Accelerated
Display of Parameterized Image Spaces
Ziyad S. Hakura
2
Real-time rendering of Parameterized Image
Spaces with Photorealistic Image Quality
Object Motion
Viewpoint Position
3
  • Animation time parameter
  • Viewpoint parameter along circle

4
(No Transcript)
5
(No Transcript)
6
Cockpit Lighting
Day light
Night sky
7
Interactive Toy Story
  • Limited viewpoint motion
  • Head motion parallax puts the user in the scene
  • Character parameters e.g. happiness/sadness
  • Rose98, Gleicher98, Popovic99

8
Parameterized Image Spaces
  • Space can be 1D, 2D or more
  • Content author specifies parameters

Light motion
Object motion
Viewpoint position
9
(No Transcript)
10
Interactive Motion
View
Light
An interactive user is free to move anywhere in
the parameter space
11
(No Transcript)
12
Ray-Tracing
Texture-Mapping Graphics Hardware
13
Ray Tracing
Display
Eye
14
Z-Buffer Graphics Hardware
15
Pixel Fill Rate vs. Time
16
Overall Model
17
Overall Model
18
Overall Model
19
Related Work
  • Hardware Shading Models
  • Diefenbach96, Walter97, Ofek98, Udeshi99,
    Cabral99,
  • Kautz99, Heidrich99
  • Image-Based Rendering (IBR)
  • Chen93, Levoy96, Gortler96, Debevec96, Shade98,
  • Miller98, Debevec98, Bastos99, Heidrich99, Wood00
  • Animation Compression
  • Guenter93, Levoy95, Agrawal95, Cohen-Or99

20
Contributions
  • Inverse rendering method for inferring texture
    maps
  • Hardware-accelerated decoding of compressed
  • parameterized image spaces
  • Parameterized environment maps representation
  • for moving away from pre-rendered samples
  • Hybrid rendering for refractive objects

21
Outline
  • Motivation
  • Texture Inference
  • Parameterized Texture Compression
  • Parameterized Environment Maps
  • Hybrid Rendering
  • Conclusion

22
Consider a Single Image
p2
p1
Parameterized Image Space
Single Image
How do we represent the shading on each object?
23
Texture Mapping


3D Mesh
24
Texture Inference by Inverse Rendering


3D Mesh
2D Texture
25
Linear Hardware Model
A x b
Unknown Texture Pixels
Ray-Traced Image
HW Filter Coefficients
26
Texture Inference
x texture values
b ray-traced image
Ax b r(x)2
27
Forward Mapping Method
28
Ray-Traced
Inverse Fitted PSNR41.8dB
Forward Mapped PSNR35.4dB
29
Outline
  • Motivation
  • Texture Inference
  • Parameterized Texture Compression
  • Parameterized Environment Maps
  • Hybrid Rendering
  • Conclusion

30
Parameterized Texture Compression
Light
View
31
Why compress textures instead of images?
  • Textures better capture coherence
  • Independent of where in image object appears
  • Object silhouettes correctly rendered from
    geometry
  • Viewpoint can move away from original samples
  • No geometric disocclusions

32
Laplacian Pyramid
33
Adaptive Pyramid
34
MPEG-Image, 3551 PSNR36.8dB
Laplacian-Texture, 3791 PSNR38.7dB
35
(No Transcript)
36
Runtime System
  • Decompresses texture images
  • Caches uncompressed textures in memory
  • Textures in top of pyramid likely to be re-used
  • Generates rendering calls to graphics system

37
(No Transcript)
38
(No Transcript)
39
Outline
  • Motivation
  • Texture Inference
  • Parameterized Texture Compression
  • Parameterized Environment Maps
  • Hybrid Rendering
  • Conclusion

40
How to handle reflective objects?
Problem Movement away from pre-rendered views
gives a pasted-on look Solution Parameterized
Environment Maps
41
Static Environment Maps (EMs)
  • Generated using standard techniques
  • Photograph a physical sphere in an environment
  • Render six faces of a cube from object center

42
Problem with Static EM
Ray-Traced
Static EM
Self-reflections are missing
43
ParameterizedEnvironment Maps (PEM)
44
(No Transcript)
45
Environment Map Geometry
46
Why Parameterize Environment Maps?
  • Captures view-dependent shading in environment
  • Accounts for geometric error due to approximation
  • of environment with simple geometry

47
Surface Light Fields Miller98,Wood00
Surface Light Field
PEM
Dense sampling over surface points
of low-resolution lumispheres
Sparse sampling over viewpoints of
high-resolution EMs
48
Layering of Paramaterized Environment Maps
  • Segment environment into local and distant maps
  • Allows different EM geometries in each layer
  • Supports parallax between layers

49
Segmented, Ray-Traced Images
Fresnel
EMs are inferred for each layer separately
50
Inferred EMs per Viewpoint
Distant
Local Color
Local Alpha
51
Experimental Setup
  • 1D view space
  • 1 separation between views
  • 100 sampled viewpoints

52
Ray-Traced vs. PEM
Closely match local reflections like
self-reflections
53
(No Transcript)
54
Movement Away from Viewpoint Samples
Ray-Traced
PEM
55
(No Transcript)
56
Layered PEM vs. Infinite Sphere PEM
Layered PEM
Infinite Sphere PEM
57
(No Transcript)
58
(No Transcript)
59
Outline
  • Motivation
  • Texture Inference
  • Parameterized Texture Compression
  • Parameterized Environment Maps
  • Hybrid Rendering
  • Conclusion

60
How to handle refractive objects?
Problem Outgoing ray direction hard to predict
from first surface intersection
N
Refractive Path
Eye
Outgoing ray
Solution Hybrid Rendering
61
Hybrid Rendering
62
Hybrid Rendering
  • Greedy Ray Path Shading Model
  • Adaptive Tessellation
  • Layered, Parameterized Environment Maps

63
Greedy Ray Path Shading Model
Reflective Path
Refractive Object
N
Refractive Path
Eye
Trace two ray paths until rays exit refractive
object
64
Comparison of Shading Models
Two-term greedy ray path
Full ray tree
65
Adaptive Tessellation
  • Two criteria
  • Ray path topology
  • Outgoing ray distance
  • Consider both terms of shading model

66
Layered EMs
67
Inferred EMs
L1
L2
L3
Reflection Term
Refraction Term
Inferred Environment Maps
68
Ray-Traced vs. Hybrid
69
(No Transcript)
70
Benefit of Hybrid Renderingover Ray-Tracing
  • Lower cost
  • Adaptive ray-tracing algorithm
  • Lower cost and higher predictability
  • Greedy two-term shading model
  • Substitute environment with layered shells

71
Outline
  • Motivation
  • Texture Inference
  • Parameterized Texture Compression
  • Parameterized Environment Maps
  • Hybrid Rendering
  • Conclusion

72
Conclusion
  • Provide photorealistic rendering of parameterized
  • image spaces
  • Texture inference by Inverse Rendering
  • Parameterized Texture Compression
  • Parameterized Environment Maps
  • Hybrid Rendering

73
Recommendations for Graphics Hardware
  • Decompression of textures in hardware
  • Compression algorithms
  • Decoding from parameter-dependent texture blocks
  • More dynamic range in texture pixels
  • Ray-tracing for local models

74
Future Work
  • More sophisticated models for hardware rendering
  • e.g. fitting area light sources
  • Effect of hybrid rendering on compression
  • More efficient pre-rendering of ray-traced images
  • Multi-dimensional Ray-Tracing
  • Higher dimensions

75
Acknowledgements
  • Bernard Widrow

76
Acknowledgements
  • Bernard Widrow
  • John Snyder

77
Acknowledgements
  • Bernard Widrow
  • John Snyder
  • Anoop Gupta

78
Acknowledgements
  • Bernard Widrow
  • John Snyder
  • Anoop Gupta
  • Pat Hanrahan

79
Acknowledgements
  • Bernard Widrow
  • John Snyder
  • Anoop Gupta
  • Pat Hanrahan
  • Jed Lengyel, Turner Whitted, and others at
    Microsoft Research

80
Acknowledgements
  • Bernard Widrow
  • John Snyder
  • Anoop Gupta
  • Pat Hanrahan
  • Jed Lengyel, Turner Whitted, and others at
    Microsoft Research
  • Graphics Friends at Stanford

81
Acknowledgements
  • Bernard Widrow
  • John Snyder
  • Anoop Gupta
  • Pat Hanrahan
  • Jed Lengyel, Turner Whitted, and others at
    Microsoft Research
  • Graphics Friends at Stanford
  • Administrators
  • John Gerth, Charles Orgish, Kevin Colton
  • Darlene Hadding, Ada Glucksman, Heather Gentner

82
Acknowledgements
  • Bernard Widrow
  • John Snyder
  • Anoop Gupta
  • Pat Hanrahan
  • Jed Lengyel, Turner Whitted, and others at
    Microsoft Research
  • Graphics Friends at Stanford
  • Administrators
  • John Gerth, Charles Orgish, Kevin Colton
  • Darlene Hadding, Ada Glucksman, Heather Gentner
  • Friends
  • Ulrich Stern, Ravi Soundararajan, Kanna Shimizu,
  • Gaurishankar Govindaraju, Luke Chang, Johannes
    Helander

83
Acknowledgements
  • Bernard Widrow
  • John Snyder
  • Anoop Gupta
  • Pat Hanrahan
  • Jed Lengyel, Turner Whitted, and others at
    Microsoft Research
  • Graphics Friends at Stanford
  • Administrators
  • John Gerth, Charles Orgish, Kevin Colton
  • Darlene Hadding, Ada Glucksman, Heather Gentner
  • Friends
  • Ulrich Stern, Ravi Soundararajan, Kanna Shimizu,
  • Gaurishankar Govindaraju, Luke Chang, Johannes
    Helander
  • Mother and Sisters Dima and Dalia

84
Acknowledgements
  • Bernard Widrow
  • John Snyder
  • Anoop Gupta
  • Pat Hanrahan
  • Jed Lengyel, Turner Whitted, and others at
    Microsoft Research
  • Graphics Friends at Stanford
  • Administrators
  • John Gerth, Charles Orgish, Kevin Colton
  • Darlene Hadding, Ada Glucksman, Heather Gentner
  • Friends
  • Ulrich Stern, Ravi Soundararajan, Kanna Shimizu,
  • Gaurishankar Govindaraju, Luke Chang, Johannes
    Helander
  • Mother and Sisters Dima and Dalia
  • Father

85
END
Write a Comment
User Comments (0)
About PowerShow.com