Title: Lecture 8 - Texture Mapping
1CSC 830 Computer GraphicsLecture 6 Texture more
Course Note Credit Some of slides are extracted
from the course notes of prof. Mathieu Desburn
(USC) and prof. Han-Wei Shen (Ohio State
University).
2Announcements
- Term Project Proposal Due April 14th (next
class) - Assignment 4 Shading 5 texture
3Can you do this
4(No Transcript)
5Texture Mapping
- Surfaces in the wild are very complex
- Cannot model all the fine variations
- We need to find ways to add surface detail
- How?
6Texture Mapping
- Particles and fractals
- gave us lots of detail information
- not easy to model
- mathematically and computationally challenging
7Texture Mapping
- Of course, one can model the exact micro-geometry
material property to control the look and feel
of a surface - But, it may get extremely costly
- So, graphics use a more practical approach
texture mapping
8Texture Mapping
- Solution - (its really a cheat!!)
- How?
MAP surface detail from a predefined
multi-dimensional table (texture) to a simple
polygon
9Slide Courtesy of Leonard McMillan Jovan
Popovic, MIT
10OpenGL functions - demo
- During initialization read in or create the
texture image and place it into the OpenGL state. - glTexImage2D (GL_TEXTURE_2D, 0, GL_RGB,
imageWidth, imageHeight, 0, GL_RGB,
GL_UNSIGNED_BYTE, imageData) - Before rendering your textured object, enable
texture mapping and tell the system to use this
particular texture. - glBindTexture (GL_TEXTURE_2D, 13)
11OpenGL functions
- During rendering, give the cartesian coordinates
and the texture coordinates for each vertex. - glBegin (GL_QUADS)
- glTexCoord2f (0.0, 0.0)
- glVertex3f (0.0, 0.0, 0.0)
- glTexCoord2f (1.0, 0.0)
- glVertex3f (10.0, 0.0, 0.0)
- glTexCoord2f (1.0, 1.0)
- glVertex3f (10.0, 10.0, 0.0)
- glTexCoord2f (0.0, 1.0)
- glVertex3f (0.0, 10.0, 0.0)
- glEnd ()
12What happens when outside the 0-1 range?
- (u,v) should be in the range of 01
- What happens when you request (1.5 2.3)?
- Tile repeat (OGL) the integer part of the value
is dropped, and the image repeats itself across
the surface - Mirror the image repeats itself but is mirrored
(flipped) on every other repetition - Clamp to edge value outside of the range are
clamped to this range - Clamp to border all those outside are rendered
with a separately defined color of the border
13Methods for modifying surface
- After a texture value is retrieved (may be
further transformed), the resulting values are
used to modify one or more surface attributes - Called combine functions or texture blending
operations - Replace replace surface color with texture color
- Decal replace surface color with texture color,
blend the color with underlying color with an
alpha texture value, but the alpha component in
the framebuffer is not modified - Modulate multiply the surface color by the
texture color (shaded textured surface)
14Okay, then how can you implement?
Slide Courtesy of Leonard McMillan Jovan
Popovic, MIT
15Texture and Texel
- Each pixel in a texture map is called a Texel
- Each Texel is associated with a (u,v) 2D texture
coordinate - The range of u, v is 0.0,1.0 due to
normalization
16(u,v) tuple
- For any (u,v) in the range of (0-1, 0-1)
multiplied by texture image width and height, we
can find the corresponding value in the texture
map
17How do we get F(u,v)?
- We are given a discrete set of values
- Fi,j for i0,,N, j0,,M
- Nearest neighbor
- F(u,v) F round(Nu), round(Mv)
- Linear Interpolation
- i floor(Nu), j floor(Mv)
- interpolate from Fi,j, Fi1,j, Fi,j1,
Fi1,j - Filtering in general !
18Interpolation
19Filtering Textures
Texture
Image
Texture gt Image
Image gt Texture
Footprint changes from pixel to pixel no single
filter
Resampling theory Magnification Interpolation Mi
nification Averaging
We would like a constant cost per pixel
20Mip Mapping Williams
MIP Multim In Parvo Many things in a small
place
d
R
R
G
B
v
G
B
u
Trilinear interpolation
21Mip Mapping - Example
22(No Transcript)
23Was it working correctly?
24Perspective Correction
No perspective Correction
25(No Transcript)
26(No Transcript)
27(No Transcript)
28Probably the most common form of perspective
texturing is done via a divide by Z. Its a very
simple algorithm. Instead of interpolate U and
V, we instead interpolate U/Z and V/Z. 1/Z is
also interpolated. At each pixel, we take our
texture co-ords, and divide them by Z. Hang on,
you're thinking - if we divide by the same
number twice (Z) don't we get back to where we
started - like a double reciprocal? Well, sort
of. Z is also interpolated, so we're not
dividing by the same Z twice. We then take the
new U and V values, index into our texture map,
and plot the pixel. Pseudo-code might be su
Screen-U U/Z sv Screen-V V/Z sz
Screen-Z 1/Z for xstartx to endx u su /
sz v sv / sz PutPixel(x, y,
texturevu) su deltasu sv deltasv
sz deltasz end Very simple, and
very slow.
29(No Transcript)
30Are correct_s correct_t integer?
31Useful links (Google perspective correct
texture)
- http//www.whisqu.se/per/docs/graphics16.htm
- http//p205.ezboard.com/fyabasicprogrammingfrm20.s
howMessage?topicID20.topic (Perspective
correction with Z-buffering) - http//csdl2.computer.org/persagen/DLAbsToc.jsp?re
sourcePath/dl/proceedings/toccomp/proceedings/c
giv/2005/2392/00/2392toc.xmlDOI10.1109/CGIV.2005
.58 (Perspective Correct Normal Vectors for Phong
Shading ) - http//easyweb.easynet.co.uk/mrmeanie/tmap/tmap.h
tm
32Procedural Texture
Checkerboard Scale s 10 If (u s) 20 (v
s)20 texture(u,v) 0 // black Else texture
(u,v) 1 // white
33Slide Courtesy of Leonard McMillan Jovan
Popovic, MIT
34Slide Courtesy of Leonard McMillan Jovan
Popovic, MIT
35t
Slide Courtesy of Leonard McMillan Jovan
Popovic, MIT
36Slide Courtesy of Leonard McMillan Jovan
Popovic, MIT
37Environment Maps
- Use texture to represent reflected color
- Texture indexed by reflection vector
- Approximation works when objects are far away
from the reflective object
38Environment Maps
- Using a spherical environment map
Spatially variant resolution
39Environment Maps
- Using a cubical environment map
40Environment Mapping
- Environment mapping produces reflections on shiny
objects - Texture is transferred in the direction of the
reflected ray from the environment map onto the
object - Reflected ray R2(NV)N-V
- What is in the map?
Environment Map
Viewer
Reflected ray
Object
41Approximations Made
- The map should contain a view of the world with
the point of interest on the object as the eye - We cant store a separate map for each point, so
one map is used with the eye at the center of the
object - Introduces distortions in the reflection, but the
eye doesnt notice - Distortions are minimized for a small object in a
large room - The object will not reflect itself
- The mapping can be computed at each pixel, or
only at the vertices
42Environment Maps
- The environment map may take one of several
forms - Cubic mapping
- Spherical mapping (two variants)
- Parabolic mapping
- Describes the shape of the surface on which the
map resides - Determines how the map is generated and how it is
indexed - What are some of the issues in choosing the map?
43Example
44Refraction Maps
- Use texture to represent refraction
45Opacity Maps
- Use texture to represent opacity
Useful for billboarding
46Illumination Maps
- Use texture to represent illumination footprint
47Illumination Maps
48Bump Mapping
- Use texture to perturb normals
- - creates a bump-like effect
original surface
bump map
modified surface
Does not change silhouette edges
49Bump Mapping
- Many textures are the result of small
perturbations in the surface geometry - Modeling these changes would result in an
explosion in the number of geometric primitives. - Bump mapping attempts to alter the lighting
across a polygon to provide the illusion of
texture.
50Bump Mapping
- This modifies the surface normals.
51Bump Mapping
52Bump Mapping
53Bump Mapping
- Consider the lighting for a modeled surface.
54Bump Mapping
- We can model this as deviations from some base
surface. - The questionis then how thesedeviations change
thelighting.
55Bump Mapping
- Step 1 Putting everything into the same
coordinate frame as B(u,v). - x(u,v), y(u,v), z(u,v) this is given for
parametric surfaces, but easy to derive for other
analytical surfaces. - Or O(u,v)
56Bump Mapping
- Define the tangent plane to the surface at a
point (u,v) by using the two vectors Ou and Ov. - The normal is then given by
- N Ou ? Ov
57Bump Mapping
- The new surface positions are then given by
- O(u,v) O(u,v) B(u,v) N
- Where, N N / N
- Differentiating leads to
- Ou Ou Bu N B (N)u ? Ou Ou Bu N
- Ov Ov Bv N B (N)v ? Ov Ov Bv N
- If B is small.
58Bump Mapping
- This leads to a new normal
- N(u,v) Ou ? Ov - Bu(N ? Ov) Bv(N ? Ou)
- Bu Bv(N ? N)
- N - Bu(N ? Ov) Bv(N ? Ou)
- N D
D
N
59Bump Mapping
- For efficiency, can store Bu and Bv in a
2-component texture map. - The cross products are geometry terms only.
- N will of course need to be normalized after the
calculation and before lighting. - This floating point square root and division
makes it difficult to embed into hardware.
60Displacement Mapping
- Use texture to displace the surface geometry
Bump mapping only affects the normals, Displacemen
t mapping changes the entire surface (including
the silhouette)
613D Textures
Usually stored procedurally
Can simulate an object carved from a material
623D Textures - Noise
- Pseudo-random
- Bandlimited
- few high or low frequencies
- Controllable
633D Textures - Noise
- Create a nD integer-aligned lattice of random
numbers - For any nDPoint p, noise is defined as
noise(nDPoint p) Find 2 neighbors of p
Linearly interpolate neighbors table values
Return interpolated value
64Turbulence
- Noise with self-similarity
Add together many octaves of noise
65Turbulence
66Animating Turbulence
- Use an extra dimension as time
67Slide Courtesy of Leonard McMillan Jovan
Popovic, MIT
68Slide Courtesy of Leonard McMillan Jovan
Popovic, MIT
69Slide Courtesy of Leonard McMillan Jovan
Popovic, MIT