Title: Last Time
1Last Time
- Shading Interpolation
- Texture Mapping introduction
2Today
- Texture Mapping details
- Modeling Intro (maybe)
- Homework 5
3Basic OpenGL Texturing
- Specify texture coordinates for the polygon
- Use glTexCoord2f(s,t) before each vertex
- Eg glTexCoord2f(0,0) glVertex3f(x,y,z)
- Create a texture object and fill it with texture
data - glGenTextures(num, indices) to get identifiers
for the objects - glBindTexture(GL_TEXTURE_2D, identifier) to bind
the texture - Following texture commands refer to the bound
texture - glTexParameteri(GL_TEXTURE_2D, , ) to specify
parameters for use when applying the texture - glTexImage2D(GL_TEXTURE_2D, .) to specify the
texture data (the image itself)
MORE
4Basic OpenGL Texturing (cont)
- Enable texturing glEnable(GL_TEXTURE_2D)
- State how the texture will be used
- glTexEnvf()
- Texturing is done after lighting
- Youre ready to go
5Nasty Details
- There are a large range of functions for
controlling the layout of texture data - You must state how the data in your image is
arranged - Eg glPixelStorei(GL_UNPACK_ALIGNMENT, 1) tells
OpenGL not to skip bytes at the end of a row - You must state how you want the texture to be put
in memory how many bits per pixel, which
channels, - Textures must be square with width/height a power
of 2 - Common sizes are 32x32, 64x64, 256x256
- Smaller uses less memory, and there is a finite
amount of texture memory on graphics cards
6Controlling Different Parameters
- The pixels in the texture map may be
interpreted as many different things. For
example - As colors in RGB or RGBA format
- As grayscale intensity
- As alpha values only
- The data can be applied to the polygon in many
different ways - Replace Replace the polygon color with the
texture color - Modulate Multiply the polygon color with the
texture color or intensity - Similar to compositing Composite texture with
base color using operator
7Example Diffuse shading and texture
- Say you want to have an object textured and have
the texture appear to be diffusely lit - Problem Texture is applied after lighting, so
how do you adjust the textures brightness? - Solution
- Make the polygon white and light it normally
- Use glTexEnvi(GL_TEXTURE_2D, GL_TEXTURE_ENV_MODE,
GL_MODULATE) - Use GL_RGB for internal format
- Then, texture color is multiplied by surface
(fragment) color, and alpha is taken from fragment
8Some Other Uses
- There is a decal mode for textures, which
replaces the surface color with the texture
color, as if you stick on a decal - Texture happens after lighting, so the light info
is lost - BUT, you can use the texture to store lighting
info, and generate better looking lighting
(override OpenGLs lighting) - You can put the color information in the polygon,
and use the texture for the brightness
information - Called light maps
- Normally, use multiple texture layers, one for
color, one for light
9Textures and Aliasing
- Textures are subject to aliasing
- A polygon pixel maps into a texture image,
essentially sampling the texture at a point - The situation is very similar to resizing an
image, but the resize ratios may change across
the image - Standard approaches
- Pre-filtering Filter the texture down before
applying it - Post-filtering Take multiple pixels from the
texture and filter them before applying to the
polygon fragment
10Point Sampled Texture Aliasing
Texture map
Polygon far from the viewer in perspective
projection
Rasterized and textured
- Note that the back row is a very poor
representation of the true image
11Mipmapping (Pre-filtering)
- If a textured object is far away, one screen
pixel (on an object) may map to many texture
pixels - The problem is how to combine them
- A mipmap is a low resolution version of a texture
- Texture is filtered down as a pre-processing
step - gluBuild2DMipmaps()
- When the textured object is far away, use the
mipmap chosen so that one image pixel maps to at
most four mipmap pixels - Full set of mipmaps requires double the storage
of the original texture
12Many Texels for Each Pixel
Texture map with pixels drawn on it. Some pixels
cover many texture elements (texels)
Polygon far from the viewer in perspective
projection
13Mipmaps
For far objects
For middle objects
For near objects
14Mipmap Math
- Define a scale factor, ?texels/pixel
- A texel is a pixel from a texture
- ? is actually the maximum from x and y
- The scale factor may vary over a polygon
- It can be derived from the transformation
matrices - Define ?log2 ?
- ? tells you which mipmap level to use
- Level 0 is the original texture, level 1 is the
next smallest texture, and so on - If ?lt0, then multiple pixels map to one texel
magnification
15Post-Filtering
- You tell OpenGL what sort of post-filtering to do
- Magnification When ?lt0 the image pixel is
smaller than the texel - glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILT
ER, type) - Type is GL_LINEAR or GL_NEAREST
- Minification When ?gt0 the image pixel is bigger
than the texel - GL_TEX_MIN_FILTER
- Can choose to
- Take nearest point in base texture, GL_NEAREST
- Linearly interpolate nearest 4 pixels in base
texture, GL_LINEAR - Take the nearest mipmap and then take nearest or
interpolate in that mipmap, GL_NEAREST_MIPMAP_LINE
AR - Interpolate between the two nearest mipmaps using
nearest or interpolated points from each,
GL_LINEAR_MIPMAP_LINEAR
16Filtering Example
Level 2
NEAREST_MIPMAP_NEAREST level 0, pixel (0,0)
LINEAR_MIPMAP_NEAREST level 0, pixel (0,0)
0.51 level 1, pixel (0,0) 0.49
Level 1
NEAREST_MIPMAP_LINEAR level 0, combination
of pixels (0,0), (1,0), (1,1), (0,1)
Level 0
s0.12,t0.1 ?1.4 ?0.49
17Boundaries
- You can control what happens if a point maps to a
texture coordinate outside of the texture image - All texture images are assumed to go from (0,0)
to (1,1) in texture space - The problem is how to extend the image to make an
infinite space - Repeat Assume the texture is tiled
- glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S,
GL_REPEAT) - Clamp to Edge the texture coordinates are
truncated to valid values, and then used - glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S,
GL_CLAMP) - Can specify a special border color
- glTexParameterfv(GL_TEXTURE_2D,
GL_TEXTURE_BORDER_COLOR, R,G,B,A)
18Repeat Border
(1,1)
(0,0)
19Clamp Border
(1,1)
(0,0)
20Border Color
(1,1)
(0,0)
21Other Texture Stuff
- Texture must be in fast memory - it is accessed
for every pixel drawn - If you exceed it, performance will degrade
horribly - Skilled artists can pack textures for different
objects into one image - Texture memory is typically limited, so a range
of functions are available to manage it - Specifying texture coordinates can be annoying,
so there are functions to automate it - Sometimes you want to apply multiple textures to
the same point Multitexturing is now in most new
hardware
22Yet More Texture Stuff
- There is a texture matrix apply a matrix
transformation to texture coordinates before
indexing texture - There are image processing operations that can
be applied to the pixels coming out of the
texture - There are 1D and 3D textures
- Mapping works essentially the same
- 3D textures are very memory intensive, and how
they are used is very application dependent - 1D saves memory if the texture is inherently 1D,
like stripes
23Procedural Texture Mapping
- Instead of looking up an image, pass the texture
coordinates to a function that computes the
texture value on the fly - Renderman, the Pixar rendering language, does
this - Available in a limited form with vertex shaders
on current generation hardware - Advantages
- Near-infinite resolution with small storage cost
- Idea works for many other things
- Has the disadvantage of being slow in many cases
24Other Types of Mapping
- Environment mapping looks up incoming
illumination in a map - Simulates reflections from shiny surfaces
- Bump-mapping computes an offset to the normal
vector at each rendered pixel - No need to put bumps in geometry, but silhouette
looks wrong - Displacement mapping adds an offset to the
surface at each point - Like putting bumps on geometry, but simpler to
model - All are available in software renderers like
RenderMan compliant renderers - All these are becoming available in hardware
25The Story So Far
- Weve looked at images and image manipulation
- Weve looked at rendering from polygons
- Next major section
- Modeling
26Modeling Overview
- Modeling is the process of describing an object
- Sometimes the description is an end in itself
- eg Computer aided design (CAD), Computer Aided
Manufacturing (CAM) - The model is an exact description
- More typically in graphics, the model is then
used for rendering (we will work on this
assumption) - The model only exists to produce a picture
- It can be an approximation, as long as the visual
result is good - The computer graphics motto If it looks right
it is right - Doesnt work for CAD
27Issues in Modeling
- There are many ways to represent the shape of an
object - What are some things to think about when choosing
a representation?
28Categorizing Modeling Techniques
- Surface vs. Volume
- Sometimes we only care about the surface
- Rendering and geometric computations
- Sometimes we want to know about the volume
- Medical data with information attached to the
space - Some representations are best thought of defining
the space filled, rather than the surface around
the space - Parametric vs. Implicit
- Parametric generates all the points on a surface
(volume) by plugging in a parameter eg (sin?
cos?, sin? sin?, cos?) - Implicit models tell you if a point in on (in)
the surface (volume) eg x2 y2 z2- 1 0
29Techniques We Will Examine
- Polygon meshes
- Surface representation, Parametric representation
- Prototype instancing and hierarchical modeling
- Surface or Volume, Parametric
- Volume enumeration schemes
- Volume, Parametric or Implicit
- Parametric curves and surfaces
- Surface, Parametric
- Subdivision curves and surfaces
- Procedural models