Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 50

MODULE - 4

3D Viewing and Visible Surface


Detection

06/25/2022 ELAIYARAJA P
3D Viewing

06/25/2022 ELAIYARAJA P
3D Viewing

3D viewing involves some tasks that are not


present in 2D viewing:
• Projection,
• Visibility checks,
• Lighting effects, etc.

06/25/2022 ELAIYARAJA P
3D Viewing

• First, set up viewing (or camera) coordinate


reference.
• The position and orientation for a view plane (or
projection plane) that corresponds to a camera film
plane

06/25/2022 ELAIYARAJA P
3D Viewing Concepts

World Coordinate System Viewing Coordinate System

06/25/2022 ELAIYARAJA P
3D Viewing Transformation Pipeline
Modeling World Viewing
Coordinates Coordinates Coordinates

Construct World-
Coordinate Scene Convert World-
Coordinates to Projection
From Modeling-
Coordinate Viewing- Transformation
Transformations Coordinates

Projection Coordinates

Transform Projection- Normalized Device


Coordinates Map Normalized- Coordinates
Coordinates to
Coordinates to
Normalized-
Device-Coordinates
Coordinates

06/25/2022 ELAIYARAJA P
• Model is given in model (self) coordinates.
• Conversion to world coordinates takes place.
• Viewing coordinate system which defines the position and
orientation of the projection plane (film plane in camera) is selected,
to which scene is converted.
• 2D clipping window (lens of camera) is defined on the projection
plane (film plane) and a 3D clipping, called view volume, is
established.

06/25/2022 ELAIYARAJA P
• The shape and size of view volume is defined by the dimensions
of clipping window, the type of projection and the limiting
positions along the viewing direction.
• Objects are mapped to normalized coordinated and all parts of the
scene out of the view volume are clipped off.
• The clipping is applied after all device independent transformation
are completed, so efficient transformation concatenation is
possible.
• Few other tasks such as hidden surface removal and surface
rendering take place along the pipeline.

06/25/2022 ELAIYARAJA P
Projection Transformations
– Parallel Projection

06/25/2022 ELAIYARAJA P
Projection Transformations
• In parallel projection, coordinate
positions are transferred to view plane
along parallel lines
• orthogonal/orthographic
• oblique
• For perspective projection,
coordinates are transferred to view
plane along lines that converge at a
point

06/25/2022 ELAIYARAJA P
Projection Transformations
Next step in 3D viewing pipeline is projection of object to
viewing plane
Parallel Projection
Coordinate are transferred
to viewing plane along
parallel lines.

View Plane

Preserves relative size of object’s portions.

Projection can be perpendicular or oblique to viewing plane.

06/25/2022 ELAIYARAJA P
Orthogonal Projection Coordinates

• If projection direction is
parallel to z view
• xp=x, yp=y

• z coordinate is kept for


visibility detection procedures
Fig.An orthogonal projection of a spatial
position onto a view plane.

06/25/2022 ELAIYARAJA P
Orthogonal (orthographic) projections

06/25/2022 ELAIYARAJA P
Normalizing Orthogonal Projection

Orthogonal Projection View Volume ynorm

 xwmax , ywmax , zwfar 


znorm 1,1,1
xnorm
 xwmin , ywmin , znear 

yview

zview xview  1, 1, 1


Normalized View Volume

Display coordinate system is usually left-handed.

06/25/2022 ELAIYARAJA P
Orthogonal (orthographic) projections
Projection lines are parallel to normal.

Plane View

Front
Elevation Side
View Elevation
View

Used in engineering and architecture. Length and angles can be


measured directly from drawings.

06/25/2022 ELAIYARAJA P
Orthogonal projections

• Projection along lines parallel to the view-plane normal N


• Front, side, rear orthogonal projections are often called
elevations
• The top one is called plan view

06/25/2022 ELAIYARAJA P
Clipping Window and View Volume
Orthogonal Projection View
View Volume Plane

Far
Clipping
Plane
Near
Clipping yview
Plane
Clipping window

xview
zview

06/25/2022 ELAIYARAJA P
View Volume of Oblique Parallel Projection
View
Plane

Clipping
Window View
Volume
(Side)

Near Vp
Plane
View
Volume Vp
(Top) Far
Plane

06/25/2022 ELAIYARAJA P
Projection Transformations
– Perspective Projection

06/25/2022 ELAIYARAJA P
Projection Transformations

Perspective Projection

Projection lines converge


in a point behind viewing
plane.

Doesn’t preserve relative size but looks more realistic.

06/25/2022 ELAIYARAJA P
Perspective Projection
There are four parameters involved.
1. Eye
2. View Direction
3. View Up Vector
4. Normal to the Image Plane

Eye is at (0, 0, 0), View Direction and Normal


to the Image Plane is coincident with the z axis
and View Up vector is coincident with y axis

06/25/2022 ELAIYARAJA P
Vanishing Points
Vanishing points occur when the viewing plane intersects with the
axes of viewing coordinate system.

y Vanishing
Point

z
Principle axes for cube One-Point perspective Projection

Parallel to Z lines of XZ plane and parallel lines to Z in YZ plane will


vanish. Vanishing point on viewing plane correspond to infinity in
world.
06/25/2022 ELAIYARAJA P
Viewing plane is parallel to y-axis, z-axis vanishing point
intersecting both x-axis and z-axis

x-axis vanishing point

Vanishing points of all three axes occur when viewing plane


intersects all three axes.

06/25/2022 ELAIYARAJA P
Perspective-Projection View Volume

View
Rectangular Frustum
Plane
View Volume

Far Clipping Window


Clipping
Plane

Near yview
Clipping
Plane 
xview
Field-of-view Angle Projection
zview Reference Point

06/25/2022 ELAIYARAJA P
Settings of Perspective Projection
Perspective projection point
Where the viewer (camera, eye) is positioned in the world.
Positioning viewing plane with respect to viewing coordinates
Results vanishing points, one, two or three.
Clipping window on viewing plane
Defines the infinite pyramid view volume.
Near and far clipping planes (parallel to view plane)
Define the rectangular frustum view volume.
Scale and translation parameters of perspective matrix
Define the normalization range.

06/25/2022 ELAIYARAJA P
BASIS OF DIFFERENCE PERSPECTIVE PROJECTION   PARALLEL PROJECTION

A parallel projection is a projection of an


A perspective projection can be described as the object in three-dimensional space onto a
projector lines (lines of sight) that converge at the fixed plane referred as the projection
Description
center of projection, which results in many visual plane or image plane, where the rays,
effects of an object. known as lines of sight or projection lines
are parallel to each other.

One-point perspective Projection. Two-point


Orthographic parallel projection
Types perspective projection. Three-point perspective
Oblique parallel projection.  
projection.
Perspective projection cannot give the accurate view of Parallel projection can give the accurate
Accurate View Of Object
object.   view of object.  

Perspective projection represents the object in three Parallel projection represents the object
Object Representation
dimensional way.   in a different way like telescope.  

Perspective projection forms a realistic picture of Parallel projection does not form
Realistic View of Object
object.   realistic view of object.  
Distance Of The Object In parallel projection, the distance of the
The distance of the object from the center of projection
From The Center Of object from the center of projection is
is finite.  
Projection infinite.  
Projector in parallel projection is
Projector Projector in perspective projection is not parallel.  
parallel.  

Preservation Of Relative Perspective projection cannot preserve the relative Parallel projection can preserve the
Portion Of An Object proportion of an object.   relative proportion of an object.

The lines of parallel projection are


Lines Of Projection The lines of perspective projection are not parallel.    
parallel.

06/25/2022 ELAIYARAJA P
OpenGL 3D viewing functions

gluLookAt - Specifies three-dimensional viewing parameters.


glOrtho - Specifies parameters for a clipping window and the
near and far clipping planes for an orthogonal projection.
gluPerspective - Specifies field-of-view angle and other parameters for a
symmetric perspective projection.
glFrustum - Specifies parameters for a clipping window and near and far
clipping planes for a perspective projection (symmetric or
oblique).
glClipPlane - Specifies parameters for an optional clipping plane.

06/25/2022 ELAIYARAJA P
Visible Surface Detection
Methods –
Back-face detection algorithm

06/25/2022 ELAIYARAJA P
The problem of Visibility – Occlusion

Given a set of 3-D surfaces to be projected


onto a 2-D screen, obtain the nearest surface
corresponding to any point on the screen.
• We can classify visible-surface detection
algorithms according to whether they deal with the
object definitions or with their projected images.
• There are two approaches of visible surface
detection are object-space methods and image-
space methods.
06/25/2022 ELAIYARAJA P
Visible surface detection algorithm
• Object space – Back face detection algorithm
• Image space – Depth buffer algorithm
• An object-space method compares objects and parts of
objects to each other within the scene definition to
determine which surfaces, as a whole, we should label as
visible.
• In an image-space algorithm, visibility is decided point by
point at each pixel position on the projection plane.

06/25/2022 ELAIYARAJA P
Object-space methods
Compares parts of objects to each other to determine
which surfaces should be labeled as visible (use of bounding
boxes, and check limits along each direction).
Order the surfaces being drawn, such that it provides the
correct impression of depth variations and positions.

Image Space methods


Visibility is decided point by point at each pixel position
on the projection plane. Screen resolution can be a limitation.

Hidden Surface –Surface for rendering or Line


drawing

06/25/2022 ELAIYARAJA P
back-face detection algorithm
A fast and simple object-space method for locating the back faces
of a polyhedron is based on front-back tests. A point (x, y, z) is
behind a polygon surface if
Ax + By + Cz + D < 0
where A, B,C, and D are the plane parameters for the polygon.
When this position is along the line of sight to the surface, we
must be looking at the back of the polygon. Therefore, we could
use the viewing position to test for back faces.

06/25/2022 ELAIYARAJA P
back-face detection algorithm
We can simplify the back-face test by considering the direction of the
normal vector N for a polygon surface. If Vview is a vector in the viewing direction
from our camera position, as shown in Figure 1, then a polygon is a back face if
Vview · N > 0

our viewing direction is parallel to the viewing zv axis, then we need to consider

only the z component of the normal vector N

06/25/2022 ELAIYARAJA P
back-face detection algorithm
In a right-handed viewing system with the viewing direction along the negative zv
axis (Figure 2), a polygon is a back face if the z component, C, of its normal
vector N satisfies C < 0. Also, we cannot see any face whose normal has z
component C = 0, because our viewing direction is grazing that polygon.
In general, we can label any polygon as a back face if its normal vector has a z
component value that satisfies the inequality C≤0

06/25/2022 ELAIYARAJA P
back-face detection algorithm -Summary
Compute N for every face of object.
If (C.(Z component) < 0)
then a back face and don't draw
else
front face and draw

for the right-handed system, if the Z component of the


normal vector is negative, then it is a back face. If the Z
component of the vector is positive, then it is a front face.

06/25/2022 ELAIYARAJA P
back-face detection algorithm

The surface normal appears to be pointing outwards.


Facing towards the viewer, the edges of the polygon appears to
be drawn counter-clockwise.

The polygon is a back face, if c > 0


If the view direction is along the –ve Z- direction, the
above condition becomes: c < 0

06/25/2022 ELAIYARAJA P
Drawbacks of back face culling:
* Partially hidden faces cannot be determined by this method

* Not useful for ray tracing, or photometry/ radiosity.

However, this is still useful


as a pre-processing step, as almost
50% of the surfaces are eliminated.

06/25/2022 ELAIYARAJA P
Visible Surface Detection
Methods –
depth-buffer(Z-Buffer) algorithm

06/25/2022 ELAIYARAJA P
depth-buffer(Z-Buffer) algorithm

• Each (X,Y,Z) point on a polygon surface, corresponds to the


orthographic projection point (X, Y) on the view plane.
• At each point (X, Y) on the PP, object depths are compared by
using the depth(Z) values.

06/25/2022 ELAIYARAJA P
depth-buffer(Z-Buffer) algorithm
 Image-space approach for detecting visible surfaces is the
depth-buffer method, which compares surface depth values
throughout a scene for each pixel position on the projection
plane.
 Each surface of a scene is processed separately, one pixel
position at a time, across the surface.
 The algorithm is usually applied to scenes containing only
polygon surfaces, because depth values can be computed
very quickly and the method is easy to implement.

06/25/2022 ELAIYARAJA P
depth-buffer(Z-Buffer) algorithm
Assume normalized coordinates:

(FCP) Zmax > Z > 0 (BCP), where Zmax = 1.

Two buffer areas are used:

(i) Depth (Z) buffer: To store the depth values for each
(X, Y) position, as surfaces are processed.

(ii) Refresh Buffer: To store the intensity


value at each position (X, Y).

06/25/2022 ELAIYARAJA P
depth-buffer(Z-Buffer) algorithm

 This visibility-detection approach is also frequently


mentioned to as the z-buffer method, because object depth is
usually measured along the z axis of a viewing system.
 However, rather than using actual z coordinates within the
scene, depth-buffer algorithms often compute a distance from
the view plane along the z axis.
 Depth(Z)buffer: To store the depth values for each (X, Y)
position, as surfaces are processed.

06/25/2022 ELAIYARAJA P
depth-buffer(Z-Buffer) algorithm

 The depth-buffer processing steps are summarized in the


following algorithm. This algorithm assumes that depth values
are normalized on the range from 0.0 to 1.0 with the view plane
at depth= 0 and the farthest depth= 1.

06/25/2022 ELAIYARAJA P
06/25/2022 ELAIYARAJA P
06/25/2022 ELAIYARAJA P
Z-Buffer Algorithm
Given the depth values for the vertex positions of any polygon in a
scene, we can calculate the depth at any other point on the plane
containing the polygon.
At surface position (x, y), the depth is calculated from the plane
equation as

z = −Ax − By − D / C C!=0

from Equation of the surface Ax+By+Cz+ D= 0

06/25/2022 ELAIYARAJA P
Z-Buffer Algorithm -Summary
initialize the depth of each pixel.
i.e, d(x, y) = infinite (max length)
Initialize the color value for each pixel
as c(x, y) = background color
for each polygon, do the following steps :
for (each pixel in polygon's projection)
{
find depth i.e, z of polygon
at (x, y) corresponding to pixel position
if (z < d(x, y))
{
d(x, y) = z;
c(x, y) = color;
}
}

06/25/2022 ELAIYARAJA P
OpenGL Visibility detection functions

06/25/2022 ELAIYARAJA P
OpenGL Visibility detection functions

glCullFace - Specifies front or back planes of polygons for culling operations


when activated with
glEnable (GL CULL FACE).
glutInitDisplayMode - Specifies depth-buffer operations using argument
GLUT DEPTH.
glClear (GL DEPTH BUFFER BIT) - Initializes depth-buffer values to the
default (1.0) or a value specified by the glClearDepth function.
glClearDepth - Specifies an initial depth-buffer value.

06/25/2022 ELAIYARAJA P
OpenGL Visibility detection functions
glEnable (GL DEPTH TEST)- Activates depth-testing operations.

glDepthRange- Specifies a range for normalizing depth values.

glDepthFunc- Specifies a depth-testing condition.

glDepthMask - Sets write status for the depth buffer.

glPolygonOffset - Specifies an offset to eliminate hidden lines in a wire-frame display when

a background fill color is applied.

glFog - Specifies linear depth-cueing operations and values for minimum and maximum

depth in the depth- cueing calculations.

06/25/2022 ELAIYARAJA P

You might also like