Slicing: 3D texture mapping
Store volume in solid (3D) texture memory
For all k screen parallel image planes in
distance lk
– intersect slicing plane lk and trilinearly
interpolate f using 3D texture mapping (need to
compute 3D texture coordinates for the
vertices of the slice polygon).
– blend the texture mapped slice into frame
buffer (using back-to-front alpha blending).
Hardware implemented 3D texture mapping
1
Slice based Rendering
color
opacity
object (color, opacity)
1.0
Similar to ray-casting
with simultaneous rays2
1
Rendering by Slicing - examples
MRI 2563 (back-to-front blending):
3
The 3DIVE System
Visualization
– Rendering
• Object-based (regionbased)
• Fast
– Data filtering
• Color lookup table
• 3D Image processing
Virtual environment
– Display
– Interaction
4
2
Virtual Environment
CAVE
– 8 ft. Cubed room
– Front, left, right and
floor rear-projection
– Flock of Birds magnetic
tracking
ImmersaDesk
– 4 x 5 ft. Screen
– Single rear-projected
display
– Ascension magnetic
tracking
5
Rendering Method: 3D texture
mapping
6
3
7
8
4
Transfer functions
A function mapping from scalar values (and
maybe their gradients or other evaluated
quantities) to color and opacity values
May involve a sequence of scalar-to-scalar
mappings followed by a “coloring” process (color
lookup table, shading, etc.)
Red
Green
Alpha
Blue
9
Color & opacity
Surface rendering
Transfer function
Semi-transparent
rendering
intensity
------10
5
Transfer function design
Infinite search space --- search space reduction,
avoid invalid and bad transfer functions
Visual result dependent --- interactive process
Optimizing transfer function parameters by
Genetic Algorithms
Design galleries
Image analysis
Integrated image processing
11
TF: Parameter optimization
a:
Dataset
Parameter generation
User evaluation
b:
Volume rendering
Image population
Dataset
Parameter generation
Volume rendering
User
objectives
Automatic evaluation
Image population
12
6
TF: Parameter optimization (2)
1.
2.
3.
4.
Encoding the solution
Generate initial population
Evaluate initial solution and assign fitness to
each solution
While no satisfactory solution is found
4.1
4.2
4.3
4.4
Stochastically select an intermediate population
Generate new solution population
Evaluate new solutions
Load-balance the population
13
TF: Parameter optimization (3)
Solution encoding
– Normalized functions: [0,1] [0,1]
– A solution: Xi = [s1,s2,s3, …, sn] (i.e. samples)
Initial solutions (population)
– Random
– User defined
– Pre-defined simple math functions
Selection of intermediate population
– Genetic algorithm
– Proportionate selection based on fitness values
# of offsprings of solution i = fi / f, where fi is the fitness value
of solution I and f is the average fitness value in a population.
14
7
TF: Generation of new solutions
Mutation: For each solution X = [s1,s2,s3, …, sn], a new
solution may be generated: Y = [t1,t2,t3, …, tn], where ti
is a mutation of si. For example:
ti si f m d m , f m [1,1] is random,
d m is a constant and decreases exponentially
Crossover:
– Randomly pair solutions
– For each pair, randomly select two points, and
exchange segments between the two points.
15
TF: Solution evaluation
– Generate an image by volume rendering
using each transfer function solution
– User selection (1: like; 0: don’t like;
(0,1): somewhere in between)
– Automatic evaluation: objective function
by analyzing the images (e.g. histogram
analysis)
16
8
Parameter optimization (examples)
17
TF: Design galleries
Parameter optimization: narrowing down solutions
Design galleries: solution dispersion
Design principles
– Input vector : piecewise function (polylines)
– Output vector : a selected set of pixels from each
rendering image
– Dispersion : finding a set of parameters of the input
vectors that optimally disperse the output vectors (by
measuring nearest neighbor distances)
– Arrangement: organizing resulting images for easy
selection and browsing.
18
9
19
TF: Design galleries (2)
Input: a random set of input vectors, I, and their output
vectors, O. |I|=|O|=n
Output: modified input & output vectors, I and O
Procedure Disperse(I,O,t) {
}
for i = 1 to t do {
j=ran_int (1,n);
u=perturb(I[j], i); // new transfer function
map(u,v);
// generate output vector “v”
k=worst_index (O); // with the smallest nearest distance
if (is_better (v, O[k], O))
// replace O[k] with v
I[k]=u; O[k]=v;
else if (is_better (v, O[j], O) // replace O[j]
I[j] = u; O[j] = v;
}
20
10
TF: Image analysis
Kindlmann and Durkin
(IEEE VolVis Symposium 98)
Looking for boundary features
Edge-detection based
Transform dataset to a histogram volume
Study f-f’-f’’ relationship
21
22
11
23
24
12
25
26
13
27
Position function
Defining opacity function based on distances to
the surface – constant thickness.
Assuming a Gaussian distribution across boundary
Let : v f ( x), p(v) x, b( x) b( p (v))
f ( x) c1 c2 e
x
x2
2 2
,
f '' ( x )
x
2
'
f ( x)
2 f '' ( f 1 (v))
2 h (v )
p (v )
1
'
f ( f (v))
g (v )
where : g average f ' along boundary
h average f '' along boundary
28
14
29
30
15
31
TF: Integrated image processing
Fang, Biddlecome, Tuceryan, IEEE Vis’98
Integrating image processing and visualization: a
more general approach
Representing a transfer function as a sequence of
image processing procedures, with intuitive
parameterization.
F = fn fn-1 …… f2 f1
where fi is an intensity mapping defined over the volume
space, representing the result of an image processing
procedure.
Coloring: shading and color table
32
16
Two basic types of intensity mappings
Intensity table: an intensity-to-intensity lookup
table representing a piecewise linear function over
the volume’s intensity domain: [0,1][0,1]
Neighborhood function: a function involving the
intensity values in a mxmxm neighborhood of the
voxel: D [0,1]
A typical form is the 3D spatial convolution over the
volume V and a mask T:
m/2
f (x,y,z) = i,j,k
[I, j, k] V [x+i, y+j, z+k]
= -m/2
33
Parameter modification
V0
point
f1
Coloring and rendering
f2
Parameter modification
V0
V1
f1
Coloring and rendering
V2
f2
Parameter modification
V0
slice
f1
point
f3
f3
V3
(a) point-based
approach
(Ray-casting)
(b) volume-based
approach
(3D texture mapping)
Coloring and rendering
f2
f3
slice
(c) slice-based
approach
(2D texture mapping)
34
17
In point-based approach, when multiple neighborhood
functions are used in one transfer function, each voxel
may be computed multiple times, since it may fall into the
neighborhoods of several sampling points.
35
Buffering in point-based approach
A small fraction (often less then 10%) of the total
set of voxels are actually used for each rendering
Buffers can be used to avoid repeated computation
V0
point
f1
Buffer 1
f2
Buffer 2
f3
Buffer 3
36
18
Enhancement operations
Point enhancement
– Intensity modification
– Histogram modification (e.g. histogram
equalization)
Spatial enhancement
– Smoothing
– Sharpening
37
Smoothing & Sharpening
Smoothing
– Gaussian: T (i, j , k )
1
2
2
e ( i
2
j 2 k 2 ) / 2 2
– Median filter: median value in a neighborhood
Sharpening
– Laplacian filter: f ( x, y, z ) g ( x, y, z ) 2 g ( x, y, z )
– Unsharp masking: blending the low frequency
component V1 and the high frequency component (V-V1)
using a convolution mask.
T (i, j , k ) V1 (V V1 )
38
19
39
40
20
Iso-surface rendering
by boundary detection
Apply a boundary (edge) detection
operator to identify all boundary
voxels (e.g. gradient thresholding)
Generate histogram of boundary
voxels
Extract the intensity values (isovalues) at which the histogram
reaches local maxima.
41
42
21
43
Dynamic boundary rendering
Dynamically determine (by edge detection)
the boundary points during rendering.
For surfaces that cannot be well defined
by iso-values (e.g. in microscopy,
photobleaching causes the same material
to have different intensities in different
focal planes).
Only simple edge detection procedure are
used (e.g. convolution based)
44
22
45
Multi-scale iso-value detection
[Witkin, 1983] : “SCALE-SPACE
FILTERING”.
Describes signals qualitatively,
managing the ambiguity of scale in
an organized and natural way.
The signal is expanded by
convolution with Gaussian masks
over a continuum of sizes.
The “Scale-Space” image is then
collapsed, using it’s qualitative
structure (e.g. zero-crossing
points), into a tree providing a
concise but complete qualitative
description covering all scales of
observation.
Scale space smoothing
s
Scale space map
46
23
47
Vector Visualization
Data set is given by vectors:
Gaseous and fluid flow (car, ship and
aircraft design, blood vessels)
Techniques :
– Hedgehogs/glyphs
– Particle tracing
– stream-, streak-, time- & pathlines, stream-ribbon, streamsurfaces, stream-polygons, streamtube
steady and unsteady flows:
vector field stays constant or changes
with time
48
24
49
Mappings - Hedgehogs, Glyphs
Put “icons” at certain
places in the flow:
oriented lines, glyphs,
vortex, etc.
Use of icon size (length,
volume, area) and
direction
Tend to clutter the
image real quick
glyphs
oriented
lines
vortex
50
25
Examples
51
Weather Data
Direction-hue
Direction-value
52
26
Tornado
53
Mappings – Warping
Warping: Animate displacement by deformation
and distortion
54
27
Mappings – Displacement Plot
Displacement Plot: Use scalar values
s v n
55
Mappings - Path-lines
Lines from particle trace
collection of particle traces gives sense of
time evolution of flow
computed by
dx
v x , t or dx vx , t dt
dt
56
28
Path-line tracing
Euler method: xi 1 xi vi t
(error: O(t 2 ))
Runge-Kutta method:
t
xi 1 xi (vi vi 1 ) (error: O(t 3 ))
2
vi v( xi , t ), vi 1 v( xi t vi , t t )
57
58
29
Time-lines
Position at an instant of time of a batch of
particles which had been released
simultaneously.
timeline
T=1
T=2
T=3
59
Mappings - Streak-lines
Locus at time t0 of all fluid elements that
have previously passed through x0
information of the past history of the flow
obtained by linking all the end-points of
the trace of: dx
vx , t
dt
Computer particle positions from the origin
at time t0 t i
60
30
Mappings - Streak-lines
61
62
31
Mappings - Streamlines
Everywhere tangent to the flow, a mathematical
curve, exist only at fixed time t0
Same as path-lines & streak-lines in steady flow
integral curve along a curve s (s is the arc-length
of the curve) :
dx
v x , t0 , x vds
ds
63
Mappings - Streamlines
64
32
65
Mappings - compare
66
33
Mappings - Contours
67
Mappings - Stream-ribbon
Need to see vorticities, I.e. places where the flow
twists -- requires surface information.
Idea: trace neighboring particles and connect
them with polygons, then shade those polygons
appropriately to show twists.
A Problem - flow divergence ( the “spread”)
Solution: trace one streamline and a constant size
vector (curve’s normal vector) with it:
68
34
Mappings - Stream-tube
Stream-tube: Generate a stream-line and
connect circular crossflow sections along
the stream-line
69
70
35
71
Mappings - Stream-surface
Stream-surface: Collection of stream-lines
passing through a base curve (rake).
If the rake is closed : stream tube
If the rake is open and short: stream
ribbon.
No flow can pass through a stream surface
Constructed by connecting polygons.
72
36
73
Mappings - Flow Volumes
Instead of tracing a line - trace a small
polyhedron
74
37
Flow Volume (1)
Seed polygon (square) is used as smoke generator.
Center is perpendicular to flow.
Square can be subdivided into finer mesh.
Volume is adaptively subdivided in areas of high
divergence (e.g. when edges become too long)
There is no
merging.
Irregular volume
(various topologies)
75
Flow Volume (2)
Can simulate puffing
smoke
Can be color-coded to
represent other fields.
76
38
77
Rendering
hedgehog
Hedgehog & Glyphs
– Oriented lines
– polygonal representation
stream-**:
– Curves
– polygonal models
– volumetric models (flow
volumes)
78
39
Image Processing
apply a vector field to an image to create
motions
79
40
© Copyright 2025 Paperzz