Realtime 3D Computer Graphics Virtual Reality

Realtime 3D Computer Graphics
Virtual Reality
Human Visual Perception
The human visual system
•
•
•
2 eyes
Optic nerve: 1.5 million fibers per eye
Radio
(each fiber is the axon from a neuron)
|
125 million rods (achromatic low light sight)
350
•
6 million cones (high detail color sight)
•
•
•
•
•
•
•
•
most outside the fovea
most concentrated inside fovea
3 types: R, G, B sensitive to long, middle and short
wavelengths (!) respectively
Approximately 100:1 compression from the number of
receptors to the number of fibers in the optic nerve.
Quick transmission rate through the optic nerve
Monocular visual field is 160° (w) x 135° (h)
Binocular visual field is 200° (w) x 135° (h)
Processing is not uniform across the visual field:
•
25% of cortex is devoted to the central
5° of the field of view.
Realtime 3D Computer Graphics / Virtual Reality – WS 2005/2006 – Marc Erich Latoschik
Heat
Light
! (nm)
|
780
Human visual perception
• Human visual perception processes
convergence
• position/orientation and movement in 3 dimensions plus
• color.
l-eye
c
• The third dimension depth is processed based
on several physiological and psychological depth cues.
• Depth cues1 can be binocular or monocular.
r-eye
Binocular depth cues:
l-eye
• Convergence
• Difference in the direction of the eyes.
• Our eyes point slightly inward for closer objects.
r-eye
• Only effective on short distances (< 10 meters).
different images (size,
• Binocular Parallax
position and content) on
• Difference in the sensed images by our two eyes.
the real retina as well as
• Our eyes see the world from slightly different locations;
on the virtual projection
! images sensed are slightly different.
plane.
• Human visual system is very sensitive to these differences;
! most important depth cue for medium viewing distances.
• Can be used to achieve depth sense even if all other depth cues are removed.
1depth
Realtime 3D Computer Graphics / Virtual Reality – WS 2005/2006 – Marc Erich Latoschik
Human visual perception
Monocular depth cues:
• Monocular Movement (motion) Parallax
• Depth perception by moving each of our
eyes (head).
• Depth information is extracted from
consecutive similar images
in the same way as images from
different eyes are combined.
• Retinal Image Size
• Brain compares the sensed size of an
object to its “known” real size.
l-eye
t
l-eye
Realtime 3D Computer Graphics / Virtual Reality – WS 2005/2006 – Marc Erich Latoschik
cues following (Okoshi, 1976)
Human visual perception
Monocular depth cues continued:
• Linear Perspective
•
•
Straight parallel lines meet in the horizon.
Important depth cue.
•
Texture Gradient
•
Occlusion, Overlapping
•
Aerial Perspective
•
Shades and Shadows
•
•
•
•
•
•
•
•
Closer objects look more detailed.
Objects with smooth surface textures
are usually interpreted being farther
away (especially true if the texture
spans from near to far).
Out of sight blocking of objects.
Distant objects (mountains in the horizon) look always slightly bluish
or hazy due to small water and dust particles in the air between.
Objects shadowing others are closer to light sources.
Useful to resolve ambiguities.
Bright objects seem to be closer to the observer than dark ones.
(Example: Three dimensional looking WIMP interfaces.)
Realtime 3D Computer Graphics / Virtual Reality – WS 2005/2006 – Marc Erich Latoschik
Simulating visual stimuli
3D CG rendering provides:
Retinal Image Size
Linear perspective
Methods:
Perspective projection
Texture Gradient
High tessellation, LOD, texturing (images,
bump maps, normal maps, height maps…)
Occlusion
Occlusion culling, z-buffer algorithm
Aerial Perspective
fogging, atmospheric models
Special lighting equations, shadow maps,
shadow casts
VR requires immersion and hence a simulation of visual stimuli which
provides a mature depth perception:
Shades and Shadows
Convergence
Binocular Parallax
Monocular and binocular Motion Parallax
Stereoscopy, channel separation
Head (motion) tracking,
dynamic view frustum
Realtime 3D Computer Graphics / Virtual Reality – WS 2005/2006 – Marc Erich Latoschik
Implementing
additional depth cues
•
Stereoscopy:
• Render from two offset eye points (IPD)
or center of projections (COPs).
off-axis projection
" (and # in 3D):
VPN
p``
binocular
parallax
p`
p
"
Left eye
Right
eye
View Plane
Normal (VPN)
Center Of
Screen (1)
(COS)
Image plane/screen 1
fixed w.r.t. (right) eye
COS
Left eye
Right
Motion
eye parallax:
•
•
•
p``
binocular
parallax
p`
p
COPt-1
Image plane/screen
fixed w.r.t. world
Left eye
Track head (and hence eye movements)
and calculate new perspective projection. translation
Calculate dynamic view frustum in case of
image plane fixed w.r.t. world.
Left eye
COPt
Realtime 3D Computer Graphics / Virtual Reality – WS 2005/2006 – Marc Erich Latoschik
p
q
p’t
motion
parallax
q‘t
nearness:
q‘t-1
leftness:
p‘t-1
Image plane/screen
fixed w.r.t. world
Stereoscopy
Features of binocular parallax
• Negative: object in front of screen
• Zero: object on the screen
• Positive: object behind the screen
• Focus vs. convergence
• Focus is on image plane
• Convergence is on virtual object
! Large parallax puts strain on the eye.
•
Stereoscopy methods
Feed each channel and its rendered picture
to one specific eye by
• using one screen per eye (HMD).
• time-multiplexing generated images (shutter
glasses).
• filter images through polarization filters.
• filter images using color filters (anaglyph).
• using auto stereoscopic displays.
Shutter Technology
• Close left eye when right eye image
is displayed and vice versa.
• Controlled through infrared or wired
up.
• Usually connects to V-sync signal
(vertical retrace of CRT).
Realtime 3D Computer Graphics / Virtual Reality – WS 2005/2006 – Marc Erich Latoschik
Stereoscopy
Polarization
• Light: wave length and direction of polarization. Two components orthogonal to each other.
“normal” light
•
•
polarized light
Filters can block certain directions of polarization.
See through linear polarization (use two projectors):
• Left view: vertical filter in front of the lens.
• Right view: horizontal filter in front of the lens.
• Wear glasses with polarizing filters.
• Left eye: vertical
• Right eye: horizontal
Realtime 3D Computer Graphics / Virtual Reality – WS 2005/2006 – Marc Erich Latoschik
Stereoscopy
•
•
Linear polarization
• Can’t tilt head
• Little ghosting
See through circular polarization (using two projectors):
• Left view: clockwise filter
• Right view: counter clockwise filter
• Allows arbitrary head orientations
• In general more ghosting than linear polarization
linear polarization
circular polarization
Anaglyph stereo
• Combine each channel’s R,G,B values by two complementing
transformations to calculate an integrated channel.
• Several anaglyph version exist. Usually black/white images,
color possible but filter and image colors may interfere.
• Example for a red/blue transformation:
Realtime 3D Computer Graphics / Virtual Reality – WS 2005/2006 – Marc Erich Latoschik
Stereoscopy
•
Pulfrich effect
•
•
•
•
•
•
At low light levels the eye-brain visual
response is slower.
Using a neutral (transparent gray) filter over one eye.
Movement perception by that eye will
lag behind perception by the unimpeded eye.
Lag induces a difference in the images perceived
by each eye.
This induces a binocular vision illusion of depth.
•
Autostereograms
Auto stereoscopic displays
•
•
Holographic displays, e.g. laser projection on gas or fluids.
Modified LCDs
• Assign alternating pixel columns for each eye.
• Filter outgoing light by prisms or by two
vertically striped masks located
in front of the LCD.
• Slightly dislocate the masks in depth and
displace them horizontally.
Realtime 3D Computer Graphics / Virtual Reality – WS 2005/2006 – Marc Erich Latoschik
SeeReal Technologies C-nt
res. 1600x1200 mono
res. 800x1200 stereo
sweetspot distance 650 mm
sweetspot width/depth
50/150mm