Immersion, Prescence, Distributed VR - PowerPoint PPT Presentation

1 / 79
About This Presentation
Title:

Immersion, Prescence, Distributed VR

Description:

Use of highly interactive real-time immersive systems to convey information ... Have a Wand' Stereo glasses in CAVE ... film of Fuji TV: the Dream of Mr. M. ... – PowerPoint PPT presentation

Number of Views:74
Avg rating:3.0/5.0
Slides: 80
Provided by: ncsa2
Category:

less

Transcript and Presenter's Notes

Title: Immersion, Prescence, Distributed VR


1
Immersion, Prescence, Distributed VR
  • Bob Hobbs
  • Staffordshire University
  • Computing School

2
  • Outline
  • Context
  • Immersion
  • Presence
  • Shared Environments

3
Virtual Reality is a Tool
  • What it is
  • Use of highly interactive real-time immersive
    systems to convey information
  • What it is not
  • Desktop graphics
  • Text based
  • Non-interactive
  • Linear

4
Immersion Realisation of an Environment
  • generates displays ideally in all sensory
    systems
  • fully encloses the participant in those displays
  • tracks the body, limbs, head
  • determines the optical, auditory... arrays as a
    function of head tracking
  • Either
  • displays a Virtual Body with movements as
    function of the tracking. (mainly with HMD)
  • Participant can visualise self and world (CAVE)

5
Virtual Body
  • At any moment there is a position in the geometry
    with respect to which sensory data is generated -
    the egocentric self-reference position.
  • This corresponds to the place occupied by the
    human actor in the environment.
  • At the self-reference position there is a
    functioning VB represented by the displays.

6
Cave
7
Position Tracking Systems
  • Polhemus Inc. (http//www.polhemus.com)
  • 3Space ISOTRAK (1 sensor)
  • 3Space FASTRAK (many sensors)
  • Ascension Technology Corp. (http//www.ascension-t
    ech.com)
  • Flock of Birds
  • pcBIRD
  • SpacePad

8
Trackers Calibration
  • Dynamic errors
  • caused by external electromagnetic fields
  • can be corrected by increasing measurements
    frequency, synchronizing the measurements with
    the external field source, and filtering
  • Static errors
  • caused by the field distortions due to the
    surrounding metal and external fields
  • can be corrected via trackers calibration

9
Calibration Table
10
Calibration Example
  • CAVE, FoB
  • 4 feet from the floor
  • 1 foot grid
  • 4th order polynomial fit

11
Interpolation
  • True space
  • Tracked space

V. Kindratenko, A. Bennett, Evaluation of
Rotation Correction Techniques for
Electromagnetic Position Tracking Systems, in
Proc. VE 2000, pp. 13-22
12
Data Acquisition Techniques
  • Size and type of a calibration table depends on
  • Type of the calibration technique to be used
  • Severity of the field distortions
  • Required calibration quality
  • Calibration table can be
  • Irregular (for high-order polynomial fit)
  • Regular in the true space (for interpolation)
  • Regular in the tracked space (for tri-linear
    interpolation)

13
Regular Grid in the True Space
14
An Immersive Participant
  • A user will be head tracked
  • Have a Wand
  • Stereo glasses in CAVE
  • HMD user may have additional tracking sensors
    Data Glove or Motion tracker



15
Data Glove

Hand measurement devices must sense both flexing
angles of fingers and position/orientation of
wrist in real-time. typical example of hand
measurement device DataGlove from VPL Research.
DataGlove consists of lightweight nylon glove
with optical sensors mounted along fingers.
16
  • Each sensor short length of fiberoptic cable,
    with light-emitting diode (LED) at one end and
    phototransistor at other end.
  • When cable flexed, some of LED's light lost, so
    less light received by phototransistor.
  • Attached to back 3Space Isotrack system to
    measure orientation/position of gloved hand.

17
Data Suit
  • Much less popular than DataGlove allows to
    measure positions of body.
  • typical example of use of datasuit
  • film of Fuji TV the Dream of Mr. M.
  • 3D character approximately performs same motion
    as animator.
  • Another way of measuring positions of body just
    to use collection of sensors like Flock of Birds.
  • However, needs algorithms for calibration and
    conversion (see paper by Molet et al.)

18
Sound
  • Midi-equipment and workstation audio for sound
    generation and effects, filter processors and
    3D-audio cards for spatial audio.
  • Two categories of sound in VR can be identified
  • Simulation of real world acoustics based on our
    experiences in everyday life physical behavior of
    sound can be modeled.
  • comprises sound generation, e.g. caused by object
    collision, sound propagation and auralization.
  • Immersive user interfaces can be used to evaluate
    simulation results.
  • Sound at user interface sound can be applied to
    support user in current task or to provide
    information about invisible proceedings.

19
Presence
  • Presence is a state of consciousness where the
    human actor has a sense of being in the location
    specified by the displays.
  • We take presence as the central feature of
    "virtual reality"
  • "A virtual reality is defined as a real or
    simulated
  • environment in which a perceiver experiences
  • telepresence" (Steuer).
  • The unique feature of "virtual reality" systems
    is that they are general purpose presence
    transforming machines..

20
Meaning of Presence
  • Presence is the psychological sense of being
    there in the environment specified by the
    displays.
  • a high degree of presence in the VE should lead
    to the participant experiencing objects and
    processes in the virtual world as (temporarily)
    more the presenting reality than the real world
    in which the VE experience is actually embedded.
  • A correlate of this is that the participant
    should exhibit behaviours that are the same as
    those they would carry out in similar
    circumstances in everyday reality.
  • The VE experience - should be more like visiting
    a place, rather than like seeing images
    designating a place

21
Design in Immersive VEs
  • With design in immersive virtual environments...
  • designer shares same space as objects
  • a degree of evaluation can take place in the
    virtual space
  • presence leads to the designer behaving in a
    manner appropriate to everyday reality in similar
    circumstances.
  • Special "interactive techniques" and behaviours
    do not have to be learned...

22
Feedback
  • Two forms of feedbaack
  • Force Feedback
  • Manipulating virtual objects
  • Gravity
  • Simulation
  • Touch (tactile) Feedback
  • Texture appreciation
  • Navigation
  • Sensitive
  • Use Haptic Devices

23
What is a haptic interface?
  • A haptic interface is a force reflecting device
    which allows a user to touch, feel, manipulate,
    create, and/or alter simulated 3D-objects in a
    virtual environment.
  • Movement trackers do not provide feedback

24
Tactile Feedback
25
Usage
  • It could be used to
  • train physical skills such as those jobs
    requiring specialized hand-help tools (e.g.
    surgeons, astronauts, mechanics),
  • to provide haptic-feedback modeling of three
    dimensional objects without a physical medium
    (such as automobile body designers working with
    clay models), or
  • to mock-up developmental prototypes directly from
    CAD databases (rather than in a machine shop).

26
Phantom
Very common haptic device mainly used with
augmentation on desktop systems
27
Exoskeleton
28
Actuators
  • Electrical current drives actuators controlling
    individual joints
  • Directly to motors or solenoids
  • To valves controlling flow of fluids to hydraulic
    or pneumatic systems

29
Presence in Multi-participant Environments
  • Sense of being in a place
  • sense of sharing the same space as other
    individuals
  • Sense of belonging to a totality more than just
    the sum of the individuals
  • Awareness may be an important factor enhancing
    shared presence.
  • Shared presence may correspondingly enhance
    awareness

30
Robot arm
  • Simplest sort of robot
  • Typical arm has 7 segments, 6 joints
  • 6DOF
  • Human arm 7DOF
  • Usually driven by Step Motors
  • Main use is in manufacturing

31
Robot Arm
  • Fitted with end effector
  • Usually interchangeable
  • Artificial Hand , paint gun, welding rod
  • Pressure sensor needed to prevent crushing
  • Programmed by incremental steps which are then
    replicated ad infinitum

32
Frameworks, Chains (or Skeletons)
  • A lot of mechanical objects in the real world
    consist of solid sections connected by joints
  • Obviously robot arm but also
  • Creatures such as humans and animals.
  • Car Suspension
  • Ropes, string and Chains

33
Frameworks, Chains (or Skeletons)
  • Sections and joints of robot arm are known as a
    'chain
  • In creatures could be referred to as a skeleton
  • Moveable sections correspond to bones
  • Attachments between bones are joints.

34
Frameworks, Chains (or Skeletons)
  • Motions of chains can be specified in terms of
    translations and rotations.
  • Forward Kinematics - From the amounts of rotation
    and bending of each joint in an arm, for example,
    the position of the hand can be calculated.
  • Inverse Kinematics - If the hand is moved, the
    rotation and bending of the arm is calculated, in
    accordance with the length and joint properties
    of each section of the arm.

35
Joint Translation-Rotation
  • We can use a transform (T) to transform each
    point relative to the body to a position in world
    coordinates.
  • If we want to model both linear and angular
    (rotational) motion then we need to use a 4x4
    matrix to represent the transform

36
What is Inverse Kinematics?
  • Forward Kinematics

37
What is Inverse Kinematics?
  • Inverse Kinematics

38
Kinematic Chains
  • Solid links connected at movable joints
  • Fixed end base
  • Movable end tip or end effector
  • One degree of freedom (DOF) per joint
  • Open chain one fixed end, one movable end
  • Closed chain both ends fixed

39
Forward and Inverse Kinematics
40
Kinematic Redundancy
  • End-effector has 6 DoFs
  • - (x, y, z) position
  • - ( , , ) orientation
  • Non-redundant linkage has lt 6 joints (DoFs)
  • Redundant linkage has gt 6 joints (DoFs)
  • - Human arm has 7 DoFs
  • Shoulder 3
  • Elbow 1
  • Forearm 1
  • Wrist 2
  • - Redundancy enables multiple solutions

41
Inverse Kinematics (IK)
  • Non-redundant Linkages
  • - Analytical solutions
  • Redundant Linkages
  • - Many techniques
  • Pseudo-inverse (Jacobian)
  • Gradient
  • Others
  • IK Commonly Found in Animation Packages
  • - 3D Studio Max

42
Redundancy
  • A redundant system has infinite number of
    solutions
  • Human skeleton has 70 DOF
  • Ultra-super redundant
  • How to solve highly redundant system?

43
Iterative solution
  • Start at end effector
  • Move each joint so that end gets closer to target
  • The angle of rotation for each joint is found by
    taking the dot product of the vectors from the
    joint to the current point and from the joint to
    the desired end point. Then taking the arcsin of
    this dot product.
  • To find the sign of this angle (ie which
    direction to turn), take the cross product of
    these vectors and checking the sign of the Z
    element of the vector.

44
Goal Potential Function
  • Distance from the end effector to the goal
  • Function of joint angles G(q)

45
Our Example
46
Quiz
  • Will G(q) be always zero?
  • No Unreachable Workspace
  • Will the solution be always found?
  • No Local Minima/Singular Configuration
  • Will the solution be always unique?
  • No Redundancy

47
Conflict Between Goals
ee 2
ee 1
base
48
Conflict Between Goals
Goal 1
ee 2
ee 1
base
49
Conflict Between Goals
Goal 2
ee 2
ee 1
base
50
Conflict Between Goals
Goal 2
Goal 1
ee 2
ee 1
base
51
Conflict Between Goals
Goal 2
Goal 1
ee 2
ee 1
base
52
Figure Modeling
  • Many VE Applications Require Human, Animalor
    Robotic Actors
  • - Team training exercises
  • SIMNET, DIS
  • - Mission planning and rehearsal
  • - Human factors studies
  • Boeing 777
  • - Walkthroughs
  • Virtual Actors
  • - Computational models of real-world counterparts

53
Virtual Actors Autonomous or Guided
  • Guided Actors are Slaved to the Motions of a
    Human Participant Using Body Tracking
  • Optical, mechanical, . . .
  • A.K.A. Avatar
  • Autonomous Actors Are Controlled by Behavior
    Modeling Programs, and Can
  • - Augment or replace human participants
  • - Serve as surrogate instructors
  • - Act as guides in complex synthetic worlds
  • Hybrid Control Desirable
  • - VRLOCO uses interaction to invoke and control
    locomotion behaviors

54
The Weiss 6-Level Motor Organization Hierarchy
Organism Level 6. Motor Behavior 5. Motor
Organ System 4. Motor Organ 3. Muscle Group 2.
Muscle 1. Motor Unit Neuron Level
  • 3. Muscle Group
  • - Coordinated action of several muscles
  • - Motion at one joint
  • 2. Muscle
  • - Muscle contraction
  • 1. Motor Unit
  • - Neuron muscle fibers
  • - Twitching, shivering

55
The Weiss 6-Level Motor Organization Hierarchy
Organism Level 6. Motor Behavior 5. Motor
Organ System 4. Motor Organ 3. Muscle Group 2.
Muscle 1. Motor Unit Neuron Level
  • 6. Motor Behavior
  • - Movement of the whole organism
  • - E.G., Goal-directed locomotion
  • - Task manager
  • 5. Motor Organ System
  • - Coordinated action of several limbs
  • - E.g., Walking
  • - Motor programs, skills
  • 4. Motor Organ
  • - Coordinated action of several joints
  • - E.G., Stepping motion of a limb
  • - Local motor programs

56
Motion and Reaction
  • Sensorymotor level
  • - Levels 1 - 5
  • - Peripheral and proprioceptive feedback
    associated with
  • reflex arcs
  • - Motor programs and reflexes coordinate and
    control
  • motion
  • - Executes behaviors
  • Reactive level
  • Level 6 and higher
  • - Perception triggers and modulates behavior
  • - Organism responds to environmental stimuli to
    select and compose behaviors
  • - Selects behaviors

57
Organization of a Virtual Actor
Organism Level 6. Motor Behavior 5. Motor
Organ System 4. Motor Organ 3. Muscle Group 2.
Muscle 1. Motor Unit Neuron Level
Level 6 and above Reactive level
Levels 1-5 Sensorymotor level
58
Virtual Actor
59
(No Transcript)
60
Abstraction and Interaction
61
Representation and Abstraction
62
(No Transcript)
63
Finite State Machines for Walking
64
Control and Abstraction
65
Avatars
66
Static Balance
67
Weight
  • Bend
  • Non-weight-bearing motion
  • Traverse subtree rooted at rotating joint
  • Pivot
  • Weight-bearing motion
  • Traverse entire tree starting at root EXCEPT
    for subtree rooted at rotating joint
  • Critical Element of Realism
  • Is the character supported by its legs, or are
    the legs dangling in space as the character is
    translated along?

68
Bend
Non-weight-bearing motion traverse
subtree rooted at rotating joint
69
Pivot
Weight-bearing motion traverse entire tree
starting at root EXCEPT for subtree rooted at
rotating joint
70
Gait Parameters
  • Gait Pattern
  • Sequence of lifting and placing feet
  • Gait Cycle
  • One repetition Of the gait pattern
  • Period
  • Duration of one gait cycle
  • Relative Phase of Leg I
  • Fraction of gait cycle before leg I is lifted
  • Duty Factor
  • Fraction of gait cycle period a given leg
    spends on ground
  • Swing Time
  • Time a leg spends In the air
  • Stance Time
  • Time a leg spends On the ground
  • Stroke
  • Distance body travels during a leg's stance time

71
Finite State Machines for Walking
72
(No Transcript)
73
Tele-Immersion
  • Goal - not just making these collaborations
    possible, but making them convenient

74
CAVERNsoft Application
Virtual Harlem
  • Bryan Carter, Bill Plummer ATC (Advanced
    Technology Center at Univ of Missouri- Columbia )
  • SIGGRAPH 1999
  • Harlem is reconstructed for an African American
    Literature course at MU. Instead of just reading
    literary works from this era, this prototype will
    allow students to become immersed and engaged in
    an interactive literature course.
  • Jim Sosnoski, Jim Fletcher- English Dept. Univ
    Illinois Chicago
  • Steve Jones- Communications Dept. Univ Illinois
    Chicago

75
Elements ofTele-Immersion
76
Avatars
  • Tracking head and hand position and orientation
    give good cues
  • Extendable pointing rays can be useful in large
    spaces
  • Exaggerated head and hand motions give better
    cues than just hand

77
(No Transcript)
78
Shared Virtual Environments in Europe
  • Collaborative Virtual Environments (COVEN) ACTS
  • Develops an integrated teleworking platform that
    supports multi-sensory presence for collaboration
    in shared virtual environments.
  • Services
  • mechanisms to support the presence of users in
    shared virtual environments.
  • browsing and interaction facilities for large
    numbers of users accessing enormous quantities of
    remote information
  • synchronised multi-sensory interaction with
    dynamic representations of three-dimensional
    objects and actors
  • support for collaborative tasks requiring complex
    motor skills and shared information.

79
VR Applications
  • Augmented Reality
  • Placing data in the normal workspace
  • Data Visualisation
  • Explaining data through better representation
  • Training
  • For dangerous/expense procedures
  • Conferencing
  • Social context for telecommunication
  • Health
  • Treatment of phobias/psychological disorders
  • Entertainment
Write a Comment
User Comments (0)
About PowerShow.com