Title: Display and Interaction Devices
1User Interface II
- Display and Interaction Devices
(Based in part on previous lectures by Matt
Ayers, Kenneth Herndon, and Scott Sona Snibbe.
Updated by Fareed Behmaram-Mosavat)
2Display and Interaction Devices
- Roadmap
- Distinguishing characteristics of devices
- Input devices
- standard
- research
- Output devices
- video
- other
- Virtual devices
- WIMP vs. Post-WIMP interfaces
- Where do we go from here?
-
3Input Device Hardware (1/3)
- Hardware characteristics
- Absolute vs. relative
- Polled vs. interrupt-driven
- Discrete vs. continuous input
- Degrees of freedom (DOF)
- number of simultaneous, independent data values
that arrive in one record - normally 1, 2, 3, or 6
- Potential problem areas
- Spatial resolution
- Registration and calibration
- Accuracy and repeatability
- Sample frequency (temporal resolution)
- Lag
- Data synchronization
- Abstractions
- Hardware level
- Logical level
4Input Device Hardware (2/3)
- Device interface level
- Wired vs. wireless
- Connection
- IrDA (Infrared)
- Universal Serial Bus (USB)
- Firewire (IEEE1394)
- Bluetooth
- older RS 232, parallel, mini-din 8
- Power source
- AC power supply
- batteries
- mechanical motion, solar
- connection to computer
- Type of data transferred
- binary or text
- floating point, integers, text, etc.
5Input Device Hardware (3/3)
- Logical level
- Divides devices into task-oriented categories
- navigation in a scene
- object selection
- positioning of an object or camera in 1, 2, 3 or
more dimensions - orientation of an object or camera in 1, 2, 3 or
more dimensions - text input
- scalar value input
- ink, i.e. draw a line
- indication of complex shape contours
- Hides hardware issues such as absolute vs.
relative values - Can be remapped in software
- Logical level abstractions are easy for
WIMP, very hard for next generation devices
and VR applications
6Traditional Input Devices (1/5)
- Commonly used today
- Mouse-like devices
- mouse
- wheel mouse (up to 2 wheels offer extra DOF)
- trackball
- trackpad
- Keyboards
- QWERTY, Dvorak, Maltron
- one handed vs. two handed
- standard vs. ergonomic
- chording keyboards
- DataHand keyboard
7Traditional Input Devices (2/5)
- Pen/Stylus (see slide 40)
- Data provided
- Absolute position
- Pressure, distance from surface
- Tablets for desktop computers
- Alternative to mouse
- Tablet PCs (can be used as laptops or slates)
- Toshiba Portege M200, M400
- HP Tablet PC tc1100, tc4200
- Acer TravelMate C200, C310
- and many more
- Palm-top devices
- HP iPaq Pocket PC
- Handspring, PalmOS
- Sony Clie, Treo
- BlackBerry
WACOM Intuos3 tablet
WACOM Cintiq 21UX
Nintendo DS
HP tc1100
8Traditional Input Devices (4/5)
- Dial boxes
- number of dials (1 DOF per dial)
- Joysticks
- game pads
- flightsticks (2 or 3 DOF plus a myriad of buttons
and sliders) - Nintendo Wiis controller
9Traditional Input Devices (5/5)
- Touchscreens
- Microphones
- wireless vs. wired
- headset
- unencumbering
- Digital still and video cameras, scanners
- Sony Eye
- Uses basic image recognition to
- track body movements as an input
- to console games
- TrackIR by NaturalPoint
- MIDI devices
- input from electronic musical instruments
- more convenient than entering scores with just a
mouse/keyboard
103D Input Devices (1/4)
- Use may become more common in future
- Electromagnetic trackers
- 3 or 6 DOF (position and orientation in space)
- can be attached to any head, hands, joints,
objects - must deal with noise, calibration
- Polhemus FASTRAK(used in Browns Cave)
- provides X,Y,Z position and Euler angle
orientation at 120 Hz, and 0.03 accuracy - receivers attached to user detect a field
generated by a mounted transmitter - Flock of Birds (used on Graphics Lab Fakespace
table) - Acoustic-inertial trackers
- no interference from metal objects
- wider range, higher accuracy
- Intersense IS-900
- receivers attached to user detect ultrasound from
many inexpensive and small emitters to determine
position - also uses inertial measurement unit to determine
angular acceleration, integrate for orientation
113D Input Devices (2/4)
- Use may become more common in future
- Infrared trackers
- Short range (normally around 3 or 4 feet)
- high accuracy
- Nintendo Power Glove
- Optical trackers
- photogrammetric technique space-resection by
collinearity - no EM interference to worry about
- self-calibration
- UNCs Highball (commercialized by 3rdTech)
123D Input Devices (3/4)
- Gloves
- attach electromagnetic tracker to the hand
- Pinch gloves
- contact between digits is a pinch gesture
- in Cave, extended Fakespace PINCH gloves with
extra contacts - Browns FingerSleeve - single finger device,
combines tracker and pop-through buttons.
133D Input Devices (4/4)
- Mouselike
- relative 6 DOF, with multiple buttons
- 6 DOF trackers are easier to control and
versatile - Logicad Magellan controller, used to be in the
CAVE early on, replaced by Intersense 6 DOF wand - Hybrid
- Wand/Wanda (Murray Consulting)
- 6 DOF tracking, relative joystick and buttons
14Unsuccessful 3D Input Devices
- Commercial failures
- Spaceball
- broke ground for the Magellan puck
- 6 DOF designed for easy navigation
- mostly used for 3D modeling
- Flymouse
- tracks motion of mouse held in mid-air
- limited range of motion
15Products for specialized markets
- UI hardware for the disabled
- Animation/keyframing
- Full body and facial motion capture
16Some Current Input Device Research
- Non-standard Input Devices
- Reconfigurable devices
- Browns Lego toolkit
- Tool handles/props, with attached sensors
- phicons (physical icons, Hiroshi Ishii, MIT Media
Lab) - Passive input devices
- Premise all devices are encumbering
- repetitive stress
- limited range of expression
- unsanitary
- Would like to separate user from devices
- Voice recognition without a headset
- not successful yet
- Image-based analysis
- video camera trained on user
- gaze tracking
- gesture tracking
- expression tracking
17Multitouch
- iPhone, iTouch, Macbook Pro
- Can use two fingers at once
- No need for buttons!
- UI elements are displayed on screen
- Frustrated Total Internal Reflection (FTIR)
- Jeff Han (NYU) 2005
- Infrared lights placed at edges of acrylic
surface - complete internal reflection until user touches
surface - computer vision algorithms determine points of
contact - multiple users can interact at the same time
18Virtual Input Devices (1/8)
- a.k.a. gestures and 3D widgets
- Part of a windowing system, UI toolkit, or 3D
environment - Widgets a combination of behavior and geometry
- Motivation
- Advanced hardware devices are expensive, and not
always available for all platforms - Most users already know how to use traditional
input devices (mouse keyboard) - It is inefficient to have to continuously switch
devices - try to keep hands on the mouse or the keyboard
- Would like to perform complicated inputs with
simple gestures - You will implement a virtual trackball and other
virtual devices in the Modeler assignment
19Virtual Input Devices (2/8)
- 2D widgets
- Windowing systems (e.g. X, Mac, Windows)
- window
- scrollbar
- UI toolkits (e.g. Java Swing/AWT, Motif, Windows
Forms) - button
- dialog box
- drawing area
- object handles
- Simulating hardware devices
- sliders as virtual dials
- windows as virtual tablet
20Virtual Input Devices (3/8)
- 3D widgets
- Ambiguity of gestures
- 2D mouse gesture ? 3D movement
- interface must make decisions
- complex geometry involved to make these decisions
- Fundamental differences between 2D and 3D
graphics - multiple coordinate systems
- hidden surfaces
- more complicated primitives (3D objects, not 2D
windows) - Combine geometry behavior
- make sure that target users can infer the
widgets functionality based on its geometry - reduce the cognitive distance between the
function you are actually performing and the
interaction you are doing - virtual devices should show the affordances of
the actions they are designed to do
21Virtual Input Devices (4/8)
- Disambiguating 2D gestures
- How do we interpret a 2D mouse gesture for 3D
translation? - Axis-aligned
- Screen-aligned
- Surface-aligned
22Virtual Input Devices (5/8)
- Gestural axis-aligned translation
- Compare 2D mouse vector with projected 3D object
axes - we choose the axis whose direction matches most
closely - mathematically, this is the axis whose
screen-projected 2D dot product with the mouse
vector has the largest magnitude - in this case, we choose the X axis
- special cases crop up when the projected axes
cannot be disambiguated
Y axis
X axis
Z axis
23Virtual Input Devices (6/8)
- Virtual sphere rotation (Chen 88)
- Project mouse motions onto the surface of a
sphere surrounding the object (an object
trackball) - Construct two vectors from center of sphere to
the surface of the sphere - first vector sphere center to beginning of mouse
motion - second vector sphere center to end of mouse
motion - Cross product of two vectors gives the axis
around which to rotate - Normalized dot product gives the cosine of angle
to rotate object through - Used for camera trackball as well
- You will implement this in Modeler!
24Virtual Input Devices (7/8)
- Inherent difficulties of 3D input
- Different coordinate systems
- world
- object
- camera
- UV coordinates on objects surface
- screen
- More complex math
- 3D points, vectors, transformation matrices,
quaternions - ray casting, hidden surface calculations
- 2D view of 3D scene
- information is missing in a flat display
- objects obscured or off screen
- spatial relationships difficult to perceive
- need to be able to form object hypothesis
(James Gibson, perceptual psychologist)
25Virtual Input Devices (8/8)
- Comparison of real and virtual devices
- Some systems (i.e., our Cave) have many physical
devices - 2D input device mouse, keypad, WACOM tablet
- Crystal Eyes shutter glasses for stereo output
- up to 3 Intersense or Polhemus trackers (6 DOF
each) - tracker on shutter glasses
- tracker on each hand
- immediately accessible, all might work
simultaneously - Many users prefer mouse virtual devices
- not a lot of space on a physical desktop
- dont have to keep fumbling around the desk
- a certain amount of time to reacquaint yourself
with the devices can be more important than
actual 3D input - feel is more important
- easier to adapt behavior as users transition from
novice to expert - Experimental results
- in an experiment, Marceli Wein presented users
with an actual trackball directly beside a tablet
with virtual sphere control - all of his users abandoned the actual trackball
in favor of the virtual sphere algorithm - but dont assume! User testing is crucial!
26Video Output Devices (1/5)
- Classifications
- Stereo
- considered necessary (better depth cues)
- demands extra hardware
- head-mounted displays
- shutter glasses (CrystalEyes glasses used in our
CaveTM) - demands faster update rates
- no more than 300ms lag
- at least 60 frames per
- second
- Degree of immersion
- conventional desktop screen
- walkup VR, semi-immersive displays
- immersive virtual reality
- augmented (mixed) reality with video or optical
blending - see slide 32
27Video Output Devices (2/5)
- Desktop
- CRT
- LCD flatpanel
- Desktop displays
- (Sun Lab)
- PC and Mac laptops
- Tablet computers and palmtops
- Wacom Cintiq 12WX display tablet
- Semi-Immersive Desktop
- Rear projected
- Typically in stereo
- Fakespace M1 Desk
- Holografikas Holovizio
- Fishtank (slide 29)
- Depth cube (slide 29)
- Semi-Immersive Wall
- Single projector, often DLP (Digital Light
Processing) based (Texas Instruments) - Power Wall (see next slide)
- e.g. 3x3 wall in CCV at 180 George Street
Fakespace M1 Desk
28Power Wall
- Significant registration and blending problems
http//graphics.idav.ucdavis.edu/newsletter/oct04
- First created at SGI
- Mono or stereo
- Semi-immersive via head-tracked stereo
http//www.emercedesbenz.com/Apr06/18DesignOfThe20
07MercedesSClass.html
29Video Output Devices (3/5)
- Fishtank VR
- Cheap VR setup
- Uses stereo (separate image to each eye) to
create 3Dillusion - Input through force-feedback haptic devices
-
- DepthCube
- Composed of 20 liquid
crystal scattering shutters - At any one time, 19 of
these screens are
transparent, and 1 is in a scattering state - Uses z-buffer to determine image displayed on
each screen - 3D anti-aliasing removes discontinuities between
layers - http//lightspacetech.com/
Daniel Keefe using the Fishtank
The LightSpace DepthCube
30Video Output Devices (4/5)
- Immersive
- Head-mounted displays (HMD)
- Cognitive Science Department has the VENlab for
human navigation experiments - Uses Kaiser Proview HMD and Intersense IS-900
trackers - Allows subjects to wander a 1600 sq.ft. room
(nearly) freely - Working on making HMD wireless
- CAVETM Automatic Virtual Environment
- Invented at University of Illinois Electronic
Visualization Lab by Carolina Cruz-Neira, Daneil
Sandin, and Tom DeFanti (SIGGRAPH 1992) - projection onto 3 walls and floor
- also 5 and even 6-sided CAVEs, and a RAVE, a
reconfigurable CAVE - FakeSpace Rave
- Reconfigurable large screen stereoscopic display
31Video Output Devices (5/5)
- Immersive
- Virtual Retinal Display (VRD)
- University of Washington HIT Lab
- VirtuSphere
- Fully immersive VR
- 360 degrees of motion
- Floor moves as you move
- Wireless
User with VRD
VirtuSphere
32Augmented Reality
- Augmented reality devices
- Optical see-through or video-based
- research going on at UNC, University of Vienna,
Columbia (Steven Feiner), Takamura Lab at Osaka
Univeristy, Bauhaus University - University of South Australia made AR Quake can
play a first person shooter around campus
Columbias MARS
33Other Output Devices
- Audio
- Do not underestimate the importance of sound!
- Speakers
- 3D spatial sound
- Headphones
- Printers
- Selectric-style impact printing
- Plotters
- Ink jet
- Thermal transfer
- Laser
- Braille
- Slides/film
- Dye-sublimation
- Holographs
- MIT Media Lab Spatial Imaging Group
- Rapid prototyping systems and 3D raster scan
devices
34Haptic Devices (1/2)
- Haptic relating to or based on the sense of
touch - Actively provides tactile feedback
- Caveat Almost all tactile output devices are
also input devices - Some examples
- piezoelectric gloves
- Piezo pads apply pressure or vibration to users
fingers - solenoid mouse
- Mouse vibrates via an electromagnetic solenoid
- SensAbles PHANToM in the Graphics Lab
- Also passive haptic devices
- prop or phicon based interaction
Phantom 3D force feedback haptic interface
Phicons
35Haptic Devices (2/2)
- Kinetic devices
- Force-feedback joystick
- Sarcos Dextrous Arm
36Computer vs. Human Performance
Goal increase bandwidth to the brain
37WIMP Pros
- WIMP encourages ease of x learning, remembering,
transferring - WIMP has become a standard GUI
- but not everyone can or wants to use a mouse
- Layers of support software gt ease of
implementation, maintainability - Toolkits (Qt, Motif, etc.)
- interface builder
- User Interface Management Systems (UIMS)
- Lots of documentation about how to come up with
a good GUI - GUI Design for Dummies by Laura Arlov (97)
38WIMP Cons
- Imposes ping-pong dialog model based on mouse and
keyboard input, 2D graphics output - deterministic and discrete
- hard to handle simultaneous input
- pure WIMP doesnt use other senses hearing,
touch - 70 of our neurons in visual cortex, but try to
communicate without speech, sound - Not usable in immersive VR
- Does not support multiple, simultaneous users
39Impedance-matching Limitations of WIMP GUI
Limited Vision (Flat, 2D) No Speech No
Gestures Limited Audio Limited
Tactile One Hand Tied Behind Back
40Post-WIMP interfaces
- Gestural interfaces
- Microsoft Center for Research in Pen-centric
Computing at Brown - Music Notepad
- MathPad
- ChemPad
- Diagrammer
-
- Multitouch
- Jeff Han (NYU), Perceptive Pixel
- Microsoft Surface
- iPhone, iPod touch
- FTIR in Graphics Lab
- based on Hans work
- Multimodal
- XV (IBM, Motorola, Opera)
- VR and AR interfaces
- CAVE, VENLab
41Post-WIMP Characteristics
- Multiple channels possibly multiple participants
- High bandwidth, continuous input
- body part tracking (head, hand)
- gesture and speech recognitiongtprobabilistic
disambiguation (e.g. handwriting recognition for
PDAs, data gloves in VR) - multimodal interfaces mutually reinforcing
parallel channels - perceptual interfaces typically multimodal
passive sensing - Autonomous objects in active world
- MIT Media Labs Put that there from the 80s
- MIT AI Labs Intelligent Room
42Post-WIMP World Push and Pull
- Push from new technology, from form factors
- PDAs
- flat panels
- wearables
- embedded computing, smart x
- Pull from new applications that both leverage and
drive technology trends - These interact to raise expectations continuously
43WIMP GUIs Will Be Augmented, Not Replaced
- UI spectrum
- direct control (direct manipulation,
drag-and-drop, 2D and 3D widgets) - indirect control (agents, social interfaces)
- WIMP enhanced by
- speech and gesture recognition, passive sensing
(video based) - agents/wizards
- 3D widgets (interface tools)
- From Human Computer Interaction (HCI) to Human
Human Interaction (HHI)
44Bill Buxton-Surface and Tangible Computing
- Acoustic transducers are bi-directional,
e.g., microphone/speaker - displays soon will bepixel (R,G, B, I) where
I puns eye and is basically a photo-diode
- Size matters (large screen displays, the Cave)
- multiple technologies will make it possible to
replace whiteboardscheaply with 100-200 DPI
color screens (e.g., Organic LEDs) can be puton
thin, flexible substrates - Phicons/tangible objects become interesting when
they have built-in intelligence - our e-gadgets
all do (microchips, wireless) - Surface becomes the connecting "ground" on which
these "figures interact and amplify their
ability - http//www.popularmechanics.com/technology/industr
y/4217348.html
45From HCI to HHI (Human-Human Interaction)
Note each human typically controls many
devices and user interfaces
46Multiple, Interconnected Devices and UIs per
User (1/3)
- Office/Home
- wall displays personal notepad
- video-tracking for user id, location, gaze,
gesture - continuous speech recognition natural language
understanding intelligent information
processing - Furniture chair is instrumented to help detect
posture, adjust to the users preferred position
- Health-care
- Prostheses (today, heart pacemakers, hearing
aids, cochlear implants, voice boxes, artificial
joints and organs) - Electro-chemical monitors, probes (increasingly
less obtrusive) - Smart toilet to monitor bodily wastes
- Tele-collaboration for multi-disciplinary
industrial design - Immersive VR environment
- Emphasis on small team collaboration
47Where are We Today?
48Further Resources
- Check out the newest research in the Brown CS
Computer Graphics Lab - Visualization Laboratory
- http//vis.cs.brown.edu/organization/people/dhl.ht
ml - CAVE
- http//www.cs.brown.edu/research/graphics/research
/cave/home.html - Haptics
- http//www.cs.brown.edu/research/graphics/research
/haptics/home.html - Pen-centric Computing
- http//pen.cs.brown.edu/
- https//pcc.cs.brown.edu/wiki/