![]() |
|
|
|
Course director: Robert Allison Fall 2011
T 19:00-22:00 FRQ 045 (this room is horrible, we'll meet in CSE3033 after the first night )
Office 3051CSE; phone x20192
email: allison@cse.yorku.ca; www: www.cse.yorku.ca/percept
course web page: www.cse.yorku.ca/course/6326
This course considers the role of human perception in human-computer interaction particularly computer generated graphics/sound and immersive virtual reality. Fundamental findings from sensory physiology and perceptual psychophysics are presented in the context of interface and display design.
Expanded Course Description (from calendar)
This course evaluates the role of human perception in the design and use of computer systems. The fields of visual, tactile and auditory psychophysics and physiology are surveyed. Fundamental findings on how we perceive tone, pitch, force, light, colour, pattern, motion, texture, shape, and depth are examined in the context of how they can be used in real applications of computer-generated displays and advanced interfaces. The current state of the art will be discussed in terms of the capabilities and limitations of the operator. Selected topics of interest to the instructor and class will be covered in detail, and would include material such as the following:
1. Principles of human perception and its relationship to the design
of computer displays - basics of the physiology and psychophysics of
audition, vision and proprioception
2. Limits of performance– perception, motor action, cognition, memory,
workload and attention
3. Visual displays- realistic image synthesis, scientific and
information visualization, virtual environments, and graphic design.
4. Auditory, tactile and motion displays
5. Virtual environments- Enabling technologies, characteristics, human
factors, sensory integration, cyber sickness, interaction, navigation,
control
6. Training and fidelity in computer-generated simulations.
7. Telepresence and teleoperation
8. Mixed reality and wearable computers
9. Presence, realism, suspension of belief and their relationship
to the development of effective virtual environments.
10. Interaction and collaboration between users in multi-user virtual
environments
11. Measurement of physiological and psychophysical parameters-
anthropometry, biofeedback, performance and usability evaluation
Preliminary Schedule
Here's my tenative plan subject to the interests of the class and demands of time. For this year I plan to spend more time on stereoscopic 3D displays and applications than I have in the past.
Week | Dates | Topic |
Week 1 | Sept 13 | Introduction |
Week 2 | Sept 20 | VR, AR and advanced display applications, Enabling Technology. |
Week 3 | Sept 27 | Visual perception and Visual Displays |
Week 4 | Oct 4 | Visual perception and Visual Displays |
Week 5 | Oct 11 | No class break week |
Week 6 | Oct 18 | Visual perception and Visual Displays |
Week 7 | Oct 25 | Active Vision; Movement and tracking |
Week 8 | Nov 1 | Depth Perception and 3D Displays; Paper or project proposal Due |
Week 9 |
Nov 8 |
Depth Perception and 3D Displays |
Week 10 | Nov 15 | Depth Perception and 3D Displays |
Week 11 | Nov 22 | Audition and Spatial Auditory displays |
Week 12 | Nov 29 | Somatosensory perception, kinesthesia; Haptic displays |
Week 13 | Dec 6 | Sensory integration; Cue conflict; Side-effects: cybersickness, asthenopia, oscillopsia ...; Paper or project Due Dec 13 |
Evaluation
Seminars (individual) 50%
Seminar Participation 20%
Term Paper/Project (individual) 30%
The seminars will be critical reviews/analysis of papers drawn from the current literature. They will be presented to the class and evaluated in terms of presentation and content. You should cover background material, discuss the paper, critique it and put it in context. Seminars will be scheduled regularly throughout the term. Each student will present two or three seminars (depending on class numbers). All students should read the papers and be prepared to discuss it.
A term paper reviewing a related subject must be submitted by the end of term. I am also happy to consider interesting practical projects. Students must submit a brief (one paragraph) proposal for a paper topic for instructor approval.
Seminars, Papers for Discussion
Week 1: Introduction
Week 2: Introduction, enabling technology
Week 3: Vision and Visual Displays
seminars and readings on Historical VR: 1) Sutherland paper A and B 2) CAVE paper
Week 4: Vision and Visual Displays
seminars and readings on enabling technologies and apllication: 1) AR Image-guided needle insertion 2) BCI interfaces see also this introduction for an easier go
Week 6:
seminars and readings on limits of vision and displays: 1) Visually significant edges 2) Tone Mapping
Week 7:
seminars and readings on limits of vision and displays: 1) Attention, Consciousness, and Data Display 2) Two Pathways
Week 8:
seminars and readings on limits of vision and displays: 1) Peephole pointing 2) Multiscale UI
Week 9:
seminars and readings on motion: 1) Motion in Graphs 2) Vection
Week 10:
seminars and readings on motion and interaction 1) Steering law (pdf here) 2) Error Analysis in AR
Week 10:
seminars and readings on stereo displays 1) stereo display tech 2) Motion in depth (Sidrah)
Week 11:
seminars and readings on stereo displays 1) Zone of comfort and 2) Microstereopsis
Week 12:
seminars and readings on stereo displays 1) Fusion Control and 2) Stereo Cut and Paste
Week 13:
seminars and readings on stereo displays 1) Auditory Space
Bibliography
Recommended text is Gutiérrez MA, Vexo, F, & Thalmann, D (2008) Stepping into Virtual Reality, Springer. The core readings will consist of research papers drawn from the recent literature.
Some Additional Readings:
Some Relevant Journals and Proceedings: