Title: Experience Retrieval in a Ubiquitous Home
1Experience Retrieval in a Ubiquitous Home
Gamhewage C. de Silva, Byoungjun Oh, T. Yamasaki,
K. Aizawa Department of Frontier Informatics, The
University of Tokyo, Japan
Abstract
Ubiquitous Home
- An electronic chronicle for a home like
ubiquitous environment - Data acquisition using a large number of cameras
and microphones - Analysis of data from floor sensors for efficient
retrieval - Content analysis for event detection
- Evaluation experiments for selecting the most
suitable algorithms
- Located in NICT Keihanna Laboratory
- Video recorded at 5 frames per second
- Audio sampled at 44.1kHz
- Floor sensors activated by footsteps
- Approximately 500 GB per day
Person-based Video Retrieval
Data Collection
- Student experiments
- Daily life activities
- Data for specific actions and events
- Real-life experiment
- A family of three members living for 10 days
- To retrieve a single video clip showing a moving
person - Segment of floor sensor data into footstep
sequences of individuals
Key Frame Extraction
- Video handover for automatic camera selection for
video creation - Audio handover for microphone selection to dub
the video
- Create a complete and compact summary of the
video - Several techniques evaluated by an experiment
- Adaptive Spatio-Temporal Sampling for up to 80
accuracy
Interaction Detection
Average
Adaptive
Context and Heuristics
Change detection in ROI
- Predefined regions of interest using multiple
views to minimize error due to occlusion - Change detection using color histogram
- Heuristics and and context data for event
detection
Conclusion
User interaction
- Interactive querying via a Graphical user
interface - Results grouped by persons and locations
- Video retrieval and summarization in a home
environment - Footstep segmentation with video and audio
handover for video synthesis - Key frame extraction for video summarization
- Image analysis for interaction detection
Demonstration
- Retrieval from a day of the real-life experiment
- Please have a look and provide feedback!
Email (chamds, byon, yamasaki,
aizawa)_at_hal.k.u-tokyo.ac.jp This work is
supported in part by the National Institute of
Information and Communications Technology, Japan.