Title: Context
1Panoramic
University of Amsterdam Informatics Institute
2Localization
3with a Sony Aibo
by Jürgen Sturm
4RoboCup 4-Legged League
Context Mobile Robots
Sony Aibo Robots 4 vs. 4 robots play fully
autonomously Soccer Games
5RoboCup _at_ home
- real-world applications
- human-machine interaction
- Fully autonomous robots have to master challenges
- in unknown unstructured environments
- Follow a human, navigate, etc.
6(No Transcript)
7Traditional approaches
- Aibos / 4-Legged league uses landmarks with
- known positions,
- known shape and
- known color (manually calibration taking hours)
- General solutions (SLAM) use better hardware
- Laser range finders
- Omnidirectional cameras
- Robots with better odometry (wheels)
The problemMobile robot localization(estimating
the robots position)
8Features of new approach
- Real-time localization on a Sony Aibo
- Take advantage of natural features of a room
- Independency of artificial landmarks
- Auto-calibrating in new environments
- Idea
- Learn a panoramic model of the surroundings of
the robot for localization
9Color clustering
Collect interesting colors (around the robot)
Determine 10 most characteristic colors (using
an EM clustering algorithm)
Raw image (208x160, YCbCr)
10Sector appearance
Approach Building an virtual panoramic wall
Divide in vertical slices, called sectors (360
correspond to 80 sectors)
Count color transitions per sector (between the
10 most char- acteristic colors of the scene)
Raw image (208x160, YCbCr)
11Learning the panorama model
Image features (10-12 sectors/image, 10x10
frequencies/sector)
Learn panorama model (estimate frequency
distributions per sector)
Panorama model (80 sectors, 10x10 distributions,
each defined by 5 bins)
12Alignment and Localization
Robot rotated 45 to the left
After learning from 131 frames
Image features (10-12 sectors/image, 10x10
frequencies/sector)
Align with stored panorama model (find shortest
path)
Output (Rotational estimate Signal-to-noise
ratio Confidence range)
13Experiments in human environments
- Rotational test in living room (at night)
Results Learning of the appearance of unknown
unstructured environments
14Translational test on soccer fields
Human soccer field, outdoors, single learned spot
4-Legged soccer field, indoors, single learned
spot
15Multi-spot learning
- Aibo trained on 3x3 different spots, yielding 9
different panoramas - Aibo kidnapped and placed back at arbitrary
positions on the field - Aibo tries to walk back to center spot
16Possibilities for the 4-Legged league
- Getting rid of all artificial landmarks
- 11 vs. 11 games (bigger field)
- Outdoor demonstrations become possible
Conclusions
17Possible usage for theRoboCup _at_ home league
- Distinguish living room from kitchen or garden
- Rough but quick map building
- Find relative position of the TV/stove/etc on
this map
18Other applications
- CareBot navigation in a closed indoor
environment - Mobile applications (for example on cellular
phones) for quick positional estimates (tourism)
19Conclusions
- Accurate estimate of the rotation from a single
learned spot (up to 40 meters) - A good estimate of the relative distance from a
single learned spot (up to 40 meters) - Rough estimate of the absolute position from
multiple trained spots
20Panoramic Localization with a Sony Aibo by Jürgen
Sturm
University of Amsterdam Informatics Institute
- User manual
- Head button always resets robot and triggers
autoshutter color clustering - Press front button to manually trigger color
clustering - In training mode
- Press middle button to start learning of the
first spot - Press middle button again to continue learning
on 8 more spots - Press back button to switch to localization mode
- In localization mode
- Press front button to switch between rotational
and translational mode - Press middle button to reset panorama and start
learning - Press back button to switch between find and
set-reference mode