Title: SpARC
1SpARC Supplementary Assistance for Rowing
Coaching
- Simon Fothergill
- Ph.D. student, Digital Technology Group, Computer
Laboratory
DTG Monday Meeting, 9th November 2009
2Overview
- Automated Physical Performance Assessment
- Basic real-time feedback
- Performance similarity
- Fault recognition
- Data annotation
- Questions
3Automated Physical Performance Assessment
- Sense and Optimise
- Automatically provide feedback to athletes on
their technique, including how good they are and
how they could improve. - Combining Coaches and surrogate coaches
- Analysis of the kinetics of a physical
performance - Focusing on (indoor) rowing
- Other relevant domains
4Overview
- Automated Physical Performance Assessment
- Basic real-time feedback
- Data capture
- User interface
- Conclusions
- Performance similarity
- Fault recognition
- Data annotation
- Questions
5Data Capture System
- Aim
- Record a data set.
- Allow annotation by expert, professional coaches
- Provide basic local, real-time and distributed,
post-session feedback.
- Data set requirements
- Real
- Convincing useful evaluation of algorithms
- Investigate domain
- Large
- Segmented
- Labelled
- Synchronised
- High fidelity
- Data capture system requirements
- Portable
- Cheap
- Physically robust
- Extensible platform
- Equipment augmentation
6Data Capture System Previous work
- Systems
- Bat system
- Inertial sensors
- Phasespace
- Coda
- Vicon
- Motion capture with Wii controllers
- StrideSense
- Concept 2 Performance monitor
- Datasets
- Generic actions
- 2D videos of sports performances
7Data Capture System SpARC (0)
8Data Capture System WMCS
- 3D Motion Capture System with Wii controllers
- Based on work below which showed 3D motion
capture is possible with a reasonable local
precision, using 2 Wii controllers. - Reference Optical tracking using commodity
hardware Simon Hay, Joseph Newman and Robert
Harle 7th IEEE and ACM International Symposium
on Mixed and Augmented Reality 2008 - Standalone from Matlab, improved performance
- Synchronisation of Wii controllers
- Robust software
- Cmdline tool/library
9Data Capture System WMCS
PC (Ubuntu)
C server
Wii library interface
LIFO Buffer
Wii library
Wii controller Thread
Wii controller Thread
Bluetooth library
Bluetooth adapter
Serial port
Bluetooth
Bluetooth
Nintendo Wii controller Â
Nintendo Wii controller Â
power control
power control
Wii controller bridge
IR 1024x768 camera (100Hz)
IR 1024x768 camera (100Hz)
10Data Capture System C2PM3
PC (Ubuntu)
C server
C2PM3 library interface
libUSB
USB port
Concept 2 Performance Monitor v3
Buffer
Erg (with flywheel)
11Data Capture System StrideSense
PC (Ubuntu)
Java client
usb0 network interface
USB port power TCP/IP
Crossbow iMote2 strideSenseADCServer.c
ADC
Power board
FSR
FSR
- ( http//sourceforge.net/projects/stridesense/ )
12Data Capture System EMCS (1)
- EMCS needs to track handle (1), seat (1) and erg
position orientation (4) - WMCS currently limited to 4 LEDs
- Use 1 LED as a stationary point on the erg 2
LEDs on the seat at different points in time - Use PCA to extract ECS axes
ECS (Erg Coordinate System)
Two LEDS attached to seat
Erg clamped to camera rig to minimise error
13Data Capture System EMCS (2)
Server
Client
Storage
4 x 2D coordinates
End LED
Calibrate labeller
Erg calibration
Calibration
Stereo calibration
Calibrate WMCS (openCV)
Handle LED
Label markers
Seat LEDs
Triangulation
Update ECS if necessary
ECS
Live operation
Transform to ECS
Data from one Wii controller IR camera, used in
computing correspondance of LEDs between cameras
4 x 3D coordinates
14Data Capture System SpARC (1)
PC (Ubuntu) in boathouse
PC in Computer Lab
Monitor Keyboard
Upload to database
ssh
CLOCK
Create user requested videos
scp
C server
Java client
Batch processing
TCP / IP
camera interface
libdc1394
(firewire)
Camera (rower)
C2PM3 (handle force)
StrideSense (footplates force)
WMCS (handle seat motion)
Athletes / coachs PC
Web browser
15Data Capture System SpARC (2)
Server
Client
File server (CL)
Detect strokes
Create directories
Turn on/off camera
Transmit data
Record user code
Live operation
Handle seat coordinates, handle force, stroke
boundaries
Display on GUI
Log data Motion force data, images
Split data into strokes
Augment and select
Encode videos
Post session
Create metadata
Data, videos, video metadata
Create user videos
Update database
16Data Capture System SpARC (3)
17User interface SpARC (1)
18User interface SpARC (2)
- http//www-dyn.cl.cam.ac.uk/jsf29/
19Conclusions (1)
- General
- It works!
- It is being used and collects on average 1
session of 5 strokes every 2 or 3 days
- Users
- Cool!, Works, Has potential
- Some people are very frightened about using it
(reassurance is required). - Being able to see what your doing as well as feel
it has been observed to facilitate communication
and understanding of correct technique - Although rowing technique is complex, the system
has a steep but short learning curve - Athletes require a very simple interface. They
wont even see half the screen and definitely not
read anything. - Elite athletes will favour raw signal feedback
novices would be aided by hints and tips - Force is equally important as motion in the
feedback.
20Conclusions (2)
- Technical
- At limit of WMCS range (accuracy and precision)
- WMCS wont work in bright sunlight
- Hand covering LED on handle
- Correspondence Unnecessary vigorous rowing
upsets algorithms which could be improved (domain
specific e.g. scan generic e.g. epipolar
constraints) - ECS updated infrequently
- More force sensors on heal of feet
- openCV is buggy
- General
- Basic signals or sophisticated interpretation.
Manually constructing rules is hard and unhelpful
- Every sports has specific requirements
- General system fidelity does show up interesting
shapes, artefacts in rowing - Sensor signals BECOME the ontology
21Conclusions (3)
- General
- Developed a novel and functional system and
gained experience of deploying it and what is
possible to achieve. - It enables further useful and convincing work to
be done - Useful dataset
- Platform for other system to be built with
- Contributes to the age of useful data sets by
setting a benchmark for dataset scale and detail.
- The experience is applicable in other domains of
physical performances and a lot of the work and
could be reused
22Further work
- Use metrics to quantitatively measure the
consistency of a performance with different
levels (and modes) of real-time feedback
23Overview
- Automated Physical Performance Assessment
- Basic real-time feedback
- Performance similarity
- Fault recognition
- Data annotation
- Questions
24Performance similarity (1)
- Overall quality
- Quantitatively measure the difference between an
ideal and given performance - Motivation Closed sports, muscle memory,
practise consistently good strokes - Cue and motivate athlete
- Problems include definition of ideal (currently
coach), anthropomorphic invariance - Similarity is VERY hard to define.
25Performance Similarity (2)
- Approach
- Trajectories can be different in various ways
- Objective or subjective (from coaches) arguments
for how similar are pairs of trajectories which
are different in these ways. - Form hypothesis about likely algorithms
- Test on dataset labelled with similarity define
by coach - See how necessary various aspects of algorithms
are - Artefacts exist in real dataset
- Correlations between shape speed
26Overview
- Automated Physical Performance Assessment
- Basic real-time feedback
- Performance similarity
- Fault recognition
- Data annotation
- Questions
27Fault recognition (on-going work)
- Preliminary results have been obtained using a
dataset of 6 rowers and the complete trajectory
of the erg handle only. Binary classification
over stroke quality was done using tempo-spatial
features of the trajectory of the handle and a
neural network. Two training methods were
compared.
Classification accuracy across given number of
performers, for quality of individual aspects of
technique.
- Progress Waiting for enough data to make it
worth running more algorithms
28Overview
- Automated Physical Performance Assessment
- Basic real-time feedback
- Performance similarity
- Fault recognition
- Data annotation
- Questions
29Annotating data
- http//www-dyn.cl.cam.ac.uk/jsf29/
30Concluding remarks
- Three generations of machine learning (Taxonomy
by Prof. Chris Bishop). - Rule dont work,
- Statistics is limited, especially recognition
across different people - Structure (time, segment selection)
31Acknowledgements
- Rob Harle (advice)
- Brian Jones (EMCS, driving the van)
- Marcelo Pias Salman Taherian (StrideSense)
- SeSAME (context)
- Simon Hay (Wii controllers, van driving)
- Andrew Lewis (C2PM3 code advice)
- Jesus College Boat Club, fellows and IT
department (environment and support) - i-Teams (who are investigating any commercial
potential of SpARC)
32Questions
- Please come and use the system!
- Thank you!
- Any questions?