A Practical Guide to SVM - PowerPoint PPT Presentation

About This Presentation
Title:

A Practical Guide to SVM

Description:

A Practical Guide to SVM Yihua Liao Dept. of Computer Science 2/3/03 Outline Support vector machine basics GIST LIBSVM (SVMLight) Classification problems Given: n ... – PowerPoint PPT presentation

Number of Views:118
Avg rating:3.0/5.0
Slides: 19
Provided by: Yih2
Category:

less

Transcript and Presenter's Notes

Title: A Practical Guide to SVM


1
A Practical Guide to SVM
  • Yihua Liao
  • Dept. of Computer Science
  • 2/3/03

2
Outline
  • Support vector machine basics
  • GIST
  • LIBSVM (SVMLight)

3
Classification problems
  • Given n training pairs, (ltxigt, yi), where
  • ltxigt(xi1, xi2,,xil) is an input vector, and
    yi1/-1, corresponding classification H /H-
  • Out A label y for a new vector x

4
Support vector machines
Goal to find discriminator That maximize the
margins
5
A little math
  • Primal problem
  • Decision function

6
Example
  • Functional classifications of Yeast genes based
    on DNA microarray expression data.
  • Training dataset
  • genes that are known to have the same Function f
  • genes that are known to have a different function
    than f

7
Gist
  • http//microarray.cpmc.columbia.edu/gist/
  • Developed by William Stafford Noble etc.
  • Contains tools for SVM classification, feature
    selection and kernel principal components
    analysis.
  • Linux/Solaris. Installation is straightforward.

8
Data files
  • Sample.mtx (tab-delimited, same for testing)
  • gene alpha_0X alpha_7X alpha_14X
    alpha_21X
  • YMR300C -0.1 0.82 0.25
    -0.51
  • YAL003W 0.01 -0.56 0.25
    -0.17
  • YAL010C -0.2 -0.01 -0.01
    -0.36
  • Sample.labels
  • gene Respiration_chain_complexes.mi
    psfc
  • YMR300C -1
  • YAL003W 1
  • YAL010C -1

9
Usage of Gist
  • compute-weights -train sample.mtx -class
    sample.labels gt sample.weights
  • classify -train sample.mtx -learned
    sample.weights -test test.mtx gt test.predict
  • score-svm-results -test test.labels test.predict
    sample.weights

10
Test.predict
  • Generated by classify Gist, version 2.0
  • .
  • gene classification discriminant
  • YKL197C -1 -3.349
  • YGL022W -1 -4.682
  • YLR069C -1 -2.799
  • YJR121W 1 0.7072

11
Output of score-svm-results
  • Number of training examples 1644 (24 positive,
    1620 negative)
  • Number of support vectors 60 (14 positive, 46
    negative) 3.65
  • Training results FP0 FN3 TP21 TN1620
  • Training ROC 0.99874
  • Test results FP12 FN1 TP9 TN801
  • Test ROC 0.99397

12
Parameters
  • compute-weights
  • -power ltvaluegt
  • -radial -widthfactor ltvaluegt
  • -posconstraint ltvaluegt
  • -negconstraint ltvaluegt

13
Rules of thumb
  • Radial basis kernel usually performs better.
  • Scale your data. scale each attribute to 0,1 or
    -1,1 to avoid over-fitting.
  • Try different penalty parameters C for two
    classes in case of unbalanced data.

14
LIBSVM
  • http//www.csie.ntu.edu.tw/cjlin/libsvm/
  • Developed by Chih-Jen Lin etc.
  • Tools for (multi-class) SV classification and
    regression.
  • C/Java/Python/Matlab/Perl
  • Linux/UNIX/Windows
  • SMO implementation, fast!!!

15
Data files for LIBSVM
  • Training.dat
  • 1 10.708333 21 31 4-0.320755
  • -1 10.583333 2-1 4-0.603774 51
  • 1 10.166667 21 3-0.333333 4-0.433962
  • -1 10.458333 21 31 4-0.358491 50.374429
  • Testing.dat

16
Usage of LIBSVM
  • svm-train -c 10 -w1 1 -w-1 5 Train.dat My.model
  • - train classifier with penalty 10 for class 1
    and penalty 50 for class 1, RBK
  • svm-predict Test.dat My.model My.out
  • svm-scale Train_Test.dat gt Scaled.dat

17
Output of LIBSVM
  • Svm-train
  • optimization finished, iter 219
  • nu 0.431030
  • obj -100.877286, rho 0.424632
  • nSV 132, nBSV 107
  • Total nSV 132

18
Output of LIBSVM
  • Svm-predict
  • Accuracy 86.6667 (234/270) (classification)
  • Mean squared error 0.533333 (regression)
  • Squared correlation coefficient 0.532639
    (regression)
  • Calculate FP, FN, TP, TN from My.out
Write a Comment
User Comments (0)
About PowerShow.com