Feature Selection using Mutual Information - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

Feature Selection using Mutual Information

Description:

Fractal Dimension. FD. Linearized Power Spectrum. Slope ... Feature Images Fractal Dim. 15. Feature Images - PSD. 16. Interdependence between Features ... – PowerPoint PPT presentation

Number of Views:223
Avg rating:3.0/5.0
Slides: 25
Provided by: na375
Category:

less

Transcript and Presenter's Notes

Title: Feature Selection using Mutual Information


1
Feature Selectionusing Mutual Information
  • SYDE 676 Course Project
  • Eric Hui
  • November 28, 2002

2
Outline
  • Introduction prostate cancer project
  • Definition of ROI and Features
  • Estimation of PDFs using Parzen Density
    Estimation
  • Feature Selection using MI Based Feature
    Selection
  • Evaluation of Selection using Generalized
    Divergence
  • Conclusions

3
Ultrasound Image of Prostate
4
Prostate Outline
5
Guesstimated Cancerous Region
6
Regions of Interest (ROI)
7
Features as Mapping Functions
  • Mapping from image space to feature space

8
Parzen Density Estimation
  • Histogram Bins
  • bad estimation with limited data available!
  • Parzen Density Est.
  • reasonable approximation with limited data.

9
Features
  • Gray-Level Difference Matrix (GLDM)
  • Contrast
  • Mean
  • Entropy
  • Inverse Difference Moment (IDM)
  • Angular Second Moment (ASM)
  • Fractal Dimension
  • FD
  • Linearized Power Spectrum
  • Slope
  • Y-Intercept

10
P(XCCancerous), P(XCBenign), and P(X)
11
Entropy and Mutual Information
  • Mutual Information I(CX) measures the degree of
    interdependence between X and C.
  • Entropy H(C) measures the degree of uncertainty
    of C.
  • I(XC) H(C) H(CX).
  • I(XC) H(C) is the upper bound.

12
ResultsMutual Information I(CX)
13
Feature Images - GLDM
14
Feature Images Fractal Dim.
15
Feature Images - PSD
16
Interdependence between Features
  • Expensive to compute all features.
  • Some features might be similar to each other.
  • Thus, need to measure the interdependence between
    features I(Xi Xj)

17
ResultsInterdependence between Features
18
Mutual Information BasedFeature Selection (MIFS)
  • Select first feature with highest I(CX).
  • Select next feature with highest
  • Repeat until a desired number of features are
    selected.

19
Mutual Information BasedFeature Selection (MIFS)
  • This method takes into account both
  • the interdependence between class and features,
    and
  • the interdependence between selected features.
  • The parameter ß controls the amount of
    interdependence between selected features.

20
Varying ß in MIFS
21
Generalized Divergence J
  • If the features are biased towards a class, J
    is large.
  • A good set of features should have small J.

22
ResultsJ with respect to ß
  • First feature selected GLDM ASM
  • Second feature selected

23
Conclusions
  • Mutual Info. Based Feature Selection (MIFS)
  • Generalized Divergence

24
Questions and Comments
Write a Comment
User Comments (0)
About PowerShow.com