Title: Multitask Learning Using Dirichlet Process
1Multitask Learning Using Dirichlet Process
2Outline
- Task defined infinite mixture of priors
- Multitask learning
- Dirichlet process
- Task undefined expert network
- Finite expert network
- Infinite expert network
3Multitask Learning- Common Prior Model
M classification tasks
Shared prior of w
4Drawback of This Model
Assume each wm is a two-dimensional vector.
5Proposed Model
w is drawn from a Gaussian mixture model
6Two Special Cases
- Common prior model - single Gaussian
- Piecewise linear classifier point mass function
similar vs. identical
7Clustering
Unknown parameters
Another uncertainty K.
Model selection compute evidence/Marginal
8Clustering with DP No Model Selection
We rewrite the model in another form
We define a Dirichlet process
prior for parameters
9Stick-Breaking View of DP
0
1
Finally we get
10Prediction Rule of DP for Posterior Inference
- is a new data point.
- Assuming there are K distinct values of among
, - belongs to an existing cluster k
belongs to new cluster
11Toy Problem
12(No Transcript)
13Task 1
14Task 2
15Task 3
16Task 4
17Task 5
18Task 6
19Task 7
20Task 8
21(No Transcript)
22Expert Network
23Mathematical Model
24Mathematical Model
is the unique path from the root note to expert m.
where
25Example
26Infinite Expert Network
Infinite number of gating node.