Full Title - VideoLectures

Report 0 Downloads 29 Views
Classification and Clustering via Dictionary Learning with Structured Incoherence and Shared Features Ignacio Ramirez, Pablo Sprechmann, and Guillermo Sapiro University of Minnesota

Pablo Sprechmann

0/ 14

Pablo Sprechmann

1/ 14

Sparse Models

Pablo Sprechmann

2/ 14

Sparse Models

Pablo Sprechmann

2/ 14

Sparse Models

Pablo Sprechmann

2/ 14

Sparse Models

Pablo Sprechmann

2/ 14

Learning a Sparse Model (A∗ , D∗ ) = arg min A,D

X x∈C

kx − Dak2 + λ kak1 | {z } | {z } data fitting

regularizer

• Atoms satisfy kdi k1 = 1 • Usual sparsity-inducing regularizers: • `0 “norm”: ψ(aj ) = kaj k0 • `1 norm: ψ(aj ) = kaj k1

See also: SPAMS software. Pablo Sprechmann

3/ 14

Sparse Models for Supervised Classification • Classes: {C1 , C2 , . . . , Cc } • Training: 

xi1 , . . . , xini ⊂ Ci

Pablo Sprechmann

4/ 14

Sparse Models for Supervised Classification • Classes: {C1 , C2 , . . . , Cc } • Training: 

xi1 , . . . , xini ⊂ Ci

Proposed Method 1

Learn (fit) a dictionary Di to represent samples from class Ci .

2

Use representation cost as discriminant function R(x, Di ) = min ||x − Di a||22 + λ||a||1 a | {z } | {z } Fitting term Complexity term

3

Assign sample to class with smallest R(x, Di ) Class(x) = arg min R(x, Di ) i

See also: [Mairal CVPR ‘08]. Pablo Sprechmann

4/ 14

Classification Results

Classification Error Rate (%). Dataset MNIST USPS

[Mairal et al. NIPS ‘08] Disc. Rec. 1.0 3.4 3.5 4.4

Pablo Sprechmann

Proposed 1.3 4.0

5/ 14

Promoting Cross-Incoherence

min

{Di ,Ai }i=1...c

c X X 

kx − Di ak22 + λkak1



i=1 x∈Ci

Pablo Sprechmann

6/ 14

Promoting Cross-Incoherence

min

{Di ,Ai }i=1...c

c X X 

kx − Di ak22 + λkak1



cross-incoherence term z X }| {

T 2

η Di Dj F +

i=1 x∈Ci

j6=i

• More incoherence leads to better discriminative power

See also: [Tropp S.P. 2006] and [ Eldar et al. TIT, Nov. 2009]. Pablo Sprechmann

6/ 14

Promoting Cross-Incoherence

min

{Di ,Ai }i=1...c

c X X 

kx − Di ak22 + λkak1



cross-incoherence term z X }| {

T 2

η Di Dj F +

i=1 x∈Ci

j6=i

• More incoherence leads to better discriminative power • Shared Features:

See also: [Tropp S.P. 2006] and [ Eldar et al. TIT, Nov. 2009]. Pablo Sprechmann

6/ 14

Sparse Models for Clustering

Pablo Sprechmann

7/ 14

Sparse Models for Clustering

Pablo Sprechmann

7/ 14

Sparse Models for Clustering

See also: L1 graph [Cheng TIP, Apr. 2010] and Subspace Clustering [Elhamifar CVPR ‘09]. Pablo Sprechmann

7/ 14

Sparse Models for Clustering

See also: L1 graph [Cheng TIP, Apr. 2010] and Subspace Clustering [Elhamifar CVPR ‘09]. Pablo Sprechmann

7/ 14

Sparse Models for Clustering

See also: L1 graph [Cheng TIP, Apr. 2010] and Subspace Clustering [Elhamifar CVPR ‘09]. Pablo Sprechmann

7/ 14

Sparse Models for Clustering • Energy Minimization Problem • Lloyd’s type of algorithm for minimizing: R(x,Di )

c X z }| { X X

DTi Dj 2 min kx − Di ak22 + λkak1 + η min F

Ci ,{Di }

i=1 x∈Ci

a

Pablo Sprechmann

i6=j

8/ 14

Object Detection

• Detection based on local descriptors. • Learn Dictionaries for SIFT feature vectors.

See also: [Mairal CVPR ‘08,Yang CVPR ‘09].

Pablo Sprechmann

9/ 14

Texture Segmentation

See also: [Peyre JMIV, May 2008, Mairal et al. CVPR ‘08]. Pablo Sprechmann

10/ 14

Extensions

Pablo Sprechmann

11/ 14

Extensions

Pablo Sprechmann

11/ 14

Extensions

Pablo Sprechmann

11/ 14

Source Separation: Hierarchical models

Pablo Sprechmann

11/ 14

Source Separation: Hierarchical models

See also: [Yuan and Lin 2006, Jenatton arXive, 2009].

Pablo Sprechmann

11/ 14

Source Separation: Hierarchical models

See also: [Yuan and Lin 2006, Jenatton arXive, 2009].

Pablo Sprechmann

11/ 14

Collaborative Source Separation

Pablo Sprechmann

12/ 14

Collaborative Source Separation

min

A∈Rp×n

c

n

g=1

k=1

X X 1 kX − DAk2F + λ2 kAg kF + λ1 kak k1 2

P.Sprechmann, I. Ramirez, G. Sapiro and Y. Eldar, Arxiv 2010.

Pablo Sprechmann

12/ 14

Collaborative Source Separation Results Recovery of (two) superimposed textures

Pablo Sprechmann

13/ 14

Collaborative Source Separation Results Recovery of (two) superimposed textures

Recovery of superimposed numbers with missing information

Pablo Sprechmann

13/ 14

Conclusions

• Framework for classification and clustering rich data via dictionary

learning • Simple metric derived from sparse modeling • Inclusion of incoherence and shared features detection • Extension to source separation

Pablo Sprechmann

14/ 14

Conclusions

• Framework for classification and clustering rich data via dictionary

learning • Simple metric derived from sparse modeling • Inclusion of incoherence and shared features detection • Extension to source separation

Thank you!!

Pablo Sprechmann

14/ 14

Recommend Documents