Denoising using Local ICA and a Generalized Eigendecomposition ...

Report 4 Downloads 14 Views
Outline

Denoising using Local ICA and a Generalized Eigendecomposition with Time-Delayed Signals P. Gruber1 K. Stadlthanner1 A. M. Tom´e2 A. R. Teixeira2 F. J. Theis1 C. G. Puntonet3 E. W. Lang1 1 Institute of Biophysics University of Regensburg 2

3

Dept. de Electr´ onica e Telecomunica¸c˜ oes/IEETA Universidade de Aveiro

Dep. Arquitectura y Tecnologia de Computadores Universidad de Granada

ICA 2004 Granada, Spain

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Outline

Contents

I I

Denoising Problems Automatic local projective denoising with ICA I I

I

Denoising using Generalized Eigendecomposition I I

I

ICA, MDL Algorithm GEVD, Matrix Pencil, GEVD-dAMUSE Algorithm

Comparisons and applications I

Analysis of 2D NOESY protein NMR spectra

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Denoising in general

I

Applications I I I

I

Prerequisites I I

I

Signal processing Exploratory data analysis Enhancement of the stability of statistic algorithms Deterministic signals Nondeterministic (random) additive noise

Goal I I

Extraction of the degraded signals Reconstruction of original signals

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Denoising in general

I

Applications I I I

I

Prerequisites I I

I

Signal processing Exploratory data analysis Enhancement of the stability of statistic algorithms Deterministic signals Nondeterministic (random) additive noise

Goal I I

Extraction of the degraded signals Reconstruction of original signals

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Denoising in general

I

Applications I I I

I

Prerequisites I I

I

Signal processing Exploratory data analysis Enhancement of the stability of statistic algorithms Deterministic signals Nondeterministic (random) additive noise

Goal I I

Extraction of the degraded signals Reconstruction of original signals

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Models for additive Noise

I

Gaussian noise I

I I

I

Other models I I

I

Central limit theorem: since the noise originates from many different sources Thermal noise of the examined system or the Noise introduced by processing algorithms sensors Noise with a bounded distribution due to limitations of sensors Variable gaussian noise

Possible solutions I I

Approximation through white noise in time patches Denoising with local methods

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Models for additive Noise

I

Gaussian noise I

I I

I

Other models I I

I

Central limit theorem: since the noise originates from many different sources Thermal noise of the examined system or the Noise introduced by processing algorithms sensors Noise with a bounded distribution due to limitations of sensors Variable gaussian noise

Possible solutions I I

Approximation through white noise in time patches Denoising with local methods

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

ICA, MDL I

Independent Component Analysis (ICA) I I I

I

Minimum Description Length (MDL) I I I

I

Detection of independent sources in a mixture of data signals Given observations X , find transformation f with f (X ) independent Linear symmetric case: Uniqueness except for scaling and permutation Can be used to estimate the dimensionality of the signal subspace Robust even for short time series Deals with additional noise

Information theoretical basis of the MDL estimator I I

I I

MDL principle derived from Bayes rule: P(H | D) = P(D|H)P(H) P(D) Optimisation of a log likelihood function depending on the model and the covariance matrix of the data Approximation of the noise distributions by Gaussians Uses the variance as an estimation of the relative coding length

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

ICA, MDL I

Independent Component Analysis (ICA) I I I

I

Minimum Description Length (MDL) I I I

I

Detection of independent sources in a mixture of data signals Given observations X , find transformation f with f (X ) independent Linear symmetric case: Uniqueness except for scaling and permutation Can be used to estimate the dimensionality of the signal subspace Robust even for short time series Deals with additional noise

Information theoretical basis of the MDL estimator I I

I I

MDL principle derived from Bayes rule: P(H | D) = P(D|H)P(H) P(D) Optimisation of a log likelihood function depending on the model and the covariance matrix of the data Approximation of the noise distributions by Gaussians Uses the variance as an estimation of the relative coding length

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

GEVD, Matrix Pencil

I

Generalized Eigenvalue Decomposition (GEVD) I

I I

I

Solves generalized eigenvalue problem AV = BVD for the matrices A, B Can be performed by two consecutive standard EVDs Can also be reduced to a single standard EVD problem

BSS with Matrix Pencil I I

I

I

Linear BSS model X = AS GEVD of two covariance matrices obtained from the mixtures using a transformation of the signal values for the second Using a frequency filter (convolution with the frequency response function) as transformation The BSS is obtained since the GEVD simultaneous decorrelates both the original and the filtered signal and hence recovers A

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

GEVD, Matrix Pencil

I

Generalized Eigenvalue Decomposition (GEVD) I

I I

I

Solves generalized eigenvalue problem AV = BVD for the matrices A, B Can be performed by two consecutive standard EVDs Can also be reduced to a single standard EVD problem

BSS with Matrix Pencil I I

I

I

Linear BSS model X = AS GEVD of two covariance matrices obtained from the mixtures using a transformation of the signal values for the second Using a frequency filter (convolution with the frequency response function) as transformation The BSS is obtained since the GEVD simultaneous decorrelates both the original and the filtered signal and hence recovers A

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

Delayed Coordinates, Local Projective Denoising I

I

I

I

Using delayed coordinates as high dimensional feature space for the signal Clustering in the feature space by k-means Denoising is achieved by locally projecting onto the subspace containing the signal The delay dimension, signal subspace dimension and number of clusters are estimated using a MDL criterion

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

new data vector 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9

original time series

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

Delayed Coordinates, Local Projective Denoising I

I

I

I

Using delayed coordinates as high dimensional feature space for the signal Clustering in the feature space by k-means Denoising is achieved by locally projecting onto the subspace containing the signal The delay dimension, signal subspace dimension and number of clusters are estimated using a MDL criterion

new data vector 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9

original time series

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

Delayed Coordinates, Local Projective Denoising I

I

I

I

Using delayed coordinates as high dimensional feature space for the signal Clustering in the feature space by k-means Denoising is achieved by locally projecting onto the subspace containing the signal The delay dimension, signal subspace dimension and number of clusters are estimated using a MDL criterion

new data vector 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9

original time series

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

Delayed Coordinates, Local Projective Denoising I

I

I

I

Using delayed coordinates as high dimensional feature space for the signal Clustering in the feature space by k-means Denoising is achieved by locally projecting onto the subspace containing the signal The delay dimension, signal subspace dimension and number of clusters are estimated using a MDL criterion

new data vector 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9

original time series

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

Delayed Coordinates, Local Projective Denoising I

I

I

I

Using delayed coordinates as high dimensional feature space for the signal Clustering in the feature space by k-means Denoising is achieved by locally projecting onto the subspace containing the signal The delay dimension, signal subspace dimension and number of clusters are estimated using a MDL criterion

new data vector 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9

original time series

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

Delayed Coordinates, Local Projective Denoising I

I

I

I

Using delayed coordinates as high dimensional feature space for the signal Clustering in the feature space by k-means Denoising is achieved by locally projecting onto the subspace containing the signal The delay dimension, signal subspace dimension and number of clusters are estimated using a MDL criterion

new data vector 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9

original time series

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

Delayed Coordinates, Local Projective Denoising I

I

I

I

Using delayed coordinates as high dimensional feature space for the signal Clustering in the feature space by k-means Denoising is achieved by locally projecting onto the subspace containing the signal The delay dimension, signal subspace dimension and number of clusters are estimated using a MDL criterion

Data

Delay Embedding

k

m -

MDL

PCA ICA

Clustering

A MDL list

MDL Projection

m

Uncluster Average

Out

A inv

new data vector 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9

original time series

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

GEVD-dAMUSE Algorithm I

I I

Delayed coordinates (trajectory matrix) X Filtering yields Xf Denoising matrix pencil I

I

I

I

EVD of the covariance matrix RX to get whitening transformation of X Apply whitening and reduce dimension to Xf using the l largest eigenvalues of X EVD of the transformed Xf yields the sources

x(t) = [x0 (t), . . . , xN (t)]⊥ , t = 1, . . . , L



X=

               

x0 ((M − 1)tau) x0 ((M − 2)tau) . ..

x0 (1 + (M − 1)tau) · · · x0 (L + (M − 1)tau) x0 (1 + (M − 2)tau) · · · x0 (L + (M − 2)tau) .. .. . ··· . x0 (1) ··· x0 (L) .. .. . ··· .

x0 (0) .. . xN ((M − 1)tau) xN (1 + (M − 1)tau) · · · xN (L + (M − 1)tau) .. .. .. . . ··· . xN (0) xN (1) ··· xN (L)

Parameters: the used eigenvalues l and the delay dimension M

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

                

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

GEVD-dAMUSE Algorithm I

I I

Delayed coordinates (trajectory matrix) X Filtering yields Xf Denoising matrix pencil

12

10

8

original signal

6

4

2

0

I

I

I

I

EVD of the covariance matrix RX to get whitening transformation of X Apply whitening and reduce dimension to Xf using the l largest eigenvalues of X EVD of the transformed Xf yields the sources

Parameters: the used eigenvalues l and the delay dimension M

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

-2

10

8

6

4

2

0

-4 -2

x(t) = [x0 (t), . . . , xN (t)]⊥ , t = 1, . . . , L

               

X=

x0 ((M − 1)tau) x0 (1 + (M − 1)tau) x0 ((M − 2)tau) x0 (1 + (M − 2)tau) .. .. . . x0 (0) x0 (1) .. .. . . xN ((M − 1)tau) xN (1 + (M − 1)tau) .. .. . . xN (0)

xN (1)

· · · x0 (L + (M − 1)tau) · · · x0 (L + (M − 2)tau) .. ··· . ··· x0 (L) .. ··· . · · · xN (L + (M − 1)tau) .. ··· . ···

xN (L)

                

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

GEVD-dAMUSE Algorithm I

I I

Delayed coordinates (trajectory matrix) X Filtering yields Xf Denoising matrix pencil

12

10

8

6

4

2

0

I

I

I

I

EVD of the covariance matrix RX to get whitening transformation of X Apply whitening and reduce dimension to Xf using the l largest eigenvalues of X EVD of the transformed Xf yields the sources

Parameters: the used eigenvalues l and the delay dimension M

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

-2

10

8

6

4

2

0

-4 -2

filter

x(t) = [x0 (t), . . . , xN (t)]⊥ , t = 1, . . . , L

               

X=

x0 ((M − 1)tau) x0 (1 + (M − 1)tau) x0 ((M − 2)tau) x0 (1 + (M − 2)tau) .. .. . . x0 (0) x0 (1) .. .. . . xN ((M − 1)tau) xN (1 + (M − 1)tau) .. .. . . xN (0)

xN (1)

· · · x0 (L + (M − 1)tau) · · · x0 (L + (M − 2)tau) .. ··· . ··· x0 (L) .. ··· . · · · xN (L + (M − 1)tau) .. ··· . ···

xN (L)

                

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

GEVD-dAMUSE Algorithm I

I I

Delayed coordinates (trajectory matrix) X Filtering yields Xf Denoising matrix pencil

12

10

8

6

4

2

0

I

I

I

I

EVD of the covariance matrix RX to get whitening transformation of X Apply whitening and reduce dimension to Xf using the l largest eigenvalues of X EVD of the transformed Xf yields the sources

Parameters: the used eigenvalues l and the delay dimension M

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

-2

10

8

6

4

2

0

12

-4 -2

10

8

6

4

2

0

-2

10

8

6

4

2

0

-4 -2

x(t) = [x0 (t), . . . , xN (t)]⊥ , t = 1, . . . , L

               

X=

x0 ((M − 1)tau) x0 (1 + (M − 1)tau) x0 ((M − 2)tau) x0 (1 + (M − 2)tau) .. .. . . x0 (0) x0 (1) .. .. . . xN ((M − 1)tau) xN (1 + (M − 1)tau) .. .. . . xN (0)

xN (1)

· · · x0 (L + (M − 1)tau) · · · x0 (L + (M − 2)tau) .. ··· . ··· x0 (L) .. ··· . · · · xN (L + (M − 1)tau) .. ··· . ···

xN (L)

                

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

GEVD-dAMUSE Algorithm I

I I

Delayed coordinates (trajectory matrix) X Filtering yields Xf Denoising matrix pencil I

I

I

I

model X = AS Mixtures (Signals)

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

S, Sf filtered signal

EVD of the covariance matrix RX to get whitening transformation of X Apply whitening and reduce dimension to Xf using the l largest eigenvalues of X EVD of the transformed Xf yields the sources

Parameters: the used eigenvalues l and the delay dimension M

Sources

X, Xf

(RX, RXf )

congruent pencil

(RS, RSf )

RX = UVU⊤ , V−1/2U⊤ RX,f UV−1/2 = EDE⊤

estimated sources

A ≃ U⊤ V−1/2 E

12

10

x(t) = [x0 (t), . . . , xN (t)]⊥ , t = 1, . . . , L

8

6

4

2

0

-2



X=

               

x0 ((M − 1)tau) x0 ((M − 2)tau) .. . x0 (0) .. .

x0 (1 + (M − 1)tau) · · · x0 (L + (M − 1)tau) x0 (1 + (M − 2)tau) · · · x0 (L + (M − 2)tau)    .. ..  . ··· .    x0 (1) ··· x0 (L)   .. ..  . ··· . 

10

8

6

4

2

0

12

-2-4

10

8

6

4

2

 

xN ((M − 1)tau) xN (1 + (M − 1)tau) · · · xN (L + (M − 1)tau)   .. .. ..   . . ··· . xN (0) xN (1) ··· xN (L)

0

-2

10

8

6

4

2

0

-2-4

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Tools and techniques Automatic Local Projective Denoising with ICA Denoising using GEVD-dAMUSE

GEVD-dAMUSE Algorithm I

I I

Delayed coordinates (trajectory matrix) X Filtering yields Xf Denoising matrix pencil I

I

I

I

EVD of the covariance matrix RX to get whitening transformation of X Apply whitening and reduce dimension to Xf using the l largest eigenvalues of X EVD of the transformed Xf yields the sources

Parameters: the used eigenvalues l and the delay dimension M

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

model X = AS 12

Mixtures (Signals)

Sources

10

x(t) = [x0 (t), . . . , xN (t)]⊥ , t = 1, . . . , L

X, Xf

8

S, Sf

6

filtered signal 4

2

0

-2



X=

               

x0 ((M − 1)tau) x0 ((M − 2)tau) .. .

x0 (1 + (M − 1)tau) · · · x0 (L + (M − 1)tau) x0 (1 + (M − 2)tau) · · · x0 (L + (M − 2)tau)    .. ..  . ··· .    x0 (1) ··· x0 (L)   .. ..  . ··· . 

x0 (0) .. .   xN ((M − 1)tau) xN (1 + (M − 1)tau) · · · xN (L + (M − 1)tau)   .. .. ..   . . ··· . xN (0) xN (1) ··· xN (L)

10

8

6

4

2

0

12

-2-4

(RX, RXf )

congruent pencil

(RS, RSf )

10

8

6

4

2

0

-2

10

8

6

4

2

0

RX = UVU⊤ , V−1/2U⊤ RX,f UV−1/2 = EDE⊤

-2-4

estimated sources

A ≃ U⊤ V−1/2 E

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

2D NOESY protein NMR spectra Comparison of the Denoising Algorithms

Problem

I

I

I

Stadlthanner et al. [2003] showed that blind source separation (BSS) techniques solve the problem of removing the prominent water artifact in 2D NOESY protein NMR spectra Algorithms like GEVD-MP introduce unwanted noise into the independent components obtained We also compare our application of two different denoising techniques with a kernel PCA based denoising algorithm introduced by Mika et al. [1998]

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

2D NOESY protein NMR spectra Comparison of the Denoising Algorithms

Water artifact separation from 2D NOESY NMR spectra of P11 protein I Original NMR spectrum of the P11 protein 12

Signal [a.u.]

10

I

8 6 4 2 0 -2 10

8

6

4

2

0

I

-2

[ppm]

Spectrum after the water removal algorithm 12

Signal [a.u.]

10 8 6 4 2

I

Water artifact removal with GEVD-MP Detection of water sources by comparison with the strongest resonance signal Additional noise introduced by the algorithm Estimated SNR 17.3 dB Degration of the signal occurs independent of the applied ICA-algorithm

0 -2 10

8

6

4

2

0

-2

[ppm]

(NMR spectrum of P11 kindly provided by W. Gronwald and H. R. Kalbitzer) Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

2D NOESY protein NMR spectra Comparison of the Denoising Algorithms

Water artifact separation from 2D NOESY NMR spectra of P11 protein I Original NMR spectrum of the P11 protein 12

Signal [a.u.]

10

I

8 6 4 2 0 -2 10

8

6

4

2

0

I

-2

[ppm]

Spectrum after the water removal algorithm 12

Signal [a.u.]

10 8 6 4 2

I

Water artifact removal with GEVD-MP Detection of water sources by comparison with the strongest resonance signal Additional noise introduced by the algorithm Estimated SNR 17.3 dB Degration of the signal occurs independent of the applied ICA-algorithm

0 -2 10

8

6

4

2

0

-2

[ppm]

(NMR spectrum of P11 kindly provided by W. Gronwald and H. R. Kalbitzer) Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

2D NOESY protein NMR spectra Comparison of the Denoising Algorithms

Local ICA Results

8

I

7

6

undenoised Signal [a.u.]

5

4

I

3

Kernel-PCA 13.9 dB Local ICA 6.1 dB

2

1

10

8

6

4

2

0

0 -2

[ppm]

Differences between original and the resulting spectrum GEVD-MP, GEVD-MP and local ICA denoising, GEVD-MP and kernel PCA denoising (from top to bottom)

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Water artifact separation with GEVD-MP and denoising of the resulting spectrum with Kernel-PCA or the estimated water sources with local ICA Estimated SNR gain on a flat part of the spectrum

I

Estimated SNR gain on full spectrum Kernel-PCA −2.5 dB Local ICA 4.3 dB

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

2D NOESY protein NMR spectra Comparison of the Denoising Algorithms

Local ICA Results

8

I

7

6

Signal [a.u.]

5

4

kernel PCA denoised

I

3

Kernel-PCA 13.9 dB Local ICA 6.1 dB

2

1

10

8

6

4

2

0

0 -2

[ppm]

Differences between original and the resulting spectrum GEVD-MP, GEVD-MP and local ICA denoising, GEVD-MP and kernel PCA denoising (from top to bottom)

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Water artifact separation with GEVD-MP and denoising of the resulting spectrum with Kernel-PCA or the estimated water sources with local ICA Estimated SNR gain on a flat part of the spectrum

I

Estimated SNR gain on full spectrum Kernel-PCA −2.5 dB Local ICA 4.3 dB

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

2D NOESY protein NMR spectra Comparison of the Denoising Algorithms

Local ICA Results

8

I

7

6

Signal [a.u.]

5

4

I

3

Kernel-PCA 13.9 dB Local ICA 6.1 dB

2

local ICA denoised

10

8

6

1

4

2

0

0 -2

[ppm]

Differences between original and the resulting spectrum GEVD-MP, GEVD-MP and local ICA denoising, GEVD-MP and kernel PCA denoising (from top to bottom)

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Water artifact separation with GEVD-MP and denoising of the resulting spectrum with Kernel-PCA or the estimated water sources with local ICA Estimated SNR gain on a flat part of the spectrum

I

Estimated SNR gain on full spectrum Kernel-PCA −2.5 dB Local ICA 4.3 dB

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

2D NOESY protein NMR spectra Comparison of the Denoising Algorithms

GEVD-dAMUSE Results I

GEVD-dAMUSE 14

12

10

Signal [a. u.]

8

6

I

4

2

0

I

−2

−4

10

9

8

7

6

5

4

3

2

1

0

−1

δ [ppm]

Application of GEVD-dAMUSE to 128 time domain signals with delay dimension M = 2 and a step of τ = 2 Considered only the largest l = 95 eigenvalues after the first EVD The GEVD-dAMUSE yields a SNR gain of 5.1 dB compared to GEVD-MP I

I

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Performanceomparable to local ICA denoising Far less computational effort than local ICA denoising

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

2D NOESY protein NMR spectra Comparison of the Denoising Algorithms

GEVD-dAMUSE Results I

GEVD-dAMUSE 14

12

10

Signal [a. u.]

8

6

I

4

2

0

I

−2

−4

10

9

8

7

6

5

4

3

2

1

0

−1

δ [ppm]

Application of GEVD-dAMUSE to 128 time domain signals with delay dimension M = 2 and a step of τ = 2 Considered only the largest l = 95 eigenvalues after the first EVD The GEVD-dAMUSE yields a SNR gain of 5.1 dB compared to GEVD-MP I

I

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Performanceomparable to local ICA denoising Far less computational effort than local ICA denoising

Denoising with ICA and a Generalized Eigendecomposition

Denoising Problems Denoising Algorithms Application and Comparisons Conclusions

Conclusions

I

I

I

I

I

Noise introduced through statistical water artifact removal can be reduced by various means We presented two algorithms based on local projective methods in the space of delayed coordinates Both perform better than Kernel-PCA based denoising, but do not distort the peak amplitudes The local ICA is a slightly more general approach and can adjust to other spectra through automatic estimation of the parameters GEVD-dAMUSE provides both the artifact removal and the denoising and is computationally very efficient with comparable results

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition

References

Appendix I I

This research was supported by the DFG1 and BMBF2 . References S. Mika, B. Sch¨ olkopf, A. Smola, K. M¨ uller, M. Scholz, and G. R¨ atsch. Kernel PCA and denoising in feature spaces. Adv. Neural Information Processing Systems, NIPS11, 11, 1998. K. Stadlthanner, A. M. Tom´ e, F. J. Theis, W. Gronwald, K. R. Kalbitzer, and E. W. Lang. Blind source separation of water artifacts in NMR spectra using a matrix pencil. In Fourth International Symposium On Independent Component Analysis and Blind Source Separation, ICA’2003, pages 167–172, Nara, Japan, 2003.

1 graduate 2 project

college: Nonlinearity and Nonequilibrium in Condensed Matter ’ModKog’

Gruber, Stadlthanner, Tom´ e, Teixeira, Theis, Puntonet, Lang

Denoising with ICA and a Generalized Eigendecomposition