Signal Processing, IEEE Transactions on

Report 3 Downloads 213 Views
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 50, NO. 5, MAY 2002

997

Sequential Blind Extraction of Instantaneously Mixed Sources Yuanqing Li and Jun Wang, Senior Member, IEEE

Abstract—This paper presents a general approach to sequential blind extraction of instantaneously mixed sources for several major ill-conditioned cases as well as the regular case of full column rank mixing matrices. Four ill-conditioned cases are considered: The mixing matrix is square but singular; the number of sensors is less than that of sources; the number of sensors is larger than that of sources, but the column rank of the mixing matrix is deficient; and the number of sources is unknown and the column rank of the mixing matrix is deficient. First, a solvability analysis is presented for a general case. A necessary and sufficient condition for extractability is derived. A sequential blind extraction approach is then proposed to extract all theoretically separable sources. Next, a principle and a cost function based on fourth-order cumulants are presented for blind source extraction. By minimizing the cost function under a nonsingularity constraint of the extraction matrix, all theoretically separable sources can be extracted sequentially. Finally, simulation results are presented to demonstrate the validity and performance of the blind source extraction approach.

and the extraction approach [7], [9], [21]–[25], [31]. In the separation approach, all separable sources are separated simultaneously, whereas the sources are extracted one by one in the extraction approach. Simultaneous separation, if possible, is, of course, desirable. In some ill-conditioned cases, simultaneous blind separation cannot be achieved, but sequential blind extraction can since sequential blind extraction requires weaker solvability conditions than simultaneous blind separation, as will be shown in this paper. In addition, blind source extraction has some advantages over simultaneous blind source separation in extracting some interested sources, according to some stochastic features (e.g., kurtosis) of sources [22]–[24]. This paper focuses on the sequential blind extraction of linear instantaneous mixtures. Consider a general linear case of instantaneously mixing sources with observable mixtures

Index Terms—Blind extraction, cumulant, ill-conditioned case, independence, solvability.

(1)

I. INTRODUCTION

B

LIND separation of independent sources from their mixtures has received considerable attention in recent years. Blind source separation techniques have widespread application potentials in numerous technical areas such as communications [1], medical signal processing [2], speech signal processing [3], and image restoration [4], to name a few. The objective of blind source separation is to recover sources from their mixtures without the prior knowledge of the sources and the mixing channels. The mixtures of sources can be divided into several categories, such as instantaneous mixtures and dynamical or convolutive mixtures. Independent component analysis (ICA) can be used to deal with the instantaneous mixture case (e.g., [1], [5]–[11], etc.), and dynamical component analysis (DCA) can be used to deal with the convolutive mixture case [12]–[18]. In general, there are two classes of approaches for recovering original sources from instantaneous mixtures: the simultaneous separation approach [5], [6], [11], [19], [20], [28], [30] Manuscript received November 16, 1999; revised January 30, 2002. This work was supported by the National Natural Science Foundation of China under Grants 60004004 and 69934030, by the Natural Science Foundation of Guangdong Province, China, under Grant 990584, and by the Hong Kong Research Grants Council under Grant CUHK4150/97E. The associate editor coordinating the review of this paper and approving it for publication was Prof. Arye Nehorai. Y. Q. Li is with the Department of Automatic Control Engineering, College of Electronic and Information Technology, South China University of Technology, Guangzhou, Guangdong, China. J. Wang is with the Department of Automation and Computer-Aided Engineering, The Chinese University of Hong Kong, Shatin, Hong Kong. Publisher Item Identifier S 1053-587X(02)03216-6.

is a vector of muwhere tually independent unknown sources with zero means, is a vector of mixed signals, and is an unknown constant matrix known as the mixing matrix. The task of blind extraction to recover the sources one . by one from the available mixtures Most of existing studies are based on the assumptions that and is nonsingular or has full column rank. In practice, however, the number of sources may not be known a priori, and the mixing matrix may be rectangular or singular, . In general, there are four ill-conditioned even though cases. Case 1) The number of sensors equals that of sources, but the mixing matrix is singular. Case 2) The number of sensors is less than that of sources. Case 3) The number of sensors is larger than that of sources, but the column rank of is deficient. Case 4) The number of sources is unknown, and the column rank of is deficient. In fact, the common key problem in the four cases above is the column-rank deficiency of the mixing matrix . In [8], the general results of solvability analysis and separation principle are presented for simultaneous blind separation, which are suitable for the ill-conditioned cases above. In that paper, the concept of -row decomposability is presented first. Next, the following result is obtained: The sources can be separated into groups simultaneously if and only if the mixing matrix is -row decomposable. Then, necessary and sufficient conditions on -row decomposability are presented. In addition, two important open problems are raised: How to estimate the number of separable sources and how to determine

1053-587X/02$17.00 © 2002 IEEE

998

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 50, NO. 5, MAY 2002

which separated component is a source signal and which one is still a mixture. In fact, the maximum partition number is unknown. Furthermore, since partition is not unique, whether there exists an optimal partition and how to obtain it are also problems to be studied. Thus, it is not easy to obtain a satisfactory partition by means of blind separation. Although these ill-conditioned cases have received attention (e.g., [8], [31]), so far, no method can effectively deal with all of the above four ill-conditioned cases. Since the mixing matrix is required to be nonsingular or full of column rank for simultaneous separation to be solvable, it is difficult to use the instantaneous blind separation for the four cases above. However, the solvability conditions of blind extraction may hold in the ill-conditioned cases, as can be seen in this paper, and the source number is not necessary to be known for blind extraction. Therefore, sequential blind extraction of unknown sources provides a possible remedy for the above ill-conditioned cases. Generally, only one source signal can be obtained by a single-step blind extraction. By using sequential blind extraction, more than one source signal can be obtained one by one. There are two different parts in a single-step blind extraction in the existing references. The first is extraction, and the second is deflation. An extraction model can be described as (2) where is an -dimensional row vector, and is the output of the extraction model. The extraction of an independent source is often achieved by maximizing the absolute value of the fourthorder cumulant (e.g., kurtosis) of the output of the extraction model subject to certain constraints [7], [9], [24], [25]. In [7], an adaptive approach is proposed for blind extraction of independence sources. The approach includes convergent extraction and deflation algorithms that are implemented by maximizing several contrast functions in terms of fourth-order statistics. In [24], a neural network is presented with unconstrained extraction and deflation criteria that require neither prior knowledge of source signals nor whitening of mixed signals and can cope with a mixture of sources with positive and negative kurtosis. It is proven that the criteria have no spurious equilibria by showing that all spurious equilibria are unstable. However, like in the previous studies [7], [9], a necessary condition is that the mixing matrix has full column rank, which means that . Otherwise, the set is . It is then possible that the unstable spurious a subspace of is as stable as the subspace , which leads to equilibria in a spurious solution and the failure of blind extraction because there is no criterion to differentiate true solutions and spurious solutions. Blind source extraction using the model (2) under the condition of the full column rank mixing matrix is also discussed in [25]. For these ill-conditioned cases, besides the potential existence of stable spurious equilibria as mentioned previously, the number of spurious equilibria would increase substantially as that of the sources increases. In [26], a recurrent neural network and its associated learning rule are presented, which can deal with the case in which is nearly ill-conditioned, but must be nonsingular. Recently, the blind extraction of singularly mixed

sources is discussed based on a recurrent neural network model with an adaptive learning algorithm in [31]. However, the study is limited to a particular blind extraction model and one ill-conditioned case [Case 1)] only. In this paper, we introduce the following general blind extraction model: (3) blind where is an -dimensional output vector, is an . The task of blind source extracextraction matrix, and tion is to determine such that one component in corresponds to a source up to a scale. The common simultaneous blind separation model is in the same form as (3), where is called a separation matrix. Under the condition of at most one Gaussian source, the choice of is generally based on the principle that all outputs are mutually , is a diagonal matrix, and independent such that is a permutation matrix. In this paper, the blind extraction model (3) is based on a different principle from that of blind source separation. That is, only one output (the extracted signal) is asked to be pairwise independent with other outputs in each extraction step. The extraction and deflation can be carried out simultaneously by using (3). This paper presents theoretical results on blind source extraction including a solvability condition, a blind extraction principle, and a cost function that is suitable for above four ill-conditioned cases as well as the normal case of full column rank mixing matrices. Based on the solvability analysis, a sequential blind extraction approach that can extract all theoretically separable sources is proposed. By minimizing the cost function under a nonsingularity constraint, all theoretically separable sources can be extracted one by one, provided that all sources are sup-Gaussian or sub-Gaussian. The remainder of this paper is organized as follows. The solvability analysis is presented in Section II. A blind extraction principle and a cost function based on higher order cumulants are introduced in Section III. Simulation results are discussed in Section IV. Concluding remarks in Section V summarize the approach in this paper and state the remaining tasks. II. SOLVABILITY ANALYSIS In this section, we analyze the solvability of blind source extraction based on the blind extraction model (3). Specifically, two theorems and one corollary are provided. matrix in Theorem 1: There exists a nonsingular (3) such that a mixture of sources can be extracted in one component of , and the other components do not contain these submatrix sources if and only if there exists an composed of columns in such that rank rank and the submatrix composed of the remaining columns in has rank 1. Proof: See the Appendix. From the proof of Theorem 1 in the Appendix, we can see that the number of the sources is not necessarily known. Thus, Theorem 1 is also suitable for the case in which the number of sources is unknown.

LI AND WANG: SEQUENTIAL BLIND EXTRACTION OF INSTANTANEOUSLY MIXED SOURCES

In Theorem 1, the extraction matrix is nonsingular. From the following analysis, we can see that if there is a is a row vector such that a source is extracted based on (2), then there is a nonsingular such that the same source can be extracted based on (3). ; thus, Without loss of generality, let . Consider the homogeneous linear equation with variables

999

Since rank rank , by Theorem 1, there is an nonsingular matrix such that has the following form:

.. .

.. .

..

(5)

.. .

.

Obviously (4) linearly independent row vectors Obviously, there are is independent of all solutions of (4). that satisfy (4), and matrix with its Thus, we can obtain a nonsingular rows being the first row being and its remaining linearly independent solutions of (4). Therefore, the first row have only one nonzero entry in their and the first column of junction. Thus, one source can be extracted by using the model (3) with the nonsingular extraction matrix . There is another new result in using model (3) with nonsingular . That is, if a signal is extracted, then it is either a single source or a mixture of several sources that cannot be separated by using any other blind separation methods according to [8, Th. 2]. Remarks 1: implies that a single source can be 1) In Theorem 1, nonsingular matrix or is extracted. If is an rectangular matrix with full column rank, the an conditions of Theorem 1 are obviously satisfied by setting to be 1. Consequently, one of sources can be extracted. , then there exist theoretically in2) In Theorem 1, if submatrix, separable sources, corresponding to the in either the extracted component of or the other components of . 3) If the mixing matrix does not satisfy the conditions in Theorem 1, then it is impossible to extract a single source or a mixture of several sources using any nonsingular extraction matrix or any rectangular extraction matrix including the extraction model (2). 4) Once a single source or a mixture of several sources is extracted as one component of , the remaining components of can be used as new mixtures, and sequential blind extraction can be continued. The following theorem implies that we can obtain all theoretically separable sources using sequential blind extraction. submatrices Theorem 2: If there exist at most such that rank rank for in denoted as , then at most sources can be extracted by means of sequential blind extraction based on (3). Proof: Without loss of generality, assume .. .

..

.

.. .

.. .

.. .

..

.. .

..

.

..

.. .

Since and (6)

..

.

.. .

.. .

..

.

.. .

(6)

.. .

.. .

is nonsingular, rank rank

rank

..

rank

.

.. .

. From (5)

rank

rank

(7)

and rank

rank rank

rank rank

(8)

rank . Thus, rank nonsingular By Theorem 1, there is an such that can be extracted when blind extraction matrix new mixtures from the previous extraction model (3) and are used. When the processes above are repeated times, then there , where are nonsingular blind extraction matrices , their dimensions are are extracted. respectively, such that the sources Let

Then

.. .

.

Denote

.. .. .

.. .

.

.

1000

where is an identity matrix, , is an matrix. and If there are sources that can be extracted sequentially, satisfies the conditions in Theorem 1, and there are then submatrices in , where their ranks are . Since is nonsingular, has rank submatrices, where their ranks are rank . This is in contradiction with the conditions of this theorem. Thus, at most sources can be extracted. According to [8, Th. 2], if the condition in Theorem 2 is satisfied, then there are theoretically separable sources. Thus, the maximum number of sequentially extractable sources is equal to that of theoretically separable sources. Theorem 2 and its proof outline the sequential blind extraction method in this paper by which all extractable sources or theoretically separable sources can be extracted one by one. On the contrary, it is impossible to obtain all the separable sources by using general blind separation model in all the ill-conditioned cases mentioned in Section I. For singular matrix, there is no sepainstance, when is an , where is a permutation ration matrix such that matrix, and is a diagonal matrix. If the blind partition method proposed in [8] is used to separate these theoretically separable sources, two open problems of the reference cannot be avoided. Corollary 1: Let be the maximum number of sequentially extracted signals based on (3) using any effective algorithm. If any two columns of are linearly independent, then all the extracted signals are sources, and the maximum number of theoretically separable or sequentially extractable sources is . Proof: Since any two columns of are linearly independent, all the signals are single sources, according to Theorem 1. Theorem 2 implies that the sequential blind extraction approach in this paper can extract all separable sources; thus, the result of this corollary holds. Remarks 2: 1) The condition that any two columns of are linearly independent is much weaker than the condition that is of full column rank. The later implies that any columns of are linearly independent ( ). 2) If the condition that any two columns of are linearly independent is removed from Corollary 1, then the number of separable sources is at most . These signals include all separable sources and may include several mixtures of some sources that cannot be separated. Theorem 2 and Corollary 1 imply that sequential blind extraction based on (3) has the ability to extract all theoretically separable sources. From Theorem 2 and Corollary 1, we immediately have the following corollary. Corollary 2: If is of full column rank or nonsingular, then all sources can be extracted sequentially.

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 50, NO. 5, MAY 2002

and

Suppose that and are independent. If is non-Gaussian, for any . then The following is the result of blind extraction principle derived from Lemma 1 directly. are mutually indepenTheorem 3: Suppose that dent and at most one of them is Gaussian. If one output in when (3) is pairwise independent to the others, then ; ; . The principle of simultaneous blind source separation is based on the pairwise independence of all outputs of a separation model. However, the blind source extraction principle in Theorem 3 is based on the pairwise independence of one output with other outputs of the model (3). Under the condition in Theorem 3, corresponds to either a single source or a mixture of several sources, and the other components in do not contain the source or these sources. Note that the condition in Theorem 3 may hold only if the solvability condition in Theorem 1 is satisfied. Although Theorem 3 provides a good blind extraction principle theoretically, it is not easy to check whether or not two signals are statistically independent. Next, we introduce a cost function as a criterion for blind extraction based on the idea in [28]. Without loss of generality, let be the signal to be extracted. A cost function is defined using fourth-order cumulants as Cum

(9)

The properties of cumulants are discussed in many referand are pairwise ences, e.g., [17] and [20]. Obviously, if , then . independent are Theorem 4: Suppose that sup-Gaussian (or sub-Gaussian) and mutually independent stationary sources with zero means. in (3) 1) If there exists a nonsingular extraction matrix and , then is an extracted signal. such that 2) Under the condition of nonsingular extrication matrix and nonsingular mixing matrix , all local minima of are global ones, and each of them leads to a single source. implies that Cum , Proof: 1) . In (3) (10) are mutually independent, in view of (10),

Since we have

III. BLIND EXTRACTION PRINCIPLE AND COST FUNCTION First, we present a lemma known as the Darmois–Skitovich theorem [8], [27] that is a basis of the blind extraction principle in this paper. be an -dimensional Lemma 1: Let , componentwise mutually independent random vector

Cum Cum where Since all (or 0),

(11) . are sup-Gaussian (or sub-Gaussian), . In view of (9) and (11),

LI AND WANG: SEQUENTIAL BLIND EXTRACTION OF INSTANTANEOUSLY MIXED SOURCES

implies that

or if , , . Thus, is an extracted signal. If only one , then is a single source. are global Obviously, all local minima of such that ones with local stability. , 2) From (9) and (11), we have for Cum

1001

Since ( ) are observed, we can obtain all by using mean to replace expectation in Cum (14). According to (9) and (13), we can compute in terms of . The remaining task is to find the nonsingular extraction matrix by solving the problem (12). in (12), To ensure the nonsingularity constraint in the minimization we should often check whether , we can add a disturbance to and restart process. If the iteration. In blind separation, several algorithms (e.g., in [11, Alg. (A)]) have the feature of keeping the separation matrix from becoming singular. These methods may be applied to avoid singularity of . IV. SIMULATION RESULTS

Set

and for ; . It is easy to find that all solutions of the equations and are arbitrary, can be divided into two classes: 1) , . 2) There exists at least one such that and , . Both classes ; thus, they all are global minima of of equilibria lead to with local stability. However, the equilibria in the first class , the extraction matrix is can be excluded since nonsingular, and has full rank. Obviously, each of equilibria in the second class will lead to asigle source. Remark 3: 1) If the conditions in Theorem 1 are not satisfied, , unless , for , then it is impossible for . 2) If , , and , then is i.e., as new mixtures to an extracted signal. We can use extract the remaining sources sequentially. In light of Theorem 4, blind extraction using (3) can be converted to solving the following constrained minimization problem:

Simulation results presented in this section are divided into three categories. In Examples 1 and 2, two ill-conditioned cases are considered (Cases 1 and 2). In Example 2, the mixing matrix is nonsingular, and all sources are singled out one by one via sequential blind extraction. In this section, the following Gauss–Newton algorithm is applied for blind source extraction by solving the optimization problem (12) vec vec Cum vec Cum vec

vec

(15)

, , and are where vec , is an identity matrix. positive constants, and To verify the extraction results and show the convergence behaviors of the algorithm (15), the following performance index is introduced:

(12) (16)

In view of (10) and the properties of cumulants Cum

(13)

. where Example 1: Consider three sources , , , where , and are independent uniform white noises with values . The kurtoses of and are , in , respectively. Obviously, These sources are suband Gaussian. The mixing matrix is assumed to be

(14)

Note that is singular. By checking the determinant of four 3 2 submatrices of , we can find that only the source can be extracted according to Theorem 1. It is impossible to deal with the case by using existing blind separation methods.

Cum Cum

Since

has zero mean (

)

Cum

1002

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 50, NO. 5, MAY 2002

Fig. 1.

Blind extraction from mixtures of three sources in Example 1.

The initial value of the extraction matrix domly as

By

using

tion matrix

is chosen ran-

the

Gauss–Newton algorithm (15) with , we can obtain the blind extracand, in turn, as follows:

In Fig. 1, three sources are presented in the three subplots of the first row and three observable mixtures in the second row. The extracted signal is shown in the first subplot of the third are row. The new mixtures of the remaining sources and shown in the second and third subplots, respectively, of the third row. The first subplot in the fourth row shows the calibrated , which implies that the source is deviation is depicted in the last extracted. The performance index subplot of Fig. 1, where is the iteration counter. Example 2: Consider the ill-conditioned Case 2) with four sources and three observable mixtures. The first three sources , where are the same as those in Example 1 is a uniform white noise with values in independent and in Example 1. The kurtosis of is . of Thus, is also sub-Gaussian. The mixing matrix is assumed to be

By checking the determinant of four 3 3 submatrices of , we can find that only the source can be extracted according to Theorem 1. Existing blind separation methods cannot deal with the case.

The initial value of the extraction matrix domly as

By

using

and resulting

the

is chosen ran-

Gauss–Newton algorithm (15) with , we can obtain the blind extraction as follows:

In Fig. 2, four sources are presented in the four subplots of the first row and three observable mixtures in the second row. The extracted signal is shown in the first subplot of the third row. The new mixtures of the remaining sources and are shown in the second and third subplots, respectively, of the third row. The first subplot in the fourth row shows the calibrated deviation , which implies that the source is extracted. The is depicted in the last subplot of Fig. 2. performance index Example 3: In this example, the three sources are , , in Example 1, and the mixing matrix is assumed to be and the following nonsingular matrix

Since is nonsingular, all sources can be extracted one by one via sequential blind extraction according to Theorem 2. In the first-step blind extraction, the initial value of the exis chosen randomly as traction matrix

LI AND WANG: SEQUENTIAL BLIND EXTRACTION OF INSTANTANEOUSLY MIXED SOURCES

Fig. 2.

1003

Blind extraction from mixtures of four sources in Example 2.

Fig. 3. First-step blind extraction from mixtures of three sources in Example 3.

By using (15) with and the matrix traction matrix

, we can obtain the exas follows:

The three sources and three mixtures are shown in the subplots of the first and second row of Fig. 3, respectively. The extracted signal is shown in the first subplot of the third row, and the new mixtures of the remaining sources and are shown

in the second and third subplot of the third row. The deviation is shown in the first subplot of the fourth row, is extracted. The performance which implies that the source in this step of extraction is shown in the second indexes subplot of the last row. In the second-step blind extraction, the mixtures are the reand . The initial exmainders of the first-step extraction is set randomly as traction matrix

By choosing in (15), similar to the first-step extraction, we obtain the second-step extraction matrix

1004

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 50, NO. 5, MAY 2002

Fig. 4.

Second-step blind extraction from mixtures of three sources in Example 3.

Without loss of generality, let . Thus

and

where is a 2 3 matrix composed of the second and third row of . Fig. 4 presents the result of the second-step blind extraction. is the extracted signal, and is the remainder In Fig. 4, of the second-step blind extraction. The first two subplots of the and second row in Fig. 4 show the deviations , respectively. The third subplot of the second in the second-step exrow shows the performance index is iteratively computed by use of the eletraction, where ments in . Thus, the remaining sources of the first-step extraction are obtained in the second-step extraction.

.. .

..

, where

.. .

.

.. .

..

.. .

.

(17)

, where

By rewriting .. .

..

.. .

.

.. .

..

.

.. .

(18)

V. CONCLUDING REMARKS A general approach is proposed for sequential blind extraction of instantaneously mixed sources in the normal and various ill-conditioned cases. Sequential blind extraction is shown to be more suitable than simultaneous blind separation for ill-conditioned cases. A necessary and sufficient extractability condition is derived. In addition, under a weak condition in Corollary 1, all extracted signals are separable sources. Even if the condition in Corollary 1 is not satisfied, sequential blind extraction can still extract all theoretically separable sources along with or without new mixture of inseparable sources. Sequential blind extraction thus can recover sources from instantaneously mixed signals to the most extent. A blind extraction principle, a cost function, and corresponding algorithm implementation are also presented. The simulation results confirm the validity and demonstrate the characteristics of the blind source extraction approach in this paper. Further investigations may aim at the development of better cost functions and optimization algorithms for blind source extraction of arbitrary of signals. APPENDIX PROOF OF THEOREM 1 nonsinProof—Necessity: Suppose that there is an gular matrix such that a mixture of sources can be extracted and the other outputs do not contain of these sources.

we have (19) In view of (17), (19), and the nonsingularity of rank

rank

rank

rank

rank

rank

The necessity is obtained. Sufficiency: Without loss of generality, , rank rank that rank rank is denoted as rank

suppose . Thus,

(20) are nonzero constants. where ; thus, the rows in Obviously, rank early dependent. Therefore, there is a row vector in such that supposed to be

are linthat is

(21) where

are constants.

LI AND WANG: SEQUENTIAL BLIND EXTRACTION OF INSTANTANEOUSLY MIXED SOURCES

Now, we prove that there is a vector fies both (21) and the following inequalities:

that satis-

1005

In view of rank , where

being , (28), and (21), we have rank

(22)

.. .

.. .

..

.. .

.

Denote Consider .. .

From (21), rank

..

.. .

.

rank

. Consider the equation

.. .

(23)

From (21), (23) has a special solution the homogeneous equation of (23)

. Consider

.. .

(30)

.. .

It can be proved that the number of basic solutions to (30) is . However, from (24), (25), and (29), , are all the basic solutions of (30). A contradiction occurs, and thus, (22) is satisfied. It follows from (20) that (31) From (20) and (31), we have (32)

.. . It can be proved that (24) has solutions, which are denoted as

.. .

.. .

(24)

linearly independent basic .. .

(25)

.. .

Thus, the general solution of (23) can be represented by

.. .

.. .

Let

.. .

.. .

..

.. .

.

.. .

.. .

..

.. .

.

Then, from (32)

.. .

..

.

.. .

.. .

..

.

.. .

Thus, there exists a nonsingular blind extraction matrix in (3) such that a mixture of sources can be extracted. The sufficiency is obtained.

.. . (26)

are arbitrary. where If (22) are not satisfied, then there is at least a that is 0 for of (23). Without loss of generality, any solution . Thus suppose that (27) In view of (26) and (27), letting in (26), we have (28) From (26)–(28), we can obtain (29)

ACKNOWLEDGMENT The major part of this study was performed when Y. Q. Li visited the Chinese University of Hong Kong in 1999. The authors are grateful for the Associate Editor and the anonymous reviewers’ valuable comments. REFERENCES [1] P. Comon, “Independent component analysis, a new concept?,” Signal Process., vol. 36, pp. 287–314, 1994. [2] A. Hyvarinen and E. Oja, “Independent component analysis: algorithms and applications,” Neural Networks, vol. 13, pp. 411–430, 2000. [3] A. J. Bell and T. J. Sejnowski, “An information-maximization approach to blind separation and blind deconvolution,” Neural Comput., vol. 7, pp. 1004–1034, 1995. [4] D. Kundur and D. Hatzinakos, “A novel blind deconvolution scheme for image restoration using recursive filtering,” IEEE Trans. Signal Processing, vol. 46, pp. 375–390, Feb. 1998.

1006

[5] C. Jutten and J. Herault, “Blind separation of sources, Part I: An adaptive algorithm based on neuromimetic architecture,” Signal Process., vol. 24, pp. 1–10, 1991. [6] E. Moreau and O. Macchi, “Self-adaptive source separation—Part II: Comparison of the direct, feedback, and mixed linear network,” IEEE Trans. Signal Processing, vol. 46, pp. 39–50, Jan. 1998. [7] N. Delfosse and P. Loubaton, “Adaptive blind separation of independent sources: A deflation approach,” Signal Process., vol. 45, pp. 59–83, 1995. [8] X. R. Cao and R. W. Liu, “General approach to blind source separation,” IEEE Trans. Signal Processing, vol. 44, pp. 562–571, Mar. 1996. [9] A. Hyvärinen and E. Oja, “Simple neuron models for independent component analysis,” Int. J. Neural Syst., vol. 7, no. 6, pp. 671–687, 1996. [10] J. F. Cardoso and B. Laheld, “Equivariant adaptive source separation,” IEEE Trans. Signal Processing, vol. 43, pp. 3017–3029, Nov. 1996. [11] H. H. Yang and S. I. Amari, “Adaptive on-line learning algorithms for blind separation—Maximum entropy and minimum mutual information,” Neural Comput., vol. 9, pp. 1457–1482, 1997. [12] L. Tong, G. H. Xu, and T. Kailath, “Blind identification and equalization based on second-order statistics: A time domain approach,” IEEE Trans. Inform. Theory, vol. 40, pp. 340–349, Apr. 1994. [13] O. Shalvi and E. Weinstein, “Super-exponential methods for blind deconvolution,” IEEE Trans. Inform. Theory, vol. 39, pp. 504–519, Apr. 1993. [14] Y. Inouye and K. Hirano, “Cumulant-based blind identification of linear multi-input multi-output systems driven by colored inputs,” IEEE Trans. Signal Processing, vol. 45, pp. 1543–1552, June 1997. [15] H. L. N. Thi and C. Jutten, “Blind source separation for convolutive mixtures,” Signal Process., vol. 45, pp. 209–229, 1995. [16] E. Moreau and O. Macchi, “High-order contrasts for self-adaptive source separation,” Int. J. Adaptive Contr. Signal Process., vol. 10, pp. 19–46, 1996. [17] K. L. Yeung and S. F. Yau, “A cumulant-based super-exponential algorithm for blind deconvolution of multi-input multi-output systems,” Signal Process., vol. 67, pp. 141–162, 1998. [18] M. I. Gurelli and C. L. Nikias, “EVAM: An eigenvector-based algorithm for multi-channel blind deconvolution of input colored signals,” IEEE Trans. Signal Processing, vol. 43, pp. 134–149, Jan. 1995. [19] P. Comon, C. Jutten, and J. Herault, “Blind separation of sources, Part II: Problems statement,” Signal Process., vol. 24, pp. 11–20, 1991. [20] B. C. Ihm and D. J. Park, “Blind separation of sources using higher-order cumulants,” Signal Process., vol. 73, pp. 267–276, 1999. [21] A. Cichocki, S. I. Amari, and R. Thawonmas, “Blind signal extraction using self-adaptive nonlinear Hebbian learning rule,” in Proc. Int. Symp. Nonlinear Theory Appl., Kochi, Japan, 1996, pp. 377–380. [22] R. Thawonmas and A. Cichocki, “Blind extraction of source signals with specified stochastic features,” in Proc. IEEE Int. Conf. Acoust. Speech, Signal Process., vol. 4, 1997, pp. 3353–3356. [23] A. Cichocki, R. Thawonmas, and S. Amari, “Sequential blind signal extraction in order specified by stochastic properties,” Electron. Lett., vol. 33, no. 1, pp. 64–65, 1997. [24] R. Thawonmas, A. Cichocki, and S. Amari, “A cascade neural network for blind extraction without spurious equilibria,” IEICE Trans. Fund., vol. E81-A, no. 9, pp. 1–14, 1998. [25] Z. Malouche and O. Macchi, “Adaptive unsupervised extraction of one component of a linear mixture with a single neuron,” IEEE Trans. Neural Networks, vol. 9, pp. 123–135, Jan. 1998.

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 50, NO. 5, MAY 2002

[26] A. Cichocki, “Robust neural networks with on-line learning for blind identification and blind separation,” IEEE Trans. Circuits Syst. I, vol. 43, pp. 894–906, Nov. 1996. [27] A. M. Kagan, J. V. Linnik, and C. R. Rao, Characterization Problems in Mathematical Statistics. New York: Wiley, 1973. [28] A. Mansour and C. Jutten, “Fourth-order criteria for blind sources separation,” IEEE Trans. Signal Processing, vol. 43, pp. 2022–2025, Aug. 1995. [29] S. C. Douglas and A. Cichocki, “Neural networks for blind decorrelation of signals,” IEEE Trans. Signal Processing, vol. 45, pp. 2829–2840, Nov. 1997. [30] U. A. Lindgren and H. Broman, “Source separation using a criterion based on second-order statistics,” IEEE Trans. Signal Processing, vol. 46, pp. 1837–1850, July 1998. [31] Y. Q. Li, J. Wang, and J. M. Zurada, “Blind extraction of singularly mixed source signals,” IEEE Trans. Neural Networks, vol. 11, pp. 1413–1422, Nov. 2000.

Yuanqing Li was born in Hunan Province, China, in 1966. He received the B.S. degree in applied mathematics from Wuhan University, Wuhan, China, in 1988, the M.S. degree in applied mathematics from South China Normal University, Guangzhou, in 1994, and the Ph.D. degree in control theory and applications from South China University of Technology, Guangzhou, in 1997. Presently, he is an Associate Professor with the Automatic Control Engineering Department, College of Electronic and Information Technology, South China University of Technology. His research interests include applied mathematics, singular systems, neural networks, and blind signal processing.

Jun Wang (S’89–M’90–SM’93) received the B.S. degree in electrical engineering, the M.S. degree in systems engineering from Dalian University of Technology, Dalian, China, and the Ph.D. degree in systems engineering from Case Western Reserve University, Cleveland, OH. He is currently a Professor of automation and computer-aided engineering at the Chinese University of Hong Kong. He is also a guest professor at South China University of Technology, Guangzhou, among other adjunct professorships. Prior to coming to Hong Kong, he was an associate professor at the University of North Dakota, Grand Forks. His current research interests include neural networks and their engineering applications. Prof. Wang is an Associate Editor of the IEEE TRANSACTIONS ON NEURAL NETWORKS and the IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS.