Handbook of Blind Source Separation -

Handbook of Blind Source Separation (eBook)

Independent Component Analysis and Applications
eBook Download: PDF | EPUB
2010 | 1. Auflage
856 Seiten
Elsevier Science (Verlag)
978-0-08-088494-3 (ISBN)
Systemvoraussetzungen
Systemvoraussetzungen
124,00 inkl. MwSt
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

Edited by the people who were forerunners in creating the field, together with contributions from 34 leading international experts, this handbook provides the definitive reference on Blind Source Separation, giving a broad and comprehensive description of all the core principles and methods, numerical algorithms and major applications in the fields of telecommunications, biomedical engineering and audio, acoustic and speech processing. Going beyond a machine learning perspective, the book reflects recent results in signal processing and numerical analysis, and includes topics such as optimization criteria, mathematical tools, the design of numerical algorithms, convolutive mixtures, and time frequency approaches. This Handbook is an ideal reference for university researchers, R&D engineers and graduates wishing to learn the core principles, methods, algorithms, and applications of Blind Source Separation.


  • Covers the principles and major techniques and methods in one book
  • Edited by the pioneers in the field with contributions from 34 of the world's experts
  • Describes the main existing numerical algorithms and gives practical advice on their design
  • Covers the latest cutting edge topics: second order methods; algebraic identification of under-determined mixtures, time-frequency methods, Bayesian approaches, blind identification under non negativity approaches, semi-blind methods for communications
  • Shows the applications of the methods to key application areas such as telecommunications, biomedical engineering, speech, acoustic, audio and music processing, while also giving a general method for developing applications

Edited by the people who were forerunners in creating the field, together with contributions from 34 leading international experts, this handbook provides the definitive reference on Blind Source Separation, giving a broad and comprehensive description of all the core principles and methods, numerical algorithms and major applications in the fields of telecommunications, biomedical engineering and audio, acoustic and speech processing. Going beyond a machine learning perspective, the book reflects recent results in signal processing and numerical analysis, and includes topics such as optimization criteria, mathematical tools, the design of numerical algorithms, convolutive mixtures, and time frequency approaches. This Handbook is an ideal reference for university researchers, R&D engineers and graduates wishing to learn the core principles, methods, algorithms, and applications of Blind Source Separation. Covers the principles and major techniques and methods in one book Edited by the pioneers in the field with contributions from 34 of the world's experts Describes the main existing numerical algorithms and gives practical advice on their design Covers the latest cutting edge topics: second order methods; algebraic identification of under-determined mixtures, time-frequency methods, Bayesian approaches, blind identification under non negativity approaches, semi-blind methods for communications Shows the applications of the methods to key application areas such as telecommunications, biomedical engineering, speech, acoustic, audio and music processing, while also giving a general method for developing applications

Front cover 1
Half page 2
Title page 4
Copyright page 5
Contents 6
About the editors 20
Preface 22
Contributors 24
Chapter 1. Introduction 26
1.1. Genesis of blind source separation 26
1.2. Problem formalization 35
1.3. Source separation methods 36
1.4. Spatial whitening, noise reduction and PCA 38
1.5. Applications 40
1.6. Content of the handbook 40
References 44
Chapter 2. Information 48
2.1. Introduction 48
2.2. Methods based on mutual information 49
2.3. Methods based on mutual information rate 70
2.4. Conclusion and perspectives 86
References 87
Chapter 3. Contrasts 90
3.1. Introduction 90
3.2. Cumulants 92
3.3. MISO contrasts 94
3.4. MIMO contrasts for static mixtures 103
3.5. MIMO contrasts for dynamic mixtures 117
3.6. Constructing other contrast criteria 126
3.7. Conclusion 127
References 128
Chapter 4. Likelihood 132
4.1. Introduction: Models and likelihood 132
4.2. Transformation model and equivariance 134
4.3. Independence 141
4.4. Identifiability, stability, performance 147
4.5. Non-Gaussian models 156
4.6. Gaussian models 161
4.7. Noisy models 167
4.8. Conclusion: A general view 173
4.9. Appendix: Proofs 177
References 178
Chapter 5. Algebraic methods after prewhitening 180
5.1. Introduction 180
5.2. Independent component analysis 186
5.3. Diagonalization in least squares sense 190
5.4. Simultaneous diagonalization of matrix slices 195
5.5. Simultaneous diagonalization of third-order tensor slices 199
5.6. Maximization of the tensor trace 199
References 200
Chapter 6. Iterative algorithms 204
6.1. Introduction 204
6.2. Model and goal 205
6.3. Contrast functions for iterative BSS/ICA 206
6.4. Iterative search algorithms: Generalities 211
6.5. Iterative whitening 217
6.6. Classical adaptive algorithms 218
6.7. Relative (natural) gradient techniques 224
6.8. Adapting the nonlinearities 228
6.9. Iterative algorithms based on deflation 230
6.10. The FastICA algorithm 233
6.11. Iterative algorithms with optimal step size 241
6.12. Summary, conclusions and outlook 245
References 246
Chapter 7. Second-order methods based on color 252
7.1. Introduction 252
7.2. WSS processes 253
7.3. Problem formulation, identifiability and bounds 257
7.4. Separation based on joint diagonalization 270
7.5. Separation based on maximum likelihood 285
7.6. Additional issues 295
References 301
Chapter 8. Convolutive mixtures 306
8.1. Introduction and mixture model 306
8.2. Invertibility of convolutive MIMO mixtures 308
8.3. Assumptions 312
8.4. Joint separating methods 317
8.5. Iterative and deflation methods 326
8.6. Non-stationary context 334
References 347
Chapter 9. Algebraic identification of under-determined mixtures 350
9.1. Observation model 350
9.2. Intrinsic identifiability 351
9.3. Problem formulation 357
9.4. Higher-order tensors 362
9.5. Tensor-based algorithms 370
9.6. Appendix: expressions of complex cumulants 385
References 387
Chapter 10. Sparse component analysis 392
10.1. Introduction 392
10.2. Sparse signal representations 395
10.3. Joint sparse representation of mixtures 399
10.4. Estimating the mixing matrix by clustering 413
10.5. Square mixing matrix: Relative Newton method 421
10.6. Separation with a known mixing matrix 428
10.7. Conclusion 435
10.8. Outlook 437
References 439
Chapter 11. Quadratic time-frequency domain methods 446
11.1. Introduction 446
11.2. Problem statement 447
11.3. Spatial quadratic t - f spectra and representations 452
11.4. Time-frequency points selection 460
11.5. Separation algorithms 465
11.6. Practical and computer simulations 477
11.7. Summary and conclusion 487
References 489
Chapter 12. Bayesian approaches 492
12.1. Introduction 492
12.2. Source separation forward model and notations 493
12.3. General Bayesian scheme 495
12.4. Relation to PCA and ICA 496
12.5. Prior and likelihood assignments 502
12.6. Source modeling 507
12.7. Estimation schemes 518
12.8. Source separation applications 519
12.9. Source characterization 524
12.10. Conclusion 533
References 534
Chapter 13. Non-negative mixtures 540
13.1. Introduction 540
13.2. Non-negative matrix factorization 540
13.3. Extensions and modifications of NMF 546
13.4. Further non-negative algorithms 559
13.5. Applications 564
13.6. Conclusions 567
References 567
Chapter 14. Nonlinear mixtures 574
14.1. Introduction 574
14.2. Nonlinear ICA in the general case 575
14.3. ICA for constrained nonlinear mixtures 579
14.4. Priors on sources 592
14.5. Independence criteria 595
14.6. A Bayesian approach for general mixtures 600
14.7. Other methods and algorithms 605
14.8. A few applications 606
14.9. Conclusion 609
References 611
Chapter 15. Semi-blind methods for communications 618
15.1. Introduction 618
15.2. Training-based and blind equalization 620
15.3. Overcoming the limitations of blind methods 622
15.4. Mathematical formulation 624
15.5. Channel equalization criteria 626
15.6. Algebraic equalizers 629
15.7. Iterative equalizers 635
15.8. Performance analysis 641
15.9. Semi-blind channel estimation 653
15.10. Summary, conclusions and outlook 657
References 658
Chapter 16. Overview of source separation applications 664
16.1. Introduction 664
16.2. How to solve an actual source separation problem 667
16.3. Overfitting and robustness 670
16.4. Illustration with electromagnetic transmission systems 673
16.5. Example: Analysis of Mars hyperspectral images 683
16.6. Mono- vs multi-dimensional sources and mixtures 693
16.7. Using physical mixture models or not 697
16.8. Some conclusions and available tools 701
References 702
Chapter 17. Application to telecommunications 708
17.1. Introduction 708
17.2. Data model, statistics and problem formulation 712
17.3. Possible methods 721
17.4. Ultimate separators of instantaneous mixtures 737
17.5. Blind separators of instantaneous mixtures 741
17.6. Instantaneous approach versus convolutive approach: simulation results 751
17.7. Conclusion 754
References 755
Chapter 18. Biomedical applications 762
18.1. Introduction 762
18.2. One decade of ICA-based biomedical data processing 764
18.3. Numerical complexity of ICA algorithms 783
18.4. Performance analysis for biomedical signals 788
18.5. Conclusion 797
References 797
Chapter 19. Audio applications 804
19.1. Audio mixtures and separation objectives 804
19.2. Usable properties of audio sources 812
19.3. Audio applications of convolutive ICA 815
19.4. Audio applications of SCA 831
19.5. Conclusion 839
References 840
Glossary 846
Index 848

Chapter 2

Information


D.T. Pham

Publisher Summary


The blind source separation (BSS) is aimed at reconstructing the sources from the observations. In a blind context, the separation of sources can only rely on the basic knowledge, which is their mutual independence. The mutual information and the independence criterion offering several benefits are adopted. First, it is invariant with respect to invertible. In particular, it is scale invariant thus avoids a prewhitening step, which is needed in many other separation methods. Second, it is a very general and complete independence criterion: it is non-negative and can be zero if and only if there is independence. Some other criteria such as the cumulants are only partial: to ensure independence, one needs to check that all (cross) cumulants vanish, but in practice only a finite number of them can be considered. Finally, the mutual information can be interpreted in terms of entropy and the Kullback–Leibler divergence, and is closely related to the expected log likelihood. This chapter covers the use of the mutual information between the observations at a given time, which is suitable for instantaneous (linear) mixtures, in which the temporal dependence of the source sequences is ignored. It also covers the use of the information rate between stationary processes, which is necessary to treat the case of convolutive (linear) mixtures and can also be useful for the case of instantaneous mixtures when there is strong temporal dependence of the source sequences.

Chapter Outline

Introduction

Methods based on mutual information

Methods based on mutual information rate

Conclusion and perspectives

2.1 Introduction


Blind source separation (BSS) deals typically with a mixing model of the form1(⋅)=A{s(⋅)} where (n) and (n) represent the source and observed vectors at time and is a transformation, which can be instantaneous (operating on each (n) to produce (n)), or global (operating on the whole sequence (⋅) of source vectors. The goal of BSS is to reconstruct the sources from the observations. Clearly, for this task to be possible should not be completely unknown: it should belong to some class of transformations given a priori. Most common classes are the class of linear (affine) instantaneous transformations and that of (linear) convolutions. They correspond to linear instantaneous mixtures and (linear) convolutive mixtures respectively. Nonlinear transformations have also been considered. For example, may be constituted of linear instantaneous (or convolutive) transformation followed by nonlinear instantaneous transformation operating component-wise. The corresponding mixtures are called post-nonlinear (or convolutive post-nonlinear). This chapter deals primarily with linear mixtures; nonlinear mixtures are treated elsewhere (see Chapter 14).

In a blind context, the separation of sources can only rely on the basic knowledge which is their mutual independence. It is thus natural to try to achieve separation by minimizing an independence criterion between the components of −1{x(⋅)} among all ∈A, where −1 denotes the inverse transformation of . We adopt here the mutual information and the independence criterion. This is a popular criterion and has many appeals. Firstly, it is invariant with respect to invertible transformation (see Lemma 2.1 below and what follows). In particular, it is scale invariant thus avoids a prewhitening step, which is needed in many other separation methods. Secondly, it is a very general and complete independence criterion: it is non-negative and can be zero if and only if there is independence. Some other criteria such as the cumulants are only partial: to ensure independence, one needs to check that all (cross) cumulants vanish, but in practice only a finite number of them can be considered. Finally, the mutual information can be interpreted in terms of entropy and Kullback-Leibler divergence [3], and is closely related to the expected log likelihood [1]. The downsize is that this criterion requires the knowledge of the joint density of the components of −1{x(⋅)}, which is unknown in practice and hence must be replaced by some nonparametric estimate. The estimation procedure can be quite costly computationally. We shall however introduce some methods which are not much costlier than using simpler criteria (such as the cumulants).

The rest of this chapter contains two parts: the first one concerns the use of the mutual information between the observations at a given time, which is suitable for instantaneous (linear) mixtures, in which the temporal dependence of the source sequences are ignored (for simplicity or because it is weak). The second part concerns the use of the information rate between stationary processes, which is necessary to treat the case of convolutive (linear) mixtures, but can also be useful for the case of instantaneous mixtures when there is strong temporal dependence of the source sequences. Note that for the convolutive mixture, the sources can be recovered only up to filtering (as it will be seen later) and one may require temporal independence of the source to lift this ambiguity. The problem then may be viewed as the multi-channel blind deconvolution problem as it reduces to the well-known deconvolution problem when both source and observed sequences are scalar. We however call this problem blind separation-deconvolution as it aims to both recover the sources (separation) and make them temporally independence (deconvolution).

2.2 Methods based on mutual information


We first define mutual information and provide its main properties.

2.2.1 Mutual information between random vectors


Let 1,…,yP, be random vectors with joint densities y1,…,yP and marginal density y1,…,pyP. The mutual information between these vectors is defined as the Kullback-Leibler divergence (or relative entropy) between the densities k=1Ppyk and y1,…,yP:

{y1…,yP}=−Elogpy1(y1)⋯pyP(yP)py1,…,yP(y1,…,yP).

This measure is non-negative (but can be ) and can vanish only if the random vectors are mutually independent [4]. It can also be written as:

{y1…,yP}=∑i=1PH{yk}−H{y1…,yP} (2.1)

(2.1)

where (y1…,yP) and (y1),…,H(yP) are the joint entropy and the marginal entropies of 1,…,yP:

{y1…,yP}=−Elogpy1,…,yP(y1…,yP) (2.2)

(2.2)

and {yk} is defined in the same way with yk in place of y1,…,yP. The notation {y1…,yP} is the same as {[y1T⋯yPT]T}, T denoting the transpose.

The entropy possesses the following interesting property of equivariance with respect to invertible transformation.

Lemma 2.1

Let be a random vector and =g(x) where is a differentiable invertible transformation with Jacobian (matrix of partial derivatives) . Then:

{y}=H{x}+Elog|detg′(x)|.

This result can be easily obtained from the definition (2.2) of entropy and the equality x(x)=py[g(x)]|detg′(x)|. It follows immediately from (2.1) that if is a transformation operating component-wise (that is the -th component of (x) depends only on the -th component of ) then the mutual information between the components of the transformed random vector (x) is the same as that between the components of the original random vector . Thus the mutual information between the random variables 1,…,yP remains unchanged when one applies to each of them an invertible transformation. Clearly, it is also unchanged if one permutes the random variables. This is consistent with the fact that independent random variables remain independent under the above operations. As a result, one can separate the sources from their linear mixtures only up to a scaling and a permutation (by exploiting their independence only).

The application of Lemma 2.1 is most interesting in the case of the linear (affine) mixture, since then one is dispensed with an expectation calculation: if...

Erscheint lt. Verlag 17.2.2010
Sprache englisch
Themenwelt Mathematik / Informatik Informatik Theorie / Studium
Naturwissenschaften Physik / Astronomie Elektrodynamik
Technik Elektrotechnik / Energietechnik
Technik Nachrichtentechnik
ISBN-10 0-08-088494-6 / 0080884946
ISBN-13 978-0-08-088494-3 / 9780080884943
Haben Sie eine Frage zum Produkt?
PDFPDF (Adobe DRM)
Größe: 20,9 MB

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

EPUBEPUB (Adobe DRM)
Größe: 49,4 MB

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: EPUB (Electronic Publication)
EPUB ist ein offener Standard für eBooks und eignet sich besonders zur Darstellung von Belle­tristik und Sach­büchern. Der Fließ­text wird dynamisch an die Display- und Schrift­größe ange­passt. Auch für mobile Lese­geräte ist EPUB daher gut geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Discover tactics to decrease churn and expand revenue

von Peter Armaly; Jeff Mar

eBook Download (2024)
Packt Publishing Limited (Verlag)
25,19
A practical guide to probabilistic modeling

von Osvaldo Martin

eBook Download (2024)
Packt Publishing Limited (Verlag)
35,99
Unleash citizen-driven innovation with the power of hackathons

von Love Dager; Carolina Emanuelson; Ann Molin; Mustafa Sherif …

eBook Download (2024)
Packt Publishing Limited (Verlag)
35,99