Blind Source Separation Based on Joint Diagonalization in R : The Packages JADE and BSSasymp
Miettinen, J., Nordhausen, K., & Taskinen, S. (2017). Blind Source Separation Based on Joint Diagonalization in R : The Packages JADE and BSSasymp. Journal of Statistical Software, 76(2), 1-31. https://doi.org/10.18637/jss.v076.i02
Published in
Journal of Statistical SoftwareDate
2017Copyright
© the Authors, 2017. This is an open access article under the terms of the Creative Commons Attribution 3.0 Unported License.
Blind source separation (BSS) is a well-known signal processing tool which is used to
solve practical data analysis problems in various fields of science. In BSS, we assume that
the observed data consists of linear mixtures of latent variables. The mixing system and
the distributions of the latent variables are unknown. The aim is to find an estimate of an
unmixing matrix which then transforms the observed data back to latent sources. In this
paper we present the R packages JADE and BSSasymp. The package JADE offers several
BSS methods which are based on joint diagonalization. Package BSSasymp contains
functions for computing the asymptotic covariance matrices as well as their data-based
estimates for most of the BSS estimators included in package JADE. Several simulated
and real datasets are used to illustrate the functions in these two packages.
Publisher
Foundation for Open Access StatisticsISSN Search the Publication Forum
1548-7660Keywords
Publication in research information system
https://converis.jyu.fi/converis/portal/detail/Publication/26487836
Metadata
Show full item recordCollections
License
Except where otherwise noted, this item's license is described as © the Authors, 2017. This is an open access article under the terms of the Creative Commons Attribution 3.0 Unported License.
Related items
Showing items with similar title or keywords.
-
On the usage of joint diagonalization in multivariate statistics
Nordhausen, Klaus; Ruiz-Gazen, Anne (Elsevier, 2022)Scatter matrices generalize the covariance matrix and are useful in many multivariate data analysis methods, including well-known principal component analysis (PCA), which is based on the diagonalization of the covariance ... -
KernelICA : Kernel Independent Component Analysis
Koesner, Christoph L.; Nordhausen, Klaus (CRAN - The Comprehensive R Archive Network, 2021)The kernel independent component analysis (kernel ICA) method introduced by Bach and Jordan (2003) . The incomplete Cholesky decomposition used in kernel ICA is provided as separate function. -
Asymptotic and bootstrap tests for subspace dimension
Nordhausen, Klaus; Oja, Hannu; Tyler, David E. (Elsevier, 2022)Many linear dimension reduction methods proposed in the literature can be formulated using an appropriate pair of scatter matrices. The eigen-decomposition of one scatter matrix with respect to another is then often used ... -
Snowball ICA : A Model Order Free Independent Component Analysis Strategy for Functional Magnetic Resonance Imaging Data
Hu, Guoqiang; Waters, Abigail B.; Aslan, Serdar; Frederick, Blaise; Cong, Fengyu; Nickerson, Lisa D. (Frontiers Media, 2020)In independent component analysis (ICA), the selection of model order (i.e., number of components to be extracted) has crucial effects on functional magnetic resonance imaging (fMRI) brain network analysis. Model order ... -
Generation of stimulus features for analysis of FMRI during natural auditory experiences
Tsatsishvili, Valeri; Cong, Fengyu; Ristaniemi, Tapani; Toiviainen, Petri; Alluri, Vinoo; Brattico, Elvira; Nandi, Asoke (IEEE, 2014)In contrast to block and event-related designs for fMRI experiments, it becomes much more difficult to extract events of interest in the complex continuous stimulus for finding corresponding blood-oxygen-level ...