• Complain

Duan Zhansheng - Principal Component Analysis Networks and Algorithms

Here you can read online Duan Zhansheng - Principal Component Analysis Networks and Algorithms full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. City: Singapore, year: 2017, publisher: Springer Singapore, genre: Home and family. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

Duan Zhansheng Principal Component Analysis Networks and Algorithms

Principal Component Analysis Networks and Algorithms: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Principal Component Analysis Networks and Algorithms" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

Introduction -- Eigenvalue and singular value decomposition -- Principal component analysis neural networks -- Minor component analysis neural networks -- Dual purpose methods for principal and minor component analysis -- Deterministic discrete time system for PCA or MCA methods -- Generalized feature extraction method -- Coupled principal component analysis -- Singular feature extraction neural networks.;This book not only provides a comprehensive introduction to neural-based PCA methods in control science, but also presents many novel PCA algorithms and their extensions and generalizations, e.g., dual purpose, coupled PCA, GED, neural based SVD algorithms, etc. It also discusses in detail various analysis methods for the convergence, stabilizing, self-stabilizing property of algorithms, and introduces the deterministic discrete-time systems method to analyze the convergence of PCA/MCA algorithms. Readers should be familiar with numerical analysis and the fundamentals of statistics, such as the basics of least squares and stochastic algorithms. Although it focuses on neural networks, the book only presents their learning law, which is simply an iterative algorithm. Therefore, no a priori knowledge of neural networks is required. This book will be of interest and serve as a reference source to researchers and students in applied mathematics, statistics, engineering, and other related fields.

Duan Zhansheng: author's other books


Who wrote Principal Component Analysis Networks and Algorithms? Find out the surname, the name of the author of the book and a list of all author's works by series.

Principal Component Analysis Networks and Algorithms — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Principal Component Analysis Networks and Algorithms" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Science Press, Beijing and Springer Nature Singapore Pte Ltd. 2017
Xiangyu Kong , Changhua Hu and Zhansheng Duan Principal Component Analysis Networks and Algorithms 10.1007/978-981-10-2915-8_1
1. Introduction
Xiangyu Kong 1
(1)
Department of Control Engineering, Xian Institute of Hi-tech, Xian, China
(2)
Department of Control Engineering, Xian Institute of Hi-tech, Xian, China
(3)
Center for Information Engineering Science Research, Xian Jiaotong University, Xian, Shaanxi, China
Xiangyu Kong (Corresponding author)
Email:
Changhua Hu
Email:
Zhansheng Duan
Email:
1.1 Feature Extraction
Pattern recognition and data compression are two applications that rely critically on efficient data representation []. In these applications, it is desirable to extract measurements that are invariant or insensitive to the variations within each class. The process of extracting such measurements is called feature extraction . It is also to say feature extraction is a data processing which maps a high-dimensional space to a low-dimensional space with minimum information loss.
Principal component analysis (PCA) is a well-known feature extraction method, while minor component analysis (MCA) and independent component analysis (ICA) can be regarded as variants or generalizations of the PCA. MCA is most useful for solving total least squares (TLS) problems, and ICA is usually used for blind signal separation (BSS).
In the following, we briefly review PCA, PCA neural networks, and extensions or generalizations of PCA.
1.1.1 PCA and Subspace Tracking
The principal components (PC) are the directions in which the data have the largest variances and capture most of the information contents of data. They correspond to the eigenvectors associated with the largest eigenvalues of the autocorrelation matrix of the data vectors. Expressing data vectors in terms of the PC is called PCA. On the contrary, the eigenvectors that correspond to the smallest eigenvalues of the autocorrelation matrix of the data vectors are defined as the minor components (MC), and MC are the directions in which the data have the smallest variances (they represent the noise in the data). Expressing data vectors in terms of the MC is called MCA. Now, PCA has been successfully applied in many data processing problems, such as high-resolution spectral estimation, system identification, image compression, and pattern recognition, and MCA is also applied in total least squares, moving target indication, clutter cancelation, curve and surface fitting, digital beamforming, and frequency estimation.
The PCA or MCA is usually one dimensional. However, in real applications, PCA or MCA is mainly multiple dimensional. The eigenvectors associated with the r largest (or smallest) eigenvalues of the autocorrelation matrix of the data vectors is called principal (or minor) components, and r is referred to as the number of the principal (or minor) components. The eigenvector associated with the largest (smallest) eigenvalue of the autocorrelation matrix of the data vectors is called largest (or smallest) component. The subspace spanned by the principal components is called principal subspace (PS), and the subspace spanned by the minor components is called minor subspace (MS). In some applications, we are only required to find the PS (or MS) spanned by r orthonormal eigenvectors. The PS is sometimes called signal subspace, and the MS is called noise subspace. Principal and minor component analyzers of a symmetric matrix are matrix differential equations that converge on the PCs and MCs, respectively. Similarly, the principal (PSA) and minor (MSA) subspace analyzers of a symmetric matrix are matrix differential equations that converge on a matrix whose columns span is the PS and MS, respectively. PCA/PSA and MCA/MSA are powerful techniques in many information processing fields. For example, PCA/PSA is a useful tool in feature extraction, data compression, pattern recognition, and time series prediction [].
As discussed before, the PC is the direction which corresponds to the eigenvector associated with the largest eigenvalue of the autocorrelation matrix of the data vectors, and the MC is the direction which corresponds to the eigenvector associated with the smallest eigenvalue of the autocorrelation matrix of the data vectors. Thus, implementations of these techniques can be based on batch eigenvalue decomposition (ED) of the sample correlation matrix or on singular value decomposition (SVD) of the data matrix. This approach is unsuitable for adaptive processing because it requires repeated ED/SVD, which is a very time-consuming task []. Thus, the attempts to propose adaptive algorithms are still continuing even though the field has been active for three decades up to now.
1.1.2 PCA Neural Networks
In order to overcome the difficulty faced by ED or SVD, a number of adaptive algorithms for subspace tracking were developed in the past. Most of these techniques can be grouped into three classes [] have been proposed to track the signal or noise subspace.
Neural network approaches on PCA or MCA pursue an effective online approach to update the eigen direction after each presentation of a data point, which possess many obvious advantages, such as lower computational complexity, compared with the traditional algebraic approaches such as SVD. Neural network methods are especially suited for high-dimensional data, since the computation of the large covariance matrix can be avoided, and for the tracking of nonstationary data, where the covariance matrix changes slowly over time. The attempts to improve the methods and to suggest new approaches are continuing even though the field has been active for two decades up to now.
In the last decades, many neural network learning algorithms were proposed to extract PS []. These gradient-type algorithms could be claimed to be globally convergent.
In the class of MS tracking, many algorithms [].
1.1.3 Extension or Generalization of PCA
It can be found that the above-mentioned algorithms only focused on eigenvector extraction or eigen-subspace tracking with noncoupled rules. However, a serious speed stability problem exists in the most noncoupled rules [].
It is well known that the generalized eigen decomposition (GED) plays very important roles in various signal processing applications, e.g., data compression, feature extraction, denoising, antenna array processing, and classification. Though PCA, which is the special case of GED problem, has been widely studied, the adaptive algorithms for the GED problem are scarce. Fortunately, a few efficient online adaptive algorithms for the GED problem that can be applied in real-time applications have been proposed [].
Other extensions of PCA also include dual-purpose algorithm [.
1.2 Basis for Subspace Tracking
In Sect. , we have reviewed the PCA algorithm and its extensions and generalizations from the viewpoint of the feature extraction. In this section, from another viewpoint of subspace, we will discuss the concept of subspace and subspace tracking method.
1.2.1 Concept of Subspace
Definition 1
If Principal Component Analysis Networks and Algorithms - image 1 is the vector subset of vector space V , then the set W of all linear combinations of Principal Component Analysis Networks and Algorithms - image 2
Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Principal Component Analysis Networks and Algorithms»

Look at similar books to Principal Component Analysis Networks and Algorithms. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Principal Component Analysis Networks and Algorithms»

Discussion, reviews of the book Principal Component Analysis Networks and Algorithms and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.