dc.contributor.author | Aizenbud, Yariv | |
dc.date.accessioned | 2019-12-05T10:14:45Z | |
dc.date.available | 2019-12-05T10:14:45Z | |
dc.date.issued | 2019 | |
dc.identifier.isbn | 978-951-39-7965-2 | |
dc.identifier.uri | https://jyx.jyu.fi/handle/123456789/66657 | |
dc.description.abstract | The thesis focuses on solving problems that are related to the behavior of random
variables in high-dimensional spaces. The main motivation comes from the understanding
that many of the scientific challenges involve large amounts of highdimensional
data. It is known that there are always a small number of “hidden”
parameters that encode the “interesting” part of the data. The question is, how do
we identify and extract these parameters? This thesis is focused on two different
aspects of data analysis: Numerical linear algebra and manifold learning.
Numerical linear algebra is a major component for data analysis. It includes
matrix factorization algorithms such as SVD and LU. SVD is considered to be the
single most important algorithm in numerical linear algebra. However, due to the
computational complexity of classical SVD algorithms, they cannot be applied in
practice to huge datasets. One possible solution to this problem is to use low-rank
methods. The idea of low-rank methods is the fact that in many cases there are
dependencies and redundancies within the data. Therefore, the data can be well
approximated and processed by utilizing its low-rank property which results in
a faster processing of smaller data. In this thesis, Low-rank SVD and LU approximation
algorithms are presented. They create a trade-off between accuracy and
computational time. We improve on the state-of-the-art algorithms for Low-rank
SVD and LU approximation. Since matrix factorization algorithms play a central
central role in almost any modern computation, this part of the thesis provides
general tools for many of the modern big data, and data analysis challenges.
Understanding high-dimensional data via manifold learning. Many data
analysis problems are formulated in the language of manifold learning. A typical
assumption is that the data is on (or near) some unknown manifold embedded in
high dimensions, and the goal is to “understand” the structure of this manifold.
The thesis presents two result on this subject. First, a connection between two of
the most classical methods in manifold learning, PCA and least squares, is presented.
Secondly, a method for regression over manifold is presented. It allows to
interpolate functions defined on manifolds given only the values of the function
in several sampled points, without knowing the manifold on which the function
is defined. The ability to solve regression problems over manifolds, can enable us
to gain new insights from complex sampled data.
Keywords: Matrix decompositions, Random projections, SVD, LU, manifold learning,
Regression over manifolds | en |
dc.format.mimetype | application/pdf | |
dc.language.iso | eng | |
dc.publisher | Jyväskylän yliopisto | |
dc.relation.ispartofseries | JYU Dissertations | |
dc.relation.haspart | <b>Artikkeli I:</b> Shabat, G., Shmueli, Y., Aizenbud, Y., Averbuch. A. (2018). Randomized LU Decomposition. <i>Applied and Computational Harmonic Analysis, 44(2), 246-272.</i> <a href="https://doi.org/10.1016/j.acha.2016.04.006"target="_blank"> DOI: 10.1016/j.acha.2016.04.006</a> | |
dc.relation.haspart | <b>Artikkeli II:</b> Aizenbud, Y., Averbuch. A. (2018). Matrix Decompositions Using sub-Gaussian Random Matrices. <i>Information and Inference: A Journal of the IMA,8.3, 445-469.</i> <a href="https://doi.org/10.1093/imaiai/iay017"target="_blank"> DOI: 10.1093/imaiai/iay017</a> | |
dc.relation.haspart | <b>Artikkeli III:</b> Aizenbud, Y, and Sober. B. (2019). Approximating the Span of Principal Components via Iterative Least-Squares. <a href="https://arxiv.org/abs/1907.12159"target="_blank"> arXiv:1907.12159</a> | |
dc.relation.haspart | <b>Artikkeli IV:</b> Sober, B., Aizenbud, Y., Levin, D. (2021). Approximation of functions over manifolds : A Moving Least-Squares approach. <i>Journal of Computational and Applied Mathematics, 383, 113140.</i> <a href="https://doi.org/10.1016/j.cam.2020.113140"target="_blank"> DOI: 10.1016/j.cam.2020.113140</a> | |
dc.rights | In Copyright | |
dc.subject | tiedonlouhinta | |
dc.subject | koneoppiminen | |
dc.subject | algoritmit | |
dc.subject | matriisit | |
dc.subject | projektio | |
dc.subject | lineaarialgebra | |
dc.subject | monistot | |
dc.subject | regressioanalyysi | |
dc.subject | matrix decompositions | |
dc.subject | random projections | |
dc.subject | SVD | |
dc.subject | LU | |
dc.subject | manifold learning | |
dc.subject | regression over manifolds | |
dc.title | Random Projections for Matrix Decomposition and Manifold Learning | |
dc.type | Diss. | |
dc.identifier.urn | URN:ISBN:978-951-39-7965-2 | |
dc.contributor.tiedekunta | Faculty of Information Technology | en |
dc.contributor.tiedekunta | Informaatioteknologian tiedekunta | fi |
dc.contributor.yliopisto | University of Jyväskylä | en |
dc.contributor.yliopisto | Jyväskylän yliopisto | fi |
dc.relation.issn | 2489-9003 | |
dc.rights.copyright | © The Author & University of Jyväskylä | |
dc.rights.accesslevel | openAccess | |
dc.type.publication | doctoralThesis | |
dc.format.content | fulltext | |
dc.rights.url | https://rightsstatements.org/page/InC/1.0/ | |