搜索结果: 1-15 共查到“知识库 low rank”相关记录31条 . 查询时间(0.093 秒)
In this work, we seek to extend the capabilities of the “core obfuscator” from the work of Garg, Gentry, Halevi, Raykova, Sahai, and Waters (FOCS 2013), and all subsequent works constructing general-p...
Learning Low Rank Matrices from O(n) Entries
Random items matrix reconstruction matrix the stochastic matrix
2015/8/21
How many random entries of an n × nα, rank r matrix are necessary to reconstruct the matrix within an accuracy δ? We address this question in the case of a random matrix with bounded rank, whereby the...
Principal components analysis (PCA) is a well-known technique for approximating a data set represented by a matrix by a low rank matrix. Here, we extend the idea of PCA to handle arbitrary data sets c...
Principal components analysis (PCA) is a well-known technique for approximating a data set represented by a matrix by a low rank matrix. Here, we extend the idea of PCA to handle arbitrary data sets c...
Tight Oracle Bounds for Low-rank Matrix Recovery from a Minimal Number of Random Measurements
Matrix completion The Dantzig selector oracle inequalities norm of random matrices convex optimization and semidefinite programming
2015/6/17
This paper presents several novel theoretical results regarding the recovery of a low-rank matrix from just a few measurements consisting of linear combinations of the matrix entries. We show that pro...
Dense Error Correction for Low-Rank Matrices via Principal Component Pursuit
Dense Error Correction Low-Rank Matrices Principal Component Pursuit
2015/6/17
We consider the problem of recovering a lowrank matrix when some of its entries, whose locations are not known a priori, are corrupted by errors of arbitrarily large magnitude. It has recently been sh...
Low-Rank and Sparse Matrix Decomposition for Accelerated Dynamic MRI with Separation of Background and Dynamic Components
compressed sensing low-rank matrix completion sparsity dynamic MRI
2015/6/17
Purpose: To apply the low-rank plus sparse (L+S) matrix decomposition model to reconstruct undersampled dynamic MRI as a superposition of background and dynamic components in various problems of clini...
Randomized Algorithms for Low-Rank Matrix Factorizations:Sharp Performance Bounds
Randomized Algorithms Low-Rank Matrix Factorizations Sharp Performance Bounds
2015/6/17
The development of randomized algorithms for numerical linear algebra, e.g. for computing approximate QR and SVD factorizations, has recently become an intense area of research. This paper studies one...
Extracting Deep Neural Network Bottleneck Features using Low-Rank Matrix Factorization
DNN Bottleneck features
2014/11/27
In this paper, we investigate the use of deep neural networks (DNNs) to generate a stacked bottleneck (SBN) feature representation for low-resource speech recognition. We examine different SBN extract...
Accurate scoring of syntactic structures
such as head-modifier arcs in dependency parsing typically requires rich, highdimensional feature representations.
Extracting Deep Neural Network Bottleneck Features Using Low-Rank Matrix Factorization
DNN Bottleneck features
2015/3/9
Extracting Deep Neural Network Bottleneck Features Using Low-Rank Matrix Factorization.
Parallel Gaussian Process Regression with Low-Rank Covariance Matrix Approximations
Parallel Gaussian Process Regression Low-Rank Covariance Matrix Approximations
2013/6/14
Gaussian processes (GP) are Bayesian non-parametric models that are widely used for probabilistic regression. Unfortunately, it cannot scale well with large data nor perform real-time predictions due ...
A least-squares method for sparse low rank approximation of multivariate functions
least-squares method sparse low rank approximation multivariate functions
2013/6/14
In this paper, we propose a low-rank approximation method based on discrete least-squares for the approximation of a multivariate function from random, noisy-free observations. Sparsity inducing regul...
We introduce a novel algorithm that computes the $k$-sparse principal component of a positive semidefinite matrix $A$. Our algorithm is combinatorial and operates by examining a discrete set of specia...
Sharp analysis of low-rank kernel matrix approximations
Sharp analysis low-rank kernel matrix approximations
2012/9/18
We consider supervised learning problems within the positive-definite kernel framework,such as kernel ridge regression, kernel logistic regression or the support vector machine. With kernels leading t...