Braun, Mikio Ludwig: Spectral Properties of the Kernel Matrix and their Relation to Kernel Methods in Machine Learning. - Bonn, 2005. - Dissertation, Rheinische Friedrich-Wilhelms-Universität Bonn.
Online-Ausgabe in bonndoc: https://nbn-resolving.org/urn:nbn:de:hbz:5N-06315
@phdthesis{handle:20.500.11811/2321,
urn: https://nbn-resolving.org/urn:nbn:de:hbz:5N-06315,
author = {{Mikio Ludwig Braun}},
title = {Spectral Properties of the Kernel Matrix and their Relation to Kernel Methods in Machine Learning},
school = {Rheinische Friedrich-Wilhelms-Universität Bonn},
year = 2005,
note = {Machine learning is an area of research concerned with the construction of algorithms which are able to learn from examples. Among such algorithms, so-called kernel methods form an important family of algorithms which have proven to be powerful and versatile for a large number of problem areas. Central to these approaches is the kernel matrix which summarizes the information contained in the training examples. The goal of this thesis is to analyze machine learning kernel methods based on properties of the kernel matrix. The algorithms considered are kernel principal component analysis and kernel ridge regression. This thesis is divided into two parts: a theoretical part devoted to studying the spectral properties of the kernel matrix, and an application part which analyzes the kernel principal component analysis method and kernel based regression based on these theoretical results.
In the theoretical part, convergence properties of the eigenvalues and eigenvectors of the kernel matrix are studied. We derive accurate bounds on the approximation error which have the important property that the error bounds scale with the magnitude of the eigenvalue, predicting correctly that the approximation error of small eigenvalues is much smaller than that of large eigenvalues. In this respect, the results improve significantly on existing results. A similar result is proven for scalar products with eigenvectors of the kernel matrix. It is shown that the scalar products with eigenvectors corresponding to small eigenvalues are small a priori independently of the degree of approximation.
In the application part, we discuss the following topics. For kernel principal component analysis, we show that the estimated eigenvalues approximate the true principal values with high precision. Next, we discuss the general setting of kernel based regression and show that the relevant information of the labels is contained in the first few coefficients of the label vector in the basis of eigenvectors of the kernel matrix, such that the information and the noise can be divided much more easily in this representation. Finally, we show that kernel ridge regression works by suppressing all but the leading coefficients, thereby extracting the relevant information of the label vectors. This interpretation suggests an estimate of the number of relevant coefficients in order to perform model selection. In an experimental evaluation, this approach proves to perform competitively to state-of-the-art methods.},

url = {https://hdl.handle.net/20.500.11811/2321}
}

Die folgenden Nutzungsbestimmungen sind mit dieser Ressource verbunden:

InCopyright