Search ETDs:
Model Selection in Kernel Methods
You, Di

2011, Doctor of Philosophy, Ohio State University, Electrical and Computer Engineering.

Kernel methods have been extensively studied in pattern recognition and machine learning over the last decade, and they have been successfully used in a variety of applications. A main advantage of kernel methods is that nonlinear problems such as classification and regression can be efficiently solved using classical linear approaches. The performance of kernel methods greatly depends on the selected kernel model. The model is defined by the kernel mapping and its parameters. Different models result in different generalization performance. Hence, model selection in kernel methods is an important problem and remains a challenge in the literature. In this dissertation, we propose several approaches to address this problem. Our approaches can determine good learning models by optimizing both the kernels and all other parameters in the kernel-based algorithms.

In classification, we develop an algorithm yielding class distributions that are linearly separable in the kernel space. The idea is to enforce the homoscedasticity and separability of the pairwise class distributions simultaneously in the kernel space. We show how this approach can be employed to optimize kernels in discriminant analysis. We then derive a criterion to search for a good kernel representation by directly minimizing the Bayes classification error over different kernel mappings.

In regression, we derive a model selection approach to directly balance the model fit and model complexity using the framework of multiobjective optimization. We develop an algorithm to obtain the Pareto-optimal solutions which balance the trade-off between the model fit and model complexity. We show how the proposed method is related to minimizing the predicted generalization error of the learning function.

In our final algorithm, the kernel matrix is recursively learned with genetic algorithms until the classification/prediction error falls below a threshold. We derive a family of adaptive kernels to better fit the data with various densities and show their superiority over the commonly used fixed-shape kernels.

Extensive experimental results demonstrate that the proposed approaches are superior to the state of the art.

Aleix Martinez (Advisor)
Yuan Zheng (Committee Member)
Yoonkyung Lee (Committee Member)
186 p.

Recommended Citations

Hide/Show APA Citation

You, D. (2011). Model Selection in Kernel Methods. (Electronic Thesis or Dissertation). Retrieved from

Hide/Show MLA Citation

You, Di. "Model Selection in Kernel Methods." Electronic Thesis or Dissertation. Ohio State University, 2011. OhioLINK Electronic Theses and Dissertations Center. 21 Apr 2018.

Hide/Show Chicago Citation

You, Di "Model Selection in Kernel Methods." Electronic Thesis or Dissertation. Ohio State University, 2011.


osu1322581224.pdf (2.7 MB) View|Download