Dec 31, 2009 · Abstract: Learning linear combinations of multiple kernels is an appealing strategy when the right choice of features is unknown.
Feb 24, 2010 · Non-Sparse Regularization and Efficient Training with. Multiple ... In learning with multiple kernels we aim at minimizing the loss on the ...
Non-Sparse Regularization with Multiple Kernels. Marius Kloft. Ulf Brefeld. Soeren Sonnenburg. Alexander Zien. Electrical Engineering and Computer Sciences.
An experiment on controlled artificial data experiment sheds light on the appropriateness of sparse, non-sparse and ∞ MKL in various scenarios. Application of p ...
To allow for robust kernel mixtures, we generalize MKL to arbitrary norms. We devise new insights on the connection between several existing MKL formulations ...
Oct 26, 2010 · Non-sparse Regularization for Multiple Kernel Learning ... In learning with multiple kernels we aim at minimizing the loss on the training data ...
Marius Kloft, Ulf Brefeld, Sören Sonnenburg, and Alexander Zien. Non-sparse regularization and efficient training with multiple kernels. Technical report ...
Sparsity-inducing norms along with non-sparse formulations promote different degrees of sparsity at the kernel coefficient level, at the same time permitting ...
Recent research on multiple kernel learning has lead to a number of approaches for combining kernels in regularized risk minimization.
Non-sparse regularization and efficient training with multiple kernels. M Kloft, U Brefeld, S Sonnenburg, A Zien. arXiv preprint arXiv:1003.0079 186, 189-190 ...