Approaches to multiple kernel learning (MKL) employ ℓ1-norm constraints on the mixing coefficients to promote sparse kernel combinations.
... non-sparse multiple kernel learning (MKL), we propose a non-sparse version of MK-FDA, which imposes a general ℓp norm regularisation on the kernel weights ...
Feb 27, 2010 · Previous approaches to multiple kernel learning (MKL) promote sparse kernel combinations to support interpretability and scalability.
Abstract: We consider the problem of learning a linear combination of pre-specified kernel matrices in the Fisher discriminant analysis setting.
This paper gives a general theoretical tool to derive fast learning rates that is applicable to arbitrary monotone norm-type regularizations in a unifying ...
In this paper, we give a new generalization error bound of Multiple Kernel Learning (MKL) for a general class of regularizations, and discuss what kind of ...
We study non-sparse multiple kernel learning by imposing an ℓ2-norm constraint on the mixing coefficients. Empirically, ℓ2-MKL proves robust against noisy and ...
Building on recent advances in non-sparse multiple kernel learning (MKL), we propose a non-sparse version of MK-FDA, which imposes a general lp norm ...
Jun 9, 2021 · Nonsparse multi-Kernel learning has won many successful applications in multimodal data fusion due to the full utilization of multiple Kernels.
Abstract. In object classification tasks from digital photographs, multiple cat- egories are considered for annotation. Some of these visual concepts may ...