Logo image
Simultaneous Twin Kernel Learning Using Polynomial Transformations for Structured Prediction
Accepted manuscript   Open access   Peer reviewed

Simultaneous Twin Kernel Learning Using Polynomial Transformations for Structured Prediction

Chetan Tonde and Ahmed Elgammal
IEEE Conference on Computer Vision and Pattern Recognition. Proceedings, pp.995-1002
Columbus, Ohio, 06/2014
06/2014
DOI:
https://doi.org/10.7282/T36Q206N

Abstract

Gaussian processes Learning Artificial intelligence Polynomial approximation Reproducing Kernel Hilbert Spaces (RKHS) Twin Gaussian Processes (TGP) Automatic kernel learning Kernel methods Polynomial kernel transformations Positive definite kernel functions Structured prediction Twin kernel learning Computer Vision
Many learning problems in computer vision can be posed as structured prediction problems, where the input and output instances are structured objects such as trees, graphs or strings rather than, single labels {+1, −1} or scalars. Kernel methods such as Structured Support Vector Machines , Twin Gaussian Processes (TGP), Structured Gaussian Processes, and vector-valued Reproducing Kernel Hilbert Spaces (RKHS), offer powerful ways to perform learning and inference over these domains. Positive definite kernel functions allow us to quantitatively capture similarity between a pair of instances over these arbitrary domains. A poor choice of the kernel function, which decides the RKHS feature space, often results in poor performance. Automatic kernel selection methods have been developed, but have focused only on kernels on the input domain (i.e.’one-way’). In this work, we propose a novel and efficient algorithm for learning kernel functions simultaneously, on both input and output domains. We introduce the idea of learning polynomial kernel transformations, and call this method Simultaneous Twin Kernel Learning (STKL). STKL can learn arbitrary, but continuous kernel functions, including ’one-way’ kernel learning as a special case. We formulate this problem for learning covariances kernels of Twin Gaussian Processes. Our experimental evaluation using learned kernels on synthetic and several real-world datasets demonstrate consistent improvement in performance of TGP’s.
pdf
rutgers-lib-48304_PDF-1237.82 kBDownloadView
Accepted Manuscript (AM) Open Access
url
https://dx.doi.org/10.1109/CVPR.2014.547View
Version of Record (VoR) IEEE Conference on Computer Vision and Pattern Recognition. Proceedings
url
Report an accessibility issueView
Please complete a content remediation request to report an accessibility issue with a library electronic resource, website, or service.

Metrics

228 File downloads
79 Record Views

Details

Logo image