Kernel Feature Extraction in Signal Processing

View Researcher II's Other Codes

Disclaimer: “The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).”

Please contact us in case of a broken link from here

Authors V. Laparra and Camps-Valls, G.
Journal/Conference Name Digital Signal Processing with Kernel Methods (Chapter 12)
Paper Category
Paper Abstract Kernel‐based feature extraction and dimensionality reduction are becoming increasingly important in advanced signal processing. This is particularly relevant in applications dealing with very high‐dimensional data. Besides changing the data representation space via kernel feature extraction, another possibility is to correct for biases in the data distributions operating on the samples. This chapter reviews the main kernel feature extraction and dimensionality reduction methods, dealing with supervised, unsupervised and semi‐supervised settings. It illustrates methods in toy examples, as well as real datasets. The chapter also analyzes the connections between Hilbert‐Schmidt independence criterion (HSIC) and classical feature extraction methods. The HSIC method measures cross‐covariance in an adequate reproducing kernel Hilbert space (RKHS) by using the entire spectrum of the cross‐covariance operator. Kernel dimensionality reduction (KDR) is a supervised feature extraction method that seeks a linear transformation of the data such that it maximizes the conditional HSIC on the labels.
Date of publication 2017
Code Programming Language MATLAB

Copyright Researcher II 2022