MMDS大阪大学 数理・データ科学教育研究センター
Center for Mathematical Modeling and Data Science,Osaka University

Sparse Hilbert--Schmidt Independence Criterion Regression

Benjamin Poignard (Osaka University)

大阪大学 数理・データ科学セミナー データ科学セミナーシリーズ 第51回

Sparse Hilbert--Schmidt Independence Criterion Regression

Benjamin Poignard (Osaka University)

Feature selection is a fundamental problem for machine learning and statistics, and it has been widely studied over the past decades. However, the majority of feature selection algorithms are based on linear models, and the nonlinear feature selection problem has not been well studied compared to linear models, in particular for the high-dimensional case. In this paper, we propose the sparse Hilbert-Schmidt Independence Criterion (SpHSIC) regression, which is a versatile nonlinear feature selection algorithm based on the HSIC and is a continuous optimization variant of the well-known minimum redundancy maximum relevance (mRMR) feature selection algorithm. More specifically, the SpHSIC consists of two parts: the convex HSIC loss function on the one hand and the regularization term on the other hand, where we consider the Lasso, Bridge, MCP, and SCAD penalties. We prove some asymptotic properties of the sparsity based HSIC regression estimator. We also provide the conditions to satisfy the support recovery property. On the basis of synthetic and real-world experiments, we illustrate this theoretical property and highlight the fact that the proposed algorithm performs well in the high-dimensional setting.

講師: Benjamin Poignard (Osaka University)
テーマ: 大阪大学 数理・データ科学セミナー データ科学セミナーシリーズ 第51回
日時: 2019年11月25日(月) 10:45-12:15
場所: 大阪大学基礎工学部 J棟 J617
参加費: 無料
参加方法: 申し込み不要
アクセス: 会場までのアクセスは下記URLをご参照ください。
http://www.es.osaka-u.ac.jp/ja/access.html
お問い合せ: 本ウェブサイトの「お問い合せ」のページをご参照ください。