Kazan Federal University Digital Repository

Generalized Embedding Regression: A Framework for Supervised Feature Extraction

Show simple item record

dc.contributor.author Lu J.
dc.contributor.author Lai Z.
dc.contributor.author Wang H.
dc.contributor.author Chen Y.
dc.contributor.author Zhou J.
dc.contributor.author Shen L.
dc.date.accessioned 2021-02-25T21:00:19Z
dc.date.available 2021-02-25T21:00:19Z
dc.date.issued 2020
dc.identifier.issn 2162-237X
dc.identifier.uri https://dspace.kpfu.ru/xmlui/handle/net/162861
dc.description.abstract IEEE Sparse discriminative projection learning has attracted much attention due to its good performance in recognition tasks. In this article, a framework called generalized embedding regression (GER) is proposed, which can simultaneously perform low-dimensional embedding and sparse projection learning in a joint objective function with a generalized orthogonal constraint. Moreover, the label information is integrated into the model to preserve the global structure of data, and a rank constraint is imposed on the regression matrix to explore the underlying correlation structure of classes. Theoretical analysis shows that GER can obtain the same or approximate solution as some related methods with special settings. By utilizing this framework as a general platform, we design a novel supervised feature extraction approach called jointly sparse embedding regression (JSER). In JSER, we construct an intrinsic graph to characterize the intraclass similarity and a penalty graph to indicate the interclass separability. Then, the penalty graph Laplacian is used as the constraint matrix in the generalized orthogonal constraint to deal with interclass marginal points. Moreover, the L2,1-norm is imposed on the regression terms for robustness to outliers and data's variations and the regularization term for jointly sparse projection learning, leading to interesting semantic interpretability. An effective iterative algorithm is elaborately designed to solve the optimization problem of JSER. Theoretically, we prove that the subproblem of JSER is essentially an unbalanced Procrustes problem and can be solved iteratively. The convergence of the designed algorithm is also proved. Experimental results on six well-known data sets indicate the competitive performance and latent properties of JSER.
dc.relation.ispartofseries IEEE Transactions on Neural Networks and Learning Systems
dc.subject Feature extraction
dc.subject Generalized orthogonal constraint
dc.subject joint sparsity
dc.subject Laplace equations
dc.subject Linear programming
dc.subject low-dimensional embedding
dc.subject low-rank regression.
dc.subject Manifolds
dc.subject Principal component analysis
dc.subject Robustness
dc.subject Training
dc.title Generalized Embedding Regression: A Framework for Supervised Feature Extraction
dc.type Article
dc.collection Публикации сотрудников КФУ
dc.source.id SCOPUS2162237X-2020-SID85096865368


Files in this item

This item appears in the following Collection(s)

  • Публикации сотрудников КФУ Scopus [24551]
    Коллекция содержит публикации сотрудников Казанского федерального (до 2010 года Казанского государственного) университета, проиндексированные в БД Scopus, начиная с 1970г.

Show simple item record

Search DSpace


Advanced Search

Browse

My Account

Statistics