Показать сокращенную информацию
dc.contributor.author | Lu J. | |
dc.contributor.author | Lin J. | |
dc.contributor.author | Lai Z. | |
dc.contributor.author | Wang H. | |
dc.contributor.author | Zhou J. | |
dc.date.accessioned | 2022-02-09T20:31:06Z | |
dc.date.available | 2022-02-09T20:31:06Z | |
dc.date.issued | 2021 | |
dc.identifier.issn | 0020-0255 | |
dc.identifier.uri | https://dspace.kpfu.ru/xmlui/handle/net/168698 | |
dc.description.abstract | Least squares regression (LSR) has attracted widespread attention in the fields of statistics, machine learning, and pattern recognition. However, it utilizes strict zero-one regression targets, which leads to inferior performance on classification tasks. Furthermore, LSR ignores the local manifold structures of data and lacks robustness. To address these issues, this paper proposes a general regression framework called RLRR, where a low-rank constraint is imposed on regression matrices to explore the underlying correlation structures of classes. Strict zero-one regression targets are redirected to more feasible variable matrices for the purpose of margin amplification of different classes. Additionally, rather than using a pre-constructed weighted graph, the proposed framework dynamically updates the neighborhood structures of data to preserve original manifold structures. By utilizing this framework as a general platform, we developed two dynamic neighborhood-structure-based regression models called RLRRM and RLRRP. RLRRM integrates a reconstruction error minimization term into the proposed RLRR framework, whereas RLRRP aims to preserve the local geometric structures of data in a low-dimensional subspace. Both RLRRM and RLRRP use theℓ2,1-norm penalty to replace the traditional F-norm penalty for the projection matrix for the sake of self-adaptive feature selection. Instead of directly solving the resultant optimization problems with non-convex constraints, we adopt the variable-splitting and penalty techniques to derive an equivalent solution. Analysis of the corresponding convergence and computational complexity characteristics is also presented. Extensive experiments on several well-known datasets demonstrate the promising performance of the proposed models. | |
dc.relation.ispartofseries | Information Sciences | |
dc.subject | Dynamic neighbors | |
dc.subject | Joint sparsity | |
dc.subject | Least squares regression | |
dc.subject | Local and global structure preservation | |
dc.subject | Target redirected regression | |
dc.title | Target redirected regression with dynamic neighborhood structure | |
dc.type | Article | |
dc.relation.ispartofseries-volume | 544 | |
dc.collection | Публикации сотрудников КФУ | |
dc.relation.startpage | 564 | |
dc.source.id | SCOPUS00200255-2021-544-SID85092003879 |