Abstract:
Least squares regression (LSR) has attracted widespread attention in the fields of statistics, machine learning, and pattern recognition. However, it utilizes strict zero-one regression targets, which leads to inferior performance on classification tasks. Furthermore, LSR ignores the local manifold structures of data and lacks robustness. To address these issues, this paper proposes a general regression framework called RLRR, where a low-rank constraint is imposed on regression matrices to explore the underlying correlation structures of classes. Strict zero-one regression targets are redirected to more feasible variable matrices for the purpose of margin amplification of different classes. Additionally, rather than using a pre-constructed weighted graph, the proposed framework dynamically updates the neighborhood structures of data to preserve original manifold structures. By utilizing this framework as a general platform, we developed two dynamic neighborhood-structure-based regression models called RLRRM and RLRRP. RLRRM integrates a reconstruction error minimization term into the proposed RLRR framework, whereas RLRRP aims to preserve the local geometric structures of data in a low-dimensional subspace. Both RLRRM and RLRRP use theℓ2,1-norm penalty to replace the traditional F-norm penalty for the projection matrix for the sake of self-adaptive feature selection. Instead of directly solving the resultant optimization problems with non-convex constraints, we adopt the variable-splitting and penalty techniques to derive an equivalent solution. Analysis of the corresponding convergence and computational complexity characteristics is also presented. Extensive experiments on several well-known datasets demonstrate the promising performance of the proposed models.