Download PDFOpen PDF in browser

Neural Network for A Class of Sparse Optimization in Machine Learning Problems

EasyChair Preprint 5224

10 pagesDate: March 29, 2021

Abstract

Sparse optimization involving the l0-norm function in objective function has a wide application in machine learning problems. In this paper, we propose a projected neural network modeled by a differential equation to solve a class of these optimization problems, in which the objective function is the sum of a nonsmooth convex loss function and the regularization defined by the l0-norm function. This optimization problem is not only nonconvex, but also discontinuous. To simplify the structure of the proposed network and let it own better convergence properties, we use the smoothing method, where the new constructed smoothing function for the regularization term plays a key role. We prove that the solution to the proposed network is globally existent and unique, and any accumulation point of it is a critical point of the continuous relaxation model. Except for a special case, which can be easily justified, any critical point is a local minimizer of the considered sparse optimization problem. It is an interesting thing that all critical points own a promising lower bound property, which is satisfied by all global minimizers of the considered problem, but is not by all local minimizers. Finally, we use some numerical experiments to illustrate the efficiency and good performance of the proposed method for solving this class of sparse optimization problems, which include the most widely used models in feature selection of classification learning.

Keyphrases: convergence analysis, critical point, machine learning, projected neural network, sparse optimization

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:5224,
  author    = {Qingfa Li and Sitian Qin and Wei Bian},
  title     = {Neural Network for A Class of Sparse Optimization in Machine Learning Problems},
  howpublished = {EasyChair Preprint 5224},
  year      = {EasyChair, 2021}}
Download PDFOpen PDF in browser