Inexact subgradient methods for quasi-convex optimization problems

Yaohua Hu, Xiaoqi Yang, Chee-Khian Sim

Research output: Contribution to journalArticlepeer-review

248 Downloads (Pure)


In this paper, we consider a generic inexact subgradient algorithm to solve a nondifferentiable quasi-convex constrained optimization problem. The inexactness stems from computation errors and noise, which come from practical considerations and applications. Assuming that the computational errors and noise are deterministic and bounded, we study the effect of the inexactness on the subgradient method when the constraint set is compact or the objective function has a set of generalized weak sharp minima. In both cases, using the constant and diminishing stepsize rules, we describe convergence results in both objective values and iterates, and finite convergence to approximate optimality. We also investigate efficiency estimates of iterates and apply the inexact subgradient algorithm to solve the Cobb–Douglas production efficiency problem. The numerical results verify our theoretical analysis and show the high efficiency of our proposed algorithm, especially for the large-scale problems.
Original languageEnglish
Pages (from-to)315-327
Number of pages13
JournalEuropean Journal of Operational Research
Issue number2
Early online date20 May 2014
Publication statusPublished - 16 Jan 2015


  • Subgradient method
  • Quasi-convex optimization
  • Noise
  • Weak sharp minima


Dive into the research topics of 'Inexact subgradient methods for quasi-convex optimization problems'. Together they form a unique fingerprint.

Cite this