Monotonicity and Restart in Fast Gradient Methods

P. Giselsson and S. Boyd

IEEE Conference on Decision and Control, pages 5058-5063, December 2014.

Fast gradient methods are known to be non-monotone algorithms, and oscillations typically occur around the solution. To avoid this behavior, we propose in this paper a fast gradient method with restart, and analyze its convergence rate. The proposed algorithm bears similarities to other algorithms in the literature, but differs in a key point that enables theoretical convergence rate results. The efficiency of the proposed method is demonstrated by two numerical examples.