Yayınlanmış 1 Ocak 2019 | Sürüm v1
Dergi makalesi Açık

A framework for parallel second order incremental optimization algorithms for solving partially separable problems

  • 1. Sabanci Univ, Fac Engn & Nat Sci, TR-34956 Istanbul, Turkey
  • 2. Istanbul Bilgi Univ, Dept Ind Engn, TR-34060 Istanbul, Turkey
  • 3. Erasmus Univ, Inst Econometr, NL-3000 DR Rotterdam, Netherlands
  • 4. Bogazici Univ, Dept Comp Engn, TR-34342 Istanbul, Turkey
  • 5. Univ Paris Saclay, Telecom ParisTech, LTCI, F-75013 Paris, France

Açıklama

We propose Hessian Approximated Multiple Subsets Iteration (HAMSI), which is a generic second order incremental algorithm for solving large-scale partially separable convex and nonconvex optimization problems. The algorithm is based on a local quadratic approximation, and hence, allows incorporating curvature information to speed-up the convergence. HAMSI is inherently parallel and it scales nicely with the number of processors. We prove the convergence properties of our algorithm when the subset selection step is deterministic. Combined with techniques for effectively utilizing modern parallel computer architectures, we illustrate that a particular implementation of the proposed method based on L-BFGS updates converges more rapidly than a parallel gradient descent when both methods are used to solve large-scale matrix factorization problems. This performance gain comes only at the expense of using memory that scales linearly with the total size of the optimization variables. We conclude that HAMSI may be considered as a viable alternative in many large scale problems, where first order methods based on variants of gradient descent are applicable.

Dosyalar

bib-53271eeb-14c1-403a-bce3-39e541655f75.txt

Dosyalar (282 Bytes)

Ad Boyut Hepisini indir
md5:4eca43ea23d40297f1be042f84a36cbf
282 Bytes Ön İzleme İndir