Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Non-asymptotic analysis of Stochastic approximation algorithms for streaming data

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • معلومة اضافية
    • Contributors:
      Laboratoire de Probabilités, Statistiques et Modélisations (LPSM (UMR_8001)); Sorbonne Université (SU)-Centre National de la Recherche Scientifique (CNRS)-Université de Paris (UP)
    • بيانات النشر:
      Preprint
    • بيانات النشر:
      EDP Sciences, 2023.
    • الموضوع:
      2023
    • نبذة مختصرة :
      We introduce a streaming framework for analyzing stochastic approximation/optimization problems. This streaming framework is analogous to solving optimization problems using time-varying mini-batches that arrive sequentially. We provide non-asymptotic convergence rates of various gradientbased algorithms; this includes the famous Stochastic Gradient (SG) descent (a.k.a. Robbins-Monro algorithm), mini-batch SG and time-varying mini-batch SG algorithms, as well as their iterated averages (a.k.a. Polyak-Ruppert averaging). We show (i) how to accelerate convergence by choosing the learning rate according to the time-varying mini-batches, (ii) that Polyak-Ruppert averaging achieves optimal convergence in terms of attaining the Cramer-Rao lower bound, and (iii) how time-varying mini-batches together with Polyak-Ruppert averaging can provide variance reduction and accelerate convergence simultaneously, which is advantageous for many learning problems, such as online, sequential, and large-scale learning. We further demonstrate these favorable effects for various time-varying minibatches.
    • ISSN:
      1262-3318
    • الرقم المعرف:
      10.1051/ps/2023006
    • الرقم المعرف:
      10.48550/arxiv.2109.07117
    • Rights:
      CC BY
      arXiv Non-Exclusive Distribution
    • الرقم المعرف:
      edsair.doi.dedup.....12ccd4600391c6601b108e60bd0b7830