Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

PAC-Bayes Un-Expected Bernstein Inequality

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • معلومة اضافية
    • Contributors:
      CNRS; Université de Lille; Laboratoire Paul Painlevé LPP
    • الموضوع:
      2020
    • Collection:
      LillOA (Lille Open Archive - Université de Lille)
    • نبذة مختصرة :
      We present a new PAC-Bayesian generalization bound. Standard bounds contain a $\\sqrt{L_n \\cdot \\KL\/n}$ complexity term which dominates unless $L_n$, the empirical error of the learning algorithm's randomized predictions, vanishes. We manage to replace $L_n$ by a term which vanishes in many more situations, essentially whenever the employed learning algorithm is sufficiently stable on the dataset at hand. Our new bound consistently beats state-of-the-art bounds both on a toy example and on UCI datasets (with large enough $n$). Theoretically, unlike existing bounds, our new bound can be expected to converge to $0$ faster whenever a Bernstein\/Tsybakov condition holds, thus connecting PAC-Bayesian generalization and {\\em excess risk\\\/} bounds---for the latter it has long been known that faster convergence can be obtained under Bernstein conditions. Our main technical tool is a new concentration inequality which is like Bernstein's but with $X^2$ taken outside its expectation.
    • File Description:
      application/octet-stream
    • Relation:
      NeurIPS 2019; http://hdl.handle.net/20.500.12210/29181
    • الدخول الالكتروني :
      https://hdl.handle.net/20.500.12210/29181
    • Rights:
      info:eu-repo/semantics/openAccess
    • الرقم المعرف:
      edsbas.E309F83