Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Bayesian batch active learning as sparse subset approximation

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • المؤلفون: Pinsler, R.; Gordon, J.; Nalisnick, E.; Hernández-Lobato, J.M.
  • المصدر:
    Pinsler , R , Gordon , J , Nalisnick , E & Hernández-Lobato , J M 2020 , Bayesian batch active learning as sparse subset approximation . in H Wallach , H Larochelle , A Beygelzimer , F d'Alché-Buc , E Fox & R Garnett (eds) , 32nd Conference on Neural Information Processing Systems (NeurIPS 2019) : Vancouver, Canada, 8-14 December 2019 . vol. 8 , Advances in Neural Information Processing Systems , vol. 32 , San Diego, CA , pp. 6327-6338 , 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019 , Vancouver , Canada , 8/12/19 ....
  • نوع التسجيلة:
    article in journal/newspaper
  • اللغة:
    English
  • معلومة اضافية
    • Contributors:
      Wallach, H.; Larochelle, H.; Beygelzimer, A.; d'Alché-Buc, F.; Fox, E.; Garnett, R.
    • الموضوع:
      2020
    • Collection:
      Universiteit van Amsterdam: Digital Academic Repository (UvA DARE)
    • نبذة مختصرة :
      Leveraging the wealth of unlabeled data produced in recent years provides great potential for improving supervised models. When the cost of acquiring labels is high, probabilistic active learning methods can be used to greedily select the most informative data points to be labeled. However, for many large-scale problems standard greedy procedures become computationally infeasible and suffer from negligible model change. In this paper, we introduce a novel Bayesian batch active learning approach that mitigates these issues. Our approach is motivated by approximating the complete data posterior of the model parameters. While naive batch construction methods result in correlated queries, our algorithm produces diverse batches that enable efficient active learning at scale. We derive interpretable closed-form solutions akin to existing active learning procedures for linear models, and generalize to arbitrary models using random projections. We demonstrate the benefits of our approach on several large-scale regression and classification tasks.
    • File Description:
      application/pdf
    • ISBN:
      978-1-71380-793-3
      1-71380-793-9
    • Relation:
      https://dare.uva.nl/personal/pure/en/publications/bayesian-batch-active-learning-as-sparse-subset-approximation(3570d498-610d-4f63-9b96-52e8e07d24dd).html; urn:ISBN:9781713807933
    • الدخول الالكتروني :
      https://dare.uva.nl/personal/pure/en/publications/bayesian-batch-active-learning-as-sparse-subset-approximation(3570d498-610d-4f63-9b96-52e8e07d24dd).html
      https://hdl.handle.net/11245.1/3570d498-610d-4f63-9b96-52e8e07d24dd
      https://pure.uva.nl/ws/files/97822109/NeurIPS_2019_bayesian_batch_active_learning_as_sparse_subset_approximation_Paper.pdf
      http://www.proceedings.com/53719.html
      http://www.scopus.com/inward/record.url?scp=85090174813&partnerID=8YFLogxK
      https://papers.nips.cc/paper/2019/hash/84c2d4860a0fc27bcf854c444fb8b400-Abstract.html
    • Rights:
      info:eu-repo/semantics/openAccess
    • الرقم المعرف:
      edsbas.E07ABDBB