Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Large scale Gaussian processes with Matheron's update rule and Karhunen-Loève expansion

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • معلومة اضافية
    • Contributors:
      Analyse, Géométrie et Modélisation (AGM - UMR 8088); Centre National de la Recherche Scientifique (CNRS)-CY Cergy Paris Université (CY); École des Mines de Saint-Étienne (Mines Saint-Étienne MSE); Institut Mines-Télécom Paris (IMT); Laboratoire d'Informatique, de Modélisation et d'Optimisation des Systèmes (LIMOS); Ecole Nationale Supérieure des Mines de St Etienne (ENSM ST-ETIENNE)-Centre National de la Recherche Scientifique (CNRS)-Université Clermont Auvergne (UCA)-Institut national polytechnique Clermont Auvergne (INP Clermont Auvergne); Université Clermont Auvergne (UCA)-Université Clermont Auvergne (UCA); Institut Henri Fayol (FAYOL-ENSMSE); Institut Mines-Télécom Paris (IMT)-Institut Mines-Télécom Paris (IMT); Département Génie mathématique et industriel (FAYOL-ENSMSE); Ecole Nationale Supérieure des Mines de St Etienne (ENSM ST-ETIENNE)-Institut Henri Fayol
    • بيانات النشر:
      HAL CCSD
    • الموضوع:
      2023
    • Collection:
      HAL Clermont Auvergne (Université Blaise Pascal Clermont-Ferrand / Université d'Auvergne)
    • نبذة مختصرة :
      International audience ; Gaussian processes have become essential for non-parametric function estimation and widely used in many fields like machine learning. In this paper, large scale Gaussian process regression (GPR) is investigated. This problem is related to the simulation of high dimensional Gaussian vectors truncated on the intersection of a set of hyperplanes. The main idea is to combine both Matheron's update rule (MUR) and Karhunen-Lovève expansion (KLE). First, by the MUR we show how simulating from the posterior distribution is possible without computing the posterior covariance matrix and its decomposition. Second, by splitting the input domain in smallest nonoverlapping subdomains, the KLE coefficients are conditioned in order to guarantee the correlation structure in the entire domain. The parallelization of this technique is developed and the advantages are highlighted. By this, the computational complexity is drastically reduced. The mean-square block error is computed. It provides accurate results when using a family of covariance functions with compact support. Some numerical examples to study the performance of the proposed approach are included.
    • Relation:
      hal-03909542; https://hal.science/hal-03909542; https://hal.science/hal-03909542v2/document; https://hal.science/hal-03909542v2/file/MAATOUK-Large-scale%20%281%29.pdf
    • Rights:
      info:eu-repo/semantics/OpenAccess
    • الرقم المعرف:
      edsbas.32278B8E