Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Regularized distributionally robust optimization with applications in finance

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • المؤلفون: Zhao, Leyang
  • نوع التسجيلة:
    Electronic Resource
  • الدخول الالكتروني :
    http://hdl.handle.net/1959.4/104131
    https://unsworks.unsw.edu.au/bitstreams/61c82c9c-9b2f-489b-acfe-1d623f6ef8d4/download
    https://doi.org/10.26190/unsworks/30867
  • معلومة اضافية
    • Publisher Information:
      UNSW, Sydney 2025
    • نبذة مختصرة :
      Model mis-specification happens when the test data is generated from different distributions than the training data. This could be the case when there are latent random variables that are not directly observed by the decision-maker. Subpopulation is an example of the result of a latent variable, where the value of the random variable controls the parameters of the population distributions. This often happens in real-world applications, such as index tracking and portfolio selection problems, where the historical data could contain only partial information of the data-generating distribution. When future data is generated from a different distribution, or according to different model parameters, some statistical estimators will be inefficient, resulting in large prediction errors. Distributionally robust optimization (DRO) is a successful candidate for dealing with distributional uncertainty. It considers the optimization of the worst-case scenario in an ambiguity set of probability distributions whose size is quantified by some discrepancy measures between distributions. On the other hand, in models that contain many features, it is common that some of them are not relevant to the output. Ideally, we want to only keep the relevant features in the final model and remove the redundant ones. When assuming no model mis-specification, several candidates such as LASSO and SCAD are successful in feature selection by adding a constraint on the size of the coefficients. When there is collinearity in the data, some estimators such as the least squared estimator can have a large variance due to the instability of its solutions. This is taken care of via the ridge regression by adding a 2-norm constraint on the coefficients. However, there has not been much research on the construction of models that consider the combination of both properties of distributional uncertainty and sparsity. In this thesis, we make the following key contributions: • We consider new optimization mode
    • الموضوع:
    • Availability:
      Open access content. Open access content
      open access
      https://purl.org/coar/access_right/c_abf2
      CC BY 4.0
      https://creativecommons.org/licenses/by/4.0
      free_to_read
    • Note:
      application/pdf
      English
    • Other Numbers:
      LJ1 oai:unsworks.library.unsw.edu.au:1959.4/104131
      1503178470
    • Contributing Source:
      UNIV OF NEW S WALES
      From OAIster®, provided by the OCLC Cooperative.
    • الرقم المعرف:
      edsoai.on1503178470
HoldingsOnline