Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Measuring Privacy with Distinguishability Metrics: Definitions, Mechanisms and Application to Location Privacy ; Mesurer la confidentialité avec des métriques de discernabilité: définitions, mécanismes et confidentialité des informations liées à la localisation

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • معلومة اضافية
    • Contributors:
      Laboratoire d'informatique de l'École polytechnique Palaiseau (LIX); École polytechnique (X)-Centre National de la Recherche Scientifique (CNRS); Concurrency, Mobility and Transactions (COMETE); École polytechnique (X)-Centre National de la Recherche Scientifique (CNRS)-École polytechnique (X)-Centre National de la Recherche Scientifique (CNRS)-Inria Saclay - Ile de France; Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria); École Polytechnique; Catuscia Palamidessi(catuscia@lix.polytechnique.fr); Konstantinos Chatzikokolakis(kostas@lix.polytechnique.fr)
    • بيانات النشر:
      HAL CCSD
    • الموضوع:
      2014
    • Collection:
      École Polytechnique, Université Paris-Saclay: HAL
    • نبذة مختصرة :
      The increasing availability of smartphone and tablets has given place to the development of a broad new class of applications, which collect and analyze big amounts of information about its users for different reasons: offering a personalized service, offer targeted advertisement, or provide accurate aggregated data for research and analysis purposes. However, serious privacy concerns have been risen about the kind and quantity of data being collected: this data is in general private by nature, and often it can be linked to other kinds of sensitive information. And in most cases, this information is made available to an untrusted entity, either because the service provider itself is not reliable, or because some aggregated information is being publicly released. In order to deal with these concerns, some kind of privacy guarantee is needed. Differential Privacy is one of the most prominent frameworks used to deal with disclosure prevention in statistical databases. It provides a formal privacy guarantee, ensuring that sensitive information relative to individuals cannot be easily inferred by disclosing answers to aggregate queries. If two databases are adjacent, i.e. differ only for an individual, then the query should not allow to tell them apart by more than a certain factor. This induces a bound also on the distinguishability of two generic databases, which is determined by their distance on the Hamming graph of the adjacency relation. When the sensitive information to be protected is other than the value of a single individual, or when the secrets itself are not databases at all, it is common to consider different notions of distinguishability, which depend on the application at hand and the privacy guarantees we wish to express.In the first part of this thesis we explore the implications of differential privacy when the indistinguishability requirement depends on an arbitrary notion of distance. We show that we can naturally express, in this way, (protection against) privacy threats that cannot be ...
    • Relation:
      tel-01098088; https://pastel.hal.science/tel-01098088; https://pastel.hal.science/tel-01098088/document; https://pastel.hal.science/tel-01098088/file/thesis%20%281%29.pdf
    • Rights:
      info:eu-repo/semantics/OpenAccess
    • الرقم المعرف:
      edsbas.5212A498