Protecting Locations of Individual Movement under Temporal Correlations Pubblico

Xiao, Yonghui (2017)

Permanent URL: https://etd.library.emory.edu/concern/etds/tt44pn501?locale=it
Published

Abstract

Concerns on location privacy frequently arise with the rapid development of GPS enabled devices and location-based applications. In this dissertation, we study how to protect the locations of individual movement under temporal correlations. First, we propose two types of privacy notions, location privacy and customizable privacy. Location privacy is used to protect the true location of a user at each timestamp; Customizable privacy means the user can configure personalized privacy notions depending on different demand. Second, we investigate how to preserve these privacy notions. We show that the traditional L1-norm sensitivity in differential privacy exaggerates the real sensitivity, and thus leads to too much noise in the released data. Hence we study the real sensitivity, called sensitivity hull, for the data release mechanism. Then we design the optimal location release mechanism for location privacy. We show that the data release mechanism has to be dynamically updated for the customizable privacy to guarantee the privacy is protectable, which is measured by a notion of degree of protection. Third, we implement these algorithms on real-world datasets to demonstrate the efficiency and effectiveness.

Table of Contents

1 Introduction 1

1.1 Motivation ............................. 1

1.1.1 LocationPrivacy ...................... 3

1.1.2 CustomizablePrivacy.................... 4

1.2 Overview............................... 5

1.2.1 Contributions ........................ 7

1.3 Publicationsfromthisdissertation................. 8

2 Related Works 10

2.1 LocationPrivacy .......................... 10

2.2 InferencesonLocation ....................... 11

2.3 DifferentialPrivacy......................... 13

3 Differential Privacy on Location Set 15

3.1 Preliminaries ............................ 15

3.1.1 TwoCoordinateSystems.................. 16

3.1.2 MobilityandInferenceModel ............... 17

3.1.3 Differential Privacy and Laplace Mechanism . . . . . . . 19

3.1.4 UtilityMetrics ....................... 19

3.1.5 ConvexHull......................... 20

3.2 PrivacyDefinition.......................... 20

3.2.1 δ-LocationSet........................ 21

3.2.2 DifferentialPrivacyonδ-LocationSet. . . . . . . . . . . 23

3.2.3 AdversarialKnowledge................... 24

3.2.4 ComparisonwithOtherDefinitions . . . . . . . . . . . . 26

3.2.5 Discussion.......................... 27

3.3 SensitivityHull ........................... 28

3.3.1 SensitivityHull ....................... 29

3.3.2 ErrorBoundofDifferentialPrivacy . . . . . . . . . . . . 30

3.4 LocationReleaseAlgorithm .................... 32

3.4.1 Framework ......................... 32

3.4.2 PlanarIsotropicMechanism ................ 33

3.4.3 LocationInference ..................... 38

3.5 ExperimentalEvaluation...................... 39

3.5.1 PerformanceOverTime .................. 42

3.5.2 ImpactofParameters.................... 43

3.5.3 UtilityforLocationBasedQueries . . . . . . . . . . . . 45

3.6 ConclusionRemarks ........................ 46

4 Customizable Privacy in Hidden Markov Model 47

4.1 Preliminaries ............................ 48

4.1.1 BlowfishPrivacy ...................... 48

4.1.2 HiddenMarkovModel ................... 49

4.2 ProblemStatement......................... 51

4.2.1 ProbabilisticConstraint .................. 52

4.2.2 ProblemStatement..................... 53

4.3 PrivacyDefinition.......................... 55

4.3.1 PolicyGraph ........................ 55

4.3.2 DPHMM........................... 57

4.3.3 ComparisonwithOtherDefinitions . . . . . . . . . . . . 58

4.4 PrivacyRisk............................. 59

4.4.1 BlowfishAnalysis...................... 59

4.4.2 DegreeofProtection .................... 64

4.5 DataReleaseMechanism...................... 67

4.5.1 MinimumProtectableGraph................ 67

4.5.2 DataReleaseMechanism.................. 69

4.5.3 AdversarialKnowledge................... 71

4.6 PrivacyComposition ........................ 72

4.7 EmpiricalEvaluation........................ 73

4.7.1 Runtime........................... 75

4.7.2 PerformanceoverTime................... 75

4.7.3 ImpactofPrivacyBudgetε ................ 76

4.7.4 TuningPrivacyandUtilitybyGraphs . . . . . . . . . . 77

4.8 ConclusionRemarks ........................ 78

5 Conclusion 81

About this Dissertation

Rights statement
  • Permission granted by the author to include this thesis or dissertation in this repository. All rights reserved by the author. Please contact the author for information regarding the reproduction and use of this thesis or dissertation.
School
Department
Degree
Submission
Language
  • English
Research Field
Parola chiave
Committee Chair / Thesis Advisor
Committee Members
Ultima modifica

Primary PDF

Supplemental Files