Date post: | 17-Jan-2016 |
Category: |
Documents |
Upload: | madeline-harvey |
View: | 216 times |
Download: | 0 times |
1
Securing Location Privacy in Vehicular Applications and CommunicationsDISSERTATION DEFENSE
GEORGE CORSER
NOVEMBER 6, 2015
2
Sections0. Preliminaries
1. Introduction and Background
2. Properties
3. Metrics
4. Random Rotation
5. Endpoint Protection
6. Privacy By Decoy
7. Safety-Silence Tradeoff
8. Evaluation Using New Metrics
9. Conclusion
3
0. Preliminaries
Acknowledgements
Biographical Highlights
Publications
Contributions
4
Acknowledgements Mom: Maureen Corser
Advisor and doctoral committee: Dr Huirong Fu, Dr Jia Li, Dr Jie Yang, Dr Dan Steffy
Colleagues: Richard Bassous, Nahed Alnahash, Nasser Bani Hani, Ahmad Mansour, Eralda Caushaj, Jared Oluoch, and many anonymous peer reviewers
REU Students: Warren Ma, Patrick D’Errico, Lars Kivari, Mathias Masasabi, Ontik Hoque, Johnathan Cox
SVSU Students: Dustyn Tubbs, Anthony Ventura, Aaron Hooper, Alejandro Arenas, Kevin Kargula
Many more, personal and professional: SVSU faculty and administration, everyone in my family who put up with my rantings (esp. my wife)
The present company: you!
5
Biographical HighlightsEDUCATION
Year University and Degree
Current Oakland University, PhD (Candidate), Computer Science and Informatics
2011 University of Michigan – Flint, MS, Computer and Information Science
1985 Princeton University, BSE, Civil Engineering
EMPLOYMENT, ASSOCIATIONS AND AWARDS
CSIS Faculty, SVSU, Taught 32 credits in 2014-15 Organizer, Faculty Adviser, TEDxSVSU 2015 - Herbert H and Grace Dow Professor Award
Member Candidate, Automotive Security Review Board, Intel Corporation
Member, ACM, Association for Computing Machinery
Member, IEEE Intelligent Transportation Systems Society, Internet of Things Community, Computer Society, Vehicular Technology Society
Member, Michigan Infragard
Member, SAE, Society of Automotive Engineers
9 + 4
13
Conference publications (6 as 1st author)Journal publications (3 as 1st author)Total publications (2012-2015) *
* not including this dissertation
6
Publications (2015*)Corser, G., Fu, H., Banihani, A., Evaluating Location Privacy in Vehicular Communications and
Applications, IEEE Transactions on Intelligent Transportation Systems. Accepted, pending minor revisions, October 2015. Impact Factor: 3.0.
Corser, G., A. Arenas, Fu, H., Effect on Vehicle Safety from Nonexistent or Silenced Basic Safety Messages, IEEE ICNC. Accepted October 2015.
Corser, G., Masasabi, M., Kivari, L., Fu, H., Properties of Vehicle Network Privacy. Michigan Academician. Accepted June 2015.
Gurary, J., Alnahash, N., Corser, G., Ouloch, J., Fu, H., MAPS: A Multi-Dimensional Password Scheme for Mobile Authentication, ACM International Conference on Interactive Tabletops and Surfaces (ITS) 2015. Accepted September 2015.
Journal publications in bold
* not including this dissertation
7
Publications (2014)Corser, G., Fu, H., Shu, T. D'Errico, P., Ma, W., Leng, S., Zhu, Y. (2014, June). Privacy-by-Decoy:
Protecting Location Privacy Against Collusion and Deanonymization in Vehicular Location Based Services. In 2014 IEEE Intelligent Vehicles Symposium. IEEE. Dearborn, MI.
Alrajei, N., Corser, G., Fu, H., Zhu, Y. (2014, February). Energy Prediction Based Intrusion Detection In Wireless Sensor Networks. International Journal of Emerging Technology and Advanced Engineering, Volume 4, Issue 2. IJETAE.
Alnahash, N., Corser, G., Fu, H. (2014, April). Protecting Vehicle Privacy using Dummy Events. In 2014 American Society For Engineering Education North Central Section Conference. ASEE NCS 2014.
Oluoch, J., Corser, G., Fu, H., Zhu, Y. (2014, April). Simulation Evaluation of Existing Trust Models in Vehicular Ad Hoc Networks. In 2014 American Society For Engineering Education North Central Section Conference. ASEE NCS 2014.
Journal publications in bold
8
Publications (2013, 2012)Corser, G., Fu, H., Shu, T. D'Errico, P., Ma, W. (2013, December). Endpoint Protection Zone (EPZ):
Protecting LBS User Location Privacy Against Deanonymization and Collusion in Vehicular Networks. In Second International Conference on Connected Vehicles & Expo. IEEE. Las Vegas, NV.
Corser, G., Arslanturk, S., Oluoch, J., Fu, H., & Corser, G. E. (2013). Knowing the enemy at the gates: measuring attacker motivation. International Journal of Interdisciplinary Telecommunications and Networking 5(2), 83-95. IJITN.
Corser, G. (2012, October). A tale of two CTs: IP packets rejected by a firewall. In Proceedings of the 2012 Information Security Curriculum Development Conference (pp. 16-20). ACM. InfoSecCD 2012, Kennesaw, GA. Best Paper Runner-up.
Corser, G. (2012, October). Professional association membership of computer and information security professionals. In Proceedings of the 2012 Information Security Curriculum Development Conference (pp. 9-15). ACM. InfoSecCD 2012, Kennesaw, GA. Best Student Paper Runner-up.
Corser, George, Entropy as an Estimate of Image Steganography. Accepted, but never paid fee for publication, CAINE 2012.
Journal publications in bold
9
Contributions Definitions: Distinctions between privacy, location privacy and network privacy
Properties: Twelve fundamental properties must be preserved as privacy functionality is added to VANETs
SSTE: Analysis of Impact on Safety
Metrics and Theoretical Estimates: New ways to measure privacy in CPLQ contexts
Software: To measure and visualize
RRVT: Vehicle mobility patterns present special opportunities for dummy event privacy techniques
EPZ: Endpoint mix techniques provide high levels of protections against deanonymization and no additional overhead, but limited application functionality
PBD: Active decoys protect against deanonymization and give high decentralization and user control, but possibly at a high cost in overhead
Comparison using KDT: Variability in mix zones creates more clumping, which increases anonymity levels
10
1. Introduction and Background
What is a Vehicular Ad-hoc Network?
Privacy Problem
Importance
Complexity
Prior Solutions
Background◦ Location Privacy◦ Vehicle Networks◦ Vehicle Mobility◦ Applications
Coming Network Traffic Control
Philosophies of Privacy
Dummy Events
11
What is a Vehicular Ad-hoc Network?
VANET – Vehicular Ad-hoc Network
GPS – Global Positioning Satellite
V2V – Vehicle to Vehicle
V2I– Vehicle to Infrastructure
RSU – Roadside Unit
LBS – Location Based Service
TMS – Traffic Management System (a type of LBS)
12
Privacy Problem Vehicle network privacy research should identify measurable relationships regarding:
a. vehicle safety, i.e. active transmission of heartbeat messages, without silent period,
b. network efficiency, i.e. low latency, low congestion and low computational and storage overhead,
c. application availability, i.e. access to location based services (LBS), and
d. desired level of privacy (or transparency).
13
Privacy Problem (Continued) A vehicular privacy system should:
1. accommodate differing vehicular mobility patterns, esp. low vehicle densities, notably during the transition period between system initialization and full saturation,
2. provide for multi-layer defenses, e.g. MAC and APP layer attacks,
3. permit user choice so motorists need not depend on cooperation by large numbers of other motorists or by trusted third parties (TTPs),
4. provide protection even when LBS applications require frequent precise location queries (FPLQs), also called continuous precise location queries (CPLQs), and
5. provide protection over a wide geographical range, beyond any particular vehicle’s communications range, and for a long duration of time.
14
Importance Privacy is a form of confidentiality. If you do not believe privacy is important, ask yourself: would you post your online banking username and password on your Facebook page?
Vehicle location is confidential information. If you do not believe vehicle location privacy is important, ask yourself: would you place a $35 GPS tracker in your vehicle, and post its credentials on your Facebook page?
The US Supreme Court affirmed that car tracking constitutes a search and requires a warrant. See: Torrey Dale Grady v. North Carolina (March 30, 2015). Unanimous decision.
US DOT desires privacy protection in connected vehicle systems. See: US DOT ITS JPO. However, no consensus has emerged on how to define vehicular privacy, let alone how to implement it.
Principle of Least Privilege: Just because one bad actor cannot be prevented from accessing private information does not mean security measures should not be implemented against other bad actors. Case in point: E911.
15
Complexity: DSRC/WAVE Protocol Stack
DSRC – Dedicated Short Range Communications
WAVE – Wireless Access in Vehicular Environments
WSMP – WAVE Short Message Protocol
BSM – Basic Safety Message (SAE J2735)
Privacy via temporary MAC address, public/private keys based on pseudo-IDs
16
Prior Solution: PseudoID Changing
Does not protect against map deanonymization
17
Prior Solution: Group ModelReduces location protection range to comrange of group
Requires complex handoff schemes
Certificate Revocation List (CRL) requirements considered too challenging for group model
18
Prior Solution: Spatial Cloaking
Spatial cloaking in a series of three snapshots: Vehicle 5 maintains k,s‑privacy at each snapshot but all three snapshots analyzed together may reveal the vehicle
19
Background
Locatio
n Privacy
Vehicle Networks
Vehicle Mobility ApplicationsScope of Research
20
Location Privacy: Theory
Image source: Shokri, 2010
21
Location Privacy: Threat Model Global passive adversary (GPA)
MEANS: access to LBS, RSU, DSRC communications only, not cell phone data, and perhaps license plate reader (LPR) or other camera data; knowledge of geography (road maps / road topology), traffic conditions (blocked / slow roads), home owner names and addresses and geographical coordinates, and perhaps the target’s name, address, license plate number, and mobility profile.
ACTIONS: passive (eavesdrop only); global geographical scope, the entire area covered by the TMS; temporal scope may be long term, hours, days, months, or even longer periods of time.
GOALS: The goal of the GPA would be real-time tracking, to determine whether a specific individual (target) is at a given place at a given time, or tracing, to pinpoint the target’s position in the past.
22
Vehicle Networks: Threat Model
23
Vehicle Mobility
Pedestrian Mobility Patterns
Image source: You, Peng and Lee, 2007
Vehicle Mobility Patterns
24
Applications: TelematicsPioneer Networked Entertainment eXperience (NEX)
Image source: MacRumors
25
Applications: DSRC
Image source: ITS InternationalImage source: ETSI
26
Coming Vehicle Network Traffic Control
27
Philosophies of Privacy
Charles Fried (Control Theory)
James Moor(Restricted Access Theory)
George Corser(Active Decoy Theory)
28
Dummy Events What if it were possible to create dummies with realistic movements?
Source: Location Privacy in Pervasive Computing, Beresford & Stajano, 2003
29
Dummy Events: Toward RealismAuthors Methods Category
You, Peng and Lee (2007) Random trajectory Spatial shift
Lu, Jensen and Yiu (2008) Virtual grid, virtual circle Spatial shift
Chow & Golle (2009) Google Maps poly line Trajectory database
Kido, Yanagisawa, Satoh (2009) Moving in a neighborhood Spatial shift
Krumm (2009) GPS data modified with noise Trajectory database
Alnahash, Corser, Fu, Zhu (2014) Random trajectory confined to roads Spatial shift
Corser, et. al. (IEEE, 2014) “Live” dummies from active vehicles Active decoy
30
Summary What is a Vehicular Ad-hoc Network?
Privacy Problem
Importance
Complexity
Prior Solutions
Background◦ Location Privacy◦ Vehicle Networks◦ Vehicle Mobility◦ Applications
Coming Network Traffic Control
Philosophies of Privacy
Dummy Events
31
2. Properties
1. Collision Avoidance
2. Authentication
3. Pseudonymity
4. Untrackability
5. Untraceability
6. Accountability
7. Revocability
8. Anonymity
9. Decentralization
10. Undeanonymizability
11. Efficiency
12. User Control
32
Property 1: Collision Avoidance Basic Safety Messages (BSMs) must maximize safety.
To achieve collision avoidance, vehicles inform each other regarding whether or not they are on a trajectory to collide. In VANETs this is accomplished using BSMs. To achieve collision avoidance, vehicles inform each other regarding whether or not they are on a trajectory to collide. In VANETs this is accomplished using BSMs.
Privacy protocols must minimize silent periods.Image source: Clemson.edu
33
Property 2: Authentication BSMs must be trustworthy.
Digital signatures, but no encryption
Privacy protocols must maximize public key infrastructure (PKI) efficiency.
Image source: www.sit.fraunhofer.de
34
Property 3: Pseudonymity BSMs must not reveal identities of vehicles.
To achieve pseudonymity, a type of identity privacy, BSMs must use pseudonyms, or pseudoIDs, each of which having a corresponding digital certificate. Except in circumstances requiring accountability or revocability, described below, pseudoIDs and their certificates are unlinkable to the vehicle identification number (VIN) of the vehicle and to the personally identifiable information (PII) of the vehicle owner.
Moot with map deanonymization, CPLQ.Image source: slideshare
35
Property 4: Untrackability PseudoIDs in currently transmitted BSMs must not be linkable to pseudoIDs in immediately preceding BSMs from the same vehicle, except by proper authorities.
If a vehicle were identified (marked), and its pseudoID linked to PII even a single time, then the vehicle could be monitored as long as its BSM used that same pseudoID. Hence this property may be appropriately referred to as real-time location privacy.
36
Property 5: Untraceability PseudoIDs in current or past BSMs must not be linkable to other pseudoIDs from the same vehicle, except by proper authorities.
To achieve untraceability, BSMs must be transmitted using multiple pseudoIDs, switching between them, as in untrackability, above. However, the property of untraceability is distinct from untrackability. By this paper's definition, “tracking” a vehicle would be performed in real-time, while the vehicle is in motion. “Tracing” the vehicle would be a matter of historical investigation, to determine what vehicle was at what location at what time. Hence this property may be appropriately referred to as historical location privacy.
37
Property 6: Accountability PseudoIDs must be linkable to PII by proper authorities.
Sometimes it is beneficial to link a vehicle to its owner’s identity and/or its location, such as when a vehicle may have been used in a crime or involved in an accident. It may be argued that a privacy protocol without the property of accountability would introduce more risk to the public by concealing criminals than it would introduce security to the public by protecting people’s privacy.
38
Property 7: Revocability Property 7: Revocability. PseudoIDs and digital certificates must be rescindable.
It is possible that valid digital certificates could be stolen and used maliciously. If this is detected the certificate should be revoked.
To achieve revocability, a CA or other TTP must provide valid digital certificates for pseudoIDs while maintaining the capability of rescinding certificates by updating and distributing a certificate revocation list (CRL) if requested by a proper authority.
39
Property 8: Anonymity Location privacy models must maximize indistinguishability between pseudoIDs.
Privacy protocols can be evaluated by anonymity, which we define as the quantifiable amount of privacy the vehicle’s pseudoID enjoys by using the protocol.
Location privacy models are effective to the extent a previously identified entity becomes unidentified/ambiguous with a number of other entities, i.e. it could be in any of a number of positions
40
Property 9: Decentralization BSMs must not be traceable by a single TTP. Decentralized protocols by definition involve
multiple independent TTPs. The purpose of decentralization is to prevent a single TTP from being able to undeanonymize vehicles.
41
Property 10: Undeanonymizability
Location privacy models must minimize cross-referenceability.
If vehicles require LBS query results based on precise location data, then eavesdroppers with access to the content of that data could use map databases or other cross-references to deanonymize motorists and vehicles. For example, if a driver started her vehicle at coordinates (x,y), her home address, then eavesdroppers could match (x,y) with the longitude and latitude of the address and possibly identify the driver.
42
Property 11: Efficiency Location privacy protocols must minimize overhead.
In order to maximize efficiency location privacy protocols must avoid overhead, unnecessary consumption of bandwidth, memory and computational resources.
43
Property 12: User Control Location privacy models must maximize individual customization settings.
Not all motorists prefer privacy. Some prefer transparency. Some may prefer to switch between transparency and privacy. The capability of a location privacy model to permit end user customization or even the capability to shut off anonymization altogether, indicates a superior privacy protocol compared to one that does not offer the feature.
44
3. Metrics
Traditional Anonymity Metrics◦ Anonymity Set Size (k=|AS|)◦ Entropy of Anonymity Set Size (H[k=|AS|])◦ Tracking Probability (Prob[k=|AS|=1)
Traditional Distance and Time of Anonymity Metrics◦ Short Term Disclosure (SD)◦ Long Term Disclosure (LD)◦ Distance Deviation (dst)
Proposed Composite Metric (KDT) and Theoretical Estimates◦ Continuous/Cumulative Anonymity Set Size (K)◦ Continuous/Cumulative Distance Deviation (D)◦ Duration/Time of Anonymity (T)
A New Approach Toward Metrics
45
Traditional Anonymity Metrics The anonymity set, ASi, of target user, i, is the collection of all users, j, including i, within the set of all userIDs, ID, whose trajectories, Tj, are indistinguishable from Ti. That is, the probability of confusion of i and j, p(i,j)>0. The anonymity set size is the number of elements in the set.
Information entropy expresses the level of uncertainty in the correlations between Ti and Tj.
Tracking probability, Pti, is defined as the probability that the anonymity set equals 1. This metric is important because average Pt tells what percentage of vehicles have some privacy, and what percentage have no privacy at all, not just how much privacy exists in the overall system.
} , , 0 ) , ( | {ID j i iT T T j i p j AS )),((log),( 2 jipjipH
iASji
)1( iASPi
Pt
46
Traditional Anonymity Distance Metric
dst is the average of distance between trajectories of dummies and the true user
dsti : the distance deviation of user i
PLji : the location of true user i at the jth time slot
Ljdk : the location of the kth dummy at the jth time slot
dist() express the distance between the true user location and the dummy location
n dummies
m time slots
Source: You, Peng and Lee, 2007
47
Traditional Anonymity Time Metrics
Short-term Disclosure (SD): the probability of an eavesdropper successfully identifying a true trajectory given a set of true and dummy POSITIONS over a short period of time.
Long-term Disclosure (LD): the probability of an eavesdropper successfully identifying a true trajectory given a set of true and dummy TRAJECTORIES over a longer period of time.
m: time slicesDi : set of true and dummy locations at time slot i
n total trajectories k trajectories that overlapn – k trajectories that do not overlapTk is the number of possible trajectories amongst the overlapping trajectories
More overlap means more privacy
Source: You, Peng and Lee, 2007
48
Proposed Composite Metric: KDT Time of Anonymity, or Anonymity Duration (T)
Average Anonymity Set Size (K)
Average Distance Deviation (D) KDT is presented in more detailin section 8
49
A New Approach Toward Metrics
50
4. Random Rotation of Vehicle Trajectories
Random Rotation of Vehicle Trajectories (RRVT) ◦ Description◦ Mobility Patterns◦ Overlap Improves Privacy◦ Simulation Setup◦ Results◦ Recall: Threat Model
51
Random Rotation of Vehicle Trajectories Mobility Patterns (RRVT) Description
If a privacy-seeker knew the possible trajectories of other entities, she could send genuine and false requests to an LBS.
Vehicles and cars move in different patterns.
Left image source: You, Peng and Lee, 2007
Overlap Improves Privacy Example: 3 trajectories 8 possible paths
Image source: You, Peng and Lee, 2007
Simulation Setup Sim. time: 20 time slots
Speed: ~3 squares/slot
Dummies: sets of 5 to 25
Manhattan grid 50x50
Trajectories constrained to roadways every 10 grid squares
Ran simulation nine times per dummy set
Data presented: median number of trajectory intersection overlaps
Example real trajectory in redExample dummy trajectories in black
54
Results Random Rotation may be more effective in vehicular settings than in pedestrian ones because more likely overlapping trajectories (very short duration)
55
Recall: Threat Model
56
5. Endpoint Protection Zone
Endpoint Protection Zone (EPZ)◦ Description◦ Simulation Setup◦ Performance Evaluation◦ Results
57
Endpoint Protection Zone (EPZ) Description
Divide region into squares or rectangles containing residence and workplace locations
Examples◦ Apartment Complex, Housing Subdivision
Vulnerabilities Defended◦ Map deanonymization◦ LBS-RSU collusion
58
Simulation Setup Realistic mobility patterns
◦ City◦ Urban◦ Rural
Parameters (independent variables)◦ V: number of vehicles in region, R◦ λ: ratio of LBS user vehicles to V◦ A: area of R (constant 3000m x 3000m)◦ w, h: width, height of EPZ (EPZ Size)◦ T: 2000 seconds
Computed metrics (dependent variables)◦ Metrics: |AS|, H(|AS|), Pt
Theoretical Estimate:
E{ | ASEPZ | } = λVwh/WH
59
Performance Evaluation: |AS|10% LBS USERS (Λ=0.1) 20% LBS USERS (Λ=0.2)
60
Performance Evaluation: H(|AS|)10% LBS USERS (Λ=0.1) 20% LBS USERS (Λ=0.2)
61
Performance Evaluation: Pt10% LBS USERS (Λ=0.1) 20% LBS USERS (Λ=0.2)
62
Results Traffic management or other LBS with privacy is possible using EPZ
◦ CPLQ replies sent to IP address, or could have random code to decrypt location specific message
Minimum vehicle density threshold is required to prevent Pt=1
◦ May not work in remote rural areas, unless VERY large EPZ
◦ If no other cars in area, what is the point of V2V safety?
Advantages◦ No additional overhead (if no CPLQ)◦ Controllable anonymity set size◦ “Edward Snowden” effect◦ Very realistic, because queries are from real
vehicle positions
Limitations◦ If attacker hates everyone in a particular
neighborhood, the system may fail◦ User can’t use LBS within EPZ◦ Attacker could put license plate reader (LPR) at
ingress/egress points, then track
63
6. Privacy by Decoy
Privacy By Decoy (PBD)◦ Description◦ Simulation Setup◦ Results
64
Privacy by Decoy (PBD)Description
65
Simulation Setup Enhancement of EPZ model
Grid: ◦ 3000 m x 3000 m (1.864 mi x 1.864 mi)
Mobility patterns:◦ rural, urban and city
Simulation time◦ 2000 seconds (33.3 minutes)
Endpoint Protection Zones (EPZs)◦ 600 m x 600 m (25 EPZs): few large EPZs◦ 300 m x 300 m (100 EPZs): many small EPZs
Expected number of LBS connections in EPZ, no parroting:
◦ E(k=|AS|) = λVwh/WH
If group parroting:◦ E(k=|AS|) = (λ + ρ) Vwh/WH
λ = LBS users; ρ = potential parrots
Pirates: Vehicles desiring privacy
Parrots: Vehicles willing to be active decoys
66
Results: E(|AS|)
67
Results: E(H[|AS|])
68
Results: Pt
69
Results Advantages
◦ Attacker can’t put LPRs in enough places to track everyone◦ User choice! User can choose whether or not to be private◦ Better protection in low density areas
Disadvantages◦ Overhead: exponential increase in LBS requests (May be mitigated if technique used sparingly)
70
7. Safety-Silence Tradeoff Equation
Safety-Silence Tradeoff Equation (SSTE)◦ Description◦ Proof◦ Major Assumption◦ Estimate Using Real-World Data◦ Implications
If the US Department of Transportation mandates the implementation of VANETs, as they are expected to do, not all vehicles will transmit basic safety messages (BSMs). At some point in time, at a given intersection or other potential collision point, perhaps only half of the vehicles will have VANET equipment installed. Even after full deployment vehicles will go radio silent when implementing privacy protocols. What’s the risk?
71
Safety-Silence Tradeoff Equation (SSTE) Description
With four vehicles at a four-way stop, there are 28 possible impact points but 4-choose-2 = 6 possible collisions between two vehicles. The safety-silence tradeoff equation predicts the probability of at least one collision given n=4 vehicles and b=2 is
1 - (1 - pb) (1 - pu)5 where pb and pu are as defined in PROOF.
72
Proof Let pb be the probability of a collision between two vehicles both transmitting BSMs. The probability of the two vehicles not colliding is (1 - pb). By definition of combination, the total number of potential collisions between two vehicles is b-choose-2, where b is the number of vehicles in R transmitting BSMs during ∆t. This can be reduced by a constant, cb, to eliminate certain impossible scenarios.
Let pu be the probability of a collision between two vehicles where at least one is not transmitting BSMs. The probability of collision between two silent vehicles is (1 - pu) and the number of potential collisions is (n-choose-2) minus (b-choose-2) where n is the total number of vehicles and b is the number of vehicles transmitting BSMs in R during ∆t. This can be reduced by a constant, cu, to eliminate certain impossible scenarios.
The probability of zero collisions is the product of (1 - pb) ^ (b-choose-2) and (1 - pu) ^ [ (n-choose-2) minus (b-choose-2) ], which is the probability that each and every potential collision fails to occur. The probability of at least one collision in R during ∆t is 1 minus that product, as the equation states. QED
73
Major Assumption The probability that three vehicles will collide at the same time is so small that we can assume it to be zero.
In practice there are many multi-car pileups. However, it is not clear how many of these are three-car collisions and how many are subsequent collisions due to initial collisions between two vehicles.
74
Estimate Using Real-World Data About 40% of all accidents in the US occurred at intersections in 2006, 8,500 of which were fatal and 900,000 of which were injurious [17]. In Japan, in 1997, 60% of accidents occur at intersections. Based on a study of 150 four-legged intersections in the Tokyo metropolitan area, researchers observed that the average probability of a vehicle encountering an obstacle vehicle was 0.339 and the average probability of a following vehicle driver’s failure was 0.0000002 [18]. These vehicles were not outfitted with on-board units (OBUs), the devices that transmit BSMs.
No data are available regarding accident probabilities of vehicles with OBUs installed, however one recent NHTSA report estimated that V2V could one day address 79% of vehicle crashes [19]. From this we can roughly estimate that the probability of a crash at an intersection without any BSMs is 0.339 times 0.0000002, or 0.0000000678, and the probability of a crash at an intersection with all vehicles transmitting BSMs is (1-0.79) times 0.0000000678, or 0.00000001423.
75
Implications Under given assumptions, if 5 of 10 (50%) vehicles transmit BSMs, 35 out of 45 (78%) collisions remain possible.
If a few go silent, they might as well all go silent.
Bottom line: Any successful privacy protocol will require some nonzero cost in terms of safety.
76
8. Evaluation Using New Metrics
Definitions of Privacy
Time of Anonymity (T)
Average Anonymity Set Size (K)
Average Distance Deviation (D)
Expected Value of D
Comparison of K vs. k
Comparison of H[K] vs. H[k]
77
Definitions of Privacy Definition 1. Privacy: the degree to which an entity cannot be linked to its identity.
Definition 2. Location privacy: the degree to which a spatial characteristic of an entity cannot be linked to its identity.
Definition 3. Continuous location privacy: the degree to which, over a contiguous series of time intervals, a spatial characteristic of an entity cannot be linked to its identity.
Definition 4. Network privacy: the degree to which an entity cannot be linked to its identity while it is connected to a communications system.
Definition 5. Network location privacy: the degree to which a spatial characteristic of an entity cannot be linked to its identity while it is connected to a communications system.
Definition 6. Continuous network location privacy: the degree to which, over a contiguous series of time intervals, a spatial characteristic of an entity cannot be linked to its identity while it is connected to a communications system.
78
Time of Anonymity (T) Time of Anonymity, or Anonymity Duration (T):
The set of all contiguous time intervals during which an entity is network-connected and indsitinguishable from at least one other entity
79
Average Anonymity Set Size (K) Average Anonymity Set Size (K)
Anonymity set size, kj, is the number of entities that might be confused with one another at time, tj, that is, kj = | ASj |. Average anonymity set size, K, is the sum of all kj from t0 to tj divided by the time of anonymity, | T | + 1
80
Average Distance Deviation (D) Average Distance Deviation (D)
where
and
The distance, dsij, is between two entities, s and i, in time interval tj. Let psij be the probability that an attacker will guess that entity i is the target, given that entity s is the actual target, in time interval tj.
The quantity, kij, , is the anonymity set size of the target vehicle at time j.
81
Expected Value of D
(0,0)
(1,1)
(0.5,0.5)
C
B
Uniform Distribution
= 0.5739
82
Expected Value of D (continued)
E[d] evaluates to approximately 0.1988
83
Comparison of Metrics
84
Example: Comparison of K vs. k
K vs. trajectory k anonymity: the latter does not incorporate the effect of higher anonymity in earlier ‑time intervals and consequently reports lower values of k for clumpy protocols and higher values for uniform protocols. The chart at right shows average maximum anonymity set size, kmax.
85
9. Conclusion
Contributions of this Research
Definitions: Location Privacy, notably in CPLQ context
Properties: Twelve fundamental properties must be preserved as privacy functionality is added to VANETs
Metrics: New metrics are required to measure privacy in CPLQ contexts
RRVT: Vehicle mobility patterns present special opportunities for dummy event privacy techniques
EPZ: Endpoint mix techniques provide high levels of protections against deanonymization and no additional overhead, but limited application functionality
PBD: Active decoys protect against deanonymization and give high decentralization and user control, but possibly at a high cost in overhead
Comparison of protocols Using KDT: Variability in mix zones creates more clumping, which increases anonymity levels
86
Closing Perspectives and Future Work
The reason digital privacy seems so difficult is because we think about it in outdated terms
We may need to re-think privacy to be an occasional, continuous attribute of security, not a static political right
The technical means by which we implement privacy may come at high costs which would only be appropriate in relatively rare circumstances
Future Work: Closer examination of overhead