1 / 42

Privacy Challenges in Pervasive Spaces

Privacy Challenges in Pervasive Spaces. Outline. Pervasive spaces Examples Privacy Challenges What can lead to loss of privacy Towards privacy preserving pervasive spaces .. Different components, different trust models. Event detection in untrusted model . Pervasive Spaces.

farren
Download Presentation

Privacy Challenges in Pervasive Spaces

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Privacy Challenges in Pervasive Spaces RESCUE All Hands Meeting – June 2007

  2. Outline • Pervasive spaces • Examples • Privacy Challenges • What can lead to loss of privacy • Towards privacy preserving pervasive spaces .. • Different components, different trust models. • Event detection in untrusted model RESCUE All Hands Meeting – June 2007

  3. Pervasive Spaces

  4. Example: Incident Level Awareness

  5. ExampleMultimodal Surveillance nearby sensors Event: shooter on campus events of interest Shooter location: UCI#outdoors/(300,506) !

  6. Other Pervasive Space Applications • Patient/health monitoring systems • In hospitals • At home for elderly • Smart offices • Smart Demand-Response System • Smart meters that dynamically monitor consumer behavior to optimize power usage • Cyber physical systems in general… RESCUE All Hands Meeting – June 2007

  7. Outline • Pervasive spaces • Examples • Privacy Challenges • What can lead to loss of privacy • Towards privacy preserving pervasive spaces • Event detection in untrusted model RESCUE All Hands Meeting – June 2007

  8. Privacy Challenges Whenever sensor data captures human activity, there is a potential possibility of privacy breach… RESCUE All Hands Meeting – June 2007

  9. Disclosure Risk in Observation • Sensing • Sensor reading could disclose identity • E.g., video data capture, RFID data capture, etc. • Obvious fix: anonymization/ obfuscation RESCUE All Hands Meeting – June 2007

  10. How trajectories identify people Calit2 4th floor Faculty offices hallway Bren Hall ICS Faculty offices hallway Calit2 4th floor kitchen RESCUE All Hands Meeting – June 2007

  11. Event Detection can lead to Privacy Loss • Besides location, other information contained in an event could also lead to disclosure of identity • Server room has been accessed + knowledge that only 3 people have access to server room  reveal identity of the individual to an unsafe degree. RESCUE All Hands Meeting – June 2007

  12. Detecting composite events can lead to inference • A simple event can affect multiple composite events (rules) • R1: Only a faculty or an admin may login to the IBM server. • R2: Individuals need to swipe in their cards to enter the server-room (identified up to their group, eg. Student, staff, faculty) • Knowledge: Some student swiped his card to enter server-room at night + there was one login into the server at night. • Bob is the only student who is also an admin. The person who logged into the server must be Bob RESCUE All Hands Meeting – June 2007

  13. Inference via composite events (example 2) • Consider two events • E1: CS 295 student enters room • E2: A ISG student enters the room • Knowledge: • If we know E1 detected  1 out of ~8 • If we know E2 detected 1 out of ~40 • If we know some event was detected (not which one) 1 out of ~45 • If we know both events detected  1 out of ~4! • Replace ISG by Sconce  we know it is Ali!!! • Example shows knowledge of which event or how many are detected could lead to disclosure RESCUE All Hands Meeting – June 2007

  14. Cornell Study of Electric Usage in Dorms • [Mulligan et. al] data mining over a few months of electric usage in a dorm can able to determine what activities students were involved in: • Eating • Sleeping • … • … RESCUE All Hands Meeting – June 2007

  15. Inference in Pervasive Spaces • Pervasive spaces can be viewed as dynamically evolving systems. • Sensors capture the state of the system at any given instance of time • Knowledge of what state a system is in could result in privacy violation • A given state, or a state transition might be a distinct characteristic pattern that identifies presence/absence of and or activity/event a person is involved in. RESCUE All Hands Meeting – June 2007

  16. Outline • Pervasive spaces • Examples • Privacy Challenges • What can lead to loss of privacy • Towards privacy preserving pervasive spaces .. • Different components, different trust models. • Event detection in untrusted model RESCUE All Hands Meeting – June 2007

  17. Towards Pervasive but (NOT) Invasive Spaces • Before we study privacy preserving pervasive systems we need to address two things: • 1) Components of Pervasive System • Subjects – who are immersed in space • System -- include the infrastructure owner and/or operators who manage the infrastructure • Observers – who can communicate with the system to get view of the state of the system and/or subjects (assuming they have the access privilege to such information). • This is not a comprehensive model. Could differ from application to application • 2) Trust Model • is the infrastructure and its operators trusted? • If yes, privacy policies and/or access control mechanisms can be used • Else, similar to outsourcing situation, new techniques for implementing pervasive functionality needs to be designed RESCUE All Hands Meeting – June 2007

  18. Disclosure risk in Untrusted Model • Pervasive systems can be viewed as consisting of following steps: • Sensing: • Diverse types of sensors used to track objects, entities, environment • Event Detection: • Sensor data used to detect events of interest to application • Action Execution: • Detected event could lead to action execution. • Each of the above poses disclosure risks!! • NEXT: event detection in untrusted pervasive environments • But before we do so….. RESCUE All Hands Meeting – June 2007

  19. UCI Responsphere Testbed • Campus-Scale sensing, communication, storage, computing infrastructure • - 200+ video cameras, Motes, sun spots, RFID, mobile cameras, gas sensors, • Mesh routers, WiFi, power-line network, zigbee • storage & compute clusters

  20. SATrecorder User Authentication RESCUE All Hands Meeting – June 2007

  21. SATrecorder Outdoor GUI RESCUE All Hands Meeting – June 2007

  22. SATrecorder 4th Floor Sensors RESCUE All Hands Meeting – June 2007

  23. SATrecorder Camera Viewer RESCUE All Hands Meeting – June 2007

  24. SATrecorder Event Detection RESCUE All Hands Meeting – June 2007

  25. SATrecorder with two camera’s output RESCUE All Hands Meeting – June 2007

  26. Scrubbing Sensor Streams in SATWARE RESCUE All Hands Meeting – June 2007

  27. Privacy Preserving Events in SATWARE • Sample Events • A person leaves the coffee room dirty • A person drinks the last cup of coffee in the pot but forgets to switch the machine off. • A person drinks 3 cups of coffee. • … • … • Privacy Preserving Event Detection • System can detect above events, but does not know any intermediate information about such events. • E.g., if a person drinks 3 cups of coffee the sytem is able to determine that. However, system does not know if a person has had 0, 1, or 2 cupts. RESCUE All Hands Meeting – June 2007

  28. Composite events Composite event templates • Detect the event when: “A student drinks more than 3 cups of coffee” e1 ≡ <u ∈ STUDENT, coffee_room, coffee_cup, dispense> • Detect the event when: “A student tries to accesses the IBM server in the server room” e1 ≡ <u ∈ STUDENT,server_room,*, entry> e2 ≡ <ū, server_room, *, exit> e3 ≡ <ū, server_room, IBM-server, login-attempt>

  29. Automata & State Information • Rule Automaton template • (Rule, Individual) Instance of a template = automaton object ARX ARY ARZ Rule R applies to {X, Y, Z} 3 automata that implement R for X, Y and Z respectively The number of automata in the state table is proportional to the number of individuals who interact with the space

  30. System architecture & adversary Server Secure Sensor node (SSN) Rules DB :: Secure Sensor node (SSN) State Information (Encrypted) Thin trusted middleware to obfuscate origin of events Basic Assumptions about SSNs • Secure data capture (Sensors are tamper-proof) • Secure generation of basic events by SSN • Trusted & have computation power + limited storage, can carry out encryption/decryption with secret key common to all SSNs

  31. System architecture & adversary (cont.) Adversary: Server-side snooper who wants to deduce the identity of the individual associated with a basic-event. Minimum requirement for security: State information is to be always encrypted on server Recall: Goal is to ensure a level k of anonymity for each individual

  32. Basic protocol Return automata that (possibly) match e (encrypted match) Store updated automata SERVER SECURE SENSOR NODE Query for set of (encrypted) automata that match event e Decrypt automata, advance the state of automata if necessary associate encrypted label with new state. Write-back encrypted automata Generate basic event e Question: Does encryption ensure complete anonymity? NO! SSNs’ pattern of automata access may cause identity disclosure

  33. Example U enters kitchen U takes coffee R1 U enters kitchen U opens fridge Applies to Tom Tom enters Kitchen  3 firings R2 U enters kitchen U opens microwave R3 U enters kitchen U takes coffee R1 Applies to Bill Bill enters Kitchen  2 firings U enters kitchen U opens fridge R2 On an event,the # rows retrieved from state tablecan disclose the identity of the individual

  34. Characteristic access patterns of automata The characteristic access patterns of rows can potentially reveal the identity of the automaton in spite of encryption The set of rules applicable to an individual maybe unique  potentially identify the individual Rules applicable to TOM Tom enters kitchen Tom takes coffee x Characteristic patterns of x P1: {x,y,z} {x y} Characteristic patterns of y P2: {x,y,z} {x,y} {y} P3: {x,y,z} {y,z} {y} Characteristic patterns of z P4: {x,y,z} {y z} Tom leaves coffee pot empty Tom takes coffee Tom enters kitchen y Tom opens fridge Tom leaves fridge open Tom enters kitchen Tom opens fridge z

  35. Partitioning events (unrestricted) Goal: Make the set of characteristic patterns associated with each automaton non-identifying (k-anonymous) Candidate solution: • Partition events into k-diverse groups • Index automata (rows of the table) by event’s group-id instead of the event-label C1 Tom enters kitchen Bill enters kitchen Kate leaves microwave open C2 Tom opens fridge Kate enters kitchen Bill takes coffee Theorem: Checking if an event-partitioning scheme for a given set of automata is k-anonymous is NP-Complete (The problem of checking the existence of a fixed-point-free automorphism in graphs can be reduced to this problem) Tom leaves microwave open Kate leaves fridge open 3-diverse event clusters Bill leaves microwave open C3 Does not guarantee 3-anonymity

  36. Event clustering (restricted) • Assign all events in an automaton into a single group • If two automata have a common event, assign them to the same group Connected-groups of automata • Combine connected-groups into k-diverse partitions Guarantees k-anonymity C1 C2 All automata in a cluster are associated with the same access pattern  k-anonymity

  37. Final partition-based protocol Return all automata belonging to Partition(e) Store updated automata SERVER SECURE SENSOR NODE Determine Partition(e) (encrypted query) Decrypt automata, Advance the state of automata if necessary Write-back all automata in Partition(e) Generate basic event e

  38. Minimum-cost clustering Each connected-group of automata is represented by a ball • Each ball has a “weight” (accessed with a frequency) • Each ball has a “price” (transmission overhead) • Each ball has a “color” (denoting individual) Optimization problem: Partition the set of balls into as many bins as required where the objective is to ∑ ( ∑ b.price ) * ( ∑ b.weight ) s.t. each bin has balls of at least k distinct colors Minimize bini b∈bini b∈bini (Problem is NP-Hard: reduction from sum-of-squares problem)

  39. Solution to optimization problem We give some simple heuristic solution that works well in practice • Start with a random feasible partition meeting k-anonymity constraint • Iterate: determine best set of “non-conflicting” ball transfers between bins (i.e. those which reduce cost by largest amount) & execute these transfers • Iterate: determine best set of non-conflicting ball exchanges between bins & execute these exchanges • Stop when no further cost-reduction is possible

  40. Experiments • Prototype built on SATware-Responsphere framework • Responsphere – communications, storage, computing framework consisting of approx. 200 sensors • SATware – middleware for deploying pervasive space applications • Dataset for simulation • Generate events based on real activities in office building • 4 groups of people – STUDENT, FACULTY, STAFF, VISITOR (300 in all) • 3 regions: KITCHEN, SERVER_ROOM, FACILITIES_ROOM • 15 rules belonging to 2 classes of activities: (i) protection of resources; (ii) suspicious activity

  41. Sample rules

  42. Evaluation using realistic dataset Evaluation • Simulated sequence of 1000 events & measured communication cost between Server and SSNs • Compare the following 2 partitioning algorithms: • k-individual partitioning – all automata of an individual in a single group • k-connected-group partitioning – remove the above constraint

More Related