1 / 48

Quantification of Integrity

Quantification of Integrity. Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15, 2010. Goal. Information-theoretic Quantification of programs’ impact on Integrity of Information. [Denning 1982].

ima
Download Presentation

Quantification of Integrity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15, 2010

  2. Goal Information-theoretic Quantification of programs’ impact on Integrity of Information [Denning 1982]

  3. What is Integrity? Common Criteria: Protection of assets from unauthorized modification Biba (1977): Guarantee that a subsystem will perform as it was intended Isolation necessary for protection from subversion Dual to confidentiality Databases: Constraints that relations must satisfy Provenance of data Utility of anonymized data …no universal definition

  4. Our Notions of Integrity Corruption: damage to integrity

  5. Our Notions of Integrity Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output …distinct, but interact

  6. Contamination Goal: model taint analysis Program untrusted Attacker Attacker trusted User User

  7. Contamination Goal: model taint analysis Untrusted input contaminates trusted output Program untrusted Attacker Attacker trusted User User

  8. Contamination u contaminates o o:=(t,u)

  9. Contamination u contaminates o (Can’t u be filtered from o?) o:=(t,u)

  10. Quantification of Contamination Use information theory: information is surprise X, Y, Z: distributions I(X,Y): mutual information between X and Y (in bits) I(X,Y | Z): conditional mutual information

  11. Quantification of Contamination Program untrusted Attacker Attacker trusted User User

  12. Quantification of Contamination Uin Program untrusted Attacker Attacker trusted User User Tin Tout

  13. Quantification of Contamination Contamination = I(Uin,Tout| Tin) Uin Program untrusted Attacker Attacker trusted User User Tin Tout [Newsome et al. 2009] Dual of [Clark et al. 2005, 2007]

  14. Example of Contamination o:=(t,u) Contamination = I(U, O | T) = k bits if U is uniform on [0,2k-1]

  15. Our Notions of Integrity Corruption: damage to integrity Contamination: bad information present in output Suppression: good information lost from output

  16. Program Suppression Goal: model program (in)correctness Specification Sender Receiver correct (Specification must be deterministic)

  17. Program Suppression Goal: model program (in)correctness Specification Sender Receiver correct Implementation untrusted Attacker Attacker trusted Sender Receiver real

  18. Program Suppression Goal: model program (in)correctness Specification Sender Receiver correct Implementation untrusted Attacker Attacker trusted Sender Receiver real Implementation might suppress information about correct output from real output

  19. Example of Program Suppression Spec. for (i=0; i<m; i++) { s := s + a[i]; } a[0..m-1]: trusted

  20. Example of Program Suppression Spec. for (i=0; i<m; i++) { s := s + a[i]; } a[0..m-1]: trusted Impl. 1 for (i=1; i<m; i++) { s := s + a[i]; } Suppression—a[0] missing No contamination

  21. Example of Program Suppression Spec. for (i=0; i<m; i++) { s := s + a[i]; } a[0..m-1]: trusted Impl. 1 Impl. 2 for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } a[m]: untrusted Suppression—a[0] missing No contamination Suppression—a[m] added Contamination

  22. Suppression vs. Contamination output := input Contamination Attacker Attacker * * Suppression

  23. Suppression vs. Contamination o := t o:=(t,u) o contaminated by u no suppression n:=rnd(); o:=t xorn t suppressed by noise no contamination o:=t xor u t suppressed by u o contaminated by u

  24. Quantification of Program Suppression Specification Sender Receiver Implementation untrusted Attacker Attacker trusted Sender Receiver

  25. Quantification of Program Suppression In Spec Specification Sender Receiver Implementation untrusted Attacker Attacker trusted Sender Receiver

  26. Quantification of Program Suppression In Spec Specification Sender Receiver Uin Implementation untrusted Attacker Attacker trusted Sender Receiver Tin Impl

  27. Quantification of Program Suppression In Spec Specification Sender Receiver Uin Implementation untrusted Attacker Attacker trusted Sender Receiver Tin Impl Program transmission = I(Spec, Impl)

  28. Quantification of Program Suppression H(X): entropy (uncertainty) of X H(X|Y): conditional entropy of X given Y Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec | Impl)

  29. Quantification of Program Suppression H(X): entropy (uncertainty) of X H(X|Y): conditional entropy of X given Y Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec | Impl) Total info to learn about Spec

  30. Quantification of Program Suppression H(X): entropy (uncertainty) of X H(X|Y): conditional entropy of X given Y Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec | Impl) Info actually learned about Spec by observing Impl Total info to learn about Spec

  31. Quantification of Program Suppression H(X): entropy (uncertainty) of X H(X|Y): conditional entropy of X given Y Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec | Impl) Info actually learned about Spec by observing Impl Total info to learn about Spec Info NOT learned about Spec by observing Impl

  32. Quantification of Program Suppression H(X): entropy (uncertainty) of X H(X|Y): conditional entropy of X given Y Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec | Impl) Program Suppression = H(Spec | Impl)

  33. Example of Program Suppression Spec. for (i=0; i<m; i++) { s := s + a[i]; } Impl. 1 Impl. 2 for (i=1; i<m; i++) { s := s + a[i]; } for (i=0; i<=m; i++) { s := s + a[i]; } Suppression = H(A) Suppression ≤ H(A) A = distribution of individual array elements

  34. Echo Specification Tin Tin output := input trusted Sender Receiver

  35. Echo Specification Tin Tin output := input trusted Sender Receiver Uin Implementation untrusted Attacker Attacker trusted Sender Receiver Tin Tout

  36. Echo Specification Tin Tin output := input trusted Sender Receiver Uin Implementation untrusted Attacker Attacker trusted Sender Receiver Tin Tout Simplifies to information-theoretic model of channels, with attacker

  37. Channel Suppression Uin Channel untrusted Attacker Attacker trusted Sender Receiver Tin Tout Channel transmission = I(Tin,Tout) Channel suppression = H(Tin | Tout) (Tout depends on Uin)

  38. Belief-based Metrics What if user’s/receiver’s distribution on unobservable inputs is wrong? Belief-based information flow [Clarkson et al. 2005] Belief-based generalizes information-theoretic: • On single executions, the same • In expectation, the same …if user’s/receiver’s distribution is correct

  39. Suppression and Confidentiality Declassifier: program that reveals (leaks) some information; suppresses rest Leakage: [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009] Thm. Leakage + Suppression is a constant What isn’t leaked is suppressed

  40. Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake response Anonymizer Database User User query anonymized response

  41. Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake …suppresses to avoid leakage response Anonymizer Database User User query anonymized response anon. resp. := resp.

  42. Database Privacy Statistical database anonymizes query results: …sacrifices utility for privacy’s sake …suppresses to avoid leakage …sacrifices integrity for confidentiality’s sake response Anonymizer Database User User query anonymized response

  43. K-anonymity DB. Every individual must be anonymous within set of size k[Sweeney 2002] Iflow. Every output corresponds to k inputs …no bound on leakage or suppression

  44. L-diversity DB. Every individual’s sensitive information should appear to have L (roughly) equally likely values[Machanavajjhala et al. 2007] DB. Entropy L-diversity: H(anon. block) ≥ log L[Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007] Iflow. H(Tin | tout) ≥ log L …implies suppression ≥ log L

  45. Differential Privacy DB. [Dwork et al. 2006, Dwork 2006] • No individual loses privacy by including data in database • Anonymized data set reveals no information about an individual beyond what other individuals already reveal Iflow. Output reveals almost no information about individual input beyond what other inputs already reveal …implies almost all information about individual suppressed …quite similar to noninterference

  46. Summary Measures of information corruption: • Contamination (generalizes taint analysis, dual to leakage) • Suppression (generalizes program correctness, no dual) Application: database privacy (model anonymizers; relate utility and privacy; security conditions)

  47. More Integrity Measures • Attacker- and program-controlled suppression Granularity: • Average over all executions • Single executions • Sequences of executions …interaction of attacker with program Applications: Error-correcting codes, malware

  48. Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15, 2010

More Related