1 / 65

Computer Security 463.10 Formal Methods

Computer Security 463.10 Formal Methods. Fall 2005. Required. Reading J. M. Wing. A symbiotic relationship between formal methods and security. Proceedings NSF Workshop on Computer Security, Fault Tolerance, and Software Assurance: From Needs to Solution. December 1998.

ayla
Download Presentation

Computer Security 463.10 Formal Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computer Security463.10 Formal Methods Fall 2005

  2. Required • Reading • J. M. Wing. A symbiotic relationship between formal methods and security. Proceedings NSF Workshop on Computer Security, Fault Tolerance, and Software Assurance: From Needs to Solution. December 1998. • L. C. Paulson. Proving properties of security protocols by induction. Proceedings of the 10th Computer Security Foundations Workshop, 1997, page 70-83

  3. Introduction • National Security Agency was the major source of funding formal methods research and development in the 70s and early 80s • Formal security models • Tools for reasoning about security • Applications of using these tools to prove systems secure • The use of Internet brings security to the attention of masses • What kind of problems can formal methods help to solve in security • What problems will formal methods never help to solve

  4. The Limits of Formal Methods • Systems will never made 100% secure • Formal methods will not break this axiom • Assumptions about the system’s environment • Hard to state them explicitly • The system could be deployed in an environment not originally designed • For convenience or lack of an alternative • Clever intruders find out how to violate these assumptions • Security is not an either/or property • Pays more, gains more • Eg. Passwords, certificates, biometrics are measured in terms of degree of security for authentication

  5. What Formal Methods Can Do • Delimit the system’s boundary: the system and its environment • Characterize a system’s behavior more precisely • Define the system’s desired properties precisely • Prove a system meets its specification • Tell under what circumstances a system cannot meet its specification

  6. How They Can Help • These capabilities of formal methods help practitioner in two ways • Through specification, focusing on designer’s attention • What is interface • What are the assumptions about the system • What is the system supposed to do under this condition and that condition • What are the system’s invariant properties • Through verification • Prove a system meet its security goals • Find out the weakness of the system

  7. History-Past • Early formal method research funded by the National Security Agency, centered on proving systems secure • Bell-LaPadula model • Biba integrity model • Clark-Wilson model • The systems of interest to prove secure were operating systems, more specifically, kernels

  8. Process of Proving • Process of proving entails 3 parts • Formal specification • State the property of the system • For example: *-property • Model the system so that one can formally prove the property • Model might be a semantic structure like a state machine or a syntactic structure like a logical expression • Proof • Methods • Rely on induction over traces of the state machine model • Or rely on deduction to show that an implication holds • Automatically proved by machine or require interactive guidance of human

  9. The Orange Book • US Trusted Computer System Evaluation Criteria – The Orange Book • Produced by NCSC (National Computer Security Center) in 1985 • Provide a standard metric for NCSC to compare the security of different computer systems • Guide computer system vendors in the design and development of secure systems • Provide a means for specifying security requirements in Government contracts • Levels: D, C1,C2,B1,B2,B3,A1,A2 • Certified A.1 means that one formally specify the system’s security requirements, formally model the system and formally prove that the model meets its specification

  10. Major Results of Early 80s • Major results of early 80s were theorem proving tools with general purpose • Affirm (USC) • Support for reasoning about equational logic • Formal Development Methodology (FDM) System (System Development Corporation) • Support for non-deterministic state machine model • Gypsy (U. Texas Austin) • Support for program verification of a subset of Pascal, including a verification condition generator • (Enhanced) Hierarchy Development Methodology (HDM) (Stanford) • SPECIAL specification language • Collection of decision procedures for propositional logic

  11. Tools Specific for Security • Tools specific to reason about security • Specifies an insecure state and the tool searches backwards to determine whether that state is reachable • Interrogator • Based on Prolog, exhaustive search, fully automatic • NRL Protocol Analyzer • Based on Dolev and Yao’s work on algebraic term rewriting model for two-party cryptographic protocols • Less automatic • BAN logic – Logic of Authentication • Reason in terms of belief Logic • Accumulates belief during the run of the protocol • E.g., the freshness of message

  12. Two Different Styles of Formal Methods • Two different styles of formal methods • The UK and European style • Focus was on specification, on the system’s high level design and on the paper-and-pencil analysis • The US and Canadian style • Focus was on verification, from the system’s high level design through its code-level implementation down to its bit-level representation in hardware, and on machine-assisted analysis

  13. History-Present • Since early 90s, there was an explosion of new developments • Three threads in the development of formal methods • Model checking • hardware verification • Theorem proving • Software specification

  14. Model Checking • Model Checking • In 1996, model checker FDR was used to exhibit a flaw in the Needham-Schroeder protocol • Since then many other model check tools or other proving tools showed the same thing • Other protocols were or are being checked • SSL by Stanford • IKE, SET by Meadows • Netbill electronic payment protocol

  15. Theorem Proving • Theorem proving approaches • General purpose theorem prover Isabelle was used to reason 5 classic authentication protocols and their variation • Coq was used to analyze the Needham-Schroeder protocol and SET • PVS was used to verify authentication protocol

  16. Hybrid Approaches • Hybrid approaches • Embodies both model checking and theorem proving functionalities • Improved NRL protocol analyzer analyze IKE and SET protocol • Improved NRL embodies both model checking (e.g., brute force search )and theorem proving (e.g., lemma generation)

  17. Practice of Building Secure Systems • Build secure systems • Solid crypto base • Encryption, hash, signature • Protocols • Authentication, key exchange protocol • Standard reliable network protocol • Systems and general purpose programming language • Kerberos authentication • Applications • Online banking • Online shopping • Bill Payment Applications Systems/PL Protocols Crypto Crypto System Layers

  18. Security Guarantees • Security guarantees reverse • Crypto base precisely tells us • What is guaranteed • What is not guaranteed • What is still open problem • Protocols • formal methods provide some guarantee • Systems/PL • Commercial technology like Authenticode, Active X, Java provide varying degrees of security • Applications • Not very much at all Applications Systems/PL Protocols Crypto Security Guarantees

  19. A Case Study: Proving Properties of Security Protocols by Induction • Cryptographic protocols let agents communicate securely over insecure network • Secrecy: spy cannot read the contents of messages intended for others • Authenticity: agents can identify themselves to others, but nobody can masquerade as somebody else

  20. Two Popular Approaches • Formal methods to analyze security protocols • State exploration • Model checking • Model protocol as finite state system • Verify by exhaustive search all reachable states • Pros: find attacks quickly • Cons: require drastic simplifying assumptions to make state space small • Belief logics • BAN Logic • Formally express what an agent may infer from messages received • Pros: short proof expressed at high level of abstraction • Cons: do not address secrecy

  21. An Inductive Approach • Borrow elements from both approaches • From state exploration: concrete notion of events • From belief logic: derive guarantees from each protocol messages • Inductive approach • Traces of events: A sends X to B • Protocols are formalized as the set of all possible traces of events • An agent may extend a trace in any way permitted by the protocol, given what he can see in current trace • Any number of interleaved runs • Modeling Accidents and general attacker • Mechanize proofs • Properties proved by Induction on traces • Using theorem prover Isabelle

  22. Why an Inductive Approach • Necessary formal tool is inductive definition • Inductive definition lists the possible actions an agent or system can perform • Corresponding induction rule helps to reason about the consequences of an arbitrary finite sequence of such actions • Highly general: used to specify the semantics of program languages for long time • Each security property is proved by induction over the protocol

  23. Overview • Method includes • Theory of message analysis • Theory describing standard features of protocols • Individual protocol description rest on the former two common base • Formalization assume an encrypted message can neither be altered nor read without the appropriate key

  24. Messages • Messages items may include • Agent names A, B,… • Nonces Na, Nb, … • Keys Ka, Kb , Kab , … • Compound messages {| X, X’|} • Encrypted messages Crypt K X • Structure of each message is explicit • Different types of components cannot be confused Note: (K-1)-1 = K

  25. Message Analysis • H: an agent’s initial knowledge and the history of all messages sent in a trace • parts H :obtained from H by repeatedly adding components of compound messages and the bodies of encrypted messages • Crypt K X 2parts H ) X 2parts H • analz H : obtained from H by repeatedly adding components of compound messages and by decrypting messages whose keys are in analz H • Represents maximum knowledge a spy can learn without breaking ciphers • Crypt K X 2analz H, K-12analz H) X 2analz H • synth H : repeatedly adding agent names, forming compound messages and encrypting with keys obtained in H • Model the messages a spy could build up from elements of H • X 2synth H, K 2 H ) Crypt K X 2synth H

  26. Model the attacker • Observe all traffic, that is H • Sends fraudulent messages from synth (analz H ) • May be able to get a set of “lost” long-term keys of those agents • May be able to get session keys accidentally lost • May be an agent by itself, so as to sent normal protocol messages and fraudulent ones

  27. Modeling a protocol • Each event in a trace has form Says A B X • Each agent’s state is • represented by its initial knowledge • Its key shared with the server • The list of events he can see • Honest agents only read messages addressed to themselves

  28. Example Protocol • A variant of Otway-Rees protocol • A ! B: Na, A, B, {| Na, A, B|}Ka • B ! S: Na,A,B,{| Na, A, B|}Ka,Nb,{| Na,A, B|}Kb • S ! B: Na,{| Na,Kab |}Ka,{| Nb, Kab |}Kb • B ! A: Na ,{| Na,Kab |}Ka

  29. Model its steps • Protocol steps are modeled as possible extensions of a trace with new events • If evs is a trace and N_a is unused, may add • Says A B {|Na, A, B, {|Na,A,B|}Ka|} • If evs has Says A’ B {|Na, A, B, X|} and Nb is unused, may add • Says B S {|Na, A, B, X, Nb, {|Na,A,B|}Kb|} • A’ instead of A is because B does not know the true sender • Using X is because B cannot read {|Na,A,B|}Ka

  30. If evs has Says B’ S {|Na, A, B, {|Na,A,B|}Ka, Nb, {|Na,A,B|}Kb|} and Kab is unused, may add • Says S B {| Na,{| Na,Kab |}Ka,{| Nb, Kab |}Kb |} • If evs has Says B S {|Na, A, B, X, Nb, {|Na,A,B|}Kb|} and Says S’ B {| Na,X,{| Nb, Kab |}Kb |}, may add • Says B A {| Na,X |} • An agent may be in several protocol runs concurrently, the trace represents his state in all these runs • Agents may respond to past events, no matter how old they are. They may respond any number of times or never

  31. Three Additional Rules for Protocol • Protocol description usually requires three additional rules: • The empty list [ ], is a trace • Fake:If evs is a trace, X 2synth(analz H ) is a fraudulent message and B  Spy, then add • Says Spy B X • Modeling attacks • Oops: If evs is a trace and S distributed key K to A, Na and Nb were used, then add • Says ASpy {| Na ,Nb ,K |} • Models accidental loss of session keys • Nonces identify the protocol run

  32. Attack to the Example Protocol • An attack to the protocol • A ! B £: Na, A, B, {| Na, A, B|}Ka 1’.C ! A :Nc, C, A, {| Nc, C, A|}Kc 2’.A ! S £: Nc, C, A, {| Nc, C, A|}Kc ,Na’, {| Nc, C, A|}Ka 2’’. CA! S : Nc, C, A, {| Nc, C, A|}Kc ,Na, {| Nc, C, A|}Ka 3’. S ! A £: Nc,{| Nc,Kca |}Kc,{| Na, Kca |}Ka • CB! A : Na ,{| Na,Kca |}Ka Replacing Na’ by A’s original nonce Na eventually cause A to accept key Kca as a key shared with B

  33. A Mechanized Theory of Messages • Mechanized using Isabelle/HOL • an instantiation of generic theorem prover Isabelle to higher-order logic • Why Isabelle • Support for inductively defined sets • its automatic tools for proving

  34. Brief Introduction to Isabelle • Working with Isabelle means creating theories • A named collection of types, functions and theorems theory T= B1+…+Bn: declarations, definitions, proofs end

  35. exampleTheory theory exampleTheory = Main: datatype 'a list = Nil | Cons 'a “ 'a list” consts append :: “ 'a list => 'a list => 'a list” primrec "append Nil ys = ys" "append (Cons x xs) ys = Cons x (app xs ys)“ lemma [simp]: "append xs Nil = xs" apply (induct xs) apply auto done end datatype: datatype declaration consts: function declaration primrec: recursive definition of a function lemma:definea theorem and its proving process simp: an attribute for a lemma

  36. Main Datatype Definition datatype agent = Server | Friend nat | Spy datatype msg = Agent agent | Nonce nat | Key key | Mpair msg msg | Crypt key msg

  37. Induction Definition for parts H For parts H X 2 H Crypt K X 2parts H X 2parts H X 2parts H {|X,Y|} 2parts H{|X,Y|} 2parts H X 2parts HY 2parts H

  38. Induction Definition for analz H For analz H X 2 H X 2analz H Crypt K X 2analz HK-12analz H X 2analz H {|X,Y|} 2analz H{|X,Y|} 2analz H X 2analz HY 2analz H

  39. Induction Definition for synth H For synth H X 2 H X 2synth HK 2synth H X 2synth H Crypt K X 2synth H X 2synth HY 2synth HAgent A 2synth H {|X,Y|} 2synth H

  40. Isabella Syntax for analz H consts analz :: msg set => msg set inductive “analz H” intros Inj “X2 H ) X 2analz H” Fst “{|X,Y|}2analz H ) X 2analz H” Snd “{|X,Y|}2analz H ) Y 2analz H” Decrypt “[| Crypt K X 2analz H; Key (invKey K) 2analz H|] ) X 2analz H ” • Inductive definition on analz H • Inj, Fst, Snd and Decrypt are introduction rules

  41. Simplification Rules • They are monotonic (let G µ H) parts Gµparts H analz Gµanalz H synth Gµsynth H • They are idempotent parts (parts H) = parts H analz (analz H) = analz H synth (synth H) = synth H • Other equations parts (analz H) = parts H analz (parts H) = parts H parts (synth H) = parts H [ synth H analz (synth H) = analz H [ synth H synth (parts H) = ? irreducible synth (analz H) = ? irreducible

  42. Symbolic Evaluation: parts H For parts H ins X H = {X} [ H parts(ins(Agent A) H)=ins(Agent A) (parts H) parts(ins(Nonce N) H)=ins(Nonce N)(parts H) parts(ins (Key K) H)=ins(Key K)(parts H) parts(ins({|X,Y|}) H)=ins{|X,Y|}(parts (ins X (ins Y H))) parts(ins(Crypt K X) H)=ins(Crypt K X)(parts (ins X H))

  43. Symbolic Evaluation: analz H For analz H analz(ins(Agent A) H)=ins(Agent A)(analz H) analz(ins(Nonce N) H)=ins(Nonce N)(analz H) analz(ins({|X,Y|}) H)=ins{|X,Y|}(analz (ins X (ins Y H)))

  44. keysFor H = {K-1 | 9 X. Crypt K X 2 H} K keysFor (analz H ) analz (ins (Key K) H) = ins (Key K) (analz H) analz (ins (Crypt K X) H) = • ins (Crypt K X) (analz (ins X H)) if K-12analz H • ins (Crypt K X) (analz H) otherwise Nested encryption give rise to nested if-then-else expression

  45. Symbolic Evaluation: synth H • Symbolic evaluation of synth is impossible since its result is infinite. • No necessary to do the symbolic evaluation • Instead need to simplify assumption of the form X 2synth H, which arise when considering whether a certain message might be fraudulent • Nonce N 2synth H ) Nonce N 2 H • Key K 2synth H ) Key K 2 H (Key and Nonce are unguessable) • Crypt K X 2synth H ) • Crypt K X 2 H or • X 2synth H and K 2 H • If know K  H , then Crypt K X is a replay rather a spoof

  46. Statistics for Simplification Rules • Over 90 theorems are proved about parts, analz, synth, keysFor • Most are proved by induction with respect to the relevant inductive definition, which only needs three command each • Most of them can be later used by other theorems for automatic simplification

  47. Events and Agent Knowledge • datatype event = Says agent agent msg • Function sees models the set of messages an agent receives from a trace sees lost A ((Says A’ B X) || evs) = • ins X ( sees lost A evs) if A 2 {B, Spy} • sees lost A evsotherwise lost: a set of agents whose keys have been lost to the spy A: who sees in this action

  48. Initial Knowledge • From the empty trace, an agent sees his initial state sees lost A [ ] = initState lost A • Each agent share a long term key with Server, E.g. shrK A initState lost Server = all long term keys initState lost (Friend i)={Key(shrK(Friend i))} initState lost Spy= {Key(shrK(A))| A 2lost}

  49. The Formal Protocol Specification • A variant of Otway-Rees protocol • A ! B: Na, A, B, {| Na, A, B|}Ka • B ! S: Na,A,B,{| Na, A, B|}Ka,Nb,{| Na,A, B|}Kb • S ! B: Na,{| Na,Kab |}Ka,{| Nb, Kab |}Kb • B ! A: Na ,{| Na,Kab |}Ka

  50. Rules for Modeling the Protocol Nil[ ] 2otway Fake[|evs 2otway; B  Spy; X2synth( analz ( sees lost Spy evs) )|] )Says Spy B X || evs 2otway OR1[|evs 2otway; A  B; B  Server; Nonce NA usedevs|] )Says A B {| Nonce NA, Agent A, Agent B, Crypt (ShrK A) {| Nonce NA, Agent A, Agent B|} |} || evs 2otway OR2[|evs 2otway; B  Server; Nonce NB usedevs; Says A’ B {| Nonce NA, Agent A, Agent B, X|} 2 set_of_list evs |] )Says B Server {| Nonce NA, Agent A, Agent B, X, Nonce NB, Crypt (ShrK B) {| Nonce NA, Agent A, Agent B|} |} || evs 2otway OR3[|evs 2otway; B  Server; Key Kabusedevs; Says B’ Server {| Nonce NA, Agent A, Agent B, Crypt (ShrK A) {| Nonce NA, Agent A, Agent B|}, Nonce NB, Crypt (ShrK B) {| Nonce NA, Agent A, Agent B|} |} 2 set_of_list evs |] )Says Server B {| Nonce NA, Crypt (ShrK A) {| Nonce NA, Key Kab |} Crypt (ShrK B) {| Nonce NB, Key Kab |} |} || evs 2otway

More Related