1 / 92

Open universes and nuclear weapons

Open universes and nuclear weapons. Outline. Why we need expressive probabilistic languages BLOG combines probability and first-order logic Application to global seismic monitoring for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The world has things in it!!.

rupert
Download Presentation

Open universes and nuclear weapons

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Open universes and nuclear weapons

  2. Outline • Why we need expressive probabilistic languages • BLOG combines probability and first-order logic • Application to global seismic monitoring for the Comprehensive Nuclear-Test-Ban Treaty (CTBT)

  3. The world has things in it!! • Expressive language => concise models => fast learning, sometimes fast reasoning • E.g., rules of chess: • 1 page in first-order logic On(color,piece,x,y,t) • ~100000 pages in propositional logic WhiteKingOnC4Move12 • ~100000000000000000000000000000000000000 pages as atomic-state model R.B.KB.RPPP..PPP..N..N…..PP….q.pp..Q..n..n..ppp..pppr.b.kb.r [Note: chess is a tiny problem compared to the real world]

  4. Brief history of expressiveness probability logic atomic propositional first-order/relational

  5. Brief history of expressiveness probability 5th C B.C. logic atomic propositional first-order/relational

  6. Brief history of expressiveness 17th C probability 5th C B.C. logic atomic propositional first-order/relational

  7. Brief history of expressiveness 17th C probability 5th C B.C. 19th C logic atomic propositional first-order/relational

  8. Brief history of expressiveness 17th C 20th C probability 5th C B.C. 19th C logic atomic propositional first-order/relational

  9. Brief history of expressiveness 17th C 20th C 21st C probability 5th C B.C. 19th C logic atomic propositional first-order/relational

  10. Brief history of expressiveness 17th C 20th C 21st C probability (be patient!) 5th C B.C. 19th C logic atomic propositional first-order/relational

  11. First-order probabilistic languages • Gaifman [1964]: • Possible worlds with objects and relations, probabilities attached to (infinitely many) sentences • Halpern [1990]: • Probabilities within sentences, constraints on distributions over first-order possible worlds • Poole [1993], Sato [1997], Koller & Pfeffer [1998], various others: • KB defines distribution exactly (cf. Bayes nets) • assumes unique names and domain closure like Prolog, databases (Herbrand semantics)

  12. Herbrand vs full first-order Given Father(Bill,William) and Father(Bill,Junior) How many children does Bill have?

  13. Herbrand vs full first-order Given Father(Bill,William) and Father(Bill,Junior) How many children does Bill have? Herbrand semantics: 2

  14. Herbrand vs full first-order Given Father(Bill,William) and Father(Bill,Junior) How many children does Bill have? Herbrand semantics: 2 First-order logical semantics: Between 1 and ∞

  15. Possible worlds • Propositional

  16. Possible worlds A B C D • Propositional • First-order + unique names, domain closure A B A B A B A B C D C D C D C D

  17. Possible worlds A B C D • Propositional • First-order + unique names, domain closure • First-order open-universe A B A B A B A B C D C D C D C D A B C D A B C D A B C D A B C D A B C D A B C D

  18. Open-universe models • Essential for learning about what exists, e.g., vision, NLP, information integration, tracking, life • [Note the GOFAI Gap: logic-based systems going back to Shakey assumed that perceived objects would be named correctly] • Key question: how to define distributions over an infinite, heterogeneous set of worlds?

  19. Bayes nets build propositional worlds Burglary Earthquake Alarm

  20. Bayes nets build propositional worlds Burglary Earthquake Alarm Burglary

  21. Bayes nets build propositional worlds Burglary Earthquake Alarm Burglary not Earthquake

  22. Bayes nets build propositional worlds Burglary Earthquake Alarm Burglary not Earthquake Alarm

  23. Open-universe models in BLOG • Construct worlds using two kinds of steps, proceeding in topological order: • Dependency statements: Set the value of a function or relation on a tuple of (quantified) arguments, conditioned on parent values

  24. Open-universe models in BLOG • Construct worlds using two kinds of steps, proceeding in topological order: • Dependency statements: Set the value of a function or relation on a tuple of (quantified) arguments, conditioned on parent values • Number statements: Add some objects to the world, conditioned on what objects and relations exist so far

  25. Semantics Every well-formed* BLOG model specifies a unique proper probability distribution over open-universe possible worlds; equivalent to an infinite contingent Bayes net * No infinite receding ancestor chains, no conditioned cycles, all expressions finitely evaluable

  26. Example: Citation Matching [Lashkari et al 94] Collaborative Interface Agents, Yezdi Lashkari, Max Metral, and Pattie Maes, Proceedings of the Twelfth National Conference on Articial Intelligence, MIT Press, Cambridge, MA, 1994. Metral M. Lashkari, Y. and P. Maes. Collaborative interface agents. In Conference of the American Association for Artificial Intelligence, Seattle, WA, August 1994. Are these descriptions of the same object? Core task in CiteSeer, Google Scholar, over 300 companies in the record linkage industry

  27. (Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));

  28. (Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));

  29. (Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));

  30. (Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));

  31. (Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));

  32. (Simplified) BLOG model #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));

  33. Citation Matching Results Four data sets of ~300-500 citations, referring to ~150-300 papers

  34. Example: Sibyl attacks • Typically between 100 and 10,000 real people • About 90% are honest, have one login ID • Dishonest people own between 10 and 1000 logins. • Transactions may occur between logins • If two logins are owned by the same person (sibyls), then a transaction is highly likely; • Otherwise, transaction is less likely (depending on honesty of each login’s owner). • A login may recommend another after a transaction: • Sibyls with the same owner usually recommend each other; • Otherwise, probability of recommendation depends on the honesty of the two owners.

  35. #Person ~ LogNormal[6.9, 2.3](); Honest(x) ~ Boolean[0.9](); #Login(Owner = x) ~ if Honest(x) then 1 else LogNormal[4.6,2.3](); Transaction(x,y) ~ if Owner(x) = Owner(y) then SibylPrior() else TransactionPrior(Honest(Owner(x)), Honest(Owner(y))); Recommends(x,y) ~ if Transaction(x,y) then if Owner(x) = Owner(y) then Boolean[0.99]() else RecPrior(Honest(Owner(x)), Honest(Owner(y))); Evidence: lots of transactions and recommendations, maybe some Honest(.) assertions Query: Honest(x)

  36. #Person ~ LogNormal[6.9, 2.3](); Honest(x) ~ Boolean[0.9](); #Login(Owner = x) ~ if Honest(x) then 1 else LogNormal[4.6,2.3](); Transaction(x,y) ~ if Owner(x) = Owner(y) then SibylPrior() else TransactionPrior(Honest(Owner(x)), Honest(Owner(y))); Recommends(x,y) ~ if Transaction(x,y) then if Owner(x) = Owner(y) then Boolean[0.99]() else RecPrior(Honest(Owner(x)), Honest(Owner(y))); Evidence: lots of transactions and recommendations, maybe some Honest(.) assertions Query: Honest(x)

  37. #Person ~ LogNormal[6.9, 2.3](); Honest(x) ~ Boolean[0.9](); #Login(Owner = x) ~ if Honest(x) then 1 else LogNormal[4.6,2.3](); Transaction(x,y) ~ if Owner(x) = Owner(y) then SibylPrior() else TransactionPrior(Honest(Owner(x)), Honest(Owner(y))); Recommends(x,y) ~ if Transaction(x,y) then if Owner(x) = Owner(y) then Boolean[0.99]() else RecPrior(Honest(Owner(x)), Honest(Owner(y))); Evidence: lots of transactions and recommendations, maybe some Honest(.) assertions Query: Honest(x)

  38. #Person ~ LogNormal[6.9, 2.3](); Honest(x) ~ Boolean[0.9](); #Login(Owner = x) ~ if Honest(x) then 1 else LogNormal[4.6,2.3](); Transaction(x,y) ~ if Owner(x) = Owner(y) then SibylPrior() else TransactionPrior(Honest(Owner(x)), Honest(Owner(y))); Recommends(x,y) ~ if Transaction(x,y) then if Owner(x) = Owner(y) then Boolean[0.99]() else RecPrior(Honest(Owner(x)), Honest(Owner(y))); Evidence: lots of transactions and recommendations, maybe some Honest(.) assertions Query: Honest(x)

  39. #Person ~ LogNormal[6.9, 2.3](); Honest(x) ~ Boolean[0.9](); #Login(Owner = x) ~ if Honest(x) then 1 else LogNormal[4.6,2.3](); Transaction(x,y) ~ if Owner(x) = Owner(y) then SibylPrior() else TransactionPrior(Honest(Owner(x)), Honest(Owner(y))); Recommends(x,y) ~ if Transaction(x,y) then if Owner(x) = Owner(y) then Boolean[0.99]() else RecPrior(Honest(Owner(x)), Honest(Owner(y))); Evidence: lots of transactions and recommendations, maybe some Honest(.) assertions Query: Honest(x)

  40. Example: classical data association

  41. Example: classical data association

  42. Example: classical data association

  43. Example: classical data association

  44. Example: classical data association

  45. Example: classical data association

  46. #Aircraft(EntryTime = t) ~ NumAircraftPrior(); Exits(a, t) if InFlight(a, t) then ~ Bernoulli(0.1); InFlight(a, t)if t < EntryTime(a) then = falseelseif t = EntryTime(a) then = trueelse = (InFlight(a, t-1) & !Exits(a, t-1)); State(a, t)if t = EntryTime(a) then ~ InitState() elseif InFlight(a, t) then ~ StateTransition(State(a, t-1)); #Blip(Source = a, Time = t) if InFlight(a, t) then ~ NumDetectionsCPD(State(a, t)); #Blip(Time = t) ~ NumFalseAlarmsPrior(); ApparentPos(r)if (Source(r) = null) then ~ FalseAlarmDistrib()else ~ ObsCPD(State(Source(r), Time(r)));

  47. Inference Theorem: BLOG inference algorithms (rejection sampling, importance sampling, MCMC) converge to correct posteriors for any well-formed* model, for any first-order query Current generic MCMC engine is quite slow • Applying compiler technology • Developing user-friendly methods for specifying piecemeal MCMC proposals

  48. CTBT • Bans testing of nuclear weapons on earth • Allows for outside inspection of 1000km2 • 182/195 states have signed • 153/195 have ratified • Need 9 more ratifications including US, China • US Senate refused to ratify in 1998 • “too hard to monitor”

  49. 2053 nuclear explosions

More Related