1 / 61

Chapter 12. Basic Probability

Chapter 12. Basic Probability. 2013/05/08. How Do You Use Probability in Games?. Bayesian analysis for decision making under uncertainty is fundamentally tied to probability. Genetic algorithms also use probability to some extent—for example, to determine mutation rates.

yates
Download Presentation

Chapter 12. Basic Probability

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 12. Basic Probability 2013/05/08

  2. How Do You Use Probability in Games? • Bayesian analysis for decision making under uncertainty is fundamentally tied to probability. • Genetic algorithms also use probability to some extent—for example, to determine mutation rates. • Even neural networks can be coupled with probabilistic methods

  3. Randomness • rand() • range from 0 to RAND_MAX • rand() % N • srand (seed) • The unit, when confronted, will move left with a 25% probability or will move right with a 25% probability or will back up with a 50% probability • Proportion method(輪盤法)

  4. Hit Probabilities, Character Abilities • 60% probability of striking his opponent

  5. State Transitions • assign individual creatures different probabilities, giving each one their own distinct personality

  6. Adaptability • Updating certain probabilities as the game is played in an effort to facilitate computer-controlled unit learning or adapting • The number and outcomes of confrontations between a certain type of creature and a certain class of player

  7. 12.2 What is Probability ? • Classical Probability • Frequency Interpretation • Subjective Interpretation • Odds • Expectation

  8. Techniques for Assigning Subjective Probability • Odds • betting metaphor • put up or shut up approach • could be biased by the risk they perceive • Expectation • Fair price

  9. Probability Rules • P(A), must be a real number between 0 and 1 • if S represents the entire sample space for the event, the probability of S equals 1 • P(A'), is 1- P(A). • if two events A and B are mutually exclusive, only one of them can occur at a given time

  10. Rule 5 nonmutually exclusive events

  11. Rule 6 • if two events, A and B, are independent

  12. Random variables

  13. Prior probability

  14. Joint Probability Distribution

  15. Conditional Probability

  16. Conditional Probability n-1

  17. Inference by Enumeration

  18. Inference by Enumeration

  19. Inference by Enumeration

  20. Inference by Enumeration

  21. Normalization

  22. Inference by Enumeration

  23. Independence

  24. Conditional Independence

  25. Conditional Independence

  26. Chapter 13.Bayesian networks

  27. Bayes’ Rules

  28. Bayes’ Rule and Conditional Independence

  29. Bayesian networks • A simple, graphical notation for conditional independence assertions and hence for compact specification of full joint distributions • Syntax: • a set of nodes, one per variable a directed, acyclic graph (link ≈ "directly influences") • a conditional distribution for each node given its parents: P (Xi | Parents (Xi)) • In the simplest case, conditional distribution represented as a conditional probability table (CPT) giving the distribution over Xi for each combination of parent values

  30. Example • Topology of network encodes conditional independence assertions: • Weather is independent of the other variables • Toothache and Catch are conditionally independent given Cavity

  31. Example • I'm at work, neighbor John calls to say my alarm is ringing, but neighbor Mary doesn't call. Sometimes it's set off by minor earthquakes. Is there a burglar? • Variables: Burglary, Earthquake, Alarm, JohnCalls, MaryCalls • Network topology reflects "causal" knowledge: • A burglar can set the alarm off • An earthquake can set the alarm off • The alarm can cause Mary to call • The alarm can cause John to call

  32. Example contd.

  33. Compactness • A CPT for Boolean Xi with k Boolean parents has 2k rows for the combinations of parent values • Each row requires one number p for Xi = true(the number for Xi = false is just 1-p) • If each variable has no more than k parents, the complete network requires O(n · 2k) numbers • I.e., grows linearly with n, vs. O(2n)for the full joint distribution • For burglary net, 1 + 1 + 4 + 2 + 2 = 10 numbers (vs. 25-1 = 31)

  34. Semantics The full joint distribution is defined as the product of the local conditional distributions: P (X1, … ,Xn) = πi = 1P (Xi | Parents(Xi)) e.g., P(j  m  a b e) = P (j | a) P (m | a) P (a | b, e) P (b) P (e) n

  35. Constructing Bayesian

  36. Example

  37. Example

  38. Example

  39. Example

  40. Example

  41. Example

  42. Bayesian network on Games • Bayesian networks provide a natural representation for (causally induced) conditional independence • Topology + CPTs = compact representation of joint distribution • Generally easy for domain experts to construct

  43. Example Bayesian network • the nodes labeled T, Tr, and L represent random variables. The arrows connecting each node represent causal relationships.

  44. Example conditional probability table P(T) P(L|T, Tr) P(Tr)

  45. Decision-making problems • you use Bayesian methods for very specific decision-making problems and leave the other AI tasks to other methods that are better suited for them

  46. Inference • Predictive reasoning • Right-up figure • P(A|BC)=P(BC|A)P(A) • =  P(B|A)P(C|A)P(A) • Diagnostic reasoning • left • Explaining away

  47. Trapped? P(L|T) P(T)

  48. Tree Diagram

More Related