1 / 17

Once again about the science-policy interface

Once again about the science-policy interface. Q R A. Open risk management: overview. Human-human interface. There are really interesting new interfaces for transmitting information from person to person: Facebook: How are you? Wikipedia: What is thing X? Opasnet: What should we do?

Download Presentation

Once again about the science-policy interface

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Once again about the science-policy interface

  2. Q R A Open risk management: overview

  3. Human-human interface • There are really interesting new interfaces for transmitting information from person to person: • Facebook: How are you? • Wikipedia: What is thing X? • Opasnet: What should we do? • A universal interface for communication about decisions and decision support is urgently needed. • When that problem is solved, the communication problem between science and policy is solved as well. There is no need for a separate science-policy interface.

  4. Introduction to probability theory Jouni Tuomisto THL

  5. Probability of a red ball • P(x|K) = R/N, • x=event that a red ball is picked • K=your knowledge about the situation

  6. Probability of an event x Prize p Red Red ball 100 € • If you are indifferent between decisions 1 and 2, then your probability of x is p=R/N. Decision 1 White ball 0 € 1-p x happens 100 € Decision 2 0 € x does not happen

  7. The meaning of uncertainty • Uncertainty is that which disappears when we become certain. • We become certain of a declarative sentence when (a) truth conditions exist and (b) the conditions for the value ‘true’ hold. • (Bedford and Cooke 2001) • Truth conditions: • It is possible to design a setting where it can be observed whether the truth conditions are met or not.

  8. Different kinds of uncertainty • Aleatory (variability, irreducible) • Epistemic (reducible), actually the difference only depends on its purpose in a model. • Weights of individuals is aleatory if we are interested in each person, but epistemic if we are interested in a random person in the population. • Parameter (in a model): should be observable! • Model: several models can be treated as parameters in a meta-model

  9. Different kinds of uncertainty: not really uncertainty • Ambiguity: not uncertainty but fuzziness of description • Volitional uncertainty: “The probability that I will clean up the basement next weekend.” • Uncertainties about own actions cannot be measured by probabilities.

  10. What is probability? • 1. Frequentists talk about probabilities only when dealing with experiments that are random and well-defined. The probability of a random event denotes the relative frequency of occurrence of an experiment's outcome, when repeating the experiment. Frequentists consider probability to be the relative frequency "in the long run" of outcomes.[1] • 2. Bayesians, however, assign probabilities to any statement whatsoever, even when no random process is involved. Probability, for a Bayesian, is a way to represent an individual's degree of belief in a statement, or an objective degree of rational belief, given the evidence. • Source: Wikipedia

  11. Probability rules • Rule 1 (convexity): • For all A and B, 0 ≤ P(A|B) ≤ 1 and P(A|A)=1. • Cromwell’s rule P(A|B)=1 if and only if A is a logical consequence of B. • Rule 2 (addition): if A and B are exclusive, given C, • P(A U B|C) = P(A|C) + P(B|C). • P(A U B|C) = P(A|C) + P(B|C) – P(A ∩ B|C) if not exclusive. • Rule 3 (multiplication): for all A, B, and C, • P(AB|C) = P(A|BC) P(B|C) • Rule 4 (conglomerability): if {Bn} is a partition, possibly infinite, of C and P(A|BnC)=k, the same value for all n, then P(A|C)=k.

  12. Conditional probabilities • The totality of possible states of the world P(A) P(B)

  13. Binomial distribution • You make n trials with success probability p. The number of successful trials k follows the binomial distribution. • Like drawing n balls (with replacement) from an urn and k being red. • P(n,k|p) = n!/k!/(n-k)! pk (1-p)(n-k)

  14. Example • You draw randomly 3 balls from an urn with 40 red and 60 white balls. What is the probability distribution for the number of red balls?

  15. Answer • P(n,k|p) = n!/k!/(n-k)! pk (1-p)(n-k) • 0 red: 3!/0!/3! *0.40*(1-0.4)3-0 • = 1*1*0.63 = 0.216 • 1 red: 3!/1!/2! *0.41*(1-0.4)3-1 = 0.432 • 2 red: 3!/2!/2! *0.42*(1-0.4)3-2 = 0.288 • 3 red: 3!/3!/0! *0.43*(1-0.4)3-3 = 0.064

  16. Binomial distribution (n=6)

  17. Binomial distribution: likelihoods

More Related