930 likes | 1.3k Views
Kuliah 9 – Bayesian Network. Yeni Herdiyeni Dept. Ilmu Komputer. Probabilities. Probability distribution P( X| x ) X is a random variable Discrete Continuous x is background state of information. Discrete Random Variables. Finite set of possible outcomes. X binary:.
E N D
Kuliah 9 – Bayesian Network YeniHerdiyeni Dept. IlmuKomputer
Probabilities • Probability distribution P(X|x) • X is a random variable • Discrete • Continuous • xis background state of information
Discrete Random Variables • Finite set of possible outcomes X binary:
Continuous Random Variable • Probability distribution (density function) over continuous values 5 7
More Probabilities • Joint • Probability that both X=x and Y=y • Conditional • Probability that X=x given we know that Y=y
Rules of Probability • Product Rule • Marginalization X binary:
Conditional Probability • If E and F are independent, then • Law of Total Probability
Conditional Probability • Bayes’ Theorem: • Chain Rule
Conditional Independence • Let E, F, and G be events. E and F are conditionally independentgivenG if • An equivalent definition is:
Naïve Bayes Classifier Pada Naïve Bayes, variabelinputnyasalingindependent !
Naïve Bayesvs Bayesian Network • Pada Naïve Bayes, mengabaikankorelasiantarvariable • diasumsikanpeluangbersyaratnyaatribut yang digunakansalingbebas (independence). • Sedangkanpada Bayesian Network, variabel input bisasaling dependent tidaksalingbebas (dependence).
Bayesian Networks • A Bayesian network (BN) is a probabilistic graphical model that represents a set of variables and their independencies • Formally, a BN is a directed acyclic graph (DAG) whose nodes represent variables, and whose arcs encode the conditional independencies between the variables
Baynesian Network - Example family-out (fo) bowel-problem (bp) light-on (lo) dog-out (do) hear-bark (hb) From Charniak
Bayesian Networks • “Over the last few years, a method of reasoning using probabilities, variously called belief networks, Bayesian networks, knowledge maps, probabilistic causal networks, and so on, has become popular within the AI community” - from Chaniak • Applications include medical diagnosis, map learning, language understanding, vision, and heuristic search. • In particular, this method is playing an increasingly important role in the design and analysis of machine learning algorithms.
Bayesian Networks Two interpretations • Causal • BNs are used to model situations where causality plays a role, but our understanding is incomplete, so that we must describe things probabilistically • Probabilistic • BNs allow us to calculate the conditional probabilities of the nodes in a network given that some of the values have been observed.
Probabilities in BNs • Specifying the probability distribution for a BN requires: • The prior probabilities of all the root nodes (nodes without parents) • The conditional probabilities of all non-root nodes given all possible combinations of their direct parents • BN representation can yield significant savings in the number of values needed to specify the probability distribution • If variables are binary, then 2n1 values are required for the complete distribution, where n is the number of variables
Probabilities in BNs - Example P(fo) = .15 P(bp) = .01 family-out bowel-problem P(do | fo bp) = .99 P(do | fo ¬bp) = .90 P(do | ¬fo bp) = .97 P(do | ¬fo ¬bp) = .3 light-on dog-out P(lo | fo) = .6 P(lo | ¬fo) = .05 hear-bark P(hb | do) = .7 P(hb | ¬do) = .01 From Charniak
Calculating Probabilities - Example What is the probability that the lights are out? P(lo) = P(lo | fo) P(fo) + P(lo | ¬fo) P(¬fo) = .6 (.15) + .05 (.85) = 0.1325
Calculating Probabilities - Example What is the probability that the dog is out? P(do) = P(do | bpfo) P(bpfo) + P(do | bp ¬fo) P(bp ¬fo) + P(do | ¬bpfo) P(¬bpfo) + P(do | bp ¬fo) P(¬bp¬fo) = P(do | bpfo) P(bp) P(fo) + P(do | bp ¬fo) P(bp) P(¬fo) + P(do | ¬bpfo) P(¬bp) P(fo) + P(do | bp ¬fo) P(¬bp) P(¬fo) = .99(.15)(.01) + .90(.15)(.99) + .97(.85)(.01) + .3(.85)(.99) = 0.4
Linear a Converging Diverging b a c b b a c c Types of Connections in BNs
Independence Assumptions • Linear connection: The two end variables are usually dependent on each other. The middle variable renders them independent. • Converging connection: The two end variables are usually independent of each other. The middle variable renders them dependent. • Divergent connection: The two end variables are usually dependent on each other. The middle variable renders them independent.
Inference in Bayesian Networks • A basic task for BNs is to compute the posterior probability distribution for a set of query variables, given values for some evidence variables. This is called inference or belief updating. • The input to a BN inference evaluation is a set of evidences: e.g., E = { hear-bark = true, lights-on = true } • The outputs of the BN inference evaluation are conditional probabilities P(Xi= v | E) where Xi is a variable in the network.
Inference in Bayesian Networks • Types of inference: • Diagnostic • Causal • Intercausal (Explaining Away) • Mixed
A Bayesian Network Age Gender Exposure to Toxics Smoking Cancer Serum Calcium Lung Tumor
Independence Age and Gender are independent. Age Gender P(A,G) = P(G)P(A) P(A|G) = P(A) A ^G P(G|A) = P(G) G ^A P(A,G) = P(G|A) P(A) = P(G)P(A) P(A,G) = P(A|G) P(G) = P(A)P(G)
Conditional Independence Cancer is independent of Age and Gender given Smoking. Age Gender Smoking P(C|A,G,S) = P(C|S) C ^ A,G | S Cancer
Serum Calcium is independent of Lung Tumor, given Cancer P(L|SC,C) = P(L|C) More Conditional Independence:Naïve Bayes Serum Calcium and Lung Tumor are dependent Cancer Serum Calcium Lung Tumor
P(E = heavy | C = malignant) > P(E = heavy | C = malignant, S=heavy) More Conditional Independence:Explaining Away Exposure to Toxics and Smoking are independent Exposure to Toxics Smoking E ^ S Cancer Exposure to Toxics is dependent on Smoking, given Cancer
Age Gender Exposure to Toxics Smoking Cancer Serum Calcium Lung Tumor Put it all together
Q E Diagnostic Inference • Inferring the probability of a cause based on evidence of an effect • Also known as “bottom up” reasoning
Probabilities in BNs - Example P(fo) = .15 P(bp) = .01 family-out bowel-problem P(do | fo bp) = .99 P(do | fo ¬bp) = .90 P(do | ¬fo bp) = .97 P(do | ¬fo ¬bp) = .3 light-on dog-out P(lo | fo) = .6 P(lo | ¬fo) = .05 hear-bark P(hb | do) = .7 P(hb | ¬do) = .01 From Charniak
family out bowel problem dog out Example: Diagnostic Inference Given that the dog is out, what’s the probability that the family is out? That the dog has a bowel problem? What’s the probable cause of the dog being out?
E Q Causal Inference • Inferring the probability of an effect based on evidence of a cause • Also known as “top down” reasoning
family out bowel problem dog out Example: Causal Inference What is the probability that the dog is out given that the family is out? P(do | fo) = P(do | fobp) P(bp) + P(do | fo ¬bp) P(¬bp) = .99 (.01) + .90 (.99) = 0.90 What is the probability that the dog is out given that he has a bowel problem? P(do | bp) = P(do | bpfo) P(fo) + P(do | bp ¬fo) P(¬fo) = .99 (.15) + .97 (.85) = 0.973
Q E F Intercausal Inference (Explaining Away) • Involves two causes that "compete" to "explain" an effect • The causes become conditionally dependent given that their common effect is observed, even though they are marginally independent.
family out bowel problem dog out Explaining Away - Example What is the probability that the family is out given that the dog is out and has a bowel problem? Evidence of the bowel problem “explains away” the fact that the dog is out.
E Q E Mixed Inference • Combines two or more diagnostic, causal, or intercausal inferences
Predictive Inference Age Gender How likely are elderly males to get malignant cancer? Exposure to Toxics Smoking P(C=malignant| Age>60, Gender= male) Cancer Serum Calcium Lung Tumor
Combined Age Gender How likely is an elderly male patient with high Serum Calciumto have malignant cancer? Exposure to Toxics Smoking Cancer P(C=malignant| Age>60, Gender= male, Serum Calcium = high) Serum Calcium Lung Tumor
Smoking • If we then observe heavy smoking, the probability of exposure to toxics goes back down. Explaining away Age Gender • If we see a lung tumor, the probability of heavy smoking and of exposure to toxics both go up. Exposure to Toxics Smoking Cancer Serum Calcium Lung Tumor
Bayesian Network - Latihan Yeni Herdiyeni
Bayesian Network • Dari gambartersebutdapatdiketahuipeluanggabungandari P(R,W). Jika P(R) = 0.4, maka P(~R) = 0.6 danjika P(~W|~R) = 0.8. • Kaidah Bayes dapatdigunakanuntukmembuatdiagnosa.
Bayesian Network Sebagai contoh jika diketahui bahwa rumput basah, maka peluang hujan dapat dihitung sebagai berikut :
Bayesian Network • Berapa peluang rumput basah jika Springkler menyala (tidak diketahui hujan atau tidak)
Bayesian Network • Berapa peluang Springkler menyala setelah diketahui rumput basah P(S|W)?
Bayesian Network • Jika diketahui hujan, berapa peluang Springkler menyala?
Bayesian Network • Bagaimana jika ada asumsi : Jika cuacanya mendung (cloudy), maka Springkler kemungkinan besar tidak menyala.
Bayesian Network • Berapa peluang rumput basah jika diketahui cloudy?