1 / 28

Network Theory and Dynamic Systems Information Cascades - Bayes

Network Theory and Dynamic Systems Information Cascades - Bayes. Prof. Dr. Steffen Staab. Bayes ‘ rule. Terminology. P[A] Prior Probability (also Margin) P[ A|B] Posterior Probability. Example for Bayes ‘ rule. Eyewitness to accident involving taxi

yaholo
Download Presentation

Network Theory and Dynamic Systems Information Cascades - Bayes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Network Theory and Dynamic SystemsInformation Cascades - Bayes Prof. Dr. Steffen Staab

  2. Bayes‘ rule

  3. Terminology P[A] Prior Probability (also Margin) P[A|B] PosteriorProbability

  4. ExampleforBayes‘ rule • Eyewitnesstoaccidentinvolvingtaxi • 80% oftaxisareblack: P[true=Y] = 0.8 • 20% oftaxisareblack: P[true=B] = 0.2 • Eyewitnessunreliable: • P[report=Y|true=Y] = 0.8 • Implies P[report=B|true=Y] = 0.2 • P[report=B|true=B] = 0.8 • Implies P[report=Y|true=B] = 0.2

  5. Puttingittogether Computing the Marginal Probabilityfor P[report=Y] Puttingeverythingtogether

  6. Puttingittogether Computing the Marginal Probabilityfor P[report=Y] Puttingeverythingtogether

  7. Puttingittogether Computing the Marginal Probabilityfor P[report=Y] Puttingeverythingtogether

  8. Side Remarks P[true=Y | report=Y] isonly 0.5 False positives heavilyinfluencethisresult. Most people do not expect such heavy influencebyfalse positives. Especially in medicaltreatmentthishasbeenshowntobehighlyproblematic, becausedoctorsareequallybadathandling such conditionalprobabilitieswell (cf researchbypsychologists, Gerd Gigerenzerandteam)

  9. Bayes‘ rule in theherdingexperiment • Individual objectivetoberewarded • Guess „blue“ ifandonlyif • P[majority-blue | whatseenandheard] > ½ • How? • Priors: • P[majority-blue] = P[majority-red] = 0.5 • Posteriors • P[blue|majority-blue] = P[red|majority-red] = 2/3

  10. First student • P[majority-blue | blue] = 2/3 Prior: 1/2 Posterior: 2/3 Margin: 1/2

  11. Second student – assumingfirstsaid „blue“ • Trustingthatstudent 1 behavesrationally • New Priors: • P[majority-blue] = 2/3 • P[majority-red] = 1/3 • Posteriorsremainunchanged • P[blue|majority-blue] = P[red|majority-red] = 2/3 (4/9) / (5/9) = 4/5 = 0.8 Prior: 2/3 Posterior: 2/3 Margin: 5/9 2/3*2/3+1/3*1/3=5/9

  12. Second student – assumingfirstsaidtrue • Alternative wayofmodelingtheproblem • Lookingfor • P[majority-blue | blue, blue] 1/2 4/9 Independent Events!

  13. Second student – assumingfirstsaidtrue • Alternative wayofmodelingthe same problem • Lookingfor • P[majority-blue | blue, blue] 5/18

  14. Second student – assumingfirstsaidtrue • Alternative wayofmodelingthe same problem • Lookingfor • P[majority-blue | blue, blue] 1/2 4/9 5/18

  15. Third student – assumingred (after twoblue) • Lookingfor

  16. Simple, General Cascade Model • Group ofpeople (numbered 1,2,3,...) sequentiallymakingdecisions • Eachperson: acceptingorrejecting an option • Adopttechnology • Wearnewfashion • Eat in specificrestaurant • Commit crime • Votefor a politicalparty • Chooseholidaydestination • ...

  17. Simple, General Cascade Model - Ingredients • State ofthe World: Initial random, unobservableeventdetermineswhetheracceptingorrejectingisbetter • G: Acceptingisgood • B: Acceptingisbad • Priors: P[G]=p, P[B]=1-p

  18. Simple, General Cascade Model - Ingredients • State ofthe World: G, B, Priors: P[G]=p, P[B]=1-p • Payoffs: • Payoffforrejecting: 0 • Payoffforaccepting: • If G thenpayoffisvg, wherevg>0 • If B thenpayoffisvb, wherevb<0 • Expectedpayoffinitially 0  p*vg + (1-p)*vb=0

  19. Simple, General Cascade Model - Ingredients • State ofthe World: G, B, Priors: P[G]=p, P[B]=1-p • Payoffs: vg, vb, p*vg + (1-p)*vb=0 • Signals: modeling private, but uncertaininformation • High signal: H, suggestingthatacceptingisgood • Low signal: L, suggestingthatacceptingisbad • If G then high signalsaremorefrequentthanlowsignals: • P[H|G]=q > ½ andP[L|G]=1-q < ½ • IfB then high signalsarelessfrequentthanlowsignals: • P[L|B]=q > ½ and P[H|B]=1-q < ½ • Probability Matrix

  20. Individual Decisions - General • State ofthe World: G, B, P[G]=p • Payoffs: vg, vb • Signals: H,L Individual Decision After First Signal

  21. Individual Decisions – Multiple Signals • State ofthe World: G, B, P[G]=p • Payoffs: vg, vb • Signals: H,L S a sequencewith a many H signals, b many L signals Hypothesestobeverified:

  22. Individual Decisions – Multiple Signals • State ofthe World: G, B, P[G]=p • Payoffs: vg, vb • Signals: H,L S a sequencewith a many H signals, b many L signals Becauseofconditionalindependencemultiplyingprobabilities:

  23. Individual Decisions – Multiple Signals • State ofthe World: G, B, P[G]=p • Payoffs: vg, vb • Signals: H,L S a sequencewith a many H signals, b many L signals Lookingfor: Usingpreviousslide

  24. Individual Decisions – Multiple Signals • State ofthe World: G, B, P[G]=p • Payoffs: vg, vb • S witha H signals, b L signals ?<> If a>b then „<„ becauseq>½>(1-q) implying P[G|S]<p=P[G] If a<b then „>“ implyingP[G|S]<p=P[G] If a=b then P[G|S]=p=P[G] =

  25. SequentialDecision Making andCascades • Person 1 follows private signal • Person 2 getstwosignals • a clearonefromperson 1 • an ownone • Person 3 hasthreeindependent, clearsignals • Person 3 will followthemajorityvote

  26. Starting a cascade

  27. Long termimplications In order not tostart a cascadeat all, theremayneverbethree same signals in a row However: probabilitytohavethree same signalsgoesto 1 aswehavemoreandmoredecisions • Forthreepeople in a rowtheprobabilityofhavingthree same signalsis: q3 + (1-q)3 • For 3N peopletheaggregatedprobabilityofneverhavingthreesignals in a rowis (1- q3 - (1-q)3)Nwhichgetsassmallasyouwant, ifyoumake N large enough

  28. LessonsfromCascades • Cascadescanbewrong: • wrongchoicesmadeinitiallybecauseofrandomlyincorrectsignalsmaystart a cascade • Cascadescanbebased on verylittleinformation • People ignoretheir private informationonce a cascadestarts • Cascadesare fragile • As theystartwithlittleinformation, theycan also bestoppedwithlittleinformation • E.g. someonereceivingtwo private signalsmaydecidetoletthemoverrulethetwoothersignalsthatstartedthecascale

More Related