1 / 25

Scrubbing Query Results from Probabilistic Databases

Scrubbing Query Results from Probabilistic Databases. Jianwen Chen, Ling Feng, Wenwei Xue. A skeleton of scrubbing probabilistic database query results. Three probabilistic relation examples. Query 1: look for the year(s) where at least one movie was liked by people from northern regions.

Download Presentation

Scrubbing Query Results from Probabilistic Databases

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Scrubbing Query Results from Probabilistic Databases Jianwen Chen, Ling Feng, Wenwei Xue

  2. A skeleton of scrubbing probabilistic database query results

  3. Three probabilistic relation examples

  4. Query 1: look for the year(s) where at least one movie was liked by people from northern regions The user gets the following answer from the probabilistic database: User: Where is the probability derived? System: It is based on the two assumptions: Pr(x4) = 0.9 and Pr(x5) = 0.2 User: I think the movie of MovieID = 4 is not actually liked by people from northern regions. Pr(x4) should be 0.1 but not 0.9! System: The new probability is 0.28! How to identify the top-k uncertain assumptions for user clarification? How to recompute the probability?

  5. Pr(ee) =Pr(x4∨x5) =Pr(x4) + Pr(x5) – Pr(x4) * Pr(x5) =0.9 + 0.2 – 0.9 * 0.2 = 0.92 Top-k assumptions 0.1 Pr(ee) =Pr(x4∨x5) =Pr(x4) + Pr(x5) – Pr(x4) * Pr(x5) =0.1 + 0.2 – 0.1 * 0.2 = 0.28

  6. Basic algorithm to compute top-k assumptions For an event expression ee, to compute its probability Pr(ee), one can first convert it into an equivalent disjunctive normal form, and then apply the inclusion-exclusion formula. disjunctive norm form: ee = C1∨C2∨…∨Cm where C1= e11∧e12∧…∧e1 s1, C2= e21∧e22∧…∧e2 s2, ..., Cm= em1∧em2∧…∧em sm, m ≥1, s1,s2,…,sm≥1 inclusion-exclusion formula:

  7. Basic algorithm to compute top-k assumptions To compute one can rewrite Pr(ee) as Pr(ee)=α*Pr(ei)+β where α and β are two sub-expressions irrelevant to Pr(ei) and The time complexity is O(2m), where m is the number of conjuncts in the disjunctive normal form of ee.

  8. Optimization We restrict the event expression ee to the situation where basic events e1,e2, …, en are independent and moreover they do not occur repeatedly in ee, which can be obtained for most of the queries (80% of the TPC/H queries ) by using the well-researched optimization technique adopted in Dalvi, N., Suciu, D.: Efficient query evaluation on probabilistic databases. VLDB Journal 16(4) (2007) 523–544

  9. Three probabilistic relation examples

  10. Query 2: look for the year(s) where at least one movie was liked by people from northern regions but not by people from southern regions The user gets the following answer from the uncertain database:

  11. ee=(e1∧~e2) ∨(e3∧~e4) ∨(e5∧~e6) Pr(e1)=0.2 Pr(e2)=0.7 Pr(e3)=0.1 Pr(e4)=0.9 Pr(e5)=0.7 Pr(e6)=0.2 Pr(ee)? Pr(~ee) = 1 –Pr(ee) Pr(ee1∧ee2) = Pr(ee1) * Pr(ee2) Pr(ee1∨ee2) = Pr(ee1) + Pr(ee2) – Pr(ee1) * Pr(ee2) Pr(ee)=f(Pr(e1),Pr(e2),…,Pr(e6))

  12. Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.0694+0.56-0.0696*0.56 =0.591 Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.06+0.01-0.06*0.01 =0.0694 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.7*0.8 =0.56 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.1*0.1 =0.01 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.2*0.3 =0.06 Pr(ee(N)) =1-Pr(ee(leftChild(N))) =1-0.2 =0.8 Pr(e5)=0.7 Pr(ee(N)) =1-Pr(ee(leftChild(N))) =1-0.9 =0.1 Pr(ee(N)) =1-Pr(ee(leftChild(N))) =1-0.7 =0.3 Pr(e6)=0.2 Pr(e1)=0.2 Pr(e3)=0.1 Pr(e2)=0.7 Pr(e4)=0.9 (e1∧~e2) ∨(e3∧~e4) ∨(e5∧~e6)

  13. Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.0694+0.56-0.0696*0.56 =0.591 Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.06+0.01-0.06*0.01 =0.0694 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.7*0.8 =0.56 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.1*0.1 =0.01 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.2*0.3 =0.06 Pr(ee(N)) =1-Pr(ee(leftChild(N))) =1-0.2 =0.8 Pr(e5)=0.7 Pr(ee(N)) =1-Pr(ee(leftChild(N))) =1-0.9 =0.1 Pr(ee(N)) =1-Pr(ee(leftChild(N))) =1-0.7 =0.3 Pr(e6)=0.2 Pr(e1)=0.2 Pr(e3)=0.1 Pr(e2)=0.7 Pr(e4)=0.9 (e1∧~e2) ∨(e3∧~e4) ∨(e5∧~e6)

  14. Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.0694+0.56-0.0696*0.56 =0.591 Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.06+0.01-0.06*0.01 =0.0694 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.7*0.8 =0.56 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.1*0.1 =0.01 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.2*0.3 =0.06 Pr(ee(N)) =1-Pr(ee(leftChild(N))) =1-0.2 =0.8 Pr(ee(N)) =1-Pr(ee(leftChild(N))) =1-0.9 =0.1 Pr(ee(N)) =1-Pr(ee(leftChild(N))) =1-0.7 =0.3 (e1∧~e2) ∨(e3∧~e4) ∨(e5∧~e6)

  15. Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.0694+0.56-0.0696*0.56 =0.591 Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.06+0.01-0.06*0.01 =0.0694 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.7*0.8 =0.56 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.1*0.1 =0.01 Pr(ee(N)) =Pr(ee(leftChild(N))) *Pr(ee(rightChild(N))) =0.2*0.3 =0.06 (e1∧~e2) ∨(e3∧~e4) ∨(e5∧~e6)

  16. Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.0694+0.56-0.0696*0.56 =0.591 Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.06+0.01-0.06*0.01 =0.0694 (e1∧~e2) ∨(e3∧~e4) ∨(e5∧~e6)

  17. Pr(ee(N)) =Pr(ee(leftChild(N)))+Pr(ee(rightChild(N))) -Pr(ee(leftChild(N)))*Pr(ee(rightChild(N))) =0.0694+0.56-0.0696*0.56 =0.591 (e1∧~e2) ∨(e3∧~e4) ∨(e5∧~e6)

  18. (e1∧~e2) ∨(e3∧~e4) ∨(e5∧~e6)

  19. Second Optimization

  20. top-2 assumptions (e1∧~e2) ∨(e3∧~e4) ∨(e5∧~e6)

  21. Scrub the query result Recompute Pr((e1∧~e2) ∨(e3∧~e4) ∨(e5∧~e6)) with modified Pr(e2) and pr(e5)

  22. Performance Study

  23. Performance Study

  24. Conclusion

More Related