1 / 55

TDC 369 / TDC 432

TDC 369 / TDC 432. April 2, 2003 Greg Brewster. Topics. Math Review Probability Distributions Random Variables Expected Values. Math Review. Simple integrals and differentials Sums Permutations Combinations Probability. Math Review: Sums. Math Review: Permutations.

efuru
Download Presentation

TDC 369 / TDC 432

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TDC 369 / TDC 432 April 2, 2003 Greg Brewster

  2. Topics • Math Review • Probability • Distributions • Random Variables • Expected Values

  3. Math Review • Simple integrals and differentials • Sums • Permutations • Combinations • Probability

  4. Math Review: Sums

  5. Math Review:Permutations • Given N objects, there are N! = N(N-1)…1 different ways to arrange them • Example: Given 3 balls, colored Red, White and Blue, there are 3! = 6 ways to order them • RWB, RBW, BWR, BRW, WBR, WRB

  6. Math Review:Combinations • The number of ways to select K unique objects from a set of N objects without replacement is C(N,K) = • Example: Given 3 balls, RBW, there are C(3,2) = 3 ways to uniquely choose 2 balls • RB, RW, BW

  7. Probability • Probability theory is concerned with the likelihood of observableoutcomes (“events”) of some experiment. • Let  be the set of all outcomes and let E   be some event in , then the probability of E occurring = Pr[E] is the fraction of times E will occur if the experiment is repeated infinitely often.

  8. Probability • Example: • Experiment = tossing a 6-sided die • Observable outcomes = {1, 2, 3, 4, 5, 6} • For fair die, • Pr{die = 1} = • Pr{die = 2} = • Pr{die = 3} = • Pr{die = 4} = • Pr{die = 5} = • Pr{die = 6} =

  9. Probability Pie

  10. Valid Probability Measure • A probability measure, Pr, on an event space {Ei} must satisfy the following: • For all Ei , 0 <= Pr[Ei ] <= 1 • Each pair of events, Ei and Ek, are mutually exclusive, that is, • All event probabilities sum to 1, that is,

  11. Probability Mass Function Pr(Die = x)

  12. Mass Function = Histogram • If you are starting with some repeatable events, then the Probability Mass function is like a histogram of outcomes for those events. • The difference is a histogram indicates how many times an event happened (out of some total number of attempts), while a mass function shows the fraction of time an event happens (number of times / total attempts).

  13. Dice Roll Histogram1200 attempts Number of times Die = x

  14. Probability Distribution Function(Cumulative Distribution Function) Pr(Die <= x)

  15. Combining Events • Probability of event not happening: • Probability of both E and F happening: • IF events E and F are independent • Probability of either E or F happening:

  16. Conditional Probabilities • The conditional probability that E occurs, given that F occurs, written Pr[E | F], is defined as

  17. Conditional Probabilities • Example: The conditional probability that the value of a die is 6, given that the value is greater than 3, is Pr[die=6 | die>3] =

  18. Probability Pie

  19. Conditional Probability Pie

  20. Independence • Two events E and F are independent if the probability of E conditioned on F is equal to the unconditional probability of E. That is, Pr[E | F] = Pr[E]. • In other words, the occurrence of F has no effect on the occurrence of E.

  21. Random Variables • A random variable, R, represents the outcome of some random event. Example: R = the roll of a die. • The probability distribution of a random variable, Pr[R], is a probability measure mapping each possible value of R into its associated probability.

  22. Sum of Two Dice • Example: • R = the sum of the values of 2 dice • Probability Distribution: due to independence:

  23. Sum of Two Dice

  24. Probability Mass Function:R = Sum of 2 dice Pr(R = x)

  25. Continuous Random Variables • So far, we have only considered discrete random variables, which can take on a countable number of distinct values. • Continuous random variables and take on any real value over some (possibly infinite) range. • Example: R = Inter-packet-arrival times at a router.

  26. Continuous Density Functions • There is no probability mass function for a continuous random variable, since, typically, Pr[R = x] = 0 for any fixed value of x because there are infinitely many possible values for R. • Instead, we can generate density functions by starting with histograms split into small intervals and smoothing them (letting interval size go to zero).

  27. Example: Bus Waiting Time • Example: I arrive at a bus stop at a random time. I know that buses arrive exactly once every 10 minutes. How long do I have to wait? • Answer: My waiting time is uniformly distributed between 0 and 10 minutes. That is, I am equally likely to wait for any time between 0 and 10 minutes

  28. Bus Wait Histogram2000 attempts (histogram interval = 2 min) Waiting Times (using 2-minute ‘buckets’)

  29. Bus Wait Histogram2000 attempts (histogram interval = 1 min) Waiting Times (using 1-minute ‘buckets’)

  30. Bus Waiting TimeUniform Density Function

  31. Value for Density Function • The histograms show the shape that the density function should have, but what are the values for the density function? • Answer: Density function must be set so that the function integrates to 1.

  32. Continuous Density Functions • To determine the probability that the random value lies in any interval (a, b), we integrate the function on that interval. • So, the probability that you wait between 3 and 5 minutes for the bus is 20%:

  33. Cumulative Distribution Function • For every probability density function, fR(x), there is a corresponding cumulative distribution function, FR(x), which gives the probability that the random value is less than or equal to a fixed value, x.

  34. Example: Bus Waiting Time • For the bus waiting time described earlier, the cumulative distribution function is

  35. Bus Waiting TimeCumulative Distribution Function Pr(R <= x)

  36. Cumulative Distribution Functions • The probability that the random value lies in any interval (a, b) can also easily be calculated using the cumulative distribution function • So, the probability that you wait between 3 and 5 minutes for the bus is 20%:

  37. Expectation • The expected value of a random variable, E[R], is the mean value of that random variable. This may also be called the average value of the random variable.

  38. Calculating E[R] • Discrete R.V. • Continuous R.V.

  39. E[R] examples • Expected sum of 2 dice • Expected bus waiting time

  40. Moments • The nth moment of R is defined to be the expected value of Rn • Discrete: • Continuous:

  41. Standard Deviation • The standard deviation of R, (R), can be defined using the 2nd moment of R:

  42. Coefficient of Variation • The coefficient of variation, CV(R), is a common measure of the variability of R which is independent of the mean value of R:

  43. Coefficient of Variation • The coefficient of variation for the exponential random variable is always equal to 1. • Random variables with CV greater than 1 are sometimes called hyperexponential variables. • Random variables with CV less than 1 are sometimes called hypoexponential variables.

  44. Common Discrete R.V.sBernouli random variable • A Bernouli random variable w/ parameter p reflects a 2-valued experiment with results of success (R=1) w/ probability p

  45. Common Discrete R.V.sGeometric random variable • A Geometric random variable reflects the number of Bernouli trials required up to and including the first success

  46. Geometric Mass Function# Die Rolls until a 6 is rolled Pr(R = x)

  47. Geometric Cumulative Function# Die Rolls until a 6 is rolled Pr(R <= x)

  48. Common Discrete R.V.sBinomial random variable • A Binomial random variable w/ parameters (n,p) is the number of successes found in a sequence of n Bernoulli trials w/ parameter p

  49. Binomial Mass Function# 6’s rolled in 12 die rolls Pr(R = x)

  50. Common Discrete R.V.sPoisson random variable • A Poisson random variable w/ parameter  models the number of arrivals during 1 time unit for a random system whose mean arrival rate is  arrivals per time unit

More Related