1 / 52

Mathematical Statistics

Mathematical Statistics. Random Sampling, Point Estimation and Maximum Likelihood. Statistical Inference. The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population .

matty
Download Presentation

Mathematical Statistics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mathematical Statistics Random Sampling, Point Estimation and Maximum Likelihood

  2. Statistical Inference • The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. • These methods utilize the information contained in a samplefrom the population in drawing conclusions. • Statistical inference may be divided into two major areas: • Parameter Estimation • Hypothesis Testing

  3. Sampling • If all members of a population are identical, the population is considered to be homogenous. • When individual members of a population are different from each other, the population is considered to be heterogeneous (having significant variation among individuals).

  4. What is Sampling? What you actually observe in the data What you want to talk about Population Sampling Process Sample Sampling Frame Inference Using data to say something (make aninference) with confidence, about a whole (population) based on the study of a only a few (sample).

  5. Estimation of Parameters • Sample mean and sample variance are the two most important parameters of a sample. • µ measures the central location of the sample values and s2 their spread (their variability). • Small s2 may indicate high quality of production, high accuracy of measurement, etc. • Note that µ and s2 will generally vary from sample to sample taken from the same population.

  6. Point Estimation of Parameters A point estimate of a parameter is a number (point on real line: p in Binomial, µ and б in Normal Distribution), which is computed from a given sample and serves as an approximation of the unknown exact value of the parameter of the population.

  7. Interval Estimate • A point estimate is a statistic taken from a sample and is used to estimate a population parameter. • An Interval Estimate is an interval (Confidence Interval).

  8. Point Estimates Estimate PopulationParameters … with SampleStatistics Mean Proportion Variance Difference

  9. Approximation of Mean Method of Moments Kth Moment of a Sample x1, x2, x3,…, xn

  10. Likelihood Function • Consider a random variable X whose probability/ density f(x) depends on a single parameter θ: • Discrete Probability of n elements is l = f(x1) f(x2) f(x3)…f(xn) where xj≤ x ≤ xj + ∆x, j = 1, 2, …, n • Since f(xj) depend on θ, the function l depends on x1,x2, x3,…,xn(given and fixed) and θ.

  11. Likelihood Function • The likelihood function is: • Likelihood Function is an approximation for the unknown value of θ for which l is as large as possible. If l is differentiable function of θ, a necessary condition for to have a maximum in an interval is:

  12. Problem 1 Find the maximum likelihood estimate for the parameter µ of a Normal distribution with known variance б2 = б02.

  13. Problem 3 Derive the maximum likelihood estimate for the parameter p of a Binomial distribution.

  14. Problem 5 Suppose that 4 times 5 trials were made and in the first 5 trials A happened 2, 1, 4, 4 times, respectively. Estimate p.

  15. Problem 7 Consider X = number of independent trials until an event A occurs. Show that X has the probability function f(x) = pqx-1, x = 1, 2, …, where p is probability of A in a single trial and q = 1 – p. Find the maximum likelihood estimate for the parameter p corresponding to a single observation x of X.

  16. Problem 9 Apply the maximum likelihood method to Poisson distribution.

  17. Problem 11 Find the maximum likelihood estimate of θ in the density f(x) = θe-θx, if x ≥ 0 and f(x) = 0, if x < 0.

  18. Problem 13 Compute θ^ from the sample 1.8, 0.4, 0.8, 0.6, 1.4. Graph the Sample Distribution Function F^(x) and the Distribution Function F(x), with θ = θ^ on the same axes. Do they agree reasonably well.

  19. 7-2 General Concepts of Point Estimation 7-2.1 Unbiased Estimators Definition

  20. 7-2 General Concepts of Point Estimation Example 7-1

  21. 7-2 General Concepts of Point Estimation Example 7-1 (continued)

  22. 7-2 General Concepts of Point Estimation 7-2.3 Variance of a Point Estimator Definition Figure 7-1The sampling distributions of two unbiased estimators

  23. 7-2 General Concepts of Point Estimation 7-2.3 Variance of a Point Estimator Theorem 7-1

  24. 7-2 General Concepts of Point Estimation 7-2.4 Standard Error: Reporting a Point Estimate Definition

  25. 7-2 General Concepts of Point Estimation 7-2.4 Standard Error: Reporting a Point Estimate

  26. 7-2 General Concepts of Point Estimation Example 7-2

  27. 7-2 General Concepts of Point Estimation Example 7-2 (continued)

  28. 7-2 General Concepts of Point Estimation 7-2.6 Mean Square Error of an Estimator Definition

  29. 7-2 General Concepts of Point Estimation 7-2.6 Mean Square Error of an Estimator

  30. 7-2 General Concepts of Point Estimation 7-2.6 Mean Square Error of an Estimator Figure 7-2A biased estimator that has smaller variance than the unbiased estimator

  31. 7-3 Methods of Point Estimation Definition Definition

  32. 7-3 Methods of Point Estimation Example 7-4

  33. 7-3 Methods of Point Estimation 7-3.2 Method of Maximum Likelihood Definition

  34. 7-3 Methods of Point Estimation Example 7-6

  35. 7-3 Methods of Point Estimation Example 7-6 (continued)

  36. 7-3 Methods of Point Estimation Figure 7-3Log likelihood for the exponential distribution, using the failure time data. (a) Log likelihood with n = 8 (original data). (b) Log likelihood if n = 8, 20, and 40.

  37. 7-3 Methods of Point Estimation Example 7-9

  38. 7-3 Methods of Point Estimation Example 7-9 (continued)

  39. 7-3 Methods of Point Estimation Properties of the Maximum Likelihood Estimator

  40. 7-3 Methods of Point Estimation The Invariance Property

  41. 7-3 Methods of Point Estimation Example 7-10

  42. 7-3 Methods of Point Estimation • Complications in Using Maximum Likelihood Estimation • It is not always easy to maximize the likelihood function because the equation(s) obtained from dL()/d = 0 may be difficult to solve. • It may not always be possible to use calculus methods directly to determine the maximum of L().

  43. 7-3 Methods of Point Estimation Example 7-11

  44. 7-3 Methods of Point Estimation Figure 7-4The likelihood function for the uniform distribution in Example 7-11.

  45. 7-4 Sampling Distributions Statistical inference is concerned with making decisions about a population based on the information contained in a random sample from that population. Definition

  46. 7-5 Sampling Distributions of Means Theorem 7-2: The Central Limit Theorem

  47. 7-5 Sampling Distributions of Means Figure 7-6 Distributions of average scores from throwing dice. [Adapted with permission from Box, Hunter, and Hunter (1978).]

  48. Example 7-13

  49. 7-5 Sampling Distributions of Means Figure 7-7Probability for Example 7-13.

  50. 7-5 Sampling Distributions of Means Definition

More Related