1 / 39

Chapter 1

Chapter 1. Introduction to Statistical Methods. Pure math for a while! The Math of Probability & Statistics. “The true logic of this world is in the calculus of probabilities” . James Clerk Maxwell. Relevance of Probability to Physics. In this course, we’ll discuss

johnnyd
Download Presentation

Chapter 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 1 Introduction to Statistical Methods

  2. Pure math for a while! • The Math of Probability • & Statistics. “The true logic of this world is in the calculus of probabilities”. James Clerk Maxwell

  3. Relevance of Probability to Physics • In this course, we’ll discuss The Physicsof systems containing HUGE numbers ( >> 1023) of particles: Solids, Liquids, Gases, EM Radiation, (photons & other quantum particles), ..

  4. Relevance of Probability to Physics The Challenge • Describe a system’s Macroscopic characteristics starting from a Microscopictheory (model).

  5. GOAL: Formulate a theory to describe a system’s Macroscopic characteristics starting from a Microscopictheory. • Classical Mechanics: Newton’s Laws Need to solve >>1023coupled differential Newton’s 2nd Law equations of motion! (ABSURD!!) • Quantum Mechanics:Schrödinger’s Equation: Need a solution for >>1023 particles! (ABSURD!!)

  6. Historically, this led to the use of a Statistical description of such a system. So, we’ll talk about Probabilities & Average System Properties • We AREN’Tconcerned with the detailed behavior of individual particles.

  7. Definitions: • Microscopic:~ Atomic dimensions or ~ ≤ a fewÅ • Macroscopic:Large enough to be “visible” in the “ordinary” sense (Optical Microscopes are “ordinary!)

  8. Definitions: • An Isolated Systemis in Equilibrium when it’s Macroscopic parameters are time-independent. • This is the usual case in this course! • But, note! Even if it’s Macroscopic parameters are time-independent, a system’s Microscopic parameters can & probably will still vary with time!

  9. Now, some Basic(Simple) Math of Probability & Statistics

  10. Random WalkBinomial Distribution Section 1.1 Elementary Statistical Concepts & Examples • Math preliminaries (methods) for a few lectures. To treat statistical physics problems, we must first know something about the mathematics of Probability & Statistics

  11. Probability & Statistics • The following should be a review! (?) • Keep in mind: Whenever we want to describe a situation using probability & statistics, we must consider an assembly of a large number N(in principle, N ∞) of “similarly prepared systems”.

  12. This assembly is called an ENSEMBLE  French word for Assembly!. • The Probability of an occurrence of a particular event is DEFINED with respect to this particular ensemble & is given by the fraction of systems in the ensemble characterized by the occurrence of this event.

  13. Example • In throwing a pair of dice, we can give a statistical description by considering that a very large number Nof similar pairs of dice are thrown under similar circumstances. • Alternatively, we could imagine the same pair of dice thrown N times under similar circumstances. • The probability of obtaining two 1’s is then given by the fraction of these experiments in which two 1’s is the outcome.

  14. Note that this probability depends stronglyon the type of ensemble to which we are referring. • See Reif’s flower seed example (p. 5). • To quantitatively introduce probability concepts, we use a specific, simple example, which is actually much more generalthan you first might think. • The example is called “The Random Walk Problem”

  15. The 1-Dimensional Random Walk “The most important questions of life are indeed, for the most part, really only problems of probability.” Pierre Simon Laplace “Théorie Analytique des Probabilités”, 1812

  16. The 1-Dimensional Random Walk • In it’s simplest, crudest, most idealized form, The random walk problem can be viewed as in the figure • A story about this is that a drunk that starts out from a lamp post on a street. Obviously, he wants to move down the sidewalk to get somewhere!!

  17. So the drunk starts out from a lamp post on a street. • Each step he takes is of equal lengthℓ. He is SO DRUNK, that the direction of each step (right or left) is completely independent of the preceding step. • The (assumed known) probability of stepping to the right is p & of stepping to the left is q = 1 – p. In general, q ≠ p. • The x axis is along the sidewalk, the lamp post is at x = 0. • Each step is of length ℓ, so his location on the x axis must be x = mℓ where m = a positive or negative integer.

  18. Question: After N steps, what is the probability that the man is at a specific location x = mℓ (m specified)? • To answer, we first consider an ensemble of a large numberNof drunk men starting from similar lamp posts!! • Or repeat this with the same drunk man walking on the sidewalk N times!!

  19. This is “easily generalized” to 2 dimensions, as shown schematically in the figure. • The 2 dimensional random walk corresponds to a PHYSICS problem of adding N, 2 dimensional vectors of equal length (figure) & random directions & asking: “What is the probability that the resultant has a certain magnitude & a certain direction?”

  20. Physical Examplesto which the Random Walk Problemapplies: 1. Magnetism(Quantum Treatment) • Natoms, each with magnetic moment μ. Each has spin ½. By Quantum Mechanics, each magnetic moment can point either “up” or “down”. If these are equally likely, what is the Net magnetic moment of the N atoms?

  21. Physical Examplesto which the Random Walk Problemapplies: 2. Diffusion of a Molecule of Gas (Classical Treatment) • A molecule travels in 3 dimensions with a mean distance ℓbetween collisions. How far is it likely to have traveled after Ncollisions? Answer using Classical Mechanics.

  22. The Random Walk Problem: • Is a simple example which illustrates some fundamental results of Probability Theory. • The techniques used are Powerful & General. • They are used repeatedly throughout Statistical Mechanics. • So, it’s very important to spend some time on this problem & to understand it!

  23. Section 1.2: 1-Dimensional Random Walk • Forget the drunk, let’s get back to Physics! • Think of a particle moving in 1 dimension in steps of length ℓ, with 1. Probability p of stepping to the right & 2. Probability q = 1 – pstepping to the left. Simplest case: p = q = ½ • After N steps, the particle is at position: x = mℓ (- N ≤m≤N). Let n1≡ # of steps to the right (out of N) Let n2≡ # of steps to the left.

  24. After N steps, x = mℓ (- N ≤m≤N). • Let n1≡ # of steps to the right (out of N) • Let n2≡ # of steps to the left. • Clearly, N = n1+ n2 (1) • Clearly also, x ≡ mℓ = (n1- n2)ℓ or, m = n1- n2(2) • Combining (1) & (2) gives:  m = 2n1– N (3) • If N is odd, so is m. • If N is even, so is m.

  25. A Fundamental Assumption is that Successive Steps are Statistically Independent • Let p≡ the probability of stepping to the right & q = 1 – p≡ the probability of stepping to the left. • Since each step is statistically independent, the probability of a given sequence of n1steps to the right followed by n2steps to the left is given by multiplying the respective probabilities for each step:

  26. p≡ Probability of stepping to the right q = 1 – p≡ Probability of stepping to the left. • Since each step is statistically independent, the probability of a given sequence of n1steps to the right followed by n2steps to the left is given by multiplying the respective probabilities for each step: p·p·p·p·p· · · · · · · p·p· ·· q·q·q·q·q·q·q·q·q·q· · · q·q n1 factors n2 factors ≡ pn1qn2 • But, also, clearly, there are MANY different possible ways of taking N steps so n1 are to the right & n2 are to the left!

  27. The # of distinct possibilities is the SAME as counting the # of distinct ways we can place N objects, n1 of one type & n2 of another type in N = n1+ n2 places: 1stplace: Can be occupied any one of N ways 2ndplace: Can be occupied any one of N - 1 ways 3rdplace: Can be occupied any one of N - 2 ways ……. (N – 1)th place: Can be occupied only 2 ways Nth place: Can be occupied only 1 way

  28.  All available places can be occupied in: N(N-1)(N-2)(N-3)(N-4) ······ ······ (3)(2)(1) ≡ N! ways. Here, N! ≡ “N-Factorial”

  29. Note However! • This analysis doesn’t take into account the fact that there are only 2 distinguishable kinds of objects: n1 of the 1st type & n2 of the 2ndtype. All n1! possible permutations of the 1st type of object lead to exactly the sameN! possible arrangements of the objects. Similarly, all n2! possible permutations of the 2ndtype of object alsolead to exactly the sameN! arrangements.  So, we need to divide the result by n1!n2!

  30.  So, the # of distinct ways in which N objects can be arranged with n1 of the 1st type & n2 of the 2nd type is ≡ N!/(n1!n2!) • This is the same as the # of distinct ways of taking N steps, with n1 to the right & n2 to the left.

  31. Summary: • The probability WN(n1) of taking N steps; n1 to the right & n2 (=N -n1) to the left is WN(n1) = [N!/(n1!n2!)]pn1qn2or WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2 • Often, this is written as WN(n1) N pn1qn2 n1 Remember that q = 1- p

  32. q = 1- p WN(n1) = [N!/{n1!(N – n1)!]}pn1(1-p)n2 • Often, this is written as WN(n1)  N pn1qn2 n1 • This probability distribution is called the Binomial Distribution. • This is because the Binomial Expansion has the form (p + q)N = ∑(n1 = 0N)[N!/[n!(N–n1)!]pn1qn2

  33. We really want the probabilityPN(m) that x = mℓafter N steps. This really the same as WN(n1)if we change notation: • PN(m) = WN(n1).But m = 2n1– N, so n1= (½)(N + m) & n2= N - n1 = (½)(N - m). • So the probabilityPN(m) that x = mℓafter N steps = PN(m) = {N!/([(½)(N + m)]! [(½)(N – m)!]}p(½)(N+m)(1-p)(½)(N-m) • For the common case of p = q = ½, this is: PN(m) = {N!∕([(½)(N + m)]![(½)(N – m)!])}(½)N

  34. ProbabilityPN(m) that x = mℓafter N steps: PN(m) = {N!/([(½)(N + m)]!  [(½)(N – m)!]}p(½)(N+m)(1-p)(½)(N-m) • For the common case of p = q = ½, this is: PN(m) = {N!/([(½)(N + m)]! [(½)(N - m)!]}(½)N

  35. PN(m) = {N!/([(½)(N + m)]! [(½)(N – m)!]}p(½)(N+m)(1-p)(½)(N-m) • For the common case of p = q = ½, this is: PN(m) = {N!/([(½)(N + m)]![(½)(N – m)!]}(½)N • This is the usual form of the Binomial Distribution which is probably the most elementary (discrete) probability distribution.

  36. As a trivial example, suppose that p = q = ½, N = 3 steps: • This gives, P3(m) = {3!/[(½)(3+m)!][(½)(3-m)!](½)3 • SoP3(3) = P3(-3) = (3!/[3!0!](⅛) = ⅛ P3(1) = P3(-1) = (3!/[2!1!](⅛) = ⅜ Table of Possible Step Sequences

  37. Another example, let: p = q = ½, N = 20. • This gives: P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)20 • Calculation gives the histogram results in the figure. P20(20) = [20!/(20!0!)](½)20 P20(20)  9.5  10-7 P20(0) = [20!/(10!)2](½)20 P20(0)  1.8  10-1 P20(10) = [20!/(15!)(5!)](½)20 P20(10)  1.5  10-2

  38. Notice: The “envelope” of the histogram is a “bell-shaped” curve. The significance of this is that, after N random steps, the probability of a particle being a distance of N steps away from the start is very small & the probability of it being at or near the origin is relatively large: • For this same case: P20(m) = {20!/[(½)(20 + m)!][(½)(20 - m)!](½)3 P20(20) = [20!/(20!0!)](½)20 P20(20)  9.5  10-7 P20(0) = [20!/(10!)2](½)20 P20(0)  1.8  10-1

  39. “It is remarkable that a science which began with the consideration of games of chance should have become the most important object of human knowledge.” Pierre Simon Laplace “ThéorieAnalytique des Probabilités”, 1812

More Related