1 / 16

A Markov Chain Model of Baseball

A Markov Chain Model of Baseball. Eric Kuennen Department of Mathematics University of Wisconsin Oshkosh kuennene@uwosh.edu. Used as a project for an undergraduate Stochastic Modeling course. Presented at: Joint Mathematics Meetings Washington, D.C. January 6, 2009.

KeelyKia
Download Presentation

A Markov Chain Model of Baseball

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Markov Chain Model of Baseball Eric Kuennen Department of Mathematics University of Wisconsin Oshkosh kuennene@uwosh.edu Used as a project for an undergraduate Stochastic Modeling course Presented at: Joint Mathematics Meetings Washington, D.C. January 6, 2009

  2. Markov Chain Model for Baseball • View an inning of baseball as a stochastic process with 25 possible states. • There are 8 different arrangements of runners on the bases: (bases empty, runner on 1st, runner on 2nd, runner on 3rd, runners on 1st and 2nd , runners on 1st and 3rd, runners on 2nd and 3rd , bases loaded) and three possibilities for the number of outs (0 outs, 1 out, 2 outs), for a total of 24 non-absorbing states. • The 25th state (3 outs) is an absorbing state for the inning.

  3. Transition Probabilities • A Markov Chain is a stochastic process in which the next state depends only on the present state. In other words, future states are independent of past states. • Let Pijdenote the probability the next state is j, given the current state is i. • Form the Transition Matrix T = [Pij]. w = probability of a walk s = probability of a single d = probability of a double t = probability of a triple h = probability of a home run out = probability of an out

  4. Transition Matrix

  5. Run Matrix

  6. Methods of Analysis Theoretical Calculations with Maple • Expected Run Values for each state • Steady State Probability Vector • Expected Value of a given play in a given state or in general

  7. Expected Run Values • Let vi be the expected number of runs scored starting in state i • Students use Maple’s linear algebra package to solve for the vector v

  8. Expected Run Values From 2005 MLB: w = .094 s = .157 d = .049 t = .005 h = .029 out = .661

  9. Sacrifice Bunting Is it ever advantageous to sacrifice bunt?

  10. Stealing Bases How successful does a base-stealer need to be on average in order for it to be worth-while to attempt to steal second base with a runner on first and no outs?

  11. Methods of Analysis Experimental Simulations with Minitab • Students write a Minitab macro that uses a random number generator to simulate the step by step evolution of the Markov Chain • Large-scale simulations are used to estimate Expected Run Values and perform situational strategy analyses

  12. Two Simulated Innings First Inning 1. Single 2. Out 3. Double 4. Single 5. Out 6. Single 7. Out Second Inning 8. Single 9. Homerun 10. Out 11. Out 12. Single 13. Out

  13. Sacrificing with the game on the line In the ninth inning, your team needs one run to win or tie. Suppose the first batter reaches first. Should you bunt? Mean number of runs scored: 0.909 Probability of scoring at least one run: 0.390 Mean number of runs scored: 0.665 Probability of scoring at least one run: 0.406

  14. Reference • Sokol, J.S. (2004) “AnIntuitive Markov Chain Lesson From Baseball,” Informs Transactions on Education. 5 pp. 47-55.

  15. Please contact me for: • Sample Maple Worksheet • Sample Minitab Macro • Project Assignment Handout kuennene@uwosh.edu

More Related