1 / 48

Power Nineteen

Power Nineteen. Econ 240C. Outline. Forecast Sources Ideas that are transcending Symbolic Summary. Outline. Forecasting Federal: Federal Reserve @ Philidelphia State: CA Department of Finance Local UCSB: tri-counties Chapman College: Orange County UCLA: National, CA.

gisela
Download Presentation

Power Nineteen

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Power Nineteen Econ 240C

  2. Outline • Forecast Sources • Ideas that are transcending • Symbolic Summary

  3. Outline • Forecasting • Federal: Federal Reserve @ Philidelphia • State: CA Department of Finance • Local • UCSB: tri-counties • Chapman College: Orange County • UCLA: National, CA

  4. http://www.ucsb-efp.com

  5. Review • 2. Ideas That Are Transcending

  6. Use the Past to Predict the Future • A. Applications • Trend Analysis • linear trend • quadratic trend • exponential trend • ARIMA Models • autoregressive models • moving average models • autoregressive moving average models

  7. Use Assumptions To Cope With Constraints • A. Applications • 1. Limited number of observations: simple exponential smoothing • assume the model: (p, d, q) = (0, 1, 1) • 2. No or insufficient identifying exogenous variables: interpreting VAR impulse response functions • assume the error structure is dominated by one pure error or the other, e.g assume b1 = 0, then e1 = edcapu

  8. Standard VAR (lecture 17) • dcapu(t) = (a1 + b1 a2)/(1- b1 b2) +[ (g11+ b1 g21)/(1- b1 b2)] dcapu(t-1) + [ (g12+ b1 g22)/(1- b1 b2)] dffr(t-1) + [(d1+ b1 d2 )/(1- b1 b2)] x(t) + (edcapu(t) + b1 edffr(t))/(1- b1 b2) • But if we assume b1 =0, • thendcapu(t) = a1 +g11 dcapu(t-1) + g12 dffr(t-1) + d1 x(t) + edcapu(t) +

  9. Use Assumptions To Cope With Constraints • A. Applications • 3. No or insufficient identifying exogenous variables: simultaneous equations • assume the error structure is dominated by one error or the other, tracing out the other curve

  10. Simultaneity • There are two relations that show the dependence of price on quantity and vice versa • demand: p = a - b*q +c*y + ep • supply: q= d + e*p + f*w + eq

  11. Shift in demand with increased income, may trace out i.e. identify or reveal the demand curve price supply demand quantity

  12. Review • 2. Ideas That Are Transcending

  13. Reduce the unexplained sum of squares to increase the significance of results • A. Applications • 1. 2-way ANOVA: using randomized block design • example: minutes of rock music listened to on the radio by teenagers Lecture 1 Notes, 240 C • we are interested in the variation from day to day • to get better results, we control for variation across teenager

  14. Reduce the unexplained sum of squares to increase the significance of results • A. Applications • 2. Distributed lag models: model dependence of y(t) on a distributed lag of x(t) and • model the residual using ARMA

  15. Lab 7 240 C

  16. Reduce the unexplained sum of squares to increase the significance of results • A. Applications • 3. Intervention Models: model known changes (policy, legal etc.) by using dummy variables, e.g. a step function or pulse function

  17. Lab 8 240 C

  18. Model with no Intervention Variable

  19. Add seasonal difference of differenced step function

  20. Review • Symbolic Summary

  21. Autoregressive Models • AR(t) = b1 AR(t-1) + b2 AR(t-2) + …. + bp AR(t-p) + WN(t) • AR(t) - b1 AR(t-1) - b2 AR(t-2) - …. + bp AR(t-p) = WN(t) • [1 - b1 Z + b2 Z2 + …. bp Zp ] AR(t) = WN(t) • B(Z) AR(t) = WN(t) • AR(t) = [1/B(Z)]*WN(t) AR(t) 1/B(Z) WN(t)

  22. Moving Average Models • MA(t) = WN(t) + a1 WN(t-1) + a2 WN(t-2) + …. aq WN(t-q) • MA(t) = WN(t) + a1 Z WN(t) + a2 Z2 WN(t) + …. aq Zq WN(t) • MA(t) = [1 + a1 Z + a2 Z2 + …. aq Zq ] WN(t) • MA(t) = A(Z)*WN(t) MA(t) A(Z) WN(t)

  23. ARMA Models • ARMA(p,q) = [Aq (Z)/Bp (Z)]*WN(t) ARMA(t) A(Z)/B(Z) WN(t)

  24. Distributed Lag Models • y(t) = h0 x(t) + h1 x(t-1) + …. hn x(t-n) + resid(t) • y(t) = h0 x(t) + h1 Zx(t) + …. hn Zn x(t) + resid(t) • y(t) = [h0 + h1 Z + …. hn Zn ] x(t) + resid(t) • y(t) = h(Z)*x(t) + resid(t) • note x(t) = Ax (Z)/Bx (Z) WNx (t), or • [Bx (Z) /Ax (Z)]* x(t) =WNx (t), so • [Bx (Z) /Ax (Z)]* y(t) = h(Z)* [Bx (Z) /Ax (Z)]* x(t) + [Bx (Z) /Ax (Z)]* resid(t) or • W(t) = h(Z)*WNx (t) + Resid*(t)

  25. Distributed Lag Models • Where w(t) = [Bx (Z) /Ax (Z)]* y(t) • and resid*(t) = [Bx (Z) /Ax (Z)]* resid(t) • cross-correlation of the orthogonal WNx (t) with w(t) will reveal the number of lags n in h(Z), and the signs of the parameters h0 , h1 , etc. for modeling the regression of w(t) on a distributed lag of the residual, WNx (t), from the ARMA model for x(t)

  26. Distributed Lag Model + Y(t) X(t) H(z) + Remember to Model the Residual! Residual(t)

  27. VAR Model Y1(t) = h1 (t ) Y1 (t) + h2 (t) Y2 (t) +e1 (t) Y2 (t ) = h3 (t ) Y1 (t) + h4 (t) Y2 (t) +e2 (t) e1 (t) + + Y1 (t) h1 (z) Y1 (t) + With a similar schematic for Y2 (t) Note: e1 (t) and e2 (t) are each compound errors, i.e. composed of the pure shock, ey1, to Y1 and the pure shock, ey2, to Y2 h2 (z) Y2 (t)

  28. Crime in California

  29. 1952-2004

  30. Use the California Experience • Crime rates Have Fallen. Why Haven’t Imprisonment rates? • Apply the conceptual tools • Criminal justice system schematic • crime control technology

  31. Schematic of the Criminal Justice System: Coordinating CJS “The Driving Force” Causes ?!! Weak Link Offense Rate Per Capita Crime Generation Expected Cost of Punishment (detention, deterrence) Expenditures Crime Control

  32. Jobs and Crime

  33. Model Schematic Causality: California Misery Index Crime Generation: California Index Offenses Per Capita Crime Control: California Prisoners Per Capita

  34. CA Crime Index Per Capita (t) = 0.039 + 0.00034*Misery Index (t) – 3.701*Prisoners Per Capita (t) + e(t) where e(t) = 0.954*e(t-1)

  35. Ln CA Crime Index Per Capita (t) = -5.25 + 0.17*ln Misery Index (t) -0.22 ln Prisoners Per capita (t) +e(t) where e(t) = 0.93 e(t-1)

More Related