1 / 30

Multiple Random Variables and Joint Distributions

Multiple Random Variables and Joint Distributions. The conditional dependence between random variables serves as a foundation for time series analysis. When multiple random variables are related they are described by their joint distribution and density functions.

hanson
Download Presentation

Multiple Random Variables and Joint Distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multiple Random Variables and Joint Distributions • The conditional dependence between random variables serves as a foundation for time series analysis. • When multiple random variables are related they are described by their joint distribution and density functions

  2. Conditional and Joint Probability Definition Bayes Rule If D and E are independent Partition of domain into non overlaping sets D1 D2 D3 E Larger form of Bayes Rule

  3. Conditional and joint density functions Conditional density function Marginal density function If X and Y are independent

  4. Marginal Distribution

  5. Conditional Distribution

  6. Expectation and moments of multivariate random variables

  7. Covariance and Correlation are measures of Linear Dependence

  8. Mutual Information • Is there a relationship between these two variables plotted? • Correlation, the linear measure of dependence is 0. • How to quantify that a relationship exists?

  9. Entropy • Entropy is a measure of randomness. The more random a variable is, the more entropy it will have. f(x) f(x)

  10. Mutual Information • Mutual information is a general information theoretic measure of the dependence between two variables. • The amount of information gained about X when Y is learned, and vice versa. • I(X,Y) = 0 if and only if X and Y are independent

  11. Mutual Information Sample Statistic • Requires Monte-Carlo procedure to determine significance. (See later)

  12. The theoretical basis for time series models • A random process is a sequence of random variables indexed in time • A random process is fully described by defining the (infinite) joint probability distribution of the random process at all times

  13. xt+1 = g(xt, xt-1, …,) + random innovation (errors or unknown random inputs) Random Processes • A sequence of random variables indexed in time • Infinite joint probability distribution

  14. Classification of Random Quantities

  15. A time series constitutes a possible realization of a random process completely described by the full (infinite) joint probability distribution Bras, R. L. and I. Rodriguez-Iturbe, (1985), Random Functions and Hydrology, Addison-Wesley, Reading, MA, 559 p.

  16. The infinite set of all possible realizations is called the Ensemble. Bras, R. L. and I. Rodriguez-Iturbe, (1985), Random Functions and Hydrology, Addison-Wesley, Reading, MA, 559 p.

  17. Random process properties are formally defined with respect to the ensemble. First order marginal density function from which the mean and variance can be evaluated f(x(t))

  18. Stationarity A strictly stationary stochastic process {xt1, xt2, xt3, …} has the same joint distribution as the series of {xt1+h, xt2+h, xt3+h, …} for any given value of h. This applies for all values of N, i.e. all orders of joint distribution function

  19. d f(x(t1)) = f(x(t1+h)) for any value of h d f(x(t1), x(t2)) = f(x(t1+h), x(t2+h)) for any value of h Stationarity of a specific order • 1st Order. A random process is classified as first-order stationary if its first-order probability density function remains equal regardless of any shift in time to its time origin • 2nd Order. A random process is classified as second-order stationary if its second-order probability density function does not vary over any time shift applied to both values. This means that the joint distribution is not a function of the absolute values of t1 and t2 but only a function of the lag =(t2-t1)

  20. d f(x(t1)) = f(x(t2)) t1, t2 First order stationarity Stationarity of moments

  21. Second order density function f(x(t1), x(t2)) Second order moments Correlation

  22. Second order stationarity f(x(t1), x(t2)) is not a function of the absolute values of t1 and t2 but only a function of the lag =(t2-t1) Second moment stationarity

  23. Stationarity of the moments (weak or wide sense stationarity) 2nd Moment. A random process is classified as 2nd Moment stationary if its first and second moments are not a function of the specific time. mean: µ(t) = µ variance: σ2(t)= σ and: covariance: Cov( X(t1), X(t2)) = Cov( X(t1+h), X(t2+h)) This means that the covariance is not a function of the absolute values of t1 and t2 but only a function of the lag  = (t2- t1). • Subset of 2nd order stationarity • For gaussian process equivalent to 2nd order stationarity

  24. d f(xy1,m) = f(xy2,m) y1, y2 for each m Periodic Stationarity In hydrology it is common to work with data subject to a seasonal cycle, i.e. that is formally non-stationary, but is stationary once the period is recognized. Periodic variable y=year, m=month Periodic first order stationarity Periodic second moment stationarity Cov(Xy,m1, Xy+,m2) = Cov(m1, m2, )

  25. Ergodicity • Definitions givin are with respect to the ensemble • It is often possible to observe only one realization • How can statistics be estimated from one realization • The Ergodicity assumption for stationary processes asserts that averaging over the ensemble is equivalent to averaging over a realization

  26. Discrete representation • A continuous random process can only be observed at discrete intervals over a finite domain • Zt may be averages from t-1 to t (Rainfall) or instantaneous measurements at t (Streamflow)

  27. Markov Property • The infinite joint PDF construct is not practical. • A process is Markov order d if the joint PDF characterizing the dependence structure is of dimension no more than d+1. Joint Distribution Conditional Distribution Assumption of the Markov property is the basis for simulation of time series as sequences of later values conditioned on earlier values

  28. Linear approach to time series modeling e.g. Xt=Xt-1+Wt AR1 • Model structure and parameters identified to match second moment properties • Skewness accommodated using • Skewed residuals • Normalizing transformation (e.g. log, Box Cox) • Seasonality through seasonally varying parameters

  29. Nonparametric/Nonlinear approach to time series modeling e.g. Multivariate nonparametric estimated directly from data then used to obtain NP1 • 2nd Moments and Skewness inherited by distribution • Seasonality through separate distribution for each season Other variants Estimated directly using nearest neighbor method KNN Local polynomial trend function plus residual

More Related