340 likes | 375 Views
Multiple Random Variables and Joint Distributions. The conditional dependence between random variables serves as a foundation for time series analysis. When multiple random variables are related they are described by their joint distribution and density functions.
E N D
Multiple Random Variables and Joint Distributions • The conditional dependence between random variables serves as a foundation for time series analysis. • When multiple random variables are related they are described by their joint distribution and density functions
Conditional and Joint Probability Definition Bayes Rule If D and E are independent Partition of domain into non overlaping sets D1 D2 D3 E Larger form of Bayes Rule
Conditional and joint density functions Conditional density function Marginal density function If X and Y are independent
Covariance and Correlation are measures of Linear Dependence
Mutual Information • Is there a relationship between these two variables plotted? • Correlation, the linear measure of dependence is 0. • How to quantify that a relationship exists?
Entropy • Entropy is a measure of randomness. The more random a variable is, the more entropy it will have. f(x) f(x)
Mutual Information • Mutual information is a general information theoretic measure of the dependence between two variables. • The amount of information gained about X when Y is learned, and vice versa. • I(X,Y) = 0 if and only if X and Y are independent
Mutual Information Sample Statistic • Requires Monte-Carlo procedure to determine significance. (See later)
The theoretical basis for time series models • A random process is a sequence of random variables indexed in time • A random process is fully described by defining the (infinite) joint probability distribution of the random process at all times
xt+1 = g(xt, xt-1, …,) + random innovation (errors or unknown random inputs) Random Processes • A sequence of random variables indexed in time • Infinite joint probability distribution
A time series constitutes a possible realization of a random process completely described by the full (infinite) joint probability distribution Bras, R. L. and I. Rodriguez-Iturbe, (1985), Random Functions and Hydrology, Addison-Wesley, Reading, MA, 559 p.
The infinite set of all possible realizations is called the Ensemble. Bras, R. L. and I. Rodriguez-Iturbe, (1985), Random Functions and Hydrology, Addison-Wesley, Reading, MA, 559 p.
Random process properties are formally defined with respect to the ensemble. First order marginal density function from which the mean and variance can be evaluated f(x(t))
Stationarity A strictly stationary stochastic process {xt1, xt2, xt3, …} has the same joint distribution as the series of {xt1+h, xt2+h, xt3+h, …} for any given value of h. This applies for all values of N, i.e. all orders of joint distribution function
d f(x(t1)) = f(x(t1+h)) for any value of h d f(x(t1), x(t2)) = f(x(t1+h), x(t2+h)) for any value of h Stationarity of a specific order • 1st Order. A random process is classified as first-order stationary if its first-order probability density function remains equal regardless of any shift in time to its time origin • 2nd Order. A random process is classified as second-order stationary if its second-order probability density function does not vary over any time shift applied to both values. This means that the joint distribution is not a function of the absolute values of t1 and t2 but only a function of the lag =(t2-t1)
d f(x(t1)) = f(x(t2)) t1, t2 First order stationarity Stationarity of moments
Second order density function f(x(t1), x(t2)) Second order moments Correlation
Second order stationarity f(x(t1), x(t2)) is not a function of the absolute values of t1 and t2 but only a function of the lag =(t2-t1) Second moment stationarity
Stationarity of the moments (weak or wide sense stationarity) 2nd Moment. A random process is classified as 2nd Moment stationary if its first and second moments are not a function of the specific time. mean: µ(t) = µ variance: σ2(t)= σ and: covariance: Cov( X(t1), X(t2)) = Cov( X(t1+h), X(t2+h)) This means that the covariance is not a function of the absolute values of t1 and t2 but only a function of the lag = (t2- t1). • Subset of 2nd order stationarity • For gaussian process equivalent to 2nd order stationarity
d f(xy1,m) = f(xy2,m) y1, y2 for each m Periodic Stationarity In hydrology it is common to work with data subject to a seasonal cycle, i.e. that is formally non-stationary, but is stationary once the period is recognized. Periodic variable y=year, m=month Periodic first order stationarity Periodic second moment stationarity Cov(Xy,m1, Xy+,m2) = Cov(m1, m2, )
Ergodicity • Definitions givin are with respect to the ensemble • It is often possible to observe only one realization • How can statistics be estimated from one realization • The Ergodicity assumption for stationary processes asserts that averaging over the ensemble is equivalent to averaging over a realization
Discrete representation • A continuous random process can only be observed at discrete intervals over a finite domain • Zt may be averages from t-1 to t (Rainfall) or instantaneous measurements at t (Streamflow)
Markov Property • The infinite joint PDF construct is not practical. • A process is Markov order d if the joint PDF characterizing the dependence structure is of dimension no more than d+1. Joint Distribution Conditional Distribution Assumption of the Markov property is the basis for simulation of time series as sequences of later values conditioned on earlier values
Linear approach to time series modeling e.g. Xt=Xt-1+Wt AR1 • Model structure and parameters identified to match second moment properties • Skewness accommodated using • Skewed residuals • Normalizing transformation (e.g. log, Box Cox) • Seasonality through seasonally varying parameters
Nonparametric/Nonlinear approach to time series modeling e.g. Multivariate nonparametric estimated directly from data then used to obtain NP1 • 2nd Moments and Skewness inherited by distribution • Seasonality through separate distribution for each season Other variants Estimated directly using nearest neighbor method KNN Local polynomial trend function plus residual