1 / 60

FIN 685: Risk Management

FIN 685: Risk Management. Topic 4: Dependencies Larry Schrenk, Instructor. Topics. Dependency Rank Order Statistics Spearman’s r Kendall’s t Correlation Copulae. Dependency. Bivariate Dependencies. Require that you ask three questions Does an dependencies exist ?

medea
Download Presentation

FIN 685: Risk Management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FIN 685: Risk Management Topic 4: Dependencies Larry Schrenk, Instructor

  2. Topics • Dependency • Rank Order Statistics • Spearman’s r • Kendall’s t • Correlation • Copulae

  3. Dependency

  4. BivariateDependencies • Require that you ask three questions • Does an dependencies exist? • If an dependency exists, then how strong is it? • What is the pattern or direction of the dependency ? • When we consider these questions, we begin to explain the nature (if there is one) of the relationship between our variables • When we do this, we cross a threshold into a higher form of scientific inquiry

  5. Does a dependency Exist? • In our example we looked at the independence of two variables • When the test rejected the null of independence it revealed that there was evidence of some association between our variables • It is important to recognize that the statistics do not prove a causal relationship, but they do give evidence that that relationship is likely to exist • This can have far reaching implications because it opens up the possibility for modeling and prediction

  6. How Strong is the dependency ? • A huge Chi-Square value is some indication that there is a strong association, but this “impressionistic” approach is somewhat limited • As we progress in this subject, we will investigate specific indices for describing the strength of an association • These indices are generally scaled from 0 to 1 or from –1 to +1 • A zero value generally means no association, while a 1 value means that there is nearly a perfect association

  7. What Is the Pattern or Direction of the dependency ? • Because of the pseudo-ordinal nature of the data in our example, attending lecture is clearly associated with higher exam scores • In the simplest sense, more lectures attended translates into a higher score • This means that the dependency is not only present, but also positive in its effect (big yields big and small yields small)

  8. What Is the Pattern or Direction of the Dependency ? • A silly example: • Note that the dependency here is negative • It appears that getting so drunk that you forget where you are on Saturday night can have an adverse effect on how you feel on Sunday • This is negative association: big boozing provides little contentment on Sunday morning, while little alcohol will yield big contentment on Sunday morning

  9. Indices of dependency • If an dependency exists, then we should the strength of the relationship in a standard manner via an index • We do this so that we can compare associations between many variables and thereby determine which have the strongest influence over others

  10. Scatter plot • The relationship between any two variables can be portrayed graphically on an x- and y- axis. • Each subject i1 has (x1, y1). When score s for an entire sample are plotted, the result is called scatter plot.

  11. Scatter plot

  12. Direction of the relationship Variables can be positively or negatively correlated. • Positive correlation: A value of one variable increase, value of other variable increase. • Negative correlation: A value of one variable increase, value of other variable decrease.

  13. Examples

  14. Strength of the relationship The magnitude of correlation: • Indicated by its numerical value • ignoring the sign • expresses the strength of the linear relationship between the variables.

  15. r =1.00 r = .42 r =.17 r =.85

  16. Rank Order Statistics:Spearman’s r

  17. Spearman’s rho • Non-Parametric • Range:–1.0 to zero to 1.0 • Like correlation between ranked variables • Ordinal

  18. Ordinal By Degrees • Ordinal data is defined as data that has a clear hierarchy • This form of data can often appear similar to nominal data (in categories) or interval data (ranked from 1 to N) • However there is more information to ordinal categories than nominal categories • And there is less to ranks than there is to real data at an interval level of measure

  19. Interpreting Spearman’s rs • 1.00 means that the rankings are in perfect agreement • -1.00 is if they are in perfect disagreement • 0 signifies that there is no relationship

  20. Example 1

  21. Example 2

  22. Example 3

  23. Calculation • Convert data to ranks, xi, yi • Excel: RANK function • Assuming no tied ranks

  24. Example

  25. Rank Order Statistics:Kendall’s t

  26. Kendall’s Tau • Non-Parametric • Range:–1.0 to zero to 1.0 • ‘Pairs’ Oriented • Ordinal

  27. Introduction to Kendall’s • The basic premise behind Kendall’s is that for observations with two pieces of information (two variables) you can rank the value of each and treat it as a pair to be compared to all other pairs • Each pair will have an X (dependent variable) value and a Y (independent variable) value • If we order the X values, we would would expect for the Y values to have a similar order if there is a a strong positive correlation between X and Y • Kendall’s has a range from –1 to +1 with large positive values denoting positive associations and large negative values denoting negative associations, a 0 denotes no association

  28. Kendall’s tau • This series of tests works off of the comparison of pairs to all other pairs • Any comparison of pairs can have only three possible results • Concordant – Ordinally correct • Discordant – Ordinallyincorect • Tied – Exactly the same • Note that for n pairs there are n(n-1)/2 comparisons, hence the equation from your book

  29. Calculation • This series of tests works off of the comparison of pairs to all other pairs • Any comparison of pairs can have only three possible results • Concordant (Nc) – Ordinally correct • Discordant (Nd) – Ordinallyincorrect • Tied – Exactly the same

  30. Calculation • Note that for n pairs there are n(n-1)/2 comparisons, hence the equation

  31. Example

  32. Correlation

  33. Pearson’s rho • Pearson’s Product Moment Correlation • Devised by Francis Galton • The coefficient is essentially the sum of the products of the z-scores for each variable divided by the degrees of freedom • Its computation can take on a number of forms depending on your resources

  34. Pearson’s rho • Parametric: Elliptical • Linear • Range:–1.0 to zero to 1.0 • Cardinal

  35. Equations and Covariation • The sample covariance is the upper center equation without the sample standard deviations in the denominator • Covariance measures how two variables covary and it is this measure that serves as the numerator in Pearson’s r Mathematically Simplified Computationally Easier

  36. Covariation r = 0.89, cov = 788.6944 • How it works graphically: x(bar) y(bar) +,+ -,-

  37. Correlation via rho • So we now understand Covariance • Standard deviation is also comfortable term by now • So we can calculate Pearson’s r, but what does it mean: • r is scaled from –1 to +1 and its magnitude gives the strength of association, while its sign shows how the variables covary

  38. Example • Correlation = 0.58

  39. Correlation Works When • Multivariate Normal • Ellipitical • Linear Relationships

  40. Some Concerns 1. Correlation represents a linear relations. • Correlation tells you how much two variables are linearly related, not necessarily how much they are related in general. • There are some cases that two variables may have a strong perfect relationship but not linear. For example, there can be a curvilinear relationship.

  41. NonLinearity

  42. NonLinearity • An Extreme Example

  43. Anscombe’s Quartet (r=0.816)

  44. Some Concerns 2. Restricted range • Correlation can be deceiving if the full information about each of the variable is not available. A correlation between two variable is smaller if the range of one or both variables is truncated. • Because the full variation of one variables is not available, there is not enough information to see the two variables covary together.

  45. Some Concerns 3. Outliers • Outliers are scores that are so obviously deviant from the remainder of the data. • On-line outliers–artificially inflate the correlation coefficient. • Off-line outliers–artificially deflate the correlation coefficient

  46. On-line outlier • An outlier which falls near where the regression line would normally fall would necessarily increase the size of the correlation coefficient, as seen below. • r = .457

  47. Off-line outliers • An outlier that falls some distance away from the original regression line would decrease the size of the correlation coefficient, as seen below: • r = .336

  48. Some Concerns 3. Distributional Assumptions • Multivariate Normal • Assets not Normal • Combining Distributions

  49. Some Concerns 3. Time Stability • Higher Correlation in Bad Markets

  50. Correlation and Causation • Two things that go together may not necessarily mean that there is a causation. • One variable can be strongly related to another, yet not cause it. Correlation does not imply causality. • When there is a correlation between X and Y. • Does X cause Y or Y cause X, or both?   • Or is there a third variable Z causing both X and Y , and therefore, X and Y are correlated?

More Related