200 likes | 510 Views
Advanced Research Methods II 03/25/2009. General Linear Models (GLM). Topic Overview. The Basic Equation for GLM Analysis Methods Subsumed Under GLM ANOVA (One-way and Factorial) ANCOVA Regression MANOVA and Discriminant Analysis Repeated Measures ANOVA Multivariate Regression
E N D
Advanced Research Methods II03/25/2009 General Linear Models (GLM)
Topic Overview • The Basic Equation for GLM • Analysis Methods Subsumed Under GLM • ANOVA (One-way and Factorial) • ANCOVA • Regression • MANOVA and Discriminant Analysis • Repeated Measures ANOVA • Multivariate Regression • Canonical Correlation • Conducting GLM by SPSS • GLM versus Generalized Linear Models
The General Linear Models (GLM) Basic Equation: • YM = XB + E Notes: n: Number of subjects (observations/cases) p: Number of dependent variables (DV) p’: Number of linear composites formed by the DVs k: Number of independent variables (IV) • Y: Dependent variables n x p Matrix • M: Coefficients determining the linear combinations of Y p x p’ Matrix • X: Independent variables n x (k+1) Matrix • B: Regression Coefficients (k +1) x p’ Matrix • E: Error n x p’ Matrix
The General Linear Models (GLM) • Oneway ANOVA: YM = XB + E n: Number of subjects (observations/cases) p: Number of dependent variables = 1 k: Number of independent variables = a - 1 (a = Number of categories for factor A) • Y: n x 1 Matrix • M: 1 x 1 Matrix = Scalar (1) • B: (k+1)x 1 Matrix (k = a -1; ) • X: n x(k+1) Matrix (k = a -1) • E: n x 1 Matrix
The General Linear Models (GLM) • One-Way ANOVA Example: ACTnx1 = (Race)n x 3*B3 x 1 + En x 1 ACTnx1= y1 Racenx3 = 1 1 0 B3x1 = β0 Enx1= ε1 y2 1 1 0 β1 ε2 .. … …… β2 .. .. 1 0 1 .. .. 1 0 1 .. .. ………. … 1 -1 -1 yn 1 -1 -1 εn
The General Linear Models (GLM) • Factorial ANOVA: YM = XB + E n: Number of subjects (observations/cases) p: Number of dependent variables = 1 k: Number of independent variables = ab - 1 (a = number of categories for factor A; b = number of categories for factor B) • Y: n x 1 Matrix • M: 1 x 1 Matrix = Scalar (1) • B: (k+1) x 1 Matrix (k = ab - 1) • X: n x (k+1) Matrix (k = ab - 1) • E: n x 1 Matrix
The General Linear Models (GLM) • Factorial ANOVA Example: ACTnx1 = (Race, College, Race*College)n x 6*B6 x 1 + En x 1 ACTnx1= y1 Xnx6 = 1 1 0 1 1 0 B6x1 = β0 y2 1 1 0 -1 -1 0β1 (race1) .. … ………… β2 (race2) .. 1 0 1 1 0 1 β3 (college) .. 1 0 1 -1 0 -1 β4 (race1 x college) .. …………. β5 (race2 x college) 1 -1 -1 1 -1 -1 yn 1 -1 -1 -1 1 1
The General Linear Models (GLM) • ANCOVA: YM = XB + E n: Number of subjects (observations/cases) p: Number of dependent variables = 1 k: Number of independent variables = ab + c – 1 (a = number of categories for factor A; b = number of categories for factor B; c = number of covariates) • Y: n x 1 Matrix • M: 1 x 1 Matrix = Scalar (1) • B: k+1 x 1 Matrix (k = ab + c - 1) • X: n x k+1 Matrix (k = ab + c - 1) • E: (n x 1) Matrix
The General Linear Models (GLM) • ANCOVA Example: ACTnx1 = (Race, College, Race*College, Father_edu)n x 7*B7 x 1 + En x 1 ACTnx1= y1 Xnx6 = 1 1 0 1 1 0 4 B7x1 = β0 y2 1 1 0 -1 -1 0 7β1 (race1) .. … ………… β2 (race2) .. 1 0 1 1 0 1 3 β3 (college) .. 1 0 1 -1 0 -1 2 β4 (race1 x college) .. …………. β5 (race2 x college) 1 -1 -1 1 -1 -1 5β6 (Father_edu) yn 1 -1 -1 -1 1 1 3
The General Linear Models (GLM) • Regression: YM = XB + E n: Number of subjects (observations/cases) p: Number of dependent variables = 1 k: Number of independent variables (k may include variables reflecting interaction effects, and/or curvilinear effects) • Y: n x 1 Matrix • M: 1 x 1 Matrix = Scalar (1) • B: (k+1) x 1 Matrix • X: n x (k+1) Matrix • E: n x 1 Matrix
The General Linear Models (GLM) • MANOVA: YM = XB + E n: Number of subjects (observations/cases) p: Number of dependent variables k: Number of independent variables = ab - 1 (a = number of categories for factor A; b = number of categories for factor B) • Y: n x p Matrix • M: p x p Identity Matrix • B: (k+1) x p Matrix (k = ab - 1) • X: n x (k+1) Matrix (k = ab - 1) • E: (n x p) Matrix E.g. p=2
The General Linear Models (GLM) • MANOVA Example: (GPA, ACT)nx2 = (Race, College, Race*College)n x 6*B6 x 2 + En x 2 Ynx2= y11 y12 Xnx6 = 1 1 0 1 1 0 B6x1 = β01β02 y21 y22 1 1 0 -1 -1 0β11 β12 .. .. … ………… β21 β22 .. .. 1 0 1 1 0 1 β31 β32 .. .. 1 0 1 -1 0 -1 β41 β41 .. .. …………. β51 β51 1 -1 -1 1 -1 -1 yn1 yn2 1 -1 -1 -1 1 1 M = 1 0 0 1
The General Linear Models (GLM) • Repeated Measures ANOVA (Within-subject factor): YM = XB + E n: Number of subjects (observations/cases) p: Number of dependent variables = t-1 (t = number of levels/times associated with the DV) k = 0 • Y: n x p Matrix • M: p x (p - 1) Matrix • B: 1x (p - 1) Matrix • X: n x 1 Matrix (vector 1: all cells = 1) • E: n x (p -1) Matrix E.g. p=2
The General Linear Models (GLM) • Repeated-Measure ANOVA Example: ACTnx2 M2x1= (ACT1 - ACT2)nx1 = Xnx1*B1x1 + En x 1 ACTnx2= y11 y12 M = 1 ACT*M = y11 - y12 Xnx1 = 1 B3x1 = β0 y21 y22 -1 y21 -y22 1 … … ... … … … yn1 yn2yn1 -yn2 1
The General Linear Models (GLM) • Mutivariate Regression: YM = XB + E n: Number of subjects (observations/cases) p: Number of dependent variables k: Number of independent variables (k may include variables reflecting interaction effects, and/or curvilinear effects) • Y: n x p Matrix • M: p x p Identity Matrix • B: (k+1) x p Matrix • X: n x (k+1) Matrix • E: (n x p) Matrix
The General Linear Models (GLM) • Multivariate Regression Example: (GPA, ACT)nx2 = (Father_Edu, Mother_Educ)n x 2*B2 x 2 + En x 2 Ynx2 = y11 y12 Xnx6 = 5 4 B6x1 = β01β02 y21 y22 7 6 β11 β12 .. .. … … .. .. 3 2 .. .. 1 4 .. .. ……… 1 3 yn1 yn2 2 2 M = 1 0 0 1
Conducting GLM by SPSS • Fixed Factors = Categorical Variables • Covariates = Continuous Variables • Full Factorial Model (default) = All main and interaction effects for Fixed factors plus main effects for covariates. • Matrix M (default) = Identity matrix. M can only be specified by syntax: E.g. Dependent variable of interest: ACT – 10*HS_GPA GLM ACT HS_GPA By Race College With Father_education Mother_education /MMatrix = ACT 1 HS_GPA -10 /Intercept=Include /Design= Race College Race*College Father_education Mother_education Father_education*Mother_education.
General Linear Models (GLM) vs. Generalized Linear Models Assumptions for GLM: + Normality (Distribution of the DV Y is multivariate normal) + Linearity Y = XB + E or E(Y’) = XB When the assumptions cannot be satisfied (e.g., Y is a dichotomous variable) Generalized Linear Models: + Normality: Assumes that Y follows a distribution belonging to the exponential family of distributions, which includes distributions such as the binomial, Poisson, exponential, and gamma distributions, in addition to the normal distribution.
General Linear Models (GLM) vs. Generalized Linear Models + Linearity: E(Y) = XB Generalized Linear Models extend GLM by suggesting a link function g such that: g[E(Y)] = XB For Logistic regression (Y = 0, 1); g = ln{(E(Y)/[1-E(Y)]} For General linear models Y ~ N (normal); g = E(Y)