# Solved – Explained Sums of Squares in matrix notation

I am currently reading Appendix C from Gujarati Basic Econometrics 5e.
It deals with the Matrix Approach to Linear Regression Model.

I am unable to decipher how the author went from equation 7.4.19 to C.3.17

Contents

In short, the author is not going from 7.4.19 to C.3.17.

C.3.17 is just a definition, from which we can construct 7.4.19.

The total sum of squares in pure matrix form is the following: begin{align} y^TM_{iota}y = y^T(I – iota(iota^Tiota)^{-1}iota^T)y = y^Ty – nbar{y}^2 = sum_{i=1}^{n}(y_i – bar{y})^2 end{align}

Where \$M_{iota}\$ is a orthogonal projection matrix and \$iota\$ is a column of ones and \$I\$ is the identity matrix of size \$n\$.

The Explained Sum of Squares is defined in C.3.17, but I will start from a more familiar definition so it makes more sense how the author ended up there.

begin{align} sum_{i=1}^n(hat{y_i} – bar{y})^2 &= hat{y}^TM_{iota}hat{y}\ & = (Xhat{beta})^TM_{iota}(Xhat{beta})\ & = hat{B^T}X^T(I – iota(iota^Tiota)^{-1}iota^T)Xhat{beta}\ & = hat{beta}^TX^TXhat{beta} – hat{beta}^TX^Tiota(iota^Tiota)^{-1}iota^TXhat{beta}\ & = hat{beta}^TX^TX((X^TX)^{-1}X^Ty) – nbar{hat{y}^2}\ & = hat{beta}^Ty – nbar{y}^2 end{align}

Rate this post