We start with an $m times n$ matrix before SVD. After SVD, we have three matrices of sizes, $m times m$, $n times n$ and $m times n$. How do we save space then if now we have three matrices instead of one and more numbers to store? Why are we talking here about dimensionality reduction? What are then the benefits of SVD?
For example, here's a $512 times 512$ B&W image of Lena:
We compute the SVD of Lena. Choosing the singular values above $1%$ of the maximum singular value, we are left with just $53$ singular values. Reconstructing Lena with these singular values and the corresponding (left and right) singular vectors, we obtain a low-rank approximation of Lena:
Instead of storing $512^2$ values (each taking $8$ bits), we can store $2 cdot (512 cdot 53) + 53 = 54325$ values, which is approximately $20%$ of the original size. This is one example of how SVD can be used to do lossy compression.
Here's the MATLAB code:
% open Lena image and convert from uint8 to double Lena = double(imread('LenaBW.bmp')); % perform SVD on Lena [U,S,V] = svd(Lena); % extract singular values singvals = diag(S); % find out where to truncate the U, S, V matrices indices = find(singvals >= 0.01 * singvals(1)); % reduce SVD matrices U_red = U(:,indices); S_red = S(indices,indices); V_red = V(:,indices); % construct low-rank approximation of Lena Lena_red = U_red * S_red * V_red'; % print results to command window r = num2str(length(indices)); m = num2str(length(singvals)); disp(['Low-rank approximation used ',r,' of ',m,' singular values']); % save reduced Lena imwrite(uint8(Lena_red),'Reduced Lena.bmp');