This is a nice intro to linear algebra but doesn't really get into how it's related to deep learning. A good read is Vector, Matrix, and Tensor Derivatives[0], which goes into the linear algebra needed in multi-layer neural networks, specifically during the backpropogation phase.
You're correct - it doesn't get into the deep learning aspect yet.
This article is in fact the second part of a larger planned series. The idea is to present more depth than a "quick refresher" that seems to be common to a lot of blog posts, but far less material than would be found in a 10-week (or single semester) undergraduate course.
[0] http://cs231n.stanford.edu/vecDerivs.pdf