We’re quite eager to get to applications of algebraic topology to things like machine learning (in particular, persistent homology). Even though there’s a massive amount of theory behind it (and we do plan to cover some of the theory), a lot of the actual computations boil down to working with matrices. Of course, this means we’re in the land of linear algebra; for a refresher on the terminology, see our primers on linear algebra.
In addition to applications of algebraic topology, our work with matrices in this post will allow us to solve important optimization problems, including linear programming. We will investigate these along the way.
Matrices Make the World Go Round
Fix two vector spaces $latex V, W$ of finite dimensions $latex m,n$ (resp), and fix bases for both spaces. Recall that an $latex n times m$ matrix uniquely represents a linear map $latex V to W$…
View original post 1,937 more words