• Vipul Vaibhaw

In the world of non-linearities, why do we use linear algebra?

People have been hard-wired in such a way that they think mathematics is super hard and redundant. Well they might not think that mathematics is redundant per se, but definitely finding the use of the mathematical tools in the real world is not obvious. Mathematics comes from nature and that is the precise reason why there are discoveries in mathematics and not inventions(debatable topic). Anyways, this question is puzzling me that in the non-linear world of neural networks why linear algebra is so important and why not calculus?

"Study hard what interests you the most in the most undisciplined, irreverent and original manner possible." - Feynman

When we start learning about mathematics, we start with calculus. Linear Algebra is introduced later in our academic life. Usually we are not taught the relationship between Linear Algebra and Calculus. It appears as two distinct tools.


It is a bad habit of mine, I digress from the topic a lot. Anyways, let's come back and think about the problem again that why linear algebra started to dominate the AI/ML world?

Calculus is a brilliant tool to deal with infinities! You can estimate/integrate over infinities to get approximations. The primary challenge with calculus is that when working with multiple dimensions, the computation becomes clumsy.




Calculus came much before linear algebra and it is inspired by curves, however linear algebra is about flat surfaces.


Linear Algebra give you the tools to compute in 10 dimensions easily. It is easier to apply parallelism to matrix computations rather than calculus. A rectangle of numbers i.e "matrix" is the key element of linear algebra. The powers of matrices are that they can represent multi-dimensional images as flat surfaces and also they can capture context really well.


The numbers in the matrices are related in a way. Think about word2vec, we want to have a matrix representation of a word which can capture context. Also Auto-encoders, simply put we are representing images into smaller dimensional meaningful matrix which has important features of mathematics.


Every matrix can be written as product of three matrix i.e rotation x stretch x rotation.



Functions which are non-linear in the world of deep learning like ReLU can be broken down into piecewise linear functions. That is precisely why linear algebra works so well in deep learning world.


Let us look at the graph of ReLU -



Note - You can generate similar graph, check out the code here - https://github.com/Chanakya-School-of-AI/help-the-beginners/blob/master/tutorial-codes/relu_graph_gen.py


Although the graph is non-linear, but it can be broken into two pieces and linear algebra would work on each pieces.

I really hope that this "piece" of blog would have helped you out in understanding the importance of linear algebra and why we use it. This post certainly doesn't intend to belittle other fields like topology, real analysis, game theory, differential equations which are being used actively in deep learning. We just wanted to stress on linear algebra this time. If you still have doubts then feel free to reach out!


If you like what we are doing then please subscribe to this blog and share it with your friends. :)

161 views

©2019 by Deeplearned education pvt ltd