Day One: Linear Algebra for AI/ML
I was able to cover the basics of linear algebra in AI. So first off I would start with the keywords.
Scalar: This is a single item that holds a numerical value with no order nor direction. for example, Age is a single value that represents how old you are.
Vector: This is a one-dimensional array of elements that can be stored in rows and columns. Instead of scalar that has only one value, a vector can have multiple of those values.
Matrix: This is a two-dimensional array of elements that can also be stored in rows and columns.
Now, we would be moving to a class of operations that can be performed on vectors and matrices which is called element-wise operations. These operations are carried out on each individual element present in either matrix or vector.
I was able to cover a couple of these element-wise operations, such as addition,subtraction,multiplication,division (as they worked the same way they worked on integers and floats. There were two operations that were new to me; Sigmoid and ReLU(Rectified Linear Unit). These two operations can act on a vector or matrix alone.
Sigmoid
The sigmoid(σ) function takes any value and squashes it into the range of 0 and 1. Even if the value is negative the sigmoid would still behave the same way. Only the sigmoid of negative infinity σ(−∞) can give 0 and the sigmoid of positive infinity σ(+∞) would result to 1.
The sigmoid is usually useful in cases where you are dealing with scores or trying to predict a probability/likelihood.
ReLU(Rectified Linear Unit)
The ReLU, on the other hand, acts as a filter. It takes in a vector or matrix as input and then filters out the value by looking out for negative elements and turning them to zero while leaving the other elements unchanged.
This function is commonly used in neural networks, like object detection.