You can find the source on the GitHub project.
The notes are collecting using my interpretation of the Zettelkasten method.
More
-
-
Deep Learning for Coders (2020)
Notes from the Deep Learning for Coders (2020) video series by Jeremy Howard and Sylvain Gugger (fast.ai)
-
Matrix Multiplication
Matrix multiplication is a mathematical operation between 2 matrices that returns a matrix.
For each row in the first matrix, take the Dot Productf …
-
-
Binary Cross-Entropy Loss
Binary Cross-Entropy (BCE), also known as log loss, is a loss function used in binary or multi-label machine learning training.
It's nearly identical to Negative …
-
-
Categorical Cross-Entropy Loss
Categorical Cross-Entropy Loss Function, also known as Softmax Loss, is a Loss Function used in multiclass classification model training. It applies the Softmax Function to …
-
Cross-Entropy
Cross-entropy measures the average number of bits required to identify an event if you had a coding scheme optimised for one probability distribution
-
Information Entropy
Entropy is a measure of uncertainty of a random variable's possible outcomes.
It's highest when there are many equally likely outcomes. As you introduce more …
-