The Math For Gradient Descent and Backpropagation

After improving and updating my neural networks library, I think I understand the popular backpropagation algorithm even more. I also discovered that \LaTeX was usable on WordPress so I want to present my thoughts on gradient descent and backpropagation using cool math equations and nice intuitions. Of course, this post will be math intensive because there will be some calculus concepts.

Continue reading “The Math For Gradient Descent and Backpropagation”

Advertisements

Java Machine Learning Library: Conv Nets, GRU Recurrent Nets, and Text Generation

For the past few weeks (in addition to the school grind), I’ve been improving my Java Machine Learning library. After a few weeks of learning from online blog posts and papers, I have completed an overhaul of the existing code and I’ve added support for convolutional layers and GRU recurrent layers.

Continue reading “Java Machine Learning Library: Conv Nets, GRU Recurrent Nets, and Text Generation”