After improving and updating my neural networks library, I think I understand the popular backpropagation algorithm even more. I also discovered that was usable on WordPress so I want to present my thoughts on gradient descent and backpropagation using cool math equations and nice intuitions. Of course, this post will be math intensive because there will be some calculus concepts.
For the past few weeks (in addition to the school grind), I’ve been improving my Java Machine Learning library. After a few weeks of learning from online blog posts and papers, I have completed an overhaul of the existing code and I’ve added support for convolutional layers and GRU recurrent layers.
Recently, I’ve been working on some reinforcement learning. I learned about DDDQNs, which are neural networks that learns how to play games. Continue reading “Double Dueling DQN with Prioritized Experience Replay”
This uses my neural network Java library that can be found here. The trained weights can also be found in the GitHub repository. Continue reading “Handwriting Recognition With MNIST Data in Java”
My Java machine learning library is now on GitHub. It contains a basic neural network that can be trained using backpropagation and gradient descent (Adam, Adagrad, or SGD). Continue reading “My Java Machine Learning Library and Other Source Codes are Now on GitHub!”
That’s my new project.
I’m starting to learn some Machine Learning (just learned Python). My first project is a simple machine learning addition program. Continue reading “Addition! (Python + Machine Learning)”