My Java machine learning library is now on GitHub. It contains a basic neural network that can be trained using backpropagation and gradient descent (Adam, Adagrad, or SGD). There are also regularization techniques such as dropout and L2 regularization that are implemented. Classification is possible with softmax activation on the output layer and cross entropy loss. More features such as LSTM recurrent neural network and convolution neural network are planned, but there is no guarantee that they will be implemented.
To be honest, implementing backpropagation and other features was a real pain, since I did not know ANY calculus before I started creating the library. At least I knew some matrix multiplication so I could understand the tutorials online. The process was very interesting and fun, even though debugging the code when I didn’t understand most of what I wrote meant was hard. Well, after looking through many tutorials, I learned enough calculus to manage to implement a simple machine learning library in, like, one to two weeks (actually, I was on vacation for half of the time, so I could only code at night)! I want to implement LSTM and CNN, but they are much harder to implement than the vanilla, “regular”, neural network.
You can find the source code here.
One of my many game projects is also on GitHub. It is unfinished, and I probably won’t finish it, ever, but it is an example of using low level OpenGL commands through LWJGL to render graphics.
The source code is available here.