Dynamic Programming Tricks

Yay, another entry to my series on competitive programming! This time I am going to write about a few tricks I have picked up from solving dynamic programming problems. The main goal of this post is to introduce the idea behind dynamic programming and summarize a few tricks that I find useful. Hopefully, the post does not turn out too long.

Continue reading “Dynamic Programming Tricks”


The Math For Gradient Descent and Backpropagation

After improving and updating my neural networks library, I think I understand the popular backpropagation algorithm even more. I also discovered that \LaTeX was usable on WordPress so I want to present my thoughts on gradient descent and backpropagation using cool math equations and nice intuitions. Of course, this post will be math intensive because there will be some calculus concepts.

Continue reading “The Math For Gradient Descent and Backpropagation”

The Venice Technique and Caching Snapshots of States

I have not written a blog post about algorithms for competitive programming for a while, because I have been doing tons of artificial neural network stuff! However, I am resuming the series on some interesting algorithms. This time, I am going to discuss the Venice Technique, which I read about here. It is an interesting data structure trick that accompanies a hash map or a sorted map.

Continue reading “The Venice Technique and Caching Snapshots of States”

All About Ranges/Intervals (Part 2)

If you haven’t read the previous part 1, please do so, or else some things will not make sense in this blog post! As always, if you do not understand something, just search it up. There are many sources online that do in-depth explanations much better than me. Also, my code is available here on GitHub.

Let’s get started with some cool data structures and algorithms! Continue reading “All About Ranges/Intervals (Part 2)”