3 January 2019

So what did I do last year? I understood backprop.

In a nutshell, 2018 was probably not the best year. Or was it?

I have been excited to pursue a PhD in Artificial Intelligence. It was like a dream coming true. Except that, turns out things tends to be harder than expected. A lot of people having a lot of expectations.
Given my technical background as a software engineer, I was able to toy with various object detectors such as Faster R-CNN and Yolo, and quickly came up with an idea, a contribution. While I think it is taking a bit longer than it should, probably because I do not have a high-end deep learning computer, the approach seems promising. Meanwhile, I have been understanding back-propagation algorithm, technically, computation graph, chain rule, derivative of basic mathematical functions and activation function and implemented it in C for my CGraph project.

Boy, what a great journey. And it is still going. I have never realized how wonderful deep learning until I coded everything from scratch. There will be a future post about CGraph, and a dedicated website (probably under this domain), but I want to share few references that I had to read to understand computational graphs and automatic differentiation.
http://outlace.com/on-chain-rule-computational-graphs-and-backpropagation.html
http://cs231n.stanford.edu/vecDerivs.pdf
http://cs231n.stanford.edu/slides/2017/cs231n_2017_lecture4.pdf
https://explained.ai/matrix-calculus/index.html
https://medium.com/@14prakash/back-propagation-is-very-simple-who-made-it-complicated-97b794c97e5c
http://www.deepideas.net/deep-learning-from-scratch-theory-and-implementation/

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *