Skip to content

Weekly report 2

irenenikk edited this page Mar 23, 2018 · 3 revisions

Weekly report 2

Hours used: 8

I started by implementing my own naive matrix multiplication. In hindsight that wasn't the best idea, since the instructions encouraged to start with the frame of the algorithm and then replacing libraries with own implementations later on. I'm going to have to extend the class to cover matrix addition and "normal" product (meaning not the dot product). I decided to leave optimizing the matrix multiplication to later.

I spent my time studying how matrixes are used in in gradient descent using this and this tutorials for example. I struggled to define what the API of the network should be like, and go stuck on that for a surprisingly long time. For example, I'm not sure whether I should use something like a Variable in pytorch in string information in feed forward. I also fought with Python's modules, not being able to add test coverage to codecov.

I did some notes on the functionality of neural network in general and gradient descent and in order to sort my head when it comes to feed forward neural networks.

The biggest problem at the moment seems to be backpropagation: how to store the loss for backpropagation, and how exactly is the derivation with respec to each weight conducted. I'm hoping to have finished that by next week.

Clone this wiki locally