implement Gradient Descent Feed-forward and Recurrent Neural Network on different languages, only use vector / linear algebra library.
Artificial Neural Network is relatively easy if you really understand it!
- feed-forward iris
- recurrent generator
- recurrent forecasting
- feed-forward iris
- recurrent generator
- recurrent forecasting
- feed-forward iris
- recurrent generator
- recurrent forecasting
- feed-forward iris
- recurrent generator
- recurrent forecasting
- feed-forward iris
- recurrent generator
- recurrent forecasting
- feed-forward iris
- recurrent generator
- recurrent forecasting
- feed-forward iris
- recurrent generator
- recurrent forecasting
- Go to any language folder.
- run install.sh
- run the program.
- Feed-forward Neural Network to predict Iris dataset.
- 3 layers included input and output layer
- first 2 layers squashed into sigmoid function
- last layer squashed into softmax function
- loss function is cross-entropy
- Vanilla Recurrent Neural Network to generate text.
- 1 hidden layer
- tanh as activation function
- softmax and cross entropy combination for derivative
- sequence length = 15
- Vanilla Recurrent Neural Network to predict TESLA market.
- 1 hidden layer
- tanh as activation function
- mean square error for derivative
- sequence length = 5
All implemention like max(), mean(), softmax(), cross_entropy(), sigmoid() are hand-coded, no other libraries.
Will update overtime.
You would not see high accuracy for other languages that natively are not using float64. During backpropagation, the changes are very small, float32 ignored it.
- Husein Zolkepli - Initial work - huseinzol05