Skip to content

Commit

Permalink
Added README.md and other fixes.
Browse files Browse the repository at this point in the history
  • Loading branch information
izzat5233 committed Nov 27, 2023
1 parent 09f8560 commit 4c7c3fa
Show file tree
Hide file tree
Showing 5 changed files with 87 additions and 15 deletions.
63 changes: 63 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
# Fruit Classifier Neural Network in WASM (FCNN)

The FCNN is a lightweight, flexible neural network library written in C++, designed for the specific task of classifying
different types of fruits. This library forms the core part of the larger Fruit Classifier project.

## Features:

- Customizable Neural Network: Create neural networks with varying structures, including different numbers of layers,
neurons per layer, and activation functions.
- Multiple Activation Functions: Includes several built-in activation functions like step, sign, linear, ReLU, sigmoid,
and tanh, each with its corresponding derivative for backpropagation.
- Efficient Computation: Optimized for performance, utilizes WebAssembly to allow fast c++ near-native speed.
- Gradient Descent Training: Utilizes gradient descent algorithm for network training.

## Components

- **```Neuron```**: Represents a single neuron, encapsulating weights and bias.
Provides functionality for computing neuron's output.

- **Layer Types**:
- **```Layer```**: Base class for layers in the neural network.
- **```HiddenLayer```**: Represents a hidden layer in the network.
- **```OutputLayer```**: Special layer type using Softmax activation.

- **```Network```**:Represents the entire neural network, a collection of layers.
Implements forward and backward propagation methods for network training.

- **Activation Functions**: Defined in the ```act``` namespace with built-in functions for use in network layers.
Includes a special softmax function for output layers.

- **Factory Functions**:
Located in the ```make``` namespace, these functions allow for the creation of Neurons, Layers, and Networks with
specific configurations.

## Usage

- To create a neural network:
- Define Network Structure: Determine the number of layers and neurons in each layer.
- Select Activation Functions: Choose activation functions for each hidden layer.
- Create Network: Use the ```make::network``` function to create a network instance.
- Train the Network: Provide training data and use ```Network::train``` method to train the network.

- Example

```c++
#include "nn/network.h"

int main() {
// Create a neural network with specific dimensions and activation functions
// The following network has 3 inputs and 2 hidden layers with [6, 4] neurons respectively
// and output layer with 2 neurons. And uses sigmoid activation function.
auto network = nn::make::network({3, 6, 4, 2}, {nn::act::relu, nn::act::sigmoid}, 0.01);

// Train the network with your data...
// For input {1, 2, 3} we expect an output {1, 0}
network.train({1, 2, 3}, {1, 0});

// Now use it to predict outputs...
auto output = network.predict({2, 3, 1});

return 0;
}
```
23 changes: 15 additions & 8 deletions main.cpp
Original file line number Diff line number Diff line change
@@ -1,12 +1,19 @@
#include "nn/network.h"

#include <iostream>
#include <vector>
int main() {
// Create a neural network with specific dimensions and activation functions
// The following network has 3 inputs and 2 hidden layers with [6, 4] neurons respectively,
// and output layer with 2 neurons. With initial learning rate 0.01.
// First hidden layer uses ReLU activation function, second one uses Sigmoid.
nn::Network network = nn::make::network({3, 6, 4, 2}, {nn::act::relu, nn::act::sigmoid}, 0.01);

using std::cout, std::cin, std::vector;
// Train the network with your data...
// For input {1, 2, 3} we expect an output {1, 0}
network.train({1, 2, 3}, {1, 0});

int main() {
auto network = nn::make::network({2, 3, 2}, nn::act::tanh, 1);
vector<double> input = {1, 2}, output = {1, 0};
network.train(input, output);
}
// Now use it to predict outputs...
// vd_t is a vector of double values.
nn::vd_t output = network.predict({2, 3, 1});

return 0;
}
5 changes: 3 additions & 2 deletions nn/nn.h
Original file line number Diff line number Diff line change
Expand Up @@ -117,14 +117,15 @@ namespace nn {
* Creates a Layer with the specified options
* which can be used to create a HiddenLayer or an OutputLayer.
* Neurons weights and bias are given random values in the range:
* [-numNeurons / 2.4, +numNeurons / 2.4].
* [-numNeurons / rangeFactor, +numNeurons / rangeFactor].
* Use this function only if specific network functionality is needed.
*
* @param numInputs Number of inputs for the neurons.
* @param numNeurons Number of neurons for the layer.
* @param rangeFactor Random values range factor. Default is 2.4.
* @return A Layer object configured as per the provided options.
*/
Layer layer(const ui_t &numInputs, const ui_t &numNeurons);
Layer layer(const ui_t &numInputs, const ui_t &numNeurons, double rangeFactor = 2.4);

/**
* Creates a Network with the specified options.
Expand Down
7 changes: 4 additions & 3 deletions src/make.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -24,9 +24,10 @@ Neuron make::neuron(const ui_t &numInputs, double lowBound, double highBound) {
return Neuron(weights, dist(gen));
}

Layer make::layer(const ui_t &numInputs, const ui_t &numNeurons) {
auto lowBound = -numNeurons / 2.4;
auto highBound = numNeurons / 2.4;
Layer make::layer(const ui_t &numInputs, const ui_t &numNeurons, double rangeFactor) {
ASSERT(rangeFactor > 0)
auto lowBound = -numNeurons / rangeFactor;
auto highBound = numNeurons / rangeFactor;

vn_t neurons;
neurons.reserve(numNeurons);
Expand Down
4 changes: 2 additions & 2 deletions src/network.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Network::Network(vl_t layers, OutputLayer outputLayer, double alpha)
ASSERT(!this->layers.empty())
ASSERT([this] {
for (auto i = std::next(this->layers.begin()); i != this->layers.end(); ++i) {
if (i[0].size() != std::prev(i)->size()) { return false; }
if ((*i)[0].size() != std::prev(i)->size()) { return false; }
}
return this->outputLayer[0].size() == this->layers.rbegin()->size();
}())
Expand All @@ -41,7 +41,7 @@ void Network::backwardPropagate(const vd_t &output) {
if (size <= 2) { return; }

// Rest of layers
for (std::size_t j = 1; j <= size - 3; ++j) {
for (std::size_t j = 0; j < size - 2; ++j) {
auto i = size - 3 - j;
e_cash[i] = layers[i + 1].backPropagate(e_cash[i + 1], layers[i]);
}
Expand Down

0 comments on commit 4c7c3fa

Please sign in to comment.