Skip to content

Commit

Permalink
Updated README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
izzat5233 committed Nov 27, 2023
1 parent 4c7c3fa commit 0037d10
Showing 1 changed file with 22 additions and 16 deletions.
38 changes: 22 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,24 +5,28 @@ different types of fruits. This library forms the core part of the larger Fruit

## Features:

- Customizable Neural Network: Create neural networks with varying structures, including different numbers of layers,
- *Customizable Neural Network*: Create neural networks with varying structures, including different numbers of layers,
neurons per layer, and activation functions.
- Multiple Activation Functions: Includes several built-in activation functions like step, sign, linear, ReLU, sigmoid,
- *Multiple Activation Functions*: Includes several built-in activation functions like step, sign, linear, ReLU,
sigmoid,
and tanh, each with its corresponding derivative for backpropagation.
- Efficient Computation: Optimized for performance, utilizes WebAssembly to allow fast c++ near-native speed.
- Gradient Descent Training: Utilizes gradient descent algorithm for network training.
- *Efficient Computation*: Optimized for performance, utilizes WebAssembly to allow fast c++ near-native speed.
- *Gradient Descent Training*: Utilizes gradient descent algorithm for network training.

## Components

- **```Neuron```**: Represents a single neuron, encapsulating weights and bias.
- **[```nn```](nn/nn.h)**: Main namespace containing all components.
Check [this](nn/nn.h) header for documentation.

- **[```Neuron```](nn/neuron.h)**: Represents a single neuron, encapsulating weights and bias.
Provides functionality for computing neuron's output.

- **Layer Types**:
- **```Layer```**: Base class for layers in the neural network.
- **```HiddenLayer```**: Represents a hidden layer in the network.
- **```OutputLayer```**: Special layer type using Softmax activation.
- **[```Layer```](nn/layer.h)**: Base class for layers in the neural network.
- **[```HiddenLayer```](nn/hidden_layer.h)**: Represents a hidden layer in the network.
- **[```OutputLayer```](nn/output_layer.h)**: Special layer type using Softmax activation.

- **```Network```**:Represents the entire neural network, a collection of layers.
- **[```Network```](nn/network.h)**:Represents the entire neural network, a collection of layers.
Implements forward and backward propagation methods for network training.

- **Activation Functions**: Defined in the ```act``` namespace with built-in functions for use in network layers.
Expand All @@ -37,8 +41,8 @@ different types of fruits. This library forms the core part of the larger Fruit
- To create a neural network:
- Define Network Structure: Determine the number of layers and neurons in each layer.
- Select Activation Functions: Choose activation functions for each hidden layer.
- Create Network: Use the ```make::network``` function to create a network instance.
- Train the Network: Provide training data and use ```Network::train``` method to train the network.
- Create Network: Use the ```nn::make::network``` function to create a network instance.
- Train the Network: Provide training data and use ```nn::Network::train``` method to train the network.

- Example

Expand All @@ -47,16 +51,18 @@ different types of fruits. This library forms the core part of the larger Fruit

int main() {
// Create a neural network with specific dimensions and activation functions
// The following network has 3 inputs and 2 hidden layers with [6, 4] neurons respectively
// and output layer with 2 neurons. And uses sigmoid activation function.
auto network = nn::make::network({3, 6, 4, 2}, {nn::act::relu, nn::act::sigmoid}, 0.01);
// The following network has 3 inputs and 2 hidden layers with [6, 4] neurons respectively,
// and output layer with 2 neurons. With initial learning rate 0.01.
// First hidden layer uses ReLU activation function, second one uses Sigmoid.
nn::Network network = nn::make::network({3, 6, 4, 2}, {nn::act::relu, nn::act::sigmoid}, 0.01);

// Train the network with your data...
// For input {1, 2, 3} we expect an output {1, 0}
network.train({1, 2, 3}, {1, 0});

// Now use it to predict outputs...
auto output = network.predict({2, 3, 1});
// vd_t is a vector of double values.
nn::vd_t output = network.predict({2, 3, 1});

return 0;
}
Expand Down

0 comments on commit 0037d10

Please sign in to comment.