Skip to content
/ mlp Public

A multi-layer perceptron (A.K.A neural network 🙇) from scratch in Java ☕.

Notifications You must be signed in to change notification settings

pc9795/mlp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Import this project as a maven project in your preferred IDE

Project Structure

  • experiments - All the code for the experiments ran. XOR, Sin and Letter recognition
  • experiments.utils - Utility methods which are used in evaluating the experiments
  • mlp - All the code for Multi layer perceptron implementation
  • mlp.activations - All the activation functions which can be used - RELU, Leaky RELU, Sigmoid, Linear, Tanh, Softmax
  • mlp.exceptions - Custom exceptions for this project
  • mlp.loss_functions - All the loss function which can be used - Squared loss, Cross entropy, Binary cross entropy

Sample Training and testing Example

    int ni = ...
    int nh = ...
    int no = ...
    int randomState = ...
    double learningRate = ...
    int epochs = ...
    ActivationType type = ...
    boolean isClassification = ...
    boolean isMulticlass = ...
    int batchSize = ...
    
    //Create an multi layer perceptron object
    MultilayerPerceptron mlp = new MultilayerPerceptron(ni, nh, no, randomState, learningRate, epochs, type, 
        isClassification, isMulticlass, bathcSize);

    //Training the MLP
    mlp.fit(input, output);

    //Get the predictions of the MLP
    double predicted[][] = mlp.predict(input);

About

A multi-layer perceptron (A.K.A neural network 🙇) from scratch in Java ☕.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages