Skip to content
This repository has been archived by the owner on Mar 22, 2020. It is now read-only.

Latest commit

 

History

History
156 lines (101 loc) · 4.91 KB

README.md

File metadata and controls

156 lines (101 loc) · 4.91 KB

SortingHat - A Neural Network Aproach

This Artificial Neural Network was developed in the IntelliJ IDE 2018.1 and using Java 1.9 for recognise, based on inputs, the specialization fields at ESPM Information System (TECH) students.


Inputs

The input layer has 15 neurons and accept this following values:

  1. Você possui facilidade com as disciplinas de desenvolvimento?
  2. Você possui facilidade com as disciplinas de gestão?
  3. Você possui facilidade com as disciplinas quantitativas?
  4. Você possui facilidade com as disciplinas de banco de dados?
  5. Você possui facilidade com as disciplinas de design?
  6. Você gosta das disciplinas de desenvolvimento?
  7. Você gosta das disciplinas de gestão?
  8. Você gosta das disciplinas quantitativas?
  9. Você gosta das disciplinas de banco de dados?
  10. Você gosta das disciplinas de design?
  11. Você possui facilidade com raciocínio lógico?
  12. Você possui interesse na resolução de problemas por meio da modelagem ou desenvolvimento de sistemas?
  13. Você possui interesse em entender o funcionamento das mecânicas de jogos?
  14. Você possui interesse na modelagem ou desenvolvimento de jogos/animações?
  15. Você possui interesse na resolução de problemas, baseando-se na extração, manipulação e análise de informações?

Obs: You can find the input values at the data folder.


Structure

You can find at the packages paes.training.c1 or paes.training.c2, respectively, RNAs with one and two hidden layers. This files contains the propagation and the backpropagation algorithm (training algorithm). At paes.test.c1 or paes.test.c2, you can find the responsible classes for testing assertiveness rate.

Variables

  • int n = 3 - This variable is responsible for the number of neurons in the hidden layer;
  • int minValue = 0 - This variable is responsible for control the input's matrix;
  • int age = 1 - This variable is responsible for count the number of ages;

Obs: 1 age is equals 360 iterations.

Methods

Propagation

  • ponderationL1();
  • activationL1();
  • ponderationL2();
  • activationL2();
  • ponderationL3() Obs: Only in c2 classes;
  • activationL3() Obs: Only in c2 classes;

Error Calculation

  • errorCalculation();

BackPropagation

  • gradientCalculationL3() Obs: Only in c2 classes;
  • gradientCalculationL2();
  • gradientCalculationL1();
  • weightsUpdateL3() Obs: Only in c2 classes;
  • weightsUpdateL2();
  • weightsUpdateL1();

Outputs

The output layer has 3 neurons. The following combination represents one of the multiples outputs, that can be printed by the RNA.

  • Business Intelligence - BI: 0 0 1;
  • Mobile Development - DEV: 0 1 0;
  • Games: 1 0 0;
  • Unidentified Spec Field detected!: 0 0 0;
  • Unidentified Spec Field detected!: 1 1 1.

Results - c1 (one hidden layer)

Sigmoid Function

  • 2 Neurons:

  • 3 Neurons:

  • 8 Neurons:

Hyperbolic Tangent (tanh)

  • 2 Neurons: non-evaluated;

  • 3 Neurons: non-evaluated;

  • 8 Neurons: non-evaluated;

ReLU (Rectified linear unit)

  • 2 Neurons: non-evaluated;

  • 3 Neurons: non-evaluated;

  • 8 Neurons: non-evaluated;

Obs:

  • NaN: Incapacibility of achieving a Numeric Value. Probably because the flutuant pointer has exploded;
  • Time Exception: Incapacibility of converging in a polynomial time, or, at least, the incapacibility of converging in a time next to the sigmoid or tanh functions times.

Results - c2 (two hidden layers)

Sigmoid Function

  • 2 Neurons:

  • 3 Neurons:

  • 8 Neurons:

Hyperbolic Tangent (tanh)

  • 2 Neurons: non-evaluated;

  • 3 Neurons: non-evaluated;

  • 8 Neurons: non-evaluated;

ReLU (Rectified linear unit)

  • 2 Neurons: non-evaluated;

  • 3 Neurons: non-evaluated;

  • 8 Neurons: non-evaluated;

Obs:

  • NaN: Incapacibility of achieving a Numeric Value. Probably because the flutuant pointer has exploded;
  • Time Exception: Incapacibility of converging in a polynomial time, or, at least, the incapacibility of converging in a time next to the sigmoid or tanh functions times.

References Links

Activation Functions:

FACURE, Matheus:

HAYKIN, Simon:

  • Neural Network. 3rd Edition. Artmed. 2008

Rectifier Nonlinearities Improve Neural Network Acoustic Models:

Sample repository: