Skip to content
/ cnum Public

Implementation of a simple perceptron network that can recognize 80% of MNIST digits

Notifications You must be signed in to change notification settings

ohnx/cnum

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

cnum

MNIST number recognition using ANSI C.

Credits to @mmlind for inspiring this project and these people for creating the MNIST database. Also, thanks to Apple for maintaining some incredibly nice-to-read BLAS documentation.

Perceptron network

Aside from build-essential, compilation requires BLAS. Run sudo apt-get install libopenblas-dev on Debian systems to get it.

Afterwards, make will generate a nice little output of cnum. Run this just by doing ./cnum. A pre-trained network is provided already and you can use it by answering 'Y' to the question.

Accuracy

Using a single-layer perceptron network with 10 neurons with inputs ranging from 0 to 1, inclusive, I am able to obtain an accuracy rate of 85% after the fourth round of training.

Problem numbers include #11, #24, and #943 (all from t10k).

Speed

When compiled with -O3, training time is around 0.8 seconds on my Intel(R) Xeon(R) CPU E5-2640 v2 @ 2.00GHz for the 60k set, but I train the network on the 60k set twice, so it takes under 2 seconds total. With debug mode (-O0 -g -pg), the training time is around 1 second each and 2 seconds total. Thanks, BLAS :)

Convolutional

Planned; NYI.

Roadmap

  1. Optimize matrix multiplication using BLAS and better memory management Done on June 18, 2018!
    • Target is < 1s train time Train time is ~0.7s when optimized
  2. Integrate with tigr
    • Enable a user to draw their own numbers and have them recognized
  3. Add support for multi-layer perceptron network
  4. Implement convolutional network

About

Implementation of a simple perceptron network that can recognize 80% of MNIST digits

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published