Skip to content

Basic implementation of a multi-layer feed-forward ANN using the Eigen library.

Notifications You must be signed in to change notification settings

bwalshe/simple_nnet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Background

If you're looking for a great deep learning framework, then this isn't it. This project started out as a bit of an experiment in using the Eigen linear algebra library and trying to write a non-trivial C++ program, which is a language I don't use a lot in my day-to-day work.

What does this do?

Once compiled this program will load MNIST digit data (or any files which use a similar format), perform one-hot encoding of the labels and then train a single hidden layer ANN against these data using back-prop.

It'll display the error rate, and then attempt to classify the first 20 entries of the test set.

Compiling and Running

Requirements

Currently this works with g++ on Linux and WSL. I've tried getting it to work with Visual Studio 2017 / Windows 10, but zlib and endian coversions behave funny. clang should be fine, but I haven't tried it yet.

Compiling

In the directory where you have cloned the repo, create a sub-directory build. Inside this sub-directory run cmake

$ mkdir build
$ cd build
$ cmake ..

This should produce an executable named train

Running

To run this, you will need the MNIST data set. Assuming you have downloaded this into ~/data, run the training process as follows:

./train ~/data/t10k-images-idx3-ubyte.gz ~/data/t10k-labels-idx1-ubyte.gz

Things that could be improved

  • Add a softmax output layer.
  • Give more control over the learning rate, number of layers, etc.
  • Make it easy to choose between ReLU and sigmoid activation functions.
  • Annealing, adaptive learning rate, all the things that actually make neural nets viable.
  • General refactor to allow the components to be tested.

About

Basic implementation of a multi-layer feed-forward ANN using the Eigen library.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published