Skip to content

FraLiturri/NNevermind

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NNeverMind image

Version Linux test Stars

⚠️ Work is still in progress; this is a temporary version.

🏅 What's NNevermind?

NNevermind is a customizable Neural Network with parallelised grid search and results analysis. Originally developed for a Machine Learning exam, this project achieved the best loss performance among all Type A projects (building a neural network from scratch) and ranked second in the overall combined leaderboard (including Type B projects using pre-built neural networks). This motivated us to turn it into a fast, user-friendly library.

🏃 A quick look

Many options are available for training algorithms, activation functions, and loss functions, allowing full customisation of the neural network behaviour.

🔧 Configuration

The user can specify:

Training algorithm by passing a string to the constructor:

NN NeuralNetwork("BP", stod(argv[1]), stod(argv[2]), stod(argv[3]), atoi(argv[4]));

Available options:

  • "BP" (Backpropagation)
  • "Adam"
  • "Random"

Loss function during training:

Loss TrainingLoss("MSE", tr_loss_path);

Available options:

  • "MSE" (Mean Square Error)
  • "MEE" (Mean Euclidean Error)
  • "BCE" (Binary Cross-Entropy)

Activation function for each layer:

Hidden_Layer first_hidden, second_hidden, output_layer;

first_hidden.create("relu", 1);
second_hidden.create("relu", 2);
output_layer.create("sigmoid", 3);

Available options:

  • "relu"
  • "sigmoid"
  • "leaky_relu"
  • "linear"
  • "tanh"
  • "threshold"

More info can be found in the Wiki (not now).

⬇️ Installation

The download is available via GitHub or by typing

git clone https://github.com/FraLiturri/NNevermind.git

in the terminal. This package also requires Eigen, which can be installed from the official website Eigen or by typing

git clone https://gitlab.com/libeigen/eigen.git

as usually.

🚀 Starting engines

Before starting, setting up file paths is necessary: open the terminal and run

python copilot.py change_path

then type Eigen's path: C:/User/.../your_path_to_eigen/Eigen/Dense (remember to add Eigen/Dense and to use / to specify sub-directories).
Then compile with

g++ main.cpp -fopenmp -O3 -o build/main.exe

Parameters can be passed via the terminal

./build/main.exe eta_value alpha_value lambda_value epochs

or using the interface (see next section).

🤖 Using copilot.py

Use copilot.py to plot results with

python copilot.py plot

or to open the interface

python copilot.py search

through which the hyperparameters are passed for a grid search or a single run. When launched, the interface automatically compiles the code.

⚠️ Please, to perform the grid search use the name main.cpp for your source file, otherwise the script won't work.

📖 Benchmark

The tests were executed on 500 samples (trained with BackPropagation), using an 11th-generation Intel(R) Core(TM) i7-1165G7 @ 2.80GHz. Time may vary on different devices.

Size (only hidden units) Epochs Time (seconds)
20x20 1k 1.1
20x20 10k 10.4
50x50 1k 3.3
50x50 10k 30.8
100x100 1k 16.3
100x100 10k 82.1
200x200 1k 26.9
200x200x200 1k 52.6

🔜 Coming soon

Adam optimisation. Grid search usage improvements.

🧱 Dependencies

This project uses the Eigen library under the MPL 2.0 license. Logo created with Logo.com.

About

Customizable Neural Network with parallelised grid search and results analysis.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •