BrabeNetz is a supervised neural network written in C++, aiming to be as fast as possible¹ by using C arrays and only raw values instead of objects (May be hard to read because of that)
BrabeNetz has no bounds- or error-checking for performance reasons, be careful what you feed it.
I've currently trained it to solve XOR.
You can save the network's state to disk with the network::save(string) function, and load it on next startup. An example of this can be found here.
It's pretty fast (TODO)
- Fast Feed-Forward algorithm
- Fast Backwards-Propagation algorithm
- Easy to use (Inputs, outputs)
- C arrays >
std::vector - Fast network state saving via
state.nnfile (Weights, Biases, Sizes) - Multithreaded if worth the spawn-overhead (
std::threador NVIDIA CUDA) - Scalability (Neuron size, Layer count) - only limited by hardware
- Randomly generated values to begin with
- Easily save/load with
network::save(string)/network::load(string) - Sigmoid squashing function (TODO: ReLU?)
- Biases for each neuron
network_topologyhelper objects for loading/saving state (user friendly)
-
Constructors
network(initializer_list<int>): Create a new neural network with the given topology vector and fill it with random numbers ({ 2, 3, 4, 1}= 2 Input, 3 Hidden, 4 Hidden, 1 Output)network(network_topology&): Create a new neural network with the given network topology and load it's valuesnetwork(string): Create a new neural network with the given path to thesate.nnfile and load it. (Seenetwork_topology::load(string)for more info)
-
Functions
double train(double* input_values, int length, int& out_length): Feed the networkinput_valuesand return an array of output values (size of output layer in topology) (whereout_lengthwill be set to the length of returned output values)double train(double* input_values, int length, double* expected_output): Feed the networkinput_valuesand compare predicted output withexpected_output, Backwards-Propagate (adjust weights/biases) if needed. Returns the total error of the output layer.void save(string path): Save the current network state (topology, weights, biases) to disk (with the given path or default:state.nn)void set_learnrate(double value): Set the learn rate of the network (used bytrain(..)function). Should always be1 / (total train times + 1)
¹: I'm new to C/C++ so; as fast as I can do it