This website is under construction. Please, be patient.

HTML logo. CSS logo. JavaScript logo.

You can exit by pressing q or ESC or clicking outside this box.

Hey Emacs user! You are more than welcome. You may be interested in checking out my Emacs configuration .

Emacs icon. Richard Stallmann with Emacs icon.

You can exit by pressing q or ESC or clicking outside this box.

Hey Vim user! You are more than welcome.

Vim icon.

You can exit by pressing q or ESC or clicking outside this box.

Hey visitor! It seems that you like to use the keyboard to scroll. Please consider upgrading to (probably) better shortcuts.

You can exit by pressing q or ESC or clicking outside this box.

Hey visitor! It seems that you are using Google Chrome. You are more than welcome, but your browser is not: read why . Please consider changing it.

Qutebrowser icon. GNU Operating System icon.

You can exit by pressing q or ESC or clicking outside this box.

This is my email. For security reasons it has been included as an image.

Personal email.

You can exit by pressing q or ESC or clicking outside this box.

This is my phone number. For security reasons it has been included as an image.

Personal phone number.

You can exit by pressing q or ESC or clicking outside this box.

This is my address. For security reasons it has been included as an image.

Personal address.

You can exit by pressing q or ESC or clicking outside this box.

Wow, wow, slow down.
That's too fast!
Background image: Flower. Background image: Leaves. Background image: Lighthouse. Background image: Mountain. Background image: Boat. Background image: Flame. Background image: Road.

David Álvarez Rosa

Mathematics and
Industrial Engineering graduate.

About Blog CV
Go to top

This website does not (and won't ever) use cookies. I value your privacy.

Implementing a Neural Network from scratch – Part 1

  6 to 9 minutes to read

  David Álvarez Rosa

  Neural Network - AI - Deep Learning

Machine Learning, C++, Implementation, Scratch

  May 3, 2020

Abstract. The first entry of this blog series of implementing a Neural Network in C++ will be covering the mathematical theory behind the fully connected layered artificial neural networks. We will start by defining its topology and its core components. Then we will dicuss how a neural network works (namely forward propagation) This blog entry will finish by reformulating the learning problem from a mathematical optimization point of view and deriving the well-known backward propagation formula.

#include
#include "Defs.hh"
#include "Math.hh"
#include "NeuralNetwork.hh"
#include "Data.hh"
int main() { std::cout.setf(std::ios::fixed); std::cout.precision(5); // Read data. const int sizeTrainDataset = 1; std::ifstream fileTrain("data/train.dat"); for (int i = 0; i < sizeTrainDataset; ++i) trainDataset.push_back(Data(fileTrain)); const int sizeTestDataset = 100; std::ifstream fileTest("data/test.dat"); for (int i = 0; i < sizeTestDataset; ++i) testDataset.push_back(Data(fileTest)); // Choose model and initialize Neural Network. NeuralNetwork neuralNetwork(neuronsPerLayer); // Train Neural Network. neuralNetwork.train(trainDataset, 5); // Test Neural Network. neuralNetwork.test(testDataset);
}
Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Except where otherwise noted, content on this website is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license.

The source code of this website is licensed under the GNU General Public License.


Broken links and other corrections or suggestions can be sent to <webmaster@alvarezrosa.com>.

Last update: June 25, 2022