This is a multi-class classification problem, meaning that there are more than two classes to be predicted, in fact there are 7 categories.

- You can download the source code from GitHub
- If you would like to see how to code a neural network from scratch, check this article
- Download the dataset we will be using from Kaggle
- A very good article on multi class concepts which I reference below

This article will focus on:

1. Import the classes and functions2. Train and save the model

2.1 Load our data

2.2 Prepare our features

2.3 Split train and test data

2.4 Hot Encoding Y

2.5…

Neural networks reflect the behavior of the human brain. They allow programs to recognise patterns and solve common problems in machine learning. This is another option to either perform classification instead of logistics regression. At Rapidtrade, we use neural networks to classify data and run regression scenarios. The source code for this article is available on GitHub.

We will be working with a dataset from Kaggle and you can download it here. So to **visualise** the data we will be working with in this article, see below. We will use this to train the network to categorise our customers according…

So, understanding what happens in linear regression is **so** good from an understanding point of view. Here is a deep dive **without** using python libraries. Here is a link to the source code for this article in github.

If you do need an intro to gradient descent have a look at my 5 part YouTube series first.

In step 1, we will write gradient descent from scratch, while in step 2 we will use sklearn’s linear regression.

Let’s download our dataset from kaggle. The Global Health Observatory (GHO) data repository under World Health Organization (WHO) keeps track of the health…

Need a brief and easy introduction to polynomials? Read ahead or watch this tutorial on youtube. You can also download the source code in this article from github.

The problem with linear regression is that data does not often go in a straight line. If we look at below data set, a linear function makes perfect sense.

Using PCA, we can reduce our features from (n) down to either 2 or 3 dimensions which can then be plotted. We will start by looking at our dataset as we downloaded from kaggle.

We can see 4 attributes, which is super for predictions, but does not allow us to plot a visualisation.

If you have arrived here and do not have a good understanding of SVM, then check this article first.

The Iris flower data set is a multivariate data set introduced by the British statistician and biologist Ronald Fisher in his 1936 paper The use of multiple measurements in taxonomic problems. It is sometimes called Anderson’s Iris data set because Edgar Anderson collected the data to quantify the morphologic variation of Iris flowers of three related species. The data set consists of 50 samples from each of three species of Iris (Iris Setosa, Iris virginica, and Iris versicolor). Four features were…

The important job that SVM’s perform is to find a decision boundary to classify our data. This decision boundary is also called the **hyperplane**.

Lets start with an example to explain it. **Visually**, if you look at figure 1, you will see that it makes sense for purple line to be a better hyperplane than the black line. The black line will also do the job, but skates a little to close to one of the red points to make it a good decision line.

Visually, this is quite easy to spot.

Lets implement a neural network to classify customers according to their key features. Running neural networks in matlab is quite understandable once you understand the equations.

This is part 5 in my series on neural networks. You are welcome to start at part 1. Here are the previous articles explaining the cost function.

Source code used here can be downloaded from github. Spoiler alert, we will only get a 66% accuracy, but, thats the data we have ;-)

Just as a refresher, here is the dataset we downloaded from kaggle…

After understanding forward and backward propagation, lets move onto **calculating cost and gradient**. This is vital component to neural networks.

This is part 2 in my series on neural networks. You are welcome to start at part 1 or skip to part 5 if you just want the code.

So, to perform gradient descent or cost optimisation, we need to write a cost function which performs:

In this article we will deal with (3) and (4). You can click on the links above for a deep dive on forward/back prop.

So, just…

Backward propagation is a tricky subject to explain. Lets give it a go here showing the code and output data as we go.

This is part 3 in my series on neural networks. You are welcome to start at part 1 or skip to part 5 if you just want the code.

So, to perform gradient descent or cost optimisation, we need to write a cost function which performs:

**This article concentres on (2) backward propagation.**

So, we have simplified our neural network in figure 1 to only show the details to…

Coding, technology and data are my passions. Oh, and some crypto trading with lots of cycling on the side. https://www.linkedin.com/in/shaun-enslin-4984bb14b/