Categories
Machine learning

Neural network design

Today I continue my neural network post series with some considerations on neural network implementation. So far we covered what is a neural network and how it works but we are still left with numerous choices regarding its design. How many layers should we use, how many units (neurons) in each layer, which activation functions, […]

Categories
Machine learning

Tensorflow introduction

Following my previous post on neural network I thought it would be nice to see how to implement these concepts with tensorflow. Tensor flow is a new library developed by google. It is aimed at building fast and efficient machine learning pipelines. Actually it is based on the computation graph that we discussed earlier. It provides […]

Categories
Machine learning

Neural Network

Machine learning applications widespread every day in many domains. One of today’s most powerful techniques is the neural network. This technique is employed in many applications such as image recognition, speech analysis and translation, self-driving cars, etc… In fact such learning algorithms have been known for decades. But only recently it has become mainstream supported by […]

Categories
Machine learning

k-means clustering

k-means is a clustering algorithm which divides space into k different clusters. Each cluster is represented by its centre of mass (i.e. barycentre) and data points are assigned to the cluster with the nearest barycentre. Algorithm The learning algorithm starts by choosing k random points. Each of these is the centre of mass of a […]

Categories
Machine learning

Confusion matrix

When you train several models over a dataset you need a way to compare the model performances and choose the one that best suites your needs. As we will see there are different ways to compare the results and then pick the best one. Let’s start with what scores we can get out of the training […]

Categories
Machine learning

k-Nearest Neighbours

The k-Nearest Neighbours is based on a simple idea: similar points tend to have similar outcomes. Therefore the idea is to memorise all the points in the dataset. The prediction for a new entry is made by finding the closest point in the dataset. Then the prediction for the new entry is simply the same outcome as the […]

Categories
General Machine learning

Frequentists vs Bayesians

Categories
Machine learning

How to split a dataset

In machine learning it is pretty obvious to me that you need to split your dataset into 2 parts: a training set that you can use to train your model and find optimal parameters a test set that you can use to test your trained model and see how well it generalises. It is important […]

Categories
Machine learning

Weight decay regularisation

Most machine learning techniques follow a similar strategy: Get the best possible model on the training dataset Generalise by testing the model on the test dataset The test dataset consists of data that are never used during training and it allows to test how the algorithm will perform over “not seen before” data.

Categories
Machine learning

Stochastic gradient descent

With gradient descent we try to optimise a function that runs over the entire dataset. represents the “cost” over the entire dataset. When working with big datasets this yield to complex function optimisation and slow computation time. This is also a problem when dealing with streaming data as we need to wait for the stream to end […]