Build a Neural Net in 4 Minutes

Hello world, welcome to Sirajology. Today, we’re going to be building
a neural net in four minutes, let’s get started. There are like a million and one
machine learning models out there, but neural nets in particular have gotten
really popular recently because of two things,
faster computers and more data. They’ve helped produce some amazing
breakthroughs in everything from image recognition to
generating rap songs. There’s really just 3 steps
involved in machine learning. Build it. Train it. and test it. Once we build our model we can
train it against our input and output data to make it better and
better at pattern recognition. So, let’s build our model. A three layer neural network in Python. We’ll want to start off by importing
NumPy, which is my go to library for scientific computing in Python. Then, we’ll want to create
a function that will map any value to a value between zero and one. This is called a sigmoid. This function will be run in every
neuron of our network when data hits it. It’s useful for
creating probabilities out of numbers. Once we’ve created that, let’s initialize our input
data set as a matrix. Each row is a different
training example. Each column represents
a different neuron. So we have four training examples
with three input neurons each. Then we’ll create our output data set. Four examples, one output neuron each. Since we’ll be generating
random numbers in a second, let’s seed them to make
them deterministic. This just means give random numbers that
are generated the same starting point or seed so that we’ll get the same
sequence of generated numbers every time we run our program. This is useful for debugging. Next, we’ll create our synapse matrices. Synapses are the connections
between each neuron in one layer to every neuron in the next layer. Since we will have three layers in our
network, we need two synapses matrixes. Each synapse has a random
weight assigned to it. After that,
we’ll begin the training code. We’ll create a for-loop that
iterates over the training code to optimize the network for
the given data set. We’ll start of by
creating our first layer. It’s just our input data. Now comes the prediction step. We’ll perform matrix multiplication
between each layer and its synapse. Then we’ll run our sigmoid function on
all the values in the matrix to create the next layer. The next layer contains
a prediction of the output data. Then we do the same thing on that
layer to get our next layer, which is a more refined prediction. So now that we have a prediction of the
output value in layer two, let’s compare it to the expected output data using
subtraction to get the error rate. We’ll also want to print out the average
error rate at a set interval to make sure it goes down every time. Next, we’ll multiply our error rate by
the result of our sigmoid function. The function is used to get the
derivative of our output prediction from layer two. This will give us a delta which we’ll
use to reduce the error rate of our predictions when we update
our synapses every iteration. Then we’ll want to see how much layer
one contributed to the error in layer two. This is called back propagation. We’ll get this error by
multiplying layer two’s delta by synapse one’s transpose. Then we’ll get layer one’s
delta by multiplying its error by the result of
our sigmoid function. The function is used to get
the derivative of layer one. Now that we have deltas for each of
our layers, we can use them to update our synapse rates to reduce the error
rate more and more every iteration. This is an algorithm
called gradient descent. To do this, we’ll just multiply
each layer by a delta. Finally, let’s print
the predicted output. And there you have it. Let’s run this in Terminal and
see what we get. Awesome, we can see that our error
rate decreases every iteration and the predicted output is very,
very close to the actual output. There is so much we can do to
improve our neural network. For more information check out
the links in the description below and please subscribe for
more technology videos. Thanks for watching.


Add a Comment

Your email address will not be published. Required fields are marked *