Multilingual content from IBKR

Close Navigation
Learn more about IBKR accounts
Neural Network In Python: Introduction, Structure And Trading Strategies – Part III

Neural Network In Python: Introduction, Structure And Trading Strategies – Part III

Posted September 18, 2019 at 10:20 am
Devang Singh

See the first and second installments of this series to learn more about perceptrons and neural networks as Devang demonstrates in this tutorial.

Training the Neural Network

To simplify things in this neural network tutorial, we can say that there are two ways to code a program for performing a specific task. 

  • Define all the rules required by the program to compute the result given some input to the program. 
  • Develop the framework upon which the code will learn to perform the specific task by training itself on a dataset through adjusting the result it computes to be as close to the actual results which have been observed. 

The second process is called training the model which is what we will be focusing on.

The neural network will be given the dataset, which consists of the OHLCV data as the input, as well as the output. We would also give the model the Close price of the next day. The actual value of the output will be represented by ‘y’ and the estimated value will be represented by y^, y hat. 

The training of the model involves adjusting the weights of the variables for all the different neurons present in the neural network. This is done by minimizing the ‘Cost Function’.

There are many cost functions that are used in practice, the most popular one is computed as half of the sum of squared differences between the actual and estimated values for the training dataset.

Training the Neural Network equation

The way the neural network trains itself is by first computing the cost function for the training dataset for a given set of weights for the neurons. Then it goes back and adjusts the weights, followed by computing the cost function for the training dataset based on the new weights. The process of sending the errors back to the network for adjusting the weights is called backpropagation. 

This is repeated several times till the cost function has been minimized. We will look at how the weights are adjusted and the cost function is minimized in more detail next.

The weights are adjusted to minimize the cost function. One way to do this is through brute force. Suppose we take 1000 values for the weights, and evaluate the cost function for these values. When we plot the graph of the cost function, we will arrive at a graph as shown below. 

The best value for weights would be the cost function corresponding to the minima of this graph.

Training the Neural Network

This approach could be successful for a neural network involving a single weight which needs to be optimized. However, as the number of weights to be adjusted and the number of hidden layers increases, the number of computations required will increase drastically. 

The time it will require to train such a model will be extremely large even on the world’s fastest supercomputer. For this reason, it is essential to develop a better, faster methodology for computing the weights of the neural network. This process is called Gradient Descent. We will look into this concept in the next part of the neural network tutorial.

Visit QuantInsti website to download the sample code.

Disclosure: Interactive Brokers

Information posted on IBKR Campus that is provided by third-parties does NOT constitute a recommendation that you should contract for the services of that third party. Third-party participants who contribute to IBKR Campus are independent of Interactive Brokers and Interactive Brokers does not make any representations or warranties concerning the services offered, their past or future performance, or the accuracy of the information provided by the third party. Past performance is no guarantee of future results.

This material is from QuantInsti and is being posted with its permission. The views expressed in this material are solely those of the author and/or QuantInsti and Interactive Brokers is not endorsing or recommending any investment or trading discussed in the material. This material is not and should not be construed as an offer to buy or sell any security. It should not be construed as research or investment advice or a recommendation to buy, sell or hold any security or commodity. This material does not and is not intended to take into account the particular financial conditions, investment objectives or requirements of individual customers. Before acting on this material, you should consider whether it is suitable for your particular circumstances and, as necessary, seek professional advice.

IBKR Campus Newsletters

This website uses cookies to collect usage information in order to offer a better browsing experience. By browsing this site or by clicking on the "ACCEPT COOKIES" button you accept our Cookie Policy.