Close Navigation
Learn more about IBKR accounts
Neural Network In Python: Types, Structure And Trading Strategies – Part III

Neural Network In Python: Types, Structure And Trading Strategies – Part III

Posted June 2, 2023
Chainika Thakar
QuantInsti

Get a general introduction to a neuron with Part I and learn how to train a neural network with Part II.

Neural network in trading

Neural networks help you develop strategies based on your overall investment strategy ⁽¹⁾. Whether it is high-risk and growth-focused (short-term trades) or a conservative approach for long-term investment, it all depends on the kind of trading strategy.

For example, you wish to find a few stocks with particular growth performance or an upward price trend over a period of one year. Now, the neural network can figure out such stocks for your portfolio and hence, make your work easy.

Also, you can give a number of such parameters or necessary pointers to the neural network so that you have exactly what you are looking for with less effort and time.


Neural network strategy in Python

Let us now see the strategical representation with neural networks in Python.

First of all we will import libraries.

Step 1: Importing Libraries

We will start by importing a few libraries, the others will be imported as and when they are used in the program at different stages. For now, we will import the libraries which will help us in importing and preparing the dataset for training and testing the model.

import numpy as np
import pandas as pd
import talib

Importing_libraries.py hosted with ❤ by GitHub

Numpy is a fundamental package for scientific computing, we will be using this library for computations on our dataset. The library is imported using the alias np.

Pandas will help us in using the powerful dataframe object, which will be used throughout the code for building the artificial neural network in Python.

Ta-lib is a technical analysis library, which will be used to compute the RSI and Williams %R. These will be used as features in order to train our artificial neural network or ANN. We could add more features using this particular library.

Step 2: Fetching data from yahoo finance

# Setting the random seed to a fixed number
import random
# Setting the random seed to a fixed number

import random
random.seed(42)

Random_seed.py hosted with ❤ by GitHub

Random is used to initialise the seed to a fixed number. This is such that every time we run the code we start with the same seed.

# Download the price data of Apple from November 2019 to January 2023
# Set the ticker as 'AAPL' and specify the start and end dates

price_AAPL= yf.download('AAPL', start='2017-11-06', end='2023-01-03', auto_adjust = True)

Fetch_AAPL_data.py hosted with ❤ by GitHub

We have taken Apple’s data for the time period 6th November 2017 to 3rd January 2023.

Step 3: Preparing the dataset

We will be building our input features by using only the OHLC values. This dataset will help us to specify the features for training our neural network in the next step.

# Preparing the dataset
price_AAPL['H-L'] = price_AAPL['High'] - price_AAPL['Low']
price_AAPL['O-C'] = price_AAPL['Close'] - price_AAPL['Open']
price_AAPL['3day MA'] = price_AAPL['Close'].shift(1).rolling(window = 3).mean()
price_AAPL['10day MA'] = price_AAPL['Close'].shift(1).rolling(window = 10).mean()
price_AAPL['30day MA'] = price_AAPL['Close'].shift(1).rolling(window = 30).mean()
price_AAPL['Std_dev']= price_AAPL['Close'].rolling(5).std()
price_AAPL['RSI'] = talib.RSI(price_AAPL['Close'].values, timeperiod = 9)
price_AAPL['Williams %R'] = talib.WILLR(dataset['High'].values, price_AAPL['Low'].values, price_AAPL['Close'].values, 7)

Preparing_dataset.py hosted with ❤ by GitHub

Step 4: Defining input features from dataset

We then prepare the various input features which will be used by the artificial neural network learning for making the predictions. We define the following input features:

  • High minus Low price
  • Close minus Open price
  • Three day moving average
  • Ten day moving average
  • 30 day moving average
  • Standard deviation for a period of 5 days
  • Relative Strength Index
  • Williams %R

We then define the output value as price rise, which is a binary variable storing 1 when the closing price of tomorrow is greater than the closing price of today.

price_AAPL['Price_Rise'] = np.where(price_AAPL['Close'].shift(-1) > price_AAPL['Close'], 1, 0)

Define_output_value.py hosted with ❤ by GitHub

Next, we drop all the rows storing NaN values by using the dropna() function.

price_AAPL = price_AAPL.dropna()

NaN.py hosted with ❤ by GitHub

We then create two data frames to store the input and the output variables. The dataframe ‘x’ stores the input features. The columns start from the fifth column of the dataset and go on to the second last column. The last column will be stored in the dataframe ‘y’ (prediction value) which is the rise in the prices.

X = price_AAPL.iloc[:, 4:-1]
y = price_AAPL.iloc[:, -1]

Two_dataframes.py hosted with ❤ by GitHub

In this part of the code, we will split our input and output variables to create the test and train datasets. This is done by creating a variable called split, which is defined to be the integer value of 0.8 times the length of the dataset.

We then slice the X and y variables into four separate data frames: Xtrain, Xtest, ytrain and ytest. This is an essential part of any machine learning algorithm, the training data is used by the model to arrive at the weights of the model.

# Splitting the dataset
split = int(len(price_AAPL)*0.8)
X_train, X_test, y_train, y_test = X[:split], X[split:], y[:split], y[split:]

The test dataset is used to see how the model will perform on new data which would be fed into the model. The test dataset also has the actual value for the output, which helps us in understanding how efficient the model is.

We will look at the confusion matrix later in the code, which essentially is a measure of how accurate the predictions made by the model are.

Step 5: Standardise the dataset (Data preprocessing)

Another important step in data preprocessing is to standardise the dataset. This process makes the mean of all the input features equal to zero and also converts their variance to 1. This ensures that there is no bias while training the model due to the different scales of all input features.

If this is not done the neural network might get confused and give a higher weight to those features which have a higher average value than others.

We implement this step by importing the StandardScaler method from sklearn.preprocessing library. We instantiate the variable sc with the StandardScaler() function.

# Feature Scaling
from sklearn.preprocessing import StandardScaler
sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.transform(X_test)

Feature_scaling.py hosted with ❤ by GitHub

Stay tuned for the next installment to learn how to build the artificial neural network model.

Originally posted on QuantInsti. Visit their blog for additional insights on this topic.

Join The Conversation

If you have a general question, it may already be covered in our FAQs. If you have an account-specific question or concern, please reach out to Client Services.

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclosure: Interactive Brokers

Information posted on IBKR Campus that is provided by third-parties does NOT constitute a recommendation that you should contract for the services of that third party. Third-party participants who contribute to IBKR Campus are independent of Interactive Brokers and Interactive Brokers does not make any representations or warranties concerning the services offered, their past or future performance, or the accuracy of the information provided by the third party. Past performance is no guarantee of future results.

This material is from QuantInsti and is being posted with its permission. The views expressed in this material are solely those of the author and/or QuantInsti and Interactive Brokers is not endorsing or recommending any investment or trading discussed in the material. This material is not and should not be construed as an offer to buy or sell any security. It should not be construed as research or investment advice or a recommendation to buy, sell or hold any security or commodity. This material does not and is not intended to take into account the particular financial conditions, investment objectives or requirements of individual customers. Before acting on this material, you should consider whether it is suitable for your particular circumstances and, as necessary, seek professional advice.

Disclosure: API Examples Discussed

Throughout the lesson, please keep in mind that the examples discussed are purely for technical demonstration purposes, and do not constitute trading advice. Also, it is important to remember that placing trades in a paper account is recommended before any live trading.

IBKR Campus Newsletters

This website uses cookies to collect usage information in order to offer a better browsing experience. By browsing this site or by clicking on the "ACCEPT COOKIES" button you accept our Cookie Policy.