DeepTrading With TensorFlow 3 - TodoTrader
DeepTrading With TensorFlow 3 - TodoTrader
DeepTrading With TensorFlow 3 - TodoTrader
BLACK BELT
e are now closer to applying our knowledge of neural networks (NN) to our trading systems. But, we still have to tune our rudiments a bit on TensorFlow.
If
you are not yet familiar with our supervised machine learning flowchart, take a look at the first two posts in this series.
As usual, the calculations contained in this post are part of a Jupyter notebook that is in our Github repository:
https://github.com/parrondo/deeptrading
The readers will use the iris data for this exercise.
Finally, we will build a one-hidden-layer neural network to predict the fourth attribute, Petal Width from the other three (Sepal length, Sepal width, Petal length).
Load configuration
In [1]:
data = load_iris()
# Dimensions of dataset
print("Dimensions of dataset")
n = X_raw.shape[0]
p = X_raw.shape[1]
print("n=",n,"p=",p)
Dimensions of dataset
n= 150 p= 3
In [3]:
Out[3]:
In [4]:
X_raw.shape # Array 150x3. Each element is a 3-dimensional data point: sepal length, sepal width, petal length
Out[4]:
(150, 3)
In [5]:
y_raw.shape # Vector 150. Each element is a 1-dimensional (scalar) data point: petal width
Out[5]:
(150,)
In [6]:
df
Out[6]:
sepal length (cm) sepal width (cm) petal length (cm) petal width (cm)
… … … … …
In [7]:
#
# Leave in blanck intentionally
#
Split data
In [8]:
# split into train and test sets
# Total samples
nsamples = n
# Samples in train
nsamples_train = jindex
# Samples in test
nsamples_test = nsamples - nsamples_train
print("Total number of samples: ",nsamples,"\nSamples in train set: ", nsamples_train,
"\nSamples in test set: ",nsamples_test)
X_test = X_raw[jindex:, :]
y_test = y_raw[jindex:]
Transform features
Important Note
Be careful not to write X_test_std = sc.fit_transform(X_test) instead of X_test_std = sc.transform(X_test) . In this case, it wouldn’t make a great difference since the mean and
standard deviation of the test set should be (quite) similar to the training set. However, this is not always the case in Forex market data, as has been well established in
the literature. The correct way is to re-use parameters from the training set if we are doing any kind of transformation. So, the test set should basically stand for “new,
unseen” data.
In [9]:
# Scale data
from sklearn.preprocessing import StandardScaler
sc = StandardScaler()
X_train_std = sc.fit_transform(X_train)
X_test_std = sc.transform(X_test)
# Clears the default graph stack and resets the global default graph
ops.reset_default_graph()
In [11]:
# make results reproducible
seed = 2
tf.set_random_seed(seed)
np.random.seed(seed)
# Initialize hyperparameters
n_features = X_train.shape[1]# Number of features in training data
print("Number of featuress in training data: ", n_features)
batch_size = 50
# Placeholders
print("Placeholders")
X = tf.placeholder(dtype=tf.float32, shape=[None, n_features], name="X")
y = tf.placeholder(dtype=tf.float32, shape=[None,1], name="y")
# Initializers
print("Initializers")
sigma = 1
weight_initializer = tf.variance_scaling_initializer(mode="fan_avg", distribution="uniform", scale=sigma)
bias_initializer = tf.zeros_initializer()
In [12]:
# Dimensions of the layers (aka layer nodes, neurons)(See figure of the model)
d0 = D = n_features # Layer 0 (Input layer)
d1 = 10 # Layer 1 (Hidden layer 1). Selected 10 for this example
d2 = C = 1 # Layer 2 (Output layer)
# Construct model
hidden_output = tf.nn.relu(tf.add(tf.matmul(X, W1), bias1))
final_output = tf.nn.relu(tf.add(tf.matmul(hidden_output, W2), bias2))
# Define optimizer
my_opt = tf.train.GradientDescentOptimizer(0.005)
train_step = my_opt.minimize(loss)
# Initialize variables
init = tf.global_variables_initializer()
d0 = 3 d1 = 10 d2 = 1
In [13]:
W1
Out[13]:
# Writer to record image, scalar, histogram and graph for display in tensorboard
writer = tf.summary.FileWriter("/tmp/tensorflow_logs", sess.graph)
sess.run(init)
# Training loop
train_loss = []
test_loss = []
for i in range(1000):
rand_index = np.random.choice(len(X_train), size=batch_size)
X_rand = X_train[rand_index]
y_rand = np.transpose([y_train[rand_index]])
sess.run(train_step, feed_dict={X: X_rand, y: y_rand})
writer.flush()
writer.close()
In [15]:
%matplotlib inline
# Plot loss (MSE) over time
plt.plot(train_loss, 'k-', label='Train Loss')
plt.plot(test_loss, 'r--', label='Test Loss')
plt.title('Loss (MSE) per Generation')
plt.legend(loc='upper right')
plt.xlabel('Generation')
plt.ylabel('Loss')
plt.show()
Tensorboard Graph
What follows is the graph we have executed and all data about it.
#
# Leave in blanck intentionally
#
We have reached the end of this post, but don’t worry, I will continue it very soon.In the meantime, I propose that you put your infrastructure in place to create trading
systems. Take a look at the following posts that are aimed at you acquiring good practices. So, your work could be reproducible.
Remember all these calculations are included in a Jupyter notebook in my Github repository:
https://github.com/parrondo/deeptrading
parrondo
Related articles
1 COMMENT
Quantocracy's Daily Wrap for 06/19/2019 | Quantocracy
2019-06-20 at 07:16 R E P L Y
[…] DeepTrading with TensorFlow III [Todo Trader] […]
Leave a Reply
Your email address will not be published. Required fields are marked *
Comment
Your Name*
Email Address*
Save my name, email, and website in this browser for the next time I
comment.
L E A V E C O M M E N T
Type Here
Recent Posts
Recent Comments
Quantocracy's Daily Wrap for 07/08/2019 | Quantocracy on DeepTrading with TensorFlow VI
Quantocracy's Daily Wrap for 07/03/2019 | Quantocracy on Deep Trading with TensorFlow V
Quantocracy's Daily Wrap for 06/23/2019 | Quantocracy on DeepTrading with Tensorflow IV
Quantocracy's Daily Wrap for 06/19/2019 | Quantocracy on DeepTrading with TensorFlow III
Quantocracy's Daily Wrap for 06/15/2019 | Quantocracy on DeepTrading with TensorFlow II
Archives
July 2019
June 2019
May 2019
April 2019
November 2017
Categories
Be Productive
Black Belt
Brown Belt
Create Systems
Software
Start to trade
Meta
Log in
Entries RSS
Comments RSS
WordPress.org
Theme by Powered by