Understanding Deep Learning Forecasts over Time

Photo by Thomas Bormans on Unsplash

In this post we’ll talk about time travel in some way. Forecasting is all about trying to predict what the future holds. But to do that, we need to understand the past first. Not just a still snapshot of the past, but a continuous period of time. We’ll present how we are able to visualize our Deep Learning models’ “vision” over time, kind of like a movie.

In a previous post, we gave a brief overview of how Predicto is able to provide explanations for its daily generated financial forecasts. As it turns out, this is a powerful tool. It allows us to go back in time and revisit Deep Learning (DL) models’ performance over time and extract valuable information. Although in our case we deal with financial time series forecasting, the same approach can be used for any type of time series forecasting. In this post, we are going to describe how we can take advantage of Deep Learning explainability over time with examples.

In Predicto, we maintain (and continuously retrain and iterate on) several DL models for a number of stocks. In order to have a clear view, we keep statistics about every single model’s forecasting performance over time (we’ll talk about this area in more detail in a future article), and when a DL model makes it to the top list (like the one below), it attracts our attention.

Walmart stock DL model forecasts over a 2-months period

What is even better, is that thanks to our Explanation framework we can go back in time and understand why a model has been performing well or, to put it even simpler, to visualize what kind of patterns the trained deep neural network “saw”. If what we discover makes sense then this is a promising signal to start paying closer attention and take further action.

The purpose of this article is to present one such occasion in action.

We present a model that has been consistently accurate over the last 2 months. This model is a trained Deep Learning model for the Walmart stock (WMT) — it has been trained using several features from financial time series that we collect (some simple and some more complicated) going back several years.

But let’s have a closer look. Take some time to study the daily generated forecasts and the features + time periods with the most influence on each prediction in the moving graph below. Every forecast of our models generates stock price predictions for the next 15 days ahead.

To better understand how explanation heat maps are generated for our forecasts, have a look on the previous article about Explaining Financial Deep Learning Forecasts. In short, the lighter color a box has in the heat map, the more influence that specific feature/time period pair appears to have in the generated forecast.

Walmart model performance & feature influence per period heat map during the last 2 months (15 days ahead forecasting)

By visualizing the generated explanation heat maps for every single daily forecast, we are able to understand what this model “sees” over time and whether its “vision” is consistent. Features used are obfuscated but their order is constant.

So what information do we extract from the above?

  1. We identify 4 features that have consistently high importance on almost every generation (gray square in the above heat map). Other features appear to influence forecasts as well, but it seems most influence comes from those 4 features appearing in the central rows of the explanation heat maps. Those features have to do with complex options time series data of the stock.
  2. By following those 4 features as time passes, we notice that time periods of influence start from the most recent 1–2 weeks and gradually extend to 1-4 weeks as time progresses (light green square in the above heat map). This indicates that a data pattern has been identified by the deep neural network that helped form accurate forecasts.
The 4 features time series identified as of high importance during the identified time periods of influence

To dig even further, we generate a graph for the features identified as of high importance and we observe the time periods that appear to influence our forecasts (blue square in the above chart). One assumption we might form is that this little coordinated trough curve was recognized as a pattern that allowed our model to project a consistently accurate forecast, in conjunction of course with other features. More experienced traders might form their own assumptions based on these graphs and this is the real value of having an explainability framework.

To give some insights for our Deep Learning enthusiasts readers, we model our forecasting problem as a sequence to sequence problem: given the last 90 days of several time series data, try to predict the next 15 days daily prices. The actual DL model architecture used varies depending on the model.

And that concludes our brief analysis!

Overall, our goal is to take advantage of the latest advances in Deep Learning in order to build a fully autonomous AI for short-term stock trading.

One step at a time.

Our process goes like this:

  1. Mine/Collect/Buy data that help us constantly improve.
  2. Train/Retrain/Iterate on Deep Learning models, architectures, and features.
  3. Generate daily forecasts using our trained models.
  4. Generate daily actionable trades based on the latest forecasts and models’ recent performance.
  5. Maintain an explainability & monitoring framework for all our models for human supervision.

If you want to learn more about our work, feel free to have a look at our previous blog posts or experiment with our platform at Predicto.

Thank you for reading — Stay tuned for our next article!

Web https://predic.to — Twitter @ThePredicto — GitHub ThePredicto




Stock & Cryptocurrency Forecasting AI. Based on News and Options Data. Powered by Intelligible Deep Learning models. https://predic.to

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Unmasking a vanilla RNN: What lies beneath?

Autonomous Navigation using Computer Vision with ROS

The Role Of AI, ML, And Deep Learning In Privacy

Six steps to hone your Data: Data Preprocessing, Part 6

Road detection using segmentation models and albumentations libraries on Keras

The NLP Challenges Conventional Flawed Article Spinners Algorithms

Rapid Prediction of Earthquake Ground Shaking Intensity Using Raw Waveform Data and aConvolutional…

Non-contact, automated cardiac pulse measurements using video imaging and blind source separation

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store


Stock & Cryptocurrency Forecasting AI. Based on News and Options Data. Powered by Intelligible Deep Learning models. https://predic.to

More from Medium

Simple Explainable Machine Learning

AI and new digital platforms

Youtube’s Machine learning (ML) algorithm

Expected value as evaluation metric in Machine Learning