A thorough analysis of the Random Walk Theory and the subsequent efficacy of using LSTMs for stock price prediction.
One of my previous articles was regarding the futility of the common practice of attempting to use LSTMs for stock forecasting due to the random walk nature of market prices over time. There was some criticism regarding the depth and argumentations of the post, thus this planned 3 part series is dedicated to giving a relatively complete and more in-depth look into the topic.
The series will cover:
The Holt-Winters method — also known as triple exponential smoothing — is an incredibly popular and relatively simple method for time series forecasting. This article will be a somewhat thorough introduction into the math and theory of the Holt-Winters method, complete with a Python implementation from scratch.
The Holt-Winters method is a very common time series forecasting procedure capable of including both trend and seasonality. The Holt-Winters method itself is a combination of 3 other much simpler components, all of which are smoothing methods:
Despite being an incredibly popular approach, LSTMs are an inherently terrible way of estimating stock prices
Long Short Term Memory (LSTM) networks are an incredibly popular type of recurrent neural network that are primarily used for learning sequences and order dependance. They are therefore generally a pretty solid pick when it comes to learning audio, language and of course, time series data — anything with a temporal dimension. It is only natural that people will attempt to use this temporal learning capability on arguably the most popular time series data of them all: the stock market. …
Often times, it can be incredibly useful to know the probability density function for a given set of observations. Unfortunately, most random samples of data will probably have unknown density functions, and thus the probability density will need to be estimated.
Enter Kernel Density Estimation: a non parametric way of estimating the density function of a random variable.
In this post, we will be covering a theoretical and mathematical explanation of Kernel Density Estimation, as well as a Python implementation from scratch!
Kernel Density Estimation (KDE) is essentially a data smoothing method that fits a smooth line over a distribution…
ARIMA models and its variants are some of the most established models for time series forecasting. This article will be a somewhat thorough introduction to ARIMA/ARMA modelling, as well as the math behind how they work.
The ARIMA (Auto Regressive Moving Average) model is a very common time series-forecasting model. It is a more sophisticated extension of the simpler ARMA (Auto Regressive Moving Average) model, which in itself is just a merger of two even simpler components:
Knowing when something has gone terribly wrong is incredibly crucial in a variety of applications. Whether it is monitoring power draw for machinery, financial transactions or server metrics, when things go weird — that means trouble.
An anomaly is an unexpected deviation from normal behaviour, and thus generally indicates some kind of problem.
An introduction to using Cerybra’s Isaac AI and Cortecx library for simple Python NLP.
With recent AI milestones like Google’s BERT and OpenAI’s GPT-2, it seems as if NLP is booming more than ever. However without much time, resources, man power or otherwise intermediate python experience, easily implementing certain NLP techniques in one’s code may be difficult.
Cortecx by Cerybra is a prototype NLP library that facilitates the implementation and use of NLP and other linguistic tools within your python code. Currently, Cortecx supports the following, with more on the way!
Electronic hobbyist and AI enthusiast