Time series Analysis and Prediction
- Time series Analysis and Prediction
- Considerations, Methods and Factors in TS evaluation
- Sequential Data examples
- Traditional ANNs are NOT suitable for sequential data
- Human Brains learn from sequences of data, and are very capable of detecting new patterns.
- Cannot restrict length of TS data
- Seasonal Adjustments of TS data - compare YoY, QoQ
- Hidden Markov Models HMMs traditional used for state modeling
- Recurrent Neural Networks (RNNs) for Sequential Data
- Sequential Data and Deep Learning
- LSTM for Time Series, Sequences
Time series Analysis and Prediction
Considerations, Methods and Factors in TS evaluation
Sequential Data examples
Time series data is perhaps the most popular form of sequential data e.g. video, speech, stock market and sensor data.
- For example stock price charting is a kind of time series.
- Another set of data is humidity, temperature, pressure, etc. over time, leading to predicting of upcoming rain.
Traditional ANNs are NOT suitable for sequential data
Traditional Feed forward NNs take a bunch of inputs and trigger outputs. These are very powerful e.g. given a 8x8 image of a handwritten digit can classify it very accurately. Or they can do recognition eg kind of animal, etc.
But there is no "MEMORY" in these neural networks. In the rain example, if you simply take a snapshot of data and try to predict 12 hours or a few days down it will likely fail. However, if see trends like rising barometer, humidity, etc. we can forecast better.
Human Brains learn from sequences of data, and are very capable of detecting new patterns.
Our brain is trained to learn from a sequence of events. For example human brain has seen many examples of cars of a wide variety, so when it comes across a "space-ship or flying car" it can still recognize it as a car.
- Basically the context and memory allows the brain to predict and use past knowledge to manage interaction with the world.
Cannot restrict length of TS data
- In predicting stock prices, while it is true that recent technical patterns are more important than multi-year ones, they still have value. For example gold in July 2020 breached its prior high of 2011 at $1940 an oz. That is nearly 10 years but very meaningful.
One of the main issues with sequential data is that we don’t what time instant affects future outcomes.
So when we build a model, it should know how to model sequences of arbitrary length - but we may adjust for problem at hand.
Seasonal Adjustments of TS data - compare YoY, QoQ
- People drive more in summers and go on more vacations, so time series projections need to make seasonal adjustments. That is why a June 2020 YoY comparision to June 2019 may be more meaningful than to spring 2020 data in predicting how much gas to stock up on.
Hidden Markov Models HMMs traditional used for state modeling
Traditionally, people have been using Hidden Markov Models (HMMs) to analyze sequential data HMMs have been implemented for many tasks such as speech recognition, gesture recognition, part-of-speech tagging,etc.
Constraint 1 of HMMs for sequential data: No dependence on data sequence
The key assumption and power of HMMs stem from the basic core assumption i.e. " Current state depends only on the previous state". Transition probabilities ONLY depend on prior states, ignoring everything that happened before the previous state. HMMs are generative models by nature. HMMs model the joint distribution of outputs and (prior) hidden states.
NOTE: This is technically a first order HMM and that we can design an HMM to go back further. But even if we do that, the next state will depend on a fixed of previous instances. However as discussed earlier, e.g. we may need to back thousands of samples like in the price of gold. Also, there may be totally different number of prior samples needed depending on pattern in TS.
Constraint 2: Transition probabilities cannot change with time
In the real world, like rising stocks, the prediction of future prices are quite different than falling stock prices (even including reversion to mean - it still will likely be in a rising or falling trend).
HMMs won’t allow your state transition matrix to evolve over time, which is a big restriction because the data tends to change with time.
Constraint 3: HMMs assume that the current output is independent of the previous outputs
HMMs only assume that future outputs rely on prior internal states. However, humans often have a confirmation bias, e.g. if they just said they like Item A over B, they will tend to stick to that choice.
Recurrent Neural Networks (RNNs) for Sequential Data
RNNs know how to operate on sequences of data and are more powerful than conventional feed forward neural networks even while using same neurons but the layer connectivity patterns are different. Unlike CNNs, RNNs allow for directed cycles of connectivity within neurons.
The primary difference is that neuron can feed its own output to itself in an RNN .
Sequential Data and Deep Learning
- Estimating The Predictability Of Time Series Data – Part I | PERPETUAL ENIGMA
- Estimating The Predictability Of Time Series Data – Part II | PERPETUAL ENIGMA
- What Is Long Memory In Time Series Analysis | PERPETUAL ENIGMA
- Measuring The Memory Of Time Series Data | PERPETUAL ENIGMA