Episode 1: Multi-output forecasting

This week I read the paper “Deep Multi-Output Forecasting - Learning to Accurately Predict Blood Glucose Trajectories” [2], and this is everything I learned from it:

[Disclaimer: Although the paper is a bit disappointing for an applications paper on diabetes, since it lacks some crucial details about study design, computational time, demographics, offline/online training/testing speed, however, I was still able to salvage certain important things from the wreckage.]

Vocabulary:

Signal-step forecasting problem: Estimate a future value of the signal for a single time instant using past values. e.g. blood glucose measurement.
Multi-step forecasting problem: Estimating multiple values within a future time horizon, recursively.
Multi-output forecasting: Estimating multiple values within a future time horizon, all at the same time instead of recursively.

Summary:

Concepts:

RNN architecture [7] ARN architecture [6]

Code of the Week:

This week I used an LSTM model in keras to predict the minimum daily temperature in Melbourne, Austrailia. I observed the effect of different activation functions on the model’s performance. Feel free to play with other parameters such as the optimizer, epochs, batch_size etc. The source of the data is Australian Bureau of Meteorology and can be downloaded from [8]. Check out the coding exercise on my Github profile.

Thought of the Week:

The two major steps in learning the basics of any new algorithm:

I think Jason@Machine Learning Mastery [1] is an excellent resource and follows the exact same approach of teaching a concise theory and a quick coding exercise in designing its ML tutorials. I love reading it!

See you next week!

References:

[1] Machine Learning Mastery
[2] Deep Multi-output Forecasting
[3] Recurrent Neural Networks
[4] Autoregressive Models
[5] ARIMA
[6] Deepmind
[7] LSTM
[8] 7 Time Series Datasets for Machine Learning

Share this: