```
using Turing
using FillArrays
using StatsPlots
using LinearAlgebra
using Random
using Statistics
Random.seed!(12345)
= 2
true_sin_freq = 5
true_sin_amp = 7
true_cos_freq = 2.5
true_cos_amp = 10
tmax = 2
β_true = -1
α_true = 0:0.05:tmax
tt f₁(t) = α_true + β_true * t
f₂(t) = true_sin_amp * sinpi(2 * t * true_sin_freq / tmax)
f₃(t) = true_cos_amp * cospi(2 * t * true_cos_freq / tmax)
f(t) = f₁(t) + f₂(t) + f₃(t)
plot(f, tt; label="f(t)", title="Observed time series", legend=:topleft, linewidth=3)
plot!(
[f₁, f₂, f₃],
tt;=["f₁(t)" "f₂(t)" "f₃(t)"],
label=[:dot :dash :dashdot],
style=1,
linewidth )
```

# Bayesian Time Series Analysis

In time series analysis we are often interested in understanding how various real-life circumstances impact our quantity of interest. These can be, for instance, season, day of week, or time of day. To analyse this it is useful to decompose time series into simpler components (corresponding to relevant circumstances) and infer their relevance. In this tutorial we are going to use Turing for time series analysis and learn about useful ways to decompose time series.

# Modelling time series

Before we start coding, let us talk about what exactly we mean with time series decomposition. In a nutshell, it is a divide-and-conquer approach where we express a time series as a sum or a product of simpler series. For instance, the time series \(f(t)\) can be decomposed into a sum of \(n\) components

\[f(t) = \sum_{i=1}^n f_i(t),\]

or we can decompose \(g(t)\) into a product of \(m\) components

\[g(t) = \prod_{i=1}^m g_i(t).\]

We refer to this as *additive* or *multiplicative* decomposition respectively. This type of decomposition is great since it lets us reason about individual components, which makes encoding prior information and interpreting model predictions very easy. Two common components are *trends*, which represent the overall change of the time series (often assumed to be linear), and *cyclic effects* which contribute oscillating effects around the trend. Let us simulate some data with an additive linear trend and oscillating effects.

Even though we use simple components, combining them can give rise to fairly complex time series. In this time series, cyclic effects are just added on top of the trend. If we instead multiply the components the cyclic effects cause the series to oscillate between larger and larger values, since they get scaled by the trend.

```
g(t) = f₁(t) * f₂(t) * f₃(t)
plot(g, tt; label="f(t)", title="Observed time series", legend=:topleft, linewidth=3)
plot!([f₁, f₂, f₃], tt; label=["f₁(t)" "f₂(t)" "f₃(t)"], linewidth=1)
```

Unlike \(f\), \(g\) oscillates around \(0\) since it is being multiplied with sines and cosines. To let a multiplicative decomposition oscillate around the trend we could define it as \(\tilde{g}(t) = f₁(t) * (1 + f₂(t)) * (1 + f₃(t)),\) but for convenience we will leave it as is. The inference machinery is the same for both cases.

# Model fitting

Having discussed time series decomposition, let us fit a model to the time series above and recover the true parameters. Before building our model, we standardise the time axis to \([0, 1]\) and subtract the max of the time series. This helps convergence while maintaining interpretability and the correct scales for the cyclic components.

```
= 0.35
σ_true = collect(tt[begin:3:end])
t = extrema(t)
t_min, t_max = (t .- t_min) ./ (t_max - t_min)
x = f.(t) .+ σ_true .* randn(size(t))
yf = maximum(yf)
yf_max = yf .- yf_max
yf
scatter(x, yf; title="Standardised data", legend=false)
```

Let us now build our model. We want to assume a linear trend, and cyclic effects. Encoding a linear trend is easy enough, but what about cyclical effects? We will take a scattergun approach, and create multiple cyclical features using both sine and cosine functions and let our inference machinery figure out which to keep. To do this, we define how long a one period should be, and create features in reference to said period. How long a period should be is problem dependent, but as an example let us say it is \(1\) year. If we then find evidence for a cyclic effect with a frequency of 2, that would mean a biannual effect. A frequency of 4 would mean quarterly etc. Since we are using synthetic data, we are simply going to let the period be 1, which is the entire length of the time series.

```
= 1:10
freqs = length(freqs)
num_freqs = 1
period = [sinpi.(2 .* freqs' .* x ./ period) cospi.(2 .* freqs' .* x ./ period)]
cyclic_features
= [1, 3, 5]
plot_freqs = plot(
freq_ptl :, plot_freqs];
cyclic_features[=permutedims(["sin(2π$(f)x)" for f in plot_freqs]),
label="Cyclical features subset",
title )
```

Having constructed the cyclical features, we can finally build our model. The model we will implement looks like this

\[ f(t) = \alpha + \beta_t t + \sum_{i=1}^F \beta_{\sin{},i} \sin{}(2\pi f_i t) + \sum_{i=1}^F \beta_{\cos{},i} \cos{}(2\pi f_i t), \]

with a Gaussian likelihood \(y \sim \mathcal{N}(f(t), \sigma^2)\). For convenience we are treating the cyclical feature weights \(\beta_{\sin{},i}\) and \(\beta_{\cos{},i}\) the same in code and weight them with \(\beta_c\). And just because it is so easy, we parameterise our model with the operation with which to apply the cyclic effects. This lets us use the exact same code for both additive and multiplicative models. Finally, we plot prior predictive samples to make sure our priors make sense.

```
@model function decomp_model(t, c, op)
~ Normal(0, 10)
α ~ Normal(0, 2)
βt ~ MvNormal(Zeros(size(c, 2)), I)
βc ~ truncated(Normal(0, 0.1); lower=0)
σ
= c * βc
cyclic = α .+ βt .* t
trend = op(trend, cyclic)
μ ~ MvNormal(μ, σ^2 * I)
y return (; trend, cyclic)
end
= mapreduce(hcat, 1:100) do _
y_prior_samples rand(decomp_model(t, cyclic_features, +)).y
end
plot(t, y_prior_samples; linewidth=1, alpha=0.5, color=1, label="", title="Prior samples")
scatter!(t, yf; color=2, label="Data")
```

With the model specified and with a reasonable prior we can now let Turing decompose the time series for us!

```
function mean_ribbon(samples)
= quantile(samples)
qs = qs[:, Symbol("2.5%")]
low = qs[:, Symbol("97.5%")]
up = mean(samples)[:, :mean]
m return m, (m - low, up - m)
end
function get_decomposition(model, x, cyclic_features, chain, op)
= Turing.MCMCChains.get_sections(chain, :parameters)
chain_params return generated_quantities(model(x, cyclic_features, op), chain_params)
end
function plot_fit(x, y, decomp, ymax)
= mapreduce(x -> x.trend, hcat, decomp)
trend = mapreduce(x -> x.cyclic, hcat, decomp)
cyclic
= plot(
trend_plt
x,.+ ymax;
trend =1,
color=nothing,
label=0.2,
alpha="Trend",
title="Time",
xlabel="f₁(t)",
ylabel
)= [ones(length(t)) t] \ y
ls = ls[1], ls[2:end]
α̂, β̂ plot!(
trend_plt,
t,.+ t .* β̂ .+ ymax;
α̂ ="Least squares trend",
label=5,
color=4,
linewidth
)
scatter!(trend_plt, x, y .+ ymax; label=nothing, color=2, legend=:topleft)
= plot(
cyclic_plt
x,
cyclic;=1,
color=nothing,
label=0.2,
alpha="Cyclic effect",
title="Time",
xlabel="f₂(t)",
ylabel
)return trend_plt, cyclic_plt
end
= sample(decomp_model(x, cyclic_features, +) | (; y=yf), NUTS(), 2000, progress=false)
chain = predict(decomp_model(x, cyclic_features, +), chain)
yf_samples = mean_ribbon(yf_samples)
m, conf = plot(
predictive_plt
t,.+ yf_max;
m =conf,
ribbon="Posterior density",
label="Posterior decomposition",
title="Time",
xlabel="f(t)",
ylabel
)scatter!(predictive_plt, t, yf .+ yf_max; color=2, label="Data", legend=:topleft)
= get_decomposition(decomp_model, x, cyclic_features, chain, +)
decomp = plot_fit(t, yf, decomp, yf_max)
decomposed_plt plot(predictive_plt, decomposed_plt...; layout=(3, 1), size=(700, 1000))
```

```
┌ Info: Found initial step size
└ ϵ = 0.025
```