Skip to content

Commit

Permalink
Added week 2 materials
Browse files Browse the repository at this point in the history
  • Loading branch information
wmutschl committed Oct 17, 2024
1 parent 9b6f062 commit fbc74af
Show file tree
Hide file tree
Showing 11 changed files with 163 additions and 91 deletions.
11 changes: 11 additions & 0 deletions .github/workflows/dynare-6.2-matlab-r2024b-macos.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,3 +42,14 @@ jobs:
addpath("Dynare-6.2-arm64/matlab");
cd("progs/matlab");
quickTourMatlab;
- name: Run week 2 codes
uses: matlab-actions/run-command@v2
with:
command: |
addpath("Dynare-6.2-arm64/matlab");
cd("progs/matlab");
visualizingTimeSeriesDataNorway;
definitionFrequenciesTimeSeriesData;
whiteNoisePlots;
plotsAR1;
11 changes: 11 additions & 0 deletions .github/workflows/dynare-6.2-matlab-r2024b-ubuntu.yml
Original file line number Diff line number Diff line change
Expand Up @@ -71,3 +71,14 @@ jobs:
addpath("dynare/matlab");
cd("progs/matlab");
quickTourMatlab;
- name: Run week 2 codes
uses: matlab-actions/run-command@v2
with:
command: |
addpath("dynare/matlab");
cd("progs/matlab");
visualizingTimeSeriesDataNorway;
definitionFrequenciesTimeSeriesData;
whiteNoisePlots;
plotsAR1;
10 changes: 10 additions & 0 deletions .github/workflows/dynare-6.2-matlab-r2024b-windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,3 +34,13 @@ jobs:
cd("progs/matlab");
quickTourMatlab;
- name: Run week 2 codes
uses: matlab-actions/run-command@v2
with:
command: |
addpath("D:\hostedtoolcache\windows\dynare-6.0\matlab");
cd("progs/matlab");
visualizingTimeSeriesDataNorway;
definitionFrequenciesTimeSeriesData;
whiteNoisePlots;
plotsAR1;
4 changes: 3 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,6 @@ Please feel free to use this for teaching or learning purposes; however, taking

</details>

<!---

<details>
<summary>Week 2: Time series data and fundamental concepts</summary>
Expand All @@ -67,6 +66,9 @@ Please feel free to use this for teaching or learning purposes; however, taking

</details>

<!---
<details>
<summary>Week 3: Dependent time series data and the autoregressive process</summary>
Expand Down
11 changes: 5 additions & 6 deletions exercises/definition_frequencies_time_series_data.tex
Original file line number Diff line number Diff line change
Expand Up @@ -13,20 +13,19 @@
\item \texttt{NorwayUnemploymentRate.xls}
\end{itemize}
Open the individual files and make note of the structure and source of the data.
Import the data and replicate figure \ref{fig:NorwayData}.
Import the data and replicate figure~\ref{fig:NorwayData}.
\begin{figure}[htbp]
\centering
\includegraphics[width=\linewidth]{plots/NorwayDataOverviewMatlab.pdf}
\caption{Various Time Series For Norway}
\label{fig:NorwayData}
\caption{Various Time Series For Norway}\label{fig:NorwayData}
\end{figure}
\item What are the data frequencies for each time series?
For what kind of economic analysis would you use these frequencies?
\item What are the sample sizes?
From an economic and/or statistical point of view, is it always better to have a larger sample size?
\item Roughly speaking, a time series consists of four components: a trend, a cycle, a season, and noise.
To what extent do you find these features in figure \ref{fig:NorwayData}?
\item Consider the plots in figure \ref{fig:NorwayData} jointly.
To what extent do you find these features in figure~\ref{fig:NorwayData}?
\item Consider the plots in figure~\ref{fig:NorwayData} jointly.
What are possible macroeconomic issues that could be analyzed?
\item How do you aggregate time series of stock variables (like capital or debt) and of flow variables (like GDP)?
For example, if you have monthly data, how do you get a quarterly time series?
Expand All @@ -38,7 +37,7 @@
\end{itemize}

\begin{solution}\textbf{Solution to \nameref{ex:DefinitionFrequenciesTimeSeriesData}}
\ifDisplaySolutions
\ifDisplaySolutions%
\input{exercises/definition_frequencies_time_series_data_solution.tex}
\fi
\newpage
Expand Down
18 changes: 10 additions & 8 deletions exercises/definition_frequencies_time_series_data_solution.tex
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
\begin{enumerate}
\item A time series is a collection of observations $Y_t$ indexed by the date of each observation $t$.
For simplicity often one denotes $t=0,1,2,...,T$,
then $\{Y_t\}_0^T = \{Y_0,Y_1,Y_2,...,Y_T\}$ is a sequence of random variables ordered in time
(each $Y_t$ is a random variable), which we call a stochastic process.
Sometimes we rely on the concept of an infinite sample and consider $\{Y_t\}_{t=-\infty}^\infty$ or simply $\{Y_t\}$.
\item A time series is a collection of observations \(Y_t\) indexed by the date of each observation \(t\).
For simplicity often one denotes \(t=0,1,2,\ldots ,T\),
then \( {\{Y_t\}}_0^T = \{Y_0,Y_1,Y_2,\ldots ,Y_T\} \) is a sequence of random variables ordered in time
(each \(Y_t\) is a random variable), which we call a stochastic process.
Sometimes we rely on the concept of an infinite sample and consider \( {\{Y_t\}}_{t=-\infty}^\infty \) or simply \( \{Y_t\} \).
A stochastic process can have many outcomes, due to its randomness,
and a single outcome of a stochastic process is called a sample function or realization.
A time series model assigns a joint probability distribution to the stochastic process.
Expand Down Expand Up @@ -56,7 +56,9 @@
\item Are house prices in line with their fundamentals? Is there a bubble?
\end{itemize}
\item Aggregation of higher frequencies to lower frequencies is straightforward;
that is, for stock variables (such as capital or debt) we simply take the value observed, i.e. $k_t^{Q1} = k_t^{m3}$,
whereas for flow variables (such as GDP) we can take the mean: $y_t^{Q1} = 1/3 (y_t^{m1}+y_t^{m2}+y_t^{m3})$.
Disaggregation is much more difficult and we need to use tools like interpolation or spline functions etc. $\rightarrow$ not straightforward!
that is, for stock variables (such as capital or debt) we simply take the value observed, i.e.\
\(k_t^{Q1} = k_t^{m3}\),
whereas for flow variables (such as GDP) we can take the mean: \(y_t^{Q1} = 1/3 (y_t^{m1}+y_t^{m2}+y_t^{m3})\).
Disaggregation is much more difficult and we need to use tools like interpolation or spline functions etc.\
\(\rightarrow \) not straightforward!
\end{enumerate}
36 changes: 22 additions & 14 deletions exercises/fundamental_concepts_univariate_time_series.tex
Original file line number Diff line number Diff line change
@@ -1,27 +1,35 @@
\section[Some Fundamental Concepts Of Univariate Time Series Analysis]{Some Fundamental Concepts Of Univariate Time Series Analysis\label{ex:FundamentalConceptsUnivariateTimeSeriesAnalysis}}

\begin{enumerate}
\item Define the \enquote{White Noise Process} labeled shortly as $\varepsilon_t \sim WN(0,\sigma_\varepsilon^2)$.
\item Plot 200 observations of
\begin{align*}
(i)~y_t &= \varepsilon_t
\\ (ii)~y_t &= \frac{1}{5}(\varepsilon_{t-2}+\varepsilon_{t-1}+\varepsilon_{t}+\varepsilon_{t+1}+\varepsilon_{t+2})
\end{align*}
with $\varepsilon_{t} \sim N(0,1)$. What are the differences?
\item Briefly explain the concepts of (i) weak stationarity and (ii) strict stationarity.
\item Define the autocovariance and autocorrelation function for a covariance-stationary stochastic process $\{Y_t\}$.
\item Consider the linear first-order difference equation $$y_t=\phi y_{t-1}+\varepsilon_t$$ with $\varepsilon_{t} \sim N(0,1)$. Simulate and plot 200 observations of $(i) |\phi|<1$, $(ii) \phi=1$, and $(iii) |\phi| >1$. What does this imply in terms of stationarity of the process?
\item Briefly explain the Lag-operator and Lag-polynomials. How can we check whether an AR(p) process $$y_t - \phi_1 y_{t-1} -\phi_2 y_{t-2} - ... - \phi_p y_{t-p} = (1-\phi_1 L-\phi_2 L^2 -... - \phi_p L^p)y_t = \varepsilon_t$$ is weakly stationary?
\item Define the \enquote{White Noise Process} labeled shortly as \(\varepsilon_t \sim WN(0,\sigma_\varepsilon^2)\).
\item Plot 200 observations of
\begin{align*}
(i)~y_t &= \varepsilon_t
\\
(ii)~y_t &= \frac{1}{5}(\varepsilon_{t-2}+\varepsilon_{t-1}+\varepsilon_{t}+\varepsilon_{t+1}+\varepsilon_{t+2})
\end{align*}
with \(\varepsilon_{t} \sim N(0,1)\). What are the differences?
\item Briefly explain the concepts of (i) weak stationarity and (ii) strict stationarity.
\item Define the autocovariance and autocorrelation function for a covariance-stationary stochastic process \( \{Y_t\} \).
\item Consider the linear first-order difference equation
\[y_t=\phi y_{t-1}+\varepsilon_t\]
with \(\varepsilon_{t} \sim N(0,1)\).
Simulate and plot 200 observations of \((i) |\phi|<1\), \((ii) \phi=1\), and \((iii) |\phi| >1\).
What does this imply in terms of stationarity of the process?
\item Briefly explain the Lag-operator and Lag-polynomials.
How can we check whether an {AR{(p)}} process
\[y_t - \phi_1 y_{t-1} -\phi_2 y_{t-2} - \cdots - \phi_p y_{t-p} = (1-\phi_1 L-\phi_2 L^2 -\cdots - \phi_p L^p)y_t = \varepsilon_t\]
is weakly stationary?
\end{enumerate}

\paragraph{Readings}
\begin{itemize}
\item \textcite[Ch.2]{Bjornland.Thorsrud_2015_AppliedTimeSeries}
\item \textcite{Lutkepohl_2004_UnivariateTimeSeries}
\item \textcite[Ch.2]{Bjornland.Thorsrud_2015_AppliedTimeSeries}
\item \textcite{Lutkepohl_2004_UnivariateTimeSeries}
\end{itemize}

\begin{solution}\textbf{Solution to \nameref{ex:FundamentalConceptsUnivariateTimeSeriesAnalysis}}
\ifDisplaySolutions
\ifDisplaySolutions%
\input{exercises/fundamental_concepts_univariate_time_series_solution.tex}
\fi
\newpage
Expand Down
68 changes: 35 additions & 33 deletions exercises/fundamental_concepts_univariate_time_series_solution.tex
Original file line number Diff line number Diff line change
@@ -1,62 +1,64 @@
\begin{enumerate}
\item A white noise has mean zero, a constant variance and all other second-order moments (i.e. autocovariances/autocorrelations) are zero:
\item A white noise has mean zero, a constant variance and all other second-order moments (i.e.\ autocovariances/autocorrelations) are zero:
\begin{align*}
E[\varepsilon_t]&=0\\
Var[\varepsilon_t]&=E[\varepsilon_t^2] - E[\varepsilon_t]E[\varepsilon_t] = \sigma_\varepsilon^2\\
Cov(\varepsilon_{t},\varepsilon_s) &= E[\varepsilon_t \varepsilon_s] - E[\varepsilon_t]E[\varepsilon_s] = 0 \text{ for $s \neq t$}
Cov(\varepsilon_{t},\varepsilon_s) &= E[\varepsilon_t \varepsilon_s] - E[\varepsilon_t]E[\varepsilon_s] = 0~\text{for}~s \neq t
\end{align*}
\item \lstinputlisting[style=Matlab-editor,basicstyle=\mlttfamily,title=\lstname]{progs/matlab/whiteNoisePlots.m}
Every simulation is different, model can thus generate an infinite set of realizations over the period $t=1,...,200$.
Every simulation is different, model can thus generate an infinite set of realizations over the period \(t=1,\ldots ,200\).
The processes do differ in their persistence.
(i) is the white noise process, which is not persistent.
(ii) is a 5-point-moving-average, which is a linear combination of white noise processes.
It is smoother and more persistent and very different from just noise.
Linear combinations of white noise processes build the basis of many models in time series analysis.
\item A process is said to be \textbf{$N$-order weakly stationary} if all its joint moments up to order $N$ exist and are time invariant.
We are particularly interested in $N=2$, i.e. \textbf{covariance stationarity}:
\item A process is said to be \textbf{\(N\)-order weakly stationary} if all its joint moments up to order \(N\) exist and are time invariant.
We are particularly interested in \(N=2\), i.e.\ \textbf{covariance stationarity}:
\begin{align*}
E[Y_t]&=\mu \text{ (constant for all t)}
E[Y_t]&=\mu~\text{(constant for all t)}
\\
Var[Y_t]&=E[(Y_t - \mu)(Y_t-\mu)]=\gamma_0 \text{ (constant for all t)}
Var[Y_t]&=E[(Y_t - \mu)(Y_t-\mu)]=\gamma_0~\text{(constant for all t)}
\\
Cov[Y_{t_1},Y_{t_1-k}] &= E[(Y_{t_1}-\mu)(Y_{t_1-k}-\mu)] = Cov[Y_{t_2},Y_{t_2-k}] = \gamma_k \text{ (only dependent on $k$)}
Cov[Y_{t_1},Y_{t_1-k}] &= E[(Y_{t_1}-\mu)(Y_{t_1-k}-\mu)] = Cov[Y_{t_2},Y_{t_2-k}] = \gamma_k~\text{(only dependent on \(k\))}
\end{align*}
That is the first two moments are not dependent on $t$.
Particularly, the autocovariance is only dependent on the time difference $k$, but not on the actual point in time $t$.
That is the first two moments are not dependent on \(t\).
Particularly, the autocovariance is only dependent on the time difference \(k\), but not on the actual point in time \(t\).
\\
\textbf{Strict stationarity}: for all $k$ and $h$: $$f(Y_t,Y_{t-1},...,Y_{t-k})=f(Y_{t-h},Y_{t-h-1},...,Y_{t-h-k})$$
That is, not only the first two moments but the whole distribution is not dependent on the point in time $t$,
but on the time difference $k$.
\textbf{Strict stationarity}: for all \(k\) and \(h\):
\[f(Y_t,Y_{t-1},\ldots ,Y_{t-k})=f(Y_{t-h},Y_{t-h-1},\ldots ,Y_{t-h-k})\]
That is, not only the first two moments but the whole distribution is not dependent on the point in time \(t\),
but on the time difference \(k\).
\item Autocovariance function for a covariance-stationary process:
$$\gamma_k = E[(Y_t - \mu)(Y_{t-k}-\mu)]$$
where $\gamma_0$ is the variance. Autocorrelation function: $$\rho_k = \gamma_k/\gamma_0$$
\\
\[\gamma_k = E[(Y_t - \mu)(Y_{t-k}-\mu)]\]
where \(\gamma_0\) is the variance. Autocorrelation function: \[\rho_k = \gamma_k/\gamma_0\]
\\
We can estimate this by using:
\begin{align*}
\hat{\gamma}_k = c_k &= \frac{1}{T} \sum_{t=k+1}^T(y_t -\bar{y})(y_{t-k}-\bar{y})\\
\hat{\rho}_k = r_k & = c_k/c_0
\end{align*}
\hat{\gamma}_k = c_k &= \frac{1}{T} \sum_{t=k+1}^T(y_t -\bar{y})(y_{t-k}-\bar{y})
\\
\hat{\rho}_k = r_k & = c_k/c_0
\end{align*}
Note: In most applications we don't correct the degrees of freedom for numerical reasons
(e.g. to avoid singularity of autocovariance matrices in the multivariate case),
i.e. the sums are not divided by $T-k-1$ but simply by $T$.
For $T>100$ this does not really matter as the expressions are very close to each other.
(e.g.\ to avoid singularity of autocovariance matrices in the multivariate case),
i.e.\ the sums are not divided by \(T-k-1\) but simply by \(T\).
For \(T>100\) this does not really matter as the expressions are very close to each other.

\item \lstinputlisting[style=Matlab-editor,basicstyle=\mlttfamily,title=\lstname]{progs/matlab/plotsAR1.m}
Remarks: If $|\phi|<1$ the series returns to the mean, i.e. it is stable and stationary.
If $|\phi>1|$ then it explodes, i.e. it is unstable and not stationary.
$\phi=1$ is a so-called random walk,
Remarks: If \(|\phi|<1\) the series returns to the mean, i.e.\ it is stable and stationary.
If \(|\phi>1|\) then it explodes, i.e.\ it is unstable and not stationary.
\(\phi=1\) is a so-called random walk,
it is the key model when working with non-stationary models.
Note that the random walk incorporates many different shapes, in macroeconomic forecasts we often want to \textbf{beat} the random walk model.
\item It is a special LINEAR operator, similar to the expectation operator, and very useful when working with time series.
The operator transforms one time series into another by shifting the observation from period $t$ to period $t-1$:
$Ly_t = y_{t-1}$ or $L^{-1} y_t =y_{t+1}$.
More general: $L^k y_t = L^{k-1} L y_t = L^{k-1} y_{t-1} = ... = y_{t-k}$.
The operator transforms one time series into another by shifting the observation from period \(t\) to period \(t-1\):
\(Ly_t = y_{t-1}\) or \(L^{-1} y_t =y_{t+1}\).
More general: \(L^k y_t = L^{k-1} L y_t = L^{k-1} y_{t-1} = \cdots = y_{t-k}\).
Convenient use:
$$(1-L)y_t = y_t - y_{t-1}= \Delta y_t$$
\[(1-L)y_t = y_t - y_{t-1}= \Delta y_t\]
We can also work with lag-polynomials:
$$ \phi(L) = (1-\phi_1 L-\phi_2 L^2 -... - \phi_p L^p)$$
\[ \phi(L) = (1-\phi_1 L-\phi_2 L^2 -\cdots - \phi_p L^p)\]
where we call p the lag order. So:
$$ \phi(L) y_t = (1-\phi_1 L-\phi_2 L^2 -... - \phi_p L^p)y_t = y_t - \phi_1 y_{t-1} -\phi_2 y_{t-2} - ... - \phi_p y_{t-p}$$
To check whether an AR(p) model is covariance stationarity, we need to check whether the roots of the lag-polynomial lie outside the unit circle.
That is, we treat $L$ as a complex number $z\in \mathbb{C}$ and compute the roots of $(1-\phi_1 z-\phi_2 z^2 -... - \phi_p z^p)=0$ (using a computer in most cases).
\[ \phi(L) y_t = (1-\phi_1 L-\phi_2 L^2 -\cdots - \phi_p L^p)y_t = y_t - \phi_1 y_{t-1} -\phi_2 y_{t-2} - \cdots - \phi_p y_{t-p}\]
To check whether an {AR{(p)}} model is covariance stationarity, we need to check whether the roots of the lag-polynomial lie outside the unit circle.
That is, we treat \(L\) as a complex number \(z\in \mathbb{C}\) and compute the roots of \((1-\phi_1 z-\phi_2 z^2 -\cdots - \phi_p z^p)=0\) (using a computer in most cases).
\end{enumerate}
18 changes: 9 additions & 9 deletions exercises/visualizing_time_series_data.tex
Original file line number Diff line number Diff line change
Expand Up @@ -27,22 +27,22 @@

\paragraph{Readings}
\begin{itemize}
\item \url{https://de.mathworks.com/help/matlab/import_export/import-data-interactively.html}
\item \url{https://de.mathworks.com/help/matlab/import_export/select-spreadsheet-data-interactively.html}
\item \textcite[Ch.2]{Bjornland.Thorsrud_2015_AppliedTimeSeries}
\item {\footnotesize\url{https://de.mathworks.com/help/matlab/import_export/import-data-interactively.html}}
\item {\footnotesize\url{https://de.mathworks.com/help/matlab/import_export/select-spreadsheet-data-interactively.html}}
\item \textcite[Ch.2]{Bjornland.Thorsrud_2015_AppliedTimeSeries}
\end{itemize}

\paragraph{Useful resources}
\begin{itemize}
\item Economic Data Resources (\url{https://libguides.umn.edu/c.php?g=843682&p=6527336})
\item Gould Library - The Data Search (\url{https://gouldguides.carleton.edu/c.php?g=147179&p=965273})
\item DBNomics Providers (\url{https://db.nomics.world/providers})
\item Our World in Data (\url{https://ourworldindata.org})
\item FRED (\url{https://fred.stlouisfed.org})
\item Economic Data Resources (\url{https://libguides.umn.edu/c.php?g=843682&p=6527336})
\item Gould Library {-} The Data Search (\url{https://gouldguides.carleton.edu/c.php?g=147179&p=965273})
\item DBNomics Providers (\url{https://db.nomics.world/providers})
\item Our World in Data (\url{https://ourworldindata.org})
\item FRED (\url{https://fred.stlouisfed.org})
\end{itemize}

\begin{solution}\textbf{Solution to \nameref{ex:VisualizingTimeSeriesData}}
\ifDisplaySolutions
\ifDisplaySolutions%
\input{exercises/visualizing_time_series_data_solution.tex}
\fi
\newpage
Expand Down
Loading

0 comments on commit fbc74af

Please sign in to comment.