Skip to content

Commit

Permalink
Add missing equation
Browse files Browse the repository at this point in the history
  • Loading branch information
Tarang74 committed Mar 2, 2024
1 parent 516c188 commit d7029c4
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 1 deletion.
Binary file modified MXB107 Exam Notes.pdf
Binary file not shown.
Binary file modified MXB107 Lecture Notes.pdf
Binary file not shown.
6 changes: 5 additions & 1 deletion MXB107 Lecture Notes.tex
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@

% Additional packages & macros
\DeclareMathOperator*{\argmax}{arg\,max}
\DeclareMathOperator*{\argmin}{arg\,min}

% Header and footer
\newcommand{\unitName}{Introduction to Statistical Modelling}
Expand Down Expand Up @@ -1647,7 +1648,10 @@ \section{Linear Regression}
y_i \sim \mathrm{N}\left( \beta_0 + \beta_1 x_i,\: \sigma^2 \right)
\end{equation*}
where \(y_i\) is normal for fixed values of \(x_i\).
The regression coefficients \(\beta_0\) and \(\beta_1\) are estimated by minimizing the sum of squared residuals:
The regression coefficients \(\beta_0\) and \(\beta_1\) are estimated by minimising the sum of squared residuals:
\begin{equation*}
\left( \hat{\beta}_0,\: \hat{\beta}_1 \right) = \argmin_{\beta_0,\: \beta_1} \sum_{i = 1}^n \epsilon_i^2 = \argmin_{\beta_0,\: \beta_1} \sum_{i = 1}^n \left( y_i - \left( \beta_0 + \beta_1 x_i \right) \right)^2.
\end{equation*}
\subsection{Estimation and Inference}
By using the method of maximum likelihood for the two parameters \(\beta_0\) and \(\beta_1\), or using the least squares solution by minimising the
squared error, we obtain the following estimators:
Expand Down

0 comments on commit d7029c4

Please sign in to comment.