OLS provides unbiased and consistent estimates for $\beta, \sigma^2_{\epsilon}$ and $\epsilon$ if the data at hand shares the following characteristics:

Objective Function of OLS

$$ \begin{align*} \hat{\beta}{ols} &:= \min{\beta} \; (y-X\beta)' \, (y-X \beta)\end{align*} $$

Explanation: We're trying to find the $\beta$ (a.k.a. the vector of parameters) that results in the least amount of squared errors (fewer errors = more accurate result). So OLS is considered a minimization problem.

Parametric Solution

$$ \begin{align*} \hat{\beta}{ols} :&= (X' X)^{{}-1} X' y \\\\ \hat{\epsilon}{ols} :&= y - X\hat{\beta}{ols} \\\\ \hat{\sigma}^2{\epsilon, ols}:&= \frac{1}{T-p-1} \; \hat{\epsilon}{ols}' \, \hat{\epsilon}{ols} \\\\ \hat{var}{ols}[\beta] :&= \hat{\sigma}^2{\epsilon,ols}\, \times \, (X' X)^{{}-1} \\\\ \hat{s.e.}{ols}[\beta_i] :&= \sqrt{[\hat{var}{ols}[\beta]]{[i,i]}} \qquad \text{ for } i \in [0,1,...,p] \\\\ \hat{t}{ols}[\beta_i] :&= \frac{[\hat{\beta}{ols}]{[i,1]}}{\hat{s.e.}{ols}[\beta_i]} \\\\ \hat{R}^2{ols} :&= 1 - \frac{(T-p-1) \times \hat{\sigma}^2_{\epsilon,ols} }{(T-1) \times \sigma^2_y }, \\&\text{where } \sigma^2_y := \frac{\sum_{t=1}^T (y_t-\bar{y})^2}{T-1}, \text{where $\bar{y}$ is the sample mean of $Y$} \\\\ \hat{\bar{R}}^2_{ols} :&= 1 - \frac{\hat{\sigma}^2_{\epsilon,ols}}{\sigma^2_y} \\\\\end{align*} $$

Explanation

If we have some linear regression model, that fulfills the 4 characteristics (weak exogenity, linearity, $\epsilon$ is white noise, no multi collinearity), then we can use the Ordinary Least Squares method to fit the model.

$\hat{x}$ ("$x$ hat") is the predicted value of x in a regression equation.

short: use this formulas to calculate all OLS parameters

Handling Measurement Error in OLS

Classical errors-in-variables model: