Sumários

Class 5 (Module II)

28 Março 2022, 09:30 Jorge Filipe Campinos Landerset Cadima

[Slides 139-153] and Linear Model Exercise 18 a)b)c)e). The problem with using the Normal distribution for beta-hat_j: the unknown variance sigma^2 of the random errors. Estimating sigma^2 with the Residual Means Square (QMRE) and its effect on the pivot quantity for inference on the model parameters beta_j. Confidence intervals for individual beta_j. Hypothesis tests for individual beta_j. An example: Exercise 18 (videiras dataset).
Nota: Esta aula foi leccionada na segunda-feira, dia 4.4.22, das 9h45 às 12h25, na Sala S1, simultaneamente presencial e por zoom.


Class 4 (Module II)

23 Março 2022, 15:00 Jorge Filipe Campinos Landerset Cadima

[Slides 112-138] The Linear Model in an inferential context: introduction, additional assumptions, the model in a multiple linear regression context. The matrix/vector notation for the Model: the Model equation and the random vector of estimators of the model parameters (beta-hat). Tools to work with random vectors: the expected vector and its properties; the matrix of (co-)variances and its properties; the MultiNormal distribution and its properties; the Linear Model in matrix/vector notation. First consequences of the model: the distribution of the random vector Y of response variable observations; the distribution of the vector of estimators beta-hat (with proofs and interpretations).
Nota: Esta aula foi leccionada na quarta-feira, dia 30.3.22, das 15h às 17h35, na Sala S3, simultaneamente presencial e por zoom.


Class 3 (Module II)

21 Março 2022, 09:30 Jorge Filipe Campinos Landerset Cadima

[Slides 77-111] Multiple linear regressions in a descriptive context. The alternative representation in the space of variables. The column-space of matrix X, C(X). The orthogonal projection matrix H (the 'hat matrix'). Properties of orthogonal projections. The orthogonal projection of the response vector y onto C(X) and the Least Squares formula for the vector of parameters, b. The three Sums of Squares and the Fundamental Formula of Linear Regression as an application of the Pythagorean Theorem to the right triangle defined by the projection of vector y onto C(X). An alternative right triangle defined by the centred vector y^c. The definition of the Coefficient of Determination and its properties in a Multiple Linear Regression, with a geometric interpretation in the space of variables. Multiple Linear Regressions in R: an example with the iris data. Models and submodels. Some properties of submodels. Polynomial regressions as a special case of multiple regressions: general idea and an example with the vineleaves dataset.
Nota: Esta aula foi leccionada na segunda-feira, dia 28.3.22, das 9h45 às 12h25, na Sala S1, simultaneamente presencial e por zoom.


Extra practical class

18 Março 2022, 09:30 Manuela Neves

Class for solving exercises and clarification of doubts


Class 2 (Module II)

16 Março 2022, 15:00 Jorge Filipe Campinos Landerset Cadima

[Slides 43-76] Nonlinear relations that can be linearized with suitable transformations of one or both variables. Five important cases with a single predictor: the exponential relation, the (2-parameter) logistic relation , the power law, hyperbolic-type relations, the Michaelis-Menten curve. For each case the appropriate linearizing transformations are given, together with differential equations underlying the non-linear relation. Multiple Linear Regression in a descriptive context. The anthocyans example once again: visualizations of a scatterplot in 3-d, when there are only two predictors (package rggobi). The Least Squares criterion in a multiple regression context. The impossibility of visualization for more than two predictors. An alternative representation of the data in the space of variables (R^n), where geometric and statistical concepts merge. The vectors of n observations, the vector of n ones and the vector y-hat of the n fitted values of y, which is a linear combination of the vectors of predictors and the vector of n ones. The model matrix X. Linear combinations of the columns of matrix X.
Nota:Esta aula foi leccionada na quarta-feira, dia 23.3.22, das 15h às 17h35, na Sala S3, simultaneamente presencial e por zoom.