Contents
Econometrics (M.Sc.)
Prof. Dr. Dominik Liebl
2021-02-07
Contents 1
Preface 5
Organization of the Course . . . . . . . . . . . . . . . . . . . . . . 5 Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1 Review: Probability and Statistics 7
1.1 ProbabilityTheory ………………….. 7 1.2 RandomVariables…………………… 16
1
2 Review: Simple Linear Regression 43
2.1 TheSimpleLinearRegressionModel . . . . . . . . . . . . . 43 2.2 OrdinaryLeastSquaresEstimation . . . . . . . . . . . . . . 47 2.3 PropertiesoftheOLSEstimator……………. 59
3 Multiple Linear Regression 67
3.1 Assumptions……………………… 67 3.2 Deriving the Expression of the OLS Estimator . . . . . . . . 72 3.3 SomeQuantitiesofInterest ……………… 74 3.4 MethodofMomentsEstimator ……………. 78 3.5 Unbiasednessof—ˆ|X …………………. 80 3.6 Varianceof—ˆ|X……………………. 81 3.7 TheGauss-MarkovTheorem……………… 86
4 Small Sample Inference 89
4.1 Hypothesis Tests about Multiple Parameters . . . . . . . . . 90 4.2 TestsaboutOneParameter ……………… 93 4.3 Testtheory………………………. 95 4.4 TypeIIErrorandPower……………….. 102 4.5 p-Value………………………… 105 4.6 ConfidenceIntervals …………………. 105 4.7 Practice:SmallSampleInference …………… 107
5 Large Sample Inference 123
5.1 ToolsforAsymptoticStatistics ……………. 123 5.2 Asymptotics under the Classic Regression Model . . . . . . . 131 5.3 RobustConfidenceIntervals ……………… 136 5.4 Practice:LargeSampleInference …………… 137
6 Maximum Likelikhood 149
6.1 LikelihoodPrinciple………………….. 149 2
6.2 Properties of Maximum Likelihood Estimators . . . . . . . . 149
6.3 The(Log-)LikelihoodFunction…………….. 150
6.4 Optimization: Non-Analytical Solutions . . . . . . . . . . . . 152
6.5 OLS-EstimationasML-Estimation . . . . . . . . . . . . . . 155
6.6 VarianceofML-Estimators—ˆ ands2 . . . . . . . . . . . 157 ˆ 2ML ML
6.7 Consistencyof—MLandsML ……………… 158 6.8 Asymptotic Theory of Maximum-Likelihood Estimators . . . 159
7 Instrumental Variables Regression 165
7.1 The IV Estimator with a Single Regressor and a Single Instru- ment…………………………. 166
7.2 TheGeneralIVRegressionModel…………… 174
7.3 CheckingInstrumentValidity …………….. 180
7.4 ApplicationtotheDemandforCigarettes . . . . . . . . . . 182
Bibliography
193
3
Preface
Organization of the Course
• Etherpad: Throughout the course, I will use the following etherpad for collecting important links and further information (Zoom-meeting link etc.): https://etherpad.wikimedia.org/p/8e2yF6X6FCbqep2AxW1b
• eWhiteboard: Besides this script, I will use an eWhiteboard during the lecture. This material will be saved as pdf-files and provided as accompaning lecture materials.
• Lecture Materials: You can find all lecture materials at eCampus or directly at sciebo: https://uni-bonn.sciebo.de/s/iOtkEDajrqWKT8v
Literature
• A guide to modern econometrics, by M. Verbeek
• Introduction to econometrics, by J. Stock and M.W. Watson
– E-Book: https://bonnus.ulb.uni-bonn.de/SummonRecord/FET CH-bonn_catalog_45089983
• Econometric theory and methods, by R. Davidson and J.G. MacKinnon • A primer in econometric theory, by J. Stachurski
• Econometrics, by F. Hayashi
5
Chapter 1
Review: Probability and Statistics
1.1 Probability Theory
Probability is the mathematical language for quantifying uncertainty. We can apply probability theory to a diverse set of problems, from coin flipping to the analysis of econometric problems. The starting point is to specify the sample space, that is, the set of possible outcomes.
1.1.1 Sample Spaces and Events
The sample space , is the set of possible outcomes of an experiment. Points Ê in are called sample outcomes or realizations. Events are subsets of Example 2.1 If we toss a coin twice then = {HH,HT,TH,TT}. The event that the first toss is heads is A = {HH,HT}.
Example: Let Ê be the outcome of a measurement of some physical quantity, for example, temperature. Then = R = (≠Œ,Œ). The event that the measurement is larger than 10 but less than or equal to 23 is A = (10, 23].
Example: If we toss a coin forever then the sample space is the infinite set = {Ê = (Ê1,Ê2,Ê3,…,)|Êi œ {H,T}} Let A be
7
the event that the first head appears on the third toss. Then A = {(Ê1,Ê2,Ê3,…,)|Ê1 = T,Ê2 = T,Ê3 = H,Êi œ {H,T} for i > 3}.
Given an event A, let Ac = {Ê œ ; Ê œ/ A} denote the complement of A. Informally, Ac can be read as “not A.” The complement of is the empty set ÿ. The union of events A and B is defined as
AfiB={ʜ |ʜAorʜBorʜ both}
which can be thought of as “A or B.” If A1,A2,… is a sequence of sets then
€Œ Ai ={Êœ :ÊœAi foratleastonei}. i=1
The intersection of A and B is defined as
A fl B = {Ê œ ; Ê œ A and Ê œ B}
which reads as “A and B.” Sometimes A fl B is also written shortly as AB. If A1,A2,… is a sequence of sets then
‹Œ Ai ={Êœ :ÊœAi foralli}. i=1
If every element of A is also contained in B we write A μ B or, equivalently, B ∏ A. If A is a finite set, let |A| denote the number of elements in A. We say that A1, A2, . . . are disjoint or mutually exclusive if Ai fl Aj = ÿ whenever i ”= j. For example, A1 = [0,1),A2 = [1,2),A3 = [2,3),… are dtisjoint. A partition of is a sequence of disjoint sets A1, A2, . . . such that
Œi = 1 A i = .
8
Summary: Sample space and events