. use "fev1.dta"
. drop if (id==197)
(1 observation deleted)
. gen y=logfev1 - 2*(log(ht))
. gen logbht=log(baseht)
. reg y age i.id
Source | SS df MS Number of obs = 1993
-------------+------------------------------ F(299, 1693) = 31.31
Model | 39.2499642 299 .131270783 Prob > F = 0.0000
Residual | 7.09835489 1693 .004192767 R-squared = 0.8468
-------------+------------------------------ Adj R-squared = 0.8198
Total | 46.3483191 1992 .023267228 Root MSE = .06475
------------------------------------------------------------------------------
y | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
age | .0298226 .0004798 62.15 0.000 .0288815 .0307637
|
id |
2 | .0965422 .033513 2.88 0.004 .0308111 .1622734
3 | .1556518 .0326371 4.77 0.000 .0916384 .2196651
4 | -.0064548 .0319146 -0.20 0.840 -.069051 .0561415
5 | -.0253331 .0346119 -0.73 0.464 -.0932198 .0425536
6 | .0691409 .0313087 2.21 0.027 .007733 .1305487
7 | -.1271935 .0346117 -3.67 0.000 -.1950797 -.0593072
8 | .0410112 .032632 1.26 0.209 -.022992 .1050144
9 | -.0103553 .0326418 -0.32 0.751 -.0743778 .0536673
10 | .2839699 .0319103 8.90 0.000 .2213822 .3465576
11 | .0943892 .0360269 2.62 0.009 .0237272 .1650511
12 | .1867598 .0692592 2.70 0.007 .0509171 .3226025
13 | .0213495 .044692 0.48 0.633 -.0663079 .1090069
14 | .07083 .0692686 1.02 0.307 -.0650312 .2066911
15 | .066741 .0379278 1.76 0.079 -.0076493 .1411312
16 | .1592216 .0313089 5.09 0.000 .0978133 .2206299
17 | .1711773 .0692829 2.47 0.014 .0352881 .3070665
18 | .0866562 .0307956 2.81 0.005 .0262548 .1470576
19 | .1619655 .0692811 2.34 0.020 .0260798 .2978512
20 | .2201291 .03191 6.90 0.000 .1575418 .2827163
<output deleted>
290 | .130696 .0360336 3.63 0.000 .0600209 .2013712
291 | -.0963241 .0405864 -2.37 0.018 -.1759288 -.0167194
292 | -.0307582 .0405853 -0.76 0.449 -.1103609 .0488445
293 | -.1928679 .0405861 -4.75 0.000 -.272472 -.1132637
294 | -.2094488 .0692363 -3.03 0.003 -.3452465 -.0736511
295 | -.2095296 .0692232 -3.03 0.003 -.3453015 -.0737576
296 | .1941328 .0346271 5.61 0.000 .1262164 .2620493
297 | .1167684 .0692236 1.69 0.092 -.0190045 .2525412
298 | .234128 .0446867 5.24 0.000 .1464809 .321775
299 | -.0164066 .0360486 -0.46 0.649 -.0871111 .054298
300 | .0307542 .034628 0.89 0.375 -.0371639 .0986723
|
_cons | -.3987398 .0252212 -15.81 0.000 -.4482078 -.3492718
------------------------------------------------------------------------------
. areg y age, absorb(id)
Linear regression, absorbing indicators Number of obs = 1993
F( 1, 1693) = 3863.17
Prob > F = 0.0000
R-squared = 0.8468
Adj R-squared = 0.8198
Root MSE = .06475
------------------------------------------------------------------------------
y | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
age | .0298226 .0004798 62.15 0.000 .0288815 .0307637
_cons | -.3555599 .0062023 -57.33 0.000 -.3677248 -.343395
-------------+----------------------------------------------------------------
id | F(298, 1693) = 15.526 0.000 (299 categories)
Linear Mixed Effects Model (Random Intercept)
. xtmixed y age || id: , variance reml
Performing EM optimization:
Performing gradient-based optimization:
Iteration 0: log restricted-likelihood = 2238.9485
Iteration 1: log restricted-likelihood = 2238.9485
Computing standard errors:
Mixed-effects REML regression Number of obs = 1993
Group variable: id Number of groups = 299
Obs per group: min = 1
avg = 6.7
max = 12
Wald chi2(1) = 3967.87
Log restricted-likelihood = 2238.9485 Prob > chi2 = 0.0000
------------------------------------------------------------------------------
y | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
age | .029807 .0004732 62.99 0.000 .0288795 .0307344
_cons | -.3551712 .0081792 -43.42 0.000 -.3712022 -.3391402
------------------------------------------------------------------------------
------------------------------------------------------------------------------
Random-effects Parameters | Estimate Std. Err. [95% Conf. Interval]
-----------------------------+------------------------------------------------
id: Identity |
var(_cons) | .0093054 .0008486 .0077823 .0111266
-----------------------------+------------------------------------------------
var(Residual) | .0041896 .0001438 .0039171 .0044811
------------------------------------------------------------------------------
LR test vs. linear regression: chibar2(01) = 1545.86 Prob >= chibar2 = 0.0000
Linear Mixed Effects Model (Random Intercept)
Decomposing Between- and Within-Subject Effects
. egen m_age=mean(age), by(id)
. gen c_age=age - m_age
. xtmixed y m_age c_age || id: , variance reml
Performing EM optimization:
Performing gradient-based optimization:
Iteration 0: log restricted-likelihood = 2234.0577
Iteration 1: log restricted-likelihood = 2234.0577
Computing standard errors:
Mixed-effects REML regression Number of obs = 1993
Group variable: id Number of groups = 299
Obs per group: min = 1
avg = 6.7
max = 12
Wald chi2(2) = 3967.44
Log restricted-likelihood = 2234.0577 Prob > chi2 = 0.0000
------------------------------------------------------------------------------
y | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
m_age | .0292337 .0029014 10.08 0.000 .023547 .0349204
c_age | .0298226 .0004796 62.18 0.000 .0288826 .0307627
_cons | -.3483212 .0351743 -9.90 0.000 -.4172617 -.2793808
------------------------------------------------------------------------------
------------------------------------------------------------------------------
Random-effects Parameters | Estimate Std. Err. [95% Conf. Interval]
-----------------------------+------------------------------------------------
id: Identity |
var(_cons) | .0093363 .0008525 .0078063 .0111661
-----------------------------+------------------------------------------------
var(Residual) | .0041898 .0001438 .0039172 .0044813
------------------------------------------------------------------------------
LR test vs. linear regression: chibar2(01) = 1546.23 Prob >= chibar2 = 0.0000