Stanislav Anatolyev. Intermediate and advanced econometrics: problems and solutions
|
|
|
- Karin Lester
- 10 years ago
- Views:
Transcription
1 Staslav Aatolyev Itermedate ad advaced ecoometrcs: problems ad solutos Thrd edto KL/9/8 Moscow 9
2 Анатольев С.А. Задачи и решения по эконометрике. #KL/9/8. М.: Российская экономическая школа, 9 г. 78 с. (Англ.) Данное пособие сборник задач, которые использовались автором при преподавании эконометрики промежуточного и продвинутого уровней в Российской Экономической Школе в течение последних нескольких лет. Все задачи сопровождаются решениями. Ключевые слова: асимптотическая теория, бутстрап, линейная регрессия, метод наименьших квадратов, нелинейная регрессия, непараметрическая регрессия, экстремальное оценивание, метод наибольшего правдоподобия, инструментальные переменные, обобщенный метод моментов, эмпирическое правдоподобие, анализ панельных данных, условные ограничения на моменты, альтернативная асимптотика, асимптотика высокого порядка. Aatolyev, Staslav A. Itermedate ad advaced ecoometrcs: problems ad solutos. #KL 9/8 Moscow, New Ecoomc School, 9 78 pp. (Eg.) Ths maual s a collecto of problems that the author has bee usg teachg termedate ad advaced level ecoometrcs courses at the New Ecoomc School durg last several years. All problems are accompaed by sample solutos. Key words: asymptotc theory, bootstrap, lear regresso, ordary ad geeralzed least squares, olear regresso, oparametrc regresso, extremum estmato, maxmum lkelhood, strumetal varables, geeralzed method of momets, emprcal lkelhood, pael data aalyss, codtoal momet restrctos, alteratve asymptotcs, hgher-order asymptotcs ISBN Анатольев С.А., 9 г. Российская экономическая школа, 9 г.
3 CONTENTS I Problems 5 Asymptotc theory: geeral ad depedet data 7. Asymptotcs of trasformatos Asymptotcs of rotated logarthms Escapg probablty mass Asymptotcs of t-ratos Creepg bug o smplex Asymptotcs of sample varace Asymptotcs of roots Secod-order Delta-method Asymptotcs wth shrkg regressor Power treds Asymptotc theory: tme seres. Treded vs. d ereced regresso Log ru varace for AR() Asymptotcs of averages of AR() ad MA() Asymptotcs for mpulse respose fuctos Bootstrap 3 3. Bref ad exhaustve Bootstrappg t-rato Bootstrap bas correcto Bootstrap lear model Bootstrap for mpulse respose fuctos Regresso ad projecto 5 4. Regressg ad projectg dce Mxture of ormals Beroull regressor Best polyomal approxmato Hadlg codtoal expectatos Lear regresso ad OLS 7 5. Fxed ad radom regressors Cosstecy of OLS uder serally correlated errors Estmato of lear combato Icomplete regresso Geerated coe cet OLS olear model Log ad short regressos Rdge regresso Icosstecy uder alteratve Returs to schoolg CONTENTS 3
4 6 Heteroskedastcty ad GLS 3 6. Codtoal varace estmato Expoetal heteroskedastcty OLS ad GLS are detcal OLS ad GLS are equvalet Equcorrelated observatos Ubasedess of certa FGLS estmators Varace estmato Whte estmator HAC estmato uder homoskedastcty Expectatos of Whte ad Newey West estmators IID settg Nolear regresso Local ad global det cato Idet cato whe regressor s oradom Cobb Douglas producto fucto Expoetal regresso Power regresso Trasto regresso Nolear cosumpto fucto Extremum estmators Regresso o costat Quadratc regresso Nolearty at left had sde Least fourth powers Asymmetrc loss Maxmum lkelhood estmato 39. Normal dstrbuto Pareto dstrbuto Comparso of ML tests Ivarace of ML tests to reparameterzatos of ull Msspec ed maxmum lkelhood Idvdual e ects Irregular co dece terval Trval parameter space Nusace parameter desty MLE versus OLS MLE versus GLS MLE heteroskedastc tme seres regresso Does the lk matter? Maxmum lkelhood ad bary varables Maxmum lkelhood ad bary depedet varable Posso regresso Bootstrappg ML tests CONTENTS
5 Istrumetal varables 45. Ivald SLS Cosumpto fucto Optmal combato of strumets Trade ad growth Geeralzed method of momets 47. Nolear smultaeous equatos Improved GMM Mmum Dstace estmato Formato of momet codtos What CMM estmates Trty for GMM All about J Iterest rates ad future ato Spot ad forward exchage rates Returs from acal market Istrumetal varables ARMA models Hausma may ot work Testg momet codtos Bootstrappg OLS Bootstrappg DD Pael data Alteratg dvdual e ects Tme varat regressors Wth ad Betwee Paels ad strumets D erecg trasformatos Nolear pael data model Durb Watso statstc ad pael data Hgher-order dyamc pael Noparametrc estmato Noparametrc regresso wth dscrete regressor Noparametrc desty estmato Nadaraya Watso desty estmator Frst d erece trasformato ad oparametrc regresso Ubasedess of kerel estmates Shape restrcto Noparametrc hazard rate Noparametrcs ad perfect t Noparametrcs ad extreme observatos Codtoal momet restrctos 6 5. Usefuless of skedastc fucto Symmetrc regresso error Optmal strumetato of cosumpto fucto Optmal strumet AR-ARCH model Optmal strumet AR wth olear error Optmal IV estmato of a costat CONTENTS 5
6 5.7 Negatve bomal dstrbuto ad PML Nestg ad PML Msspec cato varace Mod ed Posso regresso ad PML estmators Optmal strumet ad regresso o costat Emprcal Lkelhood Commo mea Kullback Lebler Iformato Crtero Emprcal lkelhood as IV estmato Advaced asymptotc theory Maxmum lkelhood ad asymptotc bas Emprcal lkelhood ad asymptotc bas Asymptotcally rrelevat strumets Weakly edogeous regressors Weakly vald strumets II Solutos 69 Asymptotc theory: geeral ad depedet data 7. Asymptotcs of trasformatos Asymptotcs of rotated logarthms Escapg probablty mass Asymptotcs of t-ratos Creepg bug o smplex Asymptotcs of sample varace Asymptotcs of roots Secod-order Delta-method Asymptotcs wth shrkg regressor Power treds Asymptotc theory: tme seres 79. Treded vs. d ereced regresso Log ru varace for AR() Asymptotcs of averages of AR() ad MA() Asymptotcs for mpulse respose fuctos Bootstrap Bref ad exhaustve Bootstrappg t-rato Bootstrap bas correcto Bootstrap lear model Bootstrap for mpulse respose fuctos Regresso ad projecto Regressg ad projectg dce Mxture of ormals Beroull regressor Best polyomal approxmato Hadlg codtoal expectatos CONTENTS
7 5 Lear regresso ad OLS 9 5. Fxed ad radom regressors Cosstecy of OLS uder serally correlated errors Estmato of lear combato Icomplete regresso Geerated coe cet OLS olear model Log ad short regressos Rdge regresso Icosstecy uder alteratve Returs to schoolg Heteroskedastcty ad GLS Codtoal varace estmato Expoetal heteroskedastcty OLS ad GLS are detcal OLS ad GLS are equvalet Equcorrelated observatos Ubasedess of certa FGLS estmators Varace estmato 7. Whte estmator HAC estmato uder homoskedastcty Expectatos of Whte ad Newey West estmators IID settg Nolear regresso 3 8. Local ad global det cato Idet cato whe regressor s oradom Cobb Douglas producto fucto Expoetal regresso Power regresso Smple trasto regresso Nolear cosumpto fucto Extremum estmators 7 9. Regresso o costat Quadratc regresso Nolearty at left had sde Least fourth powers Asymmetrc loss Maxmum lkelhood estmato 3. Normal dstrbuto Pareto dstrbuto Comparso of ML tests Ivarace of ML tests to reparameterzatos of ull Msspec ed maxmum lkelhood Idvdual e ects Irregular co dece terval Trval parameter space Nusace parameter desty CONTENTS 7
8 .MLE versus OLS MLE versus GLS MLE heteroskedastc tme seres regresso does the lk matter? Maxmum lkelhood ad bary varables Maxmum lkelhood ad bary depedet varable Posso regresso Bootstrappg ML tests Istrumetal varables 9. Ivald SLS Cosumpto fucto Optmal combato of strumets Trade ad growth Geeralzed method of momets 33. Nolear smultaeous equatos Improved GMM Mmum Dstace estmato Formato of momet codtos What CMM estmates Trty for GMM All about J Iterest rates ad future ato Spot ad forward exchage rates Returs from acal market Istrumetal varables ARMA models Hausma may ot work Testg momet codtos Bootstrappg OLS Bootstrappg DD Pael data Alteratg dvdual e ects Tme varat regressors Wth ad Betwee Paels ad strumets D erecg trasformatos Nolear pael data model Durb Watso statstc ad pael data Hgher-order dyamc pael Noparametrc estmato 5 4. Noparametrc regresso wth dscrete regressor Noparametrc desty estmato Nadaraya Watso desty estmator Frst d erece trasformato ad oparametrc regresso Ubasedess of kerel estmates Shape restrcto Noparametrc hazard rate Noparametrcs ad perfect t CONTENTS
9 4.9 Noparametrcs ad extreme observatos Codtoal momet restrctos Usefuless of skedastc fucto Symmetrc regresso error Optmal strumetato of cosumpto fucto Optmal strumet AR-ARCH model Optmal strumet AR wth olear error Optmal IV estmato of a costat Negatve bomal dstrbuto ad PML Nestg ad PML Msspec cato varace Mod ed Posso regresso ad PML estmators Optmal strumet ad regresso o costat Emprcal Lkelhood Commo mea Kullback Lebler Iformato Crtero Emprcal lkelhood as IV estmato Advaced asymptotc theory Maxmum lkelhood ad asymptotc bas Emprcal lkelhood ad asymptotc bas Asymptotcally rrelevat strumets Weakly edogeous regressors Weakly vald strumets CONTENTS 9
10 CONTENTS
11 PREFACE Ths maual s a thrd edto of the collecto of problems that I have bee usg teachg termedate ad advaced level ecoometrcs courses at the New Ecoomc School (NES), Moscow, for already a decade. All problems are accompaed by sample solutos. Approxmately, chapters 8 ad 4 of the collecto belog to a course termedate level ecoometrcs ( Ecoometrcs III the NES teral course structure); chapters 9 3 to a course advaced level ecoometrcs ( Ecoometrcs IV, respectvely). The problems chapters 5 7 requre kowledge of more advaced ad specal materal. They have bee used the NES course Topcs Ecoometrcs. May of the problems are ot ew. Some are spred by my former teachers of ecoometrcs at PhD studes: Hyugtak Ah, Mahmoud El-Gamal, Bruce Hase, Yuch Ktamura, Charles Mask, Gautam Trpath, Keeth West. Some problems are borrowed from ther problem sets, as well as problem sets of other leadg ecoometrcs scholars or ther textbooks. Some orgate from the Problems ad Solutos secto of the joural Ecoometrc Theory, where the author has publshed several problems. The release of ths collecto would be hard wthout valuable help of my teachg assstats durg varous years: Adrey Vasev, Vktor Subbot, Semyo Polbekov, Alexader Vaschlko, Des Sokolov, Oleg Itskhok, Adrey Shabal, Staslav Kolekov, Aa Mkusheva, Dmtry Shak, Oleg Shbaov, Vadm Cherepaov, Pavel Stetseko, Iva Lazarev, Yula Shkurat, Dmtry Muravyev, Artem Shamguov, Dala Delya, Vktora Stepaova, Bors Gershma, Alexader Mgta, Iva Mrgorodsky, Roma Chkoller, Adrey Savochk, Alexader Kobel, Ekatera Lavreko, Yula Vakhrutdova, Elea Pkula, to whom go my deepest thaks. My thaks also go to my studets ad assstats who spotted errors ad typos that crept to the rst ad secod edtos of ths maual, especally Dmtry Shak, Des Sokolov, Pavel Stetseko, Georgy Kartashov, ad Roma Chkoller. Preparato of ths maual was supported part by the Swedsh Professorshp ( 3) from the Ecoomcs Educato ad Research Cosortum, wth fuds provded by the Govermet of Swede through the Eurasa Foudato, ad by the Access Idustres Professorshp (3 9) from Access Idustres. I wll be grateful to everyoe who ds errors, mstakes ad typos ths collecto ad reports them to [email protected]. CONTENTS
12 CONTENTS
13 NOTATION AND ABBREVIATIONS ID det cato FOC/SOC rst/secod order codto(s) CDF cumulatve dstrbuto fucto, typcally deoted as F PDF probablty desty fucto, typcally deoted as f LIME law of terated (mathematcal) expectatos LLN law of large umbers CLT cetral lmt theorem I fag dcator fucto equallg uty whe A holds ad zero otherwse Pr fag probablty of A E [yjx] mathematcal expectato (mea) of y codtoal o x V [yjx] varace of y codtoal o x C [x; y] covarace betwee x ad y BLP, BLP [yjx] best lear predctor I k k k detty matrx plm probablty lmt typcally meas dstrbuted as N ormal (Gaussa) dstrbuto k k ch-squared dstrbuto wth k degrees of freedom () o-cetral ch-squared dstrbuto wth k degrees of freedom ad o-cetralty parameter B (p) Beroull dstrbuto wth success probablty p IID depedetly ad detcally dstrbuted typcally sample sze cross-sectos T typcally sample sze tme seres k typcally umber of parameters parametrc models ` typcally umber of strumets or momet codtos X ; Y; Z; E; b E data matrces of regressors, depedet varables, strumets, errors, resduals L () (codtoal) lkelhood fucto ` () (codtoal) loglkelhood fucto s () (codtoal) score fucto m () momet fucto Q f typcally E [f] ; for example, Q xx = E [xx ] ; Q gge = E g g e ; = E [@m=@] ; etc. I Iformato matrx W Wald test statstc LR lkelhood rato test statstc LM Lagrage multpler (score) test statstc J Hase s J test statstc CONTENTS 3
14 4 CONTENTS
15 Part I Problems 5
16
17 . ASYMPTOTIC THEORY: GENERAL AND INDEPENDENT DATA. Asymptotcs of trasformatos. Suppose that p T (^ ) d! N (; ). Fd the lmtg dstrbuto of T ( cos ^).. Suppose that T (^ ) d! N (; ). Fd the lmtg dstrbuto of T s ^. 3. Suppose that T ^! d. Fd the lmtg dstrbuto of T log ^.. Asymptotcs of rotated logarthms Let the postve radom vector (U ; V ) p U V u v be such that d! N!uu ;! uv! uv! vv as! : Fd the jot asymptotc dstrbuto of l U l V : l U + l V What s the codto uder whch l U l V ad l U + l V are asymptotcally depedet?.3 Escapg probablty mass Let X = fx ; : : : ; x g be a radom sample from some populato of x wth E [x] = ad V [x] =. Let A deote a evet such that P fa g = ; ad let the dstrbuto of A be depedet of the dstrbuto of x. Now costruct the followg radomzed estmator of : x ^ = f A happes, otherwse. () Fd the bas, varace, ad MSE of ^. Show how they behave as!. () Is ^ a cosstet estmator of? Fd the asymptotc dstrbuto of p (^ ): () Use ths dstrbuto to costruct a approxmately ( ) % co dece terval for. Compare ths CI wth the oe obtaed by usg x as a estmator of. ASYMPTOTIC THEORY: GENERAL AND INDEPENDENT DATA 7
18 .4 Asymptotcs of t-ratos Let fx g = be a radom sample of a scalar radom varable x wth E[x] = ; V[x] = ; E[(x ) 3 ] = ; E[(x ) 4 ] = ; where all parameters are te. (a) De e T x^ ; where x x ; = ^ (x x) : = Derve the lmtg dstrbuto of p T uder the assumpto =. (b) Now suppose t s ot assumed that =. Derve the lmtg dstrbuto of p T plm T :! Be sure your aswer reduces to the result of part (a) whe =. (c) De e R x ; where = x s the costraed estmator of uder the (possbly correct) assumpto =. Derve the lmtg dstrbuto of p R plm R! for arbtrary ad >. Uder what codtos o ad wll ths asymptotc dstrbuto be the same as part (b)?.5 Creepg bug o smplex Cosder a postve (x; y) orthat R + ad ts ut smplex,.e. the le segmet x + y = ; x ; y : Take a arbtrary atural umber k N: Image a bug startg creepg from the org (x; y) = (; ): Each secod the bug goes ether the postve x drecto wth probablty p; or the postve y drecto wth probablty p; each tme coverg dstace k : Evdetly, ths way the bug reaches the smplex k secods. Suppose t arrves there at pot (x k ; y k ): Now let k! ;.e. as f the bug shrks sze ad physcal abltes per secod. Determe (a) the probablty lmt of (x k ; y k ); (b) the rate of covergece; (c) the asymptotc dstrbuto of (x k ; y k ). 8 ASYMPTOTIC THEORY: GENERAL AND INDEPENDENT DATA
19 .6 Asymptotcs of sample varace Let x ; : : : ; x be a radom sample from a populato of x wth te fourth momets. Let x ad x be the sample averages of x ad x, respectvely. Fd costats a ad b ad fucto c() such that the vector sequece x a c() x b coverges to a otrval dstrbuto, ad determe ths lmtg dstrbuto. Derve the asymptotc dstrbuto of the sample varace x (x ) :.7 Asymptotcs of roots Suppose we are terested the ferece about the root of the olear system F (a; ) = ; where F : R p R k! R k ; ad a s a vector of costats. Let avalable be ^a; a cosstet ad asymptotcally ormal estmator of a: Assumg that s the uque soluto of the above system, ad ^ s the uque soluto of the system F (^a; ^) = ; derve the asymptotc dstrbuto of ^: Assume that all eeded smoothess codtos are sats ed..8 Secod-order Delta-method Let S = P = X ; where X ; = ; : : : ; ; s a radom sample of scalar radom varables wth E [X ] = ad V [X ] = : It s easy to show that p (S ) d! N (; 4 ) whe 6= : (a) Fd the asymptotc dstrbuto of S whe = ; by takg a square of the asymptotc dstrbuto of S. (b) Fd the asymptotc dstrbuto of cos(s ): Ht: appled to cos(s ). take a hgher-order Taylor expaso (c) Usg the techque of part (b), formulate ad prove a aalog of the Delta-method for the case whe the fucto s scalar-valued, has zero rst dervatve ad ozero secod dervatve (whe the dervatves are evaluated at the probablty lmt). For smplcty, let all volved radom varables be scalars..9 Asymptotcs wth shrkg regressor Suppose that y = + x + u ; ASYMPTOTICS OF SAMPLE VARIANCE 9
20 where fu g are IID wth E [u ] =, E u = ad E u 3 =, whle the regressor x s determstcally shrkg: x = wth (; ): Let the sample sze be : Dscuss as fully as you ca the asymptotc behavor of the OLS estmates (^; ^; ^ ) of (; ; ) as! :. Power treds Suppose that y = x + " ; = ; : : : ; ; where " IID (; ) whle x = for some kow ; ad = for some kow :. Uder what codtos o ad s the OLS estmator of cosstet? Derve ts asymptotc dstrbuto whe t s cosstet.. Uder what codtos o ad s the GLS estmator of cosstet? Derve ts asymptotc dstrbuto whe t s cosstet. ASYMPTOTIC THEORY: GENERAL AND INDEPENDENT DATA
21 . ASYMPTOTIC THEORY: TIME SERIES. Treded vs. d ereced regresso Cosder a lear model wth a learly tredg regressor: y t = + t + " t ; where the sequece " t s depedetly ad detcally dstrbuted accordg to some dstrbuto D wth mea zero ad varace : The object of terest s :. Wrte out the OLS estmator ^ of devatos form ad d ts asymptotc dstrbuto.. A researcher suggests removg the tredg regressor by takg d ereces to obta y t y t = + " t " t ad the estmatg by OLS. Wrte out the OLS estmator of ad d ts asymptotc dstrbuto. 3. Compare the estmators ^ ad terms of asymptotc e cecy.. Log ru varace for AR() Ofte oe eeds to estmate the log-ru varace V ze lm V p T! T T X t= z t e t! of a statoary sequece z t e t that sats es the restrcto E[e t jz t ] = : Derve a compact expresso for V ze the case whe e t ad z t follow depedet scalar AR() processes. For ths example, propose a method to cosstetly estmate V ze ; ad show your estmator s cosstecy..3 Asymptotcs of averages of AR() ad MA() Let x t be a martgale d erece sequece wth respect to ts ow past, ad let all codtos for the CLT be sats ed: p T x T = T = P T t= x d t! N (; ): Let ow y t = y t +x t ad z t = x t +x t ; where jj < ad jj < : Cosder tme averages y T = T P T t= y t ad z T = T P T t= z t:. Are y t ad z t martgale d erece sequeces relatve to ther ow past?. Fd the asymptotc dstrbutos of y T ad z T : ASYMPTOTIC THEORY: TIME SERIES
22 3. How would you estmate the asymptotc varaces of y T ad z T? 4. Repeat what you dd parts 3 whe x t s a k vector, ad we have p T x T = T = P T t= x t N (; ), y t = Py t +x t ; z t = x t +x t ; where P ad are k k matrces wth egevalues sde the ut crcle. d!.4 Asymptotcs for mpulse respose fuctos A statoary ad ergodc process z t that admts the represetato z t = + X j " t j ; j= where P j= j jj < ad " t s zero mea IID, s called lear. The fucto IRF (j) = j s called mpulse respose fucto of z t ; re ectg the fact that j t =@" t j ; a respose of z t to ts ut shock j perods ago.. Show that the strog zero mea AR() ad ARMA(,) processes ad y t = y t + " t ; jj < z t = z t + " t " t ; jj < ; jj < ; 6= ; are lear, ad derve ther mpulse respose fuctos.. Suppose a sample z ; : : : ; z T s gve. For the AR() process, costruct a estmator of the IRF o the bass of the OLS estmator of. Derve the asymptotc dstrbuto of your IRF estmator for xed horzo j as the sample sze T!. 3. Suppose that for the ARMA(,) process oe estmates from the sample z ; : : : ; z T by ^ = P T t=3 z tz t P T t=3 z t z t ; ad by a approprate root of the quadratc equato P ^ T + ^ = t= ^e t^e t P T ; ^e t = z t ^z t : t= ^e t O the bass of these estmates, costruct a estmator of the mpulse respose fucto you derved. Outle the steps (o eed to show all math) whch you would udertake order to derve ts asymptotc dstrbuto for xed j as T!. ASYMPTOTIC THEORY: TIME SERIES
23 3. BOOTSTRAP 3. Bref ad exhaustve Evaluate the followg clams.. The oly d erece betwee Mote Carlo ad the bootstrap s possblty ad mpossblty, respectvely, of samplg from the true populato.. Whe oe does bootstrap, there s o reaso to rase the umber of bootstrap repetto too hgh: there s a level whe makg t larger does ot yeld ay mprovemet precso. 3. The bootstrap estmator of the parameter of terest s preferable to the asymptotc oe, sce ts rate of covergece to the true parameter s ofte larger. 3. Bootstrappg t-rato Cosder the followg bootstrap procedure. Usg the oparametrc bootstrap, geerate bootstrap samples ad calculate ^ b ^ at each bootstrap repetto. Fd the quatles q= s(^) ad q = from ths bootstrap dstrbuto, ad costruct CI = [^ s(^)q = ; ^ s(^)q = ]: Show that CI s exactly the same as the percetle terval, ad ot the percetle-t terval. 3.3 Bootstrap bas correcto. Cosder a radom varable x wth mea : A radom sample fx g = s avalable. Oe estmates by x ad by x : Fd out what the bootstrap bas corrected estmators of ad are.. Suppose we have a sample of two depedet observatos z = ad z = 3 from the same dstrbuto. Let us be terested E[z ] ad (E[z]) whch are atural to estmate by z = (z + z ) ad z = 4 (z + z ) : Compute the bootstrap-bas-corrected estmates of the quattes of terest. BOOTSTRAP 3
24 3.4 Bootstrap lear model. Suppose oe has a radom sample of observatos from the lear regresso model y = x + e; E [ejx] = : Is the oparametrc bootstrap vald or vald the presece of heteroskedastcty? Expla.. Let the model be y = x + e; but E [ex] 6= ;.e. the regressors are edogeous. The OLS estmator ^ of the parameter s based. We kow that the bootstrap s a good way to estmate bas, so the dea s to estmate the bas of ^ ad costruct a bas-adjusted estmate of : Expla whether or ot the o-parametrc bootstrap ca be used to mplemet ths dea. 3. Take the lear regresso y = x + e; E [ejx] = : For a partcular value of x; the object of terest s the codtoal mea g(x) = E [yjx] : Descrbe how you would use the percetle-t bootstrap to costruct a co dece terval for g(x): 3.5 Bootstrap for mpulse respose fuctos Recall the formulato of Problem.4.. Descrbe detal how to costruct 95% error bads aroud the IRF estmates for the AR() process usg the bootstrap that attas asymptotc re emet.. It s well kow that spte of ther asymptotc ubasedess, usual estmates of mpulse respose fuctos are sg catly based samples typcally ecoutered practce. Propose a bootstrap algorthm to costruct a bas corrected mpulse respose fucto for the above ARMA(,) process. 4 BOOTSTRAP
25 4. REGRESSION AND PROJECTION 4. Regressg ad projectg dce Let y be a radom varable that deotes the umber of dots obtaed whe a far sx sded de s rolled. Let y f y s eve, x = otherwse. () Fd the jot dstrbuto of (x; y). () Fd the best predctor of y gve x. () Fd the best lear predctor, BLP [yjx], of y codtoal o x. (v) Calculate E U BP ad E U BLP, the mea square predcto errors for cases () ad () respectvely, ad show that E U BP E U BLP. 4. Mxture of ormals Suppose that pars (x ; y ); = ; : : : ; ; are depedetly draw from the followg mxture of ormals dstrbuto: 8 x >< N ; wth probablty p; 4 y 4 >: N ; wth probablty p; where < p < :. Derve the best lear predctor BLP [yjx] of y gve x.. Argue that the codtoal expectato fucto E [yjx] s olear. Provde a step-by-step algorthm allowg oe to derve E [yjx] ; ad derve t f you ca. 4.3 Beroull regressor Let x be dstrbuted Beroull, ad, codtoal o x; y be dstrbuted as N ; yjx ; x = ; N ; ; x = : Wrte out E [yjx] ad E y jx as lear fuctos of x: Why are these expectatos lear x? REGRESSION AND PROJECTION 5
26 4.4 Best polyomal approxmato Gve jotly dstrbuted radom varables x ad y; a best k th order polyomal approxmato BPA k [yjx] to E [yjx] ; the MSE sese, s a soluto to the problem m E E [yjx] x : : : k x k : ; ;:::; k Assumg that BPA k [yjx] exsts, d ts characterzato ad derve the propertes of the assocated predcto error U k = y BPA k [yjx] : 4.5 Hadlg codtoal expectatos. Cosder the followg stuato. The vector (y; x; z; w) s a radom quadruple. It s kow that E [yjx; z; w] = + x + z: It s also kow that C [x; z] = ad that C [w; z] > : The parameters ; ad are ot kow. A radom sample of observatos o (y; x; w) s avalable; z s ot observable. I ths settg, a researcher weghs two optos for estmatg : Oe s a lear least squares t of y o x: The other s a lear least squares t of y o (x; w): Compare these optos.. Let (x; y; z) be a radom trple. For a gve real costat ; a researcher wats to estmate E [yje [xjz] = ]. The researcher kows that E [xjz] ad E [yjz] are strctly creasg ad cotuous fuctos of z, ad s gve cosstet estmates of these fuctos. Show how the researcher ca use them to obta a cosstet estmate of the quatty of terest. 6 REGRESSION AND PROJECTION
27 5. LINEAR REGRESSION AND OLS 5. Fxed ad radom regressors. Commet o: Treatg regressors x a mea regresso as radom varables rather tha xed umbers smpl es further aalyss, sce the the observatos (x ; y ) may be treated as IID across.. A labor ecoomst argues: It s more plausble to thk of my regressors as radom rather tha xed. Look at educato, for example. A perso chooses her level of educato, thus t s radom. Age may be msreported, so t s radom too. Eve geder s radom, because oe ca get a sex chage operato doe. Commet o ths pearl. 3. Cosder a lear mea regresso y = x + e; E [ejx] = ; where x; stead of beg IID across ; depeds o through a ukow fucto ' as x = '() + u ; where u are IID depedet of e : Show that the OLS estmator of s stll ubased. 5. Cosstecy of OLS uder serally correlated errors Let fy t g + varace. t= be a strctly statoary ad ergodc stochastc process wth zero mea ad te () De e so that we ca wrte = C [y t; y t ] ; u t = y t y t ; V [y t ] y t = y t + u t : Show that the error u t sats es E [u t ] = ad C [u t ; y t ] = : () Show that the OLS estmator ^ from the regresso of y t o y t s cosstet for : () Show that, wthout further assumptos, u t s serally correlated. Costruct a example wth serally correlated u t. (v) A 994 paper the Joural of Ecoometrcs leads wth the statemet: It s well kow that lear regresso models wth lagged depedet varables, ordary least squares (OLS) estmators are cosstet f the errors are autocorrelated. Ths statemet, or a slght varato of t, appears vrtually all ecoometrcs textbooks. Recocle ths statemet wth your dgs from parts () ad (). Ths problem closely follows J.M. Wooldrdge (998) Cosstecy of OLS the Presece of Lagged Depedet Varable ad Serally Correlated Errors. Ecoometrc Theory 4, Problem LINEAR REGRESSION AND OLS 7
28 5.3 Estmato of lear combato Suppose oe has a radom sample of observatos from the lear regresso model y = + x + z + e; where e has mea zero ad varace ad s depedet of (x; z) :. What s the codtoal varace of the best lear codtoally (o the x ad z samples) ubased estmator ^ of where c x ad c z are some gve costats? = + c x + c z ;. Obta the lmtg dstrbuto of p ^ : Wrte your aswer as a fucto of the meas, varaces ad correlatos of x, z ad e ad of the costats ; ; ; c x ; c z ; assumg that all momets are te. 3. For whch value of the correlato coe cet betwee x ad z s the asymptotc varace mmzed for gve varaces of e ad x? 4. Dscuss the relatoshp of the result of part 3 wth the problem of multcollearty. 5.4 Icomplete regresso Cosder the lear regresso y = x + e; E [ejx] = ; E e jx = ; where x s k : Suppose that some compoet of the error e s observable, so that e = z + ; where z s a k vector of observables such that E [jz] = ad E [xz ] 6= : A researcher wats to estmate ad ad cosders two alteratves:. Ru the regresso of y o x ad z to d the OLS estmates ^ ad ^ of ad :. Ru the regresso of y o x to get the OLS estmate ^ of, compute the OLS resduals ^e = y x ^ ad ru the regresso of ^e o z to retreve the OLS estmate ^ of : Whch of the two methods would you recommed from the pot of vew of cosstecy of ^ ad ^? For the method(s) that yeld(s) cosstet estmates, d the lmtg dstrbuto of p (^ ) : 8 LINEAR REGRESSION AND OLS
29 5.5 Geerated coe cet Cosder the followg regresso model: y = x + z + u; where ad are scalar ukow parameters, u has zero mea ad ut varace, par (x; z) are depedet of u wth E x = x 6= ; E z = z 6= ; E [xz] = xz 6=. A collecto of trples f(x ; z ; y )g = s a radom sample. Suppose we are gve a estmator ^ of depedet of all u s, ad the lmtg dstrbuto of p (^ ) s N (; ) as! : De e the estmator ^ of as! ^ = x (y ^z ) : x = = Obta the asymptotc dstrbuto of ^ as! : 5.6 OLS olear model Cosder the equato y = ( + x)e, where y ad x are scalar observables, e s uobservable. Let E [ejx] = ad V [ejx] =. How would you estmate (; ) by OLS? How would you costruct stadard errors? 5.7 Log ad short regressos Take the true model y = x + x + e, E [ejx ; x ] = ; ad assume radom samplg. Suppose that s estmated by regressg y o x oly. Fd the probablty lmt of ths estmator. What are the codtos whe t s cosstet for? 5.8 Rdge regresso I the stadard lear mea regresso model, oe estmates k parameter by ~ = X X + I k X Y; where > s a xed scalar, I k s a k k detty matrx, X s k ad Y s matrces of data.. Fd E[ ~ jx ]. Is ~ codtoally ubased? Is t ubased?. Fd the probablty lmt of ~ as!. Is ~ cosstet? 3. Fd the asymptotc dstrbuto of ~. 4. From your vewpot, why may oe wat to use ~ stead of the OLS estmator ^? Gve codtos uder whch ~ s preferable to ^ accordg to your crtero, ad vce versa. GENERATED COEFFICIENT 9
30 5.9 Icosstecy uder alteratve Suppose that y = + x + u; where u s dstrbuted N (; ) depedetly of x: The varable x s uobserved. Istead we observe z = x + v; where v s dstrbuted N (; ) depedetly of x ad u: Gve a sample of sze ; t s proposed to ru the lear regresso of y o z ad use a covetoal t-test to test the ull hypothess = : Crtcally evaluate ths proposal. 5. Returs to schoolg A researcher presets hs research o returs to schoolg at a luchtme semar. He rus OLS, usg a radom sample of dvduals, o a Mcer-type lear regresso, where the left sde varable s a logarthm of hourly wage rate. The results are show the table. Regressor Pot estmate Stadard error t-statstc Costat :3 :6 :4 Male (g) :39 :7 5:54 Age (a) :4 : :6 Experece (e) :9 :36 :5 Completed schoolg (s) :7 :3 :5 Ablty (f) :8 :4 :65 Schoolg-ablty teracto (sf) : :4 :6. The preseter says: Our model yelds a :7 percetage pot retur per addtoal year of schoolg (I allow returs to schoolg to vary by ablty by troducg ablty-schoolg teracto, but the correspodg estmate s essetally zero). At the same tme, the estmated coe cet o ablty s 8: (although t s statstcally sg cat). Ths mples that oe would have to acqure three addtoal years of educato to compesate for oe stadard devato lower ate ablty terms of labor market returs. A perso from the audece argues: So you have just dvded oe sg cat estmate by aother sg cat estmate. Ths s lke dvdg zero by zero. You ca get ay aswer by dvdg zero by zero, so your umber 3 s as good as ay other umber. How would you professoally respod to ths argumet?. Aother perso from the audece argues: Your dummy varable Male eters the regresso oly as a separate varable, so the geder ueces oly the tercept. But the correspodg estmate s statstcally very sg cat ( fact, t s the oly sg cat varable your regresso). Ths makes me thk that t must eter the regresso also teractos wth the other varables. If I were you, I would ru two regressos, oe for males ad oe for females, ad test for d ereces coe cets across the two usg a sup-wald test. I ay case, I would compute bootstrap stadard errors to replace your asymptotc stadard errors hopg that most of parameters would become statstcally sg cat wth more precse stadard errors. How would you professoally respod to these argumets? 3 LINEAR REGRESSION AND OLS
31 6. HETEROSKEDASTICITY AND GLS 6. Codtoal varace estmato Ecoometrca A clams: I a IID cotext, to ru OLS ad GLS I do t eed to kow the skedastc fucto. See, I ca estmate the codtoal varace matrx of the error vector by ^ = dag ^e = ; where ^e for = ; : : : ; are OLS resduals. Whe I ru OLS, I ca estmate the varace matrx by (X X ) X ^X (X X ) ; whe I ru feasble GLS, I use the formula = (X ^ X ) X ^ Y: Ecoometca B argues: That a t rght. I both cases you are usg oly oe observato, ^e, to estmate the value of the skedastc fucto, (x ): Hece, your estmates wll be cosstet ad ferece wrog. Resolve ths dspute. 6. Expoetal heteroskedastcty Let y be scalar ad x be k vector radom varables. Observatos (y ; x ) are draw at radom from the populato of (y; x). You are told that E [yjx] = x ad that V [yjx] = exp(x + ), wth (; ) ukow. You are asked to estmate.. Propose a estmato method that s asymptotcally equvalet to GLS that would be computable were V [yjx] fully kow.. I what sese s the feasble GLS estmator of part e cet? I whch sese s t e cet? 6.3 OLS ad GLS are detcal Let Y = X ( + v) + U, where X s k, Y ad U are, ad ad v are k. The parameter of terest s. The propertes of (Y; X ; U; v) are: E [UjX ] =, E [vjx ] =, E [UU jx ] = I, E [vv jx ] =, E [Uv jx ] =. Y ad X are observable, whle U ad v are ot.. What are E [YjX ] ad V [YjX ]? Deote the latter by. Is the evromet homo- or heteroskedastc?. Wrte out the OLS ad GLS estmators ^ ad ~ of. Prove that ths model they are detcal. Ht: Frst prove that X b E =, where ^e s the vector of OLS resduals. Next prove that X b E =. The coclude. Alteratvely, use formulae for the verse of a sum of two matrces. The rst method s preferable, beg more ecoometrc. 3. Dscuss bee ts of usg both estmators ths model. HETEROSKEDASTICITY AND GLS 3
32 6.4 OLS ad GLS are equvalet Let us have a regresso wrtte a matrx form: Y = X + U, where X s k, Y ad U are, ad s k. The parameter of terest s. The propertes of u are: E [UjX ] =, E [UU jx ] =. Let t be also kow that X = X for some k k osgular matrx :. Prove that ths model the OLS ad GLS estmators ^ ad ~ of have the same te sample codtoal varace.. Apply ths result to the followg regresso o a costat: y = + u ; where the dsturbaces are equcorrelated, that s, E [u ] =, V [u ] = ad C [u ; u j ] = for 6= j: 6.5 Equcorrelated observatos Suppose x = + u ; where E [u ] = ad E [u u j ] = f = j f 6= j wth ; j = ; : : : ; : Is x = (x + : : : + x ) the best lear ubased estmator of? Ivestgate x for cosstecy. 6.6 Ubasedess of certa FGLS estmators Show that (a) for a radom varable z; f z ad z have the same dstrbuto, the E [z] = ; (b) for a radom vector " ad a vector fucto q (") of "; f " ad ad q ( ") = q (") for all ", the E [q (")] = : " have the same dstrbuto Cosder the lear regresso model wrtte matrx form: Y = X + E; E [EjX ] = ; E EE jx = : Let ^ be a estmate of whch s a fucto of products of least squares resduals,.e. ^ = F (MEE M) = H (EE ) for M = I X (X X ) X : Show that f E ad E have the same codtoal dstrbuto (e.g. f E s codtoally ormal), the the feasble GLS estmator s ubased. ~ F = X ^ X X ^ Y 3 HETEROSKEDASTICITY AND GLS
33 7. VARIANCE ESTIMATION 7. Whte estmator Evaluate the followg clams.. Whe oe suspects heteroskedastcty, oe should use the Whte formula Qxx Q xxe Qxx stead of good old Q xx, sce uder heteroskedastcty the latter does ot make sese, because s d eret for each observato.. Sce for the OLS estmator we have ad ^ = X X X Y E[^jX ] = V[^jX ] = X X X X X X ; we ca estmate the te sample varace by \ V[^jX ] = X X X = x x ^e X X (whch, apart from the factor ; s the same as the Whte estmator of the asymptotc varace) ad costruct t ad Wald statstcs usg t. Thus, we do ot eed asymptotc theory to do OLS estmato ad ferece. 7. HAC estmato uder homoskedastcty We look for a smpl cato of HAC varace estmators uder codtoal homoskedastcty. Suppose that the regressors x t ad left sde varable y t a lear tme seres regresso y t = x t + e t ; E [e t jx t ; x t ; : : :] = are jotly statoary ad ergodc. The error e t s serally correlated of ukow order, but let t be kow that t s codtoally homoskedastc,.e. E [e t e t j jx t ; x t ; : : :] = j s costat (.e. does ot deped o x t ; x t ; : : :) for all j : Develop a Newey West-type HAC estmator of the log-ru varace of x t e t that would take advatage of codtoal homoskedastcty. VARIANCE ESTIMATION 33
34 7.3 Expectatos of Whte ad Newey West estmators IID settg Suppose oe has a radom sample of observatos from the lear codtoally homoskedastc regresso model y = x + e ; E [e jx ] = ; E e jx = : Let ^ be the OLS estmator of, ad let ^V^ ad V^ be the Whte ad Newey West estmators of the asymptotc varace matrx of ^: Fd E[ ^V^jX ] ad E[ V^jX ]; where X s the matrx of stacked regressors for all observatos. 34 VARIANCE ESTIMATION
35 8. NONLINEAR REGRESSION 8. Local ad global det cato Cosder the olear regresso E [yjx] = + x; where 6= ad V [x] 6= : Whch det - cato codto for ( ; ) fals ad whch does ot? 8. Idet cato whe regressor s oradom Suppose we regress y o scalar x, but x s dstrbuted oly at oe pot (that s, Pr fx = ag = for some a). Whe does the det cato codto hold ad whe does t fal f the regresso s lear ad has o tercept? If the regresso s olear? Provde both algebrac ad tutve/graphcal explaatos. 8.3 Cobb Douglas producto fucto Suppose we have a radom sample of rms wth data o output Q; captal K ad labor L; ad wat to estmate the Cobb Douglas producto fucto Q = K L "; where " has the property E ["jk; L] = : Evaluate the followg suggestos of estmato of :. Ru a lear regresso of log Q log L o a costat ad log K log L. For varous values of o a grd, ru a lear regresso of Q o K L wthout a costat, ad select the value of that mmzes a sum of squared OLS errors. 8.4 Expoetal regresso Suppose you have the homoskedastc olear regresso y = exp ( + x) + e; E[ejx] = ; E[e jx] = ad radom sample f(x ; y )g = : Let the true be, ad x be dstrbuted as stadard ormal. Ivestgate the problem for local det ablty, ad derve the asymptotc dstrbuto of the NLLS estmator of (; ): Descrbe a cocetrato method algorthm gvg all formulas (cludg stadard errors that you would use practce) explct forms. NONLINEAR REGRESSION 35
36 8.5 Power regresso Suppose you have the olear regresso y = ( + x ) + e; E[ejx] = ad IID data f(x ; y )g = : How would you test H : = properly? 8.6 Trasto regresso Gve the radom sample f(x ; y )g = ; cosder the olear regresso y = + + e; E[ejx] = : + 3 x. Descrbe how to test usg the t-statstc f the margal uece of x o the codtoal mea of y, evaluated at x = ; equals.. Descrbe how to test usg the Wald statstc that the regresso fucto does ot deped o x. 8.7 Nolear cosumpto fucto Cosder the model E [c t jy t ; y t ; y t 3 ; : : :] = + I fy t > g + y t ; where c t s cosumpto at t ad y t s come at t: The par (c t ; y t ) s cotuously dstrbuted, statoary ad ergodc. The parameter represets a ormal come level, ad s kow. Suppose you are gve a log quarterly seres of legth T o c t ad y t.. Descrbe at least three d eret stuatos whe parameter det cato wll fal.. Descrbe detal how you wll ru the NLLS estmato employg the cocetrato method, cludg costructo of stadard errors for model parameters. 3. Descrbe how you wll test the hypothess H : = agast H a : < (a) by employg the asymptotc approach, (b) by employg the bootstrap approach wthout usg stadard errors. 36 NONLINEAR REGRESSION
37 9. EXTREMUM ESTIMATORS 9. Regresso o costat Cosder the followg model: y = + e; where all varables are scalars. Assume that fy g = s a radom sample, ad E[e] =, E[e ] =, E[e 3 ] = ad E[e 4 ] =. Cosder the followg three estmators of : ^ = arg m b ( ^ = y ; = log b + b ^ 3 = arg m b = y b ) (y b) ; Derve the asymptotc dstrbutos of these three estmators. Whch of them would you prefer most o the asymptotc bass? What s the dea behd each of the three estmators? = : 9. Quadratc regresso Cosder a olear regresso model where we assume: (A) The parameter space s B = ; +. y = ( + x) + u; (B) The error u has propertes E [u] =, V [u] =. (C) The regressor x has s dstrbuted uformly over [; ] depedetly of u. I partcular, ths mples E x = l ad E [x r ] = +r (r+ ) for teger r 6=. A radom sample f(x ; y )g = s avalable. De e two estmators of :. ^ mmzes S () = P = y ( + x ) over B.. ~ mmzes W () = P = y ( + x ) + l ( + x ) over B. For the case =, obta asymptotc dstrbutos of ^ ad ~. Whch oe of the two do you prefer o the asymptotc bass? EXTREMUM ESTIMATORS 37
38 9.3 Nolearty at left had sde A radom sample f(x ; y )g = s avalable for the olear model where the parameters ad are scalars. (y + ) = x + e; E[ejx] = ; E[e jx] = ;. Show that the NLLS estmator of ad ^^ = arg m a;b (y + a) bx = s geeral cosstet. What feature makes the model d er from a olear regresso where the NLLS estmator s cosstet?. Propose a cosstet CMM estmator of ad ad derve ts asymptotc dstrbuto. 9.4 Least fourth powers Suppose y = x + e; where all varables are scalars, x ad e are depedet, ad the dstrbuto of e s symmetrc aroud. For a radom sample f(x ; y )g = ; cosder the followg extremum estmator of : ^ = arg m (y bx ) 4 : b = Derve the asymptotc propertes of ^; payg specal atteto to the det cato codto. Compare ths estmator wth the OLS estmator terms of asymptotc e cecy for the case whe x ad e are ormally dstrbuted. 9.5 Asymmetrc loss Suppose that f(x ; y )g = s a radom sample from a populato satsfyg y = + x + e; where e s depedet of x; a k vector. Suppose also that all momets of x ad e are te ad that E [xx ] s osgular. Suppose that ^ ad ^ are de ed to be the values of ad that mmze y x over some set R k+ ; where for some < < u 3 f u ; (u) = ( )u 3 f u < : = Descrbe the asymptotc behavor of the estmators ^ ad ^ as! : If you eed to make addtoal assumptos be sure to specfy what these are ad why they are eeded. 38 EXTREMUM ESTIMATORS
39 . MAXIMUM LIKELIHOOD ESTIMATION. Normal dstrbuto Let x ; : : : ; x be a radom sample from N (; ): Derve the ML estmator ^ of ad prove ts cosstecy.. Pareto dstrbuto A radom varable X s sad to have a Pareto dstrbuto wth parameter, deoted X Pareto(), f t s cotuously dstrbuted wth desty x f X (xj) = (+) ; f x > ; ; otherwse. A radom sample x ; : : : ; x from the Pareto() populato s avalable. (a) Derve the ML estmator ^ of ; prove ts cosstecy ad d ts asymptotc dstrbuto. (b) Derve the Wald, Lkelhood Rato ad Lagrage Multpler test statstcs for testg the ull hypothess H : = agast the alteratve hypothess H a : 6=. Do ay of these statstcs cocde?.3 Comparso of ML tests Berdt ad Sav 977 showed that W LR LM for the case of a multvarate regresso model wth ormal dsturbaces. Ullah ad Zde-Walsh 984 showed that ths equalty s ot robust to o-ormalty of the dsturbaces. I the sprt of the latter artcle, ths problem cosders smple examples from o-ormal dstrbutos ad llustrates how ths co ct amog crtera s a ected.. Cosder a radom sample x ; : : : ; x from a Posso dstrbuto wth parameter : Show that testg = 3 versus 6= 3 yelds W LM for x 3 ad W LM for x 3.. Cosder a radom sample x ; : : : ; x from a expoetal dstrbuto wth parameter : Show that testg = 3 versus 6= 3 yelds W LM for < x 3 ad W LM for x Cosder a radom sample x ; : : : ; x from a Beroull dstrbuto wth parameter : Show that for testg = versus 6= ; we always get W LM: Show also that for testg = 3 versus 6= 3 ; we get W LM for 3 x 3 ad W LM for < x 3 or 3 x : Ths problem closely follows Bad H. Baltag () Co ct Amog Crtera for Testg Hypotheses: Examples from No-Normal Dstrbutos. Ecoometrc Theory 6, Problem..4. MAXIMUM LIKELIHOOD ESTIMATION 39
40 .4 Ivarace of ML tests to reparameterzatos of ull Cosder the hypothess H : h() = ; where h : R k! R q : It s possble to recast the hypothess H a equvalet form H : g() = ; where g : R k! R q s such that g() = f(h()) f() for some oe-to-oe fucto f : R q! R q :. Show that the LR statstc s varat to such reparameterzato.. Show that the W statstc s varat to such reparameterzato whe f s lear, but may ot be whe f s olear. 3. Suppose that R ad reparameterze H : = as ( ) = ( ) = for some : Show that the W statstc may be made as close to zero as desred by mapulatg : What value of gves the largest possble value to the W statstc?.5 Msspec ed maxmum lkelhood. Suppose that the olear regresso model E[yjx] = g (x; ) s estmated by maxmum lkelhood based o the codtoal homoskedastc ormal dstrbuto, although the true codtoal dstrbuto s from a d eret famly. Provde a smple argumet why the ML estmator of s evertheless cosstet.. Suppose we kow the true desty f(zj) up to the parameter ; but stead of usg log f(zjq) the objectve fucto of the extremum problem whch would gve the ML estmate, we use f(zjq) tself. What asymptotc propertes do you expect from the resultg estmator of? Wll t be cosstet? Wll t be asymptotcally ormal? 3. Suppose that a statoary ad ergodc codtoally heteroskedastc tme seres regresso E [y t ji t ] = x t; where x t cotas varous lagged y t s, s estmated by maxmum lkelhood based o a codtoally ormal dstrbuto N (x t; t ); wth t depedg o past data whose parametrc form, however, does ot match the true codtoal varace V [y t ji t ]. Determe whether or ot the resultg estmator provdes a cosstet estmate of : Ths problem closely follows dscusso the book Ruud, Paul () A Itroducto to Classcal Ecoometrc Theory; Oxford Uversty Press. 4 MAXIMUM LIKELIHOOD ESTIMATION
41 .6 Idvdual e ects Suppose f(x ; y )g = s a serally depedet sample from a sequece of jotly ormal dstrbutos wth E [x ] = E [y ] =, V [x ] = V [y ] =, ad C [x ; y ] = (.e., x ad y are depedet wth commo but varyg meas ad a costat commo varace). All parameters are ukow. Derve the maxmum lkelhood estmate of ad show that t s cosstet. Expla why. Fd a estmator of whch would be cosstet..7 Irregular co dece terval Let x ; : : : ; x be a radom sample from a populato of x dstrbuted uformly o [; ]: Costruct a asymptotc co dece terval for wth sg cace level 5% by employg a maxmum lkelhood approach..8 Trval parameter space Cosder a parametrc model wth desty f(xj ), kow up to a parameter, but wth = f g,.e. the parameter space s reduced to oly oe elemet. What s a ML estmator of, ad what are ts asymptotc propertes?.9 Nusace parameter desty Let radom vector Z (Y; X ) have a jot desty of the form f(zj ) = f c (Y jx; ; )f m (Xj ); where ( ; ), both ad are scalar parameters, ad f c ad f m deote the codtoal ad margal dstrbutos, respectvely. Let ^ c (^ c ; ^ c ) be the codtoal ML estmators of ad, ad ^ m be the margal ML estmator of. Now de e ~ arg max X l f c (y jx ; ; ^ m ); a two-step estmator of subparameter whch uses margal ML to obta a prelmary estmator of the usace parameter. Fd the asymptotc dstrbuto of ~. How does t compare to that for ^ c? You may assume all the eeded regularty codtos for cosstecy ad asymptotc ormalty to hold. INDIVIDUAL EFFECTS 4
42 . MLE versus OLS Cosder the model where y s regressed oly o a costat: y = + e; where e codtoed o x s dstrbuted as N (; x ); the radom varable x s ot preset the regresso, s ukow, y ad x are observable, e s uobservable. The collecto of pars f(y ; x )g = s a radom sample.. Fd the OLS estmator ^ OLS of. Is t ubased? Cosstet? Obta ts asymptotc dstrbuto. Is ^ OLS the best lear ubased estmator for?. Fd the ML estmator ^ ML of ad derve ts asymptotc dstrbuto. Is ^ ML ubased? Is ^ ML asymptotcally more e cet tha ^ OLS? Does your cocluso cotradcts your aswer to the last questo of part? Why or why ot?. MLE versus GLS Cosder the followg ormal lear regresso model wth codtoal heteroskedastcty of kow form. Codtoal o x; the depedet varable y s ormally dstrbuted wth E [yjx] = x ; V [yjx] = x : Avalable s a radom sample (x ; y ); : : : ; (x ; y ): Descrbe a feasble geeralzed least squares estmator for based o the OLS estmator for. Show that ths GLS estmator s asymptotcally less e cet tha the maxmum lkelhood estmator. Expla the source of e cecy.. MLE heteroskedastc tme seres regresso Assume that data (y t ; x t ), t = ; ; : : : ; T; are statoary ad ergodc ad geerated by y t = + x t + u t ; where u t jx t ; y t ; x t ; y t ; : : : N (; t ) ad x t jy t ; x t ; y t ; x t ; : : : N (; v): Expla, wthout gog to deep math, how to d estmates ad ther stadard errors for all parameters whe:. The etre t as a fucto of x t s fully kow.. The values of t at t = ; ; : : : ; T are kow. 3. It s kow that t = ( + x t ) ; but the parameters ad are ukow. 4. It s kow that t = + u t ; but the parameters ad are ukow. 5. It s oly kow that t s statoary. 4 MAXIMUM LIKELIHOOD ESTIMATION
43 .3 Does the lk matter? 3 Cosder a bary radom varable y ad a scalar radom varable x such that P fy = jxg = F ( + x) ; where the lk F () s a cotuous dstrbuto fucto. Show that whe x assumes oly two d eret values, the value of the log-lkelhood fucto evaluated at the maxmum lkelhood estmates of ad s depedet of the form of the lk fucto. What are the maxmum lkelhood estmates of ad?.4 Maxmum lkelhood ad bary varables Suppose z ad y are dscrete radom varables takg values or. The dstrbuto of z ad y s gve by Pfz = g = ; Pfy = jzg = ez ; z = ; : + ez Here ad are scalar parameters of terest. Fd the ML estmator of (; ) from a radom sample gvg explct formulas wheever possble, ad derve ts asymptotc dstrbuto..5 Maxmum lkelhood ad bary depedet varable Suppose y s a dscrete radom varable takg values or represetg some choce of a dvdual. The dstrbuto of y gve the dvdual s characterstc x s Pfy = jxg = ex + e x ; where s the scalar parameter of terest. The data f(y ; x )g = are a radom sample. Whe dervg varous estmators, try to make the formulas as explct as possble.. Derve the ML estmator of ad ts asymptotc dstrbuto.. Fd the (olear) regresso fucto by regressg y o x: Derve the NLLS estmator of ad ts asymptotc dstrbuto. 3. Show that the regresso you obtaed part s heteroskedastc. Settg weghts!(x) equal to the varace of y codtoal o x; derve the WNLLS estmator of ad ts asymptotc dstrbuto. 4. Wrte out the systems of momet codtos mpled by the ML, NLLS ad WNLLS problems of parts Rak the three estmators terms of asymptotc e cecy. Do ay of your dgs appear uexpected? Gve tutve explaato for aythg uusual. 3 Ths problem closely follows Joao M.C. Satos Slva (999) Does the lk matter? Ecoometrc Theory 5, Problem DOES THE LINK MATTER? 43
44 .6 Posso regresso Some radom varables called couts (lke umber of patet applcatos by a rm, or umber of doctor vsts by a patet) take o-egatve teger values ; ; ; : : :. Suppose that y s a cout varable, ad that, codtoal o scalar radom varable x; ts desty s Posso: f(yjx; ; ) = exp( (x; ; ))(x; ; )y ; where (x; ; ) = exp ( + x) : y! Here ad are ukow scalar parameters. Assume that x has a odegeerate dstrbuto (.e., s ot a costat). The data f(x ; y )g = s a radom sample.. Derve the three asymptotc ML tests (W, LR, LM) for the ull hypothess H : = ; gvg fully mplemetable formulas ( explct form wheever possble) depedg oly o the data.. Suppose the researcher has msspec ed the codtoal mea ad uses (x; ; ) = + exp (x) : Wll the coe cet be cosstetly estmated?.7 Bootstrappg ML tests. For the lkelhood rato test of H : g() = ; we use the statstc LR = max `(q) max `(q) : q q;g(q)= Wrte out the formula for the bootstrap statstc LR.. For the Lagrage Multpler test of H : g() = ; we use the statstc LM = X s z ; ^ R ML bi X Wrte out the formula for the bootstrap statstc LM. s z ; ^ R ML : 44 MAXIMUM LIKELIHOOD ESTIMATION
45 . INSTRUMENTAL VARIABLES. Ivald SLS Cosder the model y = z + u; z = x + v; where E (u; v) jx = ad V (u; v) jx = ; wth ukow. The trples f(x ; z ; y )g = costtute a radom sample.. Show that, ad are det ed. Suggest aalog estmators for these parameters.. Cosder the followg two stage estmato method. I the rst stage, regress z o x ad de e ^z = ^x, where ^ s the OLS estmator. I the secod stage, regress y ^z to obta the least squares estmate of. Show that the resultg estmator of s cosstet. 3. Suggest a method the sprt of SLS for estmatg cosstetly.. Cosumpto fucto Cosder the cosumpto fucto C t = + Y t + e t ; (.) where C t s aggregate cosumpto at t, ad Y t s aggregate come at t: The ordary least squares (OLS) estmato appled to (.) may gve a cosstet estmate of the margal propesty to cosume (MPC) : The remedy suggested by Haavelmo les treatg the aggregate come as edogeous: Y t = C t + I t + G t ; (.) where I t s aggregate vestmet at t, ad G t s govermet cosumpto at t; ad both varables are exogeous. Assume that the shock e t s mea zero IID across tme, ad all varables are jotly statoary ad ergodc. A sample of sze T cotag Y t ; C t ; I t ; ad G t s avalable.. Show that the OLS estmator of s deed cosstet. Compute the amout ad drecto of ths cosstecy.. Ecoometrca A teds to estmate (; ) by rug SLS o (.) usg the strumetal vector (; I t ; G t ) : Ecoometrca B argues that t s ot ecessary to use ths relatvely complcated estmator sce rug smple IV o (.) usg the strumetal vector (; I t + G t ) wll do the same. Is ecoometrca B rght? 3. Ecoometrca C regresses Y t o a costat ad C t ; ad obtas correspodg OLS estmates (^ ; ^ C ) : Ecoometrca D regresses Y t o a costat, C t ; I t ; ad G t ad obtas correspodg OLS estmates (^ ; ^ C ; ^ I ; ^ G ) : What values do parameters ^ C ad ^ C cosstetly estmate? INSTRUMENTAL VARIABLES 45
46 .3 Optmal combato of strumets Suppose you have the followg spec cato, where e may be correlated wth x: y = x + e:. You have strumets z ad whch are mutually ucorrelated. What are ther ecessary propertes to provde cosstet IV estmators ^ z ad ^? Derve the asymptotc dstrbutos of these estmators.. Calculate the optmal IV estmator as a lear combato of ^ z ad ^. 3. You otce that ^ z ad ^ are ot that close together. Gve a test statstc whch allows you to decde f they are estmatg the same parameter. If the test rejects, what assumptos are you rejectg?.4 Trade ad growth I the paper Does Trade Cause Growth? (Amerca Ecoomc Revew, Jue 999), Je rey Frakel ad Davd Romer study the e ect of trade o come. Ther smple spec cato s log Y = + T + W + " ; (.3) where Y s per capta come, T s teratoal trade, W s wth-coutry trade, ad " re ects other ueces o come. Sce the latter s lkely to be correlated wth the trade varables, Frakel ad Romer decde to use strumetal varables to estmate the coe cets (.3). As strumets, they use a coutry s proxmty to other coutres P ad ts sze S ; so that ad where ad are the best lear predcto errors. T = + P + (.4) W = + S + ; (.5). As the key detfyg assumpto, the authors use the fact that coutres geographcal characterstcs P ad S are ucorrelated wth the error term (.3). Provde a ecoomc ratoale for ths assumpto ad a detaled explaato how to estmate (.3) whe oe has data o Y; T; W; P ad S for a lst of coutres.. Ufortuately, data o wth-coutry trade are ot avalable. Determe f t s possble to estmate ay of the coe cets (.3) wthout further assumptos. If t s, provde all the detals o how to do t. 3. I order to be able to estmate key coe cets (.3), the authors add aother detfyg assumpto that P s ucorrelated wth the error term (.5). Provde a detaled explaato how to estmate (.3) whe oe has data o Y; T; P ad S for a lst of coutres. 4. The authors estmated a equato smlar to (.3) by OLS ad IV ad foud out that the IV estmates are greater tha the OLS estmates. Oe explaato may be that the dscrepacy s due to a samplg error. Provde aother, more ecoometrc, explaato why there s a dscrepacy ad what the reaso s that the IV estmates are larger. 46 INSTRUMENTAL VARIABLES
47 . GENERALIZED METHOD OF MOMENTS. Nolear smultaeous equatos Let y = x + u; x = y + v; where x ad y are observable, but u ad v are ot. The data f(x ; y )g = s a radom sample.. Suppose we kow that E [u] = E [v] =. Whe are ad det ed? Propose aalog estmators for these parameters.. Let also be kow that E [uv] =. Propose a method to estmate ad as e cetly as possble gve the above formato. What s the asymptotc dstrbuto of your estmator?. Improved GMM Cosder GMM estmato wth the use of the momet fucto x q m(x; y; q) = : y Determe uder what codtos the secod restrcto helps reducg the asymptotc varace of the GMM estmator of :.3 Mmum Dstace estmato Cosder a smlar to GMM procedure called the Mmum Dstace (MD) estmato. Suppose we wat to estmate a parameter mplctly de ed by = s( ); where s : R k! R` wth ` k; ad avalable s a estmator ^ of wth asymptotc propertes ^ p! ; p d! ^ N ; V^ : Also suppose that avalable s a symmetrc ad postve de te estmator ^V^ estmator s de ed as ^ MD = arg m ^ s() ^W ^ s() ; of V^: The MD where ^W s some symmetrc postve de te data-depedet matrx cosstet for a symmetrc postve de te weght matrx W: Assume that s compact, s() s cotuously d eretable wth full rak matrx of dervatves S() o ; s uque ad all eeded momets exst. GENERALIZED METHOD OF MOMENTS 47
48 . Gve a formal argumet for cosstecy of ^ MD. Derve the asymptotc dstrbuto of ^ MD :. Fd the optmal choce for the weght matrx W ad suggest ts cosstet estmator. 3. Develop a spec cato test,.e. of the hypothess H : 9 such that = s( ): 4. Apply parts 3 to the followg problem. Suppose that we have a autoregresso of order wthout a costat term: ( L) y t = " t ; where jj < ; L s the lag operator, ad " t s IID(; ): Wrtte aother form, the model s y t = y t + y t + " t ; ad ( ; ) may be e cetly estmated by OLS. The target, however, s to estmate ad verfy that both autoregressve roots are deed equal..4 Formato of momet codtos. Let z be a scalar radom varable. Let t be kow that z has mea ad that ts fourth cetral momet equals three tmes ts squared varace (lke for a ormal radom varable). Formulate a system of momet codtos for GMM estmato of.. Cosder the AR() ARCH() model y t = y t + e t ; E [e t ji t ] = ; E e t ji t =! + e t ; where I t embeds formato o the hstory from y t to the past. Such models are typcally estmated by ML, but may also be estmated by GMM, f desred. Suppose you have data o y t for t = ; : : : ; T: Costruct a overdetfyg set of momet codtos to be used by GMM..5 What CMM estmates Let g(z; q) be a fucto such that dmesos of g ad q are detcal, ad let z ; : : : ; z be a radom sample. Note that othg s sad about momet codtos. De e ^ as the soluto to g(z ; q) = : What s the probablty lmt of ^? What s the asymptotc dstrbuto of ^? = 48 GENERALIZED METHOD OF MOMENTS
49 .6 Trty for GMM Derve the three classcal tests (W, LR, LM) for the composte ull H : f : h() = g; where h : R k! R q, for the e cet GMM case. The aalog for the Lkelhood Rato test wll be called the Dstace D erece test. Ht: treat the GMM objectve fucto as the ormalzed loglkelhood, ad ts dervatve as the sample score..7 All about J. Show that the J-test statstc dverges to ty whe the system of momet codtos s msspec ed.. Provde a example showg that the J-test statstc eed ot be asymptotcally ch-squared wth degrees of freedom equallg the degree of overdet cato uder vald momet restrctos, f oe uses a o-e cet GMM. 3. Suppose a ecoometrca estmates parameters of a tme seres regresso by GMM after havg chose a overdetfyg vector of strumetal varables. He performs the overdet cato test ad clams: A bg value of the J -statstc s a evdece agast valdty of the chose strumets. Commet o ths clam..8 Iterest rates ad future ato Frederc Mshk early 9 s vestgated whether the term structure of curret omal terest rates ca gve formato about future path of ato. He spec ed the followg ecoometrc model: m t t = m; + m; ( m t t ) + m; t ; E t [ m; t ] = ; (.) where k t s k-perods-to-the-future ato rate, k t s the curret omal terest rate for k- perods-ahead maturty, ad m; t s the predcto error.. Show how (.) ca be obtaed from the covetoal ecoometrc model that tests the hypothess of codtoal ubasedess of terest rates as predctors of ato. What restrcto o the parameters (.) mples that the term structure provdes o formato about future shfts ato? Determe the autocorrelato structure of m; t.. Descrbe detal how you would test the hypothess that the term structure provdes o formato about future shfts ato, by usg overdetfyg GMM ad asymptotc theory. Make sure that you dscuss such ssues as selecto of strumets, costructo of the optmal weghtg matrx, costructo of the GMM objectve fucto, estmato of asymptotc varace, etc. TRINITY FOR GMM 49
50 3. Mshk obtaed the followg results (stadard errors are paretheses): m, m; m; t-test of t-test of (moths) m; = m; = 3; :4 :37 :7 :9 (:85) (:4498) 6; 3 :379 :83 :33 :49 (:47) (:5499) 9; 6 :86 :4 : 3:7 (:647) (:695) Dscuss ad terpret the estmates ad results of hypotheses tests..9 Spot ad forward exchage rates Cosder a smple problem of predcto of spot exchage rates by forward rates: s t+ s t = + (f t s t ) + e t+ ; E t [e t+ ] = ; E t e t+ = ; where s t s the spot rate at t; f t s the forward rate for oe-moth forwards at t, ad E t deotes expectato codtoal o tme t formato. The curret spot rate s subtracted to acheve statoarty. Suppose the researcher decdes to use ordary least squares to estmate ad : Recall that the momet codtos used by the OLS estmator are E [e t+ ] = ; E [(f t s t ) e t+ ] = : (.). Besde (.), there are other momet codtos that ca be used estmato: E [(f t k s t k ) e t+ ] = ; because f t k s t k belogs to formato at tme t for ay k : Cosder the case k = ad show that such momet codto s redudat.. Besde (.), there s aother momet codto that ca be used estmato: E [(f t s t ) (f t+ f t )] = ; because formato at tme t should be uable to predct future movemets forward rates. Although ths momet codto does ot volve or, ts use may mprove e cecy of estmato. Uder what codto s the e cet GMM estmator usg both momet codtos as e cet as the OLS estmator? Is ths codto lkely to be sats ed practce?. Returs from acal market Suppose you have the followg parametrc model supported by oe of acal theores, for the daly dyamcs of 3-moth Treasury blls retur r t : r t+ r t = + r t + r t + 3 r t + " t+ ; 5 GENERALIZED METHOD OF MOMENTS
51 where " t+ ji t N ; r t where I t cotas formato o r t ; r t ; : : : : Such model arses from a dscretzato of a cotuous tme process de ed by a d uso model for returs. Suppose you have a log statoary ad ergodc sample fr t g T t=. I aswerg the followg questos, rst of all wrte out the metoed estmators, gvg explct formulas wheever possble.. Derve the asymptotc dstrbutos of the OLS ad feasble GLS estmators of the regresso coe cets. Wrte out momet codtos that correspod to these estmators.. Derve the asymptotc dstrbutos of the ML estmator of all parameters. Wrte out a system of momet codtos that correspods to ths estmator. 3. Costruct a exactly detfyg system of momet codtos from (a) the momet codtos mplctly used by OLS estmato of the regresso fucto, ad (b) the momet codtos mplctly used by NLLS estmato of the skedastc fucto. Derve the asymptotc dstrbuto of the resultg method of momets estmator of all parameters. 4. Compare the asymptotc e cecy of the estmators from parts 3 for the regresso parameters, ad of the estmators from parts 3 for ad. ;. Istrumetal varables ARMA models. Cosder a AR() model x t = x t + e t wth E [e t ji t ] = ; E e t ji t = ; ad jj < : We ca look at ths as a strumetal varables regresso that mples, amog others, strumets x t ; x t ; : : : : Fd the asymptotc varace of the strumetal varables estmator that uses strumet x t j ; where j = ; ; : : : : What does your result suggest o what the optmal strumet must be?. Cosder a ARMA(; ) model y t = y t +e t e t wth jj <, jj < ad E [e t ji t ] =. Suppose you wat to estmate by just-detfyg IV. What strumet would you use ad why?. Hausma may ot work. Suppose we wat to perform a Hausma test for valdty of strumets z for a lear codtoally homoskedastc mea regresso model. To ths ed, we compute OLS estmator ^ OLS ad SLS estmator ^ SLS of usg x ad z as strumets. Expla carefully why the Hausma test based o comparso of ^ OLS ad ^ SLS wll ot work.. Suppose that a cotext of a lear model wth (overdetfyg) strumetal varables a researcher teds to test for codtoal homoskedastcty usg a Hausma test based o the d erece betwee the SLS ad GMM estmators of parameters. Expla carefully why ths s ot a good dea. INSTRUMENTAL VARIABLES IN ARMA MODELS 5
52 .3 Testg momet codtos I the lear model y = x + u uder radom samplg ad the ucodtoal momet restrcto E [xu] = ; suppose you wated to test the addtoal momet restrcto E xu 3 = ; whch mght be mpled by codtoal symmetry of the error terms u: A atural way to test for the valdty of ths extra momet codto would be to e cetly estmate the parameter vector both wth ad wthout the addtoal restrcto, ad the to check whether the correspodg estmates d er sg catly. Devse such a test ad gve step-by-step structos for carryg t out..4 Bootstrappg OLS We kow that oe should use receterg whe bootstrappg a GMM estmator. We also kow that the OLS estmator s oe of GMM estmators. However, whe we bootstrap the OLS estmator, we calculate ^ = (X X ) X Y at each bootstrap repetto, ad do ot receter. Resolve the cotradcto..5 Bootstrappg DD The Dstace D erece test statstc for testg the composte ull H : h() = s de ed as DD = m Q (q) m Q (q) ; q:h(q)= q where Q (q) s the GMM objectve fucto Q (q) =! m (z ; q) ^Q = mm! m (z ; q) ; = where ^Q mm cosstetly estmates Q mm = E m (z; ) m (z; ) : It s kow that, as the sample sze teds to ty, DD d! dm(q) : Wrte out a detaled formula for the bootstrap statstc DD. 5 GENERALIZED METHOD OF MOMENTS
53 3. PANEL DATA 3. Alteratg dvdual e ects Suppose that the uobservable dvdual e ects a oe-way error compoet model are d eret across odd ad eve perods: y t = O + x t + v t for odd t; y t = E + x t + v t for eve t; () where t = ; ; : : : ; T; = ; : : : : Note that there are T observatos for each dvdual. We wll call () alteratg e ects spec cato. As usual, we assume that v t are IID(; v) depedet of x s.. There are two ways to arrage the observatos: (a) the usual way, rst by dvdual, the by tme for each dvdual; (b) rst all odd observatos the usual order, the all eve observatos, so t s as though there are N dvduals each havg T observatos. Fd out the Q-matrces that wpe out dvdual e ects for both arragemets ad expla how they trasform the orgal equatos. For the rest of the problem, choose the Q-matrx to your lkg.. Treatg dvdual e ects as xed, descrbe the Wth estmator ad ts propertes. Develop a F -test for dvdual e ects, allowg heterogeety across odd ad eve perods. 3. Treatg dvdual e ects as radom ad assumg ther depedece of x s, v s ad each other, propose a feasble GLS procedure. Cosder two cases: (a) whe the varace of alteratg e ects s the same: V O = V E =, (b) whe the varace of alteratg e ects s d eret: V O = O ; V E = E, O 6= E : 3. Tme varat regressors Cosder a pael data model y t = x t + z + + v t ; = ; ; : : : ; ; t = ; ; : : : ; T; where s large ad T s small. Oe wats to estmate ad :. Expla how to e cetly estmate ad uder (a) xed e ects, (b) radom e ects, wheever t s possble. State clearly all assumptos that you wll eed.. Cosder the followg proposal to estmate. At the rst step, estmate the model y t = x t + + v t by the least squares dummy varables approach. At the secod step, take these estmates ^ ad estmate the coe cet of the regresso of ^ o z : Ivestgate the resultg estmator of for cosstecy. Ca you suggest a better estmator of? PANEL DATA 53
54 3.3 Wth ad Betwee Recall the stadard oe-way statc error compoet model. For smplcty, assume that all data are devatos from ther meas so that there s o tercept. Deote by ^ W ad ^ B the Wth ad Betwee estmators of structural parameters, respectvely. Deote the matrx of rght sde varables by X: Show that uder radom e ects, ^ W ad ^ B are ucorrelated codtoally o X: Develop a asymptotc test for radom e ects based o the d erece betwee ^ W ad ^ B : I partcular, derve the asymptotc dstrbuto of your test statstc whe the radom e ects model s vald, ad show that t asymptotcally dverges to ty whe the radom e ects are approprate. 3.4 Paels ad strumets Recall the stadard oe-way statc error compoet model wth radom e ects where the regressors are deoted by x t. Suppose that these regressors are edogeous. Addtoal data z t for each dvdual at each tme perod are gve, ad let these addtoal varables be depedet of the errors ad correlated wth the regressors. Hece, oe ca use these varables as strumetal varables (IV). Usg kowledge of lear pael regressos ad IV theory, aswer the followg questos about varous estmators, ot worryg about stadard errors.. Develop the IV-Wth estmator ad show that t wll be detcal rrespectve of whether oe uses orgal strumets or Wth-trasformed strumets.. Develop the IV-Betwee estmator ad show that t wll be detcal rrespectve of whether oe uses orgal strumets or Betwee-trasformed strumets. 3. What wll happe f we use Wth-trasformed strumets the Betwee regresso? If we use Betwee-trasformed strumets the Wth regresso? 4. Develop the IV-GLS estmator ad show that ow t matters whether oe uses orgal strumets or GLS-trasformed strumets. Itutvely, whch way would you prefer? 5. What wll happe f we use Wth-trasformed strumets the GLS-trasformed regresso? If we use Betwee-trasformed strumets the GLS-trasformed regresso? 3.5 D erecg trasformatos Evaluate the followg proposals.. I a oe-way error compoet model wth xed e ects, stead of usg dvdual dummes, oe ca alteratvely elmate dvdual e ects by takg the rst d erecg (FD) trasformato. After ths procedure oe has (T ) equatos wthout dvdual e ects, so the vector of structural parameters ca be estmated by OLS.. Recall the stadard dyamc pael data model. The dvdual heterogeety may be removed ot oly by rst d erecg, but also by (for example) subtractg the equato correspodg to t = from each other equato for the same dvdual. 54 PANEL DATA
55 3.6 Nolear pael data model We kow (see Problem 9.3) that the NLLS estmator of ad the olear model ^^ = arg m a;b (y + a) bx = (y + ) = x + e; where e s depedet of x; s geeral cosstet. Suppose that there s a pael fx t ; y t g = T t= ; where s large ad T s small, so that there s a opportuty to cotrol dvdual heterogeety. Wrte out a oe-way error compoet model assumg the same fuctoal form but allowg for dvdual heterogeety the form of radom e ects. Usg a aalogy wth the theory of a lear pael regresso, propose a multstep procedure of estmatg ad adaptg the estmator you used Problem 9.3 to a pael data evromet. 3.7 Durb Watso statstc ad pael data Cosder the stadard oe-way error compoet model wth radom e ects: y t = x t + + v t ; = ; : : : ; ; t = ; : : : ; T; (3.) where s k ; are radom dvdual e ects, IID(; ); v t are dosycratc shocks, v t IID(; v); ad ad v t are depedet of x t for all ad t ad mutually. The equatos are arraged so that the dex t s faster tha the dex : Cosder rug OLS o the orgal regresso (3.) ad rug OLS o the GLS-trasformed regresso y t ^y = (x t ^x ) + ( ^) + v t ^v ; = ; : : : ; ; t = ; : : : ; T; (3.) q where ^ s a cosstet (as! ad T stays xed) estmate of = v = v + T : Whe each OLS estmate s obtaed usg a typcal regresso package, the Durb Watso (DW) statstc s provded amog the regresso output. Recall that f ^e ; ^e ; : : : ; ^e N ; ^e N s a seres of regresso resduals, the the DW statstc s DW = NP (^e j ^e j ) j= NP ^e j j= :. Derve the probablty lmts of the two DW statstcs, as! ad T stays xed.. Usg the obtaed result, propose a asymptotc test for dvdual e ects based o the DW statstc [Ht: That the errors are estmated does ot a ect the asymptotc dstrbuto of the DW statstc. Take ths for grated.] Ths problem s a part of S. Aatolyev (, 3) Durb Watso statstc ad radom dvdual e ects. Ecoometrc Theory 8, Problem.5., 73 74, ad 9, Soluto.5., NONLINEAR PANEL DATA MODEL 55
56 3.8 Hgher-order dyamc pael Formulate a lear dyamc pael regresso wth a sgle weakly exogeous regressor, ad AR() feedback place of AR() feedback (.e. whe two most recet lags of the left sde varable are preset at the rght sde). Descrbe the algorthm of estmato of ths model. 56 PANEL DATA
57 4. NONPARAMETRIC ESTIMATION 4. Noparametrc regresso wth dscrete regressor Let (x ; y ); = ; : : : ; be a radom sample from the populato of (x; y), where x has a dscrete dstrbuto wth the support a () ; : : : ; a (k), where a () < : : : < a (k). Havg wrtte the codtoal expectato E yjx = a (j) the form that allows to apply the aalogy prcple, propose a aalog estmator ^g j of g j = E yjx = a (j) ad derve ts asymptotc dstrbuto. 4. Noparametrc desty estmato Suppose we have a radom sample fx g = ad let ^F (x) = I [x x] = deote the emprcal dstrbuto fucto f x ; where I() s a dcator fucto. Cosder two desty estmators: oe-sded estmator: ^f (x) = ^F (x + h) ^F (x) h two-sded estmator: Show that: ^f (x) = ^F (x + h=) ^F (x h=) h (a) ^F (x) s a ubased estmator of F (x). Ht: recall that F (x) = Pfx xg = E [I [x x]] : (b) The bas of ^f (x) s O (h a ) : Fd the value of a. Ht: take a secod-order Taylor seres expaso of F (x + h) aroud x. (c) The bas of ^f (x) s O h b : Fd the value of b. expaso of F x + h ad F x h aroud x. Ht: take a secod-order Taylor seres 4.3 Nadaraya Watso desty estmator Derve the asymptotc dstrbuto of the Nadaraya Watso estmator of the desty of a scalar radom varable x havg a cotuous dstrbuto, smlarly to how the asymptotc dstrbuto of the Nadaraya Watso estmator of the regresso fucto s derved, uder smlar codtos. Gve terpretato to how your expressos for asymptotc bas ad asymptotc varace deped o the shape of the desty. NONPARAMETRIC ESTIMATION 57
58 4.4 Frst d erece trasformato ad oparametrc regresso Ths problem llustrates the use of a d erece operator oparametrc estmato wth IID data. Suppose that there s a scalar varable z that takes values o a bouded support. For smplcty, let z be determstc ad compose a uform grd o the ut terval [; ]: The other varables are IID. Assume that for the fucto g () below the followg Lpschtz codto s sats ed: for some costat G: jg(u) g(v)j Gju vj. Cosder a oparametrc regresso of y o z: y = g(z ) + e ; = ; : : : ; ; (4.) where E [e jz ] = : Let the data f(z ; y )g = be ordered so that the z s are creasg order. A rst d erece trasformato results the followg set of equatos: y y = g(z ) g(z ) + e e ; = ; : : : ; : (4.) The target s to estmate E e : Propose ts cosstet estmator based o the FDtrasformed regresso (4.). Prove cosstecy of your estmator.. Cosder the followg partally lear regresso of y o x ad z: y = x + g(z ) + e ; = ; : : : ; ; (4.3) where E [e jx ; z ] = : Let the data f(x ; z ; y )g = be ordered so that the z s are creasg order. The target s to oparametrcally estmate g: Propose ts cosstet estmator usg the FD-trasformato of (4.3). [Ht: o the rst step, cosstetly estmate from the FD-trasformed regresso.] Prove cosstecy of your estmator. 4.5 Ubasedess of kerel estmates Recall the Nadaraya Watso kerel estmator ^g (x) of the codtoal mea g (x) E [yjx] costructed for a radom sample. Show that f g (x) = c, where c s some costat, the ^g (x) s ubased, ad provde tuto behd ths result. Fd out uder what crcumstace wll the local lear estmator of g (x) be ubased uder radom samplg. Fally, vestgate the kerel estmator of the desty f (x) of x for ubasedess uder radom samplg. 4.6 Shape restrcto Frms produce some product usg techology f(l; k). The fuctoal form of f s ukow, although we kow that t exhbts costat returs to scale. For a rm ; we observe labor l ; captal k ; ad output y ; ad the data geeratg process takes the form y = f(l ; k ) + " ; where E [" ] = ad " s depedet of (l ; k ). Usg radom sample fy ; l ; k g =, suggest a oparametrc estmator of f(l; k) whch also exhbts costat returs to scale. 58 NONPARAMETRIC ESTIMATION
59 4.7 Noparametrc hazard rate Let z ; : : : ; z be scalar IID radom varables wth ukow PDF f() ad CDF F (). Assume that the dstrbuto of z has support R. Pck t R such that < F (t) <. The objectve s to estmate the hazard rate H(t) = f(t)=( F (t)). () Suggest a oparametrc estmator for F (t). Deote ths estmator by ^F (t). () Let ^f(t) = h j= zj k deote the Nadaraya Watso estmate of f(t) where the badwdth h s chose so that h 5!, ad k() s a symmetrc kerel. Fd the asymptotc dstrbuto of ^f(t): Do ot worry about regularty codtos. () Use ^f(t) ad ^F (t) to suggest a estmator for H(t). Deote ths estmator by ^H(t): Fd the asymptotc dstrbuto of ^H(t). h t 4.8 Noparametrcs ad perfect t Aalyze carefully the asymptotc propertes of the Nadaraya Watso estmator of a regresso fucto wth perfect t,.e. whe the varace of the error s zero. 4.9 Noparametrcs ad extreme observatos Dscuss the behavor of a Nadaraya Watso mea regresso estmate whe oe of IID observatos, say (x ; y ); teds to assume extreme values. Spec cally, dscuss (a) the case whe x s very bg, (b) the case whe y s very bg. NONPARAMETRIC HAZARD RATE 59
60 6 NONPARAMETRIC ESTIMATION
61 5. CONDITIONAL MOMENT RESTRICTIONS 5. Usefuless of skedastc fucto Suppose that for the followg lear regresso model the form of a skedastc fucto s y = x + e; E [ejx] = E e jx = h(x; ; ); where h() s a kow smooth fucto, ad s a addtoal parameter vector. Compare asymptotc varaces of optmal GMM estmators of whe oly the rst restrcto or both restrctos are employed. Uder what codtos does cludg the secod restrcto to a set of momet restrctos reduce asymptotc varace? What f the fucto h() does ot deped o? What f addto the dstrbuto of e codtoal o x s symmetrc? 5. Symmetrc regresso error Cosder the regresso y = x + e; E [ejx] = ; where all varables are scalars. The radom sample fy ; x g = s avalable.. The researcher also suspects that y; codtoal o x; s dstrbuted symmetrcally aroud the codtoal mea. Devse a Hausma spec cato test for ths symmetry. Be spec c ad gve all detals at all stages whe costructg the test.. Suppose that eve though the Hausma test rejects symmetry, the researcher uses the assumpto that ejx N (; ). Derve the asymptotc propertes of the QML estmator of. 5.3 Optmal strumetato of cosumpto fucto Cosder the model c t = + y t + e t; where c t s cosumpto at t; y t s come at t; ad all varables are jotly statoary. There s edogeety come, however, so that e t s ot mea depedet of y t : However, the lagged values of come are predetermed, so e t s mea depedet of the past hstory of y t ad c t : E [e t jy t ; c t ; y t ; c t ; : : :] = : Suppose a log tme seres o c t ad y t s avalable. Outle how you would estmate the model parameters most e cetly. CONDITIONAL MOMENT RESTRICTIONS 6
62 5.4 Optmal strumet AR-ARCH model Cosder a AR() ARCH() model: x t = x t + " t where the dstrbuto of " t codtoal o I t s symmetrc aroud wth E " t ji t = ( ) + " t ; where < ; < ad I t = fx t ; x t ; : : :g :. Let the space of admssble strumets for estmato of the AR() part be ( X Z t = x t = ; s.t. ) X < : Usg the optmalty codto, d the optmal strumet as a fucto of the model parameters ad : Outle how to costruct ts feasble verso.. Use your tuto to speculate o relatve e cecy of the optmal strumet you foud part versus the optmal strumet based o the codtoal momet restrcto E [" t ji t ] =. = 5.5 Optmal strumet AR wth olear error Cosder a AR() model x t = x t + " t ; where the dsturbace " t s geerated as " t = t t ; where t s a IID sequece wth mea zero ad varace oe.. Show that E [" t x t j ] = for all j : Fd the optmal strumet based o ths system of ucodtoal momet restrctos. How may lags of x t does t employ? Outle how to costruct ts feasble verso.. Show that E [" t ji t ] = ; where I t = fx t ; x t ; : : :g : Fd the optmal strumet based o ths codtoal momet restrcto. Outle how to costruct ts feasble verso, or, f that s mpossble, expla why. 5.6 Optmal IV estmato of a costat Cosder the followg MA(p) data geeratg mechasm: y t = + (L)" t ; where " t s a mea zero IID sequece, ad (L) s lag polyomal of te order p. Derve the optmal strumet for estmato of based o the codtoal momet restrcto E [y t jy t p ; y t p ; : : :] = : 6 CONDITIONAL MOMENT RESTRICTIONS
63 5.7 Negatve bomal dstrbuto ad PML Determe f usg the egatve bomal dstrbuto havg the desty f (u; m) = (a + u) m u + m (a+u) ; (a) ( + u) a a where m s ts mea ad a s a arbtrary kow costat, leads to cosstet estmato of the mea regresso E [yjx] = m (x; ) whe the true codtoal dstrbuto s heteroskedastc ormal, wth the skedastc fucto V [yjx] = m (x; ) : 5.8 Nestg ad PML Cosder the regresso model E [yjx] = m (x; ) : Suppose that a PML estmator based o the desty f (z; ) parameterzed by the mea cosstetly estmates the true parameter : Cosder aother desty h (z; ; &) parameterzed by two parameters, the mea ad some other parameter &; whch ests f (z; ) (.e., f s a specal case of h). Use the example of the Webull dstrbuto havg the desty h (z; ; &) = & + &! & z & exp + &! &! z I [z ] ; & > ; to show that the PML estmator based o h (z; ; &) does ot ecessarly cosstetly estmates : What s the ecoometrc explaato of ths perhaps couter-tutve result? Ht: you may use the formato that E z = + & = + & ; (x) = (x ) (x ) ; ad () = : 5.9 Msspec cato varace Cosder the regresso model E [yjx] = m (x; ) : Suppose that ths regresso s codtoally ormal ad homoskedastc. A researcher, however, uses the followg codtoal desty to costruct a PML estmator of : (yjx; ) N m (x; ) ; m (x; ) : Establsh f such estmator s cosstet for : NEGATIVE BINOMIAL DISTRIBUTION AND PML 63
64 5. Mod ed Posso regresso ad PML estmators Let the observable radom varable y be dstrbuted, codtoally o observable x ad uobservable " as Posso wth the parameter (x) = exp(x +"); where E[exp "jx] = ad V[exp "jx] = : Suppose that vector x s dstrbuted as multvarate stadard ormal.. Fd the regresso ad skedastc fuctos, where the codtoal formato volves oly x.. Fd the asymptotc varaces of the Nolear Least Squares (NLLS) ad Weghted Nolear Least Squares (WNLLS) estmators of. 3. Fd the asymptotc varaces of the Pseudo-Maxmum Lkelhood (PML) estmators of based o (a) the ormal dstrbuto; (b) the Posso dstrbuto; (c) the Gamma dstrbuto. 4. Rak the ve estmators terms of asymptotc e cecy. 5. Optmal strumet ad regresso o costat Cosder the followg model: y = + e ; = ; : : : ; ; where uobservable e codtoally o x s dstrbuted symmetrcally wth mea zero ad varace x wth ukow. The data (y ; x ) are IID.. Costruct a par of codtoal momet restrctos from the formato about the codtoal mea ad codtoal varace. Derve the optmal ucodtoal momet restrctos, correspodg to (a) the codtoal restrcto assocated wth the codtoal mea; (b) the codtoal restrctos assocated wth both the codtoal mea ad codtoal varace.. Descrbe detal the GMM estmators that correspod to the two optmal sets of ucodtoal momet restrctos of part. Note that part (a) the parameter s ot det ed, therefore propose your ow estmator of that d ers from the oe mpled by part (b). All estmators that you costruct should be fully feasble. If you use oparametrc estmato, gve all the detals. Your descrpto should also cota estmato of asymptotc varaces. 3. Compare the asymptotc propertes of the GMM estmators that you desged. 4. Derve the Pseudo-Maxmum Lkelhood estmator of ad of order (PML) that s based o the ormal dstrbuto. Derve ts asymptotc propertes. How does ths estmator relate to the GMM estmators you obtaed part? The dea of ths problem s borrowed from Goureroux, C. ad Mofort, A. Statstcs ad Ecoometrc Models, Cambrdge Uversty Press, CONDITIONAL MOMENT RESTRICTIONS
65 6. EMPIRICAL LIKELIHOOD 6. Commo mea Suppose we have the followg momet restrctos: E [x] = E [y] =.. Fd the system of equatos that yelds the emprcal lkelhood (EL) estmator ^ of, the assocated Lagrage multplers ^ ad the mpled probabltes ^p. Derve the asymptotc varaces of ^ ad ^ ad show how to estmate them.. Reduce the umber of parameters by elmatg the redudat oes. The learze the system of equatos wth respect to the Lagrage multplers that are left, aroud ther populato couterparts of zero. Ths wll help to d a approxmate, but explct soluto for ^, ^ ad ^p. Derve that soluto ad terpret t. 3. Istead of de g the objectve fucto log p = as the EL approach, let the objectve fucto be p log p : = Ths gves rse to the expoetal tltg (ET) estmator of : Fd the system of equatos that yelds the ET estmator of ^, the assocated Lagrage multplers ^ ad the mpled probabltes ^p. Derve the asymptotc varaces of ^ ad ^ ad show how to estmate them. 6. Kullback Lebler Iformato Crtero The Kullback Lebler Iformato Crtero (KLIC) measures the dstace betwee dstrbutos, say g(z) ad h(z): KLIC(g : h) = E g log g(z) ; h(z) where E g [] deotes mathematcal expectato accordg to g(z): Suppose we have the followg momet codto: E m(z ; ) = ; ` k; k ` ad a radom sample z ; : : : ; z wth o elemets equal to each other. Deote by e the emprcal dstrbuto fucto (EDF),.e. the oe that assgs probablty to each sample pot. Deote by a dscrete dstrbuto that assgs probablty to the sample pot z ; = ; : : : ; : EMPIRICAL LIKELIHOOD 65
66 . Show that mmzato of KLIC(e : ) subject to P = = ad P = m(z ; ) = yelds the Emprcal Lkelhood (EL) value of ad correspodg mpled probabltes.. Now we swtch the roles of e ad ad cosder mmzato of KLIC( : e) subject to the same costrats. What famlar estmator emerges as the soluto to ths optmzato problem? 3. Now suppose that we have a pror kowledge about the dstrbuto of the data. So, stead of usg the EDF, we use the dstrbuto p that assgs kow probablty p to the sample pot z ; = ; : : : ; (of course, P = p = ). Aalyze how the solutos to the optmzato problems parts ad chage. 4. Now suppose that we have postulated a famly of destes f(z; ) whch s compatble wth the momet codto. Iterpret the value of that mmzes KLIC(e : f): 6.3 Emprcal lkelhood as IV estmato Cosder a lear model wth strumetal varables: y = x + e; E [ze] = ; where x s k, z s ` ; ad ` k: Wrte dow the EL estmator of a matrx form of a (ot completely feasble) strumetal varables estmator. Also wrte dow the e cet GMM estmator, ad expla tutvely why the former s expected to exhbt better te sample propertes tha the latter. 66 EMPIRICAL LIKELIHOOD
67 7. ADVANCED ASYMPTOTIC THEORY 7. Maxmum lkelhood ad asymptotc bas Derve the secod order bas of the Maxmm Lkelhood (ML) estmator ^ of the parameter > of the expoetal dstrbuto f(y; ) = obtaed from radom sample y ; : : : ; y T ; (a) usg a explct formula for ^; exp( y); y ; y < (b) usg the expresso for the secod order bas of extremum estmators. Costruct the bas corrected ML estmator of. 7. Emprcal lkelhood ad asymptotc bas Cosder estmato of a scalar parameter o the bass of the momet fucto x m(x; y; ) = y ad IID data (x ; y ); = ; : : : ; : Show that the secod order asymptotc bas of the emprcal lkelhood estmator of equals. 7.3 Asymptotcally rrelevat strumets Cosder the lear model y = x + e; where scalar radom varables x ad e are correlated wth the correlato coe cet. Avalable are data for a ` vector of strumets z. These strumets, however, are asymptotcally rrelevat,.e. E [zx] = : The data (x ; y ; z ); = ; : : : ; ; are IID. You may addtoally assume that both x ad e are homoskedastc codtoal o z; ad that they are homocorrelated codtoal o z:. Fd the probablty lmt of the SLS estmator of from the rst prcples (.e. wthout usg the weak strumets theory). ADVANCED ASYMPTOTIC THEORY 67
68 . Verfy that your result part coforms to the weak strumets theory beg ts specal case. 3. Fd the expected value of the probablty lmt of the SLS estmator. How does t relate to the probablty lmt of the OLS estmator? 7.4 Weakly edogeous regressors Cosder the regresso Y = X + E; where the regressors X are correlated wth the error E; but ths correlato s weak. Cosder the decomposto of E to ts projecto o X ad the orthogoal compoet U: E = X + U: Assume that X X ; = X U p! (Q; ) ; where N ; u Q ad Q has full rak. Show that uder the assumpto of the drftg parameter DGP = c= p ; where s the sample sze ad c s xed, the OLS estmator of s cosstet ad asymptotcally ocetral ormal, ad derve the asymptotc dstrbuto of the Wald test statstc for testg the set of lear restrcto R = r; where R has full rak q: 7.5 Weakly vald strumets Cosder a lear model wth IID data where all varables are scalars. y = x + e;. Suppose that x ad e are correlated, but there s a ` strog strumet z weakly correlated wth e: Derve the asymptotc (as! ) dstrbutos of the SLS estmator of, ts t rato, ad the overdet cato test statstc U J = b Z(Z Z) Z U b bu U b ; where U b Y ^X are the vector of SLS resduals ad Z s the matrx of strumets, uder the drftg DGP! = c! = p ; where! s the vector of coe cets o z the lear projecto of e o z: Also, specalze to the case ` =.. Suppose that x ad e are correlated, but there s a ` weak strumet z weakly correlated wth e: Derve the asymptotc (as! ) dstrbutos of the SLS estmator of, ts t rato, ad the overdet cato test statstc J ; uder the drftg DGP! = c! = p ad = c = p ; where! s the vector of coe cets o z the lear projecto of e o z; ad s the vector of coe cets o z the lear projecto of x o z: Also, specalze to the case ` =. 68 ADVANCED ASYMPTOTIC THEORY
69 Part II Solutos 69
70
71 . ASYMPTOTIC THEORY: GENERAL AND INDEPENDENT DATA. Asymptotcs of trasformatos. There are at least two ways to solve the problem. A easer way s usg the secod-order Delta-method, see Problem.8. The secod way s usg trgoometrc dettes ad the regular Delta-method: T ( cos ^) = T s ^ = p T s ^! d! cos N (; ) :. Recallg how the Delta-method s proved, T s ^ = T (s ^ s ) = (^ )! d N (; ) : p! 3. By the Ma Wald theorem, log T + log ^ d! log ) log ^ d! ) T log ^ d! :. Asymptotcs of rotated logarthms Use the Delta-method for ad We have so where It follows that l U p U x =x =y y) y =x =y u v d! N ; x l x l y g = : y l x + l y ; y) p l U l V l u l v l U + l V l u + l v GG = uu u! uv +! vv u v v! uu! vv u v! uu u u v = =u = v = u = v d! N ; GG ;! uu u! vv v +! uv u v +! vv v C A : l V ad l U + l V are asymptotcally depedet whe! uu u ; =! vv. v ASYMPTOTIC THEORY: GENERAL AND INDEPENDENT DATA 7
72 .3 Escapg probablty mass () The expectato s E [^ ] = E [x ja ] P fa g + E ja P A so the bas s E [^ ] = +! = + ; as!. The expectato of the square s E ^ = E x ja P fa g + E j A P A = so the varace s V [^ ] = E ^ (E [^ ]) = as! : As a cosequece, the MSE of ^ s MSE [^ ] = V [^ ] + (E [^ ] ad also teds to ty as!. ) = + ( ) +! ( ) + () Despte the MSE of ^ goes to ty, ^ s cosstet: for ay " > ; P fj^ j > "g = P fjx j > "g ; + ; + P fj j > "g! by cosstecy of x ad boudedess of probabltes. The CDF of p (^ ) s F p (^ ) (t) = P p (^ ) t = P p (x ) t! F N (; ) (t) + P p ( ) t by the asymptotc ormalty of x ad boudedess of probabltes. () Sce the asymptotc dstrbutos of x ad ^ are the same, the approxmate co dece tervals for wll be detcal except that they ceter at x ad ^ ; respectvely..4 Asymptotcs of t-ratos The soluto s straghtforward oce we determe to whch vector the LLN or CLT should be appled. (a) Whe =, we have: x p!, p x d! N (; ), ad ^ p!. Therefore p T = p x ^ d! N (; ) N (; ): 7 ASYMPTOTIC THEORY: GENERAL AND INDEPENDENT DATA
73 (b) Cosder the vector W x ^ = x (x ) = (x ) : By the LLN, the last term goes probablty to the zero vector, whle the rst term, ad thus the whole W, coverges probablty to plm W =! : Moreover, because p (x )! d N (; ), we have p (x ) d!. Next, let W x ; (x ) p d. The W plm W! N (; V ), where V V [W ].! Let us calculate V. Frst, V [x ] = ad V (x ) = E ((x ) Secod, C x ; (x ) = E (x )((x ) ) =. Therefore, ) = 4. p d W plm W! N ;! 4 Now use the Delta-method wth the trasformato t g = t p ) g t t to get t p T Ideed, ths reduces to N (; ) whe =. (c) Smlarly, cosder the vector t = t t t d plm T! N ; + ( 4 )! 4 6 : W By the LLN, W coverges probablty to x = plm W =! = x x + : Next, p d W plm W! N (; V ), where V V [W ], W x ; x. Let us calculate! V. Frst, V [x ] = ad V x = E (x ) = Secod, C x ; x = E (x )(x ) =. Therefore, p d W plm W! N ;! Now use the Delta-method wth to get p R g t t d plm R! N! = t p t : ; ( + ) 3 : A : Ths reduces to the aswer part (b) f ad oly f =. Uder ths codto, T ad R are asymptotcally equvalet. ASYMPTOTICS OF T -RATIOS 73
74 .5 Creepg bug o smplex Sce x k ad y k are perfectly correlated, t su ces to cosder ether oe, x k say. Note that at each step x k creases by k wth probablty p, or stays the same. That s, x k = x k + k k, where k s IID B (p). Ths meas that x k = P k k = whch by the LLN coverges probablty to E [ ] = p as k!. Therefore, plm (x k ; y k ) = (p; p). Next, due to the CLT, p (xk plm x k ) d! N (; p( p)) : Therefore, the rate of covergece s p, as usual, ad p xk xk d plm! N y k y k ; p( p) p( p) p( p) p( p) :.6 Asymptotcs of sample varace Accordg to the LLN, a must be E[x]; ad b must be E[x ]: Accordg to the CLT, c() must be p ; ad p x E[x] d V[x] C[x; x x E[x! N ; ] ] C[x; x ] V[x : ] Usg the Delta-method wth whose dervatve s we get or g G u u u u = u u = u ; p x (x ) E[x ] E[x] d! E[x] V[x] C[x; x N ; ] C[x; x ] V[x ] E[x] ; p x (x ) V[x] d! N ; V[x ] + 4E[x] V[x] 4E[x]C[x; x ] :.7 Asymptotcs of roots More precsely, we eed to assume that F s cotuously d eretable, at least at (a; ), ad that t possesses a cotuously d eretable verse at (a; ). Suppose that p (^a a) d! N (; V^a ) : The cosstecy of ^ follows from applcato of the Ma Wald theorem. Next, the rst-order Taylor expaso of F ^a; ^ = 74 ASYMPTOTIC THEORY: GENERAL AND INDEPENDENT DATA
75 aroud (a; ) yelds where (a ; ) les betwee (a; ) ad Because F (a; ) = ; we have or p ^ F (a; ) (a ; (^a a) (a ; ^ = ; = ^a; ^ compoetwse, ad hece s cosstet for (a; ) (a ; p (^a a) (a ; p ^ = (a ; (a ; ) p (a; (a; (a; (a; )!! where t s assumed (a; ) =@ s of full rak. Alteratvely, oe could use the theorem o d eretato of a mplct fucto..8 Secod-order Delta-method (a) From the CLT, p S d! N (; ). Usg the Ma Wald theorem for g(x) = x we get d! : S (b) The Taylor expaso aroud cos() = yelds cos(s ) = cos(s )S, where S [; S ]. From the LLN ad Ma Wald theorem, cos(s ) p!, ad from the Slutsky theorem, ( cos(s )) d! : (c) Let z p! z = cost ad p (z z) d! N ; : Let g be twce cotuously d eretable at z wth g (z) = ad g (z) 6= : The g(z ) g(z) g (z) d! : Proof. Ideed, as g (z) = ; from the secod-order Taylor expaso, g(z ) = g(z) + g (z )(z z) ; ad, sce g (z ) p! g (z) ad p (z z) d! N (; ) ; we have g(z ) g(z) g (z) p (z z) = d! : QED SECOND-ORDER DELTA-METHOD 75
76 .9 Asymptotcs wth shrkg regressor The OLS estmates are ^ = P y x P y P x P x ( P x ) ; ^ = X X y ^ x (.) ad ^ = X ^e : Let us cosder ^ rst. From (.) t follows that ^ = P ( + x + u )x P ( + x + u ) P x P x ( P x ) whch coverges to = + P u P u P P ( P ) = + + plm! u ; = P u ( ) P u ; ( ) ( ) f plm P u exsts ad s a well-de ed radom varable. Its momets are E [] =, E = ad E 3 = 3. Hece, 3 Now let us look at ^. Aga, from (.) we see that ^ = + ( ^) ^ d! : (.) X + X p u! ; where we used (.) ad the LLN for a average of u. Next, p ( (^ ) = p ( ^) + ) + X p u = U + V : Because of (.), U p! : From the CLT t follows that V d! N (; ): Take together, p (^ ) d! N (; ): Lastly, let us look at ^ : ^ = X ^e = X ( ^) + ( ^)x + u : (.3) Usg the facts that: () ( ^)! p, () ( ^) =! p, (3) P u (5) = P p u!, we coclude that ^ p! : p!, (4) P u p!, 76 ASYMPTOTIC THEORY: GENERAL AND INDEPENDENT DATA
77 The rest of ths soluto s optoal ad s usually ot meat whe the asymptotcs of ^ s cocered. Before proceedg to dervg ts asymptotc dstrbuto, we would lke to mark out that ( ^) p! ad P u p! for ay >. Usg the same algebra as before we get p (^ ) A p X (u ); sce the other terms coverge probablty to zero. Usg the CLT, we get where m 4 = E u 4 p (^ 4 ; provded that t s te. ) d! N (; m 4 );. Power treds. The OLS estmator sats es ^ = We ca see that E[^ x = ] = ad! x " = p =! += " : = = V[^ ] =! + : = = If V[^ ]! ; the estmator ^ wll be cosstet. Ths wll occur whe < + ( ths R T case the cotuous aalog t dt T (+) of the rst sum squared dverges faster or coverges slowler tha the cotuous aalog R T t + dt T ++ of the secod sum, as ( + ) < + + f ad oly f < + ). I ths case the asymptotc dstrbuto s +( )= (^ ) = p! +( )= += " = =! d! lm! + = = + by Lyapuov s CLT for depedet heterogeeous observatos, provded that P = 3(+)= =3 ( P = + ) =! A as!, whch s sats ed as R T = t + dt T (++)= dverges faster or coverges R T =3 slowler tha t 3(+)= dt T (3(+)=+)=3. POWER TRENDS 77
78 . The GLS estmator sats es ~ = = x! = x " = p! = " : = = Aga, E[ ~ ] = ad V[ ~ ] =! : = = The estmator ~ wll be cosstet uder the same codto,.e. whe < +. I ths case the asymptotc dstrbuto s +( )= ( ~ ) = p! +( )= = " = =! d! lm! + = = A by Lyapuov s CLT for depedet heterogeeous observatos. 78 ASYMPTOTIC THEORY: GENERAL AND INDEPENDENT DATA
79 . ASYMPTOTIC THEORY: TIME SERIES. Treded vs. d ereced regresso. The OLS estmator ^ ths case s ^ = P T t= (y t y)(t t) P T t= (t t) : Now, Because ^ = T 3 P t t (T P ; T P! T t t 3 P t t) T 3 P t t (T P t " tt t t) T P t " : t TX t = t= T (T + ) ; TX t = t= T (T + )(T + ) ; 6 t s easy to see that the rst vector coverges to (; 6). Therefore, T 3= (^ ) = (; 6) p X t T " t : T Assumg that all codtos for the CLT for heterogeous martgale d erece sequeces (e.g., Hamlto (994) Tme seres aalyss, proposto 7.8) hold, we d that because Cosequetly, lm T p T lm T T X t= TX t= lm T t T " t " t t V T " t d! N = lm T TX V [" t ] = ; t= TX t C T " t; " t t= T 3= (^ )! (; 6) N = lm T t ; 3 ; 3. For the regresso d ereces, the OLS estmator s = T So, T ( ) = " T " D(; ): TX t= " t TX t= TX t= ; t = T 3 ; t T = : (y t y t ) = + " T " : T = N (; ): ASYMPTOTIC THEORY: TIME SERIES 79
80 3. Whe T s su cetly large, ^ A N ; ;, ad D : It s easy to see that T 3 for large T; the (approxmate) varace of the rst estmator s less tha that of the secod. Itutvely, t s much easer to detfy the tred of a tredg sequece tha the drft of a drftg sequece. T. Log ru varace for AR() The log ru varace s V ze = P + j= C [z te t ; z t j e t j ] : Because e t ad z t are scalars, depedet at all lags ad leads, ad E [e t ] = ; we have C [z t e t ; z t j e t j ] = E [z t z t j ] E [e t e t j ] : Let for smplcty z t also have zero mea. The for j ; E [z t z t j ] = j z z z ad E [e t e t j ] = j e e e ; where z ; z; e ; e are AR() parameters. To sum up, V ze = z e z e +X j= jjj z jjj e = + z e ( z e ) ( z) ( e) z e: To estmate V ze ; d the OLS estmates ^ z ; ^ z; ^ e ; ^ e from AR() regressos ad plug them. The resultg ^V ze wll be cosstet by the Cotuous Mappg Theorem..3 Asymptotcs of averages of AR() ad MA() Note that y t ca be rewrtte as y t = P + j= j x t j :. () y t s ot a MDS relatve to ow past fy t ; y t ; : : :g as t s correlated wth older y t s; () z t s a MDS relatve to fx t ; x t 3 ; : : :g, but s ot a MDS relatve to ow past fz t ; z t ; : : :g, because z t ad z t are correlated through x t.. () By the CLT for the geeral statoary ad ergodc case, p d T y T! N (; qyy ), where q yy = P + j= C[y t; y t j ]: It ca be show that for a AR() process, {z } =, j = j j = j. Therefore, q yy = P + j= j = : () By the CLT for the geeral ( ) statoary ad ergodc case, p d T z T! N (; qzz ), where q zz = + + P + j= j {z} = ( + ) + = ( + ). 3. If we have cosstet estmates ^, ^, ^ of,,, we ca estmate q yy ad q zz cosstetly by ^ ( ^) ad ^ ( + ^) ; respectvely. Note that these are postve umbers by costructo. Alteratvely, we could use robust estmators, lke the Newey West oparametrc estmator, gorg addtoal formato that we have. But uder the crcumstaces ths seems to be less e cet. = 8 ASYMPTOTIC THEORY: TIME SERIES
81 4. For vectors, () p d T y T! N (; Qyy ), where Q yy = P + j= C[y t; y t j ] {z } : j But = P + j= Pj P j, j = P j f j >, ad j = j = P jjj f j <. Hece Q yy = + P + j= Pj + P + j= P j = + (I P) P + P (I P ) = (I P) (I P ) ; () p T z T d! N (; Q zz ), where Q zz = + + = = (I + )(I + ). As for estmato of asymptotc varaces, t s evdetly possble to costruct cosstet estmators of Q yy ad Q zz that are postve de te by costructo..4 Asymptotcs for mpulse respose fuctos. For the AR() process, we get by repeated substtuto X y t = j " t j : j= Sce the weghts decle expoetally, ther sum absolutely coverges to a te costat. The IRF s IRF (j) = j ; j : The ARMA(,) process wrtte va lag polyomals s of whch the MA() represetato s z t = L L " t; z t = " t + ( ) X " t : Sce the weghts decle expoetally, ther sum absolutely coverges to a te costat. The IRF s IRF () = ; IRF (j) = ( ) j ; j > :. The estmate based o the OLS estmator ^ of, s [IRF (j) = ^ j : Sce p T (^ ) N ; ; we ca derve usg the Delta-method that p T IRF [ d! (j) IRF (j) N ; j (j ) = d! as T! whe j ; ad [IRF () IRF () = : 3. Deote e t = " t " t : Sce ^! p ad ^! p (ths wll be show below), we ca costruct cosstet estmators as [IRF () = ; [IRF (j) = ^ ^ ^ j ; j > : Evdetly, [IRF () has a degeerate dstrbuto. To derve the asymptotc dstrbuto of [IRF ^ (j) for j > ; let us rst derve the asymptotc dstrbuto of ^; : Cosstecy ca be show easly: ^ = + T P T t=3 z t e t T P p T t=3 z! + E [z t e t ] t z t E [z t z t ] = ; ASYMPTOTICS FOR IMPULSE RESPONSE FUNCTIONS 8
82 P ^ T + ^ = t= ^e t^e t P T t= ^e t = : : : expaso of ^e t : : : p! + : Sce the soluto of a quadratc equato s a cotuous fucto of ts coe cets, cosstecy of ^ obtas. To derve the asymptotc dstrbuto, we eed the followg compoet: X T e t e t E [e t e t ] e t E e A t! d N (; ) ; T t=3 z t e t where s a 3 3 varace matrx whch s a fucto of ; ; " ad = E " 4 t (dervato s very tedous; oe should accout for seral correlato summads). Next, from examg the formula de g ^; droppg the terms that do ot cotrbute to the asymptotc dstrbuto, we ca d that p T ^ + ^! A p = + T (^ ) + p T X (et e t E [e t e t ]) + 3 p T X e t E e t for certa costats ; ; 3 : It follows by the Delta-method that p A= + T ^ p ^! T + ^ + A p X X = T (^ ) + p (et e t E [e t e t ]) + 3 p e t T T E e t for certa costats ; ; 3 : It follows that p ^ T ^ A = X T z t e t e t E e t e t e t E [e t e t ] A d! N ; : for certa 3 matrx : Fally, applyg the Delta-method aga, we get p T IRF [ A= (j) IRF (j) p ^ d T! N ; ^ ; for certa vector : 8 ASYMPTOTIC THEORY: TIME SERIES
83 3. BOOTSTRAP 3. Bref ad exhaustve. The metoed d erece deed exsts, but t s ot the prcpal oe. The two methods have some commo features lke computer smulatos, samplg, etc., but they serve completely d eret goals. The bootstrap s a alteratve to aalytcal asymptotc theory for makg fereces, whle Mote Carlo s used for studyg te-sample propertes of estmators or tests.. After some pot rasg B; the umber of bootstrap repettos, does ot help sce the bootstrap dstrbuto s trscally dscrete, ad rasg B caot smooth thgs out. Eve more tha that: f we are terested quatles (we usually are), the quatle for a dscrete dstrbuto s a terval, ad the ucertaty about whch pot to choose to be a quatle does ot dsappear whe we rase B. 3. There s o such thg as a bootstrap estmator. Bootstrappg s a method of ferece, ot of estmato. The same goes for a asymptotc estmator. 3. Bootstrappg t-rato The percetle terval s CI % = [^ ~q = ; ^ ~q = ]; where ~q s a bootstrap -quatle of ^ ^;.e. = Pf^ ^ ~q g: The ~q ) ^ ^ (^ ^ s the -quatle of = T; sce P ~q = : s(^) s(^) s(^) s(^) But by costructo, the -quatle of T s q, hece ~q = s(^)q : Substtutg ths to CI % we get the co dece terval as CI the problem. 3.3 Bootstrap bas correcto. The bootstrap verso x of x has mea x wth respect to the EDF: E [x ] = x : Thus the bootstrap verso of the bas (whch s tself zero) s Bas (x ) = E [x ] x = : Therefore, the bootstrap bas corrected estmator of s x Bas (x ) = x : Now cosder the bas of x : Bas(x ) = E x = V [x ] = V [x] : Thus the bootstrap verso of the bas s the sample aalog of ths quatty: Bas (x ) = V [x] = X x x : BOOTSTRAP 83
84 Therefore, the bootstrap bas corrected estmator of s x Bas (x ) = + x X x :. Sce the sample average s a ubased estmator of the populato mea for ay dstrbuto, the bootstrap bas correcto for z wll be zero, ad thus the bas-corrected estmator for E[z ] wll be z (cf. part ). Next ote that the bootstrap dstrbuto s wth probablty ad 3 wth probablty ; so the bootstrap dstrbuto for z = 4 (z + z ) s wth probablty 4 ; 9 4 wth probablty ; ad 9 wth probablty 4 : Thus the bootstrap bas estmate s = ; ad the bas corrected verso s 4 (z + z ) 9 8 : 3.4 Bootstrap lear model. Due to the assumpto of radom samplg, there caot be ucodtoal heteroskedastcty. If codtoal heteroskedastcty s preset, t does ot valdate the oparametrc bootstrap. The depedece of the codtoal varace o regressors s ot destroyed by bootstrap resamplg as the data (x ; y ) are resampled pars.. Whe we bootstrap a cosstet estmator, ts bootstrap aalogs are cocetrated more ad more aroud the probablty lmt of the estmator, ad thus the estmate of the bas becomes smaller ad smaller as the sample sze grows. That s, bootstrappg s able to correct the bas caused by te sample osymmetry of the dstrbuto, but ot the asymptotc bas (d erece betwee the probablty lmt of the estmator ad the true parameter value). 3. As a pot estmate we take ^g(x) = x ^; where ^ s the OLS estmator for. To pvotze ^g(x); observe that p x (^ ) d! N (; x E xx E xx e E xx x); so the approprate statstc to bootstrap s where se (^g(x)) = q x ( P x x ) q where se (^g (x)) = x ( P x x ) the co dece terval s h CI t = where q ad q t g = x (^ ) se (^g(x)) ; P x x ^e x ^ q ( P x x ) x: The bootstrap verso s t g = x (^ ^) se (^g (x)) ; P x x ^e ( P x x se (^g(x)) ; x ^ q se (^g(x)) ; are approprate bootstrap quatles for t g. ) x: The rest s stadard, ad 84 BOOTSTRAP
85 3.5 Bootstrap for mpulse respose fuctos. For each j smulate the bootstrap dstrbuto of the absolute value of the t-statstc: p T j^ j j j jt j j = jj^j j p ^ ; the bootstrap aalog of whch s p T j^ jt j ^ j j jj = jj^ j j p ^ ; read o the bootstrap quatles qj; ad costruct the symmetrc percetle-t co dece q terval h^ j qj; jj^jj ^ =T.. Most approprate s the resdual bootstrap whe bootstrap samples are geerated by resamplg estmates of ovatos " t. The corrected estmates of the IRFs are ]IRF (j) = ^ ^ ^ j B BX b= ^ b ^ b ^ j b ; where ^ b ; ^ b are obtaed b th bootstrap repetto by usg the same formulae as used for ^; ^ but computed from the correspodg bootstrap sample. BOOTSTRAP FOR IMPULSE RESPONSE FUNCTIONS 85
86 86 BOOTSTRAP
87 4. REGRESSION AND PROJECTION 4. Regressg ad projectg dce () The jot dstrbuto s 8 >< (x; y) = >: (; ) wth probablty 6 ; (; ) wth probablty 6 ; (; 3) wth probablty 6 ; (4; 4) wth probablty 6 ; (; 5) wth probablty 6 ; (6; 6) wth probablty 6 : () The best predctor s 8 3 x = ; >< x = ; E [yjx] = 4 x = 4; >: 6 x = 6; ad ude ed for all other x: () To d the best lear predctor, we eed E [x] = ; E [y] = 7 ; E [xy] = 8 3 ; V [x] = 6 3 ; = C [x; y] =V [x] = 7 6 ; = 8 ; so BLP[yjx] = x: (v) 8 < U BP = : wth probablty 6 ; wth probablty 3 ; wth probablty 6 ; 8 >< U BLP = >: 3 8 wth probablty 6 ; 8 wth probablty 6 ; 3 8 wth probablty 6 ; 3 8 wth probablty 6 ; 9 8 wth probablty 6 ; 6 8 wth probablty 6 : so E U BP = 4 3 ; E U BP :9: Ideed, E U BP < E U BLP. REGRESSION AND PROJECTION 87
88 4. Mxture of ormals. We eed to compute E [y] ; E [x] ; C [x; y] ; V [x] : Let us deote the rst evet A; the secod evet A: The E [y] = p E [yja] + ( p) E yj A = p 4 + ( p) = 4p; E [x] = p E [xja] + ( p) E xj A = p + ( p) 4 = 4 4p; E [xy] = p (E [xja] E [yja] + C [x; yja]) + ( p) E xj A E yj A + C x; yj A = ; C [x; y] = E [xy] E [x] E [y] = 6p ( p) ; E x = p E [xja] + V [xja] + ( p) E xja + V xja = 7 6p; V [x] = E x E [x] = + 6p 6p : Summarzg, C [x; y] = = V [x] + 6p 6p ; = E [y] E [x] = 4 p ) BLP [yjx] = 4 + 6p 6p + + 6p 6p p + 6p 6p x; ;. If E [yjx] was lear, t would cocde wth BLP [yjx] : But take, for example, the value of BLP [yjx] for some bg x: Ths value wll be egatve. But, gve ths large x; the mea of y caot be egatve, as t should clearly be betwee ad 4. Cotradcto. To derve E [yjx] ; oe have to wrte that f x y = p exp f (x) = p p exp x (y 4) x + ( p) + ( p) p exp exp (x 4) (x 4) ; y ; so f (yjx) = p p exp x (y 4) + ( p) exp (x 4) y : p exp x + ( p) exp (x 4) The codtoal expectato s the tegral E [yjx] = p Z+ y Oe ca compute ths Maple R to get p exp x (y 4) + ( p) exp (x 4) y dy (x 4) E [yjx] = p exp x + ( p) exp 4 + (p ) exp(4x 8) : It has a form of a smooth trasto regresso fucto, ad has horzotal asymptotes 4 ad as x! ad x! +; respectvely. 88 REGRESSION AND PROJECTION
89 4.3 Beroull regressor Note that E [yjx] = ; x = ; ; x = ; = ( x) + x = + ( ) x ad E y jx = + ; x = ; + ; x = ; = x: These expectatos are lear x because the support of x has oly two pots, ad oe ca always draw a straght le through two pots. The reaso s ot codtoal ormalty! 4.4 Best polyomal approxmato The FOC are h E E [yjx] x : : : k x k ( ) h E E [yjx] x : : : k x k ( x) = ; = ; h. E E [yjx] x : : : k x k x k = ; or, usg the LIME, h E y x : : : k x k = ; h E y x : : : k x k x = ; h. E y x : : : k x k x k = ; from where = ( ; ; : : : ; k ) = E zz E [zy] ; where z = ; x; : : : ; x k : The best k th order polyomal approxmato s the BPA k [yjx] = z = z E zz E [zy] : The assocated predcto error u k has the propertes that follow from the FOC: E [zu k ] = ;.e. t has zero mea ad s ucorrelated wth x; x ; : : : ; x k : BERNOULLI REGRESSOR 89
90 4.5 Hadlg codtoal expectatos. By the LIME, E [yjx; z] = + x + z: Thus we kow that the lear predcto y = + x + z + e y, the predcto error e y s ucorrelated wth the predctors,.e. C [e y ; x] = C [e y ; z] =. Cosder the lear predcto of z by x: z = + x + e z, C [e z ; x] =. But sce C [z; x] =, we kow that = : Now, f we learly predct y oly by x, we wll have y = + x + ( + e z ) + e y = + + x + e z + e y : Here the composte error e z + e y s ucorrelated wth x ad thus s the best lear predcto error. As a result, the OLS estmator of s cosstet. Checkg the propertes of the secod opto s more volved. Notce that the OLS coe cets the lear predcto of y by x ad w coverge probablty to plm ^ = x ^! xw xw w xy = x wy xw xw w so we ca see that plm ^ = + xw wz x w : xw Thus geeral the secod opto gves a cosstet estmator. x xw + wz. Because ^E [xjz] = g(z) s a strctly creasg ad cotuous fucto, g () exsts ad E [xjz] = s equvalet to z = g (): If ^E[yjz] = f(z); the ^E [yje [xjz] = ] = f(g ()): ; 9 REGRESSION AND PROJECTION
91 5. LINEAR REGRESSION AND OLS 5. Fxed ad radom regressors. It smpl es a lot. Frst, we ca use smpler versos of LLNs ad CLTs; secod, we do ot eed addtoal codtos apart from exstece of some momets. For example, for cosstecy of the OLS estmator the lear mea regresso model y = x + e; E [ejx] = ; oly exstece of momets s eeded, whle the case of xed regressors we () have to use the LLN for heterogeeous sequeces, () have to add the codto P = x! M as!.. The ecoomst s probably rght about treatg the regressors as radom f he/she has a radom samplg expermet. But the reasog s rdculous. For a sampled dvdual, hs/her characterstcs (whether true or false) are xed; radomess arses from the fact that ths dvdual s radomly selected. 3. The OLS estmator s ubased codtoal o the whole sample of x-varables, rrespectve of how x s are geerated. The codtoal ubasedess property mples ubasedess. 5. Cosstecy of OLS uder serally correlated errors. Ideed, ad E[u t ] = E[y t y t ] = E[y t ] E[y t ] = = C [u t ; y t ] = C [y t y t ; y t ] = C [y t ; y t ] V [y t ] = : () Now let us show that ^ s cosstet. Because E[y t ] =, t mmedately follows that ^ = T P T t= y ty t T P T t= y t = + T P T t= u ty t T P T t= y t () To show that u t s serally correlated, cosder p! + E [u ty t ] E yt = : C[u t ; u t ] = C [y t y t ; y t y t ] = (C [y t ; y t ] C [y t ; y t ]) ; whch s geerally ot zero uless = or = C [y t; y t ] : As a example of a serally C [y t ; y t ] correlated u t take the AR() process y t = y t + " t ; where " t s a strct whte ose. The = ad thus u t = y t ; serally correlated. (v) The OLS estmator s cosstet f the error term s correlated wth the rght-had-sde varables. Ths s ot the same as seral correlatedess of the error term. LINEAR REGRESSION AND OLS 9
92 5.3 Estmato of lear combato. Cosder the class of lear estmators,.e. oe havg the form ~ = AY; where A depeds oly o data X = ((; x ; z ) : : : (; x ; z ) ) : The codtoal ubasedess requremet yelds the codto AX = (; c x ; c z ); where = (; ; ) : The best lear ubased estmator s ^ = (; cx ; c z )^; where ^ s the OLS estmator. Ideed, ths estmator belogs to the class cosdered, sce ^ = (; c x ; c z ) (X X ) X Y = A Y for A = (; c x ; c z ) (X X ) X ad A X = (; c x ; c z ): Besdes, V h^jx = (; c x ; c z ) X X (; cx ; c z ) ad s mmal the class because the key relatoshp (A A ) A = holds.. Observe that p ^ = (; c x ; c z ) p d! ^ N ; V^ ; where V^ = + x + z x z ; x = (E[x] c x ) = p V[x]; z = (E[z] c z ) = p V[z]; ad s the correlato coe cet betwee x ad z: 3. Mmzato of V^ wth respect to yelds 8 >< x opt = z >: z x f f x z x z < ; : 4. Multcollearty betwee x ad z meas that = ad ad are udet ed. A mplcato s that the asymptotc varace of ^ s te. 5.4 Icomplete regresso. Note that y = x + z + : We kow that E [jz] = ; so E [z] = : However, E [x] 6= uless = ; because = E[xe] = E [x (z + )] = E [xz ] + E [x] ; ad we kow that E [xz ] 6= : The regresso of y o x ad z yelds the OLS estmates wth the probablty lmt where p lm ^ ^ Q = E [x] = + Q ; E[xx ] E[xz ] E[zx ] E[zz ] We ca see that the estmators ^ ad ^ are geeral cosstet. To be more precse, the cosstecy of both ^ ad ^ s proportoal to E [x] ; so that uless = (or, more subtly, uless les the ull space of E [xz ]), the estmators are cosstet. : 9 LINEAR REGRESSION AND OLS
93 . The rst step yelds a cosstet OLS estmate ^ of because the OLS estmator s cosstet a lear mea regresso. At the secod step, we get the OLS estmate X X X X X ^ = z z z^e = z z z e z x ^ = = + X z z X z X z x ^ : p Sce P z z! E [zz ] ; P z x have that ^ s cosstet for : p! E [zx ] ; P z p! E [z] = ; ^ p! ; we Therefore, from the pot of vew of cosstecy of ^ ad ^; we recommed the secod method. The lmtg dstrbuto of p (^ ) ca be deduced by usg the Delta-method. Observe that p X! (^ ) = z X = z X z x X! x x X = x e A ad X z p d! N x e E[zz ; ] E[zx e] E[xz e] E[xx ] : Havg appled the Delta-method ad the Cotuous Mappg Theorems, we get p d (^ )! N ; E[zz ] V E[zz ] ; where V = E[zz ] + E[zx ] E[xx ] E[xz ] E[zx ] E[xx ] E[xz e] E[zx e] E[xx ] E[xz ]: 5.5 Geerated coe cet Observe that p ^ = = x! p = x u p (^ )! x z : = Now, by the LLN, ad = x p! x; p x u = p (^ ) d! N x z = p! xz ; x by the CLT. We ca assert that the covergece here s jot (.e., as of a vector sequece) because of depedece of the compoets. Because of ther depedece, ther jot CDF s just a product of margal CDFs, ad potwse covergece of these margal CDFs mples potwse GENERATED COEFFICIENT 93
94 covergece of the jot CDF. Ths s mportat, sce geerally weak covergece of compoets of a vector sequece separately does ot mply jot weak covergece. Now, by the Slutsky theorem, p = x u p (^ ) Applyg the Slutsky theorem aga, we d: d x z! N ; x + xz : = p d! ^ x N ; x + xz = N ; + xz x 4 x Note how the prelmary estmato step a ects the precso estmato of other parameters: the asymptotc varace s blow up. The mplcato s that sequetal estmato makes ave (.e. whch gore prelmary estmato steps) stadard errors vald. : 5.6 OLS olear model Observe that E [yjx] = + x; V [yjx] = ( + x) : Cosequetly, we ca use the usual OLS estmator ad Whte s stadard errors. By the way, the model y = ( + x)e ca be vewed as y = + x + u; where u = ( + x)(e ); E [ujx] = ; V [ujx] = ( + x) : 5.7 Log ad short regressos Let us deote ths estmator by : We have = XX X Y = XX X (X + X + E) = = + X X X X + X X X E : Sce E [ex ] = ; we have that X E! p by the LLN. Also, by the LLN, X X p! E [x x ] ad X X p! E [x x ] : Therefore, p! + E x x E x x : So, geeral, s cosstet. It wll be cosstet f les the ull space of E [x x ] : Two specal cases of ths are: () whe = ;.e. whe the true model s Y = X + e; () whe E [x x ] = : 5.8 Rdge regresso. There s codtoal bas: E[ jx ~ ] = (X X + I k ) X E [YjX ] = (X X + I k ), uless =. Next, E[ ] ~ = E[(X X + I k ) ] 6= uless = : Therefore, estmator s geeral based. 94 LINEAR REGRESSION AND OLS
95 . Observe that ~ = (X X + I k ) X X + (X X + I k ) X E = X x x + I k! X x x +! X x x + I X k x " : Sce P x x that s, ~ s cosstet. 3. The math s straghtforward: p! E [xx ] ; P x " p! E [x"] = ; =! ; we have: ~ p! E xx E xx + E xx = ;! p ( ~ X ) = x x + I k p + {z } {z} # p # (E [xx ]) d! N ; Q xx Q xxe Q xx :! X x x + I k {z } # p (E [xx ]) h 4. The codtoal mea squared error crtero E ( ~ ) jx estmator, h E (^ N ca be used. ) jx = V[^] = (X X ) X X (X X ) : For the rdge estmator, h E ( ~ ) jx = X X + I k X X + X X + I k : X p x " {z } # d ; E xx " For the OLS By the rst order approxmato, f s small, (X X + I k ) (X X ) (I k (X X ) ). Hece, h E ( ~ ) jx (X X ) (I (X X ) )(X X )(I (X X ) )(X X ) h That s E (^ E[(^ ) ] (X X ) [X X (X X ) + (X X ) X X ](X X ) : h ) jx E ( ~ ) jx = A; where A s postve de te (exercse: show ths). Thus for small ; ~ s a preferable estmator to ^ accordg to the mea squared error crtero, despte ts basedess. 5.9 Icosstecy uder alteratve We are terestg the questo whether the t-statstcs ca be used to check H : = : I order to aswer ths questo we have to vestgate the asymptotc propertes of ^. Frst of all, uder the ull ^ p! C[z; y] V[z] = V[x] V[z] = : INCONSISTENCY UNDER ALTERNATIVE 95
96 It s straghtforward to show that uder the ull the covetoal stadard error correctly estmates (.e. f correctly ormalzed, s cosstet for) the asymptotc varace of ^. That s, uder the ull t d! N (; ) ; whch meas that we ca use the covetoal t-statstcs for testg H. 5. Returs to schoolg. There s othg wrog dvdg sg cat estmates by each other. I fact, ths s the correct way to get a pot estmate for the rato of parameters, accordg to the Delta-method. Of course, the Delta-method wth g u u = u u should be used full to get a asymptotc co dece terval for ths rato, f eeded. But the co dece terval wll ecessarly be bouded ad cetered at 3 ths example.. The perso from the audece s probably rght hs feelgs ad tetos. But the suggested way of mplemetg those deas are full of mstakes. Frst, to compare coe cets a separate regresso, oe eeds to combe them to oe regresso usg dummy varables. But what s eve more mportat there s o place for the sup-wald test here, because the dummy varables are totally kow ( d eret words, the threshold s kow ad eed ot be estmates as threshold regressos). Secod, computg stadard errors usg the bootstrap wll ot chage the approxmato prcple because the perso wll stll use asymptotc crtcal values. So othg wll essetally chage, uless full scale bootstrappg s used. To say othg about the fact that practce hardly the sg cace of may coe cets wll chage after swtchg from asymptotcs to bootstrap. 96 LINEAR REGRESSION AND OLS
97 6. HETEROSKEDASTICITY AND GLS 6. Codtoal varace estmato I the OLS case, the method works ot because each (x ) s estmated by ^e ; but because X ^X = x x ^e cosstetly estmates E[xx e ] = E[xx (x)]: I the GLS case, the same trck does ot work: = X ^ X = ca potetally cosstetly estmate E[xx =e ]; but ths s ot the same as E[xx = (x)]. Of course, ^ caot cosstetly estmate, ecoometrca B s rght about ths, but the trck the OLS case works for a completely d eret reaso. = x x ^e 6. Expoetal heteroskedastcty. At the rst step, get ^; a cosstet estmate of (for example, OLS). The costruct ^ exp(x ^) for all (we do t eed exp() sce t s a multplcatve scalar that evetually cacels out) ad use these weghts at the secod step to costruct a feasble GLS estmator of : ~ = X ^ x x! X ^ x y :. The feasble GLS estmator s asymptotcally e cet, sce t s asymptotcally equvalet to GLS. It s te-sample e cet, sce we chaged the weghts from what GLS presumes. 6.3 OLS ad GLS are detcal. Evdetly, E [YjX ] = X ad = V [YjX ] = X X + I. Sce the latter depeds o X, we are the heteroskedastc evromet.. The OLS estmator s ad the GLS estmator s ^ = X X X Y; ~ = X X X Y: HETEROSKEDASTICITY AND GLS 97
98 Frst, X E b = X Y X X X X Y = X Y X X X X X Y = X Y X Y = : Premultply ths by X : X X b E = : Add b E to both sdes ad combe the terms o the left-had sde: X X + I be b E = b E: Now predvdg by matrx gves be = b E: Premultply oce aga by X to get or just X b E =. Recall ow what b E s: = X b E = X b E; X Y = X X X X X Y whch mples ^ =. ~ The fact that the two estmators are detcal mples that all the statstcs based o the two wll be detcal ad thus have the same dstrbuto. 3. Evdetly, ths model the cocdece of the two estmators gves uambguous superorty of the OLS estmator. I spte of heteroskedastcty, t s e cet the class of lear ubased estmators, sce t cocdes wth GLS. The GLS estmator s worse sce ts feasble verso requres estmato of, whle the OLS estmator does ot. Addtoal estmato of adds ose whch may spol te sample performace of the GLS estmator. But all ths s ot typcal for rakg OLS ad GLS estmators ad s a result of a specal form of matrx. 6.4 OLS ad GLS are equvalet. Whe X = X, we have X X = X X ad X = X, so that V h^jx = X X X X X X = X X ad h V ~jx = X X = X X = X X :. I ths example, ad X = ( + ( : : : = : : : : : : )) (; ; : : : ; ) = X ; where ; = ( + ( )): Thus oe does ot eed to use GLS but stead do OLS to acheve the same te-sample e cecy. 98 HETEROSKEDASTICITY AND GLS
99 6.5 Equcorrelated observatos Ths s essetally a repetto of the secod part of the prevous problem, from whch t follows that uder the crcumstaces x the best lear codtoally (o a costat whch s the same as ucodtoally) ubased estmator of because of cocdece of ts varace wth that of the GLS estmator. Appealg to the case whe jj > (whch s temptg because the the varace of x s larger tha that of, say, x ) s vald, because t s ruled out by the Cauchy Schwartz equalty. Oe caot appeal to the usual LLNs because x s o-ergodc. The varace of x s V [x ] = +! as! ; so the estmator x s geeral cosstet (except the case whe = ). For a example of cosstet x, assume that > ad cosder the followg costruct: u = " + & ; where & IID(; ) ad " (; ) depedet of & for all : The the correlato structure s exactly as the problem, ad P p u! "; a radom ozero lmt. 6.6 Ubasedess of certa FGLS estmators (a) = E [z z] = E [z] + E [ z] = E [z] + E [z] = E [z]. It follows that E [z] = : (b) E [q (")] = E [ q ( ")] = E [ q (")] = E [q (")] : It follows that E [q (")] = : Cosder ~ F = X ^ X X ^ E: Let ^ be a estmate of whch s a fucto of products of least squares resduals,.e. for M = I ad E ad ^ = F MEE M = H EE X (X X ) X : Codtoal o X, the expresso E have the same codtoal dstrbuto. Hece by (b), h E ~F = : X ^ X X ^ E s odd E, EQUICORRELATED OBSERVATIONS 99
100 HETEROSKEDASTICITY AND GLS
101 7. VARIANCE ESTIMATION 7. Whte estmator. Yes, oe should use the Whte formula, but ot because Q xx does ot make sese. It does make sese, but s poorly related to the OLS asymptotc varace, whch geeral takes the sadwch form. It s ot true that vares from observato to observato, f by we mea ucodtoal varace of the error term.. The rst part of the clam s totally correct. But avalablty of the t or Wald statstcs s ot eough to do ferece. We eed crtcal values for these statstcs, ad they ca be obtaed oly from some dstrbuto theory, asymptotc partcular. 7. HAC estmato uder homoskedastcty Uder codtoal homoskedastcty, E x t x t je t e t j = E E x t x t je t e t j jx t ; x t ; : : : = E x t x t je [e t e t j jx t ; x t ; : : :] = j E x t x t j : Usg ths, the log-ru varace of x t e t wll be +X E x t x X + t je t e t j = j E x t x t j ; j= j= ad ts Newey West-type HAC estmator wll be +mx j= m jjj \ ^ m + j E hx t x t j ; where h \ E x t x t ^ j = T j = T m(t;t +j) X t=max(;+j) m(t;t +j) X t=max(;+j) y t x t x t j: x t^ y t j x ^ t j ; It s ot clear whether the estmate s postve de te. VARIANCE ESTIMATION
102 7.3 Expectatos of Whte ad Newey West estmators IID settg The Whte formula (apart from the factor ) s ^V^ = X X X = x x ^e X X : Because ^e = e x (^ ); we have ^e = e x (^ )e + x (^ )(^ ) x : Also recall that ^ = (X X ) P j= x je j : Hece, " # " # " # E x x ^e jx = E x x e jx E x x x (^ )e jx = = = " # +E x x x (^ )(^ ) x jx = = x x = x X X x ; because E[(^ )e jx ] = (X X ) X E [e EjX ] = (X X ) x ad E[(^ )(^ ) jx ] = (X X ) : Fally, # E[ ^V^jX ] = X X E " = x x ^e jx X X = X X X x x x X X x = X X : Let! j = jjj=(m + ): The Newey West estmator of the asymptotc varace matrx of ^ wth lag trucato parameter m s V^ = (X X ) ^S (X X ) ; where ^S = +mx j= m m(;+j) X! j =max(;+j) x x j e x (^ ) e j x j(^ ) : Thus E[ ^SjX ] = = +mx j= m +mx j= m m(;+j) X! j =max(;+j) m(;+j) X! j =max(;+j) = X X +m X j= m x x x x h je e x (^ ) e j x j(^ ) jx m(;+j) X! j =max(;+j) h I fj=g + x E (^ )(^ ) jx x j x h(^ E )e j jx x j he E (^ )jx x X X x j x x j: A Fally, E[ V^jX ] = X X X X +m X j= m m(;+j) X! j =max(;+j) x X X x j x x j X X : VARIANCE ESTIMATION
103 8. NONLINEAR REGRESSION 8. Local ad global det cato The quasregressor s g = (; x) : The local ID codto that E[g g ] s of full rak s sats ed sce t s equvalet to det E[g g ] = V [ x] 6= whch holds due to 6= ad V [x] 6= : But the global ID codto fals because the sg of s ot det ed: together wth the true par ( ; ) ; aother par ( ; ) also mmzes the populato least squares crtero. 8. Idet cato whe regressor s oradom I the lear case, Q xx = E[x ]; a scalar. Its rak (.e. t tself) equals zero f ad oly f Pr fx = g = ;.e. whe a = ; the det cato codto fals. Whe a 6= ; the det cato codto s sats ed. Graphcally, whe all pot le o a vertcal le, we ca uambguously draw a le from the org through them except whe all pots are lyg o the ordate axs. I the olear case, Q gg = E[g (x; ) g (x; ) ] = g (a; ) g (a; ) ; a k k matrx. Ths matrx s a square of a vector havg rak, hece ts rak ca be oly oe or zero. Hece, f k > (there are more tha oe parameter), ths matrx caot be of full rak, ad det cato fals. Graphcally, there are a te umber of curves passg through a set of pots o a vertcal le. If k = ad g (a; ) 6= ; there s det cato; f k = ad g (a; ) = ; there s det cato falure (see the lear case). Ituto the case k = : f margal chages shft the oly regresso value g (a; ) ; t ca be det ed; f they do ot shft t, may values of are cosstet wth the same value of g (a; ). 8.3 Cobb Douglas producto fucto. Ths dea must have come from a temptato to take logs of the producto fucto: log Q log L = log + (log K log L) + log ": But the error ths equato, e = log " E [log "], eve after ceterg, may ot have the property E [ejk; L] =, or eve E (K; L) e =. As a result, OLS wll geerally gve a cosstet estmate for.. Note that the regresso fucto s E [QjK; L] = K L E ["jk; L] = K L : The regresso s olear, but usg the cocetrato method leads to the algorthm descrbed the problem. Hece, ths suggesto gves a cosstet estmate for : [Remark: practce, though, a researcher would probably clude also a tercept to be more co det about valdty of spec cato.] NONLINEAR REGRESSION 3
104 8.4 Expoetal regresso The local IC s sats ed: the matrx Q gg = E exp ( + exp ( + x) 4 3 = = exp () x E x x s vertable. The asymptotc dstrbuto s ormal wth varace matrx V NLLS = exp () I : = exp () I The cocetrato algorthm uses the grd o : For each o ths grd, we ca estmate exp ( ()) by OLS from the regresso of y o exp (x) ; so the estmate ad sum of squared resduals are P = ^ () = log exp (x ) y P = exp (x ) ; SSR () = (y exp (^ () + x )) : = Choose such ^ that yelds mmum value of SSR () o the grd. Set ^ = ^(^): The stadard errors se (^) ad se(^) ca be computed as square roots of the dagoal elemets of SSR ^! x exp ^ + ^x : = Note that we caot use the above expresso for V NLLS sce practce we do ot kow the dstrbuto of x ad that = : x x 8.5 Power regresso Uder H : = ; the parameter s ot det ed. Therefore, the Wald (or t) statstc does ot have a usual asymptotc dstrbuto, ad we should use the sup-wald statstc sup W = supw(); where W() s the Wald statstc for = whe the udet ed parameter s xed at value. The asymptotc dstrbuto s o-stadard ad ca be obtaed va smulatos. 8.6 Smple trasto regresso. The margal uece ( + =( + 3 = x= 3 ( + 3 x) = 3 : x= 4 NONLINEAR REGRESSION
105 So the ull s H : 3 + = : The t-statstc s t = ^ ^3 + se(^ ^3 ) ; where ^ ad ^ 3 are elemets of the NLLS estmator, ad se(^ ^3 ) s a stadard error for ^ ^3 whch ca be computed from the NLLS asymptotcs ad Delta-method. The test rejects whe jtj > q N(;) = :. The regresso fucto does ot deped o x whe, for example, H : = : As uder H the parameter ^ 3 s ot det ed, ferece s ostadard. The Wald statstc for a partcular value of 3 s ad the test statstc s ; where the lmtg dstrbuto D s obtaed va smu- The test rejects whe sup W > q D latos. W ( 3 ) =! ^ ; se(^ ) sup W = sup 3 W ( 3 ) : 8.7 Nolear cosumpto fucto. Whe = ; the parameter s ot det ed, ad whe = ; the parameters ad are ot separately det ed. A thrd stuato occurs f the support of y t les etrely above or etrely below ; the parameters ad are ot separately det ed (ote: a separato of ths to two d eret stuatos s ufar).. Ru the cocetrato method wth respect to the parameter : The quas-regressor s 3. ; I fy t > g ; y t ; y t log y t : Whe costructg the stadard errors, besde usg the quasregressor we have to apply HAC estmato of asymptotc varace, because the regresso error does ot have a martgale d erece property. Ideed, ote that the formato y t ; y t ; y t 3 ; : : : does ot clude c t ; c t ; c t 3 ; : : : ad therefore, for j > ad e t = c t E [c t jy t ; y t ; y t 3 ; : : :] E [e t e t j ] = E [E [e t e t j jy t ; y t ; y t 3 ; : : :]] 6= E [E [e t jy t ; y t ; y t 3 ; : : :] e t j ] = ; as e t j s ot ecessarly measurable relatve to y t ; y t ; y t 3 ; : : : : (a) Asymptotcally, t^= = ^ se(^) s dstrbuted as N (; ). Thus, reject f t^= < q N(;) ; or, equvaletly, ^ < + se(^)q N(;) : (b) Bootstrap ^ whose bootstrap aalog s ^ ^; get the left -quatle q; ad reject f ^ < + q: NONLINEAR CONSUMPTION FUNCTION 5
106 6 NONLINEAR REGRESSION
107 9. EXTREMUM ESTIMATORS 9. Regresso o costat For the rst estmator use stadard LLN ad CLT: ^ = p y! E[y] = (cosstecy), Deote Cosder p (^ ) = p = d e! N (; V[e]) = N ; (asymptotc ormalty). = ^ = arg m b y = ( log b + b y ; = y = The FOC for ths problem gves after rearragemet: ) (y b) : (9.) = y : = q ^ + ^y y =, ^ = y y + 4y : The two values ^ correspod to the two d eret solutos of a local mmzato problem populato: E log b + b (y b)! m, b = E[y] p E[y] b + 4E[y ] = ; : (9.) Note that the SOC are sats ed at both b ; so both are local mma. Drect comparso yelds that b + = s a global mmum. Hece, the extremum estmator ^ = ^ + s cosstet for. The asymptotcs of ^ ca be foud usg the theory of extremum estmators. For f(y; b) = log b + b (y b) b) = (y b) # (y b) b b 3 b ) E = 6 ; Cosequetly, Cosder f(y; b) 6(y b) 8(y f(y; = b 4 + b 3 ) = 6 : p (^ ^ 3 = ) d! N arg m b ; 9 : f(y ; b); = EXTREMUM ESTIMATORS 7
108 where y f(y; b) = b : Note that The FOC s whch = y b 3 + y b f(y; = 6y b 4 ; = ; 4y b 3 : ^b = y y ; ad the estmate s ^ 3 = ^b = y y p! E[y ] E[y] = : To d the asymptotc varace calculate # ) E 6 6 ; E f(y; = 4 : The dervatves are take at pot b = because ; ad ot ; s the soluto of the extremum problem m b E[f(y; b)]. As follows from our dscusso, p d (^b )! N ; 4 ) p (^ 3 )! d N ; 4 4 : A safer way to obta ths asymptotcs s probably to chage varable the mmzato problem from the begg: y ^ 3 = arg m ; b b = ad proceed as above. No oe of these estmators s a pror asymptotcally better tha the others. The dea behd these estmators s: ^ s just the usual OLS estmator, ^ s the ML estmator usg codtoal ormalty: yjx N (; ): The thrd estmator may be thought of as the WNLLS estmator wth the codtoal varace fucto (x; b) = b ; though t s ot exactly that: oe should dvde by (x; ) the costructo of WNLLS. 9. Quadratc regresso Note that we have codtoal homoskedastcty. The regresso fucto s g(x; ) = ( +x). The estmator ) h s NLLS, wth = ( + x). The Q gg = E (@g(x; )=@) = 3. Therefore, p d ^! N (; 3 8 ). The estmator ~ s a extremum oe, wth h(x; y; ) = y ( + x) l( + x) : 8 EXTREMUM ESTIMATORS
109 Frst we check the det cato codto. y; = y ( + x) 3 + x ; so the FOC to the populato problem y; ) + x = E ( + x) 3 ; whch equals zero f ad oly f =. It ca be ver ed that the Hessa s egatve o all B, hece we have a global maxmum. Note that the ID codto would ot be sats ed f the true parameter was d eret from zero. Thus, ~ works oly for =. Next, The = h(x; y; = " y # x 3 x 6Y ( + x) 4 + ( + x) : = 3 4 ; = E Therefore, p ~ d! N (; 3 6 ). We ca see that ^ asymptotcally domates ~. 6y x 4 + x = : 9.3 Nolearty at left had sde. The FOCs for the NLLS problem are = P = (y + P = (y + ^) = 4 = = (y + ^) ^x (y + ^) ; = (y + ^) ^x x : Cosder the rst of these. The assocated populato aalog s = E [e (y + )] ; ad t does ot follow from the model structure. The model mples that ay fucto of x s ucorrelated wth the error e; but y + = p x + e s geerally correlated wth e. The valdty of populato codtos o whch the estmator s based leads to cosstecy. The model d ers from a olear regresso that the dervatve of e wth respect to parameters s ot oly a fucto of x; the codtog varable, but also of y; whle a olear regresso t s (t equals mus the quas-regressor).. Let us select a just detfyg set of strumets whch oe would use f the left had sde of the equato was ot squared. Ths correspods to the use the followg momet codtos mpled by the model: e = E : ex NONLINEARITY AT LEFT HAND SIDE 9
110 The correspodg CMM estmator results from applyg the aalogy prcple: (y + ~) ~ x = (y + ~) : x ~ x = Accordg to the GMM asymptotc theory, (~; ) ~ s cosstet for (; ) ad asymptotcally ormal wth asymptotc varace V = (E [y] + ) E [x] (E [yx] + E [x]) E E [x] x E [x] E x (E [y] + ) (E [yx] + E [x]) E [x] E x : 9.4 Least fourth powers Cosder the populato level objectve fucto h E (y bx) 4 = E h(e + ( b) x) 4 = E he 4 + 4e 3 ( b) x + 6e ( b) x + 4e ( b) 3 x 3 + ( b) 4 x 4 = E e ( b) E e x + ( b) 4 E x 4 ; where some of the terms dsappear because of the depedece of x ad e ad symmetry of the dstrbuto of e. The last two terms the objectve fucto are oegatve, ad are zero f ad oly f (assumg that x has a odegeerate dstrbuto) b = : Thus the (global) ID codto s sats ed. The squared score ad secod (y bx)4 A = 6e 6 (y bx) = e x ; b= b= wth expectatos 6E e 6 E x ad E e E x : Accordg to the propertes of extremum estmators, ^ s cosstet ad asymptotcally ormally dstrbuted wth asymptotc varace V^ = E e E x 6E e 6 E x E e E x E e 6 = 9 (E [e ]) E [x ] : Whe x ad e are ormally dstrbuted, V^ = 5 3 The OLS estmator s also cosstet ad asymptotcally ormally dstrbuted wth asymptotc varace (ote that there s codtoal homoskedastcty) V OLS = E e E [x ] : Whe x ad e are ormally dstrbuted, whch s smaller tha V^: e x : V OLS = e ; x EXTREMUM ESTIMATORS
111 9.5 Asymmetrc loss We rst eed to make sure that we are cosstetly estmatg the rght thg. Assume coveetly that E [e] = to x the scale of : Let F ad f deote the CDF ad PDF of e; respectvely. Assume that these are cotuous. Note that y a x b 3 = e + a + x ( b) 3 = e 3 + 3e a + x ( b) +3e a + x ( b) + a + x ( b) 3 : Now, E y a x b Z = h E (y a x b) 3 jy a x b Prfy a x b g h ( )E (y a x b) 3 jy a x b < Prfy a x b < g df x Z e+ a+x ( E F ( a) x ( b) Z ( ) df x Z e+ a+x ( b)< E F ( a) x ( b) : e 3 + 3e ( a + x ( b)) +3e ( a + x ( b)) + ( a + x ( b)) A df ejx e 3 + 3e ( a + x ( b)) +3e ( a + x ( b)) + ( a + x ( b)) 3 A A df ejx Is ths mmzed at ad? The questo about global mmum s very hard to aswer. Let us restrct ourselves to the local optmum aalyss. Take the dervatves ad evaluate them at ad [ (y a x b)] b ; = 3 E e je ( F ()) + ( )E e je < F () ; E [x] where we used that tesmal chage of a ad b aroud ad does ot chage the sg of e + a + x ( b) : For cosstecy, we eed these dervatves to be zero. Ths holds f the expresso roud brackets s zero, whch s true whe E e je < E [e je ] = F () F () : Whe there s cosstecy, the asymptotc ormalty follows from the theory of extremum estmators. = 3u from the rght, ( ) from the (u)=@u from the rght, = 6u ( ) from the left, ASYMMETRIC LOSS
112 the expected dervatves of the extremum fucto are 3 E h h = E (y a (y a x b) 4 a a b b ; = 9 E e 4 je Prfe g + 9( ) E e 4 je < Prfe < g x x x x = 9E E e 4 je ( F ()) + ( ) E e 4 je < F () ; x x (y a x b) E [h ] = E 6 4 a a 7 b b ; = 6E e je Prfe g 6( )E e je < Prfe < g x x x x = 6E (E [eje ] ( F ()) ( )E [eje < ] F ()) : x x If the last expresso roud brackets s o-zero, ad o elemet of x s determstcally costat, the the local ID codto s sats ed. The formula for the asymptotc varace easly follows. EXTREMUM ESTIMATORS
113 . MAXIMUM LIKELIHOOD ESTIMATION. Normal dstrbuto Sce x ; : : : ; x are from N (; ); the loglkelhood fucto s ` = cost l jj (x ) = cost l jj = = x! x + : = The equato for the ML estmator s + x x =. The equato has two solutos > ; < : = x + qx + 4x ; = x qx + 4x : Note that ` s a symmetrc fucto of except for the term P = x : Ths term determes the soluto. If x > the the global maxmum of ` wll be at ; otherwse at : That s, the ML estmator s ^ ML = x + sg(x) qx + 4x : It s cosstet because, f 6= ; sg(x) p! sg() ad p ^ ML! E [x] + sg(e [x]) p (E [x]) + 4E [x ] = + sg() p + 8 = :. Pareto dstrbuto The loglkelhood fucto s ` = l ( + ) l x : = (a) The ML estmator ^ of s the soluto =. That s, ^ ML = =l x; whch s cosstet for ; because =l x! p =E [l x] = : The asymptotcs s p d! ^ML N ; I ; where the formato matrx s I = E [@s=@] = E = = = : (b) The Wald test for a smple hypothess s W = (^ ) I(^)(^ ) = (^ ) ^ d! : MAXIMUM LIKELIHOOD ESTIMATION 3
114 The Lkelhood Rato test statstc for a smple hypothess s LR = `(^) `( ) = l ^ (^ + ) l x l + ( + ) = = l ^ (^ )! d l x! : The Lagrage Multpler test statstc for a smple hypothess s " LM = s(x ; ) I( ) s(x ; ) = = = (^ ) ^ d! : = = Observe that W ad LM are umercally equal. =! l x = l x #.3 Comparso of ML tests Recall that for the ML estmator ^ ad the smple hypothess H : = ; ad W = (^ ) I(^)(^ ) LM = X s(x ; ) I( ) X. The desty of a Posso dstrbuto wth parameter s f(xj) = x x! e ; s(x ; ): so ^ ML = x; I() = =: For the smple hypothess wth = 3 the test statstcs are W = (x 3) ; LM = x ad W LM for x 3 ad W LM for x 3. X x 3! 3 = (x 3) ; 3. The desty of a expoetal dstrbuto wth parameter s f(xj) = e x ; so ^ ML = x; I() = = : For the smple hypothess wth = 3 the test statstcs are W = (x 3) x ; LM = X ad W LM for < x 3 ad W LM for x 3.! x 3 3 = 3 (x 3) ; 9 4 MAXIMUM LIKELIHOOD ESTIMATION
115 3. The desty of a Beroull dstrbuto wth parameter s f(xj) = x ( ) x ; so ^ ML = x; I() = ( ) : For the smple hypothess wth = the test statstcs are W = x x( x) ; LM = P x P! x = 4 x ; ad W LM (sce x( x) =4). For the smple hypothess wth = 3 the test statstcs are W = x 3 x( x) ; LM = P x P! x 3 3 = 9 x ; therefore W LM whe =9 x( x) ad W LM whe =9 x( x): Equvaletly, W LM for 3 x 3 ad W LM for < x 3 or 3 x :.4 Ivarace of ML tests to reparameterzatos of ull. Deote by the set of s that satsfy the ull. Sce f s oe-to-oe, s the same uder both parameterzatos of the ull. The the restrcted ad urestrcted ML estmators are varat to how H s formulated, ad so s the LR statstc.. Whe f s lear, f(h()) f() = F h() F = F h(); ad the matrx of dervatves of h traslates learly to the matrx of dervatves of g: G = F H; where F does ot deped o ts argumet x; ad thus eed ot be estmated. The W g = = =! g(^) = = G ^V^G! g(^) =! F h(^) F H ^V^H F F h(^) =!! h(^) H ^V^H h(^) = W h ; = but ths sequece of equaltes does ot work whe f s olear. 3. The W statstc for the reparameterzed ull equals W = ^ ^ ^ ^ ^! = C B ^ ^ ^! C A = ^ ^ ^ ^ + ^ ^! ; INVARIANCE OF ML TESTS TO REPARAMETERIZATIONS OF NULL 5
116 where bi = ; I b = : By choosg close to ^ ; we ca make W as close to zero as desred. The value of equal to (^ ^ = )=( = ) gves the largest possble value to the W statstc equal to ^ ^ ( ) = :.5 Msspec ed maxmum lkelhood. Method. It s straghtforward to derve the loglkelhood fucto ad see that the problem of ts maxmzato mples mmzato of the sum of squares of devatos of y from g (x; b) over b;.e. the NLLS problem. But we kow that the NLLS estmator s cosstet. Method. It s straghtforward to see that the populato aalog of the FOC for the ML problem s that the expected product of quas-regressor ad devato of y from g (x; ) equals zero, but ths system of momet codtos follows from the regresso model.. By costructo, t s a extremum estmator. It wll be cosstet for the value that solves the aalogous extremum problem populato: ^ p! arg max q E [f(zjq)] ; provded that ths s uque (f t s ot uque, o ce asymptotc propertes are expected). It s ulkely that ths lmt wll be at true : As a extremum estmator, ^ wll be asymptotcally ormal, although cetered aroud wrog value of the parameter: p ^ d! N ; V^ : 3. The parameter vector s q = (b ; c ) ;where c eters the assumed form of t : The codtoal desty s log f (y t ji t ; b; c) = log (y t x t (b; c) tb) : t (b; c) The codtoal score s s (y t ji t ; b; c) = (y t x tb) x t (y t x t (b; c) + tb) t (b; c) t (b; c) (y t x tb) t (b; c) t (b; t t (b; The ML estmator wll estmate such b ad c that make E [s (y t ji t ; b; c)] = : Wll ths b be equal to true? It s easy to see that f we evaluate the codtoal score at b =, ts expectato wll be V [yt ji t E B t (; c) (; c) t (; C V [yt ji t (; c) t (; c) t (; c) A t C A : 6 MAXIMUM LIKELIHOOD ESTIMATION
117 Ths may ot be zero whe t (; c) does ot cocde wth V [y t ji t ] for ay c; so t does ot ecessarly follow that the ML estmator of wll be cosstet. However, f t (; c) does ot deped o ; t s clear that the -part of ths wll be zero, ad the c-part wll be zero too for some c (ths codto wll de e the probablty lmt of ^c; the ML estmator of c). I ths case, the ML estmator of wll be cosstet. I fact, t s easy to see that t wll equal to a weghted least squares estmator wth weghts t (^c) ^ = X t x t x t t (^c)! X t x t y t t (^c):.6 Idvdual e ects The loglkelhood s ` ; : : : ; ; = cost log( ) (x ) + (y ) : = FOC gve so that Because ^ ML = 4 ^ ML = x + y ; ML = ^ ML = 4 (x ^ ML ) + (y ^ ML ) ; = (x y ) : = (x ) + (y ) (x )(y ) = p! = ; the ML estmator s cosstet. Why? The maxmum lkelhood method presumes a parameter vector of xed dmeso. I our case the dmeso stead creases wth the umber of observatos. The formato from ew observatos goes to estmato of ew parameters stead of ehacg precso of the already exstg oes. To costruct a cosstet estmator, just multply ^ ML by. There are also other possbltes..7 Irregular co dece terval The maxmum lkelhood estmator of s ^ML = x () maxfx ; : : : ; x g whose asymptotc dstrbuto s expoetal: F (^ML ) (t)! exp(t=) I ftg + I ft>g : INDIVIDUAL EFFECTS 7
118 The most elegat way to proceed s by pvotzg ths dstrbuto rst: F (^ML )= (t)! exp(t) I ftg + I ft>g : The left 5%-quatle for the lmtg dstrbuto s log(:5): Thus, wth probablty 95%, log(:5) (^ ML )= ; so the co dece terval for s x() ; ( + log(:5)=) x () :.8 Trval parameter space Sce the parameter space cotas oly oe pot, the latter s the optmzer. If = ; the the estmator ^ ML = s cosstet for ad has te rate of covergece. If 6= ; the the ML estmator s cosstet..9 Nusace parameter desty We eed to apply the Taylor s expaso twce,.e. for both stages of estmato. The FOC for the secod stage of estmato s s c (y ; x ; ~; ^ m ) = ; = where s c (y; x; ; log f c(yjx; ; ) s the codtoal score. Taylor s expaso wth respect the -argumet aroud yelds s c (y ; x ; ; ^ m ) + = = where les betwee ~ ad compoetwse. Now Taylor-expad the rst term aroud : s c (y ; x ; ; ^ m ) = = s c (y ; x ; ; ) + = where les betwee ^ m ad compoetwse. Combg the two peces, we get: p (~ ) = = c (y ; x ; ; ^ c (y ; x ; ; ^ m (~ ) = ;! s c (y ; x ; ; ) + = = c (y ; x ; ; (^ m c (y ; x ; ; ) (^m ) : Now let!. Uder ULLN for the secod dervatve of the log of the codtoal desty, the rst factor coverges probablty to (Ic ), where log f c (yjx; E ; There 8 MAXIMUM LIKELIHOOD ESTIMATION
119 are two terms sde the brackets that have otrval dstrbutos. We wll compute asymptotc varace of each ad asymptotc covarace betwee them. The rst term behaves as follows: p = s c (y ; x ; ; ) d! N (; I c ) due to the CLT (recall that the score has zero expectato ad the formato matrx equalty). Tur to the secod term. Uder the ULLN, ; ) c (y ; x ; coverges to Ic log f c (yjx; E ; Next, we kow from the MLE theory that p (^ m )! d N ; (Im where Im log f m (xj ) Fally, the asymptotc covarace term s zero because of the margal/codtoal relatoshp betwee the two terms, the Law of Iterated Expectatos ad zero expected score. Collectg the peces, we d: p (~ )! d N ; (Ic ) Ic + Ic (Im ) Ic (Ic ) : It s easy to see that the asymptotc varace s larger ( matrx sese) tha (Ic ) that would be the asymptotc varace f we ew the usace parameter. But t s mpossble to compare to the asymptotc varace for ^ c, whch s ot (Ic ).. MLE versus OLS. Observe that ^ OLS = P = y ; E [^ OLS ] = P = E [y] = ; so ^ OLS s ubased. Next, P = y p! E [y] =, so ^ OLS s cosstet. Yes, as we kow from the theory, ^ OLS s the best lear ubased estmator. Note that the members of ths class are allowed to be of the form fay s.t. AX = Ig ; where A s a costat matrx, sce there are o regressors besde the costat. There s o heteroskedastcty, sce there are o regressors to codto o (more precsely, we should codto o a costat,.e. the trval - eld, whch gves just a ucodtoal varace whch s costat by the radom samplg assumpto). The asymptotc dstrbuto s p (^OLS ) = p d e! N ; E x ; sce the varace of e s E e = E E e jx = E x :. The codtoal lkelhood fucto s L(y ; : : : ; y ; x ; : : : ; x ; ; ) = The codtoal loglkelhood s = Y = qx exp (y ) x : `(y ; : : : ; y ; x ; : : : ; x ; ; ) = cost = (y ) x log! max ; : MLE VERSUS OLS 9
120 From the FOC the ML = ^ ML = y x = = ; P = y =x P = =x : Note: t as equal to the OLS estmator of The asymptotc dstrbuto s p (^ML ) = p P = e =x P = =x y x = x + e x : d!! E x N ; E x = N ; E x : Note that ^ ML s ubased ad more e cet tha ^ OLS sce E x < E x ; but t s ot the class of lear ubased estmators, sce the weghts A ML deped o extraeous x s. The ^ ML s e cet a much larger class. Thus there s o cotradcto.. MLE versus GLS The feasble GLS estmator ~ s costructed by ~ = = x x (x ^)! = x y (x ^) : The asymptotc varace matrx s xx V ~ = E (x ) : The codtoal logdesty s ` x; y; b; s = cost log s log(x b) (y x b) s (x b) ; so the codtoal score s s x; y; b; s = s x; y; b; s = x x b + y x b xy (x b) 3 s ; s + s 4 (y x b) (x b) ; MAXIMUM LIKELIHOOD ESTIMATION
121 whose dervatves are s x; y; b; s = s x; y; b; s = xx (x b) 3y x b y xx (x b) 4 s ; y x b xy (x b) 3 s 4 ; s x; y; b; s = s 4 s 6 (y x b) (x b) : After takg expectatos, we d that the formato matrx s I = + xx E (x ) ; I = x E x ; I = 4 : By vertg a parttoed matrx, we d that the asymptotc varace of the ML estmator of s V ML = I I I I + xx x x = E (x ) E x E x : Now, V ML = xx xx x x E (x ) + E (x ) E x E x xx E (x ) = V~ ; where the equalty follows from E [aa ] E [a] E [a ] = E (a E [a]) (a E [a]). Therefore, V ~ V ML ;.e. the GLS estmator s less asymptotcally e cet tha the ML estmator. Ths s because gures both to the codtoal mea ad codtoal varace, but the GLS estmator gores ths formato.. MLE heteroskedastc tme seres regresso Observe that the jot desty factorzes: f (y t ; x t ; y t ; x t ; y t ; x t ; : : :) = f c (y t jx t ; y t ; x t ; y t ; x t ; : : :) f m (x t jy t ; x t ; y t ; x t ; : : :) f (y t ; x t ; y t ; x t ; : : :) : Assume that data (y t ; x t ), t = ; ; : : : ; T; are statoary ad ergodc ad geerated by y t = + x t + u t ; where u t jx t ; y t ; x t ; y t ; : : : N (; t ) ad x t jy t ; x t ; y t ; x t ; : : : N (; v): Expla, wthout gog to deep math, Sce the parameter v s ot preset the codtoal desty f c ; t ca be e cetly estmated from the margal desty f m ; whch yelds The stadard error may be costructed va ^v = T ^V = T TX x t : t= TX x 4 t ^v : t= MLE IN HETEROSKEDASTIC TIME SERIES REGRESSION
122 . If the etre fucto t = (x t ) s fully kow, the codtoal ML estmator of ad s the same as the GLS estmator: ^^ ML = TX t= t The stadard errors may be costructed va ^V ML = T TX x t t= t xt x t x t! xt x t X T y t t= t! : :. If the values of t at t = ; ; : : : ; T are kow, we ca use the same procedure as part, sce t does ot use values of (x t ) other tha those at x ; x ; : : : ; x T : 3. If t s kow that t = ( + x t ) ; we have addto parameters ad to be estmated jotly from the codtoal dstrbuto The loglkelhood fucto s ad ^; ^; ^; ^ ML y t jx t ; y t ; x t ; y t ; x t ; : : : N ( + x t ; ( + x t ) ): ` (; ; ; ) = cost ^^ TX t= log( + x t ) = arg max ` (; ; ; ). Note that (;;;) ML = TX t= (^ + ^x t ) x t xt x t! The stadard errors may be costructed va ^; ^; ^; ^; ^; ^; ^ ^V ML = (; ; ; (; ; ; ) t= TX t= T X t= x t (y t x t ) ( + x t ) ; y t (^ + ^x : t ) x t 4. Smlarly to part 3, f t s kow that t = + u t ; we have addto parameters ad to be estmated jotly from the codtoal dstrbuto y t jx t ; y t ; x t ; y t ; x t ; : : : N + x t ; + (y t x t ) : 5. If t s oly kow that t s statoary, codtoal maxmum lkelhood fucto s uavalable, so we have to use sube cet methods, for example, OLS estmato ^^ OLS = TX t= The stadard errors may be costructed va ^V OLS = T TX t= x t xt x t where ^e t = y t ^ OLS ^OLS x t :! x t xt x t TX t= x t! xt x t T X t= ^e t y t x t : : TX t= x t xt x t! ; MAXIMUM LIKELIHOOD ESTIMATION
123 .3 Does the lk matter? Let the x varable assume two d eret values x ad x, u a = +x a ad ab = #fx = x a ; y = bg; for a; b = ; (.e., a;b s the umber of observatos for whch x = x a ; y = b). The log-lkelhood fucto s l(x ; ::x ; y ; : : : ; y ; ; ) = log Q = F ( + x ) y ( F ( + x )) y = = log F (u ) + log( F (u )) + log F (u ) + log( F (u )): (.) The FOC for the problem of maxmzato of l(: : : ; ; ) w.r.t. ad are: F (^u ) F (^u ) F (^u ) F (^u ) F (^u ) F (^u + ) F (^u ) F (^u ) x F (^u ) F (^u ) F (^u ) F (^u + x F (^u ) F (^u ) ) F (^u ) F (^u ) = ; = As x 6= x ; oe obtas for a = ; a F (^u a ) a F (^u a ) =, F (^ua ) = a, ^u a ^ + a + ^x a = F a a a + a (.) uder the assumpto that F (^u a ) 6= : Comparg (.) ad (.) oe sees that l(: : : ; ^; ^) does ot deped o the form of the lk fucto F (): The estmates ^ ad ^ ca be foud from (.): x F + x F + F + F + ^ = x x ; ^ = x x :.4 Maxmum lkelhood ad bary varables Sce the parameters the codtoal ad margal destes do ot overlap, we ca separate the problem. The codtoal lkelhood fucto s Y e z y e z y L(y ; : : : ; y ; z ; : : : ; z ; ) = + e z + e z ; ad the codtoal loglkelhood The rst order codto = `(y ; : : : ; y ; z ; : : : ; z ; = = y z fy z l( + e z )g : = z e z + e z = gves the soluto ^ = log ; where = #fz = ; y = g; = #fz = ; y = g: The margal lkelhood fucto s L(z ; : : : ; z ; ) = Y z ( ) z ; = DOES THE LINK MATTER? 3
124 ad the margal loglkelhood The rst order codto `(z ; : : : ; z ; ) = fz l + ( z ) l( )g : = = z P = ( z ) = gves the soluto ^ = P = z : From the asymptotc theory for ML, ^^ p d! ( ) ( + e ) e AA :.5 Maxmum lkelhood ad bary depedet varable. The codtoal ML estmator s The score s ^ ML = arg max c = arg max c ad the formato matrx s e cx y log + e cx + ( y ) log + e cx fcy x log ( + e cx )g : = = s(y; x; (yx log ( + ex )) = y I x; ) = E so the asymptotc dstrbuto of ^ ML s N ; I. e x ( + e x ) x e x + e x x;. The regresso s E [yjx] = Pfy = jxg + Pfy = jxg = ex : The NLLS estmator s + ex e cx ^ NLLS = arg m y c + e cx : = The asymptotc dstrbuto of ^ NLLS s N ; Qgg Q gge Qgg. Now, sce E e jx = V [yjx] = e x ( + e x ; we have ) e x Q gg = E ( + e x ) 4 x ; Q gge = E e x e x ( + e x ) 4 x E e jx ; e 3x = E ( + e x ) 6 x : 3. We kow that V [yjx] = ( + e x ; whch s a fucto of x: The WNLLS estmator of s ) ( + e x ) e cx ^ W NLLS = arg m c e x y + e cx : = 4 MAXIMUM LIKELIHOOD ESTIMATION
125 Note that there should be the true the weghtg fucto (or ts cosstet estmate a feasbleverso), but ot the parameter of choce c! The asymptotc dstrbuto s N ; Q ; where gg= e x e x Q gg= = E V [yjx] ( + e x ) 4 x = ( + e x ) x : 4. For the ML problem, the momet codto s zero expected score e x E y + e x x = : For the NLLS problem, the momet codto s the FOC (or o correlato betwee the error ad the quasregressor ) e x e x E y + e x ( + e x ) x = : For the WNLLS problem, the momet codto s smlar: e x E y + e x x = ; whch s magcally the same as for the ML problem. No woder that the two estmators are asymptotcally equvalet (see part 5). 5. Of course, from the geeral theory we have V MLE V W NLLS V NLLS. We see a strct equalty V W NLLS < V NLLS ; except maybe for specal cases of the dstrbuto of x, ad ths s ot surprsg. Surprsg may seem the fact that V MLE = V W NLLS : It may be surprsg because usually the MLE uses dstrbutoal assumptos, ad the NLLSE does ot, so usually we have V MLE < V W NLLS : I ths problem, however, the dstrbutoal formato s used by all estmators, that s, t s ot a addtoal assumpto made exclusvely for ML estmato..6 Posso regresso. The codtoal logdesty s log f(yjx; ; ) = (x; ; ) + y log (x; ; ) log y! = cost exp ( + x) + y ( + x) ; where cost does ot deped o parameters. Therefore, the urestrcted ML estmator s (^; ^) solvg (^; ^) = arg max fy (a + bx ) exp (a + bx )g ; (a;b) or equvaletly solvg the system = x y = y = = exp ^ + ^x x exp ^ + ^x = = ; = : POISSON REGRESSION 5
126 The loglkelhood fucto value s ` = = cost exp ^ + ^x + y ^ + ^x o = cost + (^ ) y + ^xy: The restrcted MLE estmator s (^ R ; ) where ^ R = arg max a wth the loglkelhood fucto value fy a = exp (a)g = log y; `R = cost exp ^ R + y ^ R = cost + y (log y ) : = O ths bass, LR = ` `R = (^ log y) y + ^xy : Now, the codtoal score log f(yjx; ; )=@ s(y; x; ; ) = = (y exp ( + x)) log f(yjx; ; )=@ x ad the true Iformato matrx s I = E exp ( + x) ; x x cosstetly estmated by (most easly, albet ot qute le wth prescrptos of the theory) x bi = y x x ; the bi = x x y x x x ; Therefore, W = ^ y x x : If we used ucostraed estmated b I ( le wth prescrptos of the theory), ths would be more complex. Fally, LM = = x (y (xy xy) = : y x x! x x y) y x x x = x (y y)!. The logdesty s log f(yjx; ; ) = cost exp (x) + y log ( + exp (x)) : 6 MAXIMUM LIKELIHOOD ESTIMATION
127 Therefore, the estmator used by the researcher s a extremum oe wth h(y; x; ; ) = exp (x) + y log ( + exp (x)) : The (msspec ed) codtoal score s h (y; x; a; b) = y (a + exp (bx)) : exp (bx) x Ths extremum estmator wll provde cosstet estmato of ( ; ) oly f = E [h (y; x; ; )] : Here may be ay because we care oly about cosstecy of estmate of : But, usg the law of terated expectatos, E [h (y; x; ; )] = E y ( + exp (x)) exp (x) x (exp () ) exp (x) = E ; + exp (x) exp (x) x whch s ot lkely to be zero. Ths llustrates that msspec cato of the codtoal mea leads to wrog ferece..7 Bootstrappg ML tests. I the bootstrap world, the costrat s g(q) = g(^ ML ); so LR = max q! max `(q) q; g(q)=g(^ ML ) ; where ` s the loglkelhood calculated from the bootstrap sample.. I the bootstrap world, the costrat s g(q) = g(^ ML ); so LM = =! s(z R ; ^ ML) bi =! s(z R ; ^ ML) ; R where ^ ML s the restrcted (subject to g(q) = g(^ ML )) ML bootstrap estmate ad I b s the bootstrap estmate of the formato matrx, both calculated from the bootstrap sample. No addtoal receterg s eeded because the zero expected score rule s exactly sats ed at the sample. BOOTSTRAPPING ML TESTS 7
128 8 MAXIMUM LIKELIHOOD ESTIMATION
129 . INSTRUMENTAL VARIABLES. Ivald SLS. Sce E[u] =, we have E [y] = E z, so s det ed as log as z s ot determstc zero. The aalog estmator s ^ = X z! X y : Sce E[v] =, we have E[z] = E[x], so s det ed as log as x s ot cetered aroud zero. The aalog estmator s ^ = Sce does ot deped o x, we have = V s! X X x z : u, so s det ed. The aalog estmator v ^u ^u ; ^ = = ^v ^v where ^u = y ^z ad ^v = z ^x.. The estmator sats es ~ = X ^z 4! X ^z y = ^ 4 X We kow that P x u p! E x 4 + E x v, ad ^! p. Therefore, P x4 x 4! ^ X x y : p! E x 4, P x y = P x4 + P x3 v + P ~ p! + E x v E [x 4 ] 6= : x v + 3. Evdetly, we should t the estmate of the square of z, stead of the square of the estmate. To do ths, ote that the secod equato ad propertes of the model mply E z jx = E (x + v) jx = x + E [xvjx] + E v jx = x + v: That s, we have a lear mea regresso of z o x ad a costat. Therefore, the rst stage we should regress z o x ad a costat ad costruct ^z = ^ x + ^ v, ad the secod stage, we should regress y o ^z. Cosstecy of ths estmator follows from the theory of SLS, whe we treat z as a rght had sde varable, ot z. INSTRUMENTAL VARIABLES 9
130 . Cosumpto fucto The data are geerated by C t = Y t = + A t + e t; (.) + A t + e t; (.) where A t = I t +G t s exogeous ad thus ucorrelated wth e t : Deote e = V [e t ] ad A = V [A t] :. The probablty lmt of the OLS estmator of s p lm ^ = C [Y t; C t ] V [Y t ] = + C [Y t; e t ] V [Y t ] = + e A + e = + ( ) A + : e e The amout of cosstecy s ( ) e= A + e : Sce the MPC les betwee zero ad oe, the OLS estmator of s based upward.. Ecoometrca B s correct oe sese, but correct aother. Both strumetal vectors wll gve rse to estmators that have detcal asymptotc propertes. Ths ca be see by otg that populato the projectos of the rght had sde varable Y t o both strumets z; where = E[xz ] (E [zz ]), are detcal. Ideed, because (.) the I t ad G t eter through ther sum oly, projectg o (; I t ; G t ) ad o (; A t ) gves detcal tted values + A t: Cosequetly, the matrx Q xz Q zz Q xz that gures to the asymptotc varace wll be the same sce t equals E z ( z) whch s the same across the two strumetal vectors. However, ths does ot mea that the umercal values of the two estmates of (; ) wll be the same. Ideed, the -sample predcted values (that are used as regressors or strumets at the secod stage of the SLS procedure ) are ^x = ^z = X Z (Z Z) z ; ad these values eed ot be the same for the log ad short strumetal vectors. 3. Ecoometrca C estmates the lear projecto of Y t o ad C t ; so the coe cet at C t estmated by ^ C s p lm ^ C = C [Y t; C t ] V [C t ] = A + A + e e = A + e A + e : Ecoometrca D estmates the lear projecto of Y t o ; C t ; I t ; ad G t ; so the coe cet at C t estmated by ^ C s because of the perfect t the equato Y t = +C t +I t +G t : Moreover, because of the perfect t, the umercal value of (^ ; ^ C ; ^ I ; ^ G ) wll be exactly (; ; ; ) : There exst a specal case, however, whe the umercal values wll be equal (whch s ot the case the problem at had) whe the t at the rst stage s perfect. 3 INSTRUMENTAL VARIABLES
131 .3 Optmal combato of strumets. The ecessary propertes are valdty ad relevace: E [ze] = E [e] = ad E [zx] 6= ; E [x] 6= : The asymptotc dstrbutos of ^ z ad ^ are p ^z ^! d! N ; E [zx] E z e E [xz] E [x] E ze E [xz] E [x] E ze E [x] E e (we wll eed jot dstrbuto part 3).. The optmal strumet ca be derved from the FOC for the GMM problem for the momet codtos z E [m (y; x; z; ; )] = E (y x) = : The Q mm = E z z e ; = E From the FOC for the (feasble) e cet GMM populato, the optmal weghg of momet codtos ad thus of strumets s the That s, the optmal strumet s x z : z z z Q mm _ E x E e z _ E x E e z z E [xz] E e E [x] E ze _ E [x] E [z e ] E [xz] E [ze : ] E [xz] E e E [x] E ze z + E [x] E z e E [xz] E ze z z + : Ths meas that the optmally combed momet codtos mply E z z + (y x) =, = E z z + x E z z + y = E z z + x z E [zx] z + E [x] ; where z ad are determed from the strumets separately. Thus the optmal IV estmator s the followg lear combato of ^ z ad ^ : E [xz] E e E [x] E ze E [xz] E [xz] E e ^ E [xz] E [x] E [ze ] + E [x] E [z e z ] E [x] E z e E [xz] E ze +E [x] E [xz] E e ^ E [xz] E [x] E [ze ] + E [x] E [z e : ] OPTIMAL COMBINATION OF INSTRUMENTS 3
132 3. Because of the jot covergece part, the t-type test statstc ca be costructed as p ^z ^ T = q be [xz] E b e E b [xz] E b [x] E b [ze ] + E b ; [x] E b [z e ] where E b deoted a sample aalog of a expectato. The test rejects f jt j exceeds a approprate quatle of the stadard ormal dstrbuto. If the test rejects, oe or both of z ad may ot be vald..4 Trade ad growth. The ecoomc ratoale for ucorrelatedess s that the varables P ad S are exogeous ad are ua ected by what s gog o the ecoomy, ad o the other had, hardly ca they a ect the come other ways tha through the trade. To estmate (.3), we ca use just-detfyg IV estmato, where the vector of rght-had-sde varables s x = (; T; W ) ad the strumet vector s z = (; P; S) :. Whe data o wth-coutry trade are ot avalable, oe of the coe cets (.3) s det able wthout further assumptos. I geeral, ether of the avalable varables ca serve as strumets for T (.3) where the composte error term s W + " : 3. We ca explot the assumpto that P s ucorrelated wth the error term (.5). Substtute (.5) to (.3) to get log Y = ( + ) + T + S + ( + ") : Now we see that S ad P are ucorrelated wth the composte error term + " due to ther exogeety ad due to ther ucorrelatedess wth whch follows from the addtoal assumpto ad because s the best lear predcto error (.5). As for the coe cets of (.3), oly wll be cosstetly estmated, but ot or : 4. I geeral, for ths model the OLS s cosstet, ad the IV method s cosstet. Thus, the p dscrepacy may be due to the d eret probablty lmts of the two estmators. Let IV! p ad OLS! + a; a < : The for large samples, IV ad OLS + a: The d erece s a whch s (E [xx ]) E[xe]: Sce (E [xx ]) s postve de te, a < meas that the regressors ted to be egatvely correlated wth the error term. I the preset cotext ths meas that the trade varables are egatvely correlated wth other ueces o come. 3 INSTRUMENTAL VARIABLES
133 . GENERALIZED METHOD OF MOMENTS. Nolear smultaeous equatos y x x. Sce E[u] = E[v] =, m(w; ) = x y, where w = ; = ; ca be used as y a momet fucto. The true ad solve E[m(w; )] = ; therefore E[y] = E[x] ad E[x] = E y, ad they are det ed as log as E[x] 6= ad E y 6= : The aalog of the populato mea s the sample mea, so the aalog estmators are P ^ = P y P ; ^ = P x : x. If we add E [uv] =, the momet fucto s m(w; ) y y x x y (y x)(x y ) ad GMM ca be used. The feasble e cet GMM estmator s!! ^GMM = arg m m(w ; q) ^Q mm m(w ; q) ; q = where ^Q mm = P = m(w ; ^)m(w ; ^) ad ^ s cosstet estmator of that ca be take from part. The asymptotc dstrbuto of ths estmator s A = p (^GMM ) d! N (; VGMM ); where V GMM = Q ) : The complete aswer presumes expressg ths matrx terms of momets of observable varables.. Improved GMM The rst momet restrcto gves GMM estmator ^ = x wth asymptotc varace V CMM = V [x] : The GMM estmato of the full set of momet codtos gves estmator ^ GMM wth asymptotc varace V GMM = Q ) ; where = E [@m(x; y; )=@q] = ( ; ) ad Q mm = E m(x; y; )m(x; y; ) V(x) C(x; y) = : C(x; y) V(y) Hece, V GMM = V [x] (C [x; y]) V [y] ad thus e cet GMM estmato reduces the asymptotc varace whe C [x; y] 6= : GENERALIZED METHOD OF MOMENTS 33
134 .3 Mmum Dstace estmato. Sce the equato s( ) = ca be uquely solved for ; we have = arg m ( s()) W ( s()) : For large, ^ s cocetrated aroud, ad ^W s cocetrated aroud W: Therefore, ^ MD wll be cocetrated aroud : To derve the asymptotc dstrbuto of ^ MD ; let us take the rst order Taylor expaso of the last factor the ormalzed sample FOC = S(^ MD ) p ^W ^ s(^md ) aroud : = S(^ MD ) ^W p ^ S(^ MD ) ^W S() p (^MD ) ; where les betwee ^ MD ad compoetwse, hece p! : The p (^MD ) = S(^ MD ) ^W S() S(^MD ) ^W p ^ d! S( ) W S( ) S( ) W N ; V^ = N ; S( ) W S( ) S( ) W V^W S( ) S( ) W S( ) :. By aalogy wth e cet GMM estmato, the optmal choce for the weght matrx W s V : The ^ p (^MD )! d N ; S( ) V S( ^ ) : The obvous cosstet estmator s ^V. Note that t may be freely reormalzed by a ^ costat ad ths wll ot a ect the result umercally. 3. Uder H ; the sample objectve fucto s close to zero for large ; whle uder the alteratve, t s far from zero. Let us take the rst order Taylor expaso of the root of the optmal (.e., whe W = V ) sample objectve fucto ormalzed by aroud ^ : ^ s(^md ) ^W ^ s(^md ) = ; p ^W ^ = s(^md ) ; = p ^W ^ = p ^W = S() (^ MD ) A = I` V = S( ^ ) S( ) V S( ^ ) S( ) V = V = p ^ ^ ^ A = I` V = S( ^ ) S( ) V S( ^ ) S( ) V = N (; I`) : ^ Thus uder H d! ^ s(^md ) ^W ^ s(^md ) ` k : 34 GENERALIZED METHOD OF MOMENTS
135 4. The parameter of terest s mplctly de ed by the system = s () : The matrx of dervatves = : The OLS estmator of ( ; ) s cosstet ad asymptotcally ormal wth asymptotc varace matrx V^ = E yt E [y t y t ] E [y t y t ] E yt = ; because A optmal MD estmator of s ^ MD = arg m :jj< E yt = + ( ) 3 ; E [y t y t ] E yt = + : ^ ^! t= y t y t y t y t y t y t ad s cosstet ad asymptotcally ormal wth asymptotc varace + V^MD = ( ) 3 + = ^ MD t= ^ ^! : 4 To verfy that both autoregressve roots are deed equal, we ca test the hypothess of correct spec cato. Let ^ be the estmated resdual varace. The test statstc s ^! ^MD y ^ t y t y t ^! ^MD ^ y t y t yt ^ ad s asymptotcally dstrbuted as : ^ MD.4 Formato of momet codtos. We kow that E [w] = ad E[(w ) 4 ] = 3 E[(w ) ] : It s trval to take care of the former. To take care of the latter, troduce a costat = E[(w E[(w ) 4 ] = 3 : Together, the system of momet codtos s 3 w E 4@ (w ) A5 (w ) 4 3 = : ) ]; the we have. To have overdet cato, we eed to arrve at least at 4 momet codtos (whch must be formatve about parameters!). Let, for example, z t = ; y t ; yt ; y t ; yt : The momet codtos are E z t y t y t (y t y t )! (y t y t ) = : FORMATION OF MOMENT CONDITIONS 35
136 A much better way s to thk ad to use d eret strumets for the mea ad varace equatos. For example, let z t = (y t ; y t ) ; z t = ; yt ; y t so that the momet codtos are E 4 z t (y t y t ) z t (y t y t )! (y t y t ) 3 5 = :.5 What CMM estmates The populato aalog of g(z; q) = = s E [g(z; q)] = : Thus, f the latter equato has a uque soluto ; t s a probablty lmt of ^: The asymptotc dstrbuto of ^ s that of the CMM estmator of based o the momet codto E [g(z; )] = : p ^ d! N ; )=@ E g(z; )g(z; ) ) =@ :.6 Trty for GMM The Wald test s the same up to a chage the varace matrx: W = h(^ GMM ) h H(^ GMM )( ^Q mm ) H(^ GMM ) h(^gmm ) d! q; where ^ GMM s the urestrcted GMM estmator, ad ^Q mm are cosstet estmators of ad Q mm, relatvely, ad H() The Dstace D erece test s smlar to LR, but wthout factor, ^Q =@@ p! Q h DD = Q (^ R d! GMM) Q (^ GMM ) q : The LM test s a lttle bt harder, sce the aalog of the average score s () = It s straghtforward to d that ; ) m(z ; ) : = LM = 4 (^ R GMM) ( ^Q mm ) (^ R GMM) d! q: I the mddle oe may use ether restrcted or urestrcted estmators of ad Q mm. A more detaled dervato ca be foud, for example, secto 7.4 of Ecoometrcs by Fumo Hayash (, Prceto Uversty Press). : 36 GENERALIZED METHOD OF MOMENTS
137 .7 All about J. For the system of momet codtos E [m (z; )] = ; the J-test statstc s J = m(z; ^) ^Q mm m(z; ^): Observe that! p p m(z; ^) = m (z; ) m(z; ~ (^ ) ; where = p lm ^: By the CLT, p ( m (z; ) E [m (z; )]) coverges to the ormal dstrbuto. However, ths meas that p m (z; ) = p ( m (z; ) E [m (z; )]) + p E [m (z; )] coverges to ty because of the secod term whch does ot equal to zero (the system of momet codtos s msspec ed). Hece, J s a quadratc form of somethg dvergg to ty, ad thus dverges to ty too.. Let the momet fucto be just x so that ` = ad k = : The the J-test statstc correspodg to weght matrx ^w equals J = ^wx : Of course, whe ^w! p V [x] correspods to e cet weghtg, we have J! d ; but whe ^w! p w 6= V [x] ; we have J d! wv [x] 6= : 3. The argumet would be vald f the model for the codtoal mea was kow to be correctly spec ed. The oe could blame strumets for a hgh value of the J -statstc. But our tme seres regresso of the type E t [y t+ ] = g(x t ); f ths regresso was correctly spec ed, the the varables from tme t formato set must be vald strumets! The falure of the model may be assocated wth correct fuctoal form of g(); or wth spec cato of codtoal formato. Lastly, asymptotc theory may gve a poor approxmato to exact dstrbuto of the J -statstc..8 Iterest rates ad future ato. The covetoal ecoometrc model that tests the hypothess of codtoal ubasedess of terest rates as predctors of ato, s k t = k + k k t + k t ; E t h k t = : Uder the ull, k = ; k = : Settg k = m oe case, k = the other case, ad subtractg oe equato from aother, we ca get m t t = m + m m t t + t m t ; E t [ t m t ] = : Uder the ull m = = ; m = =, ths spec cato cocdes wth Mshk s uder the ull m; = ; m; =. The restrcto m; = mples that the term structure provdes o formato about future shfts ato. The predcto error m; t s serally correlated of the order that s the farthest predcto horzo,.e., max(m; ). ALL ABOUT J 37
138 . Selecto of strumets: there s a varety of choces, for stace, ; m t t ; m t t ; m t t ; m t max(m;) t max(m;) ; or ; m t ; t ; m t ; t ; m t max(m;) ; t max(m;) ; etc. Costructo of the optmal weghtg matrx demads a HAC procedure, ad so does estmato of asymptotc varace. The rest s more or less stadard. 3. Most terestg are the results of the test m; = whch tell us that there s o formato the term structure about future path of ato. Testg m; = the seems excessve. Ths hypothess would correspod to the codtoal bas cotag oly a systematc compoet (.e. a costat upredctable by the term structure). It also looks lke there s o systematc compoet ato (the hypothess m; = ca be accepted)..9 Spot ad forward exchage rates. Ths s ot the oly way to proceed, but t s straghtforward. The OLS estmator uses the strumet zt OLS = ( x t ) ; where x t = f t s t : The addtoal momet codto adds f t s t to the lst of strumets: z t = ( x t x t ) : Let us look at the optmal strumet. If t s proportoal to zt OLS ; the the strumet x t ; ad hece the addtoal momet codto, s redudat. The optmal strumet takes the form t = Q mmz t : But E[x t ] E[x t ] E[x t ] E[x t ] E[x t ] A ; Q mm E[x t ] E[x t ] E[x t x t ] A : E[x t ] E[x t x t ] E[x t ] E[x t x t ] E[x t ] It s easy to see that Q mm = whch ca ver ed by postmultplyg ths equato by Q mm. Hece, t = zt OLS : But the most elegat way to solve ths problem goes as follows. Uder codtoal homoskedastcty, the GMM estmator s asymptotcally equvalet to the SLS estmator, f both use the same vector of strumets. But f the strumetal vector cludes the regressors (z t does clude ), the SLS estmator s detcal to the OLS estmator. I total, GMM s asymptotcally equvalet to OLS ad thus the addtoal momet codto s redudat. z OLS t. We ca expect asymptotc equvalece of the OLS ad e cet GMM estmators whe the addtoal momet fucto s ucorrelated wth the ma momet fucto. Ideed, let us compare the orthwester block of V GMM = Q wth asymptotc varace of the OLS estmator V OLS = E[xt ] E[x t ] E[x : t ] Deote f t+ = f t+ f t : For the full set of momet codtos, E[x t ] E[x t ] E[x t e t+ f t+ ] E[x t ] E[x t ] A ; Q mm E[x t ] E[x t ] E[x t e t+ f t+ ] A : E[x t e t+ f t+ ] E[x t e t+ f t+ ] E[x t (f t+ ) ] ; 38 GENERALIZED METHOD OF MOMENTS
139 It s easy to see that whe E[x t e t+ f t+ ] = E[x t e t+ f t+ ] = ; Q mm s block-dagoal ad the orthwest block of V GMM s the same as V OLS : A su cet codto for these two equaltes s E[e t+ f t+ ji t ] =,. e. that codtoally o the past, uexpected movemets spot rates are ucorrelated wth uexpected movemets forward rates. Ths s hardly sats ed practce.. Returs from acal market Let us deote y t = r t+ r t ; = ( ; ; ; 3 ) ad x t = ; r t ; rt ; rt : The the model may be rewrtte as y t = x t + " t+ :. The OLS momet codto s E[(y t x t) x t ] = ; wth the OLS asymptotc dstrbuto N ; E x t x t E[xt x tr ]E x t x t : The GLS momet codto s E[(y t x t) x t r t ] = ; wth the GLS asymptotc dstrbuto N ; E[x t x tr t ] :. The parameter vector s = ; ; : The codtoal logdesty s log f y t ji t ; ; ; = log log r t s y t ji t ; ; ; = t (y t x t) r t (ote that t s llegtmate to take a log of r t ) The codtoal score s (y t x t) x t r t! C Hece, the ML momet codto s I = E (y t x t) r (y t x t) x t r t (y t x t) r t (y t x t) r t t (y t x t) r t! log r t! = : C A log r t Note that ts rst part s the GLS momet codto. The formato matrx s " # E x t x t r t C 4 E log r t E log rt E log r t : C A : C A RETURNS FROM FINANCIAL MARKET 39
140 The asymptotc dstrbuto for the ML estmates of the regresso parameters s N ; E[x t x tr t ] ; ad for the ML estmates of ad t s depedet of the former, ad s N ; V log r E log r t E log r! t t E log rt : 3. The method of momets wll use the momet codtos (y t x t) x t E B (y t t) r t r t (y t x t) r t r t log rt C A = ; where the former oe s from Part, whle the secod ad the thrd are the expected products of the error ad quas-regressors the skedastc regresso. Note that the resultg CMM estmator wll have the usual OLS estmator for the -part because of exact det cato. Therefore, the -part wll have the same asymptotc dstrbuto: N ; E x t x t E[xt x tr ]E x t x t : As for the other two parts, t s more messy. Ideed, E[x t x t] E[r 4 t ] E[r 4 t log rt ] A ; E[r 4 t log rt ] 4 E[r 4 t log rt ] E[x t x tr t ] Q mm 4 E[r 8 t ] 6 E[r 8 t log rt ] A : 6 E[r 8 t log rt ] 8 E[r 8 t log rt ] The asymptotc varaces follow from The sadwch does ot collapse, but oe ca see that the par ^ CMM ; ^ CMM s asymptotcally depedet of ^CMM : Q mmq 4. We have already establshed that the OLS estmator s detcal to the -part of the CMM estmator. From geeral prcples, the GLS estmator s asymptotcally more e cet, ad s detcal to the -part of the ML estmator. As for the other two parameters, the ML estmator s asymptotcally e cet, ad hece s asymptotcally more e cet tha the approprate part of the CMM estmator whch, by costructo, s the same as mpled by NLLS appled to the skedastc fucto. To summarze, for we have OLS = CMM GLS = ML; whle for ad we have CMM ML:. Istrumetal varables ARMA models. The strumet x t j s scalar, the parameter s scalar, so there s exact det cato. The strumet s obvously vald. The asymptotc varace of the just detfyg IV estmator of a scalar parameter uder homoskedastcty s V xt j = Q xz Q zz : Let us calculate all peces: Q zz = E[x t j] = V [x t ] = ; Q xz = E [x t x t j ] = C [x t ; x t j ] = j V [x t ] = j : 4 GENERALIZED METHOD OF MOMENTS
141 Thus, V xt j = j : It s mootocally declg j; so ths suggests that the optmal strumet must be x t : Although ths s ot a proof of the fact, the optmal strumet s deed x t : : The result makes sese, sce the last observato s most formatve ad embeds all formato all the other strumets.. It s possble to use as strumets lags of y t startg from y t back to the past. The regressor y t wll ot do as t s correlated wth the error term through e t : Amog y t ; y t 3 ; : : : the rst oe deserves more atteto, sce, tutvely, t cotas more formato tha older values of y t :. Hausma may ot work. Because the strumets clude the regressors, ^ SLS wll be detcal to ^ OLS ; because the strumet vector cotas x; the tted values from the rst-stage projecto wll be equal to x. Hece, ^ OLS ^SLS wll be zero rrespectve of valdty of z; ad so wll be the Hausma test statstc.. Uder the ull of codtoal homoskedastcty both estmators are asymptotcally equvalet ad hece equally asymptotcally e cet. Hece, the d erece asymptotc varaces s detcally zero. Note that here the estmators are ot detcal..3 Testg momet codtos Cosder the urestrcted (^ u ) ad restrcted (^ r ) estmates of parameter R k. The rst s the CMM estmate:! x (y x ^ u ) = ) ^ u = x x x y = The secod s a feasble e cet GMM estmate:! ^ r = arg m m (b) ^Q b = mm = =! m (b) ; (.) xu(b) where m(b) = xu(b) 3 ; u(b) = y xb; u u(); ad ^Q mm s a cosstet estmator of Q mm = E m()m () xx = E u xx u 4 xx u 4 xx u 6 xx Deote also = = E 3xx u : Wrtg out the FOC for (.) ad expadg m(^ r ) aroud gves after rearragemet = p (^r ) A = Q Q mm p m (): = HAUSMAN MAY NOT WORK 4
142 Here A = meas that we substtute the probablty lmts for ther sample aalogues. The last equato holds uder the ull hypothess H : E xu 3 = : Note that the urestrcted estmate ca be rewrtte as p (^u ) A = E xx I k O k p m (): = Therefore, p h (^u r ) = A E xx I k O k + Qmm p m ()! d N (; V ); = where (after some algebra) V = E xx E xx u E xx Q : Note that V s k k: matrx. It ca be show that ths matrx s o-degeerate (ad thus has a full rak k). Let ^V be a cosstet estmate of V: By the Slutsky ad Ma Wald theorems, W (^ u ^r ) ^V (^ u r ) d! k : The test may be mplemeted as follows. Frst d the (cosstet) estmate ^ u : The compute ^Q mm = P = m (^ u )m (^ u ) ; use t to carry out feasble GMM ad obta ^ r : Use ^ u or ^ r to compute ^V ; the sample aalog of V: Fally, compute the Wald statstc W, compare t wth 95% quatle of k dstrbuto q :95; ad reject the ull hypothess f W > q :95 ; or accept t otherwse..4 Bootstrappg OLS Ideed, we are supposed to receter, but oly whe there s overdet cato. Whe the parameter s just det ed, as the case of the OLS estmator, the momet codtos hold exactly the sample, so the ceter s zero ayway..5 Bootstrappg DD Let ^ deote the GMM estmator. The the bootstrap DD test statstc s " # DD = m Q (q) q:h(q)=h(^) where Q (q) s the bootstrap GMM objectve fucto m Q q (q) ; Q (q) = m (z ; q) = = m z ; ^! ^Q mm m (z ; q) = = ^! m z ; ; where ^Q mm uses the formula for ^Q mm ad the bootstrap sample. Note two staces of receterg. 4 GENERALIZED METHOD OF MOMENTS
143 3. PANEL DATA 3. Alteratg dvdual e ects It s coveet to use three dces stead of two dexg the data. Namely, let t = (s ) + q; where q f; g; s f; : : : ; T g: The q = correspods to odd perods, whle q = correspods to eve perods. The dummy varables wll have the form of the Kroecker product of three matrces, whch s de ed recursvely as A B C = A (B C): Part. (a) I ths case we rearrage the data colum as follows: y sq = y t ; y s = ys y s ; Y y : : : y T A ; Y ad = ( O E : : : O E ) : The regressors ad errors are rearraged the same maer as y s. The the regresso ca be rewrtte as where D = I T I ; ad T = ( : : : ) (T vector). Clearly, Y : : : Y A ; Y = D + X + v; (3.) D D = I T T I = T I ; D(D D) D = T I T T I = T I J T I ; where J T = T T : I other words, D(D D) D s block-dagoal wth blocks of sze T T of the form: T : : : T T : : : T B : : : : : : : : : : : : : : : T : : : T A : T : : : T The Q-matrx s the Q = I T T I J T I : Note that Q s a orthogoal projecto ad QD = : Thus we have from (3.) QY = QX + Qv: (3.) Note that T J T s the operator of takg the mea over the s-dex (.e. over odd or eve perods depedg o the value of q). Therefore, the trasformed regresso s: y sq y q = (x sq x q ) + v ; (3.3) where y q = P T s= y sq: (b) Ths tme the data are rearraged the followg maer: y qs = y t ; Y q y : : : y T A ; Y q Y q : : : Y q A Y ; Y = ; Y PANEL DATA 43
144 = ( O : : : O E : : : E ) : I matrx form the regresso s aga (3.) wth D = I I T ; ad D(D D) D = T I I T t = T I I J T : Ths matrx cossts of blocks o the ma dagoal, each of them beg T J T : The Q-matrx s Q = I T T I J T : The rest s as part (b) wth the trasformed regresso y qs y q = (x qs x q ) + v ; (3.4) wth y q = P T s= y qs; whch s essetally the same as (3.3). Part. Take the Q-matrx as Part (b). The Wth estmator s the OLS estmator (3.4),.e. ^ = (X QX ) X QY; or ^ X qs x q )(x qs x q ) q;;s(x A X q;;s (x qs x q )(y qs y q ): Clearly, E[^] =, ^ p! ad ^ s asymptotcally ormal as! ; T xed. For ormally dstrbuted errors v qs the stadard F-test for hypothess s H : O = O = : : : = O ad E = E = : : : = E F = (RSSR RSS U )=( ) H RSS U ~ F ( ; T k) =(T k) (we have restrctos the hypothess), where RSS U = P sq (y qs y q (x qs x q ) ) ; ad RSS R s the sum of squared resduals the restrcted regresso. Part 3. Here we start wth y qs = x qs + u qs ; u qs := q + v qs ; (3.5) where = O ad = E ; E[ q] = : Let = O ; = E : We have E u qs u q s = E (q + v qs )( q + v q s ) = q qq ss + v qq ss ; where aa = f f a = a, ad f a 6= a g, ss = for all s, s : Cosequetly, = E[uu ] = I J T + vi T = (T + v) I T J T + +(T + v) I T J T + vi I (I T T J T ): The last expresso s the spectral decomposto of sce all operators t are dempotet symmetrc matrces (orthogoal projectos), whch are orthogoal to each other ad gve detty sum. Therefore, = = (T + v) = The GLS estmator of s + v I I (I T T J T ): I T J T + (T + v) = ^ = (X X ) X Y: I T J T + 44 PANEL DATA
145 To put t d eretly, ^ s the OLS estmator the trasformed regresso The latter may be rewrtte as v = Y = v = X + u : y qs ( p q )y q = (x qs ( p q )x q ) + u ; where q = v=( v + T q): To make ^ feasble, we should cosstetly estmate parameter q : I the case = we may apply the result obtaed class (we have d eret objects ad T observatos for each of them see part (b)): k ^u ^ Q^u = (T ) k + ^u P ^u ; where ^u are OLS-resduals for (3.4), ad Q = I T T I J T ; P = I T 6= : Usg equatos ad repeatg what was doe class, we have wth Q = P = E [u qs ] = v + q; E [u s ] = T v + q; k ^u Q q ^u ^q = (T ) k + ^u P q ^u ; I (I T T J T ); Q = I T J T : I (I T T J T ); P = Q. Suppose ow that I T J T ; 3. Tme varat regressors. (a) Uder xed e ects, the z varable s collear wth the dummy for : Thus, s udet able.. The Wth trasformato wpes out the term z together wth dvdual e ects ; so the trasformed equato looks exactly lke t looks f o term z s preset the model. Uder usual assumptos about depedece of v t ad X, the Wth estmator of s e cet. (b) Uder radom e ects ad mutual depedece of ad v t, as well as ther depedece of X ad Z; the GLS estmator s e cet, ad the feasble GLS estmator s asymptotcally e cet as! :. Recall that the rst-step ^ s cosstet but ^ s are cosstet as T stays xed ad! : However, the estmator of so costructed s cosstet uder assumptos of radom e ects (see part (b)). Observe that ^ = y x ^: If we regress ^ o z ; we get the OLS coe cet P P = ^ = P z ^ = z y x P ^ = = P z x + z + + v x ^ = P = z = z = z = + P = z P + = z P = z v P + = z P = z x P = z ^ : TIME INVARIANT REGRESSORS 45
146 Now, as! ; = z p! E z 6= ; p z! E [z ] = E [z ] E [ ] = ; = p z v! E [z v ] = E [z ] E [v ] = ; = z x p! E z x p ; ^! : I total, ^ p! : However, so costructed estmator of s asymptotcally e cet. A better estmator s the feasble GLS estmator of part (b). = 3.3 Wth ad Betwee Recall that ^ W = (X QX ) X QU ad ^ B = (X P X ) X P U; where U s a vector of two-compoet errors, ad that V [UjX ] = = Q Q + P P for certa weghts Q ad P that deped o varace compoets. The C h^w ; ^ B jx = X QX X QV [UjX ] P X X P X = X QX X Q ( Q Q + P P ) P X X P X = Q X QX X QQP X X P X + P X QX X QP P X X P X = ; because QP = P Q = : Usg ths result, V h^w ^B jx = V h^w jx + V h^b jx = X QX X QQX X QX + X P X X P P X X P X = Q X QX + P X P X : We kow that uder radom e ects both ^ W ad ^ B are cosstet ad asymptotcally ormal, hece the test statstc R = ^W ^B ^ Q X QX + ^P X P X ^W ^B s asymptotcally ch-squared wth k = dm() degrees of freedom uder radom e ects. Here, as usual, ^ Q = RSS W = (T k) ad ^ P = RSS B = ( k) : Whe radom e ects are approprate, ^ W s stll cosstet but ^ B s cosstet, so ^ W ^B coverges to a o-zero lmt, whle X QX ad X P X dverge to ty, makg R dverge to ty. 3.4 Paels ad strumets Stack the regressors to matrx X; the strumets to matrx Z; the depedet varables to vector Y: The Betwee trasformato s mpled by the trasformato matrx Q = I T J T ; where J T s a T T matrx of oes, the Wth trasformato s mpled by the trasformato 46 PANEL DATA
147 matrx Q = I T P; the GLS trasformato s mpled by the trasformato matrx = = Q Q + P P for certa weghts Q ad P that deped o varace compoets. Recall that P ad Q are mutually orthogoal symmetrc dempotet matrces.. Whe the Wth regresso QY = QX + QU the orgal strumets Z are used, the IV estmator s (Z QX ) Z QY. Whe the Wth-trasformed strumets QZ are used, the IV estmator s (QZ) QX (QZ) QY = (Z QX ) Z QY. These two are detcal.. The same as part, wth P place of Q everywhere. 3. Whe the Betwee regresso P Y = P X + P U the Wth-trasformed strumets QZ are used, (QZ) P X= Z Q P X= Z X= ; a zero matrx. Such IV estmator does ot exst. The same result holds whe oe uses the Betwee-trasformed strumets the Wth regresso. 4. Whe the GLS-trasformed regresso = Y = = X + = U the orgal strumets Z are used, the IV estmator s Z = X Z = Y: Whe the GLS-trasformed strumets = Z are used, the IV estmator s = Z = X = Z = Y = Z X Z Y: These two are d eret. The secod oe s more atural to do: whe strumets cocde wth regressors, the e cet GLS estmator results, whle the former estmator s e cet ad werd. 5. Whe the GLS-trasformed regresso the Wth-trasformed strumets are used, the IV estmator s (QZ) = X (QZ) = Y = Z Q ( Q Q + P P ) X Z Q ( Q Q + P P ) Y = Z ( Q Q) X Z ( Q Q) Y = Z QX Z QY; the estmator from part. Smlarly, whe the GLS-trasformed regresso oe uses the Betwee-trasformed strumets, the IV estmator s that of part. 3.5 D erecg trasformatos. OLS o FD-trasformed equatos s ubased ad cosstet as! sce the d ereced error has mea zero codtoal o the matrx of d ereced regressors uder the stadard FE assumptos. However, OLS s e cet as the codtoal varace matrx s ot dagoal. The e cet estmator of structural parameters s the LSDV estmator, whch s the OLS estmator o Wth-trasformed equatos.. The proposal leads to a cosstet, but ot very e cet, GMM estmator. The resultg error term v ;t v ; s ucorrelated oly wth y ; amog all y ; ; : : : ; y ;T so that for all equatos we ca d much fewer strumets tha the FD approach, ad the same s true for the regular regressors f they are predetermed, but ot strctly exogeous. As a result, we lose e cecy but get othg retur. DIFFERENCING TRANSFORMATIONS 47
148 3.6 Nolear pael data model The olear oe-way ECM wth radom e ects s (y t + ) = x t + + v t ; IID(; ); v t IID(; v); where dvdual e ects ad dosycratc shocks v t are mutually depedet ad depedet of x t : The latter assumpto s uecessarly strog ad may be relaxed. The estmator of Problem 9.3 obtaed from the pooled sample s e cet sce t gores odagoalty of the varace matrx of the error vector. We have to costruct a aalog of the GLS estmator a lear oeway ECM wth radom e ects. To get prelmary cosstet estmates of varace compoets, we ru aalogs of Wth ad Betwee regressos (we call the resultg estmators Wth-CMM ad Betwee-CMM):! (y t + ) TX (y t + ) TX TX = x t x t + v t v t T T T T t= TX (y t + ) = T t= t= TX x t + + T Numercally estmates ca be obtaed by cocetrato as descrbed Problem 9.3. The estmated varace compoets ad the GLS-CMM parameter ca be foud from ^ v = t= RSS W T ; ^ + T ^ v = RSS B T ( ) TX t= v t t= ) ^ = RSS W RSS B T : Note that RSS W ad RSS B are sums of squared resduals the Wth-CMM ad Betwee-CMM systems, ot the values of CMM objectve fuctos. The we cosder the FGLS-trasformed system where the varace matrx of the error vector s (asymptotcally) dagoalzed:! (y t + ) p^ TX (y t + ) TX = x t p^ x t + error T T term : t= t= 3.7 Durb Watso statstc ad pael data. I both regressos, the resduals cosstetly estmate correspodg regresso errors. Therefore, to d a probablty lmt of the Durb Watso statstc, t su ces to compute the varace ad rst-order autocovarace of the errors across the stacked equatos: where % = p lm! T p lm DW =! TX t= = u t; % % % = p lm! ; T TX t= = u t u ;t ; ad u t s deote regresso errors. Note that the errors are ucorrelated where the dex swtches betwee dvduals, hece summato from t = % : Cosder the orgal regresso y t = x t + u t ; = ; : : : ; ; t = ; : : : ; T: 48 PANEL DATA
149 where u t = + v t : The % = v + ad Thus % = T TX t= p lm! = p lm DW OLS =! ( + v t ) ( + v ;t ) = T T : T T v + The GLS-trasformato orthogoalzes the errors, therefore p lm DW GLS = :!! = T v + : T v +. Sce all computed probablty lmts except that for DW OLS do ot deped o the varace compoets, the oly way to costruct a asymptotc test of H : = vs. H A : > s by usg DW OLS : Uder H ; p T (DW OLS )! N (; 4) as!. Uder H A ; p lm DW OLS < : Hece a oe-sded asymptotc test for = for a gve level s:! Reject f DW OLS < + z p ; T where z s the -quatle of the stadard ormal dstrbuto. d 3.8 Hgher-order dyamc pael The orgal system s y t = y ;t + y ;t + x t + + v t ; = ; : : : ; ; t = 3; : : : ; T; where IID(; ) ad v IID(; v) are depedet of each other ad of x s. The FDtrasformed system s y t y ;t = (y ;t y ;t ) + (y ;t y ;t 3 ) + (x t x ;t ) + v t v ;t ; or the matrx form, = ; : : : ; ; t = 4; : : : ; T; Y = Y + Y + X + V; where the vertcal dmeso s (T 3) : The GMM estmator of the 3-dmesoal parameter vector = ( ; ; ) s ^GMM = (Y Y X ) W W (I G) W W (Y Y X ) (Y Y X ) W W (I G) W W Y; where the (T 3) ((T + ) (T 3)) matrx of strumets fro the th dvdual s 3 y ; y ; x ; x ; x ;3 y ; y ; y ;3 x ; x ; x ;3 x ;4 W = dag y ; : : : y ;T x ; : : : x ;T HIGHER-ORDER DYNAMIC PANEL 49
150 The stadard errors ca be computed usg dav ^GMM = ^ v (Y Y X ) W W (I G) W W (Y Y X ) ; where, for example, ^ v = dv dv; (T 3) 3 ad d V are resduals from the FD-trasformed system. 5 PANEL DATA
151 4. NONPARAMETRIC ESTIMATION 4. Noparametrc regresso wth dscrete regressor Fx a (j) ; j = ; : : : ; k: Observe that g(a (j) ) = E[y jx = a (j) ] = E(y I[x = a (j) ]) E(I[x = a (j) ]) because of the followg equaltes: E I x = a (j) = Pfx = a (j) g + Pfx 6= a (j) g = Pfx = a (j) g; E y I x = a (j) = E y I x = a (j) jx = a (j) Pfx = a (j) g = E y jx = a (j) Pfx = a (j) g: Accordg to the aalogy prcple we ca costruct ^g(a (j) ) as P = y I x = a (j) ^g(a (j) ) = P = I x = a (j) : Now let us d ts propertes. Frst, accordg to the LLN, P = ^g(a (j) ) = y I x = a (j) P = I p! E y I[x = a (j) ] x = a (j) E I[x = a (j) ] = g(a (j)): Secod, p ^g(a(j) ) Accordg to the CLT, where p = g(a (j) ) = p! = V y E y jx = a (j) I x = a (j) = E h Thus = V y jx = a (j) Pfx = a (j) g: P = y E y jx = a (j) I x = a (j) P = I x = a (j) : y E y jx = a (j) I x = a (j) d! N (;!) ; p ^g(a(j) ) g(a (j) ) d V! y jx = a (j)! N ; y E y jx = a (j) jx = a (j) Pfx = a (j) g Pfx = a (j) g : 4. Noparametrc desty estmato (a) Use the ht that E [I [x x]] = F (x) to prove the ubasedess of estmator: " # h E ^F (x) = E I [x x] = E [I [x x]] = F (x) = F (x): = = = NONPARAMETRIC ESTIMATION 5
152 (b) Use the Taylor expaso F (x + h) = F (x) + hf(x) + h f (x) + o(h ) to see that the bas of ^f (x) s h E ^f (x) f(x) = h (F (x + h) F (x)) f(x) Therefore, a = : = h (F (x) + hf(x) + h f (x) + o(h ) F (x)) f(x) = hf (x) + o(h): (c) Use the Taylor expasos F x + h = F (x) + h f(x) + h f (x) + h 3 6 f (x) + o(h 3 ) h ad F x = F (x) h f(x) + h f h 3 (x) 6 f (x) + o(h 3 ) to see that the bas of ^f (x) s h E ^f (x) Therefore, b = : f(x) = h (F (x + h=) F (x h=)) f(x) = 4 h f (x) + o(h ): 4.3 Nadaraya Watso desty estmator Recall that It s easy to derve that p ^f(x) f(x) = p (K h (x x) f(x)) : = E [K h (x x) f(x)] = O(h ); ad t ca smlarly be show that h V [K h (x x) f(x)] = E K h (x x) + f(x) f(x)e [K h (x x)] E [K h (x x) f(x)] Z = h K(u) (f(x) + O(h)) du + O() O(h ) = h f(x)r K + O(): To get o-trval asymptotc bas, we should work o the bas term further: Z E [K h (x x) f(x)] = K(u) f(x) + huf (x) + (hu) f (x) + O(h 3 ) du f(x) = h f (x) K + O(h 3 ): Summarzg, by (some varat of) CLT we have, as! ad h! ; h!, that p d! h ^f(x) f(x) N f (x) K; f(x)r K ; p provded that lm h 5 exsts ad s te.! 5 NONPARAMETRIC ESTIMATION
153 The asymptotc bas s proportoal to f (x); the amout of whch dcates how the desty the eghborhood of x d ers from the desty at x whch s estmated. Note that the asymptotc bas does ot deped o f(x);.e. how ofte observatos fall to ths rego, ad o f (x);.e. how desty to the left ad that to the rght of x d er deed, both are equally rrelevat to estmato of the desty (ulke the regresso). The asymptotc varace s proportoal to f(x); the desty at x; whch may seem coutertutve (the hgher frequecy of observatos, the poorer estmato qualty). Recall, however, that we are estmatg f(x); so ts hgher value mples more dsperso of ts estmate aroud that value, ad ths e ect prevals (the frequecy e ect gves _ f(x) ; the scale e ect gves _ f(x) ). 4.4 Frst d erece trasformato ad oparametrc regresso. Let us cosder the followg average that ca be decomposed to three terms: = (y y ) = = (g(z ) g(z )) + (e e ) = + (g(z ) g(z ))(e e ): = Sce z compose a uform grd ad are creasg order,.e. z z = ; we ca d the lmt of the rst term usg the Lpschtz codto: (g(z ) g(z )) G (z z ) = G ( )!! Usg the Lpschtz codto aga we ca d the probablty lmt of the thrd term: (g(z ) g(z ))(e e ) G ( ) je e j = = G p (je j + je j)!! sce G! ad P! = (je j + je j) followg probablty lmt: = = = p!! E je j <. The secod term has the = (e e ) = = e e e + e p! E e! = : Thus the estmator for whose cosstecy s proved by prevous mapulatos s ^ = (y y ) : =. At the rst step estmate from the FD-regresso. The FD-trasformed regresso s y y = (x x ) + g(z ) g(z ) + e e ; FIRST DIFFERENCE TRANSFORMATION AND NONPARAMETRIC REGRESSION 53
154 whch ca be rewrtte as y = x + g(z ) + e : The cosstecy of the followg estmator for ^ = ca be proved the stadard way: ^ = x x = x x =!!! x y =! x (g(z ) + e ) P Here = x x has some o-zero probablty lmt, P = x p e! sce! E[e jx ; z ] =, ad P = x g(z ) G P = jx p j!. Now we ca use! stadard oparametrc tools for the regresso = y x ^ = g(z ) + e ; where e = e + x ( algebrac smplcty): ^): Cosder the followg estmator (we use the uform kerel for P = dg(z) = (y x ^)I [jz zj h] P = I [jz : zj h] It ca be decomposed to three terms: dg(z) = P = g(z ) + x ( ^) + e I [jz zj h] P = I [jz zj h] The rst term gves g(z) the lmt. To show ths, use Lpschtz codto: P = (g(z ) g(z))i [jz zj h] P = I [jz zj h] Gh; ad troduce the asymptotcs for the smoothg parameter: h!. The P = g(z )I [jz zj h] P = I [jz zj h] = P = (g(z) + g(z ) g(z))i [jz zj h] P = I [jz zj h] = = g(z) + P = (g(z ) g(z))i [jz zj h] P = I [jz zj h]! g(z):! The secod ad the thrd terms have zero probablty lmt f the codto h! s sats ed P = x I [jz zj h] p P = I [jz ( ^)! zj h] {z }! {z } # p E [x ] # p ad P = e I [jz zj h] P = I [jz zj h] p!! E [e ] = : Therefore, d g(z) s cosstet whe! ; h! ; h! : 54 NONPARAMETRIC ESTIMATION
155 4.5 Ubasedess of kerel estmates Recall that ^g (x) = P = y K h (x x) P = K h (x x) ; so E [^g (x)] = E = E P = y K h (x x) P = K h (x E P = E [y jx ] K h (x x) P = K h (x x) x) jx ; : : : ; x = E P = ck h (x x) P = K = c; h (x x).e. ^g (x) s ubased for c = g (x) : The reaso s smple: all pots the sample are equally relevat estmato of ths trval codtoal mea, so bas s ot duced whe pots far from x are used estmato. The local lear estmator wll be ubased f g (x) = a + bx: The all pots the sample are equally relevat estmato sce t s a lear regresso, albet locally, s ru. Ideed, P = ^g (x) = y + (y y) (x x) K h (x x) P = (x x) (x x) ; K h (x x) so so E [^g (x)] = E [E [yjx ; : : : ; x ]] " " P # # = +E E (y y) (x x) K h (x x) P = (x x) jx ; : : : ; x (x x) K h (x x) = E [a + bx] " " P # # = +E E (a + bx a bx) (x x) K h (x x) P = (x x) jx ; : : : ; x (x x) K h (x x) = E [a + bx + b (x x)] = a + bx: As far as the desty s cocered, ubasedess s ulkely. Ideed, recall that ^f (x) = K h (x x) ; = h E ^f (x) = E [K h (x x)] = Z h K x h x f (x) dx: Ths expectato heavly depeds o the badwdth ad kerel fucto, ad barely wll t equal f (x) ; except uder specal crcumstaces (e.g., uform f (x), x far from boudares, etc.). 4.6 Shape restrcto The CRS techology has the property that f (l; k) = kf l k ; : UNBIASEDNESS OF KERNEL ESTIMATES 55
156 The regresso terms of the rescaled varables s y l = f ; + " : k k k Therefore, we ca costruct the (oe dmesoal!) kerel estmate of f (l; k) as ^f (l; k) = k P y K h k P l K h k = = l k l k : l k I e ect, we are usg the sample pots gvg hgher weght to those that are close to the ray l=k: 4.7 Noparametrc hazard rate () A smple oparametrc estmator for F (t) Pr fz tg s the sample frequecy ^F (t) = I [z j t] : j= By the law of large umbers, t s cosstet for F (t): By the cetral lmt theorem, ts rate of covergece s p : Ths wll be helpful later. () We derved class that zj E k h h t f(t) = O h ad zj V k h h t = h R k f(t) + O () ; where R k = R k (u) du: By the cetral lmt theorem appled to p h ^f(t) f(t) = p h p j= zj k h h t f(t) ; we get p h ^f(t) f(t) d! N (; Rk f(t)) : () By the aalogy prcple, ^H(t) = ^f(t) ^F (t) : 56 NONPARAMETRIC ESTIMATION
157 It s cosstet for H(t) ad p h ^H(t) H(t) = p! ^f(t) f(t) h ^F (t) F (t) = p! ^f(t)( F (t)) f(t)( ^F (t)) h ( ^F (t))( F (t)) p h ( = ^f(t) f(t)) + p p ( ^F (t) F (t)) h f(t) ^F (t) ( ^F (t))( F (t)) d! F (t) N (; R kf(t)) + f(t) = N ; R k ( F (t)) : The reaso of the fact that ucertaty ^F (t) does ot a ect the asymptotc dstrbuto of ^H(t) s that ^F (t) coverges wth faster rate tha ^f(t) does. 4.8 Noparametrcs ad perfect t Whe the varace of the error s zero, ^g (x) g(x) = (h) P = (g(x x ) g(x)) K (h) P = K x x h x h : There s o usual source of varace (regresso errors), so the varace should come from the varace of x s. The deomator coverges to f(x) whe! ; h! : Cosder the umerator, whch we deote by ^q(x); ad whch s a average of IID radom varables, say. We derved class that E [^q(x)] = h B(x)f(x) + o(h ); V [^q(x)] = o((h) ): Now we eed to look closer at the varace. Now, E x Z = h (g(x ) g(x)) x K f(x )dx h Z = h (g(x + hu) g(x)) K (u) f(x + hu)du = Z h g (x)hu + o(h) K (u) (f(x) + o(h)) du Z = hg (x) f(x) u K (u) du + o(h); so V [ ] = E E [ ] = hg (x) f(x) K + o(h); NONPARAMETRICS AND PERFECT FIT 57
158 where K = R u K (u) du: Hece, by some CLT, p h (h) (g(x ) = d! N ; g (x) f(x) K : g(x)) K x h x h B(x)f(x) + o(h )! Let = p lm h 3 ; assumg < : The,!;h! p h (^g (x) g(x))! d N B(x); g (x) f(x) K : 4.9 Noparametrcs ad extreme observatos (a) Whe x s very bg, we wll have a bg uece of ths observato whe estmatg the regresso the rght rego of the support. Wth bouded kerels, we may d that o observatos fall to the wdow. Wth ubouded kerels, the estmated regresso le wll go almost through (x ; y ): (b) Whe y s very bg, the estmated regresso le wll have a hump the rego cetered at x : 58 NONPARAMETRIC ESTIMATION
159 5. CONDITIONAL MOMENT RESTRICTIONS 5. Usefuless of skedastc fucto Deote = ad e = y x : The momet fucto s m y x m(y; x; ) = = (y x ) h(x; ; ) m The geeral theory for the codtoal momet restrcto E [m(w; )jx] = gves the optmal restrcto E D(x) (x) m(w; = ; where D(x) = jx ad (x) = E[mm jx]: The varace of the optmal estmator s V = E D(x) (x) D(x) : For the problem at D(x) = jx x x = E ex + h h jx = h h ; (x) = E[mm e jx] = E e(e h) e e e(e h) (e h) jx = E 3 e 3 (e h) jx ; sce E[exjx] = ad E[ehjx] = : Let (x) det (x) = E[e jx]e[(e h) jx] (E[e 3 jx]) : The verse of s (x) = (e (x) E h) e 3 e 3 e jx ; ad the asymptotc varace of the e cet GMM estmator s V = E D(x) (x) D(x) A B = B C ; where " (e h) xx e 3 (xh A = E + h # x ) + e h h ; (x) " # e 3 h x + e h h e h h B = E ; C = E : (x) (x) Usg the formula for verso of the parttoed matrces, d that (A B V = C B) ; where s deote submatrces whch are ot of terest. xx To aswer the problem we eed to compare V = (A B C B) wth V = E ; h the varace of the optmal GMM estmator costructed wth the use of m oly. We eed to show that V V ; or, alteratvely, V V : Note that V V = ~ A B C B; CONDITIONAL MOMENT RESTRICTIONS 59
160 where A ~ = A V ca be smpl ed to " xx ~A E e 3 jx = E (x) E [e jx] e 3 (xh + h x ) + e h h!# : Next, we ca use the followg represetato: ~A B C B = E[ww ]; where w = E e 3 jx x E e jx h p E [e jx] p (x) + B C h s E [e jx] (x) : Ths represetato cocludes that V V ad gves the codto uder whch V = V : Ths codto s w(x) = almost surely. It ca be wrtte as E e 3 jx E [e jx] x = h B C h almost surely. Cosder the specal cases.. h =. The the codto mod es to E e 3 jx E [e jx] x = e 3 E h x (x) E e h h h almost surely. (x). h = ad the dstrbuto of e codtoal o x s symmetrc. The the prevous codto s sats ed automatcally sce E e 3 jx = : 5. Symmetrc regresso error Part. The mataed hypothess s E [ejx] = : We ca use the ull hypothess H : E e 3 jx = to test for the codtoal symmetry. We could addto use more codtoal momet restrctos (e.g., volvg hgher odd powers) to crease the power of the test, but te samples that would probably lead to more dstorted test szes. The alteratve hypothess s H : E e 3 jx 6= : A estmator that s cosstet uder both H ad H s, for example, the OLS estmator ^ OLS : The estmator that s cosstet ad asymptotcally e cet ( the same class where ^ OLS belogs) uder H ad (hopefully) cosstet uder H s the strumetal varables (GMM) estmator ^ OIV that uses the optmal strumet for the system E [ejx] = ; E e 3 jx =. We derved class that the optmal ucodtoal momet restrcto s h E a (x) (y x) + a (x) (y x) 3 = ; where a (x) = a (x) x 6 (x) (x) 6 (x) 4 (x) 3 (x) 4 (x) 3 (x) 4 (x) ad r (x) = E [(y x) r jx] ; r = ; 4; 6. To costruct a feasble ^ OIV ; oe eeds to rst estmate r (x) at the pots x of the sample. Ths may be doe oparametrcally usg earest eghbor, 6 CONDITIONAL MOMENT RESTRICTIONS
161 seres expaso or other approaches. Deote the resultg estmates by ^ r (x ); = ; : : : ; ; r = ; 4; 6 ad compute ^a (x ) ad ^a (x ); = ; : : : ;. The ^ OIV s a soluto of the equato = ^a (x ) (y ^ OIV x ) + ^a (x ) (y ^ OIV x ) 3 = ; whch ca be tured to a optmzato problem, f coveet. The Hausma test statstc s the where ^V OLS e cecy boud H = (^ OLS ^ OIV ) ^V OLS ^VOIV d! ; = P = x P = x (y ^ OLS x ) ad ^V OIV s a cosstet estmate of the x ( V OIV = E 6 (x) 6 (x) 4 (x) (x)) (x) 6 (x) 4 (x) : Note that the costructed Hausma test wll ot work f ^ OLS s also asymptotcally e cet, whch may happe f the thrd-momet restrcto s redudat ad the error s codtoally homoskedastc so that the optmal strumet reduces to the oe mpled by OLS. Also, the test may be cosstet (.e., asymptotcally have power less tha ) f ^ OIV happes to be cosstet uder codtoal o-symmetry too. Part. Uder the assumpto that ejx N (; ); rrespectve of whether s kow or ot, the QML estmator ^ QML cocdes wth the OLS estmator ad thus has the same asymptotc dstrbuto p E hx (y x) (^QML )! d A (E [x ]) : 5.3 Optmal strumetato of cosumpto fucto As the error has a martgale d erece structure, the optmal strumet s (up to a factor of proportoalty) t _ E e t jy t ; c t ; : : E [y t jy t ; c t ; : : :] A : E [y t l y tjy t ; c t ; : : :] Oe could rst estmate ; ad usg the strumet vector (for example) (; y t ; y t ): Usg these, oe could get feasble versos of e t ad y t : The oe could do oe s best to parametrcally t feasble e t ; y t ad y t l y t to recet varables fy t ; c t ; y t ; c t ; : : :g ; employg autoregressve structures (e.g., facy GARCH for e t ). The tted values the could be used to form feasble t to be evetually used to IV-estmate the parameters. Naturally, because auxlary parameterzatos are used, such strumet wll be oly early optmal. I partcular, the asymptotc varace has to be computed a sadwch form. OPTIMAL INSTRUMENTATION OF CONSUMPTION FUNCTION 6
162 5.4 Optmal strumet AR-ARCH model Let us for coveece vew a typcal elemet of Z t as P =! " t ; ad let the optmal strumet be t = P = a " t :The optmalty codto s E[v t x t ] = E v t t " t for all v t Z t : Sce t should hold for ay v t Z t ; let us make t hold for v t = " t j ; j = ; ; : : : : The we get a system of equatos of the type E[" t j x t X ] = E h" t j = a " t " t : The left-had sde s just j because x t = P = " t ad because E[" h t ] = : I the rghthad sde, all terms are zeros due to codtoal symmetry of " t, except a j E : Therefore, a j = j + j ( ) ; where = E " 4 t : Ths follows from the ARCH() structure: " t j " t E " t j" t = E " t j E[" t ji t ] = E " t j ( ) + " t = ( ) + E " t j+ " t ; so that we ca recursvely obta Thus the optmal strumet s t = = X = + ( ) " t = E " t j" t = j + j : x t + ( ) + ( )( ) X () ( + ( )) ( + ( )) x t : = To costruct a feasble estmator, set ^ to be the OLS estmator of, ^ to be the OLS estmator of the model ^" t = (^" t ) + t ; ad compute ^ = T P T t= ^"4 t : The optmal strumet based o E[" t ji t ] = uses a large set of allowable strumets, relatve to whch our Z t s extremely th. Therefore, we ca expect bg losses e cecy comparso wth what we could get. I fact, calculatos for emprcally relevat sets of parameter values reveal that ths tuto s correct. Weghtg by the skedastc fucto s much more powerful tha tryg to capture heteroskedastcty by usg a te hstory of the basc strumet a lear fasho. 5.5 Optmal strumet AR wth olear error. We look for a optmal combato of " t ; " t ; : : : ; say t = X " t ; = 6 CONDITIONAL MOMENT RESTRICTIONS
163 whe the class of allowable strumets s ( Z t = & t = ) X " t : = The optmalty codto for each " t r ; r ; s or 8 r E [" t r x t ] = E " t r t " t ; "! # X 8 r r = E " t r " t " t : Whe r > ; ths mples r = r; whle whe r = ; we have = E t t t t = E 4 : The optmal strumet the s = t = E 4 "t + = E 4 X r " t = = E 4 " t + X r " t = (x t x t ) + x t = E 4 xt + E 4 x t ; ad employs oly two lags of x t : To costruct a feasble verso, oe eeds to rst cosstetly estmate (say, by OLS), ad E 4 (say, by a sample aalog to E " t " t ).. The optmal strumet based o ths codtoal momet restrcto s Chamberla s oe: t _ x t E " t ji ; t where E " t ji t = E t t jx t ; x t ; : : : = E E t j t ; x t ; x t ; : : : t jx t ; x t ; : : : = E t jx t ; x t ; : : : " = E t jx t ; x t ; : : : = : : : = " t " t 3 " t 5 : : : t " t " t 4 " t 6 : : :: Ths, apparetly, requres kowledge of all past errors, ad ot oly of ther te umber. Hece, t s problematc to costruct a feasble verso from a te sample. 5.6 Optmal IV estmato of a costat From the DGP t follows that the momet fucto s codtoally (o y t p ; y t p ; : : :) homoskedastc. Therefore, the optmal strumet s Hase s (985) (L) t = E (L ) jy t p ; y t p ; : : : ; or (L) t = () : Ths s a determstc recurso. Sce the strumet we are lookg for should be statoary, t has to be a costat. Sce the value of the costat does ot matter, the optmal strumet may be take as uty. OPTIMAL IV ESTIMATION OF A CONSTANT 63
164 5.7 Negatve bomal dstrbuto ad PML Yes, t does. The desty belogs to the lear expoetal class, wth C(m) = log m The form of the skedastc fucto s mmateral for the cosstecy property to hold. log(a + m): 5.8 Nestg ad PML Ideed, the example, h (z; ; &) reduces to expoetal dstrbuto whe & = ; ad the expoetal pseudodesty cosstetly estmates : Whe & s estmated, the vector of parameters s (; &) ; ad the pseudoscore log h (z; ; (; &) = log & + & log + & + & z= & : : : & log + (& ) log z + & z= (; &) &= : Observe that the pseudoscore for has zero expectato f ad oly f E + & z= & = : Ths obvously holds whe & = ; but may ot hold for other &: For stace, whe & = ; we kow that E z = (:5) = (:5) = :5 = (:5) ; whch cotradcts the zero expected pseudoscore. The pseudotrue value & of & ca be obtaed by solvg the system of zero expected pseudoscore, but t s very ulkely that t equals. The result ca be explaed by the presece of a extraeous parameter whose pseudotrue value has othg to do wth the problem ad whose estmato adversely a ects estmato of the quatty of terest. 5.9 Msspec cato varace The log pseudodesty o whch the proposed PML estmator reles has the form log (y; m) = log p log (y m m) + m ; whch does ot belog to the lear expoetal famly of destes (the term y =m does ot t). Therefore, the PML estmator s ot cosstet except by mprobable chace. The cosstecy ca be show drectly. Cosder a specal case of o regressors ad estmato of mea: y N ; ; whle the pseudodesty s The the pseudotrue value of s " = arg max E log (y ) y N ; : # = arg max It s easy to see by d eretatg that s ot utl by chace = : ( ) log + ( ) : 64 CONDITIONAL MOMENT RESTRICTIONS
165 5. Mod ed Posso regresso ad PML estmators Part. The mea regresso fucto s E[yjx] = E[E[yjx; "]jx] = E[exp(x + ")jx] = exp(x ): The skedastc fucto s V[yjx] = E[(y E[yjx]) jx] = E[y jx] E[yjx] : Because E y jx = E E y jx; " jx = E exp(x + ") + exp(x + ")jx = exp(x )E (exp ") jx + exp(x ) = ( + ) exp(x ) + exp(x ); we have V [yjx] = exp(x ) + exp(x ): Part. Use the formula for asymptotc varace of NLLS estmator V NLLS = Q gg Q gge Q gg ; where Q gg = )=@ ad Q gge = )=@ (y g(x; )) : I our problem g(x; ) = exp(x ) ad Q gg = E [xx exp(x )] ; Q gge = E xx exp(x )(y exp(x )) = E xx exp(x )V[yjx] = E xx exp(x )( exp(x ) + exp(x )) = E xx exp(3x ) + E xx exp(4x ) : To d the expectatos we use the formula E[xx exp(x )] = exp( )(I + ): Now, we have Q gg = exp( )(I + 4 ) ad Q gge = exp( 9 )(I + 9 ) + exp(8 )(I + 6 ): Fally, V NLLS = (I + 4 ) exp( )(I + 9 ) + exp(4 )(I + 6 ) (I + 4 ) : The formula for asymptotc varace of WNLLS estmator s V W NLLS = Q gg= ; where Q gg= = E )=@ : I ths problem whch ca be rearraged as Q gg= = E xx exp(x )( exp(x ) + exp(x )) ; V W NLLS = I xx E + exp(x : ) Part 3. We use the formula for asymptotc varace of PML estmator: V P ML = J IJ ; where " J = ; I = E 3 : m(x; ) I ths problem m(x; ) = exp(x ) ad (x; ) = exp(x ) + exp(x ): (a) For the ormal dstrbuto C(m) = m; = ad V NP ML = V NLLS : (b) For the Posso dstrbuto C(m) = log m; = =m, J = E[exp( x )xx exp(x )] = exp( )(I + ); I = E[exp( x )( exp(x ) + exp(x ))xx exp(x )] = exp( )(I + ) + exp( )(I + 4 ): MODIFIED POISSON REGRESSION AND PML ESTIMATORS 65
166 Fally, V P P ML = (I + ) exp( )(I + ) + exp( )(I + 4 ) (I + ) : (c) For the Gamma dstrbuto C(m) = =m; = =m ; J = E[ exp( x )xx exp(x )] = I; I = E[exp( 4x )( exp(x ) + exp(x ))xx exp(x )] = I + exp( )(I + ): Fally, V GP ML = I + exp( )(I + ): Part 4. We have the followg varaces: V NLLS = (I + 4 ) exp( )(I + 9 ) + exp(4 )(I + 6 ) (I + 4 ) ; V W NLLS = I xx E + exp(x ; ) V NP ML = V NLLS ; V P P ML = (I + ) exp( V GP ML = I + exp( )(I + ): )(I + ) + exp( )(I + 4 ) (I + ) ; From the theory we kow that V W NLLS V NLLS : Next, we kow that the class of PML estmators the e cecy boud s acheved m(x; ) s proportoal to (x; ); the the boud V[yjx] whch s equal to V W NLLS our case. So, we have V W NLLS V P P ML ad V W NLLS V GP ML : The comparso of other varaces s ot straghtforward. Cosder the oe-dmesoal case. The we have V NLLS = e = ( + 9 ) + e 4 ( + 6 ) ( + 4 ; ) V W NLLS = x E + ; exp(x) V NP ML = V NLLS ; V P P ML = e = ( + ) + e ( + 4 ) ( + ) ; V GP ML = + e = ( + ): We ca calculate these (except V W NLLS ) for varous parameter sets. For example, for = : ad = :4 V NLLS < V P P ML < V GP ML ; for = : ad = : V P P ML < V NLLS < V GP ML ; for = ad = :4 V GP ML < V P P ML < V NLLS ; for = :5 ad = :4 V P P ML < V GP ML < V NLLS : However, t appears mpossble to make V NLLS < V GP ML < V P P ML or V GP ML < V NLLS < V P P ML : 66 CONDITIONAL MOMENT RESTRICTIONS
167 5. Optmal strumet ad regresso o costat Part. We have the followg momet fucto: m(x; y; ) = (y ; (y ) x ) wth = ;. The optmal ucodtoal momet restrcto s E[A (x)m(x; y; )] = ; where A (x) = D (x)(x) ; D(x) = y; )=@ jx ; (x) = E[m(x; y; )m(x; y; ) jx]: (a) For the rst momet restrcto m (x; y; ) = y we have D(x) = ad (x) = E (y ) jx = x ; therefore the optmal momet restrcto s y E x = : (b) For the momet fucto m(x; y; ) we have D(x) = x ; (x) = x 4 (x) x 4 4 ; where 4 (x) = E (y ) 4 jx : The optmal weghtg matrx s A B (x) x C x A : 4 (x) x 4 4 The optmal momet restrcto s E 6B 4@ y x (y ) x 4 (x) x 4 4 x Part. (a) The GMM estmator s the soluto of 3 C7 A5 = : X y x ^ = ) ^ = X y x, X : x The estmator for ca be draw from the sample aalog of the codto E (y ) = E x : ~ = X (y ^), X x : (b) The GMM estmator s the soluto of We have the same estmator for : X y x ^ (y ^) ^ x ^ 4 (x ) x 4 ^4 x C A = : ^ = X y x, X ; x OPTIMAL INSTRUMENT AND REGRESSION ON CONSTANT 67
168 ^ s the soluto of X (y ^) ^ x ^ 4 (x ) x 4 ^4 x = ; where ^ 4 (x) s o-parametrc estmator for 4 (x) = E (y or a seres estmator. Part 3. The geeral formula for the varace of the optmal estmator s V = E D (x)(x) D(x) : ) 4 jx ; for example, a earest eghbor (a) V^ = E x : Use stadard asymptotc techques to d V ~ = E (y ) 4 (E [x ]) 4 : (b) V (^;^ ) = B 4@ x x 4 4 (x) x C7C A5A = E x E x 4 4 (x) x 4 4 C A : Whe we use the optmal strumet, our estmator s more e cet, therefore V ~ > V^ : Estmators of asymptotc varace ca be foud through sample aalogs: ^V^ = ^ X x! ; V ~ = P (y ^) 4 P x ~ 4 ; V^ = X x 4 ^ 4 (x ) x 4 ^4! : Part 4. The ormal dstrbuto PML estmator s the soluto of the followg problem: d ( ) = arg max cost ; log X (y ) : P ML Solvg gves ^ P ML = ^ = X, y X x x ; ^ P ML = X (y ^) x Sce we have the same estmator for, we have the same varace V^ = E x. It ca be show that 4 (x) V^ = E 4 : x 4 x 68 CONDITIONAL MOMENT RESTRICTIONS
169 6. EMPIRICAL LIKELIHOOD 6. Commo mea. We have the followg momet fucto: m(x; y; ) = (x ; y ). The EL estmator s the soluto of the followg optmzato problem. X log p! max p ; subject to X p m(x ; y ; ) = ; X p = : Let be a Lagrage multpler for the restrcto P p m(x ; y ; ) =, the the soluto of the problem sats es p = + m(x ; y ; ) ; = X + m(x ; y ; ) m(x ; y ; ); = ; y ; ) + m(x ; y ; : I our case, = ( ; ) ; ad the system s p = + (x ) + (y ) ; = X + (x ) + (y ) = X + (x ) + (y ) : The asymptotc dstrbuto of the estmators s p (^EL ) d! N(; V ); where V = Q ; U = Q ad Q mm = x xy ; therefore xy y mm Q V = x y xy y + x xy ; U = x y ; p d! N (; U); V Q mm: I our case = y + x xy : Estmators for V ad U based o cosstet estmators for x, y ad xy ca be costructed from sample momets. EMPIRICAL LIKELIHOOD 69
170 . The last equato of the system gves = = ; so we have p = The EL estmator s + (x y ) ; = X + (x y ) ^EL = X where EL s the soluto of x + EL (x y ) = X X x y + (x y ) = : x y y + EL (x y ) ; : Cosder the learzed EL estmator. Learzato wth respect to aroud gves p = (x y ); = X x ( (x y )) ; ad helps to d a approxmate but explct soluto P P ~ AEL = P (x y ) (x y ) ; ~ AEL = P ( (x y ))x ( (x y )) = y P P ( (x y ))y ( (x y )) : Observe that ~ AEL s a ormalzed dstace betwee the sample meas of x s ad y s, ~ AEL s a weghted sample mea. The weghts are such that the weghted mea of x s equals the weghted mea of y s. So, the momet restrcto s sats ed the sample. Moreover, the weght of observato depeds o the dstace betwee x ad y ad o how the sgs of x y ad x y relate to each other. If they have the same sg, the such observato says agast the hypothess that the meas are equal, thus the weght correspodg to ths observato s relatvely small. If they have opposte sgs, such observato supports the hypothess that meas are equal, thus the weght correspodg to ths observato s relatvely large. 3. The techque s the same as the EL problem. The Lagraga s! L = X p log p + X p + X p m(x ; y ; ): The rst-order codtos are (log p + ) + + m(x ; y ; ) = ; X ; y ; = : The rst equato together wth the codto P p = gves p = e m(x ;y ;) P e m(x ;y ;) : Also, we have = X p m(x ; y ; ); = ; y ; ) : The system for ad that gves the ET estmator s = X e m(x ;y ;) m(x ; y ; ); = X e m(x ;y ; y ; : 7 EMPIRICAL LIKELIHOOD
171 I our smple case, ths system s = X e (x )+ (y ) x y ; = X e (x )+ (y ) ( + ): Here we have = = aga. The ET estmator s ^ET = P x e (x y ) P e(x y ) = P y e (x y ) P e(x y ) ; where s the soluto of X (x y )e (x y ) = : Note, that learzato of ths system gves the same result as EL case. Sce ET estmators are asymptotcally equvalet to EL estmators (the proof of ths fact s trval: the rst-order Taylor expaso of the ET system gves the same result as that of the EL system), there s o eed to calculate the asymptotc varaces, they are the same as part. 6. Kullback Lebler Iformato Crtero. Mmzato of h KLIC(e : ) = E e log e = X log s equvalet to maxmzato of P log whch gves the EL estmator.. Mmzato of gves the ET estmator. h KLIC( : e) = E log = X e log = 3. The kowledge of probabltes p gves the followg mod cato of EL problem: X p log p! m ; s.t. X = ; X m(z ; ) = : The soluto of ths problem sats es the followg system: p = + m(z ; ) ; = X p + m(z ; ) m(z ; ); = X p + m(z ; ; : The kowledge of probabltes p gves the followg mod cato of ET problem X log! m p ; s.t. X = ; X m(z ; ) = : KULLBACK LEIBLER INFORMATION CRITERION 7
172 The soluto of ths problem sats es the followg system = p e m(z ;) Pj p je m(z j ;) ; = X = X p e m(z ;) m(z ; ); p e m(z ; : 4. The problem s equvalet to KLIC(e : f) = E e log e = X f X whch gves the Maxmum Lkelhood estmator. log f(z ; )! max; log = f(z ; )! m 6.3 Emprcal lkelhood as IV estmato The e cet GMM estmator s GMM = Z GMMX Z GMM Y; where Z GMM cotas mpled GMM strumets Z GMM = X Z Z ^Z Z ; ad ^ cotas squared resduals o the ma dagoal. The FOC to the EL problem are = X z (y x EL) + EL z (y x EL) X z y x EL ; = X + EL z (y x z x EL X x z EL : EL) From the rst equato after premultplcato by P x z EL = Z ELX Z EL Y; Z ^Z t follows that where Z EL cotas ofeasble (because they deped o yet ukow parameters EL ad EL ) EL strumets Z EL = X ^Z Z ^Z Z ^; where ^ dag ( ; : : : ; ) : If we compare the expressos for GMM ad EL ; we see that the costructo of EL some expectatos are estmated usg EL probablty weghts rather tha the emprcal dstrbuto. Usg probablty weghts yelds more e cet estmates, hece EL s expected to exhbt better te sample propertes tha GMM. 7 EMPIRICAL LIKELIHOOD
173 7. ADVANCED ASYMPTOTIC THEORY 7. Maxmum lkelhood ad asymptotc bas (a) The ML estmator s ^ = y T for whch the secod order expaso s ^ = E [y] + (E [y]) P T T t= (y t E [y]) p X T p y t + T T T t= p T T X t= y t! + o p T A : Therefore, the secod order bas of ^ s B ^ = T 3 V [y] = T : (b) We ca use the geeral formula for the secod order bas of extremum estmators. For the ML, (y; ) = log f(y; ) = log y; so ; = ; 3 = ; = so the secod order bas of ^ s B ^ = T = T : The bas corrected ML estmator of s ^ = ^ T ^ = T : T y T 7. Emprcal lkelhood ad asymptotc bas Solvg the stadard EL problem, we ed up wth the system a most coveet form ^EL = = = = x + ^(x y ) ; x y + ^(x y ) ; ADVANCED ASYMPTOTIC THEORY 73
174 where ^ s oe of orgal Largrage multples (the other equals secod order expaso for ^ EL s ^). From the rst equato, the ^EL = = = + p p x ( ^(x y ) + ^ (x y ) ) + o p (x ) = p ^ p : + (p ^) E x(x y) + o p x (x y ) = We eed a rst order expaso for ^, whch from the secod equato s P p ^ = p = (x y ) P = (x y ) + o p () = E [(x y) p ] (x y ) + o p () : = The, cotug, ^EL = + p p +! (x ) E [(x y) p (x y ) ] = =! E [(x y) p (x y ) E x(x y) + o p ] = The secod order bas of ^ EL the s " B ^EL = E E [(x y) p (x y ) p x (x y ) ] = =! 3 + E [(x y) p (x y ) E x(x y) 5 ] = = E x(x y) E [(x y) + E (x y) E x(x y)! ] (E [(x y) ]) = : p : x (x y ) = 7.3 Asymptotcally rrelevat strumets. The formula for the SLS estmator s ^ SLS = + P x z (P z z ) P z e P x z (P z z ) P x z : Accordg to the LLN, P z z X z x p p! z e p! Q zz = E [zz ] : Accordg to the CLT, N ; x x x Q zz ; 74 ADVANCED ASYMPTOTIC THEORY
175 where x = E x ad = E e (t s addtoally assumed that there s covergece probablty). Hece, we have ^ SLS = + = P x z P z z = P z e = P x z ( P zz ) = P xz! p + Q zz Qzz :. Uder weak strumets, p (Q zz c + ^ SLS! + zv ) Qzz zu (Q zz c + zv ) Qzz (Q zz c + zv ) ; where c s a costat the weak strumet assumpto. If c = ; ths formula cocdes wth the prevous oe, wth zv = ad zu =. 3. The expected value of the probablty lmt of the SLS estmator s E hp lm ^ SLS Q zz = + E Qzz = + x x = + Q E x = p lm ^ OLS ; where we use jot ormalty to deduce that E [j] = (= x ) : zz Q zz 7.4 Weakly edogeous regressors The OLS estmator sats es ^ = X X X E = c p + X X X U p! ; ad p ^ = c + X X = X U! p c + Q N c; uq : The Wald test statstc sats es W = R^ r R (X X ) R Y X ^ Y X ^ R^ r p! c + Q R RQ R R c + Q q () ; u where s the ocetralty parameter. = c R RQ R Rc u WEAKLY ENDOGENOUS REGRESSORS 75
176 7.5 Weakly vald strumets. Cosder the projecto of e o z: where v s orthogoal to z: Thus e = z! + v; = Z E = = Z (Z! + V) = Z Z c! + = Z V: Let us assume that Z Z; Z X ; = Z V p! (Qzz ; Q zx ; ) ; where Q zz E [zz ] ; Q zx E [zx ] ad N ; vq zz : The SLS estmator sats es p ^ = X Z Z Z = Z E X Z ( Z Z) Z X p! Q zx c! + Qzz N Q zxqzz Q zx Q zx c! Q zxq zz Q zx ; v Q zxqzz : Q zx Sce ^ s cosstet, ^ e U b U b = (Y X ) (Y X ) ^ = (Z! + V) (Z! + V) ^ p! v: X (Y X ) + ^ X X X Z! + X V + ^ X X Thus the t rato sats es t r ^ e ^ p! X Z (Z Z) Z X Q zx c! + Q zz Q zxq zz Q zx q v Q zxq zz Q zx N ( t ; ) ; where t = Q zxc! v q Q zxq zz Q zx s the ocetralty parameter. Fally ote that so, = Z U b = = Z p E ^ Z X p! Q zz c! + Qzz Q zx c! + Qzz N Q zz c! Q zxc! Q zx Q zxq zz Q zx ; v J = = b U Z( Z Z) = Z b U ^ e Q zxqzz Q zx Q zz Q zx Q zx Q zx Q zxqzz ; Q zx d! ` ( J) ; 76 ADVANCED ASYMPTOTIC THEORY
177 where J = Q zz c! = v c! Q zz Q zxc! Q zx Qzz Q zxqzz Q zx Q zx Q zx Q zxq zz Q zx v c! Q zz c! Q zxc! Q zx Q zxqzz Q zx s the ocetralty parameter. I partcular, whe ` =, p ^ p! c! + Q Q t p! c! + Q u. Cosder also the projecto of x o z: where w s orthogoal to z: Thus We have where where zz zz Q zx N zz Q zz Qzz Q v zx c! ; Q zz Q zx c! Q zz ; v = Z b U p! ) J p! : x = z + w; ; = Z X = = Z (Z + W) = Z Z c + = Z W: N ; w Z Z; = Z W; = Z V p! (Qzz ; ; ) ; wv wv v Q zz : The SLS estmator sats es ^ = = X Z Z Z = Z E = X Z ( Z Z) = Z X p! (Q zzc + ) Qzz (Q zz c! + ) (Q zz c + ) Qzz (Q zz c + ) v ( + z w ) (! + z v ) w ( + z w ) ( + z w ) v ; w zw z v N ; ^ e U b U b = (Y X ) (Y X ) ^ = (Z! + V) (Z! + V) ^ p! v v w Thus the t rato sats es wv + I` ad wv : Note that w ^ s cosstet. Thus, v v w X (Y X ) + ^ X X (Z + W) (Z! + V) + ^ X X w = v! + : t r ^ e ^ p = p! : X Z (Z Z) Z X q = + ( = ) WEAKLY INVALID INSTRUMENTS 77
178 Fally ote that so, = Z b U = = Z E p! Q = = v Q = zz ^ zz v (! + z v ) (! + z v ) = Z X v Qzz = w ( + z w ) w ( + z w ) ; J = = b U Z( Z Z) = Z b U p! (! + z v ) ^ e ( + z w ) (! + z v ) + ( + z w ) = 3 + ; where 3 (! + z v ) (! + z v ) : I partcular, whe ` =, ^ ^ e p! v w! + z v + z w ; p! v! + z v + z w +!! + z v ; + z w! + z v p t! + z s w ;! + z v! + z v + + z w + z w = ) J p! : 78 ADVANCED ASYMPTOTIC THEORY
STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1
STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS Recall Assumpto E(Y x) η 0 + η x (lear codtoal mea fucto) Data (x, y ), (x 2, y 2 ),, (x, y ) Least squares estmator ˆ E (Y x) ˆ " 0 + ˆ " x, where ˆ
The simple linear Regression Model
The smple lear Regresso Model Correlato coeffcet s o-parametrc ad just dcates that two varables are assocated wth oe aother, but t does ot gve a deas of the kd of relatoshp. Regresso models help vestgatg
The Gompertz-Makeham distribution. Fredrik Norström. Supervisor: Yuri Belyaev
The Gompertz-Makeham dstrbuto by Fredrk Norström Master s thess Mathematcal Statstcs, Umeå Uversty, 997 Supervsor: Yur Belyaev Abstract Ths work s about the Gompertz-Makeham dstrbuto. The dstrbuto has
SHAPIRO-WILK TEST FOR NORMALITY WITH KNOWN MEAN
SHAPIRO-WILK TEST FOR NORMALITY WITH KNOWN MEAN Wojcech Zelńsk Departmet of Ecoometrcs ad Statstcs Warsaw Uversty of Lfe Sceces Nowoursyowska 66, -787 Warszawa e-mal: wojtekzelsk@statystykafo Zofa Hausz,
ANOVA Notes Page 1. Analysis of Variance for a One-Way Classification of Data
ANOVA Notes Page Aalss of Varace for a Oe-Wa Classfcato of Data Cosder a sgle factor or treatmet doe at levels (e, there are,, 3, dfferet varatos o the prescrbed treatmet) Wth a gve treatmet level there
Simple Linear Regression
Smple Lear Regresso Regresso equato a equato that descrbes the average relatoshp betwee a respose (depedet) ad a eplaator (depedet) varable. 6 8 Slope-tercept equato for a le m b (,6) slope. (,) 6 6 8
APPENDIX III THE ENVELOPE PROPERTY
Apped III APPENDIX III THE ENVELOPE PROPERTY Optmzato mposes a very strog structure o the problem cosdered Ths s the reaso why eoclasscal ecoomcs whch assumes optmzg behavour has bee the most successful
Statistical Pattern Recognition (CE-725) Department of Computer Engineering Sharif University of Technology
I The Name of God, The Compassoate, The ercful Name: Problems' eys Studet ID#:. Statstcal Patter Recogto (CE-725) Departmet of Computer Egeerg Sharf Uversty of Techology Fal Exam Soluto - Sprg 202 (50
1. The Time Value of Money
Corporate Face [00-0345]. The Tme Value of Moey. Compoudg ad Dscoutg Captalzato (compoudg, fdg future values) s a process of movg a value forward tme. It yelds the future value gve the relevat compoudg
Regression Analysis. 1. Introduction
. Itroducto Regresso aalyss s a statstcal methodology that utlzes the relato betwee two or more quattatve varables so that oe varable ca be predcted from the other, or others. Ths methodology s wdely used
ADAPTATION OF SHAPIRO-WILK TEST TO THE CASE OF KNOWN MEAN
Colloquum Bometrcum 4 ADAPTATION OF SHAPIRO-WILK TEST TO THE CASE OF KNOWN MEAN Zofa Hausz, Joaa Tarasńska Departmet of Appled Mathematcs ad Computer Scece Uversty of Lfe Sceces Lubl Akademcka 3, -95 Lubl
n. We know that the sum of squares of p independent standard normal variables has a chi square distribution with p degrees of freedom.
UMEÅ UNIVERSITET Matematsk-statstska sttutoe Multvarat dataaalys för tekologer MSTB0 PA TENTAMEN 004-0-9 LÖSNINGSFÖRSLAG TILL TENTAMEN I MATEMATISK STATISTIK Multvarat dataaalys för tekologer B, 5 poäg.
Classic Problems at a Glance using the TVM Solver
C H A P T E R 2 Classc Problems at a Glace usg the TVM Solver The table below llustrates the most commo types of classc face problems. The formulas are gve for each calculato. A bref troducto to usg the
Report 52 Fixed Maturity EUR Industrial Bond Funds
Rep52, Computed & Prted: 17/06/2015 11:53 Report 52 Fxed Maturty EUR Idustral Bod Fuds From Dec 2008 to Dec 2014 31/12/2008 31 December 1999 31/12/2014 Bechmark Noe Defto of the frm ad geeral formato:
Settlement Prediction by Spatial-temporal Random Process
Safety, Relablty ad Rs of Structures, Ifrastructures ad Egeerg Systems Furuta, Fragopol & Shozua (eds Taylor & Fracs Group, Lodo, ISBN 978---77- Settlemet Predcto by Spatal-temporal Radom Process P. Rugbaapha
T = 1/freq, T = 2/freq, T = i/freq, T = n (number of cash flows = freq n) are :
Bullets bods Let s descrbe frst a fxed rate bod wthout amortzg a more geeral way : Let s ote : C the aual fxed rate t s a percetage N the otoal freq ( 2 4 ) the umber of coupo per year R the redempto of
IDENTIFICATION OF THE DYNAMICS OF THE GOOGLE S RANKING ALGORITHM. A. Khaki Sedigh, Mehdi Roudaki
IDENIFICAION OF HE DYNAMICS OF HE GOOGLE S RANKING ALGORIHM A. Khak Sedgh, Mehd Roudak Cotrol Dvso, Departmet of Electrcal Egeerg, K.N.oos Uversty of echology P. O. Box: 16315-1355, ehra, Ira [email protected],
Chapter Eight. f : R R
Chapter Eght f : R R 8. Itroducto We shall ow tur our atteto to the very mportat specal case of fuctos that are real, or scalar, valued. These are sometmes called scalar felds. I the very, but mportat,
Numerical Methods with MS Excel
TMME, vol4, o.1, p.84 Numercal Methods wth MS Excel M. El-Gebely & B. Yushau 1 Departmet of Mathematcal Sceces Kg Fahd Uversty of Petroleum & Merals. Dhahra, Saud Araba. Abstract: I ths ote we show how
Average Price Ratios
Average Prce Ratos Morgstar Methodology Paper August 3, 2005 2005 Morgstar, Ic. All rghts reserved. The formato ths documet s the property of Morgstar, Ic. Reproducto or trascrpto by ay meas, whole or
Preprocess a planar map S. Given a query point p, report the face of S containing p. Goal: O(n)-size data structure that enables O(log n) query time.
Computatoal Geometry Chapter 6 Pot Locato 1 Problem Defto Preprocess a plaar map S. Gve a query pot p, report the face of S cotag p. S Goal: O()-sze data structure that eables O(log ) query tme. C p E
On Error Detection with Block Codes
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 9, No 3 Sofa 2009 O Error Detecto wth Block Codes Rostza Doduekova Chalmers Uversty of Techology ad the Uversty of Gotheburg,
STOCHASTIC approximation algorithms have several
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 60, NO 10, OCTOBER 2014 6609 Trackg a Markov-Modulated Statoary Degree Dstrbuto of a Dyamc Radom Graph Mazyar Hamd, Vkram Krshamurthy, Fellow, IEEE, ad George
Reinsurance and the distribution of term insurance claims
Resurace ad the dstrbuto of term surace clams By Rchard Bruyel FIAA, FNZSA Preseted to the NZ Socety of Actuares Coferece Queestow - November 006 1 1 Itroducto Ths paper vestgates the effect of resurace
6.7 Network analysis. 6.7.1 Introduction. References - Network analysis. Topological analysis
6.7 Network aalyss Le data that explctly store topologcal formato are called etwork data. Besdes spatal operatos, several methods of spatal aalyss are applcable to etwork data. Fgure: Network data Refereces
The analysis of annuities relies on the formula for geometric sums: r k = rn+1 1 r 1. (2.1) k=0
Chapter 2 Autes ad loas A auty s a sequece of paymets wth fxed frequecy. The term auty orgally referred to aual paymets (hece the ame), but t s ow also used for paymets wth ay frequecy. Autes appear may
CHAPTER 2. Time Value of Money 6-1
CHAPTER 2 Tme Value of Moey 6- Tme Value of Moey (TVM) Tme Les Future value & Preset value Rates of retur Autes & Perpetutes Ueve cash Flow Streams Amortzato 6-2 Tme les 0 2 3 % CF 0 CF CF 2 CF 3 Show
of the relationship between time and the value of money.
TIME AND THE VALUE OF MONEY Most agrbusess maagers are famlar wth the terms compoudg, dscoutg, auty, ad captalzato. That s, most agrbusess maagers have a tutve uderstadg that each term mples some relatoshp
ECONOMIC CHOICE OF OPTIMUM FEEDER CABLE CONSIDERING RISK ANALYSIS. University of Brasilia (UnB) and The Brazilian Regulatory Agency (ANEEL), Brazil
ECONOMIC CHOICE OF OPTIMUM FEEDER CABE CONSIDERING RISK ANAYSIS I Camargo, F Fgueredo, M De Olvera Uversty of Brasla (UB) ad The Brazla Regulatory Agecy (ANEE), Brazl The choce of the approprate cable
Abraham Zaks. Technion I.I.T. Haifa ISRAEL. and. University of Haifa, Haifa ISRAEL. Abstract
Preset Value of Autes Uder Radom Rates of Iterest By Abraham Zas Techo I.I.T. Hafa ISRAEL ad Uversty of Hafa, Hafa ISRAEL Abstract Some attempts were made to evaluate the future value (FV) of the expected
10.5 Future Value and Present Value of a General Annuity Due
Chapter 10 Autes 371 5. Thomas leases a car worth $4,000 at.99% compouded mothly. He agrees to make 36 lease paymets of $330 each at the begg of every moth. What s the buyout prce (resdual value of the
Curve Fitting and Solution of Equation
UNIT V Curve Fttg ad Soluto of Equato 5. CURVE FITTING I ma braches of appled mathematcs ad egeerg sceces we come across epermets ad problems, whch volve two varables. For eample, t s kow that the speed
Optimal multi-degree reduction of Bézier curves with constraints of endpoints continuity
Computer Aded Geometrc Desg 19 (2002 365 377 wwwelsevercom/locate/comad Optmal mult-degree reducto of Bézer curves wth costrats of edpots cotuty Guo-Dog Che, Guo-J Wag State Key Laboratory of CAD&CG, Isttute
Generalized Methods of Integrated Moments for High-Frequency Data
Geeralzed Methods of Itegrated Momets for Hgh-Frequecy Data Ja L Duke Uversty Dacheg Xu Chcago Booth Ths Verso: February 14, 214 Abstract We study the asymptotc ferece for a codtoal momet equalty model
The Digital Signature Scheme MQQ-SIG
The Dgtal Sgature Scheme MQQ-SIG Itellectual Property Statemet ad Techcal Descrpto Frst publshed: 10 October 2010, Last update: 20 December 2010 Dalo Glgorosk 1 ad Rue Stesmo Ødegård 2 ad Rue Erled Jese
Chapter 3. AMORTIZATION OF LOAN. SINKING FUNDS R =
Chapter 3. AMORTIZATION OF LOAN. SINKING FUNDS Objectves of the Topc: Beg able to formalse ad solve practcal ad mathematcal problems, whch the subjects of loa amortsato ad maagemet of cumulatve fuds are
CIS603 - Artificial Intelligence. Logistic regression. (some material adopted from notes by M. Hauskrecht) CIS603 - AI. Supervised learning
CIS63 - Artfcal Itellgece Logstc regresso Vasleos Megalookoomou some materal adopted from otes b M. Hauskrecht Supervsed learg Data: D { d d.. d} a set of eamples d < > s put vector ad s desred output
Common p-belief: The General Case
GAMES AND ECONOMIC BEHAVIOR 8, 738 997 ARTICLE NO. GA97053 Commo p-belef: The Geeral Case Atsush Kaj* ad Stephe Morrs Departmet of Ecoomcs, Uersty of Pesylaa Receved February, 995 We develop belef operators
AP Statistics 2006 Free-Response Questions Form B
AP Statstcs 006 Free-Respose Questos Form B The College Board: Coectg Studets to College Success The College Board s a ot-for-proft membershp assocato whose msso s to coect studets to college success ad
How To Value An Annuity
Future Value of a Auty After payg all your blls, you have $200 left each payday (at the ed of each moth) that you wll put to savgs order to save up a dow paymet for a house. If you vest ths moey at 5%
Relaxation Methods for Iterative Solution to Linear Systems of Equations
Relaxato Methods for Iteratve Soluto to Lear Systems of Equatos Gerald Recktewald Portlad State Uversty Mechacal Egeerg Departmet [email protected] Prmary Topcs Basc Cocepts Statoary Methods a.k.a. Relaxato
Security Analysis of RAPP: An RFID Authentication Protocol based on Permutation
Securty Aalyss of RAPP: A RFID Authetcato Protocol based o Permutato Wag Shao-hu,,, Ha Zhje,, Lu Sujua,, Che Da-we, {College of Computer, Najg Uversty of Posts ad Telecommucatos, Najg 004, Cha Jagsu Hgh
Preparation of Calibration Curves
Preparato of Calbrato Curves A Gude to Best Practce September 3 Cotact Pot: Lz Prchard Tel: 8943 7553 Prepared by: Vck Barwck Approved by: Date: The work descrbed ths report was supported uder cotract
Credibility Premium Calculation in Motor Third-Party Liability Insurance
Advaces Mathematcal ad Computatoal Methods Credblty remum Calculato Motor Thrd-arty Lablty Isurace BOHA LIA, JAA KUBAOVÁ epartmet of Mathematcs ad Quattatve Methods Uversty of ardubce Studetská 95, 53
Optimal replacement and overhaul decisions with imperfect maintenance and warranty contracts
Optmal replacemet ad overhaul decsos wth mperfect mateace ad warraty cotracts R. Pascual Departmet of Mechacal Egeerg, Uversdad de Chle, Caslla 2777, Satago, Chle Phoe: +56-2-6784591 Fax:+56-2-689657 [email protected]
Constrained Cubic Spline Interpolation for Chemical Engineering Applications
Costraed Cubc Sple Iterpolato or Chemcal Egeerg Applcatos b CJC Kruger Summar Cubc sple terpolato s a useul techque to terpolate betwee kow data pots due to ts stable ad smooth characterstcs. Uortuatel
ANNEX 77 FINANCE MANAGEMENT. (Working material) Chief Actuary Prof. Gaida Pettere BTA INSURANCE COMPANY SE
ANNEX 77 FINANCE MANAGEMENT (Workg materal) Chef Actuary Prof. Gada Pettere BTA INSURANCE COMPANY SE 1 FUNDAMENTALS of INVESTMENT I THEORY OF INTEREST RATES 1.1 ACCUMULATION Iterest may be regarded as
Automated Event Registration System in Corporation
teratoal Joural of Advaces Computer Scece ad Techology JACST), Vol., No., Pages : 0-0 0) Specal ssue of CACST 0 - Held durg 09-0 May, 0 Malaysa Automated Evet Regstrato System Corporato Zafer Al-Makhadmee
Estimating the Spot Covariation of Asset Prices. Statistical Theory and Empirical Evidence SFB 6 4 9 E C O N O M I C R I S K B E R L I N
SFB 649 Dscusso Paper 014-055 Estmatg the Spot Covarato of Asset Prces Statstcal Theory ad Emprcal Evdece Markus Bbger* Markus Ress* Nkolaus Hautsch** Peter Malec*** *Humboldt-Uverstät zu Berl, Germay
Proceedings of the 2010 Winter Simulation Conference B. Johansson, S. Jain, J. Montoya-Torres, J. Hugan, and E. Yücesan, eds.
Proceedgs of the 21 Wter Smulato Coferece B. Johasso, S. Ja, J. Motoya-Torres, J. Huga, ad E. Yücesa, eds. EMPIRICAL METHODS OR TWO-ECHELON INVENTORY MANAGEMENT WITH SERVICE LEVEL CONSTRAINTS BASED ON
Forecasting Trend and Stock Price with Adaptive Extended Kalman Filter Data Fusion
2011 Iteratoal Coferece o Ecoomcs ad Face Research IPEDR vol.4 (2011 (2011 IACSIT Press, Sgapore Forecastg Tred ad Stoc Prce wth Adaptve Exteded alma Flter Data Fuso Betollah Abar Moghaddam Faculty of
Measuring the Quality of Credit Scoring Models
Measur the Qualty of Credt cor Models Mart Řezáč Dept. of Matheatcs ad tatstcs, Faculty of cece, Masaryk Uversty CCC XI, Edurh Auust 009 Cotet. Itroducto 3. Good/ad clet defto 4 3. Measur the qualty 6
Fractal-Structured Karatsuba`s Algorithm for Binary Field Multiplication: FK
Fractal-Structured Karatsuba`s Algorthm for Bary Feld Multplcato: FK *The authors are worg at the Isttute of Mathematcs The Academy of Sceces of DPR Korea. **Address : U Jog dstrct Kwahadog Number Pyogyag
Load Balancing Control for Parallel Systems
Proc IEEE Med Symposum o New drectos Cotrol ad Automato, Chaa (Grèce),994, pp66-73 Load Balacg Cotrol for Parallel Systems Jea-Claude Heet LAAS-CNRS, 7 aveue du Coloel Roche, 3077 Toulouse, Frace E-mal
Loss Distribution Generation in Credit Portfolio Modeling
Loss Dstrbuto Geerato Credt Portfolo Modelg Igor Jouravlev, MMF, Walde Uversty, USA Ruth A. Maurer, Ph.D., Professor Emertus of Mathematcal ad Computer Sceces, Colorado School of Mes, USA Key words: Loss
Speeding up k-means Clustering by Bootstrap Averaging
Speedg up -meas Clusterg by Bootstrap Averagg Ia Davdso ad Ashw Satyaarayaa Computer Scece Dept, SUNY Albay, NY, USA,. {davdso, ashw}@cs.albay.edu Abstract K-meas clusterg s oe of the most popular clusterg
An Approach to Evaluating the Computer Network Security with Hesitant Fuzzy Information
A Approach to Evaluatg the Computer Network Securty wth Hestat Fuzzy Iformato Jafeg Dog A Approach to Evaluatg the Computer Network Securty wth Hestat Fuzzy Iformato Jafeg Dog, Frst ad Correspodg Author
Near Neighbor Distribution in Sets of Fractal Nature
Iteratoal Joural of Computer Iformato Systems ad Idustral Maagemet Applcatos. ISS 250-7988 Volume 5 (202) 3 pp. 59-66 MIR Labs, www.mrlabs.et/jcsm/dex.html ear eghbor Dstrbuto Sets of Fractal ature Marcel
Performance Attribution. Methodology Overview
erformace Attrbuto Methodology Overvew Faba SUAREZ March 2004 erformace Attrbuto Methodology 1.1 Itroducto erformace Attrbuto s a set of techques that performace aalysts use to expla why a portfolo's performace
Integrating Production Scheduling and Maintenance: Practical Implications
Proceedgs of the 2012 Iteratoal Coferece o Idustral Egeerg ad Operatos Maagemet Istabul, Turkey, uly 3 6, 2012 Itegratg Producto Schedulg ad Mateace: Practcal Implcatos Lath A. Hadd ad Umar M. Al-Turk
Analysis of one-dimensional consolidation of soft soils with non-darcian flow caused by non-newtonian liquid
Joural of Rock Mechacs ad Geotechcal Egeerg., 4 (3): 5 57 Aalyss of oe-dmesoal cosoldato of soft sols wth o-darca flow caused by o-newtoa lqud Kaghe Xe, Chuaxu L, *, Xgwag Lu 3, Yul Wag Isttute of Geotechcal
Bayesian Network Representation
Readgs: K&F 3., 3.2, 3.3, 3.4. Bayesa Network Represetato Lecture 2 Mar 30, 20 CSE 55, Statstcal Methods, Sprg 20 Istructor: Su-I Lee Uversty of Washgto, Seattle Last tme & today Last tme Probablty theory
arxiv:math/0510414v1 [math.pr] 19 Oct 2005
A MODEL FOR THE BUS SYSTEM IN CUERNEVACA MEXICO) JINHO BAIK ALEXEI BORODIN PERCY DEIFT AND TOUFIC SUIDAN arxv:math/05044v [mathpr 9 Oct 2005 Itroducto The bus trasportato system Cuerevaca Mexco has certa
Chapter 3 0.06 = 3000 ( 1.015 ( 1 ) Present Value of an Annuity. Section 4 Present Value of an Annuity; Amortization
Chapter 3 Mathematcs of Face Secto 4 Preset Value of a Auty; Amortzato Preset Value of a Auty I ths secto, we wll address the problem of determg the amout that should be deposted to a accout ow at a gve
CHAPTER 13. Simple Linear Regression LEARNING OBJECTIVES. USING STATISTICS @ Sunflowers Apparel
CHAPTER 3 Smple Lear Regresso USING STATISTICS @ Suflowers Apparel 3 TYPES OF REGRESSION MODELS 3 DETERMINING THE SIMPLE LINEAR REGRESSION EQUATION The Least-Squares Method Vsual Exploratos: Explorg Smple
A New Bayesian Network Method for Computing Bottom Event's Structural Importance Degree using Jointree
, pp.277-288 http://dx.do.org/10.14257/juesst.2015.8.1.25 A New Bayesa Network Method for Computg Bottom Evet's Structural Importace Degree usg Jotree Wag Yao ad Su Q School of Aeroautcs, Northwester Polytechcal
MODELLING OF STOCK PRICES BY THE MARKOV CHAIN MONTE CARLO METHOD
ISSN 8-80 (prt) ISSN 8-8038 (ole) INTELEKTINĖ EKONOMIKA INTELLECTUAL ECONOMICS 0, Vol. 5, No. (0), p. 44 56 MODELLING OF STOCK PRICES BY THE MARKOV CHAIN MONTE CARLO METHOD Matas LANDAUSKAS Kauas Uversty
ISyE 512 Chapter 7. Control Charts for Attributes. Instructor: Prof. Kaibo Liu. Department of Industrial and Systems Engineering UW-Madison
ISyE 512 Chapter 7 Cotrol Charts for Attrbutes Istructor: Prof. Kabo Lu Departmet of Idustral ad Systems Egeerg UW-Madso Emal: [email protected] Offce: Room 3017 (Mechacal Egeerg Buldg) 1 Lst of Topcs Chapter
Compressive Sensing over Strongly Connected Digraph and Its Application in Traffic Monitoring
Compressve Sesg over Strogly Coected Dgraph ad Its Applcato Traffc Motorg Xao Q, Yogca Wag, Yuexua Wag, Lwe Xu Isttute for Iterdscplary Iformato Sceces, Tsghua Uversty, Bejg, Cha {qxao3, kyo.c}@gmal.com,
Dynamic Two-phase Truncated Rayleigh Model for Release Date Prediction of Software
J. Software Egeerg & Applcatos 3 63-69 do:.436/jsea..367 Publshed Ole Jue (http://www.scrp.org/joural/jsea) Dyamc Two-phase Trucated Raylegh Model for Release Date Predcto of Software Lafe Qa Qgchua Yao
10/19/2011. Financial Mathematics. Lecture 24 Annuities. Ana NoraEvans 403 Kerchof [email protected] http://people.virginia.
Math 40 Lecture 24 Autes Facal Mathematcs How ready do you feel for the quz o Frday: A) Brg t o B) I wll be by Frday C) I eed aother week D) I eed aother moth Aa NoraEvas 403 Kerchof [email protected] http://people.vrga.edu/~as5k/
Aggregation Functions and Personal Utility Functions in General Insurance
Acta Polytechca Huarca Vol. 7, No. 4, 00 Areato Fuctos ad Persoal Utlty Fuctos Geeral Isurace Jaa Šprková Departmet of Quattatve Methods ad Iformato Systems, Faculty of Ecoomcs, Matej Bel Uversty Tajovského
Capacitated Production Planning and Inventory Control when Demand is Unpredictable for Most Items: The No B/C Strategy
SCHOOL OF OPERATIONS RESEARCH AND INDUSTRIAL ENGINEERING COLLEGE OF ENGINEERING CORNELL UNIVERSITY ITHACA, NY 4853-380 TECHNICAL REPORT Jue 200 Capactated Producto Plag ad Ivetory Cotrol whe Demad s Upredctable
The paper presents Constant Rebalanced Portfolio first introduced by Thomas
Itroducto The paper presets Costat Rebalaced Portfolo frst troduced by Thomas Cover. There are several weakesses of ths approach. Oe s that t s extremely hard to fd the optmal weghts ad the secod weakess
A NON-PARAMETRIC COPULA ANALYSIS ON ESTIMATING RETURN DISTRIBUTION FOR PORTFOLIO MANAGEMENT: AN APPLICATION WITH THE US AND BRAZILIAN STOCK MARKETS 1
Ivestmet Maagemet ad Facal Iovatos, Volume 4, Issue 3, 007 57 A NON-PARAMETRIC COPULA ANALYSIS ON ESTIMATING RETURN DISTRIBUTION FOR PORTFOLIO MANAGEMENT: AN APPLICATION WITH THE US AND BRAZILIAN STOCK
Efficient Compensation for Regulatory Takings. and Oregon s Measure 37
Effcet Compesato for Regulatory Takgs ad Orego s Measure 37 Jack Scheffer Ph.D. Studet Dept. of Agrcultural, Evrometal ad Developmet Ecoomcs The Oho State Uversty 2120 Fyffe Road Columbus, OH 43210-1067
FINANCIAL MATHEMATICS 12 MARCH 2014
FINNCIL MTHEMTICS 12 MRCH 2014 I ths lesso we: Lesso Descrpto Make use of logarthms to calculate the value of, the tme perod, the equato P1 or P1. Solve problems volvg preset value ad future value autes.
How To Make A Supply Chain System Work
Iteratoal Joural of Iformato Techology ad Kowledge Maagemet July-December 200, Volume 2, No. 2, pp. 3-35 LATERAL TRANSHIPMENT-A TECHNIQUE FOR INVENTORY CONTROL IN MULTI RETAILER SUPPLY CHAIN SYSTEM Dharamvr
RUSSIAN ROULETTE AND PARTICLE SPLITTING
RUSSAN ROULETTE AND PARTCLE SPLTTNG M. Ragheb 3/7/203 NTRODUCTON To stuatos are ecoutered partcle trasport smulatos:. a multplyg medum, a partcle such as a eutro a cosmc ray partcle or a photo may geerate
Conversion of Non-Linear Strength Envelopes into Generalized Hoek-Brown Envelopes
Covero of No-Lear Stregth Evelope to Geeralzed Hoek-Brow Evelope Itroducto The power curve crtero commoly ued lmt-equlbrum lope tablty aaly to defe a o-lear tregth evelope (relatohp betwee hear tre, τ,
Statistical Intrusion Detector with Instance-Based Learning
Iformatca 5 (00) xxx yyy Statstcal Itruso Detector wth Istace-Based Learg Iva Verdo, Boja Nova Faulteta za eletroteho raualštvo Uverza v Marboru Smetaova 7, 000 Marbor, Sloveja [email protected] eywords:
Applications of Support Vector Machine Based on Boolean Kernel to Spam Filtering
Moder Appled Scece October, 2009 Applcatos of Support Vector Mache Based o Boolea Kerel to Spam Flterg Shugag Lu & Keb Cu School of Computer scece ad techology, North Cha Electrc Power Uversty Hebe 071003,
Sequences and Series
Secto 9. Sequeces d Seres You c thk of sequece s fucto whose dom s the set of postve tegers. f ( ), f (), f (),... f ( ),... Defto of Sequece A fte sequece s fucto whose dom s the set of postve tegers.
Load and Resistance Factor Design (LRFD)
53:134 Structural Desg II Load ad Resstace Factor Desg (LRFD) Specfcatos ad Buldg Codes: Structural steel desg of buldgs the US s prcpally based o the specfcatos of the Amerca Isttute of Steel Costructo
Session 4: Descriptive statistics and exporting Stata results
Itrduct t Stata Jrd Muñz (UAB) Sess 4: Descrptve statstcs ad exprtg Stata results I ths sess we are gg t wrk wth descrptve statstcs Stata. Frst, we preset a shrt trduct t the very basc statstcal ctets
MDM 4U PRACTICE EXAMINATION
MDM 4U RCTICE EXMINTION Ths s a ractce eam. It does ot cover all the materal ths course ad should ot be the oly revew that you do rearato for your fal eam. Your eam may cota questos that do ot aear o ths
