Template-Type: ReDIF-Article 1.0
Author-Name: P. Glewwe
Author-X-Name-First: P.
Author-X-Name-Last: Glewwe
Title: A test of the normality assumption in ordered probit model
Abstract:
This paper presents a Lagrange multiplier test of the normality
assumption underlying the ordered probit model. The test is presented both
for the standard ordered probit model and a version in which censoring is
present in the dependent variable. The test is then compared to normality
tests proposed here compares favorably to tests based on artificial
regression techinques.
Journal: Econometric Reviews
Pages: 1-19
Issue: 1
Volume: 16
Year: 1997
Keywords: Ordered Probit, Normality, Specification Tests, Lagrange Multiplier Test, Mote Carlo Simulations,
X-DOI: 10.1080/07474939708800369
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800369
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:1:p:1-19
Template-Type: ReDIF-Article 1.0
Author-Name: H. Peter Boswijk
Author-X-Name-First: H. Peter
Author-X-Name-Last: Boswijk
Author-Name: Jean-Pierre Urbain
Author-X-Name-First: Jean-Pierre
Author-X-Name-Last: Urbain
Title: Lagrance-multiplier tersts for weak exogeneity: a synthesis
Abstract:
This paper unifies two seemingly separate approaches to test weak
exogeneity in dynamic regression models with Lagrange-mulptiplier
statistics. The first class of tests focuses on the orthogonality between
innovations and conditioning variables, and thus is related to the
Durbin-Wu-Hausman specification test. The second approach has been
developed more recently in the context of context of cointegration and
error correction models, ad concentrates on the question whether the
conditioning variables display error correction behaviour. It is shown
that the vital difference between the two approaches stems from the choice
of the parmeters of interest. A new test is derived, which encompasses
both its predecessors. The test is applied to an error correction model of
the demand for money in Switzerland.
Journal: Econometric Reviews
Pages: 21-38
Issue: 1
Volume: 16
Year: 1997
Keywords: error correction models, exogeneity, lagrange-multiplier test, money do-mand,
X-DOI: 10.1080/07474939708800370
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800370
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:1:p:21-38
Template-Type: ReDIF-Article 1.0
Author-Name: Francisco Cribari-Neto
Author-X-Name-First: Francisco
Author-X-Name-Last: Cribari-Neto
Title: On the corrections to information matrix tests
Abstract:
This paper addresses the issue of designing finite-sample corrections to
information matrix tests. We review a Cornish-Fisher correction that has
been propowed elsewhere and propose an alternative, Bartlett-type
correction. Simulation results for skewness, excess kurtosis, normality
and heteroskedasticity tests are given.
Journal: Econometric Reviews
Pages: 39-53
Issue: 1
Volume: 16
Year: 1997
Keywords: and Phrases: Bartlett correction: Cornish-Fisher expansion: Edgeworth expansion; heteroskedasticity test; information matrix test; normality test; size-correction,
X-DOI: 10.1080/07474939708800371
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800371
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:1:p:39-53
Template-Type: ReDIF-Article 1.0
Author-Name: Gloria Gonzalez-Rivera
Author-X-Name-First: Gloria
Author-X-Name-Last: Gonzalez-Rivera
Title: A note on adaptation in garch models
Abstract:
In the framework of the Engle-type (G)ARCH models, I demonstrate that
there is a family of symmetric and asymmetric density functions for which
the asymptotic efficiency of the semiparametric estimator is equal to the
asymptotic efficiency of the maximum likelihood estimator. This family of
densities is bimodal (except for the normal). I also chracterize the
solution to the problem of minimizing the mean squared distance between
the parametric score and the semiparametric score in order to search for
unimodal densities for which the semiparametric estimator is likely to
perform well. The LaPlace density function emerges as one of these cases.
Journal: Econometric Reviews
Pages: 55-68
Issue: 1
Volume: 16
Year: 1997
Keywords: Adaptation, Generalized Autoregressive Conditional Heteroscedasticity (GARCH), maximum likelihood, semiparametric estimator,
X-DOI: 10.1080/07474939708800372
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800372
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:1:p:55-68
Template-Type: ReDIF-Article 1.0
Author-Name: Christopher Skeels
Author-X-Name-First: Christopher
Author-X-Name-Last: Skeels
Author-Name: Franics Vella
Author-X-Name-First: Franics
Author-X-Name-Last: Vella
Title: Monte carlo evidence on the robustness of conditional moment tests in tobit and probit models
Abstract:
This paper numerically examines the size robustness of various
conditional moment tests in misspecified tobit and probit models. The
misspecifications considered include the incorrect exclusion of
regressors, ignored heteroskedasticity and false distributional
assumptions. An important feature of the experimental design is that it is
based on an existing empirical study and is more realistic than many
simulation studies. The tests are seen to have mixed performance depending
on both the original null hypothesis being tested and type of
misspecification encountered.
Journal: Econometric Reviews
Pages: 69-92
Issue: 1
Volume: 16
Year: 1997
Keywords: and Phrases, probit models, tobit models, conditional moment tests, omitted variables, heteroskedasticity, non-normality,
X-DOI: 10.1080/07474939708800373
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800373
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:1:p:69-92
Template-Type: ReDIF-Article 1.0
Author-Name: Pene Kalulumia
Author-X-Name-First: Pene
Author-X-Name-Last: Kalulumia
Author-Name: Denis Bolduc
Author-X-Name-First: Denis
Author-X-Name-Last: Bolduc
Title: Generalized mixed estimator for nonlinear models: a maximum likelihood approach
Abstract:
This paper considers the problem of estimating a nonlinear statistical
model subject to stochastic linear constraints among unknown parameters.
These constraints represent prior information which originates from a
previous estimation of the same model using an alternative database. One
feature of this specification allows for the disign matrix of stochastic
linear restrictions to be estimated. The mixed regression technique and
the maximum likelihood approach are used to derive the estimator for both
the model coefficients and the unknown elements of this design matrix. The
proposed estimator whose asymptotic properties are studied, contains as a
special case the conventional mixed regression estimator based on a fixed
design matrix. A new test of compatibility between prior and sample
information is also introduced. Thesuggested estimator is tested
empirically with both simulated and actual marketing data.
Journal: Econometric Reviews
Pages: 93-107
Issue: 1
Volume: 16
Year: 1997
Keywords: nonlinear models, mixed regression, maximum likelihood, stochastic linear constraints,
X-DOI: 10.1080/07474939708800374
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800374
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:1:p:93-107
Template-Type: ReDIF-Article 1.0
Author-Name: Larry Taylor
Author-X-Name-First: Larry
Author-X-Name-Last: Taylor
Title: An R2criterion based on optimal predictors
Abstract:
The predictor that minimizes mean-squared prediction error is used to
derive a goodness-of-fit measure that offers an asymptotically valid model
selection criterion for a wide variety of regression models. In
particular, a new goodness-of-fit criterion (cr2) is proposed for censored
or otherwise limited dependent variables. The new goodness-of-fit measure
is then applied to the analysis of duration.
Journal: Econometric Reviews
Pages: 109-118
Issue: 1
Volume: 16
Year: 1997
Keywords: goodness-of-fit, optimal predictor, nonlinear, multivariate, instrumental variables, deration, JET Classification Numbers: C50, C52, C41,
X-DOI: 10.1080/07474939708800375
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800375
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:1:p:109-118
Template-Type: ReDIF-Article 1.0
Author-Name: Kazuhiro Ohtani
Author-X-Name-First: Kazuhiro
Author-X-Name-Last: Ohtani
Author-Name: David Giles
Author-X-Name-First: David
Author-X-Name-Last: Giles
Author-Name: Judith Giles
Author-X-Name-First: Judith
Author-X-Name-Last: Giles
Title: The exact risk performance of a pre-test estimator in a heteroskedastic linear regression model under the balanced loss function
Abstract:
We examine the risk of a pre-test estimator for regression coefficients
after a pre-test for homoskedasticity under the Balanced Loss Function
(BLF). We show analytically that the two stage Aitken estimator is
dominated by the pre-test estimator with the critical value of unity, even
if the BLF is used. We also show numerically that both the two stage
Aitken estimator and the pre-test estimator can be dominated by the
ordinary least squares estimator when “goodness of fit” is
regarded as more important than precision of estimation.
Journal: Econometric Reviews
Pages: 119-130
Issue: 1
Volume: 16
Year: 1997
Keywords: balanced loss, heteroskedasticity, sequential estimator, goodness of fit,
X-DOI: 10.1080/07474939708800376
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800376
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:1:p:119-130
Template-Type: ReDIF-Article 1.0
Author-Name: Maxwell King
Author-X-Name-First: Maxwell
Author-X-Name-Last: King
Author-Name: Ping Wu
Author-X-Name-First: Ping
Author-X-Name-Last: Wu
Title: Locally optimal one-sided tests for multiparameter hypotheses
Abstract:
Journal: Econometric Reviews
Pages: 131-156
Issue: 2
Volume: 16
Year: 1997
Keywords: autoregressive disturbances, heteroscedasticity, Lagrange multiplier test, linear regression, locally most mean powerful test, variance components,
X-DOI: 10.1080/07474939708800379
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800379
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:2:p:131-156
Template-Type: ReDIF-Article 1.0
Author-Name: Kosuke Oya
Author-X-Name-First: Kosuke
Author-X-Name-Last: Oya
Author-Name: Kosuke Oya
Author-X-Name-First: Kosuke
Author-X-Name-Last: Oya
Title: Wald,LM and LR test statistics of linear hypothese in a strutural equation model
Abstract:
For the linear hypothesis in a strucural equation model, the properties
of test statistics based on the two stage least squares estimator (2SLSE)
have been examined since these test statistics are easily derived in the
instrumental variable estimation framework. Savin (1976) has shown that
inequalities exist among the test statistics for the linear hypothesis,
but it is well known that there is no systematic inequality among these
statistics based on 2SLSE for the linear hypothesis in a structural
equation model. Morimune and Oya (1994) derived the constrained limited
information maximum likelihood estimator (LIMLE) subject to general linear
constraints on the coefficients of the structural equation, as well as
Wald, LM and Lr Test statistics for the adequacy of the linear
constraints. In this paper, we derive the inequalities among these three
test statistics based on LIMLE and the local power functions based on
Limle and 2SLSE to show that there is no test statistic which is uniformly
most powerful, and the LR test statistic based on LIMLE is locally unbised
and the other test statistics are not. Monte Carlo simulations are used to
examine the actual sizes of these test statistics and some numerical
examples of the power differences among these test statistics are given.
It is found that the actual sizes of these test statistics are greater
than the nominal sizes, the differences between the actual and nominal
sizes of Wald test statistics are generally the greatest, those of LM test
statistics are the smallest, and the power functions depend on the
correlations between the endogenous explanatory variables and the error
term of the structural equation, the asymptotic variance of estimator of
coefficients of the structural equation and the number of restrictions
imposed on the coefficients.
Journal: Econometric Reviews
Pages: 157-178
Issue: 2
Volume: 16
Year: 1997
X-DOI: 10.1080/07474939708800380
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800380
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:2:p:157-178
Template-Type: ReDIF-Article 1.0
Author-Name: Mark Jensen
Author-X-Name-First: Mark
Author-X-Name-Last: Jensen
Title: Revisiting the flexibility and regularity properties of the asymptotically ideal production model
Abstract:
In this paper we estimate the flexibility properties and regular regions
for the first three orders of the seminonparametric Asymptotically Ideal
Model (AIM) under four different types of constant returns to scale
production technologies. The AIM model's parameters are estimatedfrom a
Monte Carlo simulation where the data is generated from a three input, one
output, globally regular Constant Differences of Elasticity of
Substitution function. The Monte Carlo's input quantity and elasticity of
substitution estimates at the unit vector are graphed along with the area
of the relative price space where the AIM model is monotonically
increasing andquasi-concave.
Journal: Econometric Reviews
Pages: 179-203
Issue: 2
Volume: 16
Year: 1997
Keywords: flexible functional forms, seminonparametric models, regular regions,
X-DOI: 10.1080/07474939708800381
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800381
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:2:p:179-203
Template-Type: ReDIF-Article 1.0
Author-Name: Fabio Fornari
Author-X-Name-First: Fabio
Author-X-Name-Last: Fornari
Author-Name: Antonio Mele
Author-X-Name-First: Antonio
Author-X-Name-Last: Mele
Title: Weak convergence and distributional assumptions for a general class of nonliner arch models
Abstract:
Journal: Econometric Reviews
Pages: 205-227
Issue: 2
Volume: 16
Year: 1997
Keywords: non linear ARCH; continuous record asymptotics; stochastic volatility; option pricing theory,
X-DOI: 10.1080/07474939708800382
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800382
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:2:p:205-227
Template-Type: ReDIF-Article 1.0
Author-Name: Ana Fernandez
Author-X-Name-First: Ana
Author-X-Name-Last: Fernandez
Author-Name: Juan Rodriquez-Poo
Author-X-Name-First: Juan
Author-X-Name-Last: Rodriquez-Poo
Title: Estimation and specification testing in female labor participation models: parametric and semiparametric methods
Abstract:
Female labor participation models have been usually studied through
probit and logit specifications. Little attention has been paid to verify
the assumptions that are used in these sort of models, basically
distributional assumptions and homoskedasticity. In this paper we apply
semiparametirc methods in order to test the previous hypothesis. We also
estimate a Spanish female labor participation model using both parametric
and semiparametirc approaches. The parametirc model includes fixed and
random coefficients probit specification. The estimation procedures are
parametric maximum likelihood for both probit and logit models, and
semiparametric quasi maximum likelihood following Klein and Spady (1993).
The results depend cricially in the assumed model.
Journal: Econometric Reviews
Pages: 229-247
Issue: 2
Volume: 16
Year: 1997
Keywords: Female labor participation models, Homoskedasticity test, Distributional assumptions, Semiparametirc estimation,
X-DOI: 10.1080/07474939708800383
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800383
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:2:p:229-247
Template-Type: ReDIF-Article 1.0
Author-Name: James Davidson
Author-X-Name-First: James
Author-X-Name-Last: Davidson
Author-Name: Robert de Jong
Author-X-Name-First: Robert
Author-X-Name-Last: de Jong
Title: Strong laws of large numbers for dependent heterogeneous processes: a synthesis of recent and new results
Abstract:
This paper surveys recent developments in the strong law of large numbers
for dependent heterogeneous processes. We prove a generalised version of a
recent strong law for Lz-mixingales, and also a new strong law for
Lpmixingales. These results greatly relax the dependence and heterogeneity
conditions relative to those currently cited, and introduce explicit
trade-offs between dependence and heterogeneity. The results are applied
to proving strong laws for near-epoch dependent functions of mixing
processes. We contrast several methods for obtaining these results,
including mapping directly to the mixingale properties, and applying a
truncation argument.
Journal: Econometric Reviews
Pages: 251-279
Issue: 3
Volume: 16
Year: 1997
Keywords: Strong law of large numbers, mixing, mixingales, near-epoch dependence, JEL Classification: C19,
X-DOI: 10.1080/07474939708800387
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800387
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:3:p:251-279
Template-Type: ReDIF-Article 1.0
Author-Name: H. Penm Jammie
Author-X-Name-First: H. Penm
Author-X-Name-Last: Jammie
Author-Name: H. W. Penm Jack
Author-X-Name-First: H. W. Penm
Author-X-Name-Last: Jack
Author-Name: R. D. Terrell
Author-X-Name-First: R. D.
Author-X-Name-Last: Terrell
Title: The selection of zero-non-zero patterned cointegrating vectors in error-correction modelling
Abstract:
An effective and efficient search algorithm has been developed to select
from an 1(1) system zero-non-zero patterned cointegrating and loading
vectors in a subset VECM, Bq(l)y(t-1) + Bq-1 (L)Ay(t) = ε( t ) ,
where the long term impact matrix Bq(l) contains zero entries. The
algorithm can be applied to higher order integrated systems. The Finnish
money-output model presented by Johansen and Juselius (1990) and the
United States balanced growth model presented by King, Plosser, Stock and
Watson (1991) are used to demonstrate the usefulness of this algorithm in
examining the cointegrating relationships in vector time series.
Journal: Econometric Reviews
Pages: 281-314
Issue: 3
Volume: 16
Year: 1997
Keywords: cointegration, vector error correction modelling,
X-DOI: 10.1080/07474939708800388
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800388
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:3:p:281-314
Template-Type: ReDIF-Article 1.0
Author-Name: Jose-Mari Sarabia
Author-X-Name-First: Jose-Mari
Author-X-Name-Last: Sarabia
Title: A hierarchy of lorenz curves based on the generalized tukey's lambda distribution
Abstract:
A hierarchy of Lorenz curves based on the generalized Tukey's Lambda
distribution is proposed. Representations of the corresponding
distribution and density function are also provided, together with popular
inequality measures. Estimation methods are suggested. Finally, a
comparison with other parametric families of Lorenz curves is established.
Journal: Econometric Reviews
Pages: 305-320
Issue: 3
Volume: 16
Year: 1997
Keywords: Generalized Tukey's Lambda distribution, Classical Pareto Lorenz curve, Gini index, Pietra index,
X-DOI: 10.1080/07474939708800389
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800389
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:3:p:305-320
Template-Type: ReDIF-Article 1.0
Author-Name: Kenneth Stewart
Author-X-Name-First: Kenneth
Author-X-Name-Last: Stewart
Title: Exact testing in multivariate regression
Abstract:
An F statistic due to Rao (1951,1973) tests uniform mixed linear
restrictions in the multivariateregression model. In combination with a
generalization of the Bera-Evans-Savin exact functional relationship
between the W, LR, and LM statistics, Rao's F serves to unify a number of
exact test procedures commonly applied in disparate empirical literatures.
Examples in demand analysis and asset pricing are provided. The
availability of exact tests of restrictions in certain nonlinear models
when the model is linear under the null, originally explored by
Milliken-Graybill (1970), is extended to multivariate regression.
Generalized RESET, J-, and Hausman-Wu tests are resented. As an extension
of Dufour (1989), bounds tests exist for nonlinear and inequality
restrictions. Applications include conservative bound tests for symmetry
or negativity of the substitution matrix in demand systems.
Journal: Econometric Reviews
Pages: 321-352
Issue: 3
Volume: 16
Year: 1997
Keywords: exact test, multivariate regression,
X-DOI: 10.1080/07474939708800390
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800390
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:3:p:321-352
Template-Type: ReDIF-Article 1.0
Author-Name: Chia-Shang James Chu
Author-X-Name-First: Chia-Shang James
Author-X-Name-Last: Chu
Title: Multiple hypothesis test for parameter constancy based on recursive residuals
Abstract:
This article presents a multiple hypothesis test procedure that combines
two well known tests for structural change in the linear regression model,
the CUSUM test and the recursive t test. The CUSUM test is run through the
sequence of recursive residuals as usual; if the CUSUM plot does not
violate the critical lines, one more step is taken to perform the t test
for hypothesis of zero mean based on all recursive residuals. The
asymptotic size of this multiple hypothesis test is derived; power
simulation results suggest that it outperforms the traditional CUSUM test
and complements other tests that are currently stressed in econometrics.
Journal: Econometric Reviews
Pages: 353-360
Issue: 3
Volume: 16
Year: 1997
Keywords: CUSUM test, multiple hypothesis testing, structural change,
X-DOI: 10.1080/07474939708800391
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800391
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:3:p:353-360
Template-Type: ReDIF-Article 1.0
Author-Name: Andrew Weiss
Author-X-Name-First: Andrew
Author-X-Name-Last: Weiss
Title: Specification tests in ordered logit and probit models
Abstract:
In this paper, I study the application of various specification tests to
ordered logit and probit models with heteroskedastic errors, with the
primary focus on the ordered probit model. The tests are Lagrange
multiplier tests, information matrix tests, and chi-squared goodness of
fit tests. The alternatives are omitted variables in the regression
equation, omitted varaibles in the equation describing the
heteroskedasticity, and non-logistic/non-normal errors. The alternative
error distributions include a generalized logistic distribution in the
ordered logit model and the Pearson family in the ordered.
Journal: Econometric Reviews
Pages: 361-391
Issue: 4
Volume: 16
Year: 1997
Keywords: Lagrange multiplier, Information matrix, Chisquared,
X-DOI: 10.1080/07474939708800394
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800394
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:4:p:361-391
Template-Type: ReDIF-Article 1.0
Author-Name: Dougas Steigerwald
Author-X-Name-First: Dougas
Author-X-Name-Last: Steigerwald
Title: Uniformly adaptive estimation for models with arma errors
Abstract:
A semiparametric estimator based on an unknown density isuniformly
adaptive if the expected loss of the estimator converges to the asymptotic
expected loss of the maximum liklihood estimator based on teh true density
(MLE), and if convergence does not depend on either the parameter values
or the form of the unknown density. Without uniform adaptivity, the
asymptotic expected loss of the MLE need not approximate the expected loss
of a semiparametric estimator for any finite sample I show that a two step
semiparametric estimator is uniformly adaptive for the parameters of
nonlinear regression models with autoregressive moving average errors.
Journal: Econometric Reviews
Pages: 393-409
Issue: 4
Volume: 16
Year: 1997
Keywords: adaptive, ARMA, semiparametric, uniform convergence,
X-DOI: 10.1080/07474939708800395
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800395
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:4:p:393-409
Template-Type: ReDIF-Article 1.0
Author-Name: Zhenjuan Liu
Author-X-Name-First: Zhenjuan
Author-X-Name-Last: Liu
Author-Name: Xuewen Lu
Author-X-Name-First: Xuewen
Author-X-Name-Last: Lu
Author-Name: Zhenjuan Liu
Author-X-Name-First: Zhenjuan
Author-X-Name-Last: Liu
Author-Name: Xuewen Lu
Author-X-Name-First: Xuewen
Author-X-Name-Last: Lu
Title: Root-n-consistent semiparametric estimation of partially linear models based on k-nn method
Abstract:
In the context of the partially linear semiparametric model examined by
Robinson (1988), we show that root-n-consisten estimation results
established using kernel and series methods can also be obtained by using
k-nearest-neighbor (k-nn) method.
Journal: Econometric Reviews
Pages: 411-420
Issue: 4
Volume: 16
Year: 1997
Keywords: Semiparametric regression, nearest neighbor nonparametric regression, partially linear model,
X-DOI: 10.1080/07474939708800396
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800396
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:4:p:411-420
Template-Type: ReDIF-Article 1.0
Author-Name: Zacharias Psaradakis
Author-X-Name-First: Zacharias
Author-X-Name-Last: Psaradakis
Title: Testing for unit roots in time series with nearly deterministic seasonal variation
Abstract:
This paper addresses the problem of testing for the presence of unit
autoregressive roots in seasonal time series with negatively correlated
moving average components. For such cases, many of the commonly used tests
are known to have exact sizes much higher than their nominal significance
level. We propose modifications of available test procedures that are
based on suitably prewhitened data and feasible generalized least squares
estimators. Monte Carlo experiments show that such modifications are
successful in reducing size distortions in samples of moderate size.
Journal: Econometric Reviews
Pages: 421-439
Issue: 4
Volume: 16
Year: 1997
Keywords: and Phrases, Generalized Least Squares, Monte Carlo Experiments, Moving Average, Prewhitening, Seasonality, Unit Roots,
X-DOI: 10.1080/07474939708800397
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800397
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:4:p:421-439
Template-Type: ReDIF-Article 1.0
Author-Name: Hailong Qian
Author-X-Name-First: Hailong
Author-X-Name-Last: Qian
Author-Name: Peter Schmidt
Author-X-Name-First: Peter
Author-X-Name-Last: Schmidt
Title: The asymptotic equivalence between the iterated improved 2sls estimator and the 3sls estimator
Abstract:
In this paper we show that the 3SLS estimator of a system of equations is
asymptotically equivalent to an iterative 2SLS estimator applied to each
equation, augmented with the residuals from the other equations. This
result is a natural extension of Telser (1964).
Journal: Econometric Reviews
Pages: 441-457
Issue: 4
Volume: 16
Year: 1997
Keywords: 2SLS(IV) estimators, Improved 2SLS(IV) estimators, Iterated improved 2SLS(IV) estimators, 3SLS estimators,
X-DOI: 10.1080/07474939708800398
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939708800398
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:16:y:1997:i:4:p:441-457
Template-Type: ReDIF-Article 1.0
Author-Name: Lutz Kilian
Author-X-Name-First: Lutz
Author-X-Name-Last: Kilian
Title: Confidence intervals for impulse responses under departures from normality
Abstract:
Monte Carlo evidence shows that in structural VAR models with fat-tailed
or skewed innovations the coverage accuracy of impulse response confidence
intervals may deterorate substantially compared to the same model with
Gaussian innovations. Empirical evidance suggests that such departures
from normality are quite plausible for economic time series. The
simulation results suggest that applied researchers are best off using
nonparametric bootstrap intervals for impulse responses, regardless of
whether or not there is evidence of fat tails or skewness in the error
distribution. Allowing for departures from normality is shown to
considerably weaken the evidence of the delayed overshooting puzzle in
Eichenbaum and Evans (1995).
Journal: Econometric Reviews
Pages: 1-29
Issue: 1
Volume: 17
Year: 1998
Keywords: structural VAR model, normality assumption, bootstrap, impulse response intervals, delayed overshooting puzzle,
X-DOI: 10.1080/07474939808800401
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800401
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:1:p:1-29
Template-Type: ReDIF-Article 1.0
Author-Name: Paramsothy Silvapulle
Author-X-Name-First: Paramsothy
Author-X-Name-Last: Silvapulle
Author-Name: Merran Evans
Author-X-Name-First: Merran
Author-X-Name-Last: Evans
Title: Testing for serial correlation in the presence of dynamic heteroscedasticity
Abstract:
Standard serial correlation tests are derived assuming that the
disturbances are homoscedastic, but this study shows that asympotic
critical values are not accurate when this assumption is violated.
Asymptotic critical values for the ARCH(2)-corrected LM, BP and BL tests
are valid only when the underlying ARCH process is strictly stationary,
whereas Wooldridge's robust LM test has good properties overall. These
tests exhibit similar bahaviour even when the underlying process is GARCH
(1,1). When the regressors include lagged dependent variables, the
rejection frequencies under both the null and alternative hypotheses
depend on the coefficientsof the lagged dependent variables and the other
model parameters. They appear to be robust across various disturbance
distributions under the null hypothesis.
Journal: Econometric Reviews
Pages: 31-55
Issue: 1
Volume: 17
Year: 1998
Keywords: serial correlations tests, ARCH-correlated tests, ARMA-ARCH models,
X-DOI: 10.1080/07474939808800402
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800402
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:1:p:31-55
Template-Type: ReDIF-Article 1.0
Author-Name: Suzanne McCoskey
Author-X-Name-First: Suzanne
Author-X-Name-Last: McCoskey
Author-Name: Chihwa Kao
Author-X-Name-First: Chihwa
Author-X-Name-Last: Kao
Title: A residual-based test of the null of cointegration in panel data
Abstract:
This paper proposes a residual-based Lagrange Multiplier (LM) test for
the null of cointegration in panel data. The test is analogous to the
locally best unbiased invariant (LBUI) for a moving average (MA) unit
root. The asymptotic distribution of the test is derived under the null.
Monte Carlo simulations are performed to study the size and power
properties of the proposed test. overall, the empirical sizes of the LM-FM
and LM-DOLs are close to the true size even in small samples. The power is
quite good for the panels where T ≥ 50, and decent with panels for
fewer observation in T. In our fixed sample of N = 50 and T = 50, the
presence of a moving average and correlation between the LM-DOLS test
seems to be better at correcting these effects, although in some cases the
LM-FM test is more powerful. Although much of the non-stationary time
series econometrics has been criticized for having more to do with the
specific properties of the data set rather than underlying economic
models, the recent development of the cointegration literature has allowed
for a concrete bridge between economic long run theory and time series
methods. Our test now allows for the testing of the null of cointegration
in a panel setting and should be of considerable interest to economists in
a wide variety of fields.
Journal: Econometric Reviews
Pages: 57-84
Issue: 1
Volume: 17
Year: 1998
X-DOI: 10.1080/07474939808800403
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800403
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:1:p:57-84
Template-Type: ReDIF-Article 1.0
Author-Name: Giorgio Calzolari
Author-X-Name-First: Giorgio
Author-X-Name-Last: Calzolari
Author-Name: Gabriele Fiorentini
Author-X-Name-First: Gabriele
Author-X-Name-Last: Fiorentini
Title: A tobit model with garch errors
Abstract:
In the context of time series regression, we extend the standard Tobit
model to allow for the possibility of conditional heteroskedastic error
processes of the GARCH type. We discuss the likelihood function of the
Tobit model in the presence of conditionally heteroskedastic errors.
Expressing the exact likelihood function turns out to be infeasible, and
we propose an approximation by treating the model as being conditionally
Gaussian. The performance of the estimator is investigated by means of
Monte Carlo simulations. We find that, when the error terms follow a GARCH
process, the proposed estimator considerably outperforms the standard
Tobit quasi maximum likelihood estimator. The efficiency loss due to the
approximation of the likelihood is finally evaluated.
Journal: Econometric Reviews
Pages: 85-104
Issue: 1
Volume: 17
Year: 1998
Keywords: censored regressions, conditional heteroskedasticity, Monte Carlo simulations,
X-DOI: 10.1080/07474939808800404
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800404
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:1:p:85-104
Template-Type: ReDIF-Article 1.0
Author-Name: Julia Campos
Author-X-Name-First: Julia
Author-X-Name-Last: Campos
Title: Book review
Abstract:
Modelling Nonlinear Economic Relationships by Clive W. J. Granger and
Timo Teravirta. Pp. x+ 187. Oxford: Oxford University Press, 1993. ($US
21.00 paper) Web Information: www.oup-usa.org/gcdocs/gc-019877320x.h.
Journal: Econometric Reviews
Pages: 105-108
Issue: 1
Volume: 17
Year: 1998
X-DOI: 10.1080/07474939808800405
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800405
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:1:p:105-108
Template-Type: ReDIF-Article 1.0
Author-Name: Mark Steel
Author-X-Name-First: Mark
Author-X-Name-Last: Steel
Title: Bayesian analysis of stochastic volatility models with flexible tails
Abstract:
An alternative distributional assumption is proposed for the stochastic
volatility model. This results in extremely flexible tail behaviour of the
sampling distribution for the observables, as well as in the availability
of a simple Markov Chain Monte Carlo strategy for posterior analysis. By
allowing the tail behaviour to be determined by a separate parameter, we
reserve the parameters of the volatility process to dictate the degree of
volatility clustering. Treatment of a mean function is formally integrated
in the analysis. Some empirical examples on both stock prices and exchange
rates clearly indicate the presence of fat tails, in combination with high
levels of volatility clustering. In addition, predictive distributions
indicate a good fit with these typical financial data sets.
Journal: Econometric Reviews
Pages: 109-143
Issue: 2
Volume: 17
Year: 1998
Keywords: financial time series, leptokurtic distributions, Markov Chain Monte Carlo, Skewed Exponential Power distribution,
X-DOI: 10.1080/07474939808800408
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800408
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:2:p:109-143
Template-Type: ReDIF-Article 1.0
Author-Name: Qi Li
Author-X-Name-First: Qi
Author-X-Name-Last: Li
Author-Name: Aman Ullha
Author-X-Name-First: Aman
Author-X-Name-Last: Ullha
Title: Estimating partially linear panel data models with one-way error components
Abstract:
We consider the problem of estimating a partially linear panel data model
whenthe error follows an one-way error components structure. We propose a
feasiblesemiparametric generalized least squares (GLS) type estimator for
estimating the coefficient of the linear component and show that it is
asymptotically more efficient than a semiparametric ordinary least squares
(OLS) type estimator. We also discussed the case when the regressor of the
parametric component is correlated with the error, and propose an
instrumental variable GLS-type semiparametric estimator.
Journal: Econometric Reviews
Pages: 145-166
Issue: 2
Volume: 17
Year: 1998
Keywords: Partially linear model, individual effects, semiparametric estimation, generalized least squares method, instrumental variable,
X-DOI: 10.1080/07474939808800409
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800409
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:2:p:145-166
Template-Type: ReDIF-Article 1.0
Author-Name: Kien Tran
Author-X-Name-First: Kien
Author-X-Name-Last: Tran
Title: Estimating mixtures of normal distributions via empirical characteristic function
Abstract:
This paper uses the empirical characteristic function (ECF) procedure to
estimate the parameters of mixtures of normal distributions. Since the
characteristic function is uniformly bounded, the procedure gives
estimates that are numerically stable. It is shown that, using Monte Carlo
simulation, the finite sample properties of th ECF estimator are very
good, even in the case where the popular maximum likelihood estimator
fails to exist. An empirical application is illustrated using the monthl
excess return of the Nyse value-weighted index.
Journal: Econometric Reviews
Pages: 167-183
Issue: 2
Volume: 17
Year: 1998
Keywords: constrained Maximum-likelihood, empirical characteristic function, grid points, mixtures of normal distribution, moment generating function, Monte Carlo simulation,
X-DOI: 10.1080/07474939808800410
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800410
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:2:p:167-183
Template-Type: ReDIF-Article 1.0
Author-Name: Andre Lucas
Author-X-Name-First: Andre
Author-X-Name-Last: Lucas
Title: Inference on cointegrating ranks using lr and lm tests based on pseudo-likelihoods
Abstract:
This paper considers Lagrange Multiplier (LM) and Likelihood Ratio (LR)
tests for determining the cointegrating rank of a vector autoregressive
system. n order to deal with outliers and possible fat-tailedness of the
error process, non-Gaussian likelihoods are used to carry out the
estimation. The limiting distributions of the tests based on these
non-Gaussian pseudo-)likelihoods are derived. These distributions depend
on nuisance parameters. An operational procedure is proposed to perform
inference. It appears that the tests based on non-Gaussian
pseudo-likelihoods are much more powerful than their Gaussian counterparts
if the errors are fat-tailed. Moreover, the operational LM-type test has a
better overall performance than the LR-type test. Copyright O 1998 by
Marcel Dekker, Inc.
Journal: Econometric Reviews
Pages: 185-214
Issue: 2
Volume: 17
Year: 1998
Keywords: cointegration, Lagrange multiplier test, likelihood, ratio test, outlier robustness, fat tails, GARCH, pseudo-likelihood,
X-DOI: 10.1080/07474939808800411
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800411
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:2:p:185-214
Template-Type: ReDIF-Article 1.0
Author-Name: Charles Nelson
Author-X-Name-First: Charles
Author-X-Name-Last: Nelson
Title: Book reviews
Abstract:
Dynamic Econometrics by David F. Hendry. Pp. xxxiv+869. Oxford: Oxford
University Press, 1995. ($US 85.00 cloth, $US 45.00 paper) WEB
INFORMATION: www.oup-usa.org/gcdocs/gc_O198283172.ht.
Journal: Econometric Reviews
Pages: 215-220
Issue: 2
Volume: 17
Year: 1998
X-DOI: 10.1080/07474939808800412
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800412
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:2:p:215-220
Template-Type: ReDIF-Article 1.0
Author-Name: Norman Swanson
Author-X-Name-First: Norman
Author-X-Name-Last: Swanson
Title: Book reviews
Abstract:
Statistical Foundations for Econometric Techniques by Asad Zaman. Pp.
xxviS570. London: Academic Press, 1996. ($US 44.95 paper) Web Information:
www.apnet.com/textbook/sbe/new9596/zaman.htm.
Journal: Econometric Reviews
Pages: 221-225
Issue: 2
Volume: 17
Year: 1998
X-DOI: 10.1080/07474939808800413
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800413
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:2:p:221-225
Template-Type: ReDIF-Article 1.0
Author-Name: Pascal Lavergne
Author-X-Name-First: Pascal
Author-X-Name-Last: Lavergne
Title: Selection of regressors in econometrics: parametric and nonparametric methods selection of regressors in econometrics
Abstract:
The present paper addresses the selection-of-regressors issue into a
general discrimination framework. We show how this framework is useful in
unifying various procedures for selecting regressors and helpful in
understanding the different strategies underlying these procedures. We
review selection of regressors in linear, nonlinear and nonparametric
regression models. In each case we successively consider model selection
criteria and hypothesis testing procedures.
Journal: Econometric Reviews
Pages: 227-273
Issue: 3
Volume: 17
Year: 1998
Keywords: Selection of regressors, Discrimination, JEL Classification: Primary C52: Secondary C20,
X-DOI: 10.1080/07474939808800415
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800415
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:3:p:227-273
Template-Type: ReDIF-Article 1.0
Author-Name: Zacharias Psaradakis
Author-X-Name-First: Zacharias
Author-X-Name-Last: Psaradakis
Title: Bootstrap-based evaluation of markov-switching time series models
Abstract:
This paper explores the possibility of evaluating the adequacy of
Markov-switching time series models by comparing selected functionals
(such as the spectral density function and moving empirical moments)
obtained from the data with those of the fitted model using a bootstrap
algorithm. The proposed model checking procedure is easy to implement and
flexible enough to be adapted to a wide variety of models with parameters
subject to Markov regime-switching. Examples with real and artificial data
illustrate the potential of the methodology.
Journal: Econometric Reviews
Pages: 275-288
Issue: 3
Volume: 17
Year: 1998
Keywords: Markov Chain, Moving Estimates, Parametric Bootstrap, Regime Switching, Spectral Density Function, JEL Classification: C15: C22: C52,
X-DOI: 10.1080/07474939808800416
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800416
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:3:p:275-288
Template-Type: ReDIF-Article 1.0
Author-Name: David Crawford
Author-X-Name-First: David
Author-X-Name-Last: Crawford
Author-Name: Robert Pollak
Author-X-Name-First: Robert
Author-X-Name-Last: Pollak
Author-Name: Francis Vella
Author-X-Name-First: Francis
Author-X-Name-Last: Vella
Title: Simple inference in multinomial and ordered logit
Abstract:
This paper provides some simple methods of interpreting the coefficients
in multinomial logit and ordered logit models. These methods are
summarized in Propositions concerning the magnitudes, signs, and patterns
of partial derivatives of the outcome probabilities with respect to the
exogenousvariables. The paper also provides an empirical example
illustrating the use of these Propositions.
Journal: Econometric Reviews
Pages: 289-299
Issue: 3
Volume: 17
Year: 1998
Keywords: Inference, Mult inomial Logit, Ordered Logit, JEL Classification: C12,
X-DOI: 10.1080/07474939808800417
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800417
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:3:p:289-299
Template-Type: ReDIF-Article 1.0
Author-Name: Yoon-Jae Whang
Author-X-Name-First: Yoon-Jae
Author-X-Name-Last: Whang
Title: A test of normality using nonparametrlic residuals
Abstract:
In this paper, we develop a test of the normality assumption of the
errors using the residuals from a nonparametric kernel regression.
Contrary to the existing tests based on the residuals from a parametric
regression, our test is thus robust to misspecification of the regression
function. The test statistic proposed here is a Bera-Jarque type test of
skewness and kurtosis. We show that the test statistic has the usual x2(2)
limit distribution under the null hypothesis. In contrast to the results
of Rilstone (1992), we provide a set of primitive assumptions that allow
weakly dependent observations and data dependent bandwidth parameters. We
also establish consistency property of the test. Monte Carlo experiments
show that our test has reasonably good size and power performance in small
samples and perfornu better than some of the alternative tests in various
situations.
Journal: Econometric Reviews
Pages: 301-327
Issue: 3
Volume: 17
Year: 1998
Keywords: Nonparametric kernel estimator, Normality test, Skewness, Ihrtosis, Empirical process,
X-DOI: 10.1080/07474939808800418
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800418
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:3:p:301-327
Template-Type: ReDIF-Article 1.0
Author-Name: H.Peter Boswijk
Author-X-Name-First: H.Peter
Author-X-Name-Last: Boswijk
Title: Book reviews
Abstract:
Elements of Modern Asymptotic Theo y with Statistical Applications by
Brendan McCabe and Andrew ~kemayne. Pp. xi+264. Manchester: Manchester
University Press, 1993. (£50 cloth, £17.99 paper) WEB
INFORMATION: www.man.ac.uk/mup/
Journal: Econometric Reviews
Pages: 329-334
Issue: 3
Volume: 17
Year: 1998
X-DOI: 10.1080/07474939808800419
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800419
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:3:p:329-334
Template-Type: ReDIF-Article 1.0
Author-Name: Jon Faust
Author-X-Name-First: Jon
Author-X-Name-Last: Faust
Title: Book reviews
Abstract:
Periodicity and Stochastic Dends in Economic Time Series by Philip Hans
Franses. Pp. xii+230. Oxford: Oxford University Press, 1996. ($US 65.00
cloth, $US 32.50 paper) WEB INFORMATION: www.oupusa.org/docs/0198774540.h.
Journal: Econometric Reviews
Pages: 335-338
Issue: 3
Volume: 17
Year: 1998
X-DOI: 10.1080/07474939808800420
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800420
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:3:p:335-338
Template-Type: ReDIF-Article 1.0
Author-Name: R. Winkelmann
Author-X-Name-First: R.
Author-X-Name-Last: Winkelmann
Title: Count data models with selectivity
Abstract:
This paper shows how truncated, censored, hurdle, zero inflated and
underreported count models can be interpreted as models with selectivity.
Until recently, users of such count data models have commonly imposed
independence brtween the count generating mechanism and the selection
mechanism. Such an assumption is unrealistic in most applications, and
various models with endogenous selectivity (correlation between the count
and the selection equations) are presented. The methods are illustrated in
an application to labor mobility where the dependent variable is the
number of individual job changes during a ten year period.
Journal: Econometric Reviews
Pages: 339-359
Issue: 4
Volume: 17
Year: 1998
Keywords: Poisson distribution, sample selection, underreporting, labor mobility, JEL Classification:C25,C42,
X-DOI: 10.1080/07474939808800422
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800422
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:4:p:339-359
Template-Type: ReDIF-Article 1.0
Author-Name: W. Tsay
Author-X-Name-First: W.
Author-X-Name-Last: Tsay
Title: On the power of durbin-watson statistic against fractionally integrated processes
Abstract:
This paper provides the theoretical explanation and Monte Carlo
experiments of using a modified version of Durbin-Watson ( D W ) statistic
to test an 1 ( 1 ) process against I ( d ) alternatives, that is,
integrated process of order d, where d is a fractional number. We provide
the exact order of magnitude of the modified D W test when the data
generating process is an I ( d ) process with d E (0. 1.5). Moreover, the
consistency of the modified DW statistic as a unit root test against I ( d
) alternatives with d E ( 0 , l ) U ( 1 , 1.5) is proved in this paper. In
addition to the theoretical analysis, Monte Carlo experiments show that
the performance of the modified D W statistic reveals that it can be used
as a unit root test against I ( d ) alternatives.
Journal: Econometric Reviews
Pages: 361-386
Issue: 4
Volume: 17
Year: 1998
Keywords: Durbin-Watson statistic, unit root, fractional Brownian motion, JEL classification:C22,
X-DOI: 10.1080/07474939808800423
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800423
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:4:p:361-386
Template-Type: ReDIF-Article 1.0
Author-Name: K. Maekawa
Author-X-Name-First: K.
Author-X-Name-Last: Maekawa
Author-Name: J. L. Knight
Author-X-Name-First: J. L.
Author-X-Name-Last: Knight
Author-Name: H. Hisamatsu
Author-X-Name-First: H.
Author-X-Name-Last: Hisamatsu
Title: Finite sample comparisons of the distributions of the ols and gls estimators in regression with an integrated regsorad correlated errors
Abstract:
We compare the finite sample distributional properties of the OLS and
GT,S mtinialors 11 a rcgrassior wilh arl inl,cgrd,ctl rcgrtssor ant1
corrctifical errors of the form of AR(1) and MA(1) processes. The approach
is one of first deriving the joint characteristic function of the
quadratic forms in the clefiriit,on of t,hc est,irrial,ors and then
rurrierically inverting these 1.0 find the distributions. When the
characteristic functions are intractable, Monte Carlo integration is
employed. We find substantial differences in the finite jarriplc
ditributiorls of OLS ant1 C:LS dthough lsynlptotically thee distributions
are equivalent.
Journal: Econometric Reviews
Pages: 387-413
Issue: 4
Volume: 17
Year: 1998
X-DOI: 10.1080/07474939808800424
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800424
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:4:p:387-413
Template-Type: ReDIF-Article 1.0
Author-Name: Kuan Xu
Author-X-Name-First: Kuan
Author-X-Name-Last: Xu
Author-Name: L. Osberg
Author-X-Name-First: L.
Author-X-Name-Last: Osberg
Title: A distribution-free test for deprivation dominance
Abstract:
The Raw1sian perspective on social policy pays particular attentionto the
least advantaged members of society, but how should "the least advantaged"
be identified? The concept of deprivation dominance operationalizes in
part the Rawlsian evaluation of the welfare of the least advantaged
members of society, but a statistical procedure for testing deprivation
dominance is needed. In this paper, we construct a new distribution-free
test for deprivation dominance and apply i t to Canadian income survey
data
Journal: Econometric Reviews
Pages: 415-429
Issue: 4
Volume: 17
Year: 1998
Keywords: deprivation profile, deprivation dominance, poverty, welfare, asymptotic distribution, statistical test, JEL Classification:C12,I32,
X-DOI: 10.1080/07474939808800425
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800425
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:4:p:415-429
Template-Type: ReDIF-Article 1.0
Author-Name: Ragnar Nymoen
Author-X-Name-First: Ragnar
Author-X-Name-Last: Nymoen
Title: Book reviews
Abstract:
Journal: Econometric Reviews
Pages: 431-442
Issue: 4
Volume: 17
Year: 1998
X-DOI: 10.1080/07474939808800426
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939808800426
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:17:y:1998:i:4:p:431-442
Template-Type: ReDIF-Article 1.0
Author-Name: John Geweke
Author-X-Name-First: John
Author-X-Name-Last: Geweke
Title: Using simulation methods for bayesian econometric models: inference, development,and communication
Abstract:
This paper surveys the fundamental principles of subjective Bayesian
inference in econometrics and the implementation of those principles using
posterior simulation methods. The emphasis is on the combination of models
and the development of predictive distributions. Moving beyond
conditioning on a fixed number of completely specified models, the paper
introduces subjective Bayesian tools for formal comparison of these models
with as yet incompletely specified models. The paper then shows how
posterior simulators can facilitate communication between investigators
(for example, econometricians) on the one hand and remote clients (for
example, decision makers) on the other, enabling clients to vary the prior
distributions and functions of interest employed by investigators. A theme
of the paper is the practicality of subjective Bayesian methods. To this
end, the paper describes publicly available software for Bayesian
inference, model development, and communication and provides illustrations
using two simple econometric models.
Journal: Econometric Reviews
Pages: 1-73
Issue: 1
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800428
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800428
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:1:p:1-73
Template-Type: ReDIF-Article 1.0
Author-Name: W. E. Griffiths
Author-X-Name-First: W. E.
Author-X-Name-Last: Griffiths
Title: Estimating consumer surplus comments on "using simulation methods for bayesian econometric models: inference development and communication"
Abstract:
Journal: Econometric Reviews
Pages: 75-87
Issue: 1
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800429
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800429
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:1:p:75-87
Template-Type: ReDIF-Article 1.0
Author-Name: C. Fernandez
Author-X-Name-First: C.
Author-X-Name-Last: Fernandez
Author-Name: M. F. J. Steel
Author-X-Name-First: M. F. J.
Author-X-Name-Last: Steel
Title: Some comments on model development and posterior existence
Abstract:
We wish to congratulate John Geweke on producing such an interesting and
complete paper. We are delighted to see that serious attempts to make
Bayesian methods more generally understood and available are being
undertaken. In addition, a number of quite challenging issues is addressed
here. Whereas we agree with most of what is stated in the paper, it is our
(perceived) duty to single out those things that we feel are more
contentious. However, we hope that this can lead to a stimulating
discussion of general interest and hopefully to our better understanding
of these issues.
Journal: Econometric Reviews
Pages: 89-96
Issue: 1
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800430
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800430
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:1:p:89-96
Template-Type: ReDIF-Article 1.0
Author-Name: G. Koop
Author-X-Name-First: G.
Author-X-Name-Last: Koop
Author-Name: D. J. Poirier
Author-X-Name-First: D. J.
Author-X-Name-Last: Poirier
Title: Incomplete models and reweighting
Abstract:
Journal: Econometric Reviews
Pages: 97-104
Issue: 1
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800431
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800431
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:1:p:97-104
Template-Type: ReDIF-Article 1.0
Author-Name: H. K. Van Dijk
Author-X-Name-First: H. K.
Author-X-Name-Last: Van Dijk
Title: Some remarks on the simulation revolution in bayesian econometric inference
Abstract:
Journal: Econometric Reviews
Pages: 105-112
Issue: 1
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800432
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800432
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:1:p:105-112
Template-Type: ReDIF-Article 1.0
Author-Name: G. M. Martin
Author-X-Name-First: G. M.
Author-X-Name-Last: Martin
Author-Name: C. S. Forbes
Author-X-Name-First: C. S.
Author-X-Name-Last: Forbes
Title: Using simulation methods for bayesian econometric models: inference, development and communication: some comments
Abstract:
Journal: Econometric Reviews
Pages: 113-118
Issue: 1
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800433
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800433
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:1:p:113-118
Template-Type: ReDIF-Article 1.0
Author-Name: J. Geweke
Author-X-Name-First: J.
Author-X-Name-Last: Geweke
Title: Reply
Abstract:
Thanks and congratulations to all of the discussants for covering a wide
array of important topics. Since space does not permit attention to all of
the points that have been raise, this reply will concentrate on trying to
dispell confusion on important matters that might remain.
Journal: Econometric Reviews
Pages: 119-126
Issue: 1
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800434
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800434
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:1:p:119-126
Template-Type: ReDIF-Article 1.0
Author-Name: D. Ormoneit
Author-X-Name-First: D.
Author-X-Name-Last: Ormoneit
Author-Name: H. White
Author-X-Name-First: H.
Author-X-Name-Last: White
Title: An efficient algorithm to compute maximum entropy densities
Abstract:
We describe an algorithm to efficiently compute maximum entropy
densities, i.e. densities maximizing the Shannon entropy - [image
omitted] under a set of constraints [image omitted] . Our
method is based on an algorithm by Zellner and Highfield, which has been
found not to converge under a variety of circumstances. To demonstrate
that our method overcomes these difficulties, we conduct numerous
experiments for the special case gi(x) = xi, n = 4. An extensive table of
results for this case and computer code are available on the World Wide
Web
Journal: Econometric Reviews
Pages: 127-140
Issue: 2
Volume: 18
Year: 1999
Keywords: Density Estimation, Maximum Entropy Principle, Shannon Entropy, JEL Classification:C61,C63,C87,
X-DOI: 10.1080/07474939908800436
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800436
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:2:p:127-140
Template-Type: ReDIF-Article 1.0
Author-Name: R. F. Phillips
Author-X-Name-First: R. F.
Author-X-Name-Last: Phillips
Title: Partially adaptive estimation of nonlinear models via a normal mixture
Abstract:
This paper extends the partially adaptive method Phillips (1994) provided
for linear models to nonlinear models. Asymptotic results are established
under conditions general enough they cover both cross-sectional and time
series applications. The sampling efficiency of the new estimator is
illustrated in a small Monte Carlo study in which the parameters of an
autoregressive moving average are estimated. The study indicates that, for
non-normal distributions, the new estimator improves on the nonlinear
least squares estimator in terms of efficiency.
Journal: Econometric Reviews
Pages: 141-167
Issue: 2
Volume: 18
Year: 1999
Keywords: ARMA process, nonlinear regression model, quasi maximum likelihood, JEL Classifications:C13,C20,
X-DOI: 10.1080/07474939908800437
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800437
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:2:p:141-167
Template-Type: ReDIF-Article 1.0
Author-Name: L. G. Godfrey
Author-X-Name-First: L. G.
Author-X-Name-Last: Godfrey
Author-Name: C. D. Orme
Author-X-Name-First: C. D.
Author-X-Name-Last: Orme
Title: The robustness, reliabiligy and power of heteroskedasticity tests
Abstract:
Several tests for heteroskedasticity in linear regression models are
examined. Asymptoticrobustness to heterokurticity, nonnormality and
skewness is discussed. The finite sample eliability of asymptotically
valid tests is investigated using Monte Carlo experiments. It is found
that asymptotic critical values cannot, in general. be relied upon to give
good agreement between nominal and actual finite sample significance
levels. The use of the bootstrap overcomes this problem for general
approaches that lead to asymptotically pivotal test statistics. Power
comparisons are made for bootstrap tests and modified Glejser and Koenker
tests are recommended.
Journal: Econometric Reviews
Pages: 169-194
Issue: 2
Volume: 18
Year: 1999
Keywords: heteroskedasticity, robustness, nonnormality, bootstrap, JEL Classification:C12,C52,
X-DOI: 10.1080/07474939908800438
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800438
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:2:p:169-194
Template-Type: ReDIF-Article 1.0
Author-Name: N. Coulibaly
Author-X-Name-First: N.
Author-X-Name-Last: Coulibaly
Author-Name: B. Wade Brorsen
Author-X-Name-First: B. Wade
Author-X-Name-Last: Brorsen
Title: Monte carlo sampling approach to testing nonnested hypothesis: monte carlo results
Abstract:
Alternative ways of using Monte Carlo methods to implement a Cox-type
test for separate families of hypotheses are considered. Monte Carlo
experiments are designed to compare the finite sample performances of
Pesaran and Pesaran's test, a RESET test, and two Monte Carlo hypothesis
test procedures. One of the Monte Carlo tests is based on the distribution
of the log-likelihood ratio and the other is based on an asymptotically
pivotal statistic. The Monte Carlo results provide strong evidence that
the size of the Pesaran and Pesaran test is generally incorrect, except
for very large sample sizes. The RESET test has lower power than the other
tests. The two Monte Carlo tests perform equally well for all sample sizes
and are both clearly preferred to the Pesaran and Pesaran test, even in
large samples. Since the Monte Carlo test based on the log-likelihood
ratio is the simplest to calculate, we recommend using it.
Journal: Econometric Reviews
Pages: 195-209
Issue: 2
Volume: 18
Year: 1999
Keywords: Cox test, Monte Carlo test, Nonnested hypotheses, JEL Classification:C12,C15,
X-DOI: 10.1080/07474939908800439
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800439
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:2:p:195-209
Template-Type: ReDIF-Article 1.0
Author-Name: F. Cribari-Neto
Author-X-Name-First: F.
Author-X-Name-Last: Cribari-Neto
Author-Name: S. G. Zarkos
Author-X-Name-First: S. G.
Author-X-Name-Last: Zarkos
Title: Bootstrap methods for heteroskedastic regression models: evidence on estimation and testing
Abstract:
This paper uses Monte Carlo simulation analysis to study the
finite-sample behavior of bootstrap estimators and tests in the linear
heteroskedastic model. We consider four different bootstrapping schemes,
three of them specifically tailored to handle heteroskedasticity. Our
results show that weighted bootstrap methods can be successfully used to
estimate the variances of the least squares estimators of the linear
parameters both under normality and under nonnormality. Simulation results
are also given comparing the size and power of the bootstrapped
Breusch-Pagan test with that of the original test and of Bartlett and
Edgeworth-corrected tests. The bootstrap test was found to be robust
against unfavorable regression designs.
Journal: Econometric Reviews
Pages: 211-228
Issue: 2
Volume: 18
Year: 1999
Keywords: Bartlett-type correction, bootstrap, Edgeworth expansion, heteroskedasticity, Lagrange multiplier test, score test, weighted bootstrap, JEL CLASSIFICATION:C12,C13,C15,
X-DOI: 10.1080/07474939908800440
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800440
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:2:p:211-228
Template-Type: ReDIF-Article 1.0
Author-Name: Charles Goodhart
Author-X-Name-First: Charles
Author-X-Name-Last: Goodhart
Title: Book reviews
Abstract:
The Economics of Seasonal Cycles by Jeffrey A. Miron. Pp. xviii+225.
Cambridge, Massachusetts: MIT Press, 1996. ($US30.00 cloth) WEB
INFORMATION: http://mitpress.mit.edu/book-home.tcl?isbn=O262133237.
Journal: Econometric Reviews
Pages: 229-230
Issue: 2
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800441
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800441
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:2:p:229-230
Template-Type: ReDIF-Article 1.0
Author-Name: Pentti Saikkonen
Author-X-Name-First: Pentti
Author-X-Name-Last: Saikkonen
Title: Testing normalization and overidentification of cointegrating vectors in vector autoregressive processes
Abstract:
This paper develops test procedures for testing the validity of general
linear identifying restrictions imposed on cointegrating vectors in the
context of a vector autoregressive model. In addition to overidentifying
restrictions the considered restrictions may also involve normalizing
restrictions. Tests for both types of restrictions are developed and their
asymptotic properties are obtained. Under the null hypothesis tests for
normalizing restrictions have an asymptotic "multivariate unit root
distribution", similar to that obtained for the likelihood ratio test for
cointegration, while tests for overidentifying restrictions have a
standard chi-square limiting distribution. Since these two types of tests
are asymptotically independent they are easy to cotnbine to an overall
test for the spccifed identifying restrictions. An overall test of this
kind can consistently reveal the failure of the identifying restrictions
in a wider class of cases than previous tests which only test for
overidentifying restrictions.
Journal: Econometric Reviews
Pages: 235-257
Issue: 3
Volume: 18
Year: 1999
Keywords: and phrases, cointegration, normalizing restrictions, overidentifying restrictions, vector autoregressive process,
X-DOI: 10.1080/07474939908800444
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800444
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:3:p:235-257
Template-Type: ReDIF-Article 1.0
Author-Name: Clive Granger
Author-X-Name-First: Clive
Author-X-Name-Last: Granger
Author-Name: Tae-Hwy Lee
Author-X-Name-First: Tae-Hwy
Author-X-Name-Last: Lee
Title: The effect of aggregation on nonlinearity
Abstract:
This paper investigates the interaction between aggregation and
nonlinearity through a monte carlo study. Various tests for neglected
nonlinearity are used to compare the power of the tests for different
nonlinear models to different levels of aggregation. Three types of
aggregation, namely, cross-sectional aggregation, temporal aggregation and
systematic sampling are considered. Aggregation is inclined to simplify
nonlinearity. The degree to which nonlinearity is reduced depends on the
importance of common factor and extent of the aggregation. The effect is
larger when the size of common factor is smaller and when the extent of
the aggregation is larger.
Journal: Econometric Reviews
Pages: 259-269
Issue: 3
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800445
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800445
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:3:p:259-269
Template-Type: ReDIF-Article 1.0
Author-Name: Richard Paap
Author-X-Name-First: Richard
Author-X-Name-Last: Paap
Author-Name: Philip Hans Franses
Author-X-Name-First: Philip Hans
Author-X-Name-Last: Franses
Title: On trends and constants in periodic autoregressions
Abstract:
Periodic autoregressions are characterised by autoregressive structures
that vary with the season. If a time series is periodically integrated,
one needs a seasonally varying differencing filter to remove the
stochastic trend. When the periodic regression model contains constants
and trends with unrestricted parameters, the data can show diverging
seasonal deterministic trends. In this paper we derive explicit
expressions for parameter restrictions that result in common deterministic
trends under periodic trend stationarity and periodic integration.
Journal: Econometric Reviews
Pages: 271-286
Issue: 3
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800446
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800446
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:3:p:271-286
Template-Type: ReDIF-Article 1.0
Author-Name: Karim Abadir
Author-X-Name-First: Karim
Author-X-Name-Last: Abadir
Title: An introduction to hypergeometric functions for economists
Abstract:
Hypergeometric functions are a generalization of exponential functions.
They are explicit, computable functions that can also be manipulated
analytically. The functions and series we use in quantitative economics
are all special cases of them. In this paper, a unified approach to
hypergeometric functions is given. As a result, some potentially useful
general applications emerge in a number of areas such as in econometrics
and economic theory. The greatest benefit from using these functions stems
from the fact that they provide parsimonious explicit (and interpretable)
solutions to a wide range of general problems.
Journal: Econometric Reviews
Pages: 287-330
Issue: 3
Volume: 18
Year: 1999
Keywords: and phrases, Hypergeometric functions, distribution theory, non-linear models and discontinues, differential equations; economic theory, utility, production and cost functions,
X-DOI: 10.1080/07474939908800447
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800447
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:3:p:287-330
Template-Type: ReDIF-Article 1.0
Author-Name: Zacharias Psarasakis
Author-X-Name-First: Zacharias
Author-X-Name-Last: Psarasakis
Title: A note on super exogeneity in linear regression models
Abstract:
This note considers how hypotheses of invariance and super exogeneity may
be formulated and tested in elliptical linear regression models. It is
demonstrated that for jointly elliptical random variables super exogeneity
will only hold under normality.
Journal: Econometric Reviews
Pages: 331-336
Issue: 3
Volume: 18
Year: 1999
Keywords: Elliptically contoured distribution, Invariance, Normality, Regression model, Super exogeneity,
X-DOI: 10.1080/07474939908800448
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800448
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:3:p:331-336
Template-Type: ReDIF-Article 1.0
Author-Name: Kevin Hoover
Author-X-Name-First: Kevin
Author-X-Name-Last: Hoover
Title: Book review
Abstract:
The Foundations of Econometric Analysis, edited by David F. Hendry and
Mary S. Morgan. Pp. xvi+558. Cambridge: Cambridge University Press, 1995.
(£19.95 paper, £45.00 cloth).
Journal: Econometric Reviews
Pages: 337-342
Issue: 3
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800449
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800449
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:3:p:337-342
Template-Type: ReDIF-Article 1.0
Author-Name: David Edgerton
Author-X-Name-First: David
Author-X-Name-Last: Edgerton
Author-Name: Ghazi Shukur
Author-X-Name-First: Ghazi
Author-X-Name-Last: Shukur
Title: Testing autocorrelation in a system perspective testing autocorrelation
Abstract:
The Breusch-Godfrey test for autocorrelated errors is generalised to
cover systems of equations, and the properties of 18 versions of the test
are studied using Monte Carlo methods. We show that only one group of
tests regularly has actual size close to the nominal size; namely the
likelihood ratio tests of the auxiliary regression system that are
corrected in some manner for degrees-of-freedom. The Rao Ftest exhibits
the best performance, whilst the commonly used TR2 test behaves badly even
in single equations. However, the size and power properties of all tests
deteriorate sharply as the number of equations increases, the system
becomes more dynamic, the exogenous variables become more autocorrelated
and the sample size decreases. This performance has, in general, an
unknown degree since the interaction amongst these factors does not permit
a predictive summary, as might be hoped for by response surface-type
approaches.
Journal: Econometric Reviews
Pages: 343-386
Issue: 4
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800351
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800351
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:4:p:343-386
Template-Type: ReDIF-Article 1.0
Author-Name: Myoung-jae Lee
Author-X-Name-First: Myoung-jae
Author-X-Name-Last: Lee
Title: Probability inequalities in multivariate distributions
Abstract:
For a bivariate binary response model y, = 1 (xj βj+j > 0),
j=1,2, we propose to estimate nonpararnetrically the quadrant correlation
E{sgn(u1) *sgn(u2)} between the two error terms ul and u2 without
specifjing the error term distribution. The quadrant correlation accounts
for the relationship between yl and y2 that is not explained by xl and x2,
and can be used in testing for the specification of endogenous dummy
variable models. The quadrant correlation is further generalized into
orthant dependence allowing unknown regression functions, unknown error
term distribution and arbitrary forms of heteroskedasticity. A simulation
study is provided, followed by a brief application to a real data set.
Journal: Econometric Reviews
Pages: 387-415
Issue: 4
Volume: 18
Year: 1999
Keywords: binary response, endogenous dummy varible, quadrant correlation, orthant dependence,
X-DOI: 10.1080/07474939908800352
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800352
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:4:p:387-415
Template-Type: ReDIF-Article 1.0
Author-Name: Steven Wei
Author-X-Name-First: Steven
Author-X-Name-Last: Wei
Title: A bayesian approach to dynamic tobit models
Abstract:
This paper develops a posterior simulation method for a dynamic Tobit
model. The major obstacle rooted in such a problem lies in high
dimensional integrals, induced by dependence among censored observations,
in the likelihood function. The primary contribution of this study is to
develop a practical and efficient sampling scheme for the conditional
posterior distributions of the censored (i.e., unobserved) data, so that
the Gibbs sampler with the data augmentation algorithm is successfully
applied. The substantial differences between this approach and some
existing methods are highlighted. The proposed simulation method is
investigated by means of a Monte Carlo study and applied to a regression
model of Japanese exports of passenger cars to the U.S. subject to a
non-tariff trade barrier.
Journal: Econometric Reviews
Pages: 417-439
Issue: 4
Volume: 18
Year: 1999
Keywords: Bayesian inference, Dynamic Tobit model, The Gibbs sampler with the data augmentation, Monte Carlo simulation, truncated normal distribution,
X-DOI: 10.1080/07474939908800353
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800353
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:4:p:417-439
Template-Type: ReDIF-Article 1.0
Author-Name: Zacharias Psaradakis
Author-X-Name-First: Zacharias
Author-X-Name-Last: Psaradakis
Author-Name: Elias Tzavalis
Author-X-Name-First: Elias
Author-X-Name-Last: Tzavalis
Title: On regression-based tests for persistence in logarithmic volatility models
Abstract:
Building on the work of Pantula (1986), this paper discusses how the
hypothesis of conditional variance nonstationarity in the logarithmic
family of generalized autoregressive conditional heteroskedasticity
(GARCH) and stochastic volatility processes may be tested using
regression-based tests. The latter are easy to implement, have
well-defined large-sample distributions, and are less sensitive to
structural changes than tests based on the quasimaximum likelihood
estimator.
Journal: Econometric Reviews
Pages: 441-448
Issue: 4
Volume: 18
Year: 1999
Keywords: conditional heteroskedasticity, nonlinear Garch, persistence, stochastic volatility, regime changes, unit root,
X-DOI: 10.1080/07474939908800354
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800354
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:4:p:441-448
Template-Type: ReDIF-Article 1.0
Author-Name: Pietro Balestra
Author-X-Name-First: Pietro
Author-X-Name-Last: Balestra
Author-Name: Jaya Krishnakumar
Author-X-Name-First: Jaya
Author-X-Name-Last: Krishnakumar
Title: Announcement and call for papers for the ninth international conference on panel data
Abstract:
Journal: Econometric Reviews
Pages: 449-450
Issue: 4
Volume: 18
Year: 1999
X-DOI: 10.1080/07474939908800355
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474939908800355
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:18:y:1999:i:4:p:449-450
Template-Type: ReDIF-Article 1.0
Author-Name: Jeremy Berkowitz
Author-X-Name-First: Jeremy
Author-X-Name-Last: Berkowitz
Author-Name: Lutz Kilian
Author-X-Name-First: Lutz
Author-X-Name-Last: Kilian
Title: Recent developments in bootstrapping time series
Abstract:
Journal: Econometric Reviews
Pages: 1-48
Issue: 1
Volume: 19
Year: 2000
Keywords: Bootstrap, ARLIA, Frequency Domain, Blocks,
X-DOI: 10.1080/07474930008800457
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800457
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:1:p:1-48
Template-Type: ReDIF-Article 1.0
Author-Name: Russell Davidson
Author-X-Name-First: Russell
Author-X-Name-Last: Davidson
Author-Name: James MacKinnon
Author-X-Name-First: James
Author-X-Name-Last: MacKinnon
Title: Bootstrap tests: how many bootstraps?
Abstract:
In practice, bootstrap tests must use a finite number of bootstrap
samples. This means that the outcome of the test will depend on the
sequence of random numbers used to generate the bootstrap samples, and it
necessarily results in some loss of power. We examine the extent of this
power loss and propose a simple pretest procedure for choosing the number
of bootstrap samples so as to minimize experimental randomness. Simulation
experiments suggest that this procedure will work very well in practice.
Journal: Econometric Reviews
Pages: 55-68
Issue: 1
Volume: 19
Year: 2000
Keywords: bootstrap test, test power, pretest,
X-DOI: 10.1080/07474930008800459
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800459
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:1:p:55-68
Template-Type: ReDIF-Article 1.0
Author-Name: Alexander Benkwitz
Author-X-Name-First: Alexander
Author-X-Name-Last: Benkwitz
Author-Name: Michael Neumann
Author-X-Name-First: Michael
Author-X-Name-Last: Neumann
Author-Name: Helmut Lutekpohl
Author-X-Name-First: Helmut
Author-X-Name-Last: Lutekpohl
Title: Problems related to confidence intervals for impulse responses of autoregressive processes
Abstract:
Confidence intervals for impulse responses computed from autoregressive
processes are considered. A detailed analysis of the methods in current
use shows that they are not very reliable in some cases. In particular,
there are theoretical reasons for them to have actual coverage
probabilities which deviate considerably from the nominal level in some
situations of practical importance. For a simple case alternative
bootstrap methods are proposed which provide correct results
asymptotically.
Journal: Econometric Reviews
Pages: 69-103
Issue: 1
Volume: 19
Year: 2000
Keywords: impulse response, bootstrap, autoregressive process, asymptotic inference, nonparametric inference,
X-DOI: 10.1080/07474930008800460
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800460
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:1:p:69-103
Template-Type: ReDIF-Article 1.0
Author-Name: Denzil Fiebig
Author-X-Name-First: Denzil
Author-X-Name-Last: Fiebig
Author-Name: Jae Kim
Author-X-Name-First: Jae
Author-X-Name-Last: Kim
Title: Estimation and inference in sur models when the number of equations is large
Abstract:
There is a tendency for the true variability of feasible GLS estimators
to be understated by asymptotic standard errors. For estimation of SUR
models, this tendency becomes more severe in large equation systems when
estimation of the error covariance matrix, C, becomes problematic. We
explore a number of potential solutions involving the use of improved
estimators for the disturbance covariance matrix and bootstrapping. In
particular, Ullah and Racine (1992) have recently introduced a new class
of estimators for SUR models that use nonparametric kernel density
estimation techniques. The proposed estimators have the same structure as
the feasible GLS estimator of Zellner (1962) differing only in the choice
of estimator for C. Ullah and Racine (1992) prove that their nonparametric
density estimator of C can be expressed as Zellner's original estimator
plus a positive definite matrix that depends on the smoothing parameter
chosen for the density estimation. It is this structure of the estimator
that most interests us as it has the potential to be especially useful in
large equation systems. Atkinson and Wilson (1992) investigated the bias
in the conventional and bootstrap estimators of coefficient standard
errors in SUR models. They demonstrated that under certain conditions the
former were superior, but they caution that neither estimator uniformly
dominated and hence bootstrapping provides little improvement in the
estimation of standard errors for the regression coefficients. Rilstone
and Veal1 (1996) argue that an important qualification needs to be made to
this somewhat negative conclusion. They demonstrated that bootstrapping
can result in improvements in inferences if the procedures are applied to
the t-ratios rather than to the standard errors. These issues are explored
for the case of large equation systems and when bootstrapping is combined
with improved covariance estimation.
Journal: Econometric Reviews
Pages: 105-130
Issue: 1
Volume: 19
Year: 2000
Keywords: seemingly unrelated regression models, improved covariance estimation, bootstrapping, large equation systems,
X-DOI: 10.1080/07474930008800461
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800461
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:1:p:105-130
Template-Type: ReDIF-Article 1.0
Author-Name: Jorgen Grofi
Author-X-Name-First: Jorgen
Author-X-Name-Last: Grofi
Author-Name: Simo Puntanen
Author-X-Name-First: Simo
Author-X-Name-Last: Puntanen
Title: Remark on pseudo-generalized least squares
Abstract:
We briefly discuss the so called pseudo-GLS estimator in a standard
linear regression model with nonsperical disturbances, and conclude that
the potentiality for applications is higher than originally assumed by
Fiebig Bartels and Kramer (1996).
Journal: Econometric Reviews
Pages: 139-144
Issue: 1
Volume: 19
Year: 2000
Keywords: C13, C20,
X-DOI: 10.1080/07474930008800462
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800462
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:1:p:139-144
Template-Type: ReDIF-Article 1.0
Author-Name: James Hamilton
Author-X-Name-First: James
Author-X-Name-Last: Hamilton
Title: Book review
Abstract:
State-Space Models with Regime Switching by Chang-Jin Kim and Charles R.
Nelson. Pp. 250. Cambridge, Massachnsetts: MIT Press, 1999. ($40.00 cloth)
WEB INFOR~UATION: http://mitpress.mit.edu/book-home.tcl?isbn=0262l123.
Journal: Econometric Reviews
Pages: 135-137
Issue: 1
Volume: 19
Year: 2000
X-DOI: 10.1080/07474930008800463
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800463
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:1:p:135-137
Template-Type: ReDIF-Article 1.0
Author-Name: Oliver Linton
Author-X-Name-First: Oliver
Author-X-Name-Last: Linton
Author-Name: Douglas Steigerwald
Author-X-Name-First: Douglas
Author-X-Name-Last: Steigerwald
Title: Adaptive testing in arch models
Abstract:
Specification tests for conditional heteroskedasticity that are derived
under the assumption that the density of the innovation is Gaussian may
not be powerful in light of the recent empirical results that the density
is not Gaussian. We obtain specification tests for conditional
heteroskedasticity under the assumption that the innovation density is a
member of a general family of densities. Our test statistics maximize
asymptotic local power and weighted average power criteria for the general
family of densities. We establish both first-order and second-order theory
for our procedures. Simulations indicate that asymptotic power gains are
achievable in finite samples.
Journal: Econometric Reviews
Pages: 145-174
Issue: 2
Volume: 19
Year: 2000
Keywords: adaptive testing, ARCH, conditional heteroskedasticity;, semiparametric,
X-DOI: 10.1080/07474930008800466
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800466
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:2:p:145-174
Template-Type: ReDIF-Article 1.0
Author-Name: Douglas Hodgson
Author-X-Name-First: Douglas
Author-X-Name-Last: Hodgson
Title: Unconditional pseudo-maximum likelihood and adaptive estimation in the presence of conditional heterogeneity of unknown form
Abstract:
We consider parametric non-linear regression models with additive
innovations which are serially uncorrelated but not necessarily
independent, and consider the consequences of maximum likelihood and
related one-step iterative estimation when the innovations are treated as
being iid from their unconditional density. We find that the estimators'
asymptotic covariance matrices will generally differ from those that would
obtain if the errors actually were iid, except for the special case of
strictly exogenous regressors. One important application of these results
is to analysis of the properties of adaptive estimators, which employ
nonparametric kernel estimates of the unconditional density of the
disturbances in the construction of one-step iterative estimators. In the
presence of strictly exogenous regressors, adaptive estimators are found
to be asymptotically equivalent to the one-step iterative estimators that
use the correct unconditional density. We illustrate our results through a
brief Monte Carlo study.
Journal: Econometric Reviews
Pages: 175-206
Issue: 2
Volume: 19
Year: 2000
X-DOI: 10.1080/07474930008800467
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800467
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:2:p:175-206
Template-Type: ReDIF-Article 1.0
Author-Name: Eiji Kurozumi
Author-X-Name-First: Eiji
Author-X-Name-Last: Kurozumi
Author-Name: Taku Yamamoto
Author-X-Name-First: Taku
Author-X-Name-Last: Yamamoto
Title: Modified lag augmented vector autoregressions
Abstract:
This paper proposes an inference procedure for a possibly integrated
vector autoregression (VAR) model. We modify the lag augmented VAR
(LA-VAR) estimator to exclude the quasiasymptotic bias, which is
associated with the term Op(T-1), using the jackknife method. The new
estimator has an asymptotic normal distribution and then the Wald
statistic to test for the parameter restrictions has an asymptotic
chi-square distribut,ion. We investigate the finite sample properties of
this approach by comparing with the LA-VAR approach. We find t,hat our
modified LA-VAR (MLA-VAR) approach excels the LA-VAR approach in view of
an accuracy of the empirical size and the robustness to the
tnisspecification of the lag length. The MLA-VAR approach may be used when
the researchers place importance on an accuracy of the size, and also be
used to complement other testing procedures that may suffer from serious
size distortion.
Journal: Econometric Reviews
Pages: 207-231
Issue: 2
Volume: 19
Year: 2000
Keywords: Vector autoregressions, Integration, Cointegration, Bias correction, Hypothesis testing,
X-DOI: 10.1080/07474930008800468
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800468
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:2:p:207-231
Template-Type: ReDIF-Article 1.0
Author-Name: Kenneth Stewart
Author-X-Name-First: Kenneth
Author-X-Name-Last: Stewart
Author-Name: Kenneth Stewart
Author-X-Name-First: Kenneth
Author-X-Name-Last: Stewart
Title: GNR, MGR, and exact misspeclfication testing
Abstract:
The Gauss-Newton regression (GNR) is widely used to compute Lagrange
multiplier statistics. A regression described by Milliken and Graybill
yields an exact F test in a certain class of nonlinear models which are
linear under the null. This paper shows that the Milliken-Graybill
regression is a GNR. Hence one interpretation of Milliken-Graybill is that
they identified a class of nonlinear models for which the GNR yields an
exact test.
Journal: Econometric Reviews
Pages: 233-240
Issue: 2
Volume: 19
Year: 2000
X-DOI: 10.1080/07474930008800469
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800469
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:2:p:233-240
Template-Type: ReDIF-Article 1.0
Author-Name: L. G. Godfrey
Author-X-Name-First: L. G.
Author-X-Name-Last: Godfrey
Author-Name: M. R. Veal
Author-X-Name-First: M. R.
Author-X-Name-Last: Veal
Title: Alternative approaches to testing by variable addition
Abstract:
Journal: Econometric Reviews
Pages: 241-261
Issue: 2
Volume: 19
Year: 2000
Keywords: specification errors, model specification, variable addition tests, bootstrap critical values,
X-DOI: 10.1080/07474930008800470
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800470
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:2:p:241-261
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Almas Heshmati
Author-X-Name-First: Almas
Author-X-Name-Last: Heshmati
Title: Introduction
Abstract:
Journal: Econometric Reviews
Pages: 5-5
Issue: 3
Volume: 19
Year: 2000
X-DOI: 10.1080/07474930008800472
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800472
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:3:p:5-5
Template-Type: ReDIF-Article 1.0
Author-Name: Peter Phillips
Author-X-Name-First: Peter
Author-X-Name-Last: Phillips
Author-Name: Hyungsik Moon
Author-X-Name-First: Hyungsik
Author-X-Name-Last: Moon
Title: Nonstationary panel data analysis: an overview of some recent developments
Abstract:
This paper overviews some recent developments in panel data asymptotics,
concentrating on the nonstationary panel case and gives a new result for
models with individual effects. Underlying recent theory are asymptotics
for multi-indexed processes in which both indexes may pass to infinity. We
review some of the new limit theory that has been developed, show how it
can be applied and give a new interpretation of individual effects in
nonstationary panel data. Fundamental to the interpretation of much of the
asymptotics is the concept of a panel regression coefficient which
measures the long run average relation across a section of the panel. This
concept is analogous to the statistical interpretation of the coefficient
in a classical regression relation. A variety of nonstationary panel data
models are discussed and the paper reviews the asymptotic properties of
estimators in these various models. Some recent developments in panel unit
root tests and stationary dynamic panel regression models are also
reviewed.
Journal: Econometric Reviews
Pages: 263-286
Issue: 3
Volume: 19
Year: 2000
X-DOI: 10.1080/07474930008800473
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800473
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:3:p:263-286
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Almas Heshmati
Author-X-Name-First: Almas
Author-X-Name-Last: Heshmati
Title: Stochastic dominance amongst swedish income distributions
Abstract:
:Sweden's income distribution for the whole population and for subgroups,
including its immigants, has been extensively studied. The interest in
this area has grown with increasing availability of data, including
panels. The previous studies are based on indices of inequality or
mobility. While indices are useful for complete ordering and have an air
of "decisiveness" about them, they lack universal acceptance of the value
judgements inherent to the welfare functions that underlay any index. In
contrast, uniformpartial order relations are studied in this paper which
rank welfare situations over very wide classes of welfare functions. We
conduct bootstrap tests for the existence of first and second order
stochastic dominance amongst Sweden's income distributions over time and
for several subgroups of immigrants. Analysis of immigrant's income is
motivated by the fact that the development of income for immigrants has
been different and strongly affected by their length of residence and
countries of origin. We consider several non-consecutive waves of a panel
of incomes in Sweden. Two income definitions are developed. One is
pre-transfers and taxes, gross income, the other is a post-transfers and
taxes, disposable income. The comparison of the distribution of these two
variables affords a partial view of Sweden's welfare system. We have
focused on the incomes of Swede's and immigrant groups of single
individuals identified by country of origin, length of residence, age,
education, gender, marital status and other relevant characteristics. We
find that first order dominance is rare, but second order relations hold
in several cases, especially amongst disposable income distributions.
Sweden's incomes and welfare policies favor the elderly, females, larger
families, and longer periods of residency. We find, in general, the higher
the educational credentials, the higher is the burden of this equalization
policy.
Journal: Econometric Reviews
Pages: 287-320
Issue: 3
Volume: 19
Year: 2000
X-DOI: 10.1080/07474930008800474
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800474
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:3:p:287-320
Template-Type: ReDIF-Article 1.0
Author-Name: Richard Blundell
Author-X-Name-First: Richard
Author-X-Name-Last: Blundell
Author-Name: Stephen Bond
Author-X-Name-First: Stephen
Author-X-Name-Last: Bond
Title: GMM Estimation with persistent panel data: an application to production functions
Abstract:
This paper considers the estimation of Cobb-Douglas production functions
using panel data covering a large sample of companies observed for a small
number of time periods. GMM estimatorshave been found to produce large
finite-sample biases when using the standard first-differenced estimator.
These biases can be dramatically reduced by exploiting reasonable
stationarity restrictions on the initial conditions process. Using data
for a panel of R&Dperforming US manufacturing companies we find that the
additional instruments used in our extended GMM estimator yield much more
reasonable parameter estimates.
Journal: Econometric Reviews
Pages: 321-340
Issue: 3
Volume: 19
Year: 2000
Keywords: panel data, GMM, production functions,
X-DOI: 10.1080/07474930008800475
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800475
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:3:p:321-340
Template-Type: ReDIF-Article 1.0
Author-Name: Bo Honore
Author-X-Name-First: Bo
Author-X-Name-Last: Honore
Author-Name: Ekaterini Kyriazidou
Author-X-Name-First: Ekaterini
Author-X-Name-Last: Kyriazidou
Author-Name: J. L. Powell
Author-X-Name-First: J. L.
Author-X-Name-Last: Powell
Title: Estimation of tobit-type models with individual specific effects
Abstract:
The aim of this paper is two-fold. First, we review recent estimators for
censored regression and sample selection panel data models with
unobservable individual specific effects, and show how the idea behind
these estimators can be used to construct estimators for a variety of
other Tobit-type models. The estimators presented in this paper are
semiparametric, in the sense that they do not require the parametrization
of the distribution of the unobservables. The second aim of the paper is
to introduce a new class of estimators for the censored regression model.
The advantage of the new estimators is that they can be applied under a
stationarity assumption on the transitory error terms, which is weaker
than the exchangeability assumption that is usually made in this
literature. A similar generalization does not seem feasible for the
estimators of the other models that are considered.
Journal: Econometric Reviews
Pages: 341-366
Issue: 3
Volume: 19
Year: 2000
X-DOI: 10.1080/07474930008800476
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800476
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:3:p:341-366
Template-Type: ReDIF-Article 1.0
Author-Name: Ragner Tveterås
Author-X-Name-First: Ragner
Author-X-Name-Last: Tveterås
Author-Name: G. H. Wan
Author-X-Name-First: G. H.
Author-X-Name-Last: Wan
Title: Flexible panel data models for risky production technologies with an application to salmon aquaculture
Abstract:
Primal panel data models of production risk are estimated, using more
flexible specifications than has previously been the practice. Production
risk has important implications for the analysis of technology adoption
and technical efficiency, since risk averse producers will take into
account both the mean and variance of output when ranking alternative
technologies. Hence, one should estimate technical change separately for
the deterministic part and the risk part of thetechnology.
Journal: Econometric Reviews
Pages: 367-389
Issue: 3
Volume: 19
Year: 2000
Keywords: production risk, technical change, stochastic dominance, salmon aquaculture,
X-DOI: 10.1080/07474930008800477
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800477
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:3:p:367-389
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Almas Heshmati
Author-X-Name-First: Almas
Author-X-Name-Last: Heshmati
Title: Introduction
Abstract:
Journal: Econometric Reviews
Pages: 5-5
Issue: 4
Volume: 19
Year: 2000
X-DOI: 10.1080/07474930008800479
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800479
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:4:p:5-5
Template-Type: ReDIF-Article 1.0
Author-Name: Erik Biørn
Author-X-Name-First: Erik
Author-X-Name-Last: Biørn
Title: Panel Data With Measurement Errors: Instrumental Variables And Gmm Procedures Combining Levels And Differences
Abstract:
Journal: Econometric Reviews
Pages: 391-424
Issue: 4
Volume: 19
Year: 2000
Keywords: Panel Data, Errors-in-Variables, Repeated Measurement, Moment Conditions, GMM Estimation, Returns to scale,
X-DOI: 10.1080/07474930008800480
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800480
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:4:p:391-424
Template-Type: ReDIF-Article 1.0
Author-Name: Subal Kumbhakar
Author-X-Name-First: Subal
Author-X-Name-Last: Kumbhakar
Author-Name: M. Denny
Author-X-Name-First: M.
Author-X-Name-Last: Denny
Author-Name: M. Fuss
Author-X-Name-First: M.
Author-X-Name-Last: Fuss
Title: Estimation and decomposition of productivity change when production is not efficient: a paneldata approach
Abstract:
This paper addresses estimation and decomposition of productivity change,
which is mostly identified as technical change under constant (unitary)
returns to scale (CRS). If the CRS assumption is not made, productivity
change is decomposed into technical change and scale effects.Furthermore,
if inefficiency exists, it also contributes to productivity change. Here
we decompose productivity change into efficiency change, technical change,
and scale effects. Three alternative approaches using parametric
production, cost, and profit functions, which differ in terms of
behavioral assumptions on the producers and data requirements, are
considered.
Journal: Econometric Reviews
Pages: 312-320
Issue: 4
Volume: 19
Year: 2000
Keywords: total factor productivity, technical change, returns to scale, scale effects, technical inefficiency, allocative inefficiency, production function, cost function, profit function,
X-DOI: 10.1080/07474930008800481
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800481
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:4:p:312-320
Template-Type: ReDIF-Article 1.0
Author-Name: Seung Ahn
Author-X-Name-First: Seung
Author-X-Name-Last: Ahn
Author-Name: Robin Sickles
Author-X-Name-First: Robin
Author-X-Name-Last: Sickles
Title: Estimation of long-run inefficiency levels: a dynamic frontier approach
Abstract:
Cornwell, Schmidt, and Sickles (1990) and Kumbhakar (1990), among others,
developed stochasticfrontier production models which allow firm specific
inefficiency levels to change over time. These studies assumed arbitrary
restrictions on the short-run dynamics of efficiency levels which have
little theoretical justification. Further, the models are inappropriate
for estimation of long-run efficiencies. We consider estimation of an
alternative frontier model in which firmspecific technical inefficiency
levels are autoregressive. This model is particularly useful to examine a
potential dynamic link between technical innovations and production
inefficiency levels. We apply our methodology to a panel of US airlines.
Journal: Econometric Reviews
Pages: 461-492
Issue: 4
Volume: 19
Year: 2000
Keywords: panel data, long-run inefficiency, frontier production function, generalized method of moments,
X-DOI: 10.1080/07474930008800482
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800482
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:4:p:461-492
Template-Type: ReDIF-Article 1.0
Author-Name: Subal Kumbhakar
Author-X-Name-First: Subal
Author-X-Name-Last: Kumbhakar
Author-Name: Shinichiro Nakamura
Author-X-Name-First: Shinichiro
Author-X-Name-Last: Nakamura
Author-Name: Almas Heshmati
Author-X-Name-First: Almas
Author-X-Name-Last: Heshmati
Title: Estimation of firm-specific technological bias, technical change and total factor productivity growth: a dual approach
Abstract:
This paper deals with modeling firm-specific technical change (TC), and
technological biases (inputs and scale) in estimating total factor
productivity (TFP) growth. Several dual parametric econometric models are
used for this purpose. We examine robustness of TFP growth and TC among
competing models. These models include the traditional time trend (TT)
model and the general index (GI) model. The TT and the GI models are
generalized to accommodate firm-specific TC and technological bias (in
inputs and output). Both nested and non-nested tests are used to select
the appropriate models. Firm-level panel data from the Japanese chemical
industry during 1968- 1987 is used as an application.
Journal: Econometric Reviews
Pages: 162-173
Issue: 4
Volume: 19
Year: 2000
Keywords: returns to scale, time trend, general index, cost function,
X-DOI: 10.1080/07474930008800483
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930008800483
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:19:y:2000:i:4:p:162-173
Template-Type: ReDIF-Article 1.0
Author-Name: Gordon Anderson
Author-X-Name-First: Gordon
Author-X-Name-Last: Anderson
Title: THE POWER AND SIZE OF NONPARAMETRIC TESTS FOR COMMON DISTRIBUTIONAL CHARACTERISTICS
Abstract:
This paper considers the power and size properties of some well known
nonparametric linear rank tests for location and scale as well as the
Kolmogorov-Smirnov omnibus test and proposed alternatives to it.
Independence between some classes of linear rank tests is established
facilitating their joint application. Monte Carlo study confirms the
asymptotic power properties of the linear rank tests but raises concerns
about their application in more general and practically relevant
circumstances. It also indicates that the new omnibus tests constitute
viable alternatives with superior properties to the Kolmogorov-Smirnov
test in certain circumstances.
Journal: Econometric Reviews
Pages: 1-30
Issue: 1
Volume: 20
Year: 2001
Keywords: Two sample linear rank tests, Omnibus tests, JEL Classification: C12, C14,
X-DOI: 10.1081/ETC-100104077
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104077
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:1:p:1-30
Template-Type: ReDIF-Article 1.0
Author-Name: Badi Baltagi
Author-X-Name-First: Badi
Author-X-Name-Last: Baltagi
Author-Name: Dong Li
Author-X-Name-First: Dong
Author-X-Name-Last: Li
Title: DOUBLE LENGTH ARTIFICIAL REGRESSIONS FOR TESTING SPATIAL DEPENDENCE
Abstract:
This paper derives two simple artificial Double Length Regressions (DLR)
to test for spatial dependence. The first DLR tests for spatial lag
dependence while the second DLR tests for spatial error dependence. Both
artificial regressions utilize only least squares residuals of the
restricted model and are therefore easy to compute. These tests are
illustrated using two simple examples. In addition, Monte Carlo
experiments are performed to study the small sample performance of these
tests. As expected, these DLR tests have similar performance to their
corresponding LM counterparts.
Journal: Econometric Reviews
Pages: 31-40
Issue: 1
Volume: 20
Year: 2001
Keywords: Double length regressions, Spatial dependence, Lagrange multiplier, JEL Classification: C12, C21, R15,
X-DOI: 10.1081/ETC-100104078
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104078
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:1:p:31-40
Template-Type: ReDIF-Article 1.0
Author-Name: Thanasis Stengos
Author-X-Name-First: Thanasis
Author-X-Name-Last: Stengos
Author-Name: Yiguo Sun
Author-X-Name-First: Yiguo
Author-X-Name-Last: Sun
Title: A CONSISTENT MODEL SPECIFICATION TEST FOR A REGRESSION FUNCTION BASED ON NONPARAMETRIC WAVELET ESTIMATION
Abstract:
In this paper we present a consistent specification test of a parametric
regression function against a general nonparametric alternative. The
proposed test is based on wavelet estimation and it is shown to have
similar rates of convergence to the more commonly used kernel based tests.
Monte Carlo simulations show that this test statistic has adequate size
and high power and that it compares favorably with its kernel based
counterparts in small samples.
Journal: Econometric Reviews
Pages: 41-60
Issue: 1
Volume: 20
Year: 2001
Keywords: Wavelets, Consistent specification test, Nonparametric regression, JEL Classification: C12, C14, C52,
X-DOI: 10.1081/ETC-100104079
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104079
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:1:p:41-60
Template-Type: ReDIF-Article 1.0
Author-Name: Nunzio Cappuccio
Author-X-Name-First: Nunzio
Author-X-Name-Last: Cappuccio
Author-Name: Diego Lubian
Author-X-Name-First: Diego
Author-X-Name-Last: Lubian
Title: ESTIMATION AND INFERENCE ON LONG-RUN EQUILIBRIA: A SIMULATION STUDY
Abstract:
In this paper we study the finite sample properties of some
asymptotically equivalent estimators of cointegrating relationships and
related test statistics: the Fully Modified Least Squares estimator
proposed by Phillips and Hansen (1990), the Dynamic OLS estimator of
Saikkonen (1991) and Stock and Watson (1993), the maximum likelihood
estimator (reduced rank regression estimator) of Johansen (1988). On the
basis of previous Monte Carlo results on this topic, the main objective of
our simulation experiments is to study the sensitivity of the finite
sample distribution of estimators and test statistics to three features of
the DGP of the observable variables, namely, the degree of serial
correlation of the cointegrating relationship, the condition of weak
exogeneity and the signal-to-noise ratio. To this end, we consider 100
different DGPs and four increasing sample sizes. Besides the usual
descriptive statistics, further information about the empirical
distributions of interest by means of graphical and statistical methods
are provided. In particular, we study size distortion of test statistics
using P-value discrepancy plots and estimate the maximal moment exponent
of the empirical distribution of estimators.
Journal: Econometric Reviews
Pages: 61-84
Issue: 1
Volume: 20
Year: 2001
Keywords: Cointegration, Monte Carlo experiment, Recursive variance, P-value discrepancy plots, Maximal moment exponent, JEL Classification: C13, C15,
X-DOI: 10.1081/ETC-100104080
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104080
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:1:p:61-84
Template-Type: ReDIF-Article 1.0
Author-Name: Paramsothy Silvapulle
Author-X-Name-First: Paramsothy
Author-X-Name-Last: Silvapulle
Title: A SCORE TEST FOR SEASONAL FRACTIONAL INTEGRATION AND COINTEGRATION
Abstract:
This paper develops a time domain score statistic for testing fractional
integration at zero and seasonal frequencies in quarterly time series
models. Further, it introduces the notion of fractional cointegration at
different frequencies between two seasonally integrated, I(1) series. In
testing problems involving seasonal fractional cointegration, it is argued
that the alternative hypothesis is one-sided for which the usual score
test may not be appropriate. Therefore, based on ideas in Silvapulle and
Silvapulle (1995), a one-sided score statistic is constructed. A
simulation study finds that the score statistic generally has desirable
size and power properties in moderately sized samples. The score test is
applied to the quarterly Australian consumption function. The income and
consumption series are found to be I(1) at zero and seasonal frequencies
and these two series are not cointegrated at any frequency.
Journal: Econometric Reviews
Pages: 85-104
Issue: 1
Volume: 20
Year: 2001
Keywords: Seasonal fractional roots, Long-memory, Fractional cointegration, One-sided alternatives, JEL Classification: C12, C22 and C32,
X-DOI: 10.1081/ETC-100104081
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104081
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:1:p:85-104
Template-Type: ReDIF-Article 1.0
Author-Name: Kazumitsu Nawata
Author-X-Name-First: Kazumitsu
Author-X-Name-Last: Nawata
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: SIZE CHARACTERISTICS OF TESTS FOR SAMPLE SELECTION BIAS: A MONTE CARLO COMPARISON AND EMPIRICAL EXAMPLE
Abstract:
The t-test of an individual coefficient is used widely in models of
qualitative choice. However, it is well known that the t-test can yield
misleading results when the sample size is small. This paper provides some
experimental evidence on the finite sample properties of the t-test in
models with sample selection biases, through a comparison of the t-test
with the likelihood ratio and Lagrange multiplier tests, which are
asymptotically equivalent to the squared t-test. The finite sample
problems with the t-test are shown to be alarming, and much more serious
than in models such as binary choice models. An empirical example is also
presented to highlight the differences in the calculated test statistics.
Journal: Econometric Reviews
Pages: 105-112
Issue: 1
Volume: 20
Year: 2001
Keywords: Sample selection bias, t-test, Wald test, JEL Classification: C12, C24,
X-DOI: 10.1081/ETC-100104082
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104082
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:1:p:105-112
Template-Type: ReDIF-Article 1.0
Author-Name: Chunrong Ai
Author-X-Name-First: Chunrong
Author-X-Name-Last: Ai
Title: A MODIFIED AVERAGE DERIVATIVES ESTIMATOR
Abstract:
We extend the average derivatives estimator to the case of functionally
dependent regressors. We show that the proposed estimator is consistent
and has a limiting normal distribution. A consistent covariance matrix
estimator for the proposed estimator is provided.
Journal: Econometric Reviews
Pages: 113-131
Issue: 1
Volume: 20
Year: 2001
Keywords: Nonparametric estimation, Functional dependency, Average derivatives estimator, JEL Classification: C2, C4,
X-DOI: 10.1081/ETC-100104083
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104083
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:1:p:113-131
Template-Type: ReDIF-Article 1.0
Author-Name: J. Denis Sargan
Author-X-Name-First: J. Denis
Author-X-Name-Last: Sargan
Title: MODEL BUILDING AND DATA MINING
Abstract:
This paper defines the phenomenon of data mining in econometrics and
discusses various outcomes of and solutions to data mining. Both classical
and Bayesian approaches are considered, each with notable advantages and
disadvantages, and with the choice of loss function affecting critical
values. Illustrative examples include variable addition and exclusion in a
standard linear regression model, the choice of lag structure in a dynamic
single equation, and specification in a simultaneous equations model.
Journal: Econometric Reviews
Pages: 159-170
Issue: 2
Volume: 20
Year: 2001
Keywords: Bayes, Loss function, Pre-test estimation, Specification searches, Stein-James estimator, JEL Classification: C44, C51,
X-DOI: 10.1081/ETC-100103820
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100103820
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:2:p:159-170
Template-Type: ReDIF-Article 1.0
Author-Name: J. Denis Sargan
Author-X-Name-First: J. Denis
Author-X-Name-Last: Sargan
Title: THE CHOICE BETWEEN SETS OF REGRESSORS
Abstract:
This paper examines the choice of critical values for testing both
non-sequential and nested sequential sets of constraints in the standard
linear regression model. Modest increases in (e.g.) t-ratio critical
values relative to their one-off values are often sufficient to maintain
proper size. A Bayesian decision-theoretic approach, highlighted by the
Schwarz (1978) criterion, provides a framework for deriving consistency
and asymptotic local power properties of both forms of testing (data
mining) algorithms.
Journal: Econometric Reviews
Pages: 171-186
Issue: 2
Volume: 20
Year: 2001
Keywords: Bayes, Bonferroni, Critical values, Data mining, Multiple testing, Scheffe, Schwarz criterion, Specification searches, JEL Classifixcation: C44, C51,
X-DOI: 10.1081/ETC-100103821
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100103821
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:2:p:171-186
Template-Type: ReDIF-Article 1.0
Author-Name: Akira Tokihisa
Author-X-Name-First: Akira
Author-X-Name-Last: Tokihisa
Author-Name: Shigeyuki Hamori
Author-X-Name-First: Shigeyuki
Author-X-Name-Last: Hamori
Title: SEASONAL INTEGRATION FOR DAILY DATA
Abstract:
This paper has two purposes: it introduces the econometric methods used
to analyze time series data with general frequency and presents a
framework for analyzing economic variables that are measured daily; this
special case is then applied to the trading volume of stock markets.
Journal: Econometric Reviews
Pages: 187-200
Issue: 2
Volume: 20
Year: 2001
Keywords: Seasonal unit roots, Asymptotic distribution, Stock markets, JEL Classifcation Number: C12, C15, and C22,
X-DOI: 10.1081/ETC-100103822
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100103822
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:2:p:187-200
Template-Type: ReDIF-Article 1.0
Author-Name: Gianluca Cubadda
Author-X-Name-First: Gianluca
Author-X-Name-Last: Cubadda
Title: COMMON FEATURES IN TIME SERIES WITH BOTH DETERMINISTIC AND STOCHASTIC SEASONALITY
Abstract:
This paper extends the notions of common cycles and common seasonal
features to time series having deterministic and stochastic seasonality at
different frequencies. The conditions under which quarterly time series
with these characteristics have common features are investigated, various
representations are presented and statistical inference is discussed.
Finally, the analysis is applied to study comovements between different
components of consumption and income using UK data.
Journal: Econometric Reviews
Pages: 201-216
Issue: 2
Volume: 20
Year: 2001
Keywords: Common features, Seasonality, Codependence, JEL Classification: C32, C52,
X-DOI: 10.1081/ETC-100103823
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100103823
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:2:p:201-216
Template-Type: ReDIF-Article 1.0
Author-Name: Gael Martin
Author-X-Name-First: Gael
Author-X-Name-Last: Martin
Title: BAYESIAN ANALYSIS OF A FRACTIONAL COINTEGRATION MODEL
Abstract:
The concept of fractional cointegration, whereby deviations from an
equilibrium relationship follow a fractionally integrated process, has
attracted some attention of late. The extended concept allows
cointegration to be associated with mean reversion in the error, rather
than requiring the more stringent condition of stationarity. This paper
presents a Bayesian method for conducting inference about fractional
cointegration. The method is based on an approximation of the exact
likelihood, with a Jeffreys prior being used to offset identification
problems. Numerical results are produced via a combination of Markov chain
Monte Carlo algorithms. The procedure is applied to several purchasing
power parity relations, with substantial evidence found in favor of parity
reversion.
Journal: Econometric Reviews
Pages: 217-234
Issue: 2
Volume: 20
Year: 2001
Keywords: Fractional cointegration, Bayesian inference, Jeffreys prior, Markov chain Monte Carlo, JEL Classification: C11; C32,
X-DOI: 10.1081/ETC-100103824
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100103824
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:2:p:217-234
Template-Type: ReDIF-Article 1.0
Author-Name: Mahmoud El-Gamal
Author-X-Name-First: Mahmoud
Author-X-Name-Last: El-Gamal
Title: A BAYESIAN INTERPRETATION OF MULTIPLE POINT ESTIMATES
Abstract:
Consider a large number of econometric investigations using different
estimation techniques and/or different subsets of all available data to
estimate a fixed set of parameters. The resulting empirical distribution
of point estimates can be shown - under suitable conditions - to coincide
with a Bayesian posterior measure on the parameter space induced by a
minimum information procedure. This Bayesian interpretation makes it
easier to combine the results of various empirical exercises for
statistical decision making. The collection of estimators may be generated
by one investigator to ensure the satisfaction of our conditions, or they
may be collected from published works, where behavioral assumptions need
to be made regarding the dependence structure of econometric studies.
Journal: Econometric Reviews
Pages: 235-245
Issue: 2
Volume: 20
Year: 2001
Keywords: Bayesian statistics and econometrics, Decision theory, Literature surveys, Meta-analysis, Markov random fields, Gibbs random fields, Point estimation, JEL Classification: C11, C13, C42, C44, and C51,
X-DOI: 10.1081/ETC-100103825
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100103825
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:2:p:235-245
Template-Type: ReDIF-Article 1.0
Author-Name: Kirstin Hubrich
Author-X-Name-First: Kirstin
Author-X-Name-Last: Hubrich
Author-Name: Helmut Lutkepohl
Author-X-Name-First: Helmut
Author-X-Name-Last: Lutkepohl
Author-Name: Pentti Saikkonen
Author-X-Name-First: Pentti
Author-X-Name-Last: Saikkonen
Title: A REVIEW OF SYSTEMS COINTEGRATION TESTS
Abstract:
The literature on systems cointegration tests is reviewed and the various
sets of assumptions for the asymptotic validity of the tests are compared
within a general unifying framework. The comparison includes likelihood
ratio tests, Lagrange multiplier and Wald type tests, lag augmentation
tests, tests based on canonical correlations, the Stock-Watson tests and
Bierens' nonparametric tests. Asymptotic results regarding the power of
these tests and previous small sample simulation studies are discussed.
Further issues and proposals in the context of systems cointegration tests
are also considered briefly. New simulations are presented to compare the
tests under uniform conditions. Special emphasis is given to the
sensitivity of the test performance with respect to the trending
properties of the DGP.
Journal: Econometric Reviews
Pages: 247-318
Issue: 3
Volume: 20
Year: 2001
Keywords: Systems cointegration tests, LR tests, Nonparametric tests, Asymptotic power, Small sample simulations,
X-DOI: 10.1081/ETC-100104936
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104936
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:3:p:247-318
Template-Type: ReDIF-Article 1.0
Author-Name: Shigeru Iwata
Author-X-Name-First: Shigeru
Author-X-Name-Last: Iwata
Title: RECENTERED AND RESCALED INSTRUMENTAL VARIABLE ESTIMATION OF TOBIT AND PROBIT MODELS WITH ERRORS IN VARIABLES
Abstract:
Since Durbin (1954) and Sargan (1958), instrumental variable (IV) method
has long been one of the most popular procedures among economists and
other social scientists to handle linear models with errors-in-variables.
A direct application of this method to nonlinear errors-in-variables
models, however, fails to yield consistent estimators. This article
restricts attention to Tobit and Probit models and shows that simple
recentering and rescaling of the observed dependent variable may restore
consistency of the standard IV estimator if the true dependent variable
and the IV's are jointly normally distributed. Although the required
condition seems rarely to be satisfied by real data, our Monte Carlo
experiment suggests that the proposed estimator may be quite robust to the
possible deviation from normality.
Journal: Econometric Reviews
Pages: 319-335
Issue: 3
Volume: 20
Year: 2001
Keywords: Instrumental variables, GMM estimator, Nonlinear errors in variables, Elliptically symmetric distribution,
X-DOI: 10.1081/ETC-100104937
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104937
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:3:p:319-335
Template-Type: ReDIF-Article 1.0
Author-Name: Evzen Kocenda
Author-X-Name-First: Evzen
Author-X-Name-Last: Kocenda
Title: AN ALTERNATIVE TO THE BDS TEST: INTEGRATION ACROSS THE CORRELATION INTEGRAL
Abstract:
This paper extends and generalizes the BDS test presented by Brock,
Dechert, Scheinkman, and LeBaron (1996). In doing so it aims to remove the
limitation of having to arbitrarily select a proximity parameter by
integrating across the correlation integral. The Monte Carlo simulation is
used to tabulate critical values of the alternative statistic. Previously
published empirical studies are replicated as well as power tests executed
in order to evaluate the relative performance of the suggested alternative
to the BDS test. The results are favorable for the suggested alternative.
Journal: Econometric Reviews
Pages: 337-351
Issue: 3
Volume: 20
Year: 2001
Keywords: Chaos, Nonlinear dynamics, Correlation integral, Monte Carlo, Exchange rates,
X-DOI: 10.1081/ETC-100104938
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104938
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:3:p:337-351
Template-Type: ReDIF-Article 1.0
Author-Name: Robert Breunig
Author-X-Name-First: Robert
Author-X-Name-Last: Breunig
Title: DENSITY ESTIMATION FOR CLUSTERED DATA
Abstract:
The commonly used survey technique of clustering introduces dependence
into sample data. Such data is frequently used in economic analysis,
though the dependence induced by the sample structure of the data is often
ignored. In this paper, the effect of clustering on the non-parametric,
kernel estimate of the density, f(x), is examined. The window width
commonly used for density estimation for the case of i.i.d. data is shown
to no longer be optimal. A new optimal bandwidth using a higher-order
kernel is proposed and is shown to give a smaller integrated mean squared
error than two window widths which are widely used for the case of i.i.d.
data. Several illustrations from simulation are provided.
Journal: Econometric Reviews
Pages: 353-367
Issue: 3
Volume: 20
Year: 2001
Keywords: Bandwidth choice, Cluster sampling, Dependent data, Kernel density estimation,
X-DOI: 10.1081/ETC-100104939
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104939
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:3:p:353-367
Template-Type: ReDIF-Article 1.0
Author-Name: Offer Lieberman
Author-X-Name-First: Offer
Author-X-Name-Last: Lieberman
Title: THE EXACT BIAS OF THE LOG-PERIODOGRAM REGRESSION ESTIMATOR
Abstract:
The paper makes two contributions. First, we provide a formula for the
exact distribution of the periodogram evaluated at any arbitrary
frequency, when the sample is taken from any zero-mean stationary Gaussian
process. The inadequacy of the asymptotic distribution is demonstrated
through an example in which the observations are generated by a fractional
Gaussian noise process. The results are then applied in deriving the exact
bias of the log-periodogram regression estimator (Geweke and Porter-Hudak
(1983), Robinson (1995)). The formula is computable. Practical bounds on
this bias are developed and their arithmetic mean is shown to be accurate
and useful.
Journal: Econometric Reviews
Pages: 369-383
Issue: 3
Volume: 20
Year: 2001
Keywords: ARFIMA, Chi-square distribution, Log-periodogram regression,
X-DOI: 10.1081/ETC-100104940
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100104940
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:3:p:369-383
Template-Type: ReDIF-Article 1.0
Author-Name: Christian Gourieroux
Author-X-Name-First: Christian
Author-X-Name-Last: Gourieroux
Author-Name: Joann Jasiak
Author-X-Name-First: Joann
Author-X-Name-Last: Jasiak
Title: DYNAMIC FACTOR MODELS
Abstract:
This paper introduces nonlinear dynamic factor models for various
applications related to risk analysis. Traditional factor models represent
the dynamics of processes driven by movements of latent variables, called
the factors. Our approach extends this setup by introducing factors
defined as random dynamic parameters and stochastic autocorrelated
simulators. This class of factor models can represent processes with time
varying conditional mean, variance, skewness and excess kurtosis.
Applications discussed in the paper include dynamic risk analysis, such as
risk in price variations (models with stochastic mean and volatility),
extreme risks (models with stochastic tails), risk on asset liquidity
(stochastic volatility duration models), and moral hazard in insurance
analysis. We propose estimation procedures for models with the marginal
density of the series and factor dynamics parameterized by distinct
subsets of parameters. Such a partitioning of the parameter vector found
in many applications allows to simplify considerably statistical
inference. We develop a two- stage Maximum Likelihood method, called the
Finite Memory Maximum Likelihood, which is easy to implement in the
presence of multiple factors. We also discuss simulation based estimation,
testing, prediction and filtering.
Journal: Econometric Reviews
Pages: 385-424
Issue: 4
Volume: 20
Year: 2001
Keywords: Nonlinear dynamics, Factor models, Stochastic volatility, Moral hazard, Stable distributions, JEL Number: C22, C32, G10, G12,
X-DOI: 10.1081/ETC-100106997
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100106997
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:4:p:385-424
Template-Type: ReDIF-Article 1.0
Author-Name: Kurt Brannas
Author-X-Name-First: Kurt
Author-X-Name-Last: Brannas
Author-Name: Jorgen Hellstrom
Author-X-Name-First: Jorgen
Author-X-Name-Last: Hellstrom
Title: GENERALIZED INTEGER-VALUED AUTOREGRESSION
Abstract:
The integer-valued AR1 model is generalized to encompass some of the more
likely features of economic time series of count data. The generalizations
come at the price of loosing exact distributional properties. For most
specifications the first and second order both conditional and
unconditional moments can be obtained. Hence estimation, testing and
forecasting are feasible and can be based on least squares or GMM
techniques. An illustration based on the number of plants within an
industrial sector is considered.
Journal: Econometric Reviews
Pages: 425-443
Issue: 4
Volume: 20
Year: 2001
Keywords: Characterization, Dependence, Time series model, Estimation, Forecasting, Entry and exit, JEL Classification: C12, C13, C22, C25, C51,
X-DOI: 10.1081/ETC-100106998
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100106998
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:4:p:425-443
Template-Type: ReDIF-Article 1.0
Author-Name: Badi Baltagi
Author-X-Name-First: Badi
Author-X-Name-Last: Baltagi
Author-Name: Qi Li
Author-X-Name-First: Qi
Author-X-Name-Last: Li
Title: ESTIMATION OF ECONOMETRIC MODELS WITH NONPARAMETRICALLY SPECIFIED RISK TERMS
Abstract:
This paper studies the asymptotic properties of the semiparametric
estimator considered by Pagan and Ullah (1988) and Pagan and Hong (1991)
for models with risk terms. We show that when the risk term is
nonparametrically specified, the estimator with generated regressors
suggested by Pagan and Ullah (1988) and Pagan and Hong (1991) is [image
omitted]-consistent and has an asymptotic normal distribution. The result
is then applied to analyzing risk premium for the U.S. dollar against the
British pound, the French franc and the Japanese yen exchange markets for
monthly data covering the period 1976:1 to 1992:8.
Journal: Econometric Reviews
Pages: 445-460
Issue: 4
Volume: 20
Year: 2001
Keywords: Risk premium, Exchange market, Semiparametric estimation, √n-consistency, Asymptotic normality, JEL Classification: C14; C12,
X-DOI: 10.1081/ETC-100106999
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100106999
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:4:p:445-460
Template-Type: ReDIF-Article 1.0
Author-Name: Sung Ahn
Author-X-Name-First: Sung
Author-X-Name-Last: Ahn
Author-Name: Stergios Fotopoulos
Author-X-Name-First: Stergios
Author-X-Name-Last: Fotopoulos
Author-Name: Lijian He
Author-X-Name-First: Lijian
Author-X-Name-Last: He
Title: UNIT ROOT TESTS WITH INFINITE VARIANCE ERRORS
Abstract:
This paper considers the asymptotic properties of some unit root test
statistics with the errors belonging to the domain of attraction of a
symmetric α-stable law with 0 < α < 2. The results
obtained can be viewed as a parallel extension of the asymptotic results
for the finite-variance case. The test statistics considered are the
Dickey-Fuller, the Lagrange multiplier, the Durbin-Watson and
Phillips-type modified. Their asymptotic distributions are expressed as
functionals of a standard symmetric α-stable Levy motion.
Percentiles of these test statistics are obtained by computer simulation.
Asymptotic distributions of sample moments that are part of the test
statistics are found to have explicit densities. A small Monte Carlo
simulation study is performed to assess small-sample performance of these
test statistics for heavy-tailed errors.
Journal: Econometric Reviews
Pages: 461-483
Issue: 4
Volume: 20
Year: 2001
Keywords: Stable distributions, Invariance principles, Partial sum processes,
X-DOI: 10.1081/ETC-100107000
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100107000
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:4:p:461-483
Template-Type: ReDIF-Article 1.0
Author-Name: David Mandy
Author-X-Name-First: David
Author-X-Name-Last: Mandy
Author-Name: Carlos Martins-Filho
Author-X-Name-First: Carlos
Author-X-Name-Last: Martins-Filho
Title: OPTIMAL IV ESTIMATION OF SYSTEMS WITH STOCHASTIC REGRESSORS AND VAR DISTURBANCES WITH APPLICATIONS TO DYNAMIC SYSTEMS
Abstract:
This paper considers the general problem of Feasible Generalized Least
Squares Instrumental Variables (FGLS IV) estimation using optimal
instruments. First we summarize the sufficient conditions for the FGLS IV
estimator to be asymptotically equivalent to an optimal GLS IV estimator.
Then we specialize to stationary dynamic systems with stationary VAR
errors, and use the sufficient conditions to derive new moment conditions
for these models. These moment conditions produce useful IVs from the
lagged endogenous variables, despite the correlation between errors and
endogenous variables. This use of the information contained in the lagged
endogenous variables expands the class of IV estimators under
consideration and thereby potentially improves both asymptotic and
small-sample efficiency of the optimal IV estimator in the class. Some
Monte Carlo experiments compare the new methods with those of Hatanaka
(1976). For the DGP used in the Monte Carlo experiments, asymptotic
efficiency is strictly improved by the new IVs, and experimental
small-sample efficiency is improved as well.
Journal: Econometric Reviews
Pages: 485-505
Issue: 4
Volume: 20
Year: 2001
Keywords: Dynamic models, IV estimation, VAR errors, JEL Classification: C30,
X-DOI: 10.1081/ETC-100107001
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-100107001
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:20:y:2001:i:4:p:485-505
Template-Type: ReDIF-Article 1.0
Author-Name: Dick van Dijk
Author-X-Name-First: Dick
Author-X-Name-Last: van Dijk
Author-Name: Timo Terasvirta
Author-X-Name-First: Timo
Author-X-Name-Last: Terasvirta
Author-Name: Philip Hans Franses
Author-X-Name-First: Philip Hans
Author-X-Name-Last: Franses
Title: SMOOTH TRANSITION AUTOREGRESSIVE MODELS — A SURVEY OF RECENT DEVELOPMENTS
Abstract:
This paper surveys recent developments related to the smooth transition
autoregressive (STAR) time series model and several of its variants. We
put emphasis on new methods for testing for STAR nonlinearity, model
evaluation, and forecasting. Several useful extensions of the basic STAR
model, which concern multiple regimes, time-varying non-linear properties,
and models for vector time series, are also reviewed.
Journal: Econometric Reviews
Pages: 1-47
Issue: 1
Volume: 21
Year: 2002
Keywords: Regime-switching models, Time series model specification, Model evaluation, Forecasting, Impulse response analysis, JEL Classification: C22, C52, E24,
X-DOI: 10.1081/ETC-120008723
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120008723
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:1:p:1-47
Template-Type: ReDIF-Article 1.0
Author-Name: M. Hashem Pesaran
Author-X-Name-First: M. Hashem
Author-X-Name-Last: Pesaran
Author-Name: Yongcheol Shin
Author-X-Name-First: Yongcheol
Author-X-Name-Last: Shin
Title: LONG-RUN STRUCTURAL MODELLING
Abstract:
The paper develops a general framework for identification, estimation,
and hypothesis testing in cointegrated systems when the cointegrating
coefficients are subject to (possibly) non-linear and cross-equation
restrictions, obtained from economic theory or other relevant a priori
information. It provides a proof of the consistency of the quasi maximum
likelihood estimators (QMLE), establishes the relative rates of
convergence of the QMLE of the short-run and the long-run parameters, and
derives their asymptotic distributions; thus generalizing the results
already available in the literature for the linear case. The paper also
develops tests of the over-identifying (possibly) non-linear restrictions
on the cointegrating vectors. The estimation and hypothesis testing
procedures are applied to an Almost Ideal Demand System estimated on U.K.
quarterly observations. Unlike many other studies of consumer demand this
application does not treat relative prices and real per capita
expenditures as exogenously given.
Journal: Econometric Reviews
Pages: 49-87
Issue: 1
Volume: 21
Year: 2002
Keywords: Cointegration, Identification, QMLE, Consistency, Asymptotic distribution, testing non-linear restrictions, Almost Ideal Demand Systems, JEL Classifications: C1, C3, D1, E1,
X-DOI: 10.1081/ETC-120008724
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120008724
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:1:p:49-87
Template-Type: ReDIF-Article 1.0
Author-Name: Mukhtar Ali
Author-X-Name-First: Mukhtar
Author-X-Name-Last: Ali
Title: DISTRIBUTION OF THE LEAST SQUARES ESTIMATOR IN A FIRST-ORDER AUTOREGRESSIVE MODEL
Abstract:
This paper investigates the finite sample distribution of the least
squares estimator of the autoregressive parameter in a first-order
autoregressive model. A uniform asymptotic expansion for the distribution
applicable to both stationary and nonstationary cases is obtained.
Accuracy of the approximation to the distribution by a first few terms of
this expansion is then investigated. It is found that the leading term of
this expansion approximates well the distribution. The approximation is,
in almost all cases, accurate to the second decimal place throughout the
distribution. In the literature, there exist a number of approximations to
this distribution which are specifically designed to apply in some special
cases of this model. The present approximation compares favorably with
those approximations and in fact, its accuracy is, with almost no
exception, as good as or better than these other approximations.
Convenience of numerical computations seems also to favor the present
approximations over the others. An application of the finding is
illustrated with examples.
Journal: Econometric Reviews
Pages: 89-119
Issue: 1
Volume: 21
Year: 2002
Keywords: Unit root, Saddlepoint approximation, Asymptotic expansion, JEL Classification: C13, C22,
X-DOI: 10.1081/ETC-120008725
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120008725
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:1:p:89-119
Template-Type: ReDIF-Article 1.0
Author-Name: Kazuhiro Ohtani
Author-X-Name-First: Kazuhiro
Author-X-Name-Last: Ohtani
Author-Name: Alan Wan
Author-X-Name-First: Alan
Author-X-Name-Last: Wan
Title: ON THE USE OF THE STEIN VARIANCE ESTIMATOR IN THE DOUBLE k-CLASS ESTIMATOR IN REGRESSION
Abstract:
This paper investigates the predictive mean squared error performance of
a modified double k-class estimator by incorporating the Stein variance
estimator. Recent studies show that the performance of the Stein rule
estimator can be improved by using the Stein variance estimator. However,
as we demonstrate below, this conclusion does not hold in general for all
members of the double k-class estimators. On the other hand, an estimator
is found to have smaller predictive mean squared error than the Stein
variance-Stein rule estimator, over quite large parts of the parameter
space.
Journal: Econometric Reviews
Pages: 121-134
Issue: 1
Volume: 21
Year: 2002
Keywords: Ad-hoc, Double k-class, Predictive mean squared error, Pre-test, Stein rule, JEL Classification: primary C13; secondary C20,
X-DOI: 10.1081/ETC-120008726
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120008726
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:1:p:121-134
Template-Type: ReDIF-Article 1.0
Author-Name: Gerd Ronning
Author-X-Name-First: Gerd
Author-X-Name-Last: Ronning
Title: ESTIMATION OF DISCRETE CHOICE MODELS WITH MINIMAL VARIATION OF ALTERNATIVE-SPECIFIC VARIABLES
Abstract:
The paper states conditions for minimal variation within the explanatory
variables such that the maximum likelihood estimate of the coefficient
vector in the discrete choice logit model is unique. Special emphasis is
given to the case that (almost) all individuals observe the same set of
alternative-specific explanatory variables. The aspect of 'experimental
design' in discrete choice models is discussed.
Journal: Econometric Reviews
Pages: 135-146
Issue: 1
Volume: 21
Year: 2002
Keywords: Experimental design, Maximum likelihood, Multinomial logit, JEL Classification: C13, C25,
X-DOI: 10.1081/ETC-120008727
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120008727
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:1:p:135-146
Template-Type: ReDIF-Article 1.0
Author-Name: Zeng-Hua Lu
Author-X-Name-First: Zeng-Hua
Author-X-Name-Last: Lu
Author-Name: Maxwell King
Author-X-Name-First: Maxwell
Author-X-Name-Last: King
Title: IMPROVING THE NUMERICAL TECHNIQUE FOR COMPUTING THE ACCUMULATED DISTRIBUTION OF A QUADRATIC FORM IN NORMAL VARIABLES
Abstract:
This paper is concerned with the technique of numerically evaluating the
cumulative distribution function of a quadratic form in normal variables.
The efficiency of two new truncation bounds and all existing truncation
bounds are investigated. We also find that the suggestion in the
literature for further splitting truncation errors might reduce
computational efficiency, and the optimum splitting rate could be
different in different situations. A practical solution is provided. The
paper also discusses a modified secant algorithm for finding the critical
value of the distribution at any given significance level.
Journal: Econometric Reviews
Pages: 149-165
Issue: 2
Volume: 21
Year: 2002
Keywords: Quadratic form in normal variables, Numerical inversion of characteristic function, Truncation error, Newton', s method, Secant method, JEL Classification, C19, C63,
X-DOI: 10.1081/ETC-120014346
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120014346
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:2:p:149-165
Template-Type: ReDIF-Article 1.0
Author-Name: Badi Baltagi
Author-X-Name-First: Badi
Author-X-Name-Last: Baltagi
Author-Name: Seuck Heun Song
Author-X-Name-First: Seuck Heun
Author-X-Name-Last: Song
Author-Name: Byoung Cheol Jung
Author-X-Name-First: Byoung Cheol
Author-X-Name-Last: Jung
Title: SIMPLE LM TESTS FOR THE UNBALANCED NESTED ERROR COMPONENT REGRESSION MODEL
Abstract:
This paper derives several Lagrange Multiplier tests for the unbalanced
nested error component model. Economic data with a natural nested grouping
include firms grouped by industry; or students grouped by schools. The LM
tests derived include the joint test for both effects as well as the test
for one effect conditional on the presence of the other. The paper also
derives the standardized versions of these tests, their asymptotic locally
mean most powerful version as well as their robust to local
misspecification version. Monte Carlo experiments are conducted to study
the performance of these LM tests.
Journal: Econometric Reviews
Pages: 167-187
Issue: 2
Volume: 21
Year: 2002
Keywords: Panel data, Nested error component, Unbalanced data, LM tests,
X-DOI: 10.1081/ETC-120014347
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120014347
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:2:p:167-187
Template-Type: ReDIF-Article 1.0
Author-Name: Nilanjana Roy
Author-X-Name-First: Nilanjana
Author-X-Name-Last: Roy
Title: IS ADAPTIVE ESTIMATION USEFUL FOR PANEL MODELS WITH HETEROSKEDASTICITY IN THE INDIVIDUAL SPECIFIC ERROR COMPONENT? SOME MONTE CARLO EVIDENCE
Abstract:
This paper first derives an adaptive estimator when heteroskedasticity is
present in the individual specific error in an error component model and
then compares the finite sample performance of the proposed estimator with
various other estimators. While the Monte Carlo results show that the
proposed estimator performs adequately in terms of relative efficiency,
its performance on the basis of empirical size is quite similar to the
other estimators considered.
Journal: Econometric Reviews
Pages: 189-203
Issue: 2
Volume: 21
Year: 2002
Keywords: Heteroskedasticity, Kernel estimation, Error component model, JEL Classification, C14, C23,
X-DOI: 10.1081/ETC-120014348
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120014348
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:2:p:189-203
Template-Type: ReDIF-Article 1.0
Author-Name: John Galbraith
Author-X-Name-First: John
Author-X-Name-Last: Galbraith
Author-Name: Aman Ullah
Author-X-Name-First: Aman
Author-X-Name-Last: Ullah
Author-Name: Victoria Zinde-Walsh
Author-X-Name-First: Victoria
Author-X-Name-Last: Zinde-Walsh
Title: ESTIMATION OF THE VECTOR MOVING AVERAGE MODEL BY VECTOR AUTOREGRESSION
Abstract:
We examine a simple estimator for the multivariate moving average model
based on vector autoregressive approximation. In finite samples the
estimator has a bias which is low where roots of the characteristic
equation are well away from the unit circle, and more substantial where
one or more roots have modulus near unity. We show that the representation
estimated by this multivariate technique is consistent and asymptotically
invertible. This estimator has significant computational advantages over
Maximum Likelihood, and more importantly may be more robust than ML to
mis-specification of the vector moving average model. The estimation
method is applied to a VMA model of wholesale and retail inventories,
using Canadian data on inventory investment, and allows us to examine the
propagation of shocks between the two classes of inventory.
Journal: Econometric Reviews
Pages: 205-219
Issue: 2
Volume: 21
Year: 2002
Keywords: Vector autoregression, Vector moving average, JEL Classification:, C12, C22,
X-DOI: 10.1081/ETC-120014349
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120014349
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:2:p:205-219
Template-Type: ReDIF-Article 1.0
Author-Name: Denise Osborn
Author-X-Name-First: Denise
Author-X-Name-Last: Osborn
Author-Name: Paulo Rodrigues
Author-X-Name-First: Paulo
Author-X-Name-Last: Rodrigues
Title: ASYMPTOTIC DISTRIBUTIONS OF SEASONAL UNIT ROOT TESTS: A UNIFYING APPROACH
Abstract:
This paper adopts a unified approach to the derivation of the asymptotic
distributions of various seasonal unit root tests. The procedures
considered are those of Dickey et al. [DHF], Kunst, Hylleberg et al.
[HEGY], Osborn et al. [OCSB], Ghysels et al. [GHL] and Franses. This
unified approach shows that the asymptotic distributions of all these test
statistics are functions of the same vector of Brownian motions. The Kunst
test and the overall HEGY F-test are, indeed, equivalent both
asymptotically and in finite samples, while the Franses and GHL tests are
shown to have equivalent parameterizations. The OCSB and DHF test
regressions are viewed as restricted forms of the Kunst-HEGY regressions,
and these restrictions may have non-trivial asymptotic implications.
Journal: Econometric Reviews
Pages: 221-241
Issue: 2
Volume: 21
Year: 2002
Keywords: Seasonal unit roots, Asymptotic distributions, Unit root tests, Brownian motions, JEL Classification, C12, C22,
X-DOI: 10.1081/ETC-120014350
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120014350
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:2:p:221-241
Template-Type: ReDIF-Article 1.0
Author-Name: Eiji Kurozumi
Author-X-Name-First: Eiji
Author-X-Name-Last: Kurozumi
Title: TESTING FOR PERIODIC STATIONARITY
Abstract:
This paper proposes a test for the null hypothesis of periodic
stationarity against the alternative hypothesis of periodic integration.
We derive the limiting distribution of the test statistic and its
characteristic function, which are the same as those of the test developed
in Kwiatkowski, Phillips, Schmidt and Shin.[15] We find that some
parameters, which we must assume under the alternative, have an important
effect on the limiting power, so we should choose such parameters
carefully. A Monte Carlo simulation reveals that the test has reasonable
power but may be affected by the lag truncation parameter that is used for
the correction of nuisance parameters.
Journal: Econometric Reviews
Pages: 243-270
Issue: 2
Volume: 21
Year: 2002
Keywords: Periodic stationarity, Periodic integration, Hypothesis testing, JEL Classification, C22, C32,
X-DOI: 10.1081/ETC-120014351
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120014351
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:2:p:243-270
Template-Type: ReDIF-Article 1.0
Author-Name: Alain Hecq
Author-X-Name-First: Alain
Author-X-Name-Last: Hecq
Author-Name: Franz Palm
Author-X-Name-First: Franz
Author-X-Name-Last: Palm
Author-Name: Jean-Pierre Urbain
Author-X-Name-First: Jean-Pierre
Author-X-Name-Last: Urbain
Title: SEPARATION, WEAK EXOGENEITY, AND P-T DECOMPOSITION IN COINTEGRATED VAR SYSTEMS WITH COMMON FEATURES
Abstract:
The aim of this paper is to study the concept of separability in multiple
nonstationary time series displaying both common stochastic trends and
common stochastic cycles. When modeling the dynamics of multiple time
series for a panel of several entities such as countries, sectors, firms,
imposing some form of separability and commonalities is often required to
restrict the dimension of the parameter space. For this purpose we
introduce the concept of common feature separation and investigate the
relationships between separation in cointegration and separation in serial
correlation common features. Loosely speaking we investigate whether a set
of time series can be partitioned into subsets such that there are serial
correlation common features within the sub-groups only. The paper
investigates three issues. First, it provides conditions for separating
joint cointegrating vectors into marginal cointegrating vectors as well as
separating joint short-term dynamics into marginal short-term dynamics.
Second, conditions for making permanent-transitory decompositions based on
marginal systems are given. Third, issues of weak exogeneity are
considered. Likelihood ratio type tests for the different hypotheses under
study are proposed. An empirical analysis of the link between economic
fluctuations in the United States and Canada shows the practical relevance
of the approach proposed in this paper.
Journal: Econometric Reviews
Pages: 273-307
Issue: 3
Volume: 21
Year: 2002
Keywords: Separation, Cointegration, Common features, Weak exogeneity, P-T Decomposition, Consumption function,
X-DOI: 10.1081/ETC-120015785
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120015785
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:3:p:273-307
Template-Type: ReDIF-Article 1.0
Author-Name: Jinyong Hahn
Author-X-Name-First: Jinyong
Author-X-Name-Last: Hahn
Author-Name: Atsushi Inoue
Author-X-Name-First: Atsushi
Author-X-Name-Last: Inoue
Title: A MONTE CARLO COMPARISON OF VARIOUS ASYMPTOTIC APPROXIMATIONS TO THE DISTRIBUTION OF INSTRUMENTAL VARIABLES ESTIMATORS
Abstract:
We examine empirical relevance of three alternative asymptotic
approximations to the distribution of instrumental variables estimators by
Monte Carlo experiments. We find that conventional asymptotics provides a
reasonable approximation to the actual distribution of instrumental
variables estimators when the sample size is reasonably large. For most
sample sizes, we find Bekker[11] asymptotics provides reasonably good
approximation even when the first stage R2 is very small. We conclude that
reporting Bekker[11] confidence interval would suffice for most
microeconometric (cross-sectional) applications, and the comparative
advantage of Staiger and Stock[5] asymptotic approximation is in
applications with sample sizes typical in macroeconometric (time series)
applications.
Journal: Econometric Reviews
Pages: 309-336
Issue: 3
Volume: 21
Year: 2002
Keywords: Many instruments, Weak instruments, JEL Classification: C31,
X-DOI: 10.1081/ETC-120015786
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120015786
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:3:p:309-336
Template-Type: ReDIF-Article 1.0
Author-Name: Yanqin Fan
Author-X-Name-First: Yanqin
Author-X-Name-Last: Fan
Author-Name: Qi Li
Author-X-Name-First: Qi
Author-X-Name-Last: Li
Title: A CONSISTENT MODEL SPECIFICATION TEST BASED ON THE KERNEL SUM OF SQUARES OF RESIDUALS
Abstract:
This paper constructs a consistent model specification test based on the
difference between the nonparametric kernel sum of squares of residuals
and the sum of squares of residuals from a parametric null model. We
establish the asymptotic normality of the proposed test statistic under
the null hypothesis of correct parametric specification and show that the
wild bootstrap method can be used to approximate the null distribution of
the test statistic. Results from a small simulation study are reported to
examine the finite sample performance of the proposed tests.
Journal: Econometric Reviews
Pages: 337-352
Issue: 3
Volume: 21
Year: 2002
Keywords: Consistent test, Kernel method, Sum of squares of residuals, Asymptotic normality, Wild bootstrap, Simulation, JEL Classification Number: C12, C14,
X-DOI: 10.1081/ETC-120015787
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120015787
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:3:p:337-352
Template-Type: ReDIF-Article 1.0
Author-Name: Serena Ng
Author-X-Name-First: Serena
Author-X-Name-Last: Ng
Author-Name: Timothy Vogelsang
Author-X-Name-First: Timothy
Author-X-Name-Last: Vogelsang
Title: ANALYSIS OF VECTOR AUTOREGRESSIONS IN THE PRESENCE OF SHIFTS IN MEAN
Abstract:
This paper considers the implications of mean shifts in a multivariate
setting. It is shown that under the additive outlier type mean shift
specification, the intercept in each equation of the vector autoregression
(VAR) will be subject to multiple shifts when the break dates of the mean
shifts to the univariate series do not coincide. Conversely, under the
innovative outlier type mean shift specification, both the univariate and
the multivariate time series are subject to multiple shifts when mean
shifts to the innovation processes occur at different dates. We consider
two procedures, the first removes the shifts series by series before
forming the VAR, and the second removes intercept shifts in the VAR
directly. The pros and cons of both methods are discussed.
Journal: Econometric Reviews
Pages: 353-381
Issue: 3
Volume: 21
Year: 2002
Keywords: Trend break, Structural change, Causality tests, Forecasting,
X-DOI: 10.1081/ETC-120015788
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120015788
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:3:p:353-381
Template-Type: ReDIF-Article 1.0
Author-Name: Jason Abrevaya
Author-X-Name-First: Jason
Author-X-Name-Last: Abrevaya
Title: COMPUTING MARGINAL EFFECTS IN THE BOX-COX MODEL
Abstract:
This paper considers computation of fitted values and marginal effects in
the Box-Cox regression model. Two methods, 1 the “smearing”
technique suggested by Duan (see Ref. [10]) and 2 direct numerical
integration, are examined and compared with the “naive”
method often used in econometrics.
Journal: Econometric Reviews
Pages: 383-393
Issue: 3
Volume: 21
Year: 2002
Keywords: Marginal effects, Box-Cox model, JEL Classification: C13, C21,
X-DOI: 10.1081/ETC-120015789
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120015789
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:3:p:383-393
Template-Type: ReDIF-Article 1.0
Author-Name: Jonathan Wright
Author-X-Name-First: Jonathan
Author-X-Name-Last: Wright
Title: LOG-PERIODOGRAM ESTIMATION OF LONG MEMORY VOLATILITY DEPENDENCIES WITH CONDITIONALLY HEAVY TAILED RETURNS
Abstract:
Many recent papers have used semiparametric methods, especially the
log-periodogram regression, to detect and estimate long memory in the
volatility of asset returns. In these papers, the volatility is proxied by
measures such as squared, log-squared, and absolute returns. While the
evidence for the existence of long memory is strong using any of these
measures, the actual long memory parameter estimates can be sensitive to
which measure is used. In Monte-Carlo simulations, I find that if the data
is conditionally leptokurtic, the log-periodogram regression estimator
using squared returns has a large downward bias, which is avoided by using
other volatility measures. In United States stock return data, I find that
squared returns give much lower estimates of the long memory parameter
than the alternative volatility measures, which is consistent with the
simulation results. I conclude that researchers should avoid using the
squared returns in the semiparametric estimation of long memory volatility
dependencies.
Journal: Econometric Reviews
Pages: 397-417
Issue: 4
Volume: 21
Year: 2002
Keywords: Semiparametric methods, Fractional integration, Stochastic volatility, Stock returns, Heavy tails, JEL Classification: C22, G10,
X-DOI: 10.1081/ETC-120015382
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120015382
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:4:p:397-417
Template-Type: ReDIF-Article 1.0
Author-Name: Ionel Birgean
Author-X-Name-First: Ionel
Author-X-Name-Last: Birgean
Author-Name: Lutz Kilian
Author-X-Name-First: Lutz
Author-X-Name-Last: Kilian
Title: DATA-DRIVEN NONPARAMETRIC SPECTRAL DENSITY ESTIMATORS FOR ECONOMIC TIME SERIES: A MONTE CARLO STUDY
Abstract:
Spectral analysis at frequencies other than zero plays an increasingly
important role in econometrics. A number of alternative automated
data-driven procedures for nonparametric spectral density estimation have
been suggested in the literature, but little is known about their
finite-sample accuracy. We compare five such procedures in terms of their
mean-squared percentage error across frequencies. Our data generating
processes (DGP) include autoregressive-moving average (ARMA) models,
fractionally integrated ARMA models and nonparametric models based on 16
commonly used macroeconomic time series. We find that for both quarterly
and monthly data the autoregressive sieve estimator is the most reliable
method overall.
Journal: Econometric Reviews
Pages: 449-476
Issue: 4
Volume: 21
Year: 2002
Keywords: Business cycle measurement, Model identification, Periodogram smoothing, Autocovariance smoothing, Autoregressive sieve, Bandwidth selection,
X-DOI: 10.1081/ETC-120015386
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120015386
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:4:p:449-476
Template-Type: ReDIF-Article 1.0
Author-Name: Edoardo Otranto
Author-X-Name-First: Edoardo
Author-X-Name-Last: Otranto
Author-Name: Giampiero Gallo
Author-X-Name-First: Giampiero
Author-X-Name-Last: Gallo
Title: A NONPARAMETRIC BAYESIAN APPROACH TO DETECT THE NUMBER OF REGIMES IN MARKOV SWITCHING MODELS
Abstract:
Journal: Econometric Reviews
Pages: 477-496
Issue: 4
Volume: 21
Year: 2002
Keywords: Markov switching models, Nuisance parameters, Specification testing, Exchange rate determination, JEL Classification: C2, C5, F3,
X-DOI: 10.1081/ETC-120015387
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120015387
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:4:p:477-496
Template-Type: ReDIF-Article 1.0
Author-Name: Russell Davidson
Author-X-Name-First: Russell
Author-X-Name-Last: Davidson
Author-Name: James MacKinnon
Author-X-Name-First: James
Author-X-Name-Last: MacKinnon
Title: FAST DOUBLE BOOTSTRAP TESTS OF NONNESTED LINEAR REGRESSION MODELS
Abstract:
It has been shown in previous work that bootstrapping the J test for
nonnested linear regression models dramatically improves its finite-sample
performance. We provide evidence that a more sophisticated bootstrap
procedure, which we call the fast double bootstrap, produces a very
substantial further improvement in cases where the ordinary bootstrap does
not work as well as it might. This FDB procedure is only about twice as
expensive as the usual single bootstrap.
Journal: Econometric Reviews
Pages: 419-429
Issue: 4
Volume: 21
Year: 2002
Keywords: Nonnested test, Bootstrap test, Jtest, JEL Classification:C12, C15, C20,
X-DOI: 10.1081/ETC-120015384
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120015384
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:4:p:419-429
Template-Type: ReDIF-Article 1.0
Author-Name: Yoosoon Chang
Author-X-Name-First: Yoosoon
Author-X-Name-Last: Chang
Author-Name: Joon Park
Author-X-Name-First: Joon
Author-X-Name-Last: Park
Title: ON THE ASYMPTOTICS OF ADF TESTS FOR UNIT ROOTS
Abstract:
In this paper, we derive the asymptotic distributions of
Augmented-Dickey-Fuller (ADF) tests under very mild conditions. The tests
were originally proposed and investigated by Said and Dickey (1984) for
testing unit roots in finite-order ARMA models with i.i.d. innovations,
and are based on a finite AR process of order increasing with the sample
size. Our conditions are significantly weaker than theirs. In particular,
we allow for general linear processes with martingale difference
innovations, possibly having conditional heteroskedasticities. The linear
processes driven by ARCH type innovations are thus permitted. The range
for the permissible increasing rates for the AR approximation order is
also much wider. For the usual t-type test, we only require that it
increase at order o(n1/2) while they assume that it is of order
o(nκ) for some κ satisfying 0 < κ ≤ 1/3.
Journal: Econometric Reviews
Pages: 431-447
Issue: 4
Volume: 21
Year: 2002
Keywords: ADF tests, Unit roots, Asymptotics, Linear process, Autoregressive approximation,
X-DOI: 10.1081/ETC-120015385
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120015385
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:21:y:2002:i:4:p:431-447
Template-Type: ReDIF-Article 1.0
Author-Name: Fran�ois Laisney
Author-X-Name-First: Fran�ois
Author-X-Name-Last: Laisney
Author-Name: Michael Lechner
Author-X-Name-First: Michael
Author-X-Name-Last: Lechner
Title: Almost Consistent Estimation of Panel Probit Models with "Small" Fixed Effects
Abstract:
We propose four different GMM estimators that
allow almost consistent estimation of the structural parameters of panel
probit models with fixed effects for the case of small
Tand large N. The moments used are
derived for each period from a first order approximation of the mean of
the dependent variable conditional on explanatory variables and on the
fixed effect. The estimators differ w.r.t. the choice of instruments and
whether they use trimming to reduce the bias or not. In a Monte Carlo
study, we compare these estimators with pooled probit and conditional
logit estimators for different data generating processes. The results show
that the proposed estimators outperform these competitors in several
situations.
Journal: Econometric Reviews
Pages: 1-28
Issue: 1
Volume: 22
Year: 2003
Month: 2
X-DOI: 10.1081/ETC-120017972
File-URL: http://hdl.handle.net/10.1081/ETC-120017972
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:1:p:1-28
Template-Type: ReDIF-Article 1.0
Author-Name: Maurice J. G. Bun
Author-X-Name-First: Maurice J. G.
Author-X-Name-Last: Bun
Title: Bias Correction in the Dynamic Panel Data Model with a Nonscalar Disturbance Covariance Matrix
Abstract:
Approximation formulae are developed for the bias
of ordinary and generalized Least Squares Dummy Variable (LSDV) estimators
in dynamic panel data models. Results from Kiviet [Kiviet, J. F. (1995),
on bias, inconsistency, and efficiency of various estimators in dynamic
panel data models, J. Econometrics68:53-78; Kiviet, J. F.
(1999), Expectations of expansions for estimators in a dynamic panel data
model: some results for weakly exogenous regressors, In: Hsiao, C.,
Lahiri, K., Lee, L-F., Pesaran, M. H., eds., Analysis of Panels
and Limited Dependent Variables, Cambridge: Cambridge University
Press, pp. 199-225] are extended to higher-order dynamic panel data models
with general covariance structure. The focus is on estimation of both
short- and long-run coefficients. The results show that proper modelling
of the disturbance covariance structure is indispensable. The bias
approximations are used to construct bias corrected estimators which are
then applied to quarterly data from 14 European Union countries. Money
demand functions for M1, M2 and
M3 are estimated for the EU area as a whole for the
period 1991: I-1995: IV. Significant spillovers between countries are
found reflecting the dependence of domestic money demand on foreign
developments. The empirical results show that in general plausible
long-run effects are obtained by the bias corrected estimators. Moreover,
finite sample bias, although of moderate magnitude, is present underlining
the importance of more refined estimation techniques. Also the efficiency
gains by exploiting the heteroscedasticity and cross-correlation patterns
between countries are sometimes considerable.
Journal: Econometric Reviews
Pages: 29-58
Issue: 1
Volume: 22
Year: 2003
Month: 2
X-DOI: 10.1081/ETC-120017973
File-URL: http://hdl.handle.net/10.1081/ETC-120017973
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:1:p:29-58
Template-Type: ReDIF-Article 1.0
Author-Name: Melvyn Weeks
Author-X-Name-First: Melvyn
Author-X-Name-Last: Weeks
Author-Name: James Yudong Yao
Author-X-Name-First: James
Author-X-Name-Last: Yudong Yao
Title: Provincial Conditional Income Convergence in China, 1953-1997: A Panel Data Approach
Abstract:
This paper examines the tendency towards income
convergence among China's main provinces during the two periods: the
pre-reform period 1953-1977 and the reform period 1978-1997 using the
framework of the Solow growth model. The panel data method accounts for
not only province-specific initial technology level but also the
heterogeneity of the technological progress rate between the fast-growing
coastal and interior provinces. Estimation problems of weak instruments
and endogeneity are addressed by the use of a system generalized method of
moments (GMM) estimator. The main empirical finding is that there is a
system-wide income divergence during the reform period because the coastal
provinces do not share a common technology progress rate with the interior
provinces.
Journal: Econometric Reviews
Pages: 59-77
Issue: 1
Volume: 22
Year: 2003
Month: 2
X-DOI: 10.1081/ETC-120017974
File-URL: http://hdl.handle.net/10.1081/ETC-120017974
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:1:p:59-77
Template-Type: ReDIF-Article 1.0
Author-Name: Rafik Baccouche
Author-X-Name-First: Rafik
Author-X-Name-Last: Baccouche
Author-Name: Mokhtar Kouki
Author-X-Name-First: Mokhtar
Author-X-Name-Last: Kouki
Title: Stochastic Production Frontier and Technical Inefficiency: A Sensitivity Analysis
Abstract:
The present paper focuses attention on the
sensitivity of technical inefficiency to most commonly used one-sided
distributions of the inefficiency error term, namely the truncated normal,
the half-normal, and the exponential distributions. A generalized version
of the half-normal, which does not embody the zero-mean restriction, is
also explored. For each distribution, the likelihood function and the
counterpart of the estimator of technical efficiency are explicitly stated
(Jondrow, J., Lovell, C. A. K., Materov, I. S., Schmidt, P. ([1982]), On
estimation of technical inefficiency in the stochastic frontier production
function model, J. Econometrics19:233-238). Based on our
panel data set, related to Tunisian manufacturing firms over the period
1983-1993, formal tests lead to a strong rejection of the zero-mean
restriction embodied in the half normal distribution. Our main conclusion
is that the degree of measured inefficiency is very sensitive to the
postulated assumptions about the distribution of the one-sided error term.
The estimated inefficiency indices are, however, unaffected by the choice
of the functional form for the production function.
Journal: Econometric Reviews
Pages: 79-91
Issue: 1
Volume: 22
Year: 2003
Month: 2
X-DOI: 10.1081/ETC-120017975
File-URL: http://hdl.handle.net/10.1081/ETC-120017975
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:1:p:79-91
Template-Type: ReDIF-Article 1.0
Author-Name: Rim Ben Ayed-Mouelhi
Author-X-Name-First: Rim Ben
Author-X-Name-Last: Ayed-Mouelhi
Author-Name: Mohamed Goa�ed
Author-X-Name-First: Mohamed
Author-X-Name-Last: Goa�ed
Title: Efficiency Measure from Dynamic Stochastic Production Frontier: Application to Tunisian Textile, Clothing, and Leather Industries
Abstract:
This paper adresses the measurement of technical
efficiency of textile, clothing, and leather (TCL) industries in Tunisia
through a panel data estimation of a dynamic translog production frontier.
It provides a perspective on productivity and efficiency that should be
instructive to a developing economy which will face substantial
competitive pressure along the gradual economic liberalisation process.
The importance of TCL industries in Tunisian manufacturing sector is a
reason for obtaining more knowledge of productivity and efficiency for
this key industry. Dynamic is introduced to reflect the production
consequences of the adjustment costs, which are associated with changes in
factor inputs. Estimation of a dynamic error components model is
considered using the system generalized method of moments (GMM) estimator
suggested by Arellano and Bover (1995), Another look at the
instrumental-variable estimation of error-components models, J.
Econometrics68:29-51) and Blundell and Bond (Blundell, R., Bond,
S. (1998a), Initial conditions and moment restrictions in dynamic panel
data models. J. Econometrics87:115-143; Blundell, R.,
Bond, S. (1998b), GMM estimation with persistent panel data: an
application to production functions, Paper presented at the Eighth
International Conference on Panel Data, Goteborg University). Our study
evaluates the sensitivity of the results, particularly of the efficiency
measures, to different specifications. Firm-specific time-invariant
technical efficiency is obtained using the Schmidt and Sickles (Schmidt,
P., Sickles, R. C. (1984). Production frontiers and panel data. J.
Bus. Econ. Stat.2:367-374) approach after estimating the dynamic
frontier. We stress the importance of allowing for lags in adjustment of
output to inputs and of controlling for time-invariant variables when
estimating firm-specific efficiency. The results suggest that the system
GMM estimation of the dynamic specification produces the most accurate
parameter estimates and technical efficiency measure. Mean efficiency
scores is of 68%. Policy implications of the results are outlined.
Journal: Econometric Reviews
Pages: 93-111
Issue: 1
Volume: 22
Year: 2003
Month: 2
X-DOI: 10.1081/ETC-120017976
File-URL: http://hdl.handle.net/10.1081/ETC-120017976
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:1:p:93-111
Template-Type: ReDIF-Article 1.0
Author-Name: Cheng Hsiao
Author-X-Name-First: Cheng
Author-X-Name-Last: Hsiao
Title: In Memoriam: G. S. Maddala
Journal: Econometric Reviews
Pages: vii-ix
Issue: 1
Volume: 22
Year: 2003
Month: 2
X-DOI: 10.1081/ETC-120017977
File-URL: http://hdl.handle.net/10.1081/ETC-120017977
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:1:p:vii-ix
Template-Type: ReDIF-Article 1.0
Author-Name: Jacques Mairesse
Author-X-Name-First: Jacques
Author-X-Name-Last: Mairesse
Title: In Memoriam: Zvi Griliches
Journal: Econometric Reviews
Pages: xi-xv
Issue: 1
Volume: 22
Year: 2003
Month: 2
X-DOI: 10.1081/ETC-120017978
File-URL: http://hdl.handle.net/10.1081/ETC-120017978
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:1:p:xi-xv
Template-Type: ReDIF-Article 1.0
Author-Name: Brian Boyer
Author-X-Name-First: Brian
Author-X-Name-Last: Boyer
Author-Name: James McDonald
Author-X-Name-First: James
Author-X-Name-Last: McDonald
Author-Name: Whitney Newey
Author-X-Name-First: Whitney
Author-X-Name-Last: Newey
Title: A Comparison of Partially Adaptive and Reweighted Least Squares Estimation
Abstract:
The small sample performance of least median of squares, reweighted least
squares, least squares, least absolute deviations, and three partially
adaptive estimators are compared using Monte Carlo simulations. Two data
problems are addressed in the paper: (1) data generated from non-normal
error distributions and (2) contaminated data. Breakdown plots are used to
investigate the sensitivity of partially adaptive estimators to data
contamination relative to RLS. One partially adaptive estimator performs
especially well when the errors are skewed, while another partially
adaptive estimator and RLS perform particularly well when the errors are
extremely leptokur-totic. In comparison with RLS, partially adaptive
estimators are only moderately effective in resisting data contamination;
however, they outperform least squares and least absolute deviation
estimators.
Journal: Econometric Reviews
Pages: 115-134
Issue: 2
Volume: 22
Year: 2003
Keywords: Least median of squares, Reweighted least squares, Partially adaptive estimation,
X-DOI: 10.1081/ETC-120020459
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120020459
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:2:p:115-134
Template-Type: ReDIF-Article 1.0
Author-Name: William Barnett
Author-X-Name-First: William
Author-X-Name-Last: Barnett
Author-Name: Meenakshi Pasupathy
Author-X-Name-First: Meenakshi
Author-X-Name-Last: Pasupathy
Title: Regularity of the Generalized Quadratic Production Model: A Counterexample
Abstract:
Recently there has been a growing tendency to impose curvature, but not
monotonicity, on specifications of technology. But regularity requires
satisfaction of both curvature and monotonicity conditions. Without both
satisfied, the second order conditions for optimizing behavior fail and
duality theory fails. When neither curvature nor monotonicity are imposed,
estimated flexible specifications of technology are much more likely to
violate curvature than monotonicity. Hence it has been argued that there
is no need to impose or check for monotonicity, when curvature has been
imposed globally. But imposition of curvature may induce violations of
monotonicity that otherwise would not have occurred. We explore the
regularity properties of our earlier results with a multiproduct financial
technology specified to be generalized quadratic. In our earlier work, we
used the usual approach and accepted the usual view. We now find that
imposition of curvature globally and monotonicity locally does not assure
monotonicity within the region of the data. Our purpose is to alert
researchers to the kinds of problems that we encountered and which we
believe are largely being overlooked in the production modelling
literature, as we had been overlooking them.
Journal: Econometric Reviews
Pages: 135-154
Issue: 2
Volume: 22
Year: 2003
Keywords: Technology, Regularity, Curvature, Production, Flexibility,
X-DOI: 10.1081/ETC-120020460
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120020460
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:2:p:135-154
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Daniel Slottje
Author-X-Name-First: Daniel
Author-X-Name-Last: Slottje
Title: Dynamics of Market Power and Concentration Profiles
Abstract:
This paper examines some of the economic and econometric issues that
arise in attempting to measure the degree of concentration in an industry
and its dynamic evolution. A general axiomatic basis is developed. We
offer new measures of concentration over aggregated periods of time and
provide a sound statistical basis for inferences. Concentration is one
aspect of the problem of measuring “market power” within an
industry. Modern economic analysis of antitrust issues does not focus only
on the level of concentration, but still must examine the issue carefully.
We contrast concentration at a point in time with a dynamic profile of
change in the distribution of shares in a given market. Our methods are
demonstrated with an application to the US steel industry.
Journal: Econometric Reviews
Pages: 155-177
Issue: 2
Volume: 22
Year: 2003
Keywords: Market power, Concentration, Mobility, Statistical inference, US Steel industry, Industrial concentration, Antitrust, Steel, Tests, Dynamic profiles,
X-DOI: 10.1081/ETC-120020461
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120020461
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:2:p:155-177
Template-Type: ReDIF-Article 1.0
Author-Name: Shiqing Ling
Author-X-Name-First: Shiqing
Author-X-Name-Last: Ling
Author-Name: W. K. Li
Author-X-Name-First: W. K.
Author-X-Name-Last: Li
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: Estimation and Testing for Unit Root Processes with GARCH (1, 1) Errors: Theory and Monte Carlo Evidence
Abstract:
Least squares (LS) and maximum likelihood (ML) estimation are considered
for unit root processes with GARCH (1, 1) errors. The asymptotic
distributions of LS and ML estimators are derived under the condition
α + β < 1. The former has the usual
unit root distribution and the latter is a functional of a bivariate
Brownian motion, as in Ling and Li [Ling, S., Li, W. K. (1998). Limiting
distributions of maximum likelihood estimators for unstable autoregressive
moving-average time series with GARCH errors. Ann. Statist.26:84-125].
Several unit root tests based on LS estimators, ML estimators, and mixing
LS and ML estimators, are constructed. Simulation results show that tests
based on mixing LS and ML estimators perform better than Dickey-Fuller
tests which are based on LS estimators, and that tests based on the ML
estimators perform better than the mixed estimators.
Journal: Econometric Reviews
Pages: 179-202
Issue: 2
Volume: 22
Year: 2003
Keywords: Asymptotic distribution, Brownian motion, GARCH model, Least squares estimator, Maximum likelihood estimator, Unit root,
X-DOI: 10.1081/ETC-120020462
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120020462
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:2:p:179-202
Template-Type: ReDIF-Article 1.0
Author-Name: Arnold Zellner
Author-X-Name-First: Arnold
Author-X-Name-Last: Zellner
Title: Some Recent Developments in Econometric Inference
Abstract:
Recent results in information theory, see Soofi (1996; 2001) for a
review, include derivations of optimal information processing rules,
including Bayes' theorem, for learning from data based on minimizing a
criterion functional, namely output information minus input information as
shown in Zellner (1988; 1991; 1997; 2002). Herein, solution post data
densities for parameters are obtained and studied for cases in which the
input information is that in (1) a likelihood function and a prior
density; (2) only a likelihood function; and (3) neither a prior nor a
likelihood function but only input information in the form of post data
moments of parameters, as in the Bayesian method of moments approach. Then
it is shown how optimal output densities can be employed to obtain
predictive densities and optimal, finite sample structural coefficient
estimates using three alternative loss functions. Such optimal estimates
are compared with usual estimates, e.g., maximum likelihood, two-stage
least squares, ordinary least squares, etc. Some Monte Carlo experimental
results in the literature are discussed and implications for the future
are provided.
Journal: Econometric Reviews
Pages: 203-215
Issue: 2
Volume: 22
Year: 2003
Keywords: Econometric inference, Bayes' theorem, Information theory, Learning, Optimal estimation,
X-DOI: 10.1081/ETC-120020463
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120020463
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:2:p:203-215
Template-Type: ReDIF-Article 1.0
Author-Name: Elena Andreou
Author-X-Name-First: Elena
Author-X-Name-Last: Andreou
Author-Name: Aris Spanos
Author-X-Name-First: Aris
Author-X-Name-Last: Spanos
Title: Statistical Adequacy and the Testing of Trend Versus Difference Stationarity
Abstract:
The debate on whether macroeconomic series are
trend or difference stationary, initiated by Nelson and Plosser [Nelson,
C. R.; Plosser, C. I. (1982). Trends and random walks in macroeconomic
time series: some evidence and implications. Journal of Monetary
Economics10:139-162] remains unresolved. The main objective of
the paper is to contribute toward a resolution of this issue by bringing
into the discussion the problem of statistical adequacy.
The paper revisits the empirical results of Nelson and Plosser [Nelson, C.
R.; Plosser, C. I. (1982). Trends and random walks in macroeconomic time
series: some evidence and implications. Journal of Monetary
Economics10:139-162] and Perron [Perron, P. (1989). The great
crash, the oil price shock, and the unit root hypothesis.
Econometrica57:1361-1401] and shows that several of their
estimated models are misspecified. Respecification with a view to ensuring
statistical adequacy gives rise to heteroskedastic AR(k)
models for some of the price series. Based on estimated models which are
statistically adequate, the main conclusion of the paper is that the
majority of the data series are trend stationary.
Journal: Econometric Reviews
Pages: 217-237
Issue: 3
Volume: 22
Year: 2003
Month: 1
X-DOI: 10.1081/ETC-120023897
File-URL: http://hdl.handle.net/10.1081/ETC-120023897
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:3:p:217-237
Template-Type: ReDIF-Article 1.0
Author-Name: Pierre Perron
Author-X-Name-First: Pierre
Author-X-Name-Last: Perron
Title: Comment on "Statistical Adequacy and the Testing of Trend Versus Difference Stationarity" by Andreou and Spanos (Number 1)
Journal: Econometric Reviews
Pages: 239-245
Issue: 3
Volume: 22
Year: 2003
Month: 1
X-DOI: 10.1081/ETC-120023900
File-URL: http://hdl.handle.net/10.1081/ETC-120023900
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:3:p:239-245
Template-Type: ReDIF-Article 1.0
Author-Name: Robin L. Lumsdaine
Author-X-Name-First: Robin L.
Author-X-Name-Last: Lumsdaine
Title: Comment on "Statistical Adequacy and the Testing of Trend Versus Difference Stationarity" by Andreou and Spanos (Number 2)-super-#
Abstract:
-super-#The opinions expressed are the author's and do not
represent those of Deutsche Bank or its affiliates.
Journal: Econometric Reviews
Pages: 247-252
Issue: 3
Volume: 22
Year: 2003
Month: 1
X-DOI: 10.1081/ETC-120023903
File-URL: http://hdl.handle.net/10.1081/ETC-120023903
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:3:p:247-252
Template-Type: ReDIF-Article 1.0
Author-Name: Ragnar Nymoen
Author-X-Name-First: Ragnar
Author-X-Name-Last: Nymoen
Title: Comment on "Statistical Adequacy and the Testing of Trend Versus Difference Stationarity" by Andreou and Spanos (Number 3)
Journal: Econometric Reviews
Pages: 253-260
Issue: 3
Volume: 22
Year: 2003
Month: 1
X-DOI: 10.1081/ETC-120023906
File-URL: http://hdl.handle.net/10.1081/ETC-120023906
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:3:p:253-260
Template-Type: ReDIF-Article 1.0
Author-Name: Maozu Lu
Author-X-Name-First: Maozu
Author-X-Name-Last: Lu
Author-Name: Jan M. Podivinsky
Author-X-Name-First: Jan M.
Author-X-Name-Last: Podivinsky
Title: The Robustness of Trend Stationarity: An Illustration with the Extended Nelson-Plosser Dataset
Abstract:
We
re-evaluate Andreu and Spanos's findings in favour of trend stationarity
by considering the extended Nelson-Plosser data set. This expanded (to
1988) data set includes a period of rather different behaviour compared
with the original Nelson-Plosser data used by Andreou and Spanos. We find
that Andreou and Spanos's models (with only minor adjustments) exhibit
remarable stability over this extended period, and indicate that their
conclusions are more robust than they have shown.
Journal: Econometric Reviews
Pages: 261-267
Issue: 3
Volume: 22
Year: 2003
Month: 1
X-DOI: 10.1081/ETC-120024075
File-URL: http://hdl.handle.net/10.1081/ETC-120024075
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:3:p:261-267
Template-Type: ReDIF-Article 1.0
Author-Name: Alastair R. Hall
Author-X-Name-First: Alastair R.
Author-X-Name-Last: Hall
Author-Name: Fernanda P. M. Peixe
Author-X-Name-First: Fernanda P. M.
Author-X-Name-Last: Peixe
Title: A Consistent Method for the Selection of Relevant Instruments
Abstract:
In many
applications, a researcher must select an instrument vector from a
candidate set of instruments. If the ultimate objective is to perform
inference about the unknown parameters using conventional asymptotic
theory, then we argue that it is desirable for the chosen instrument
vector to satisfy four conditions which we refer to as orthogonality,
identification, efficiency, and non-redundancy. It is impossible to verify
a priori which elements of the candidate set satisfy these conditions;
this can only be done using the data. However, once the data are used in
this fashion it is important that the selection process does not
contaminate the limiting distribution of the parameter estimator. We refer
to this requirement as the inference condition. In a recent paper, Andrews
[[Andrews, D. W. K. (1999)]. Consistent moment selection procedures for
generalized method of moments estimation.
Econometrica67:543-564] has proposed a method of moment
selection based on an information criterion involving the overidentifying
restrictions test. This method can be shown to select an instrument vector
which satisfies the orthogonality condition with probability one in the
limit. In this paper, we consider the problem of instrument selection
based on a combination of the efficiency and non-redundancy conditions
which we refer to as the relevance condition. It is shown that, within a
particular class of models, certain canonical correlations form the
natural metric for relevancy, and this leads us to propose a canonical
correlations information criterion (CCIC) for instrument selection. We
establish conditions under which our method satisfies the inference
condition. We also consider the properties of an instrument selection
method based on the sequential application of [Andrews, D. W. K. (1999)].
Consistent moment selection procedures for generalized method of moments
estimation. Econometrica67:543-564 method and CCIC.
Journal: Econometric Reviews
Pages: 269-287
Issue: 3
Volume: 22
Year: 2003
Month: 1
X-DOI: 10.1081/ETC-120024752
File-URL: http://hdl.handle.net/10.1081/ETC-120024752
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:3:p:269-287
Template-Type: ReDIF-Article 1.0
Author-Name: Rodolfo Cermeño
Author-X-Name-First: Rodolfo
Author-X-Name-Last: Cermeño
Author-Name: G. S. Maddala
Author-X-Name-First: G. S.
Author-X-Name-Last: Maddala
Author-Name: Michael A. Trueblood
Author-X-Name-First: Michael A.
Author-X-Name-Last: Trueblood
Title: Modeling Technology as a Dynamic Error Components Process: The Case of the Inter-country Agricultural Production Function†
Abstract:
In this paper, we propose a dynamic
error-components model to represent the unobserved level of technology.
This specification implies a well-defined common factor dynamic model for
per capita output that can be tested explicitly. The model is applied to
data on aggregates of agricultural inputs and outputs for groups of
countries from the OECD, Africa (AF), Latin America (LA) as well as
centrally planned countries, over a period of 31 years. We find that the
proposed model fits the data better than alternative static specifications
and satisfies the implied common factor restrictions in two of the
samples. The results suggest that although technological change seems to
have been a faster process for less developed countries relative to the
OECD countries, it has not been fast enough to reduce appreciably the
enormous differences in average technological levels that still persist
between them.
-super-†Dedicated to the memory of G. S. Maddala.
Journal: Econometric Reviews
Pages: 289-306
Issue: 3
Volume: 22
Year: 2003
Month: 1
X-DOI: 10.1081/ETC-120024753
File-URL: http://hdl.handle.net/10.1081/ETC-120024753
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:3:p:289-306
Template-Type: ReDIF-Article 1.0
Author-Name: Lung-fei Lee
Author-X-Name-First: Lung-fei
Author-X-Name-Last: Lee
Title: Best Spatial Two-Stage Least Squares Estimators for a Spatial Autoregressive Model with Autoregressive Disturbances
Abstract:
Estimation of a cross-sectional spatial model containing both a spatial
lag of the dependent variable and spatially autoregressive disturbances
are considered. [Kelejian and Prucha (1998)]described a generalized
two-stage least squares procedure for estimating such a spatial model.
Their estimator is, however, not asymptotically optimal. We propose best
spatial 2SLS estimators that are asymptotically optimal instrumental
variable (IV) estimators. An associated goodness-of-fit (or over
identification) test is available. We suggest computationally simple and
tractable numerical procedures for constructing the optimal instruments.
Journal: Econometric Reviews
Pages: 307-335
Issue: 4
Volume: 22
Year: 2003
Keywords: Spatial autoregressive model, Two-stage least squares, Asymptotic efficiency, Best two-stage least squares, Cholesky decomposition, Contracting mapping,
X-DOI: 10.1081/ETC-120025891
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120025891
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:4:p:307-335
Template-Type: ReDIF-Article 1.0
Author-Name: Jorge Belaire-Franch
Author-X-Name-First: Jorge
Author-X-Name-Last: Belaire-Franch
Title: A Note on Resampling the Integration Across the Correlation Integral with Alternative Ranges
Abstract:
This paper reconsiders the nonlinearity test proposed by
Koc-super-˘enda (Koc-super-˘enda, E. (2001). An alternative to
the BDS test: integration across the correlation integral. Econometric
Reviews20:337-351). When the analyzed series is non-Gaussian, the
empirical rejection rates can be much larger than the nominal size. In
this context, the necessity of tabulating the empirical distribution of
the statistic each time the test is computed is stressed. To that end,
simple random permutation works reasonably well. This paper also shows,
through Monte Carlo experiments, that Koc-super-˘enda's test can be
more powerful than the Brock et al. (Brock, W., Dechert, D., Scheickman,
J., LeBaron, B. (1996). A test for independence based on the correlation
dimension. Econometric Reviews15:197-235) procedure. However, more than
one range of values for the proximity parameter should be used. Finally,
empirical evidence on exchange rates is reassessed.
Journal: Econometric Reviews
Pages: 337-349
Issue: 4
Volume: 22
Year: 2003
Keywords: Chaos, Nonlinear dynamics, Koc-super-˘enda's test, Random permutation, Exchange rates,
X-DOI: 10.1081/ETC-120025892
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120025892
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:4:p:337-349
Template-Type: ReDIF-Article 1.0
Author-Name: Manuel Dominguez
Author-X-Name-First: Manuel
Author-X-Name-Last: Dominguez
Author-Name: Ignacio Lobato
Author-X-Name-First: Ignacio
Author-X-Name-Last: Lobato
Title: Testing the Martingale Difference Hypothesis
Abstract:
In this paper we consider testing that an economic time series follows a
martingale difference process. The martingale difference hypothesis has
typically been tested using information contained in the second moments of
a process, that is, using test statistics based on the sample
autocovariances or periodograms. Tests based on these statistics are
inconsistent since they cannot detect nonlinear alternatives. In this
paper we consider tests that detect linear and nonlinear alternatives.
Given that the asymptotic distributions of the considered tests statistics
depend on the data generating process, we propose to implement the tests
using a modified wild bootstrap procedure. The paper theoretically
justifies the proposed tests and examines their finite sample behavior by
means of Monte Carlo experiments.
Journal: Econometric Reviews
Pages: 351-377
Issue: 4
Volume: 22
Year: 2003
Keywords: Nonlinear dependence, Nonparametric, Correlation, Bootstrap,
X-DOI: 10.1081/ETC-120025895
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120025895
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:4:p:351-377
Template-Type: ReDIF-Article 1.0
Author-Name: Alain Guay
Author-X-Name-First: Alain
Author-X-Name-Last: Guay
Title: Optimal Predictive Tests
Abstract:
This paper develops optimal tests based on sequential predictive moment
conditions. We show that an appropriate weighting version of the
predictive test achieves the same power as optimal structural change tests
proposed by Sowell (1996a) Optimal tests for parameter instability in the
generalized method of moments framework. Econometrica64:1085-1107 and
(1996b) Tests for Violations of MOMENT conditions. Manuscript.Graduate
School of Industrial Administration, Carnegie Mellon University.
Consequently, we can apply directly Sowell's results. Optimal predictive
tests for parameter instability and overidentifying restriction stability
are proposed. The finite sample properties of LM, Wald, LR-type and
predictive tests for parameter instability are studied via a simulation
study.
Journal: Econometric Reviews
Pages: 379-410
Issue: 4
Volume: 22
Year: 2003
Keywords: Predictive test, Optimal test, Moment conditions,
X-DOI: 10.1081/ETC-120025896
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120025896
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:4:p:379-410
Template-Type: ReDIF-Article 1.0
Author-Name: Vasco Gabriel
Author-X-Name-First: Vasco
Author-X-Name-Last: Gabriel
Title: Tests for the Null Hypothesis of Cointegration: A Monte Carlo Comparison
Abstract:
The aim of this paper is to compare the relative performance of several
tests for the null hypothesis of cointegration, in terms of size and power
in finite samples. This is carried out using Monte Carlo simulations for a
range of plausible data-generating processes. We also analyze the impact
on size and power of choosing different procedures to estimate the long
run variance of the errors. We found that the parametrically adjusted test
of McCabe et al. (1997) is the most well-balanced test, displaying good
power and relatively few size distortions.
Journal: Econometric Reviews
Pages: 411-435
Issue: 4
Volume: 22
Year: 2003
Keywords: Cointegration, Tests, Monte Carlo,
X-DOI: 10.1081/ETC-120025897
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120025897
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:22:y:2003:i:4:p:411-435
Template-Type: ReDIF-Article 1.0
Author-Name: Bent Nielsen
Author-X-Name-First: Bent
Author-X-Name-Last: Nielsen
Title: On the Distribution of Likelihood Ratio Test Statistics for Cointegration Rank
Abstract:
This paper analyses the likelihood ratio test for the hypothesis of
reduced cointegration rank in a Gaussian vector autoregressive model. The
usual asymptotic distribution typically gives rather large size
distortions. This is explained by the fact that the asymptotic
distribution of the likelihood ratio test statistic varies across the
parameter space. A much improved distribution approximation can be
obtained using local asymptotic theory. The idea is discussed for some low
dimensional examples.
Journal: Econometric Reviews
Pages: 1-23
Issue: 1
Volume: 23
Year: 2004
Keywords: Bartlett corrections, Cointegration, Finite sample results, Lack of similarity, Local asymptotics,
X-DOI: 10.1081/ETC-120028834
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120028834
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2004:i:1:p:1-23
Template-Type: ReDIF-Article 1.0
Author-Name: Guglielmo Maria Caporale
Author-X-Name-First: Guglielmo Maria
Author-X-Name-Last: Caporale
Author-Name: Nikitas Pittis
Author-X-Name-First: Nikitas
Author-X-Name-Last: Pittis
Title: Estimator Choice and Fisher's Paradox: A Monte Carlo Study
Abstract:
This paper argues that Fisher's paradox can be explained away in terms of
estimator choice. We analyse by means of Monte Carlo experiments the small
sample properties of a large set of estimators (including virtually all
available single-equation estimators), and compute the critical values
based on the empirical distributions of the t-statistics, for a variety of
Data Generation Processes (DGPs), allowing for structural breaks, ARCH
effects etc. We show that precisely the estimators most commonly used in
the literature, namely OLS, Dynamic OLS (DOLS) and non-prewhitened FMLS,
have the worst performance in small samples, and produce rejections of the
Fisher hypothesis. If one employs the estimators with the most desirable
properties (i.e., the smallest downward bias and the minimum shift in the
distribution of the associated t-statistics), or if one uses the empirical
critical values, the evidence based on US data is strongly supportive of
the Fisher relation, consistently with many theoretical models.
Journal: Econometric Reviews
Pages: 25-52
Issue: 1
Volume: 23
Year: 2004
Keywords: Fisher's paradox, Cointegration, Single-equation estimators, Monte Carlo analysis,
X-DOI: 10.1081/ETC-120028835
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120028835
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2004:i:1:p:25-52
Template-Type: ReDIF-Article 1.0
Author-Name: Dimitris Politis
Author-X-Name-First: Dimitris
Author-X-Name-Last: Politis
Author-Name: Halbert White
Author-X-Name-First: Halbert
Author-X-Name-Last: White
Title: Automatic Block-Length Selection for the Dependent Bootstrap
Abstract:
We review the different block bootstrap methods for time series, and
present them in a unified framework. We then revisit a recent result of
Lahiri [Lahiri, S. N. (1999b). Theoretical comparisons of block bootstrap
methods, Ann. Statist. 27:386-404] comparing the different methods and
give a corrected bound on their asymptotic relative efficiency; we also
introduce a new notion of finite-sample “attainable”
relative efficiency. Finally, based on the notion of spectral estimation
via the flat-top lag-windows of Politis and Romano [Politis, D. N.,
Romano, J. P. (1995). Bias-corrected nonparametric spectral estimation. J.
Time Series Anal. 16:67-103], we propose practically useful estimators of
the optimal block size for the aforementioned block bootstrap methods. Our
estimators are characterized by the fastest possible rate of convergence
which is adaptive on the strength of the correlation of the time series as
measured by the correlogram.
Journal: Econometric Reviews
Pages: 53-70
Issue: 1
Volume: 23
Year: 2004
Keywords: Bandwidth choice, Block bootstrap, Resampling, Subsampling, Time series, Variance estimation,
X-DOI: 10.1081/ETC-120028836
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120028836
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2004:i:1:p:53-70
Template-Type: ReDIF-Article 1.0
Author-Name: Amos Golan
Author-X-Name-First: Amos
Author-X-Name-Last: Golan
Author-Name: Enrico Moretti
Author-X-Name-First: Enrico
Author-X-Name-Last: Moretti
Author-Name: Jeffrey M.Perloff
Author-X-Name-First: Jeffrey
Author-X-Name-Last: M.Perloff
Title: A Small-Sample Estimator for the Sample-Selection Model
Abstract:
A semiparametric estimator for evaluating the parameters of data
generated under a sample selection process is developed. This estimator is
based on the generalized maximum entropy estimator and performs well for
small and ill-posed samples. Theoretical and sampling comparisons with
parametric and semiparametric estimators are given. This method and
standard ones are applied to three small-sample empirical applications of
the wage-participation model for female teenage heads of households,
immigrants, and Native Americans.
Journal: Econometric Reviews
Pages: 71-91
Issue: 1
Volume: 23
Year: 2004
Keywords: Maximum entropy, Sample selection, Monte Carlo experiments,
X-DOI: 10.1081/ETC-120028837
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120028837
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2004:i:1:p:71-91
Template-Type: ReDIF-Article 1.0
Author-Name: Jun Yu
Author-X-Name-First: Jun
Author-X-Name-Last: Yu
Title: Empirical Characteristic Function Estimation and Its Applications
Abstract:
This paper reviews the method of model-fitting via the empirical
characteristic function. The advantage of using this procedure is that one
can avoid difficulties inherent in calculating or maximizing the
likelihood function. Thus it is a desirable estimation method when the
maximum likelihood approach encounters difficulties but the characteristic
function has a tractable expression. The basic idea of the empirical
characteristic function method is to match the characteristic function
derived from the model and the empirical characteristic function obtained
from data. Ideas are illustrated by using the methodology to estimate a
diffusion model that includes a self-exciting jump component. A Monte
Carlo study shows that the finite sample performance of the proposed
procedure offers an improvement over a GMM procedure. An application using
over 72 years of DJIA daily returns reveals evidence of jump clustering.
Journal: Econometric Reviews
Pages: 93-123
Issue: 2
Volume: 23
Year: 2004
Keywords: Diffusion process, Poisson jump, Self-exciting, GMM, Jump clustering,
X-DOI: 10.1081/ETC-120039605
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120039605
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2004:i:2:p:93-123
Template-Type: ReDIF-Article 1.0
Author-Name: William Greene
Author-X-Name-First: William
Author-X-Name-Last: Greene
Title: Fixed Effects and Bias Due to the Incidental Parameters Problem in the Tobit Model
Abstract:
The maximum likelihood estimator (MLE) in nonlinear panel data models
with fixed effects is widely understood (with a few exceptions) to be
biased and inconsistent when T, the length of the panel, is small and
fixed. However, there is surprisingly little theoretical or empirical
evidence on the behavior of the estimator on which to base this
conclusion. The received studies have focused almost exclusively on
coefficient estimation in two binary choice models, the probit and logit
models. In this note, we use Monte Carlo methods to examine the behavior
of the MLE of the fixed effects tobit model. We find that the estimator's
behavior is quite unlike that of the estimators of the binary choice
models. Among our findings are that the location coefficients in the tobit
model, unlike those in the probit and logit models, are unaffected by the
“incidental parameters problem.” But, a surprising result
related to the disturbance variance emerges instead - the finite sample
bias appears here rather than in the slopes. This has implications for
estimation of marginal effects and asymptotic standard errors, which are
also examined in this paper. The effects are also examined for the probit
and truncated regression models, extending the range of received results
in the first of these beyond the widely cited biases in the coefficient
estimators.
Journal: Econometric Reviews
Pages: 125-147
Issue: 2
Volume: 23
Year: 2004
Keywords: Panel data, Fixed effects, Computation, Monte Carlo, Tobit, Finite sample, Incidental parameters problem, Marginal effects,
X-DOI: 10.1081/ETC-120039606
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120039606
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2004:i:2:p:125-147
Template-Type: ReDIF-Article 1.0
Author-Name: Richard Harris
Author-X-Name-First: Richard
Author-X-Name-Last: Harris
Author-Name: Elias Tzavalis
Author-X-Name-First: Elias
Author-X-Name-Last: Tzavalis
Title: Testing for Unit Roots in Dynamic Panels in the Presence of a Deterministic Trend: Re-examining the Unit Root Hypothesis for Real Stock Prices and Dividends
Abstract:
In this paper, we suggest a similar unit root test statistic for dynamic
panel data with fixed effects. The test is based on the LM, or score,
principle and is derived under the assumption that the time dimension of
the panel is fixed, which is typical in many panel data studies. It is
shown that the limiting distribution of the test statistic is standard
normal. The similarity of the test with respect to both the initial
conditions of the panel and the fixed effects is achieved by allowing for
a trend in the model using a parameterisation that has the same
interpretation under both the null and alternative hypotheses. This
parameterisation can be expected to increase the power of the test
statistic. Simulation evidence suggests that the proposed test has
empirical size that is very close to the nominal level and considerably
more power than other panel unit root tests that assume that the time
dimension of the panel is large. As an application of the test, we
re-examine the stationarity of real stock prices and dividends using
disaggregated panel data over a relatively short period of time. Our
results suggest that while real stock prices contain a unit root, real
dividends are trend stationary.
Journal: Econometric Reviews
Pages: 149-166
Issue: 2
Volume: 23
Year: 2004
Keywords: Panel data, Unit roots, Fixed effects, Central limit theorem, Score vector, Real dividends, Stock prices,
X-DOI: 10.1081/ETC-120039607
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120039607
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2004:i:2:p:149-166
Template-Type: ReDIF-Article 1.0
Author-Name: Markus Frolich
Author-X-Name-First: Markus
Author-X-Name-Last: Frolich
Title: A Note on the Role of the Propensity Score for Estimating Average Treatment Effects
Abstract:
Hahn [Hahn, J. (1998). On the role of the propensity score in efficient
semiparametric estimation of average treatment effects. Econometrica
66:315-331] derived the semiparametric efficiency bounds for estimating
the average treatment effect (ATE) and the average treatment effect on the
treated (ATET). The variance of ATET depends on whether the propensity
score is known or unknown. Hahn attributes this to “dimension
reduction.” In this paper, an alternative explanation is given:
Knowledge of the propensity score improves upon the estimation of the
distribution of the confounding variables.
Journal: Econometric Reviews
Pages: 167-174
Issue: 2
Volume: 23
Year: 2004
Keywords: Evaluation, Matching, Causal effect, Semiparametric efficiency bound,
X-DOI: 10.1081/ETC-120039608
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-120039608
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2004:i:2:p:167-174
Template-Type: ReDIF-Article 1.0
Author-Name: George Christodoulakis
Author-X-Name-First: George
Author-X-Name-Last: Christodoulakis
Author-Name: Stephen Satchell
Author-X-Name-First: Stephen
Author-X-Name-Last: Satchell
Title: Forecast Evaluation in the Presence of Unobserved Volatility
Abstract:
A number of volatility forecasting studies have led to the perception
that the ARCH- and Stochastic Volatility-type models provide poor
out-of-sample forecasts of volatility. This is primarily based on the use
of traditional forecast evaluation criteria concerning the accuracy and
the unbiasedness of forecasts. In this paper we provide an analytical
assessment of volatility forecasting performance. We use the volatility
and log volatility framework to prove how the inherent noise in the
approximation of the true- and unobservable-volatility by the squared
return, results in a misleading forecast evaluation, inflating the
observed mean squared forecast error and invalidating the Diebold-Mariano
statistic. We analytically characterize this noise and explicitly quantify
its effects assuming normal errors. We extend our results using more
general error structures such as the Compound Normal and the Gram-Charlier
classes of distributions. We argue that evaluation problems are likely to
be exacerbated by non-normality of the shocks and that non-linear and
utility-based criteria can be more suitable for the evaluation of
volatility forecasts.
Journal: Econometric Reviews
Pages: 175-198
Issue: 3
Volume: 23
Year: 2005
Keywords: Compound normal, Expected utility, Forecasting, Gram-Charlier, Mean squared error, Non-normality, Simulation, Stochastic volatility,
X-DOI: 10.1081/ETC-200028199
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200028199
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2005:i:3:p:175-198
Template-Type: ReDIF-Article 1.0
Author-Name: Peter Egger
Author-X-Name-First: Peter
Author-X-Name-Last: Egger
Author-Name: Michael Pfaffermayr
Author-X-Name-First: Michael
Author-X-Name-Last: Pfaffermayr
Title: Estimating Long and Short Run Effects in Static Panel Models
Abstract:
This paper assesses the biases of four different estimators with respect
to the short run and the long run parameters if a static panel model is
used, although the data generating process is a dynamic error components
model. We analytically derive the associated biases and provide a
discussion of the determinants thereof. Our analytical and numerical
results as well as Monte Carlo simulations illustrate that the asymptotic
bias of both the within and the between parameter with respect to the
short run and long run impact can be substantial, depending on the memory
of the data generating process, the length of the time series and the
importance of the cross-sectional variation in the explanatory variables.
Journal: Econometric Reviews
Pages: 199-214
Issue: 3
Volume: 23
Year: 2005
Keywords: Short run effects, Long run effects, Small sample bias, Panel econometrics,
X-DOI: 10.1081/ETC-200028201
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200028201
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2005:i:3:p:199-214
Template-Type: ReDIF-Article 1.0
Author-Name: Manuel Dominguez
Author-X-Name-First: Manuel
Author-X-Name-Last: Dominguez
Title: On the Power of Bootstrapped Specification Tests
Abstract:
Decisions based on econometric model estimates may not have the expected
effect if the model is misspecified. Thus, specification tests should
precede any analysis. Bierens' specification test is consistent and has
optimality properties against some local alternatives. A shortcoming is
that the test statistic is not distribution free, even asymptotically.
This makes the test unfeasible. There have been many suggestions to
circumvent this problem, including the use of upper bounds for the
critical values. However, these suggestions lead to tests that lose power
and optimality against local alternatives. In this paper we show that
bootstrap methods allow us to recover power and optimality of Bierens'
original test. Bootstrap also provides reliable p-values, which have a
central role in Fisher's theory of hypothesis testing. The paper also
includes a discussion of the properties of the bootstrap Nonlinear Least
Squares Estimator under local alternatives.
Journal: Econometric Reviews
Pages: 215-228
Issue: 3
Volume: 23
Year: 2005
Keywords: Regression model, Local alternative, Specification test, Stochastic process, Wild bootstrap,
X-DOI: 10.1081/ETC-200028205
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200028205
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2005:i:3:p:215-228
Template-Type: ReDIF-Article 1.0
Author-Name: Douglas Hodgson
Author-X-Name-First: Douglas
Author-X-Name-Last: Hodgson
Title: Semiparametric Efficient Estimation of the Mean of a Time Series in the Presence of Conditional Heterogeneity of Unknown Form
Abstract:
We obtain semiparametric efficiency bounds for estimation of a location
parameter in a time series model where the innovations are stationary and
ergodic conditionally symmetric martingale differences but otherwise
possess general dependence and distributions of unknown form. We then
describe an iterative estimator that achieves this bound when the
conditional density functions of the sample are known. Finally, we develop
a “semi-adaptive” estimator that achieves the bound when
these densities are unknown by the investigator. This estimator employs
nonparametric kernel estimates of the densities. Monte Carlo results are
reported.
Journal: Econometric Reviews
Pages: 229-257
Issue: 3
Volume: 23
Year: 2005
Keywords: Semiparametric efficiency bounds, Conditional heteroskedasticity, Time series,
X-DOI: 10.1081/ETC-200028211
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200028211
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2005:i:3:p:229-257
Template-Type: ReDIF-Article 1.0
Author-Name: Giuseppe Cavaliere
Author-X-Name-First: Giuseppe
Author-X-Name-Last: Cavaliere
Title: Unit Root Tests under Time-Varying Variances
Abstract:
The paper provides a general framework for investigating the effects of
permanent changes in the variance of the errors of an autoregressive
process on unit root tests. Such a framework - which is based on a novel
asymptotic theory for integrated and near integrated processes with
heteroskedastic errors - allows to evaluate how the variance dynamics
affect the size and the power function of unit root tests. Contrary to
previous studies, it is shown that non-constant variances can both inflate
and deflate the rejection frequency of the commonly used unit root tests,
both under the null and under the alternative, with early negative and
late positive variance changes having the strongest impact on size and
power. It is also shown that shifts smoothed across the sample have
smaller impacts than shifts occurring as a single abrupt jump, while
periodic variances have a negligible effect even when a small number of
cycles take place over a given sample. Finally, it is proved that the
locally best invariant (LBI) test of a unit root against level
stationarity is robust to heteroskedasticity of any form under the null
hypothesis.
Journal: Econometric Reviews
Pages: 259-292
Issue: 3
Volume: 23
Year: 2005
Keywords: Unit root tests, Integrated processes, Structural breaks, Heteroskedasticity,
X-DOI: 10.1081/ETC-200028215
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200028215
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2005:i:3:p:259-292
Template-Type: ReDIF-Article 1.0
Author-Name: Hyungsik Roger Moon
Author-X-Name-First: Hyungsik Roger
Author-X-Name-Last: Moon
Author-Name: Benoit Perron
Author-X-Name-First: Benoit
Author-X-Name-Last: Perron
Title: Efficient Estimation of the Seemingly Unrelated Regression Cointegration Model and Testing for Purchasing Power Parity
Abstract:
This paper studies the efficient estimation of seemingly unrelated linear
models with integrated regressors and stationary errors. We consider two
cases. The first one has no common regressor among the equations. In this
case, we show that by adding leads and lags of the first differences of
the regressors and estimating this augmented dynamic regression model by
generalized least squares using the long-run covariance matrix, we obtain
an efficient estimator of the cointegrating vector that has a limiting
mixed normal distribution. In the second case we consider, there is a
common regressor to all equations, and we discuss efficient minimum
distance estimation in this context. Simulation results suggests that our
new estimator compares favorably with others already proposed in the
literature. We apply these new estimators to the testing of the
proportionality and symmetry conditions implied by purchasing power parity
(PPP) among the G-7 countries. The tests based on the efficient estimates
easily reject the joint hypotheses of proportionality and symmetry for all
countries with either the United States or Germany as numeraire. Based on
individual tests, our results suggest that Canada and Germany are the most
likely countries for which the proportionality condition holds, and that
Italy and Japan for the symmetry condition relative to the United States.
Journal: Econometric Reviews
Pages: 293-323
Issue: 4
Volume: 23
Year: 2005
Keywords: Seemingly unrelated regressions, Efficient estimation, Purchasing power parity, Minimum distance,
X-DOI: 10.1081/ETC-200040777
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200040777
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2005:i:4:p:293-323
Template-Type: ReDIF-Article 1.0
Author-Name: L. G. Godfrey
Author-X-Name-First: L. G.
Author-X-Name-Last: Godfrey
Author-Name: J. M. C. Santos Silva
Author-X-Name-First: J. M. C. Santos
Author-X-Name-Last: Silva
Title: Bootstrap Tests of Nonnested Hypotheses: Some Further Results
Abstract:
Nonnested models are sometimes tested using a simulated reference
distribution for the uncentred log likelihood ratio statistic. This
approach has been recommended for the specific problem of testing linear
and logarithmic regression models. The general asymptotic validity of the
reference distribution test under correct choice of error distributions is
questioned. The asymptotic behaviour of the test under incorrect
assumptions about error distributions is also examined. In order to
complement these analyses, Monte Carlo results for the case of linear and
logarithmic regression models are provided. The finite sample properties
of several standard tests for testing these alternative functional forms
are also studied, under normal and nonnormal error distributions. These
regression-based variable-addition tests are implemented using asymptotic
and bootstrap critical values.
Journal: Econometric Reviews
Pages: 325-340
Issue: 4
Volume: 23
Year: 2005
Keywords: Bootstrap, Nonnested hypotheses, Nonnormality,
X-DOI: 10.1081/ETC-200040780
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200040780
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2005:i:4:p:325-340
Template-Type: ReDIF-Article 1.0
Author-Name: Gianna Boero
Author-X-Name-First: Gianna
Author-X-Name-Last: Boero
Author-Name: Jeremy Smith
Author-X-Name-First: Jeremy
Author-X-Name-Last: Smith
Author-Name: Kenneth Wallis
Author-X-Name-First: Kenneth
Author-X-Name-Last: Wallis
Title: The Sensitivity of Chi-Squared Goodness-of-Fit Tests to the Partitioning of Data
Abstract:
The power of Pearson's overall goodness-of-fit test and the
components-of-chi-squared or “Pearson analog” tests of
Anderson [Anderson, G. (1994). Simple tests of distributional form. J.
Econometrics 62:265-276] to detect rejections due to shifts in location,
scale, skewness and kurtosis is studied, as the number and position of the
partition points is varied. Simulations are conducted for small and
moderate sample sizes. It is found that smaller numbers of classes than
are used in practice may be appropriate, and that the choice of
non-equiprobable classes can result in substantial gains in power.
Journal: Econometric Reviews
Pages: 341-370
Issue: 4
Volume: 23
Year: 2005
Keywords: Pearson's goodness-of-fit test, Component tests, Monte Carlo, Number of classes, Partitions,
X-DOI: 10.1081/ETC-200040782
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200040782
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2005:i:4:p:341-370
Template-Type: ReDIF-Article 1.0
Author-Name: Atsushi Inoue
Author-X-Name-First: Atsushi
Author-X-Name-Last: Inoue
Author-Name: Lutz Kilian
Author-X-Name-First: Lutz
Author-X-Name-Last: Kilian
Title: In-Sample or Out-of-Sample Tests of Predictability: Which One Should We Use?
Abstract:
It is widely known that significant in-sample evidence of predictability
does not guarantee significant out-of-sample predictability. This is often
interpreted as an indication that in-sample evidence is likely to be
spurious and should be discounted. In this paper, we question this
interpretation. Our analysis shows that neither data mining nor dynamic
misspecification of the model under the null nor unmodelled structural
change under the null are plausible explanations of the observed tendency
of in-sample tests to reject the no-predictability null more often than
out-of-sample tests. We provide an alternative explanation based on the
higher power of in-sample tests of predictability in many situations. We
conclude that results of in-sample tests of predictability will typically
be more credible than results of out-of-sample tests.
Journal: Econometric Reviews
Pages: 371-402
Issue: 4
Volume: 23
Year: 2005
Keywords: Predictive ability, Spurious inference, Data mining, Model instability,
X-DOI: 10.1081/ETC-200040785
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200040785
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:23:y:2005:i:4:p:371-402
Template-Type: ReDIF-Article 1.0
Author-Name: Stephen Bond
Author-X-Name-First: Stephen
Author-X-Name-Last: Bond
Author-Name: Frank Windmeijer
Author-X-Name-First: Frank
Author-X-Name-Last: Windmeijer
Title: RELIABLE INFERENCE FOR GMM ESTIMATORS? FINITE SAMPLE PROPERTIES OF ALTERNATIVE TEST PROCEDURES IN LINEAR PANEL DATA MODELS
Abstract:
We compare the finite sample performance of a range of tests of linear
restrictions for linear panel data models estimated using the generalized
method of moments (GMM). These include standard asymptotic Wald tests
based on one-step and two-step GMM estimators; two bootstrapped versions
of these Wald tests; a version of the two-step Wald test that uses a
finite sample corrected estimate of the variance of the two-step GMM
estimator; the LM test; and three criterion-based tests that have recently
been proposed. We consider both the AR(1) panel model and a design with
predetermined regressors. The corrected two-step Wald test performs
similarly to the standard one-step Wald test, whilst the bootstrapped
one-step Wald test, the LM test, and a simple criterion-difference test
can provide more reliable finite sample inference in some cases.
Journal: Econometric Reviews
Pages: 1-37
Issue: 1
Volume: 24
Year: 2005
Keywords: Finite sample inference, Generalized method of moments, Hypothesis testing,
X-DOI: 10.1081/ETC-200049126
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200049126
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:1:p:1-37
Template-Type: ReDIF-Article 1.0
Author-Name: Badi Baltagi
Author-X-Name-First: Badi
Author-X-Name-Last: Baltagi
Author-Name: Georges Bresson
Author-X-Name-First: Georges
Author-X-Name-Last: Bresson
Author-Name: Alain Pirotte
Author-X-Name-First: Alain
Author-X-Name-Last: Pirotte
Title: ADAPTIVE ESTIMATION OF HETEROSKEDASTIC ERROR COMPONENT MODELS
Abstract:
This paper checks the sensitivity of two adaptive heteroskedastic
estimators suggested by Li and Stengos (1994) and Roy (2002) for an error
component regression model to misspecification of the form of
heteroskedasticity. In particular, we run Monte Carlo experiments using
the heteroskedasticity setup by Li and Stengos (1994) to see how the
misspecified Roy (2002) estimator performs. Next, we use the
heteroskedasticity setup by Roy (2002) to see how the misspecified Li and
Stengos (1994) estimator performs. We also check the sensitivity of these
results to the choice of the smoothing parameters, the sample size, and
the degree of heteroskedasticity. We find that the Li and Stengos (1994)
estimator performs better under this type of misspecification than the
corresponding estimator of Roy (2002). However, the former estimator is
sensitive to the choice of the bandwidth.
Journal: Econometric Reviews
Pages: 39-58
Issue: 1
Volume: 24
Year: 2005
Keywords: Adaptive estimation, Bandwidth, Error components, Heteroskedasticity, Panel data,
X-DOI: 10.1081/ETC-200049131
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200049131
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:1:p:39-58
Template-Type: ReDIF-Article 1.0
Author-Name: Nikolay Gospodinov
Author-X-Name-First: Nikolay
Author-X-Name-Last: Gospodinov
Title: ROBUST ASYMPTOTIC INFERENCE IN AUTOREGRESSIVE MODELS WITH MARTINGALE DIFFERENCE ERRORS
Abstract:
This paper proposes a GMM-based method for asymptotic confidence interval
construction in stationary autoregressive models, which is robust to the
presence of conditional heteroskedasticity of unknown form. The confidence
regions are obtained by inverting the asymptotic acceptance region of the
distance metric test for the continuously updated GMM (CU-GMM) estimator.
Unlike the predetermined symmetric shape of the Wald confidence intervals,
the shape of the proposed confidence intervals is data-driven owing an
estimated sequence of nonuniform weights. It appears that the flexibility
of the CU-GMM estimator in downweighting certain observations proves
advantageous for confidence interval construction. This stands in contrast
to some other generalized empirical likelihood estimators with appealing
optimality properties such as the empirical likelihood estimator whose
objective function prevents such downweighting. A Monte Carlo simulation
study illustrates the excellent small-sample properties of the method for
AR models with ARCH errors. The procedure is applied to study the dynamics
of the federal funds rate.
Journal: Econometric Reviews
Pages: 59-81
Issue: 1
Volume: 24
Year: 2005
Keywords: Conditional heteroskedasticity, Confidence intervals, Generalized empirical likelihood, GMM, Test inversion,
X-DOI: 10.1081/ETC-200049135
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200049135
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:1:p:59-81
Template-Type: ReDIF-Article 1.0
Author-Name: Artur C. B. da Silva Lopes
Author-X-Name-First: Artur C. B. da Silva
Author-X-Name-Last: Lopes
Author-Name: Antonio Montanes
Author-X-Name-First: Antonio
Author-X-Name-Last: Montanes
Title: THE BEHAVIOR OF HEGY TESTS FOR QUARTERLY TIME SERIES WITH SEASONAL MEAN SHIFTS
Abstract:
This paper studies the behavior of the HEGY statistics for quarterly
data, for seasonal autoregressive unit roots, when the analyzed time
series is deterministic seasonal stationary but exhibits a change in the
seasonal pattern. We analyze also the HEGY test for the nonseasonal unit
root. the data generation process being trend stationary too. Our results
show that when the break magnitudes are finite, the HEGY test statistics
are not asymptotically biased toward the nonrejection of the seasonal and
nonseasonal unit root hypotheses. However, the finite sample power
properties may be substantially affected, the behavior of the tests
depending on the type of the break.
Journal: Econometric Reviews
Pages: 83-108
Issue: 1
Volume: 24
Year: 2005
Keywords: HEGY tests, Seasonality, Structural breaks, Unit roots,
X-DOI: 10.1081/ECR-200049141
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ECR-200049141
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:1:p:83-108
Template-Type: ReDIF-Article 1.0
Author-Name: Sourafel Girma
Author-X-Name-First: Sourafel
Author-X-Name-Last: Girma
Title: Book Review: Panel Data Econometrics
Abstract:
Journal: Econometric Reviews
Pages: 109-111
Issue: 1
Volume: 24
Year: 2005
X-DOI: 10.1081/ETC-200049145
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200049145
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:1:p:109-111
Template-Type: ReDIF-Article 1.0
Author-Name: Dominique Guegan
Author-X-Name-First: Dominique
Author-X-Name-Last: Guegan
Title: How can we Define the Concept of Long Memory? An Econometric Survey
Abstract:
In this paper we discuss different aspects of long memory behavior and
applicable parametric models. We discuss the confusion that can arise when
the empirical autocorrelation function decreases in a hyperbolic way.
Journal: Econometric Reviews
Pages: 113-149
Issue: 2
Volume: 24
Year: 2005
Keywords: Estimation theory, Long memory, Returns, Spectral domain, Switching,
X-DOI: 10.1081/ETC-200067887
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200067887
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:2:p:113-149
Template-Type: ReDIF-Article 1.0
Author-Name: Jorg Breitung
Author-X-Name-First: Jorg
Author-X-Name-Last: Breitung
Title: A Parametric approach to the Estimation of Cointegration Vectors in Panel Data
Abstract:
In this article, a parametric framework for estimation and inference in
cointegrated panel data models is considered that is based on a
cointegrated VAR(p) model. A convenient two-step estimator is suggested
where, in the first step, all individual specific parameters are
estimated, and in the second step, the long-run parameters are estimated
from a pooled least-squares regression. The two-step estimator and related
test procedures can easily be modified to account for contemporaneously
correlated errors, a feature that is often encountered in multi-country
studies. Monte Carlo simulations suggest that the two-step estimator and
related test procedures outperform semiparametric alternatives such as the
fully modified OLS approach, especially if the number of time periods is
small.
Journal: Econometric Reviews
Pages: 151-173
Issue: 2
Volume: 24
Year: 2005
Keywords: Cointegrated systems, Estimation, Inference, Panel data,
X-DOI: 10.1081/ETC-200067895
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200067895
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:2:p:151-173
Template-Type: ReDIF-Article 1.0
Author-Name: Myoung-Jae Lee
Author-X-Name-First: Myoung-Jae
Author-X-Name-Last: Lee
Title: Monotonicity Conditions and Inequality Imputation for Sample-Selection and Non-Response Problems
Abstract:
Under a sample selection or non-response problem, where a response
variable y is observed only when a condition δ = 1 is met,
the identified mean E(y&7Cδ = 1) is not equal to the
desired mean E(y). But the monotonicity condition
E(y&7Cδ = 1) ≤ E(y&7Cδ =
0) yields an informative bound
E(y&7Cδ = 1) ≤ E(y), which is enough for
certain inferences. For example, in a majority voting with δ being
the vote-turnout, it is enough to know if E(y) > 0.5 or
not, for which E(y&7Cδ = 1) > 0.5 is
sufficient under the monotonicity. The main question is then whether the
monotonicity condition is testable, and if not, when it is plausible.
Answering to these queries, when there is a 'proxy' variable z related to
y but fully observed, we provide a test for the monotonicity; when z is
not available, we provide primitive conditions and plausible models for
the monotonicity. Going further, when both y and z are binary, bivariate
monotonicities of the type P(y,
z&7Cδ = 1) ≤ P(y,
z&7Cδ = 0) are considered, which can lead to sharper
bounds for P(y). As an empirical example, a data set on the 1996 U.S.
presidential election is analyzed to see if the Republican candidate could
have won had everybody voted, i.e., to see if P(y) > 0.5,
where y = 1 is voting for the Republican candidate.
Journal: Econometric Reviews
Pages: 175-194
Issue: 2
Volume: 24
Year: 2005
Keywords: Imputation, Monotonicity, Non-response, Orthant dependence, Sample selection,
X-DOI: 10.1081/ETC-200067910
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200067910
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:2:p:175-194
Template-Type: ReDIF-Article 1.0
Author-Name: Wendelin Schnedler
Author-X-Name-First: Wendelin
Author-X-Name-Last: Schnedler
Title: Likelihood Estimation for Censored Random Vectors
Abstract:
This article shows how to construct a likelihood for a general class of
censoring problems. This likelihood is proven to be valid, i.e. its
maximizer is consistent and the respective root-n estimator is
asymptotically efficient and normally distributed under regularity
conditions. The method generalizes ordinary maximum likelihood estimation
as well as several standard estimators for censoring problems (e.g. tobit
type I-tobit type V).
Journal: Econometric Reviews
Pages: 195-217
Issue: 2
Volume: 24
Year: 2005
Keywords: Censored variables, Likelihood, Limited dependent variables, Multivariate methods, Random censoring,
X-DOI: 10.1081/ETC-200067925
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200067925
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:2:p:195-217
Template-Type: ReDIF-Article 1.0
Author-Name: Emmanuel Flachaire
Author-X-Name-First: Emmanuel
Author-X-Name-Last: Flachaire
Title: More Efficient Tests Robust to Heteroskedasticity of Unknown Form
Abstract:
In the presence of heteroskedasticity of unknown form, the Ordinary Least
Squares parameter estimator becomes inefficient, and its covariance matrix
estimator inconsistent. Eicker (1963) and White (1980) were the first to
propose a robust consistent covariance matrix estimator, that permits
asymptotically correct inference. This estimator is widely used in
practice. Cragg (1983) proposed a more efficient estimator, but concluded
that tests basd on it are unreliable. Thus, this last estimator has not
been used in practice. This article is concerned with finite sample
properties of tests robust to heteroskedasticity of unknown form. Our
results suggest that reliable and more efficient tests can be obtained
with the Cragg estimators in small samples.
Journal: Econometric Reviews
Pages: 219-241
Issue: 2
Volume: 24
Year: 2005
Keywords: Heteroskedasticity-robust test, Regression model, Wild bootstrap,
X-DOI: 10.1081/ETC-200067942
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200067942
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:2:p:219-241
Template-Type: ReDIF-Article 1.0
Author-Name: Scott Atkinson
Author-X-Name-First: Scott
Author-X-Name-Last: Atkinson
Title: A Review of: “Stochastic Frontier Analysis”
Abstract:
Journal: Econometric Reviews
Pages: 243-245
Issue: 2
Volume: 24
Year: 2005
X-DOI: 10.1081/ETC-200067955
File-URL: http://www.tandfonline.com/doi/abs/10.1081/ETC-200067955
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:2:p:243-245
Template-Type: ReDIF-Article 1.0
Author-Name: Patrik Guggenberger
Author-X-Name-First: Patrik
Author-X-Name-Last: Guggenberger
Author-Name: Jinyong Hahn
Author-X-Name-First: Jinyong
Author-X-Name-Last: Hahn
Title: Finite Sample Properties of the Two-Step Empirical Likelihood Estimator
Abstract:
We investigate the finite sample properties of two-step empirical
likelihood (EL) estimators. These estimators are shown to have the same
third-order bias properties as EL itself. The Monte Carlo study provides
evidence that (i) higher order asymptotics fails to provide a good
approximation in the sense that the bias of the two-step EL estimators can
be substantial and sensitive to the number of moment restrictions and (ii)
the two-step EL estimators may have heavy tails.
Journal: Econometric Reviews
Pages: 247-263
Issue: 3
Volume: 24
Year: 2005
Keywords: Empirical likelihood estimator, Finite sample performance, High order bias, Two-step empirical likelihood estimator,
X-DOI: 10.1080/07474930500242987
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500242987
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:3:p:247-263
Template-Type: ReDIF-Article 1.0
Author-Name: Evzen Kocenda
Author-X-Name-First: Evzen
Author-X-Name-Last: Kocenda
Author-Name: Lubos Briatka
Author-X-Name-First: Lubos
Author-X-Name-Last: Briatka
Title: Optimal Range for the iid Test Based on Integration Across the Correlation Integral
Abstract:
This paper builds on Kocenda (2001) and extends it in three ways. First,
new intervals of the proximity parameter ε (over which the
correlation integral is calculated) are specified. For these
ε-ranges new critical values for various lengths of the data sets
are introduced, and through Monte Carlo studies it is shown that within
new ε-ranges the test is even more powerful than within the original
ε-range. The range that maximizes the power of the test is suggested
as the optimal range. Second, an extensive comparison with existing
results of the controlled competition of Barnett et al. (1997) as well as
broad power tests on various nonlinear and chaotic data are provided. Test
performance with real (exchange rate) data is provided as well. The
results of the comparison strongly favor our robust procedure and confirm
the ability of the test in finding nonlinear dependencies as well its
function as a specification test. Finally, new user-friendly and fast
software is introduced.
Journal: Econometric Reviews
Pages: 265-296
Issue: 3
Volume: 24
Year: 2005
Keywords: Chaos, Correlation integral, High-frequency economic and financial data, Monte Carlo, Nonlinear dynamics, Power tests, Single-blind competition,
X-DOI: 10.1080/07474930500243001
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500243001
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:3:p:265-296
Template-Type: ReDIF-Article 1.0
Author-Name: Joakim Westerlund
Author-X-Name-First: Joakim
Author-X-Name-Last: Westerlund
Title: New Simple Tests for Panel Cointegration
Abstract:
In this paper, two new simple residual-based panel data tests are
proposed for the null of no cointegration. The tests are simple because
they do not require any correction for the temporal dependencies of the
data. Yet they are able to accommodate individual specific short-run
dynamics, individual specific intercept and trend terms, and individual
specific slope parameters. The limiting distributions of the tests are
derived and are shown to be free of nuisance parameters. The Monte Carlo
results in this paper suggest that the asymptotic results are borne out
well even in very small samples.
Journal: Econometric Reviews
Pages: 297-316
Issue: 3
Volume: 24
Year: 2005
Keywords: Monte Carlo simulation, Panel cointegration, Residual-based tests,
X-DOI: 10.1080/07474930500243019
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500243019
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:3:p:297-316
Template-Type: ReDIF-Article 1.0
Author-Name: Manabu Asai
Author-X-Name-First: Manabu
Author-X-Name-Last: Asai
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: Dynamic Asymmetric Leverage in Stochastic Volatility Models
Abstract:
In the class of stochastic volatility (SV) models, leverage effects are
typically specified through the direct correlation between the innovations
in both returns and volatility, resulting in the dynamic leverage (DL)
model. Recently, two asymmetric SV models based on threshold effects have
been proposed in the literature. As such models consider only the sign of
the previous return and neglect its magnitude, this paper proposes a
dynamic asymmetric leverage (DAL) model that accommodates the direct
correlation as well as the sign and magnitude of the threshold effects. A
special case of the DAL model with zero direct correlation between the
innovations is the asymmetric leverage (AL) model. The dynamic asymmetric
leverage models are estimated by the Monte Carlo likelihood (MCL) method.
Monte Carlo experiments are presented to examine the finite sample
properties of the estimator. For a sample size of T = 2000 with
500 replications, the sample means, standard deviations, and root mean
squared errors of the MCL estimators indicate only a small finite sample
bias. The empirical estimates for S&P 500 and TOPIX financial returns, and
USD/AUD and YEN/USD exchange rates, indicate that the DAL class, including
the DL and AL models, is generally superior to threshold SV models with
respect to AIC and BIC, with AL typically providing the best fit to the
data.
Journal: Econometric Reviews
Pages: 317-332
Issue: 3
Volume: 24
Year: 2005
Keywords: Asymmetric effects, Monte Carlo likelihood, Stochastic volatility, Threshold effects,
X-DOI: 10.1080/07474930500243035
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500243035
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:3:p:317-332
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Daniel Millimet
Author-X-Name-First: Daniel
Author-X-Name-Last: Millimet
Author-Name: Vasudha Rangaprasad
Author-X-Name-First: Vasudha
Author-X-Name-Last: Rangaprasad
Title: Class Size and Educational Policy: Who Benefits from Smaller Classes?
Abstract:
The impact of class size on student achievement remains an open question
despite hundreds of empirical studies and the perception among parents,
teachers, and policymakers that larger classes are a significant detriment
to student development. This study sheds new light on this ambiguity by
utilizing nonparametric tests for stochastic dominance to analyze
unconditional and conditional test score distributions across students
facing different class sizes. Analyzing the conditional distributions of
test scores (purged of observables using class-size specific returns), we
find that there is little causal effect of marginal reductions in class
size on test scores within the range of 20 or more students. However,
reductions in class size from above 20 students to below 20 students, as
well as marginal reductions in classes with fewer than 20 students,
increase test scores for students below the median, but decrease test
scores above the median. This nonuniform impact of class size suggests
that compensatory school policies, whereby lower-performing students are
placed in smaller classes and higher-performing students are placed in
larger classes, improves the academic achievement of not just the
lower-performing students but also the higher-performing students.
Journal: Econometric Reviews
Pages: 333-368
Issue: 4
Volume: 24
Year: 2005
Keywords: Class size, Program evaluation, Quantile treatment effects, School quality: Stochastic dominance, Student achievement,
X-DOI: 10.1080/07474930500405485
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500405485
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:4:p:333-368
Template-Type: ReDIF-Article 1.0
Author-Name: Todd Clark
Author-X-Name-First: Todd
Author-X-Name-Last: Clark
Author-Name: Michael McCracken
Author-X-Name-First: Michael
Author-X-Name-Last: McCracken
Title: Evaluating Direct Multistep Forecasts
Abstract:
This paper examines the asymptotic and finite-sample properties of tests
of equal forecast accuracy and encompassing applied to direct, multistep
predictions from nested regression models. We first derive asymptotic
distributions; these nonstandard distributions depend on the parameters of
the data-generating process. We then use Monte Carlo simulations to
examine finite-sample size and power. Our asymptotic approximation yields
good size and power properties for some, but not all, of the tests; a
bootstrap works reasonably well for all tests. The paper concludes with a
reexamination of the predictive content of capacity utilization for
inflation.
Journal: Econometric Reviews
Pages: 369-404
Issue: 4
Volume: 24
Year: 2005
Keywords: Causality, Long horizon, Prediction,
X-DOI: 10.1080/07474930500405683
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500405683
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:4:p:369-404
Template-Type: ReDIF-Article 1.0
Author-Name: Morten Ørregaard Nielsen
Author-X-Name-First: Morten Ørregaard
Author-X-Name-Last: Nielsen
Author-Name: Per Houmann Frederiksen
Author-X-Name-First: Per Houmann
Author-X-Name-Last: Frederiksen
Title: Finite Sample Comparison of Parametric, Semiparametric, and Wavelet Estimators of Fractional Integration
Abstract:
In this paper we compare through Monte Carlo simulations the finite
sample properties of estimators of the fractional differencing parameter,
d. This involves frequency domain, time domain, and wavelet based
approaches, and we consider both parametric and semiparametric estimation
methods. The estimators are briefly introduced and compared, and the
criteria adopted for measuring finite sample performance are bias and root
mean squared error. Most importantly, the simulations reveal that (1) the
frequency domain maximum likelihood procedure is superior to the time
domain parametric methods, (2) all the estimators are fairly robust to
conditionally heteroscedastic errors, (3) the local polynomial Whittle and
bias-reduced log-periodogram regression estimators are shown to be more
robust to short-run dynamics than other semiparametric (frequency domain
and wavelet) estimators and in some cases even outperform the time domain
parametric methods, and (4) without sufficient trimming of scales the
wavelet-based estimators are heavily biased.
Journal: Econometric Reviews
Pages: 405-443
Issue: 4
Volume: 24
Year: 2005
Keywords: Bias, Finite sample distribution, Fractional integration, Maximum likelihood, Monte Carlo simulation, Parametric estimation, Semiparametric estimation, Wavelet,
X-DOI: 10.1080/07474930500405790
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500405790
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:4:p:405-443
Template-Type: ReDIF-Article 1.0
Author-Name: Achim Zeileis
Author-X-Name-First: Achim
Author-X-Name-Last: Zeileis
Title: A Unified Approach to Structural Change Tests Based on ML Scores, F Statistics, and OLS Residuals
Abstract:
Three classes of structural change tests (or tests for parameter
instability) that have been receiving much attention in both the
statistics and the econometrics communities but have been developed in
rather loosely connected lines of research are unified by embedding them
into the framework of generalized M-fluctuation tests (Zeileis and Hornik,
2003). These classes are tests based on maximum likelihood scores
(including the Nyblom-Hansen test), on F statistics (sup F,
ave F, exp F tests), and on OLS residuals (OLS-based CUSUM and
MOSUM tests). We show that (representatives from) these classes are
special cases of the generalized M-fluctuation tests, based on the same
functional central limit theorem but employing different functionals for
capturing excessive fluctuations. After embedding these tests into the
same framework and thus understanding the relationship between these
procedures for testing in historical samples, it is shown how the tests
can also be extended to a monitoring situation. This is achieved by
establishing a general M-fluctuation monitoring procedure and then
applying the different functionals corresponding to monitoring with ML
scores, F statistics, and OLS residuals. In particular, an extension of
the sup F test to a monitoring scenario is suggested and illustrated
on a real-world data set.
Journal: Econometric Reviews
Pages: 445-466
Issue: 4
Volume: 24
Year: 2005
Keywords: Aggregation functional, Fluctuation test, Functional central limit theorem, Monitoring, Nyblom-Hansen test, OLS-based CUSUM test, Parameter instability, Structural change, sup F test,
X-DOI: 10.1080/07474930500406053
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500406053
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:4:p:445-466
Template-Type: ReDIF-Article 1.0
Author-Name: Takashi Yamagata
Author-X-Name-First: Takashi
Author-X-Name-Last: Yamagata
Author-Name: Chris Orme
Author-X-Name-First: Chris
Author-X-Name-Last: Orme
Title: On Testing Sample Selection Bias Under the Multicollinearity Problem
Abstract:
This paper reviews and extends the literature on the finite sample
behavior of tests for sample selection bias. Monte Carlo results show
that, when the “multicollinearity problem” identified by
Nawata (1993) is severe, (i) the t-test based on the Heckman-Greene
variance estimator can be unreliable, (ii) the Likelihood Ratio test
remains powerful, and (iii) nonnormality can be interpreted as severe
sample selection bias by Maximum Likelihood methods, leading to negative
Wald statistics. We also confirm previous findings (Leung and Yu, 1996)
that the standard regression-based t-test (Heckman, 1979) and the
asymptotically efficient Lagrange Multiplier test (Melino, 1982), are
robust to nonnormality but have very little power.
Journal: Econometric Reviews
Pages: 467-481
Issue: 4
Volume: 24
Year: 2005
Keywords: Lagrange multiplier test, Likelihood ratio test, Sample selection bias, t-test, Wald test,
X-DOI: 10.1080/02770900500406132
File-URL: http://www.tandfonline.com/doi/abs/10.1080/02770900500406132
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:24:y:2005:i:4:p:467-481
Template-Type: ReDIF-Article 1.0
Author-Name: Justin Tobias
Author-X-Name-First: Justin
Author-X-Name-Last: Tobias
Title: Estimation, Learning and Parameters of Interest in a Multiple Outcome Selection Model
Abstract:
We describe estimation, learning, and prediction in a treatment-response
model with two outcomes. The introduction of potential outcomes in this
model introduces four cross-regime correlation parameters that are not
contained in the likelihood for the observed data and thus are not
identified. Despite this inescapable identification problem, we build upon
the results of Koop and Poirier (1997) to describe how learning takes
place about the four nonidentified correlations through the imposed
positive definiteness of the covariance matrix. We then derive bivariate
distributions associated with commonly estimated “treatment
parameters” (including the Average Treatment Effect and effect of
Treatment on the Treated), and use the learning that takes place about the
nonidentified correlations to calculate these densities. We illustrate our
points in several generated data experiments and apply our methods to
estimate the joint impact of child labor on achievement scores in language
and mathematics.
Journal: Econometric Reviews
Pages: 1-40
Issue: 1
Volume: 25
Year: 2006
Keywords: Bayesian econometrics, Treatment effects,
X-DOI: 10.1080/07474930500545421
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500545421
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:1:p:1-40
Template-Type: ReDIF-Article 1.0
Author-Name: Pieter Omtzigt
Author-X-Name-First: Pieter
Author-X-Name-Last: Omtzigt
Author-Name: Stefano Fachin
Author-X-Name-First: Stefano
Author-X-Name-Last: Fachin
Title: The Size and Power of Bootstrap and Bartlett-Corrected Tests of Hypotheses on the Cointegrating Vectors
Abstract:
In this paper we compare Bartlett-corrected, bootstrap, and fast double
bootstrap tests on maximum likelihood estimates of cointegration
parameters. The key result is that both the bootstrap and the
Bartlett-corrected tests must be based on the unrestricted estimates of
the cointegrating vectors: procedures based on the restricted estimates
have almost no power. The small sample size bias of the asymptotic test
appears so severe as to advise strongly against its use with the sample
sizes commonly available; the fast double bootstrap test minimizes size
bias, while the Bartlett-corrected test is somehow more powerful.
Journal: Econometric Reviews
Pages: 41-60
Issue: 1
Volume: 25
Year: 2006
Keywords: Bartlett correction, Bootstrap, Cointegration, Fast double bootstrap,
X-DOI: 10.1080/07474930500545439
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500545439
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:1:p:41-60
Template-Type: ReDIF-Article 1.0
Author-Name: Tommaso Proietti
Author-X-Name-First: Tommaso
Author-X-Name-Last: Proietti
Title: Trend-Cycle Decompositions with Correlated Components
Abstract:
This paper raises some interpretative issues that arise from univariate
trend-cycle decompositions with correlated disturbances. In particular, it
discusses whether the interpretation of a negative correlation as
providing evidence for the prominence of real, or supply, shocks, can be
supported. For this purpose it determines the conditions under which
correlated components may originate from the underestimation of the
cyclical component in an orthogonal decomposition; from the presence of a
growth rate cycle, rather than a deviation cycle; or alternatively, as a
consequence of the hysteresis phenomenon. Finally, it considers
interpreting correlated components in terms of permanent-transitory
decompositions, where the permanent component has richer dynamics than a
pure random walk. The consequences for smoothing and signal extraction are
discussed: in particular, it is documented that a negative correlation
implies that future observations carry most of the information needed to
assess cyclical stance. As a result, the components will be subject to
underestimation in real time and thus to high revisions. The overall
conclusion is that the characterization of economic fluctuations in
macroeconomic time series largely remains an open issue.
Journal: Econometric Reviews
Pages: 61-84
Issue: 1
Volume: 25
Year: 2006
Keywords: Hysteresis, Permanent-transitory decomposition, Revisions, Signal extraction,
X-DOI: 10.1080/07474930500545496
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500545496
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:1:p:61-84
Template-Type: ReDIF-Article 1.0
Author-Name: Jaroslava Hlouskova
Author-X-Name-First: Jaroslava
Author-X-Name-Last: Hlouskova
Author-Name: Martin Wagner
Author-X-Name-First: Martin
Author-X-Name-Last: Wagner
Title: The Performance of Panel Unit Root and Stationarity Tests: Results from a Large Scale Simulation Study
Abstract:
This paper presents results on the size and power of first generation
panel unit root and stationarity tests obtained from a large scale
simulation study. The tests developed in the following papers are
included: Levin et al. (2002), Harris and Tzavalis (1999), Breitung
(2000), Im et al. (1997, 2003), Maddala and Wu (1999), Hadri (2000), and
Hadri and Larsson (2005). Our simulation set-up is designed to address
inter alia the following issues. First, we assess the performance as a
function of the time and the cross-section dimensions. Second, we analyze
the impact of serial correlation introduced by positive MA roots, known to
have detrimental impact on time series unit root tests, on the
performance. Third, we investigate the power of the panel unit root tests
(and the size of the stationarity tests) for a variety of first order
autoregressive coefficients. Fourth, we consider both of the two usual
specifications of deterministic variables in the unit root literature.
Journal: Econometric Reviews
Pages: 85-116
Issue: 1
Volume: 25
Year: 2006
Keywords: Panel stationarity test, Panel unit root test, Power, Simulation study, Size,
X-DOI: 10.1080/07474930500545504
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500545504
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:1:p:85-116
Template-Type: ReDIF-Article 1.0
Author-Name: Fernanda Peixe
Author-X-Name-First: Fernanda
Author-X-Name-Last: Peixe
Author-Name: Alastair Hall
Author-X-Name-First: Alastair
Author-X-Name-Last: Hall
Author-Name: Kostas Kyriakoulis
Author-X-Name-First: Kostas
Author-X-Name-Last: Kyriakoulis
Title: The Mean Squared Error of the Instrumental Variables Estimator When the Disturbance Has an Elliptical Distribution
Abstract:
This paper generalizes Nagar's (1959) approximation to the finite sample
mean squared error (MSE) of the instrumental variables (IV) estimator to
the case in which the errors possess an elliptical distribution whose
moments exist up to infinite order. This allows for types of excess
kurtosis exhibited by some financial data series. This approximation is
compared numerically to Knight's (1985) formulae for the exact moments of
the IV estimator under nonnormality. We use the results to explore two
questions on instrument selection. First, we complement Buse's (1992)
analysis by considering the impact of additional instruments on both bias
and MSE. Second, we evaluate the properties of Andrews's (1999) selection
method in terms of the bias and MSE of the resulting IV estimator.
Journal: Econometric Reviews
Pages: 117-138
Issue: 1
Volume: 25
Year: 2006
X-DOI: 10.1080/07474930500545488
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930500545488
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:1:p:117-138
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: Multivariate Stochastic Volatility: An Overview
Abstract:
Journal: Econometric Reviews
Pages: 139-144
Issue: 2-3
Volume: 25
Year: 2006
X-DOI: 10.1080/07474930600712806
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600712806
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:139-144
Template-Type: ReDIF-Article 1.0
Author-Name: Manabu Asai
Author-X-Name-First: Manabu
Author-X-Name-Last: Asai
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Author-Name: Jun Yu
Author-X-Name-First: Jun
Author-X-Name-Last: Yu
Title: Multivariate Stochastic Volatility: A Review
Abstract:
The literature on multivariate stochastic volatility (MSV) models has
developed significantly over the last few years. This paper reviews the
substantial literature on specification, estimation, and evaluation of MSV
models. A wide range of MSV models is presented according to various
categories, namely, (i) asymmetric models, (ii) factor models, (iii)
time-varying correlation models, and (iv) alternative MSV specifications,
including models based on the matrix exponential transformation, the
Cholesky decomposition, and the Wishart autoregressive process.
Alternative methods of estimation, including quasi-maximum likelihood,
simulated maximum likelihood, and Markov chain Monte Carlo methods, are
discussed and compared. Various methods of diagnostic checking and model
comparison are also reviewed.
Journal: Econometric Reviews
Pages: 145-175
Issue: 2-3
Volume: 25
Year: 2006
Keywords: Asymmetry, Diagnostic checking, Estimation, Factor models, Leverage, Model comparison, Multivariate stochastic volatility, Thresholds, Time-varying correlations, Transformations,
X-DOI: 10.1080/07474930600713564
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600713564
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:145-175
Template-Type: ReDIF-Article 1.0
Author-Name: C. Gourieroux
Author-X-Name-First: C.
Author-X-Name-Last: Gourieroux
Title: Continuous Time Wishart Process for Stochastic Risk
Abstract:
Risks are usually represented and measured by volatility-covolatility
matrices. Wishart processes are models for a dynamic analysis of
multivariate risk and describe the evolution of stochastic
volatility-covolatility matrices, constrained to be symmetric positive
definite. The autoregressive Wishart process (WAR) is the multivariate
extension of the Cox, Ingersoll, Ross (CIR) process introduced for scalar
stochastic volatility. As a CIR process it allows for closed-form
solutions for a number of financial problems, such as term structure of
T-bonds and corporate bonds, derivative pricing in a multivariate
stochastic volatility model, and the structural model for credit risk.
Moreover, the Wishart dynamics are very flexible and are serious
competitors for less structural multivariate ARCH models.
Journal: Econometric Reviews
Pages: 177-217
Issue: 2-3
Volume: 25
Year: 2006
Keywords: JEL Number, G12, G13,
X-DOI: 10.1080/07474930600713234
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600713234
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:177-217
Template-Type: ReDIF-Article 1.0
Author-Name: Charles Bos
Author-X-Name-First: Charles
Author-X-Name-Last: Bos
Author-Name: Neil Shephard
Author-X-Name-First: Neil
Author-X-Name-Last: Shephard
Title: Inference for Adaptive Time Series Models: Stochastic Volatility and Conditionally Gaussian State Space Form
Abstract:
In this paper we model the Gaussian errors in the standard Gaussian
linear state space model as stochastic volatility processes. We show that
conventional MCMC algorithms for this class of models are ineffective, but
that the problem can be alleviated by reparameterizing the model. Instead
of sampling the unobserved variance series directly, we sample in the
space of the disturbances, which proves to lower correlation in the
sampler and thus increases the quality of the Markov chain. Using our
reparameterized MCMC sampler, it is possible to estimate an unobserved
factor model for exchange rates between a group of n countries. The
underlying n + 1 country-specific currency strength factors and
the n + 1 currency volatility factors can be extracted using the
new methodology. With the factors, a more detailed image of the events
around the 1992 EMS crisis is obtained. We assess the fit of competitive
models on the panels of exchange rates with an effective particle filter
and find that indeed the factor model is strongly preferred by the data.
Journal: Econometric Reviews
Pages: 219-244
Issue: 2-3
Volume: 25
Year: 2006
Keywords: Markov chain Monte Carlo, Particle filter, State space form, Stochastic volatility,
X-DOI: 10.1080/07474930600713275
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600713275
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:219-244
Template-Type: ReDIF-Article 1.0
Author-Name: David Chan
Author-X-Name-First: David
Author-X-Name-Last: Chan
Author-Name: Robert Kohn
Author-X-Name-First: Robert
Author-X-Name-Last: Kohn
Author-Name: Chris Kirby
Author-X-Name-First: Chris
Author-X-Name-Last: Kirby
Title: Multivariate Stochastic Volatility Models with Correlated Errors
Abstract:
We develop a Bayesian approach for parsimoniously estimating the
correlation structure of the errors in a multivariate stochastic
volatility model. Since the number of parameters in the joint correlation
matrix of the return and volatility errors is potentially very large, we
impose a prior that allows the off-diagonal elements of the inverse of the
correlation matrix to be identically zero. The model is estimated using a
Markov chain simulation method that samples from the posterior
distribution of the volatilities and parameters. We illustrate the
approach using both simulated and real examples. In the real examples, the
method is applied to equities at three levels of aggregation: returns for
firms within the same industry, returns for different industries, and
returns aggregated at the index level. We find pronounced correlation
effects only at the highest level of aggregation.
Journal: Econometric Reviews
Pages: 245-274
Issue: 2-3
Volume: 25
Year: 2006
Keywords: Bayesian estimation, Correlation matrix, Leverage, Markov chain Monte Carlo, Model averaging,
X-DOI: 10.1080/07474930600713309
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600713309
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:245-274
Template-Type: ReDIF-Article 1.0
Author-Name: Catherine Doz
Author-X-Name-First: Catherine
Author-X-Name-Last: Doz
Author-Name: Eric Renault
Author-X-Name-First: Eric
Author-X-Name-Last: Renault
Title: Factor Stochastic Volatility in Mean Models: A GMM Approach
Abstract:
This paper provides a semiparametric framework for modeling multivariate
conditional heteroskedasticity. We put forward latent stochastic
volatility (SV) factors as capturing the commonality in the joint
conditional variance matrix of asset returns. This approach is in line
with common features as studied by Engle and Kozicki (1993), and it allows
us to focus on identication of factors and factor loadings through first-
and second-order conditional moments only. We assume that the time-varying
part of risk premiums is based on constant prices of factor risks, and we
consider a factor SV in mean model. Additional specification of both
expectations and volatility of future volatility of factors provides
conditional moment restrictions, through which the parameters of the model
are all identied. These conditional moment restrictions pave the way for
instrumental variables estimation and GMM inference.
Journal: Econometric Reviews
Pages: 275-309
Issue: 2-3
Volume: 25
Year: 2006
Keywords: Asset pricing, Common features, Conditional factor models, Generalized method of moments, Multivariate conditional heteroskedasticity, Multiperiod conditional moment restrictions, Stochastic volatility,
X-DOI: 10.1080/07474930600713325
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600713325
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:275-309
Template-Type: ReDIF-Article 1.0
Author-Name: Alexander Philipov
Author-X-Name-First: Alexander
Author-X-Name-Last: Philipov
Author-Name: Mark Glickman
Author-X-Name-First: Mark
Author-X-Name-Last: Glickman
Title: Factor Multivariate Stochastic Volatility via Wishart Processes
Abstract:
This paper proposes a high dimensional factor multivariate stochastic
volatility (MSV) model in which factor covariance matrices are driven by
Wishart random processes. The framework allows for unrestricted
specification of intertemporal sensitivities, which can capture the
persistence in volatilities, kurtosis in returns, and correlation
breakdowns and contagion effects in volatilities. The factor structure
allows addressing high dimensional setups used in portfolio analysis and
risk management, as well as modeling conditional means and conditional
variances within the model framework. Owing to the complexity of the
model, we perform inference using Markov chain Monte Carlo simulation from
the posterior distribution. A simulation study is carried out to
demonstrate the efficiency of the estimation algorithm. We illustrate our
model on a data set that includes 88 individual equity returns and the two
Fama-French size and value factors. With this application, we demonstrate
the ability of the model to address high dimensional applications suitable
for asset allocation, risk management, and asset pricing.
Journal: Econometric Reviews
Pages: 311-334
Issue: 2-3
Volume: 25
Year: 2006
Keywords: Bayesian time series, Factor models, Stochastic covariance, Time-varying correlation,
X-DOI: 10.1080/07474930600713366
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600713366
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:311-334
Template-Type: ReDIF-Article 1.0
Author-Name: Roman Liesenfeld
Author-X-Name-First: Roman
Author-X-Name-Last: Liesenfeld
Author-Name: Jean-Francois Richard
Author-X-Name-First: Jean-Francois
Author-X-Name-Last: Richard
Title: Classical and Bayesian Analysis of Univariate and Multivariate Stochastic Volatility Models
Abstract:
In this paper, efficient importance sampling (EIS) is used to perform a
classical and Bayesian analysis of univariate and multivariate stochastic
volatility (SV) models for financial return series. EIS provides a highly
generic and very accurate procedure for the Monte Carlo (MC) evaluation of
high-dimensional interdependent integrals. It can be used to carry out
ML-estimation of SV models as well as simulation smoothing where the
latent volatilities are sampled at once. Based on this EIS simulation
smoother, a Bayesian Markov chain Monte Carlo (MCMC) posterior analysis of
the parameters of SV models can be performed.
Journal: Econometric Reviews
Pages: 335-360
Issue: 2-3
Volume: 25
Year: 2006
Keywords: Dynamic latent variables, Markov chain Monte Carlo, Maximum likelihood, Simulation smoother,
X-DOI: 10.1080/07474930600713424
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600713424
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:335-360
Template-Type: ReDIF-Article 1.0
Author-Name: Jun Yu
Author-X-Name-First: Jun
Author-X-Name-Last: Yu
Author-Name: Renate Meyer
Author-X-Name-First: Renate
Author-X-Name-Last: Meyer
Title: Multivariate Stochastic Volatility Models: Bayesian Estimation and Model Comparison
Abstract:
In this paper we show that fully likelihood-based estimation and
comparison of multivariate stochastic volatility (SV) models can be easily
performed via a freely available Bayesian software called WinBUGS.
Moreover, we introduce to the literature several new specifications that
are natural extensions to certain existing models, one of which allows for
time-varying correlation coefficients. Ideas are illustrated by fitting,
to a bivariate time series data of weekly exchange rates, nine
multivariate SV models, including the specifications with Granger
causality in volatility, time-varying correlations, heavy-tailed error
distributions, additive factor structure, and multiplicative factor
structure. Empirical results suggest that the best specifications are
those that allow for time-varying correlation coefficients.
Journal: Econometric Reviews
Pages: 361-384
Issue: 2-3
Volume: 25
Year: 2006
Keywords: DIC, Factors, Granger causality in volatility, Heavy-tailed distributions, MCMC, Multivariate stochastic volatility, Time-varying correlations,
X-DOI: 10.1080/07474930600713465
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600713465
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:361-384
Template-Type: ReDIF-Article 1.0
Author-Name: Borus Jungbacker
Author-X-Name-First: Borus
Author-X-Name-Last: Jungbacker
Author-Name: Siem Jan Koopman
Author-X-Name-First: Siem Jan
Author-X-Name-Last: Koopman
Title: Monte Carlo Likelihood Estimation for Three Multivariate Stochastic Volatility Models
Abstract:
Estimating parameters in a stochastic volatility (SV) model is a
challenging task. Among other estimation methods and approaches, efficient
simulation methods based on importance sampling have been developed for
the Monte Carlo maximum likelihood estimation of univariate SV models.
This paper shows that importance sampling methods can be used in a general
multivariate SV setting. The sampling methods are computationally
efficient. To illustrate the versatility of this approach, three different
multivariate stochastic volatility models are estimated for a standard
data set. The empirical results are compared to those from earlier studies
in the literature. Monte Carlo simulation experiments, based on parameter
estimates from the standard data set, are used to show the effectiveness
of the importance sampling methods.
Journal: Econometric Reviews
Pages: 385-408
Issue: 2-3
Volume: 25
Year: 2006
Keywords: Importance sampling, Monte Carlo likelihood, Stochastic volatility,
X-DOI: 10.1080/07474930600712848
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600712848
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:385-408
Template-Type: ReDIF-Article 1.0
Author-Name: Ben Tims
Author-X-Name-First: Ben
Author-X-Name-Last: Tims
Author-Name: Ronald Mahieu
Author-X-Name-First: Ronald
Author-X-Name-Last: Mahieu
Title: A Range-Based Multivariate Stochastic Volatility Model for Exchange Rates
Abstract:
In this paper we present a parsimonious multivariate model for exchange
rate volatilities based on logarithmic high-low ranges of daily exchange
rates. The multivariate stochastic volatility model decomposes the log
range of each exchange rate into two independent latent factors, which
could be interpreted as the underlying currency specific components. Owing
to the empirical normality of the logarithmic range measure the model can
be estimated conveniently with the standard Kalman filter methodology. Our
results show that our model fits the exchange rate data quite well.
Exchange rate news seems to be currency specific and allows identification
of currency contributions to both exchange rate levels and exchange rate
volatilities.
Journal: Econometric Reviews
Pages: 409-424
Issue: 2-3
Volume: 25
Year: 2006
Keywords: Exchange rates, Multivariate stochastic volatility models, Range-based volatility,
X-DOI: 10.1080/07474930600712814
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600712814
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:409-424
Template-Type: ReDIF-Article 1.0
Author-Name: Michael Smith
Author-X-Name-First: Michael
Author-X-Name-Last: Smith
Author-Name: Andrew Pitts
Author-X-Name-First: Andrew
Author-X-Name-Last: Pitts
Title: Foreign Exchange Intervention by the Bank of Japan: Bayesian Analysis Using a Bivariate Stochastic Volatility Model
Abstract:
A bivariate stochastic volatility model is employed to measure the effect
of intervention by the Bank of Japan (BOJ) on daily returns and volume in
the USD/YEN foreign exchange market. Missing observations are accounted
for, and a data-based Wishart prior for the precision matrix of the errors
to the transition equation that is in line with the likelihood is
suggested. Empirical results suggest there is strong conditional
heteroskedasticity in the mean-corrected volume measure, as well as
contemporaneous correlation in the errors to both the observation and
transition equations. A threshold model is used for the BOJ reaction
function, which is estimated jointly with the bivariate stochastic
volatility model via Markov chain Monte Carlo. This accounts for
endogeneity between volatility in the market and the BOJ reaction
function, something that has hindered much previous empirical analysis in
the literature on central bank intervention. The empirical results suggest
there was a shift in behavior by the BOJ, with a movement away from a
policy of market stabilization and toward a role of support for domestic
monetary policy objectives. Throughout, we observe “leaning against
the wind” behavior, something that is a feature of most previous
empirical analysis of central bank intervention. A comparison with a
bivariate EGARCH model suggests that the bivariate stochastic volatility
model produces estimates that better capture spikes in in-sample
volatility. This is important in improving estimates of a central bank
reaction function because it is at these periods of high daily volatility
that central banks more frequently intervene.
Journal: Econometric Reviews
Pages: 425-451
Issue: 2-3
Volume: 25
Year: 2006
Keywords: Central bank intervention, Foreign exchange volume, Markov chain Monte Carlo, Missing observations, Multivariate stochastic volatility, Threshold model,
X-DOI: 10.1080/07474930600712897
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600712897
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:425-451
Template-Type: ReDIF-Article 1.0
Author-Name: Manabu Asai
Author-X-Name-First: Manabu
Author-X-Name-Last: Asai
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: Asymmetric Multivariate Stochastic Volatility
Abstract:
This paper proposes and analyses two types of asymmetric multivariate
stochastic volatility (SV) models, namely, (i) the SV with leverage (SV-L)
model, which is based on the negative correlation between the innovations
in the returns and volatility, and (ii) the SV with leverage and size
effect (SV-LSE) model, which is based on the signs and magnitude of the
returns. The paper derives the state space form for the logarithm of the
squared returns, which follow the multivariate SV-L model, and develops
estimation methods for the multivariate SV-L and SV-LSE models based on
the Monte Carlo likelihood (MCL) approach. The empirical results show that
the multivariate SV-LSE model fits the bivariate and trivariate returns of
the S&P 500, the Nikkei 225, and the Hang Seng indexes with respect to AIC
and BIC more accurately than does the multivariate SV-L model. Moreover,
the empirical results suggest that the univariate models should be
rejected in favor of their bivariate and trivariate counterparts.
Journal: Econometric Reviews
Pages: 453-473
Issue: 2-3
Volume: 25
Year: 2006
Keywords: Asymmetric leverage, Bayesian Markov chain Monte Carlo, Dynamic leverage, Importance sampling, Multivariate stochastic volatility, Numerical likelihood, Size effect,
X-DOI: 10.1080/07474930600712913
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600712913
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:2-3:p:453-473
Template-Type: ReDIF-Article 1.0
Author-Name: Esmeralda Ramalho
Author-X-Name-First: Esmeralda
Author-X-Name-Last: Ramalho
Author-Name: Joaquim Ramalho
Author-X-Name-First: Joaquim
Author-X-Name-Last: Ramalho
Title: Bias-Corrected Moment-Based Estimators for Parametric Models Under Endogenous Stratified Sampling
Abstract:
This paper provides an integrated approach for estimating parametric
models from endogenous stratified samples. We discuss several alternative
ways of removing the bias of the moment indicators usually employed under
random sampling for estimating the parameters of the structural model and
the proportion of the strata in the population. Those alternatives give
rise to a number of moment-based estimators that are appropriate for both
cases where the marginal strata probabilities are known and unknown. The
derivation of our estimators is very simple and intuitive and incorporates
as particular cases most of the likelihood-based estimators previously
suggested by other authors.
Journal: Econometric Reviews
Pages: 475-496
Issue: 4
Volume: 25
Year: 2006
Keywords: Bias correction, Endogenous stratified sampling, GMM, Parametric models,
X-DOI: 10.1080/07474930600972574
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600972574
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:4:p:475-496
Template-Type: ReDIF-Article 1.0
Author-Name: Leopold Simar
Author-X-Name-First: Leopold
Author-X-Name-Last: Simar
Author-Name: Valentin Zelenyuk
Author-X-Name-First: Valentin
Author-X-Name-Last: Zelenyuk
Title: On Testing Equality of Distributions of Technical Efficiency Scores
Abstract:
The challenge of the econometric problem in production efficiency
analysis is that the efficiency scores to be analyzed are unobserved.
Statistical properties have recently been discovered for a type of
estimator popular in the literature, known as data envelopment analysis
(DEA). This opens up a wide range of possibilities for well-grounded
statistical inference about the true efficiency scores from their DEA
estimates. In this paper we investigate the possibility of using existing
tests for the equality of two distributions in such a context. Considering
the statistical complications pertinent to our context, we consider
several approaches to adapting the Li test to the context and explore
their performance in terms of the size and power of the test in various
Monte Carlo experiments. One of these approaches shows good performance
for both the size and the power of the test, thus encouraging its use in
empirical studies. We also present an empirical illustration analyzing the
efficiency distributions of countries in the world, following up a recent
study by Kumar and Russell (2002), and report very interesting results.
Journal: Econometric Reviews
Pages: 497-522
Issue: 4
Volume: 25
Year: 2006
Keywords: Bootstrap, DEA, Kernel density estimation and tests,
X-DOI: 10.1080/07474930600972582
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600972582
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:4:p:497-522
Template-Type: ReDIF-Article 1.0
Author-Name: Jeffery Racine
Author-X-Name-First: Jeffery
Author-X-Name-Last: Racine
Author-Name: Jeffrey Hart
Author-X-Name-First: Jeffrey
Author-X-Name-Last: Hart
Author-Name: Qi Li
Author-X-Name-First: Qi
Author-X-Name-Last: Li
Title: Testing the Significance of Categorical Predictor Variables in Nonparametric Regression Models
Abstract:
In this paper we propose a test for the significance of categorical
predictors in nonparametric regression models. The test is fully
data-driven and employs cross-validated smoothing parameter selection
while the null distribution of the test is obtained via bootstrapping. The
proposed approach allows applied researchers to test hypotheses concerning
categorical variables in a fully nonparametric and robust framework,
thereby deflecting potential criticism that a particular finding is driven
by an arbitrary parametric specification. Simulations reveal that the test
performs well, having significantly better power than a conventional
frequency-based nonparametric test. The test is applied to determine
whether OECD and non-OECD countries follow the same growth rate model or
not. Our test suggests that OECD and non-OECD countries follow different
growth rate models, while the tests based on a popular parametric
specification and the conventional frequency-based nonparametric
estimation method fail to detect any significant difference.
Journal: Econometric Reviews
Pages: 523-544
Issue: 4
Volume: 25
Year: 2006
Keywords: Discrete regressors, Inference, Kernel smoothing,
X-DOI: 10.1080/07474930600972590
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600972590
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:25:y:2006:i:4:p:523-544
Template-Type: ReDIF-Article 1.0
Author-Name: Massimiliano Caporin
Author-X-Name-First: Massimiliano
Author-X-Name-Last: Caporin
Title: Variance (Non) Causality in Multivariate GARCH
Abstract:
This paper extends the current literature on the variance-causality topic
providing the coefficient restrictions ensuring variance noncausality
within multivariate GARCH models with in-mean effects. Furthermore, this
paper presents a new multivariate model, the exponential causality GARCH.
By the introduction of a multiplicative causality impact function, the
variance causality effects becomes directly interpretable and can
therefore be used to detect both the existence of causality and its
direction; notably, the proposed model allows for increasing and
decreasing variance effects. An empirical application evidences negative
causality effects between returns and volume of an Italian stock market
index future contract.
Journal: Econometric Reviews
Pages: 1-24
Issue: 1
Volume: 26
Year: 2007
Keywords: Multivariate GARCH, Variance causality, Volatility,
X-DOI: 10.1080/07474930600972178
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600972178
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:1:p:1-24
Template-Type: ReDIF-Article 1.0
Author-Name: Erik Meijer
Author-X-Name-First: Erik
Author-X-Name-Last: Meijer
Author-Name: Tom Wansbeek
Author-X-Name-First: Tom
Author-X-Name-Last: Wansbeek
Title: The Sample Selection Model from a Method of Moments Perspective
Abstract:
It is shown how the usual two-step estimator for the standard sample
selection model can be seen as a method of moments estimator. Standard GMM
theory can be brought to bear on this model, greatly simplifying the
derivation of the asymptotic properties of this model. Using this setup,
the asymptotic variance is derived in detail and a consistent estimator of
it is obtained that is guaranteed to be positive definite, in contrast
with the estimator given in the literature. It is demonstrated how the MM
approach easily accommodates variations on the estimator, like the
two-step IV estimator that handles endogenous regressors, and a two-step
GLS estimator. Furthermore, it is shown that from the MM formulation, it
is straightforward to derive various specification tests, in particular
tests for selection bias, equivalence with the censored regression model,
normality, homoskedasticity, and exogeneity.
Journal: Econometric Reviews
Pages: 25-51
Issue: 1
Volume: 26
Year: 2007
Keywords: GMM, Heckman estimator, Tobit,
X-DOI: 10.1080/07474930600972194
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600972194
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:1:p:25-51
Template-Type: ReDIF-Article 1.0
Author-Name: Eric Ghysels
Author-X-Name-First: Eric
Author-X-Name-Last: Ghysels
Author-Name: Arthur Sinko
Author-X-Name-First: Arthur
Author-X-Name-Last: Sinko
Author-Name: Rossen Valkanov
Author-X-Name-First: Rossen
Author-X-Name-Last: Valkanov
Title: MIDAS Regressions: Further Results and New Directions
Abstract:
We explore mixed data sampling (henceforth MIDAS) regression models. The
regressions involve time series data sampled at different frequencies.
Volatility and related processes are our prime focus, though the
regression method has wider applications in macroeconomics and finance,
among other areas. The regressions combine recent developments regarding
estimation of volatility and a not-so-recent literature on distributed lag
models. We study various lag structures to parameterize parsimoniously the
regressions and relate them to existing models. We also propose several
new extensions of the MIDAS framework. The paper concludes with an
empirical section where we provide further evidence and new results on the
risk-return trade-off. We also report empirical evidence on microstructure
noise and volatility forecasting.
Journal: Econometric Reviews
Pages: 53-90
Issue: 1
Volume: 26
Year: 2007
Keywords: Microstructure noise, Nonlinear MIDAS, Risk, Tick-by-tick applications, Volatility,
X-DOI: 10.1080/07474930600972467
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600972467
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:1:p:53-90
Template-Type: ReDIF-Article 1.0
Author-Name: Isabel Casas
Author-X-Name-First: Isabel
Author-X-Name-Last: Casas
Author-Name: Jiti Gao
Author-X-Name-First: Jiti
Author-X-Name-Last: Gao
Title: Nonparametric Methods in Continuous Time Model Specification
Abstract:
Some popular parametric diffusion processes have been assumed as such
underlying diffusion processes. This paper considers an important case
where both the drift and volatility functions of the underlying diffusion
process are unknown functions of the underlying process, and then proposes
using two novel testing procedures for the parametric specification of
both the drift and diffusion functions. The finite-sample properties of
the proposed tests are assessed through using data generated from four
popular parametric models. In our implementation, we suggest using a
simulated critical value for each case in addition to the use of an
asymptotic critical value. Our detailed studies show that there is little
size distortion when using a simulated critical value while the proposed
tests have some size distortions when using an asymptotic critical value
in each case.
Journal: Econometric Reviews
Pages: 91-106
Issue: 1
Volume: 26
Year: 2007
Keywords: Continuous-time model, Financial econometrics, Nonparametric kernel, Specification testing,
X-DOI: 10.1080/07474930600972558
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930600972558
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:1:p:91-106
Template-Type: ReDIF-Article 1.0
Author-Name: Gary Koop
Author-X-Name-First: Gary
Author-X-Name-Last: Koop
Author-Name: Herman K. van Dijk
Author-X-Name-First: Herman K.
Author-X-Name-Last: van Dijk
Title: Editors' Introduction to the Special Issue of Econometric Reviews on Bayesian Dynamic Econometrics
Abstract:
Journal: Econometric Reviews
Pages: 107-112
Issue: 2-4
Volume: 26
Year: 2007
X-DOI: 10.1080/07474930701220675
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220675
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:107-112
Template-Type: ReDIF-Article 1.0
Author-Name: Sungbae An
Author-X-Name-First: Sungbae
Author-X-Name-Last: An
Author-Name: Frank Schorfheide
Author-X-Name-First: Frank
Author-X-Name-Last: Schorfheide
Title: Bayesian Analysis of DSGE Models
Abstract:
This paper reviews Bayesian methods that have been developed in recent
years to estimate and evaluate dynamic stochastic general equilibrium
(DSGE) models. We consider the estimation of linearized DSGE models, the
evaluation of models based on Bayesian model checking, posterior odds
comparisons, and comparisons to vector autoregressions, as well as the
non-linear estimation based on a second-order accurate model solution.
These methods are applied to data generated from correctly specified and
misspecified linearized DSGE models and a DSGE model that was solved with
a second-order perturbation method.
Journal: Econometric Reviews
Pages: 113-172
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian analysis, DSGE models, Model evaluation, Vector autoregressions,
X-DOI: 10.1080/07474930701220071
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220071
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:113-172
Template-Type: ReDIF-Article 1.0
Author-Name: Malin Adolfson
Author-X-Name-First: Malin
Author-X-Name-Last: Adolfson
Author-Name: Jesper Linde
Author-X-Name-First: Jesper
Author-X-Name-Last: Linde
Author-Name: Mattias Villani
Author-X-Name-First: Mattias
Author-X-Name-Last: Villani
Title: Bayesian Analysis of DSGE Models—Some Comments
Abstract:
Sungbae An and Frank Schorfheide have provided an excellent review of the
main elements of Bayesian inference in Dynamic Stochastic General
Equilibrium (DSGE) models. Bayesian methods have, for reasons clearly
outlined in the paper, a very natural role to play in DSGE analysis, and
the appeal of the Bayesian paradigm is indeed strongly evidenced by the
flood of empirical applications in the area over the last couple of years.
We expect their paper to be the natural starting point for applied
economists interested in learning about Bayesian techniques for analyzing
DSGE models, and as such the paper is likely to have a strong influence on
what will be considered best practice for estimating DSGE models. The
authors have, for good reasons, chosen a stylized six-equation model to
present the methodology. We shall use here the large-scale model in
Adolfson et al. (2005), henceforth ALLV, to illustrate a few econometric
problems which we have found to be especially important as the size of the
model increases. The model in ALLV is an open economy extension of the
closed economy model in Christiano et al. (2005). It consists of 25
log-linearized equations, which can be written as a state space
representation with 60 state variables, many of them unobserved. Fifteen
observed unfiltered time series are used to estimate 51 structural
parameters. An additional complication compared to the model in An and
Schorfheide's paper is that some of the coefficients in the measurement
equation are non-linear functions of the structural parameters. The model
is currently the main vehicle for policy analysis at Sveriges Riksbank
(Central Bank of Sweden) and similar models are being developed in many
other policy institutions, which testifies to the model's practical
relevance. The version considered here is estimated on Euro area data over
the period 1980Q1-2002Q4. We refer to ALLV for details.
Journal: Econometric Reviews
Pages: 173-185
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian, DSGE, MCMC, Marginal likelihood,
X-DOI: 10.1080/07474930701220121
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220121
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:173-185
Template-Type: ReDIF-Article 1.0
Author-Name: Fabio Canova
Author-X-Name-First: Fabio
Author-X-Name-Last: Canova
Title: Bayesian Analysis of DSGE Models by S. An and F. Schorfheide
Abstract:
The paper that An and Schorfheide have written is an excellent piece of
work and will become a useful reference for teaching and consultation
purposes. The paper discusses in an articulate and convincing manner
almost everything that one could think of covering in such a review. This
makes the task of the commentator difficult. Nevertheless, I will attempt
to add few insights on three issues which, in my opinion, play an
important role in applied work and in the interpretation of the estimation
result. In particular, I will discuss a) the sensitivity of posterior
distributions to prior spreads; b) the effects of model misspecification
and an approach to model respecification; c) parameter identification and
its consequences for posterior inference.
Journal: Econometric Reviews
Pages: 187-192
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian analysis, DSGE models, Model evaluation, Vector autoregressions,
X-DOI: 10.1080/07474930701220162
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220162
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:187-192
Template-Type: ReDIF-Article 1.0
Author-Name: John Geweke
Author-X-Name-First: John
Author-X-Name-Last: Geweke
Title: Comment
Abstract:
The article provides detailed and accurate illustrations of Bayesian
analysis of DSGE models that are likely to be used increasingly in support
of central bank policy making. These comments identify a dozen aspects of
these methods, discussing how their application and improvement can
contribute to effective support of policy.
Journal: Econometric Reviews
Pages: 193-200
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian, DSGE models, Markov Chain Monte Carlo, Model selection,
X-DOI: 10.1080/07474930701220196
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220196
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:193-200
Template-Type: ReDIF-Article 1.0
Author-Name: Fabio Milani
Author-X-Name-First: Fabio
Author-X-Name-Last: Milani
Author-Name: Dale J. Poirier
Author-X-Name-First: Dale J.
Author-X-Name-Last: Poirier
Title: Econometric Issues in DSGE Models
Abstract:
Journal: Econometric Reviews
Pages: 201-204
Issue: 2-4
Volume: 26
Year: 2007
X-DOI: 10.1080/07474930701220204
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220204
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:201-204
Template-Type: ReDIF-Article 1.0
Author-Name: Tao Zha
Author-X-Name-First: Tao
Author-X-Name-Last: Zha
Title: Comment on An and Schorfheide's Bayesian Analysis of DSGE Models
Abstract:
An and Schorfheide's article provides an excellent review of Bayesian
estimation of DSGE models. Rather than recapitulating the points already
made in this article, my comment focuses on three aspects. It proposes a
convergence measure to take account of serial correlation of MCMC draws,
explains why the DSGE-VAR framework for policy analysis can be improved by
avoiding the ad hoc identification assumption, and discusses an
alternative structural approach to model misspecification.
Journal: Econometric Reviews
Pages: 205-210
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Ad hoc identification, Beliefs, B/W ratio, Off-equilibrium paths, Self-confirming equilibrium,
X-DOI: 10.1080/07474930701220212
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220212
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:205-210
Template-Type: ReDIF-Article 1.0
Author-Name: Sungbae An
Author-X-Name-First: Sungbae
Author-X-Name-Last: An
Author-Name: Frank Schorfheide
Author-X-Name-First: Frank
Author-X-Name-Last: Schorfheide
Title: Bayesian Analysis of DSGE Models—Rejoinder
Abstract:
We would like to thank all the discussants for their stimulating
comments. While our article to a large extent reviews current practice of
Bayesian analysis of Dynamic Stochastic General Equilibrium (DSGE) models
the discussants provide many ideas to improve upon the current practice,
thereby outlining a research agenda for the years to come. In our
rejoinder we will briefly revisit some of the issues that were raised.
Journal: Econometric Reviews
Pages: 211-219
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian analysis, DSGE models, Hybrid MCMC algorithm, Identification,
X-DOI: 10.1080/07474930701220246
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220246
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:211-219
Template-Type: ReDIF-Article 1.0
Author-Name: James D. Hamilton
Author-X-Name-First: James D.
Author-X-Name-Last: Hamilton
Author-Name: Daniel F. Waggoner
Author-X-Name-First: Daniel F.
Author-X-Name-Last: Waggoner
Author-Name: Tao Zha
Author-X-Name-First: Tao
Author-X-Name-Last: Zha
Title: Normalization in Econometrics
Abstract:
The issue of normalization arises whenever two different values for a
vector of unknown parameters imply the identical economic model. A
normalization implies not just a rule for selecting which among equivalent
points to call the maximum likelihood estimate (MLE), but also governs the
topography of the set of points that go into a small-sample confidence
interval associated with that MLE. A poor normalization can lead to
multimodal distributions, disjoint confidence intervals, and very
misleading characterizations of the true statistical uncertainty. This
paper introduces an identification principle as a framework upon which a
normalization should be imposed, according to which the boundaries of the
allowable parameter space should correspond to loci along which the model
is locally unidentified. We illustrate these issues with examples taken
from mixture models, structural vector autoregressions, and cointegration
models.
Journal: Econometric Reviews
Pages: 221-252
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Cointegration, Local identification, Mixture distributions, Maximum likelihood estimate, Numerical Bayesian methods, Regime-switching, Small sample distributions, Vector autoregressions, Weak identification,
X-DOI: 10.1080/07474930701220329
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220329
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:221-252
Template-Type: ReDIF-Article 1.0
Author-Name: Hashem Pesaran
Author-X-Name-First: Hashem
Author-X-Name-Last: Pesaran
Author-Name: Davide Pettenuzzo
Author-X-Name-First: Davide
Author-X-Name-Last: Pettenuzzo
Author-Name: Allan Timmermann
Author-X-Name-First: Allan
Author-X-Name-Last: Timmermann
Title: Learning, Structural Instability, and Present Value Calculations
Abstract:
Present value calculations require predictions of cash flows both at near
and distant future points in time. Such predictions are generally
surrounded by considerable uncertainty and may critically depend on
assumptions about parameter values as well as the form and stability of
the data generating process underlying the cash flows. This paper presents
new theoretical results for the existence of the infinite sum of
discounted expected future values under uncertainty about the parameters
characterizing the growth rate of the cash flow process. Furthermore, we
explore the consequences for present values of relaxing the stability
assumption in a way that allows for past and future breaks to the
underlying cash flow process. We find that such breaks can lead to
considerable changes in present values.
Journal: Econometric Reviews
Pages: 253-288
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian learning, Present value, Stock prices, Structural breaks,
X-DOI: 10.1080/07474930701220352
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220352
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:253-288
Template-Type: ReDIF-Article 1.0
Author-Name: Malin Adolfson
Author-X-Name-First: Malin
Author-X-Name-Last: Adolfson
Author-Name: Jesper Linde
Author-X-Name-First: Jesper
Author-X-Name-Last: Linde
Author-Name: Mattias Villani
Author-X-Name-First: Mattias
Author-X-Name-Last: Villani
Title: Forecasting Performance of an Open Economy DSGE Model
Abstract:
This paper analyzes the forecasting performance of an open economy
dynamic stochastic general equilibrium (DSGE) model, estimated with
Bayesian methods, for the Euro area during 1994Q1-2002Q4. We compare the
DSGE model and a few variants of this model to various reduced-form
forecasting models such as vector autoregressions (VARs) and vector error
correction models (VECM), estimated both by maximum likelihood and two
different Bayesian approaches, and traditional benchmark models, e.g., the
random walk. The accuracy of point forecasts, interval forecasts and the
predictive distribution as a whole are assessed in an out-of-sample
rolling event evaluation using several univariate and multivariate
measures. The results show that the open economy DSGE model compares well
with more empirical models and thus that the tension between rigor and fit
in older generations of DSGE models is no longer present. We also
critically examine the role of Bayesian model probabilities and other
frequently used low-dimensional summaries, e.g., the log determinant
statistic, as measures of overall forecasting performance.
Journal: Econometric Reviews
Pages: 289-328
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian inference, Forecasting, Open economy DSGE model, Vector autoregressive models,
X-DOI: 10.1080/07474930701220543
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220543
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:289-328
Template-Type: ReDIF-Article 1.0
Author-Name: Jana Eklund
Author-X-Name-First: Jana
Author-X-Name-Last: Eklund
Author-Name: Sune Karlsson
Author-X-Name-First: Sune
Author-X-Name-Last: Karlsson
Title: Forecast Combination and Model Averaging Using Predictive Measures
Abstract:
We extend the standard approach to Bayesian forecast combination by
forming the weights for the model averaged forecast from the predictive
likelihood rather than the standard marginal likelihood. The use of
predictive measures of fit offers greater protection against in-sample
overfitting when uninformative priors on the model parameters are used and
improves forecast performance. For the predictive likelihood we argue that
the forecast weights have good large and small sample properties. This is
confirmed in a simulation study and in an application to forecasts of the
Swedish inflation rate, where forecast combination using the predictive
likelihood outperforms standard Bayesian model averaging using the
marginal likelihood.
Journal: Econometric Reviews
Pages: 329-363
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian model averaging, Inflation rate, Partial Bayes factor, Predictive likelihood, Training sample, Uninformative priors,
X-DOI: 10.1080/07474930701220550
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220550
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:329-363
Template-Type: ReDIF-Article 1.0
Author-Name: L. Bauwens
Author-X-Name-First: L.
Author-X-Name-Last: Bauwens
Author-Name: J. V. K. Rombouts
Author-X-Name-First: J. V. K.
Author-X-Name-Last: Rombouts
Title: Bayesian Clustering of Many Garch Models
Abstract:
We consider the estimation of a large number of GARCH models, of the
order of several hundreds. Our interest lies in the identification of
common structures in the volatility dynamics of the univariate time
series. To do so, we classify the series in an unknown number of clusters.
Within a cluster, the series share the same model and the same parameters.
Each cluster contains therefore similar series. We do not know a priori
which series belongs to which cluster. The model is a finite mixture of
distributions, where the component weights are unknown parameters and each
component distribution has its own conditional mean and variance.
Inference is done by the Bayesian approach, using data augmentation
techniques. Simulations and an illustration using data on U.S. stocks are
provided.
Journal: Econometric Reviews
Pages: 365-386
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian inference, Clustering, GARCH, Gibbs sampling, Mixtures,
X-DOI: 10.1080/07474930701220576
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220576
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:365-386
Template-Type: ReDIF-Article 1.0
Author-Name: Catherine S. Forbes
Author-X-Name-First: Catherine S.
Author-X-Name-Last: Forbes
Author-Name: Gael M. Martin
Author-X-Name-First: Gael M.
Author-X-Name-Last: Martin
Author-Name: Jill Wright
Author-X-Name-First: Jill
Author-X-Name-Last: Wright
Title: Inference for a Class of Stochastic Volatility Models Using Option and Spot Prices: Application of a Bivariate Kalman Filter
Abstract:
In this paper Bayesian methods are applied to a stochastic volatility
model using both the prices of the asset and the prices of options written
on the asset. Posterior densities for all model parameters, latent
volatilities and the market price of volatility risk are produced via a
Markov Chain Monte Carlo (MCMC) sampling algorithm. Candidate draws for
the unobserved volatilities are obtained in blocks by applying the Kalman
filter and simulation smoother to a linearization of a nonlinear state
space representation of the model. Crucially, information from both the
spot and option prices affects the draws via the specification of a
bivariate measurement equation, with implied Black-Scholes volatilities
used to proxy observed option prices in the candidate model. Alternative
models nested within the Heston (1993) framework are ranked via posterior
odds ratios, as well as via fit, predictive and hedging performance. The
method is illustrated using Australian News Corporation spot and option
price data.
Journal: Econometric Reviews
Pages: 387-418
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian inference, Markov Chain Monte Carlo, Multi-move sampler, Option pricing, Nonlinear state space model, Volatility risk,
X-DOI: 10.1080/07474930701220584
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220584
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:387-418
Template-Type: ReDIF-Article 1.0
Author-Name: Petros Dellaportas
Author-X-Name-First: Petros
Author-X-Name-Last: Dellaportas
Author-Name: David G. T. Denison
Author-X-Name-First: David G. T.
Author-X-Name-Last: Denison
Author-Name: Chris Holmes
Author-X-Name-First: Chris
Author-X-Name-Last: Holmes
Title: Flexible Threshold Models for Modelling Interest Rate Volatility
Abstract:
This paper focuses on interest rate models with regime switching and
extends previous nonlinear threshold models by relaxing the assumption of
a fixed number of regimes. Instead we suggest automatic model
determination through Bayesian inference via the reversible jump Markov
Chain Monte Carlo (MCMC) algorithm. Moreover, we allow the thresholds in
the volatility to be driven not only by the interest rate but also by
other economic factors. We illustrate our methodology by applying it to
interest rates and other economic factors of the American economy.
Journal: Econometric Reviews
Pages: 419-437
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Interest rates, Markov Chain Monte Carlo, Reversible jump, Threshold model,
X-DOI: 10.1080/07474930701220600
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220600
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:419-437
Template-Type: ReDIF-Article 1.0
Author-Name: Rodney W. Strachan
Author-X-Name-First: Rodney W.
Author-X-Name-Last: Strachan
Title: Bayesian Inference in Cointegrated I (2) Systems: A Generalization of the Triangular Model
Abstract:
This paper generalizes the cointegrating model of Phillips (1991) to
allow for I (0), I (1) and I (2) processes. The model has a simple form
that permits a wider range of I (2) processes than are usually considered,
including a more flexible form of polynomial cointegration. Further, the
specification relaxes restrictions identified by Phillips (1991) on the I
(1) and I (2) cointegrating vectors and restrictions on how the stochastic
trends enter the system. To date there has been little work on Bayesian I
(2) analysis and so this paper attempts to address this gap in the
literature. A method of Bayesian inference in potentially I (2) processes
is presented with application to Australian money demand using a Jeffreys
prior and a shrinkage prior.
Journal: Econometric Reviews
Pages: 439-468
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Cointegration, Bayesian, I(2) Analysis, Money demand,
X-DOI: 10.1080/07474930701220618
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220618
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:439-468
Template-Type: ReDIF-Article 1.0
Author-Name: Luc Bauwens
Author-X-Name-First: Luc
Author-X-Name-Last: Bauwens
Author-Name: Michel Lubrano
Author-X-Name-First: Michel
Author-X-Name-Last: Lubrano
Title: Bayesian Inference in Dynamic Disequilibrium Models: An Application to the Polish Credit Market
Abstract:
We propose a Bayesian approach for inference in a dynamic disequilibrium
model. To circumvent the difficulties raised by the Maddala and Nelson
(1974) specification in the dynamic case, we analyze a dynamic extended
version of the disequilibrium model of Ginsburgh et al. (1980). We develop
a Gibbs sampler based on the simulation of the missing observations. The
feasibility of the approach is illustrated by an empirical analysis of the
Polish credit market, for which we conduct a specification search using
the posterior deviance criterion of Spiegelhalter et al. (2002).
Journal: Econometric Reviews
Pages: 469-486
Issue: 2-4
Volume: 26
Year: 2007
Keywords: Bayesian inference, Credit rationing, Data augmentation, Disequilibrium model, Latent variables, Poland,
X-DOI: 10.1080/07474930701220634
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701220634
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:2-4:p:469-486
Template-Type: ReDIF-Article 1.0
Author-Name: Bent Nielsen
Author-X-Name-First: Bent
Author-X-Name-Last: Nielsen
Author-Name: J. James Reade
Author-X-Name-First: J. James
Author-X-Name-Last: Reade
Title: Simulating Properties of the Likelihood Ratio Test for a Unit Root in an Explosive Second-Order Autoregression
Abstract:
This paper provides a means of accurately simulating explosive
autoregressive processes and uses this method to analyze the distribution
of the likelihood ratio test statistic for an explosive second-order
autoregressive process of a unit root. While the standard Dickey-Fuller
distribution is known to apply in this case, simulations of statistics in
the explosive region are beset by the magnitude of the numbers involved,
which cause numerical inaccuracies. This has previously constituted a bar
on supporting asymptotic results by means of simulation, and analyzing the
finite sample properties of tests in the explosive region.
Journal: Econometric Reviews
Pages: 487-501
Issue: 5
Volume: 26
Year: 2007
Keywords: Explosive autoregression, Simulation, Unit root test,
X-DOI: 10.1080/07474930701512055
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701512055
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:5:p:487-501
Template-Type: ReDIF-Article 1.0
Author-Name: Florenz Plassmann
Author-X-Name-First: Florenz
Author-X-Name-Last: Plassmann
Author-Name: Neha Khanna
Author-X-Name-First: Neha
Author-X-Name-Last: Khanna
Title: Assessing the Precision of Turning Point Estimates in Polynomial Regression Functions
Abstract:
Researchers often report point estimates of turning point(s) obtained in
polynomial regression models but rarely assess the precision of these
estimates. We discuss three methods to assess the precision of such
turning point estimates. The first is the delta method that leads to a
normal approximation of the distribution of the turning point estimator.
The second method uses the exact distribution of the turning point
estimator of quadratic regression functions. The third method relies on
Markov chain Monte Carlo methods to provide a finite sample approximation
of the exact distribution of the turning point estimator. We argue that
the delta method may lead to misleading inference and that the other two
methods are more reliable. We compare the three methods using two data
sets from the environmental Kuznets curve literature, where the presence
and location of a turning point in the income-pollution relationship is
the focus of much empirical work.
Journal: Econometric Reviews
Pages: 503-528
Issue: 5
Volume: 26
Year: 2007
Keywords: Asymmetric confidence interval, Environmental Kuznets curve hypothesis, MCMC, Quantiles,
X-DOI: 10.1080/07474930701512105
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701512105
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:5:p:503-528
Template-Type: ReDIF-Article 1.0
Author-Name: Mingliang Li
Author-X-Name-First: Mingliang
Author-X-Name-Last: Li
Title: Bayesian Proportional Hazard Analysis of the Timing of High School Dropout Decisions
Abstract:
In this paper, I study the timing of high school dropout decisions using
data from High School and Beyond. I propose a Bayesian proportional hazard
analysis framework that takes into account the specification of piecewise
constant baseline hazard, the time-varying covariate of dropout
eligibility, and individual, school, and state level random effects in the
dropout hazard. I find that students who have reached their state
compulsory school attendance ages are more likely to drop out of high
school than those who have not reached compulsory school attendance ages.
Regarding the school quality effects, a student is more likely to drop out
of high school if the school she attends is associated with a higher
pupil-teacher ratio or lower district expenditure per pupil. An
interesting finding of the paper that comes along with the empirical
results is that failure to account for the time-varying heterogeneity in
the hazard, in this application, results in upward biases in the duration
dependence estimates. Moreover, these upward biases are comparable in
magnitude to the well-known downward biases in the duration dependence
estimates when the modeling of the time-invariant heterogeneity in the
hazard is absent.
Journal: Econometric Reviews
Pages: 529-556
Issue: 5
Volume: 26
Year: 2007
Keywords: Bayesian analysis, High school dropout behavior, Proportional hazard analysis,
X-DOI: 10.1080/07474930701509416
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701509416
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:5:p:529-556
Template-Type: ReDIF-Article 1.0
Author-Name: Yasemin Ulu
Author-X-Name-First: Yasemin
Author-X-Name-Last: Ulu
Title: A Comparison of the Runs Test for Volatility Forecastability and the LM Test for GARCH Using Aggregated Returns
Abstract:
Christoffersen and Diebold (2000) have introduced a runs test for
forecastable volatility in aggregated returns. In this note, we compare
the size and power of their runs test and the more conventional LM test
for GARCH by Monte Carlo simulation. When the true daily process is GARCH,
EGARCH, or stochastic volatility, the LM test has better power than the
runs test for the moderate-horizon returns considered by Christoffersen
and Diebold. For long-horizon returns, however, the tests have very
similar power. We also consider a qualitative threshold GARCH model. For
this process, we find that the runs test has greater power than the LM
test. Theresults support the use of the runs test with aggregated returns.
Journal: Econometric Reviews
Pages: 557-566
Issue: 5
Volume: 26
Year: 2007
Keywords: Aggregated returns, Forecast horizon, GARCH, LM test, Monte Carlo simulation, Runs test, Volatility forecastability,
X-DOI: 10.1080/07474930701512147
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701512147
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:5:p:557-566
Template-Type: ReDIF-Article 1.0
Author-Name: Kuan Xu
Author-X-Name-First: Kuan
Author-X-Name-Last: Xu
Title: U-Statistics and Their Asymptotic Results for Some Inequality and Poverty Measures
Abstract:
U-statistics form a general class of statistics that have certain
important features in common. This class arises as a generalization of the
sample mean and the sample variance, and typically members of the class
are asymptotically normal with good consistency properties. The class
encompasses some widely used income inequality and poverty measures, in
particular the variance, the Gini index, the poverty rate, the average
poverty gap ratios, the Foster-Greer-Thorbecke index, and the Sen index
and its modified form. This paper illustrates how these measures come
together within the class of U-statistics, and thereby why U-statistics
are useful in econometrics.
Journal: Econometric Reviews
Pages: 567-577
Issue: 5
Volume: 26
Year: 2007
Keywords: Asymptotic theory, Inequality, Measures, Poverty, U-statistics,
X-DOI: 10.1080/07474930701512170
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701512170
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:5:p:567-577
Template-Type: ReDIF-Article 1.0
Author-Name: Camilo Dagum
Author-X-Name-First: Camilo
Author-X-Name-Last: Dagum
Author-Name: Giorgio Vittadini
Author-X-Name-First: Giorgio
Author-X-Name-Last: Vittadini
Author-Name: Pietro Giorgio Lovaglio
Author-X-Name-First: Pietro Giorgio
Author-X-Name-Last: Lovaglio
Title: Formative Indicators and Effects of a Causal Model for Household Human Capital with Application
Abstract:
Dagum and Slottje (2000) estimated household human capital (HC) as a
latent variable (LV) and proposed its monetary estimation by means of an
actuarial approach. This paper introduces an improved method for the
estimation of household HC as an LV by means of formative and reflective
indicators in agreement with the accepted economic definition of HC. The
monetary value of HC is used in a recursive causal model to obtain short-
and long-term multipliers that measure the direct and total effects of the
variables that determine household HC. The new method is applied to
estimate US household HC for year 2004.
Journal: Econometric Reviews
Pages: 579-596
Issue: 5
Volume: 26
Year: 2007
Keywords: Formative and reflective indicators, Latent variable, Short-term and long-term multipliers, U.S. household human capital distribution,
X-DOI: 10.1080/07474930701512246
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701512246
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:5:p:579-596
Template-Type: ReDIF-Article 1.0
Author-Name: Massimo Guidolin
Author-X-Name-First: Massimo
Author-X-Name-Last: Guidolin
Title: A Review of: “Book Review: Empirical Dynamic Asset Pricing”
Abstract:
Journal: Econometric Reviews
Pages: 597-604
Issue: 5
Volume: 26
Year: 2007
X-DOI: 10.1080/07474930701512410
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701512410
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:5:p:597-604
Template-Type: ReDIF-Article 1.0
Author-Name: Silvia Goncalves
Author-X-Name-First: Silvia
Author-X-Name-Last: Goncalves
Author-Name: Lutz Kilian
Author-X-Name-First: Lutz
Author-X-Name-Last: Kilian
Title: Asymptotic and Bootstrap Inference for AR(∞) Processes with Conditional Heteroskedasticity
Abstract:
The main contribution of this paper is a proof of the asymptotic validity
of the application of the bootstrap to AR(∞) processes with
unmodelled conditional heteroskedasticity. We first derive the asymptotic
properties of the least-squares estimator of the autoregressive sieve
parameters when the data are generated by a stationary linear process with
martingale difference errors that are possibly subject to conditional
heteroskedasticity of unknown form. These results are then used in
establishing that a suitably constructed bootstrap estimator will have the
same limit distribution as the least-squares estimator. Our results
provide theoretical justification for the use of either the conventional
asymptotic approximation based on robust standard errors or the bootstrap
approximation of the distribution of autoregressive parameters. A
simulation study suggests that the bootstrap approach tends to be more
accurate in small samples.
Journal: Econometric Reviews
Pages: 609-641
Issue: 6
Volume: 26
Year: 2007
Keywords: Autoregression, Bootstrap, GARCH,
X-DOI: 10.1080/07474930701624462
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701624462
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:6:p:609-641
Template-Type: ReDIF-Article 1.0
Author-Name: Zhijie Xiao
Author-X-Name-First: Zhijie
Author-X-Name-Last: Xiao
Author-Name: Luiz Renato Lima
Author-X-Name-First: Luiz Renato
Author-X-Name-Last: Lima
Title: Testing Covariance Stationarity
Abstract:
In this paper, we show that the widely used stationarity tests such as
the Kwiatkowski Phillips, Schmidt, and Shin (KPSS) test have power close
to size in the presence of time-varying unconditional variance. We propose
a new test as a complement of the existing tests. Monte Carlo experiments
show that the proposed test possesses the following characteristics: (i)
in the presence of unit root or a structural change in the mean, the
proposed test is as powerful as the KPSS and other tests; (ii) in the
presence of a changing variance, the traditional tests perform badly
whereas the proposed test has high power comparing to the existing tests;
(iii) the proposed test has the same size as traditional stationarity
tests under the null hypothesis of stationarity. An application to daily
observations of return on U.S. Dollar/Euro exchange rate reveals the
existence of instability in the unconditional variance when the entire
sample is considered, but stability is found in subsamples.
Journal: Econometric Reviews
Pages: 643-667
Issue: 6
Volume: 26
Year: 2007
Keywords: Asymptotic theory, KPSS, Stationarity testing, Time-varying variance,
X-DOI: 10.1080/07474930701639080
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701639080
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:6:p:643-667
Template-Type: ReDIF-Article 1.0
Author-Name: Walter Beckert
Author-X-Name-First: Walter
Author-X-Name-Last: Beckert
Title: Specification and Identification of Stochastic Demand Models
Abstract:
This paper is concerned with stochastic demand systems for continuous
choices that arise from structural random utility models. It examines
under which nonparametric conditions on the structural random utility
specification the implied reduced form model is nonsingular and
invertible. For parametric members within this class of random utility
models, the paper provides conditions for local identification from the
reduced form under moment assumptions.
Journal: Econometric Reviews
Pages: 669-683
Issue: 6
Volume: 26
Year: 2007
Keywords: Invertibility, Local identification, Random utility, Stochastic demand,
X-DOI: 10.1080/07474930701653719
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701653719
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:6:p:669-683
Template-Type: ReDIF-Article 1.0
Author-Name: Timothy Halliday
Author-X-Name-First: Timothy
Author-X-Name-Last: Halliday
Title: Testing for State Dependence with Time-Variant Transition Probabilities
Abstract:
We derive a simple result that allows us to test for the presence of
state dependence in a dynamic Logit model with time-variant transition
probabilities and an arbitrary distribution of the unobserved
heterogeneity. Monte Carlo evidence suggests that this test has desirable
properties even when there are some violations of the model's assumptions.
We also consider alternative tests that will have desirable properties
only when the transition probabilities do not depend on time and provide
evidence that there is an “acceptable” range in which
ignoring time-dependence does not matter too much. We conclude with an
application to the Barker Hypothesis.
Journal: Econometric Reviews
Pages: 685-703
Issue: 6
Volume: 26
Year: 2007
Keywords: Dynamic panel data models, Health, State dependence,
X-DOI: 10.1080/07474930701653768
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701653768
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:6:p:685-703
Template-Type: ReDIF-Article 1.0
Author-Name: Yoichi Arai
Author-X-Name-First: Yoichi
Author-X-Name-Last: Arai
Author-Name: Eiji Kurozumi
Author-X-Name-First: Eiji
Author-X-Name-Last: Kurozumi
Title: Testing for the Null Hypothesis of Cointegration with a Structural Break
Abstract:
In this paper we propose residual-based tests for the null hypothesis of
cointegration with a structural break against the alternative of no
cointegration. The Lagrange Multiplier (LM) test is proposed and its
limiting distribution is obtained for the case in which the timing of a
structural break is known. Then the test statistic is extended to deal
with a structural break of unknown timing. The test statistic, a plug-in
version of the test statistic for known timing, replaces the true break
point by the estimated one. We show the limiting properties of the test
statistic under the null as well as the alternative. Critical values are
calculated for the tests by simulation methods. Finite-sample simulations
show that the empirical size of the test is close to the nominal one
unless the regression error is very persistent and that the test rejects
the null when no cointegrating relationship with a structural break is
present. We provide empirical examples based on the present-value model,
the term structure model, and the money-output relationship model.
Journal: Econometric Reviews
Pages: 705-739
Issue: 6
Volume: 26
Year: 2007
Keywords: Cointegration, Integrated time series, No cointegration, Structural break,
X-DOI: 10.1080/07474930701653776
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701653776
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:6:p:705-739
Template-Type: ReDIF-Article 1.0
Author-Name: James Davidson
Author-X-Name-First: James
Author-X-Name-Last: Davidson
Title: A Review of: “Book Review: Mathematical and Statistical Foundations”
Abstract:
Journal: Econometric Reviews
Pages: 605-607
Issue: 5
Volume: 26
Year: 2007
X-DOI: 10.1080/07474930701512436
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701512436
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:26:y:2007:i:5:p:605-607
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: Realized Volatility and Long Memory: An Overview
Abstract:
The challenge of modeling, estimating, testing, and forecasting financial
volatility is both intellectually worthwhile and also central to the
successful analysis of financial returns and optimal investment
strategies. In each of the three primary areas of volatility modeling,
namely, conditional (or generalized autoregressive conditional
heteroskedasticity) volatility, stochastic volatility and realized
volatility (RV), numerous univariate volatility models of individual
financial assets and multivariate volatility models of portfolios of
assets have been established. This special issue has eleven innovative
articles, eight of which are focused directly on RV and three on long
memory, while two are concerned with both RV and long memory.
Journal: Econometric Reviews
Pages: 1-9
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Forecasting, Integrated variance, Realized quarticity, Realized volatility, Returns, Risk, Securities,
X-DOI: 10.1080/07474930701853459
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701853459
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:1-9
Template-Type: ReDIF-Article 1.0
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Author-Name: Marcelo Medeiros
Author-X-Name-First: Marcelo
Author-X-Name-Last: Medeiros
Title: Realized Volatility: A Review
Abstract:
This article reviews the exciting and rapidly expanding literature on
realized volatility. After presenting a general univariate framework for
estimating realized volatilities, a simple discrete time model is
presented in order to motivate the main results. A continuous time
specification provides the theoretical foundation for the main results in
this literature. Cases with and without microstructure noise are
considered, and it is shown how microstructure noise can cause severe
problems in terms of consistent estimation of the daily realized
volatility. Independent and dependent noise processes are examined. The
most important methods for providing consistent estimators are presented,
and a critical exposition of different techniques is given. The finite
sample properties are discussed in comparison with their asymptotic
properties. A multivariate model is presented to discuss estimation of the
realized covariances. Various issues relating to modelling and forecasting
realized volatilities are considered. The main empirical findings using
univariate and multivariate methods are summarized.
Journal: Econometric Reviews
Pages: 10-45
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Continuous time processes, Finance, Financial econometrics, Forecasting, High frequency data, Quadratic variation, Realized volatility, Risk, Trading rules,
X-DOI: 10.1080/07474930701853509
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701853509
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:10-45
Template-Type: ReDIF-Article 1.0
Author-Name: Fulvio Corsi
Author-X-Name-First: Fulvio
Author-X-Name-Last: Corsi
Author-Name: Stefan Mittnik
Author-X-Name-First: Stefan
Author-X-Name-Last: Mittnik
Author-Name: Christian Pigorsch
Author-X-Name-First: Christian
Author-X-Name-Last: Pigorsch
Author-Name: Uta Pigorsch
Author-X-Name-First: Uta
Author-X-Name-Last: Pigorsch
Title: The Volatility of Realized Volatility
Abstract:
In recent years, with the availability of high-frequency financial market
data modeling realized volatility has become a new and innovative research
direction. The construction of “observable” or realized
volatility series from intra-day transaction data and the use of standard
time-series techniques has lead to promising strategies for modeling and
predicting (daily) volatility. In this article, we show that the residuals
of commonly used time-series models for realized volatility and
logarithmic realized variance exhibit non-Gaussianity and volatility
clustering. We propose extensions to explicitly account for these
properties and assess their relevance for modeling and forecasting
realized volatility. In an empirical application for S&P 500 index futures
we show that allowing for time-varying volatility of realized volatility
and logarithmic realized variance substantially improves the fit as well
as predictive performance. Furthermore, the distributional assumption for
residuals plays a crucial role in density forecasting.
Journal: Econometric Reviews
Pages: 46-78
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Density forecasting, Finance, HAR-GARCH, Normal inverse Gaussian distribution, Realized quarticity, Realized volatility,
X-DOI: 10.1080/07474930701853616
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701853616
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:46-78
Template-Type: ReDIF-Article 1.0
Author-Name: Peter Hansen
Author-X-Name-First: Peter
Author-X-Name-Last: Hansen
Author-Name: Jeremy Large
Author-X-Name-First: Jeremy
Author-X-Name-Last: Large
Author-Name: Asger Lunde
Author-X-Name-First: Asger
Author-X-Name-Last: Lunde
Title: Moving Average-Based Estimators of Integrated Variance
Abstract:
We examine moving average (MA) filters for estimating the integrated
variance (IV) of a financial asset price in a framework where
high-frequency price data are contaminated with market microstructure
noise. We show that the sum of squared MA residuals must be scaled to
enable a suitable estimator of IV. The scaled estimator is shown to be
consistent, first-order efficient, and asymptotically Gaussian distributed
about the integrated variance under restrictive assumptions. Under more
plausible assumptions, such as time-varying volatility, the MA model is
misspecified. This motivates an extensive simulation study of the merits
of the MA-based estimator under misspecification. Specifically, we
consider nonconstant volatility combined with rounding errors and various
forms of dependence between the noise and efficient returns. We benchmark
the scaled MA-based estimator to subsample and realized kernel estimators
and find that the MA-based estimator performs well despite the
misspecification.
Journal: Econometric Reviews
Pages: 79-111
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Bias correction, High-frequency data, Integrated variance, Moving average, Realized variance, Realized volatility,
X-DOI: 10.1080/07474930701853640
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701853640
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:79-111
Template-Type: ReDIF-Article 1.0
Author-Name: Toshiya Hoshikawa
Author-X-Name-First: Toshiya
Author-X-Name-Last: Hoshikawa
Author-Name: Keiji Nagai
Author-X-Name-First: Keiji
Author-X-Name-Last: Nagai
Author-Name: Taro Kanatani
Author-X-Name-First: Taro
Author-X-Name-Last: Kanatani
Author-Name: Yoshihiko Nishiyama
Author-X-Name-First: Yoshihiko
Author-X-Name-Last: Nishiyama
Title: Nonparametric Estimation Methods of Integrated Multivariate Volatilities
Abstract:
Estimation of integrated multivariate volatilities of an Ito process is
an interesting and important issue in finance, for example, in order to
evaluate portfolios. New non-parametric estimators have been recently
proposed by Malliavin and Mancino (2002) and Hayashi and Yoshida (2005a)
as alternative methods to classical realized quadratic covariation. The
purpose of this article is to compare these alternative estimators both
theoretically and empirically, when high frequency data is available. We
found that the Hayashi-Yoshida estimator performs the best among the
alternatives in view of the bias and the MSE. The other estimators are
shown to have possibly heavy bias mostly toward the origin. We also
applied these estimators to Japanese Government Bond futures to obtain the
results consistent with our simulation.
Journal: Econometric Reviews
Pages: 112-138
Issue: 1-3
Volume: 27
Year: 2008
Keywords: High frequency data, Integrated volatility, Nonparametric estimators, Weighted realized volatility,
X-DOI: 10.1080/07474930701853855
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701853855
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:112-138
Template-Type: ReDIF-Article 1.0
Author-Name: Silvia Goncalves
Author-X-Name-First: Silvia
Author-X-Name-Last: Goncalves
Author-Name: Nour Meddahi
Author-X-Name-First: Nour
Author-X-Name-Last: Meddahi
Title: Edgeworth Corrections for Realized Volatility
Abstract:
The quality of the asymptotic normality of realized volatility can be
poor if sampling does not occur at very high frequencies. In this article
we consider an alternative approximation to the finite sample distribution
of realized volatility based on Edgeworth expansions. In particular, we
show how confidence intervals for integrated volatility can be constructed
using these Edgeworth expansions. The Monte Carlo study we conduct shows
that the intervals based on the Edgeworth corrections have improved
properties relatively to the conventional intervals based on the normal
approximation. Contrary to the bootstrap, the Edgeworth approach is an
analytical approach that is easily implemented, without requiring any
resampling of one's data. A comparison between the bootstrap and the
Edgeworth expansion shows that the bootstrap outperforms the Edgeworth
corrected intervals. Thus, if we are willing to incur in the additional
computational cost involved in computing bootstrap intervals, these are
preferred over the Edgeworth intervals. Nevertheless, if we are not
willing to incur in this additional cost, our results suggest that
Edgeworth corrected intervals should replace the conventional intervals
based on the first order normal approximation.
Journal: Econometric Reviews
Pages: 139-162
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Confidence intervals, Edgeworth expansions, Realized volatility,
X-DOI: 10.1080/07474930701870420
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701870420
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:139-162
Template-Type: ReDIF-Article 1.0
Author-Name: Federico Bandi
Author-X-Name-First: Federico
Author-X-Name-Last: Bandi
Author-Name: Jeffrey Russell
Author-X-Name-First: Jeffrey
Author-X-Name-Last: Russell
Author-Name: Yinghua Zhu
Author-X-Name-First: Yinghua
Author-X-Name-Last: Zhu
Title: Using High-Frequency Data in Dynamic Portfolio Choice
Abstract:
This article evaluates the economic benefit of methods that have been
suggested to optimally sample (in an MSE sense) high-frequency return data
for the purpose of realized variance/covariance estimation in the presence
of market microstructure noise (Bandi and Russell, 2005a, 2008). We
compare certainty equivalents derived from volatility-timing trading
strategies relying on optimally-sampled realized variances and
covariances, on realized variances and covariances obtained by sampling
every 5 minutes, and on realized variances and covariances obtained by
sampling every 15 minutes. In our sample, we show that a risk-averse
investor who is given the option of choosing variance/covariance forecasts
derived from MSE-based optimal sampling methods versus forecasts obtained
from 5- and 15-minute intervals (as generally proposed in the literature)
would be willing to pay up to about 80 basis points per year to achieve
the level of utility that is guaranteed by optimal sampling. We find that
the gains yielded by optimal sampling are economically large,
statistically significant, and robust to realistic transaction costs.
Journal: Econometric Reviews
Pages: 163-198
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Dynamic portfolio choice, Market microstructure noise, Realized covariance, Realized variance,
X-DOI: 10.1080/07474930701870461
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701870461
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:163-198
Template-Type: ReDIF-Article 1.0
Author-Name: Michiel de Pooter
Author-X-Name-First: Michiel
Author-X-Name-Last: de Pooter
Author-Name: Martin Martens
Author-X-Name-First: Martin
Author-X-Name-Last: Martens
Author-Name: Dick van Dijk
Author-X-Name-First: Dick
Author-X-Name-Last: van Dijk
Title: Predicting the Daily Covariance Matrix for S&P 100 Stocks Using Intraday Data—But Which Frequency to Use?
Abstract:
This article investigates the merits of high-frequency intraday data when
forming mean-variance efficient stock portfolios with daily rebalancing
from the individual constituents of the S&P 100 index. We focus on the
issue of determining the optimal sampling frequency as judged by the
performance of these portfolios. The optimal sampling frequency ranges
between 30 and 65 minutes, considerably lower than the popular five-minute
frequency, which typically is motivated by the aim of striking a balance
between the variance and bias in covariance matrix estimates due to market
microstructure effects such as non-synchronous trading and bid-ask bounce.
Bias-correction procedures, based on combining low-frequency and
high-frequency covariance matrix estimates and on the addition of leads
and lags do not substantially affect the optimal sampling frequency or the
portfolio performance. Our findings are also robust to the presence of
transaction costs and to the portfolio rebalancing frequency.
Journal: Econometric Reviews
Pages: 199-229
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Bias-correction, High-frequency data, Mean-variance analysis, Realized volatility, Tracking error, Volatility timing,
X-DOI: 10.1080/07474930701873333
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701873333
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:199-229
Template-Type: ReDIF-Article 1.0
Author-Name: Jim Griffin
Author-X-Name-First: Jim
Author-X-Name-Last: Griffin
Author-Name: Roel Oomen
Author-X-Name-First: Roel
Author-X-Name-Last: Oomen
Title: Sampling Returns for Realized Variance Calculations: Tick Time or Transaction Time?
Abstract:
This article introduces a new model for transaction prices in the
presence of market microstructure noise in order to study the properties
of the price process on two different time scales, namely, transaction
time where prices are sampled with every transaction and tick time where
prices are sampled with every price change. Both sampling schemes have
been used in the literature on realized variance, but a formal
investigation into their properties has been lacking. Our empirical and
theoretical results indicate that the return dynamics in transaction time
are very different from those in tick time and the choice of sampling
scheme can therefore have an important impact on the properties of
realized variance. For RV we find that tick time sampling is superior to
transaction time sampling in terms of mean-squared-error, especially when
the level of noise, number of ticks, or the arrival frequency of efficient
price moves is low. Importantly, we show that while the microstructure
noise may appear close to IID in transaction time, in tick time it is
highly dependent. As a result, bias correction procedures that rely on the
noise being independent, can fail in tick time and are better implemented
in transaction time.
Journal: Econometric Reviews
Pages: 230-253
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Market microstructure noise, Optimal sampling, Pure jump process, Realized variance, Tick time, Transaction time,
X-DOI: 10.1080/07474930701873341
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701873341
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:230-253
Template-Type: ReDIF-Article 1.0
Author-Name: Offer Lieberman
Author-X-Name-First: Offer
Author-X-Name-Last: Lieberman
Author-Name: Peter Phillips
Author-X-Name-First: Peter
Author-X-Name-Last: Phillips
Title: Refined Inference on Long Memory in Realized Volatility
Abstract:
There is an emerging consensus in empirical finance that realized
volatility series typically display long range dependence with a memory
parameter (d) around 0.4 (Andersen et al., 2001; Martens et al., 2004).
The present article provides some illustrative analysis of how long memory
may arise from the accumulative process underlying realized volatility.
The article also uses results in Lieberman and Phillips (2004, 2005) to
refine statistical inference about d by higher order theory. Standard
asymptotic theory has an O(n-1/2) error rate for error rejection
probabilities, and the theory used here refines the approximation to an
error rate of o(n-1/2). The new formula is independent of unknown
parameters, is simple to calculate and user-friendly. The method is
applied to test whether the reported long memory parameter estimates of
Andersen et al. (2001) and Martens et al. (2004) differ significantly from
the lower boundary (d = 0.5) of nonstationary long memory, and
generally confirms earlier findings.
Journal: Econometric Reviews
Pages: 254-267
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Edgeworth expansion, Long memory, Realized volatility,
X-DOI: 10.1080/07474930701873374
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701873374
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:254-267
Template-Type: ReDIF-Article 1.0
Author-Name: Afonso Goncalves da Silva
Author-X-Name-First: Afonso Goncalves
Author-X-Name-Last: da Silva
Author-Name: Peter Robinson
Author-X-Name-First: Peter
Author-X-Name-Last: Robinson
Title: Finite Sample Performance in Cointegration Analysis of Nonlinear Time Series with Long Memory
Abstract:
Nonlinear functions of multivariate financial time series can exhibit
long memory and fractional cointegration. However, tools for analysing
these phenomena have principally been justified under assumptions that are
invalid in this setting. Determination of asymptotic theory under more
plausible assumptions can be complicated and lengthy. We discuss these
issues and present a Monte Carlo study, showing that asymptotic theory
should not necessarily be expected to provide a good approximation to
finite-sample behavior.
Journal: Econometric Reviews
Pages: 268-297
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Fractional cointegration, Memory estimation, Stochastic volatility,
X-DOI: 10.1080/07474930701873382
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701873382
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:268-297
Template-Type: ReDIF-Article 1.0
Author-Name: Leonardo Rocha Souza
Author-X-Name-First: Leonardo Rocha
Author-X-Name-Last: Souza
Title: Why Aggregate Long Memory Time Series?
Abstract:
This article shows that, for large samples, temporally aggregating a true
long memory time series (in order to get an improved estimator) may make
little or no sense, as the practitioner can get virtually the same
estimates as those from the aggregated series by choosing the appropriate
bandwidths on the original one, provided some fairly general conditions
apply. Besides, the practitioner has a wider choice of bandwidths than she
has of aggregating levels. However, these results apply only to two
specific and commonly used estimators, and do not apply to the aggregation
procedure undertaken to compute the realized volatility. Also, aggregating
a time series in order to test true versus spurious long memory (as in
Ohanissian et al., 2008) is a relevant issue, particularly regarding
stochastic and/or realized volatility, as many nonlinear processes display
spurious long memory where the above result does not apply.
Journal: Econometric Reviews
Pages: 298-316
Issue: 1-3
Volume: 27
Year: 2008
Keywords: Bandwidth, Long memory, Spectrum, Temporal aggregation,
X-DOI: 10.1080/07474930701873408
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930701873408
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:1-3:p:298-316
Template-Type: ReDIF-Article 1.0
Author-Name: Amos Golan
Author-X-Name-First: Amos
Author-X-Name-Last: Golan
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Title: Information Theoretic and Entropy Methods: An Overview
Abstract:
Journal: Econometric Reviews
Pages: 317-328
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Entropy, Information, Information and entropy econometrics, Information theory, Information-theoretic estimators,
X-DOI: 10.1080/07474930801959685
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801959685
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:317-328
Template-Type: ReDIF-Article 1.0
Author-Name: Steve Pincus
Author-X-Name-First: Steve
Author-X-Name-Last: Pincus
Title: Approximate Entropy as an Irregularity Measure for Financial Data
Abstract:
The need to assess subtle, potentially exploitable changes in serial
structure is paramount in the analysis of financial and econometric data.
We demonstrate the utility of approximate entropy (ApEn), a
model-independent measure of sequential irregularity, towards this goal,
via several distinct applications, both empirical data and model-based. We
also consider cross-ApEn, a related two-variable measure of asynchrony
that provides a more robust and ubiquitous measure of bivariate
correspondence than does correlation, and the resultant implications to
diversification strategies. We provide analytic expressions for and
statistical properties of ApEn, and compare ApEn to nonlinear (complexity)
measures, correlation and spectral analyses, and other entropy measures.
Journal: Econometric Reviews
Pages: 329-362
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Approximate entropy, Asynchrony, Complexity, Irregularity,
X-DOI: 10.1080/07474930801959750
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801959750
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:329-362
Template-Type: ReDIF-Article 1.0
Author-Name: Andreas Koutris
Author-X-Name-First: Andreas
Author-X-Name-Last: Koutris
Author-Name: Maria Heracleous
Author-X-Name-First: Maria
Author-X-Name-Last: Heracleous
Author-Name: Aris Spanos
Author-X-Name-First: Aris
Author-X-Name-Last: Spanos
Title: Testing for Nonstationarity Using Maximum Entropy Resampling: A Misspecification Testing Perspective
Abstract:
One of the most important assumptions in empirical modeling is the
constancy of the statistical model parameters which usually reflects the
stationarity of the underlying stochastic process. In the 1980s and 1990s,
the issue of nonstationarity in economic time series has been discussed in
the context of unit roots vs. mean trends in AR(p) models. This
perspective was subsequently extended to include structural breaks. In
this article we take a much broader perspective by allowing for more
general forms of nonstationarity. The focus of the article is primarily on
misspecification testing. The proposed test relies on Maximum Entropy (ME)
resampling techniques to enhance the information in the data in an attempt
to capture heterogeneity “locally” using rolling window
estimators. The t-heterogeneity of the primary moments of the process is
generically captured using orthogonal Bernstein polynomials. The
effectiveness of the testing procedure is assessed using Monte Carlo
simulations.
Journal: Econometric Reviews
Pages: 363-384
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Berstein polynomials, t-Heterogeneity, Maximum Entropy bootstrap, Nonstationarity, Parameter t-invariance, Rolling window estimates,
X-DOI: 10.1080/07474930801959776
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801959776
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:363-384
Template-Type: ReDIF-Article 1.0
Author-Name: Jan Jacobs
Author-X-Name-First: Jan
Author-X-Name-Last: Jacobs
Author-Name: Pieter Otter
Author-X-Name-First: Pieter
Author-X-Name-Last: Otter
Title: Determining the Number of Factors and Lag Order in Dynamic Factor Models: A Minimum Entropy Approach
Abstract:
This article proposes a solution to one of the issues in the rapidly
growing literature on dynamic factor models, i.e., how to determine the
optimal number of factors. Our formal test, based upon the canonical
correlation procedure related to concepts from information theory,
produces estimates of the number of factors and the lag order
simultaneously. Simulation experiments illustrate the potential of our
approach.
Journal: Econometric Reviews
Pages: 385-397
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Canonical correlation, Factor analysis, Model selection,
X-DOI: 10.1080/07474930801960196
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801960196
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:385-397
Template-Type: ReDIF-Article 1.0
Author-Name: Alastair Hall
Author-X-Name-First: Alastair
Author-X-Name-Last: Hall
Author-Name: Atsushi Inoue
Author-X-Name-First: Atsushi
Author-X-Name-Last: Inoue
Author-Name: Changmock Shin
Author-X-Name-First: Changmock
Author-X-Name-Last: Shin
Title: Entropy-Based Moment Selection in the Presence of Weak Identification
Abstract:
Hall et al. (2007) propose a method for moment selection based on an
information criterion that is a function of the entropy of the limiting
distribution of the Generalized Method of Moments (GMM) estimator. They
establish the consistency of the method subject to certain conditions that
include the identification of the parameter vector by at least one of the
moment conditions being considered. In this article, we examine the
limiting behavior of this moment selection method when the parameter
vector is weakly identified by all the moment conditions being considered.
It is shown that the selected moment condition is random and hence not
consistent in any meaningful sense. As a result, we propose a two-step
procedure for moment selection in which identification is first tested
using a statistic proposed by Stock and Yogo (2003) and then only if this
statistic indicates identification does the researcher proceed to the
second step in which the aforementioned information criterion is used to
select moments. The properties of this two-step procedure are contrasted
with those of strategies based on either using all available moments or
using the information criterion without the identification pre-test. The
performances of these strategies are compared via an evaluation of the
finite sample behavior of various methods for inference about the
parameter vector. The inference methods considered are based on the Wald
statistic, Anderson and Rubin's (1949) statistic, Kleibergen (2002) K
statistic, and combinations thereof in which the choice is based on the
outcome of the test for weak identification.
Journal: Econometric Reviews
Pages: 398-427
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Generalized method of moments, Inference, Moment selection, Weak identification,
X-DOI: 10.1080/07474930801960261
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801960261
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:398-427
Template-Type: ReDIF-Article 1.0
Author-Name: Thomas Mazzuchi
Author-X-Name-First: Thomas
Author-X-Name-Last: Mazzuchi
Author-Name: Ehsan Soofi
Author-X-Name-First: Ehsan
Author-X-Name-Last: Soofi
Author-Name: Refik Soyer
Author-X-Name-First: Refik
Author-X-Name-Last: Soyer
Title: Bayes Estimate and Inference for Entropy and Information Index of Fit
Abstract:
This article defines a quantized entropy and develops Bayes estimates and
inference for the entropy and a Kullback-Leibler information index of the
model fit. We use a Dirichlet process prior for the unknown
data-generating distribution with a maximum entropy candidate model as the
expected distribution. This formulation produces prior and posterior
distributions for the quantized entropy, the information index of fit, the
moments, and the model parameters. The posterior mean of the quantized
entropy provides a Bayes estimate of entropy under quadratic loss. The
consistency of the Bayes estimates and the information index are shown.
The implementation and the performances of the Bayes estimates are
illustrated using data simulated from exponential, gamma, and lognormal
distributions.
Journal: Econometric Reviews
Pages: 428-456
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Dirichlet process, Kullback-Leibler, Model selection, Nonparametric Bayes,
X-DOI: 10.1080/07474930801960311
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801960311
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:428-456
Template-Type: ReDIF-Article 1.0
Author-Name: M. Ryan Haley
Author-X-Name-First: M. Ryan
Author-X-Name-Last: Haley
Author-Name: Charles Whiteman
Author-X-Name-First: Charles
Author-X-Name-Last: Whiteman
Title: Generalized Safety First and a New Twist on Portfolio Performance
Abstract:
We propose a Generalization of Roy's (1952) Safety First (SF) principle
and relate it to the IID versions of Stutzer's (Stutzer's 2000, 2003)
Portfolio Performance Index and underperformance probability Decay-Rate
Maximization criteria. Like the original SF, the Generalized Safety First
(GSF) rule seeks to minimize an upper bound on the probability of ruin (or
shortfall, more generally) in a single drawing from a return distribution.
We show that this upper bound coincides with what Stutzer showed will
maximize the rate at which the probability of shortfall in the long-run
average return shrinks to zero in repeated drawings from the return
distribution. Our setup is simple enough that we can illustrate via direct
calculation a deep result from Large Deviations theory: in the IID case
the GSF probability bound and the decay rate correspond to the
Kullback-Leibler (KL) divergence between the one-shot portfolio
distribution and the “closest” mean-shortfall distribution.
This enables us to produce examples in which minimizing the upper bound on
the underperformance probability does not lead to the same decision as
minimizing the underperformance probability itself, and thus that the
decay-rate maximizing strategy may require the investor to take positions
that do not minimize the probability of shortfall in each successive
period. It also makes clear that the relationship between the marginal
distribution of the one-period portfolio return and the mean-shortfall
distribution is the same as that between the source density and the target
density in importance sampling. Thus Geweke's (1989) measure of Relative
Numerical Efficiency can be used as a measure of the quality of the
divergence measure. Our interpretation of the decay rate maximizing
criterion in terms of a one-shot problem enables us to use the tools of
importance sampling to develop a “performance index”
(standard error) for the Portfolio Performance Index (PPI). It turns out
that in a simple stock portfolio example, portfolios within one
(divergence) standard error of one another can have very different weights
on individual securities.
Journal: Econometric Reviews
Pages: 457-483
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Entropy, Importance sampling, Kullback-Leibler divergence, Portfolio choice, Portfolio performance, Safety first, Shortfall,
X-DOI: 10.1080/07474930801960360
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801960360
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:457-483
Template-Type: ReDIF-Article 1.0
Author-Name: Anil Bera
Author-X-Name-First: Anil
Author-X-Name-Last: Bera
Author-Name: Sung Park
Author-X-Name-First: Sung
Author-X-Name-Last: Park
Title: Optimal Portfolio Diversification Using the Maximum Entropy Principle
Abstract:
Markowitz's mean-variance (MV) efficient portfolio selection is one of
the most widely used approaches in solving portfolio diversification
problem. However, contrary to the notion of diversification, MV approach
often leads to portfolios highly concentrated on a few assets. Also, this
method leads to poor out-of-sample performances. Entropy is a well-known
measure of diversity and also has a shrinkage interpretation. In this
article, we propose to use cross- entropy measure as the objective
function with side conditions coming from the mean and variance-covariance
matrix of the resampled asset returns. This automatically captures the
degree of imprecision of input estimates. Our approach can be viewed as a
shrinkage estimation of portfolio weights (probabilities) which are shrunk
towards the predetermined portfolio, for example, equally weighted
portfolio or minimum variance portfolio. Our procedure is illustrated with
an application to the international equity indexes.
Journal: Econometric Reviews
Pages: 484-512
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Diversification, Entropy measure, Portfolio selection, Shrinkage rule, Simulation methods,
X-DOI: 10.1080/07474930801960394
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801960394
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:484-512
Template-Type: ReDIF-Article 1.0
Author-Name: Marian Grendar
Author-X-Name-First: Marian
Author-X-Name-Last: Grendar
Author-Name: George Judge
Author-X-Name-First: George
Author-X-Name-Last: Judge
Title: Large-Deviations Theory and Empirical Estimator Choice
Abstract:
In this article, we consider the problem of criterion choice in
information recovery and inference in a large-deviations (LD) context.
Kitamura and Stutzer recognize that the Maximum Entropy Empirical
Likelihood estimator can be given a LD justification (Kitamura and
Stutzer, 2002). We demonstrate there exists a similar LD justification for
Owen's Empirical Likelihood estimator (Owen, 2001). We tie the two
empirical estimators and related LD theorems to two basic ill-posed
inverse problems α and β. We note that other estimators in
this family lack an LD footing and provide an extensive discussion of the
implications of these results. The appendix contains formal statements
regarding relevant LD theorems.
Journal: Econometric Reviews
Pages: 513-525
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Boltzmann Jaynes inverse problem, Criterion choice problem, Empirical likelihood, Entropy, Information theory, Large deviations, Probabilistic laws,
X-DOI: 10.1080/07474930801960402
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801960402
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:513-525
Template-Type: ReDIF-Article 1.0
Author-Name: Patrik Guggenberger
Author-X-Name-First: Patrik
Author-X-Name-Last: Guggenberger
Title: Finite Sample Evidence Suggesting a Heavy Tail Problem of the Generalized Empirical Likelihood Estimator
Abstract:
Comprehensive Monte Carlo evidence is provided that compares the finite
sample properties of generalized empirical likelihood (GEL) estimators to
the ones of k-class estimators in the linear instrumental variables (IV)
model. We focus on sample median, mean, mean squared error, and on the
coverage probability and length of confidence intervals obtained from
inverting a t-statistic based on the various estimators. The results
indicate that in terms of the above criteria, all the GEL estimators and
the limited information maximum likelihood (LIML) estimator behave very
similarly. This suggests that GEL estimators might also share the
“no-moment” problem of LIML. At sample sizes as in our Monte
Carlo study, there is no systematic bias advantage of GEL estimators over
k-class estimators. On the other hand, the standard deviation of GEL
estimators is pronouncedly higher than for some of the k-class estimators.
Therefore, if mean squared error is used as the underlying loss function,
our study suggests the use of computationally simple estimators, such as
two-stage least squares, in the linear IV model rather than GEL. Based on
the properties of confidence intervals, we cannot recommend the use of GEL
estimators either in the linear IV model.
Journal: Econometric Reviews
Pages: 526-541
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Generalized empirical likelihood estimator, Generalized method of moments, Monte Carlo simulation, No-moment problem,
X-DOI: 10.1080/07474930801960410
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801960410
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:526-541
Template-Type: ReDIF-Article 1.0
Author-Name: Carlos Martins-Filho
Author-X-Name-First: Carlos
Author-X-Name-Last: Martins-Filho
Author-Name: Santosh Mishra
Author-X-Name-First: Santosh
Author-X-Name-Last: Mishra
Author-Name: Aman Ullah
Author-X-Name-First: Aman
Author-X-Name-Last: Ullah
Title: A Class of Improved Parametrically Guided Nonparametric Regression Estimators
Abstract:
In this article we define a class of estimators for a nonparametric
regression model with the aim of reducing bias. The estimators in the
class are obtained via a simple two-stage procedure. In the first stage, a
potentially misspecified parametric model is estimated and in the second
stage the parametric estimate is used to guide the derivation of a final
semiparametric estimator. Mathematically, the proposed estimators can be
thought as the minimization of a suitably defined Cressie-Read discrepancy
that can be shown to produce conventional nonparametric estimators, such
as the local polynomial estimator, as well as existing two-stage
multiplicative estimators, such as that proposed by Glad (1998). We show
that under fairly mild conditions the estimators in the proposed class are
[image omitted] asymptotically normal and explore their finite sample
(simulation) behavior.
Journal: Econometric Reviews
Pages: 542-573
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Asymptotic normality, Combined semiparametric estimation,
X-DOI: 10.1080/07474930801960444
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801960444
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:542-573
Template-Type: ReDIF-Article 1.0
Author-Name: Avinash Singh Bhati
Author-X-Name-First: Avinash Singh
Author-X-Name-Last: Bhati
Title: A Generalized Cross-Entropy Approach for Modeling Spatially Correlated Counts
Abstract:
This article discusses and applies an information-theoretic framework for
incorporating knowledge of the spatial structure in a sample while
extracting from it information about processes resulting in count
outcomes. The framework, an application of the Generalized Cross-Entropy
(GCE) method of estimating count outcome models, allows researchers to
incorporate such real-world features as unobserved
heterogeneity—with or without spatial clustering—when
modeling spatially correlated counts. The information-recovering potential
of the approach is investigated using a limited set of simulations. It is
then used to study the determinants of counts of homicides recorded in 343
neighborhoods in Chicago, Illinois.
Journal: Econometric Reviews
Pages: 574-595
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Count outcomes, Generalized Cross-Entropy estimation, Homicide rate, Spatial processes, Unobserved heterogeneity,
X-DOI: 10.1080/07474930801960451
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801960451
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:574-595
Template-Type: ReDIF-Article 1.0
Author-Name: R. Bernardini Papalia
Author-X-Name-First: R. Bernardini
Author-X-Name-Last: Papalia
Title: A Composite Generalized Cross-Entropy Formulation in Small Samples Estimation
Abstract:
This article introduces a maximum entropy-based estimation methodology
that can be used both to represent the uncertainty of a partial-incomplete
economic data generation process and to consider the direct influence of
learning from repeated samples. Then, a composite cross-entropy estimator,
incorporating information from a subpopulation based on a small sample and
from a population with a larger sample size, is derived. The proposed
estimator is employed to estimate the local area expenditure shares of a
sub population of Italian households using a system of censored demand
equations.
Journal: Econometric Reviews
Pages: 596-609
Issue: 4-6
Volume: 27
Year: 2008
Keywords: Generalized cross-entropy, Microeconometric models, Repeated samples, Small sample estimation,
X-DOI: 10.1080/07474930801960469
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930801960469
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:27:y:2008:i:4-6:p:596-609
Template-Type: ReDIF-Article 1.0
Author-Name: Estela Bee Dagum
Author-X-Name-First: Estela Bee
Author-X-Name-Last: Dagum
Author-Name: Silvano Bordignon
Author-X-Name-First: Silvano
Author-X-Name-Last: Bordignon
Title: Editorial: Special Issue on Statistical Inference on Time Series Stochastic and Deterministic Dynamics
Abstract:
Journal: Econometric Reviews
Pages: 1-3
Issue: 1-3
Volume: 28
Year: 2009
X-DOI: 10.1080/07474930802387720
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802387720
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:1-3
Template-Type: ReDIF-Article 1.0
Author-Name: Richard Ashley
Author-X-Name-First: Richard
Author-X-Name-Last: Ashley
Author-Name: Randal Verbrugge
Author-X-Name-First: Randal
Author-X-Name-Last: Verbrugge
Title: Frequency Dependence in Regression Model Coefficients: An Alternative Approach for Modeling Nonlinear Dynamic Relationships in Time Series
Abstract:
This article proposes a new class of nonlinear time series models in
which one of the coefficients of an existing regression model is frequency
dependent—that is, the relationship between the dependent variable
and this explanatory variable varies across its frequency components. We
show that such frequency dependence implies that the relationship between
the dependent variable and this explanatory variable is nonlinear. Past
efforts to detect frequency dependence have not been satisfactory; for
example, we note that the two-sided bandpass filtering used in such
efforts yields inconsistent estimates of frequency dependence where there
is feedback in the relationship. Consequently, we provide an explicit
procedure for partitioning an explanatory variable into frequency
components using one-sided bandpass filters. This procedure allows us to
test for and quantify frequency dependence even where feedback may be
present. A distinguishing feature of these new models is their potentially
tight connection to macroeconomic theory; indeed, they are perhaps best
introduced by reference to the frequency dependence in the marginal
propensity to consume posited by the Permanent Income Hypothesis (PIH) of
consumption theory. An illustrative empirical application is given, in
which the Phillips Curve relationship between inflation and unemployment
is found to be negligible at low frequencies, corresponding to
periods ≥ 18 months, but inverse at higher frequencies,
just as predicted by Friedman and Phelps in the 1960s.
Journal: Econometric Reviews
Pages: 4-20
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Frequency dependence, Nonlinear dependence, Nonlinear modelling, Phillips Curve, Spectral regression,
X-DOI: 10.1080/07474930802387753
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802387753
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:4-20
Template-Type: ReDIF-Article 1.0
Author-Name: Anthony Atkinson
Author-X-Name-First: Anthony
Author-X-Name-Last: Atkinson
Title: Econometric Applications of the Forward Search in Regression: Robustness, Diagnostics, and Graphics
Abstract:
The article illustrates the use of the forward search to provide robust
analyses of econometric data. The emphasis is on informative plots that
reveal the inferential importance of each observation. The division of
observations into “good” and “bad” leverage
points is shown to be potentially misleading.
Journal: Econometric Reviews
Pages: 21-39
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Bad leverage point, Fan plot, Leverage, LTS, Outliers, Residuals, Very robust methods,
X-DOI: 10.1080/07474930802387803
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802387803
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:21-39
Template-Type: ReDIF-Article 1.0
Author-Name: Estela Bee Dagum
Author-X-Name-First: Estela Bee
Author-X-Name-Last: Dagum
Author-Name: Alessandra Luati
Author-X-Name-First: Alessandra
Author-X-Name-Last: Luati
Title: A Cascade Linear Filter to Reduce Revisions and False Turning Points for Real Time Trend-Cycle Estimation
Abstract:
The problem of identifying the direction of the short-term trend
(nonstationary mean) of seasonally adjusted series contaminated by high
levels of variability has become of relevant interest in recent years. In
fact, major financial and economic changes of global character have
introduced a large amount of noise in time series data, particularly, in
socioeconomic indicators used for real time economic analysis. The aim of
this study is to construct a cascade linear filter via the convolution of
several noise suppression, trend estimation, and extrapolation linear
filters. The cascading approach approximates the steps followed by the
nonlinear Dagum (1996) trend-cycle estimator, a modified version of the
13-term Henderson filter. The former consists of first extending the
seasonally adjusted series with ARIMA extrapolations, and then applying a
very strict replacement of extreme values. The nonlinear Dagum filter has
been shown to improve significantly the size of revisions and number of
false turning points with respect to H13. We construct a linear
approximation of the nonlinear filter because it offers several
advantages. For one, its application is direct and hence does not require
some knowledge on ARIMA model identification. Furthermore, linear
filtering preserves the crucial additive constraint by which the trend of
an aggregated variable should be equal to the algebraic addition of its
component trends, thus avoiding the selection problem of direct versus
indirect adjustments. Finally, the properties of a linear filter
concerning signal passing and noise suppression can always be compared to
those of other linear filters by means of spectral analysis.
Journal: Econometric Reviews
Pages: 40-59
Issue: 1-3
Volume: 28
Year: 2009
Keywords: False turning points, Gain function, Smoothing, Symmetric linear filter, 13-Term Henderson filter,
X-DOI: 10.1080/07474930802387837
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802387837
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:40-59
Template-Type: ReDIF-Article 1.0
Author-Name: Silvano Bordignon
Author-X-Name-First: Silvano
Author-X-Name-Last: Bordignon
Author-Name: Massimiliano Caporin
Author-X-Name-First: Massimiliano
Author-X-Name-Last: Caporin
Author-Name: Francesco Lisi
Author-X-Name-First: Francesco
Author-X-Name-Last: Lisi
Title: Periodic Long-Memory GARCH Models
Abstract:
A distinguishing feature of the intraday time-varying volatility of
financial time series is given by the presence of long-range dependence of
periodic type, due mainly to time-of-the-day phenomena. In this work, we
introduce a model able to describe the empirical evidence given by this
periodic long-memory behaviour. The model, named PLM-GARCH (Periodic
Long-Memory GARCH), represents a natural extension of the FIGARCH model
proposed for modelling long-range persistence of volatility. Periodic long
memory versions of EGARCH (PLM-EGARCH) and of Log-GARCH (PLM-LGARCH)
models are also examined. Some properties and characteristics of the
models are given and finite sample performance of quasi-maximum likelihood
estimation are studied with Monte Carlo simulations. Further possible
extensions of the model to take into account multiple sources of periodic
long-memory behaviour are proposed. Two empirical applications on
intra-day financial time series are also provided.
Journal: Econometric Reviews
Pages: 60-82
Issue: 1-3
Volume: 28
Year: 2009
Keywords: GARCH models, Intra-day volatility, Long-memory, Periodicity,
X-DOI: 10.1080/07474930802387860
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802387860
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:60-82
Template-Type: ReDIF-Article 1.0
Author-Name: Yongjae Kwon
Author-X-Name-First: Yongjae
Author-X-Name-Last: Kwon
Author-Name: Hamparsum Bozdogan
Author-X-Name-First: Hamparsum
Author-X-Name-Last: Bozdogan
Author-Name: Halima Bensmail
Author-X-Name-First: Halima
Author-X-Name-Last: Bensmail
Title: Performance of Model Selection Criteria in Bayesian Threshold VAR (TVAR) Models
Abstract:
This article presents a new Bayesian modeling and information-theoretic
model selection criteria for threshold vector autoregressive (TVAR)
models. The analytical framework of Bayesian modeling for threshold VAR
models are developed. Markov Chain Monte Carlo (MCMC) simulation and
importance/rejection sampling methods are used to estimate the parameters
of the model and to obtain posterior samples. We propose reliable modeling
procedures using Bayes factor, and the information-theoretic model
selection criteria such as, Akaike's (1973) Information Criterion (AIC),
Schwarz (1978) Bayesian Criterion (SBC), Information Complexity (ICOMP)
Criterion of Bozdogan (1990, 1994, 2000), Extended Consistent (AIC) with
Fisher Information (CAICFE), and the new Bayesian Model Selection (BMS)
Criterion of Bozdogan and Ueno (2000). We study the performance of these
criteria under different design of the simulation protocol with varying
sample sizes in TVAR models. Our results show that these criteria perform
well in small sample as well as large samples to avoid heavy computational
burden in conventional procedures.
Journal: Econometric Reviews
Pages: 83-101
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Bayesian modeling, Information-theoretic model selection criteria and model selection, Threshold autoregressive models,
X-DOI: 10.1080/07474930802387894
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802387894
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:83-101
Template-Type: ReDIF-Article 1.0
Author-Name: Giovanni Luca
Author-X-Name-First: Giovanni
Author-X-Name-Last: Luca
Author-Name: Giampiero Gallo
Author-X-Name-First: Giampiero
Author-X-Name-Last: Gallo
Title: Time-Varying Mixing Weights in Mixture Autoregressive Conditional Duration Models
Abstract:
Financial market price formation and exchange activity can be
investigated by means of ultra-high frequency data. In this article, we
investigate an extension of the Autoregressive Conditional Duration (ACD)
model of Engle and Russell (1998) by adopting a mixture of distribution
approach with time-varying weights. Empirical estimation of the Mixture
ACD model shows that the limitations of the standard base model and its
inadequacy of modelling the behavior in the tail of the distribution are
suitably solved by our model. When the weights are made dependent on some
market activity data, the model lends itself to some structural
interpretation related to price formation and information diffusion in the
market.
Journal: Econometric Reviews
Pages: 102-120
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Autoregressive, Conditional Durations, Financial durations, Mixture of distributions, Time-varying weights,
X-DOI: 10.1080/07474930802387944
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802387944
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:102-120
Template-Type: ReDIF-Article 1.0
Author-Name: Søren Johansen
Author-X-Name-First: Søren
Author-X-Name-Last: Johansen
Title: Representation of Cointegrated Autoregressive Processes with Application to Fractional Processes
Abstract:
We analyze vector autoregressive processes using the matrix valued
characteristic polynomial. The purpose of this article is to give a survey
of the mathematical results on inversion of a matrix polynomial in case
there are unstable roots, to study integrated and cointegrated processes.
The new results are in the I(2) representation, which contains explicit
formulas for the first two terms and a useful property of the third. We
define a new error correction model for fractional processes and derive a
representation of the solution.
Journal: Econometric Reviews
Pages: 121-145
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Error correction models, Fractional autoregressive model, Granger representation, Integration of order 1 and 2,
X-DOI: 10.1080/07474930802387977
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802387977
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:121-145
Template-Type: ReDIF-Article 1.0
Author-Name: Fabrizio Laurini
Author-X-Name-First: Fabrizio
Author-X-Name-Last: Laurini
Author-Name: Jonathan Tawn
Author-X-Name-First: Jonathan
Author-X-Name-Last: Tawn
Title: Regular Variation and Extremal Dependence of GARCH Residuals with Application to Market Risk Measures
Abstract:
Stock returns exhibit heavy tails and volatility clustering. These
features, motivating the use of GARCH models, make it difficult to predict
times and sizes of losses that might occur. Estimation of losses, like the
Value-at-Risk, often assume that returns, normalized by the level of
volatility, are Gaussian. Often under ARMA-GARCH modeling, such scaled
returns are heavy tailed and show extremal dependence, whose strength
reduces when increasing extreme levels. We model heavy tails of scaled
returns with generalized Pareto distributions, while extremal dependence
can be reduced by declustering data.
Journal: Econometric Reviews
Pages: 146-169
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Declustering, Expected shortfalls, Extremal dependence, Generalized Pareto distribution, Regular variation, Value-at-Risk,
X-DOI: 10.1080/07474930802387985
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802387985
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:146-169
Template-Type: ReDIF-Article 1.0
Author-Name: Cristiano Varin
Author-X-Name-First: Cristiano
Author-X-Name-Last: Varin
Author-Name: Paolo Vidoni
Author-X-Name-First: Paolo
Author-X-Name-Last: Vidoni
Title: Pairwise Likelihood Inference for General State Space Models
Abstract:
This article concerns parameter estimation for general state space
models, following a frequentist likelihood-based approach. Since exact
methods for computing and maximizing the likelihood function are usually
not feasible, approximate solutions, based on Monte Carlo or numerical
methods, have to be considered. Here, we concentrate on a different
approach based on a simple pseudolikelihood, called “pairwise
likelihood.” Its merit is to reduce the computational burden so
that it is possible to fit highly structured statistical models, even when
the use of standard likelihood methods is not possible. We discuss
pairwise likelihood inference for state space models, and we present some
touchstone examples concerning autoregressive models with additive
observation noise and switching regimes, the local level model and a
non-Makovian generalization of the dynamic Tobit model.
Journal: Econometric Reviews
Pages: 170-185
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Composite likelihood, Efficiency, Pairwise likelihood, Pseudolikelihood, Regime switching, State space model, Tobit model,
X-DOI: 10.1080/07474930802388009
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802388009
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:170-185
Template-Type: ReDIF-Article 1.0
Author-Name: Tommaso Proietti
Author-X-Name-First: Tommaso
Author-X-Name-Last: Proietti
Title: On the Model-Based Interpretation of Filters and the Reliability of Trend-Cycle Estimates
Abstract:
The article explores and illustrates some of the typical trade-offs which
arise in designing filters for the measurement of trends and cycles in
economic time series, focusing, in particular, on the fundamental
trade-off between the reliability of the estimates and the magnitude of
the revisions as new observations become available. This assessment is
available through a novel model based approach, according to which an
important class of highpass and bandpass filters, encompassing the
Hodrick-Prescott (HP) filter, are adapted to the particular time series
under investigation. Via a suitable decomposition of the innovation
process, it is shown that any linear time series with ARIMA representation
can be broken down into orthogonal trend and cycle components, for which
the class of filters is optimal. The main results then follow from
Wiener-Kolmogorov (WK) signal extraction theory, whereas exact finite
sample inferences are provided by the Kalman filter and smoother for the
relevant state space representation of the decomposition.
Journal: Econometric Reviews
Pages: 186-208
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Bandpass filters, Kalman filter and Smoother, Revisions, Signal extraction,
X-DOI: 10.1080/07474930802388025
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802388025
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:186-208
Template-Type: ReDIF-Article 1.0
Author-Name: Matteo Grigoletto
Author-X-Name-First: Matteo
Author-X-Name-Last: Grigoletto
Author-Name: Corrado Provasi
Author-X-Name-First: Corrado
Author-X-Name-Last: Provasi
Title: Misspecification Testing for the Conditional Distribution Model in GARCH-Type Processes
Abstract:
In this article, we study goodness of fit tests for some distributions of
the innovations which are usually adopted to explain the behavior of
financial time series. Inference is developed in the context of GARCH-type
models. Functional bootstrap tests are employed, assuming that the
conditional means and variances of the model are correctly specified. The
performances of the functional tests are assessed with a Monte Carlo
experiment, based on some of the most common distributions adopted in the
financial framework. The results of an application to the series of
squared residuals from a PARCH(1,1) model fitted to a series of foreign
exchange rates returns are also shown.
Journal: Econometric Reviews
Pages: 209-224
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Bootstrap, Functional tests, GARCH model, Goodness of fit,
X-DOI: 10.1080/07474930802388033
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802388033
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:209-224
Template-Type: ReDIF-Article 1.0
Author-Name: Changli He
Author-X-Name-First: Changli
Author-X-Name-Last: He
Author-Name: Timo Terasvirta
Author-X-Name-First: Timo
Author-X-Name-Last: Terasvirta
Author-Name: Andres Gonzalez
Author-X-Name-First: Andres
Author-X-Name-Last: Gonzalez
Title: Testing Parameter Constancy in Stationary Vector Autoregressive Models Against Continuous Change
Abstract:
In this article we derive a parameter constancy test of a stationary
vector autoregressive model against the hypothesis that the parameters of
the model change smoothly over time. A single structural break is
contained in this alternative hypothesis as a special case. The test is a
generalization of a single-equation test of a similar hypothesis proposed
in the literature. An advantage here is that the asymptotic distribution
theory is standard. The performance of the tests is compared to that of
generalized Chow-tests and found satisfactory in terms of both size and
power.
Journal: Econometric Reviews
Pages: 225-245
Issue: 1-3
Volume: 28
Year: 2009
Keywords: JEL, C32, C12,
X-DOI: 10.1080/07474930802388041
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802388041
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:225-245
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Jeffrey Racine
Author-X-Name-First: Jeffrey
Author-X-Name-Last: Racine
Title: A Robust Entropy-Based Test of Asymmetry for Discrete and Continuous Processes
Abstract:
We consider a metric entropy capable of detecting deviations from
symmetry that is suitable for both discrete and continuous processes. A
test statistic is constructed from an integrated normed difference between
nonparametric estimates of two density functions. The null distribution
(symmetry) is obtained by resampling from an artificially lengthened
series constructed from a rotation of the original series about its mean
(median, mode). Simulations demonstrate that the test has correct size and
good power in the direction of interesting alternatives, while
applications to updated Nelson and Plosser (1982) data demonstrate its
potential power gains relative to existing tests.
Journal: Econometric Reviews
Pages: 246-261
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Entropy, Kernel, Metric, Nonparametric, Symmetry, Time series,
X-DOI: 10.1080/07474930802388066
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802388066
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:246-261
Template-Type: ReDIF-Article 1.0
Author-Name: Marco Riani
Author-X-Name-First: Marco
Author-X-Name-Last: Riani
Title: Robust Transformations in Univariate and Multivariate Time Series
Abstract:
It is well known that transformation of the response may improve the
homogeneity and the approximate normality of the errors. Unfortunately,
the estimated transformation and related test statistic may be sensitive
to the presence of one, or several, atypical observations. In addition, it
is important to remark that outliers in one transformed scale may not be
atypical in another scale. Therefore, it is important to choose a
transformation which does not depend on the presence of particular
observations. In this article we suggest an efficient procedure based on a
robust score test statistic which quantifies the effect of each
observation on the choice of the transformation.
Journal: Econometric Reviews
Pages: 262-278
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Fan plot, Forward search, Kalman filter, Outlier detection, Robust methods, Score test,
X-DOI: 10.1080/07474930802388074
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802388074
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:262-278
Template-Type: ReDIF-Article 1.0
Author-Name: Elena Rusticelli
Author-X-Name-First: Elena
Author-X-Name-Last: Rusticelli
Author-Name: Richard Ashley
Author-X-Name-First: Richard
Author-X-Name-Last: Ashley
Author-Name: Estela Bee Dagum
Author-X-Name-First: Estela Bee
Author-X-Name-Last: Dagum
Author-Name: Douglas Patterson
Author-X-Name-First: Douglas
Author-X-Name-Last: Patterson
Title: A New Bispectral Test for NonLinear Serial Dependence
Abstract:
Nonconstancy of the bispectrum of a time series has been taken as a
measure of non-Gaussianity and nonlinear serial dependence in a stochastic
process by Subba Rao and Gabr (1980) and by Hinich (1982), leading to
Hinich's statistical test of the null hypothesis of a linear generating
mechanism for a time series. Hinich's test has the advantage of focusing
directly on nonlinear serial dependence—in contrast to subsequent
approaches, which actually test for serial dependence of any kind
(nonlinear or linear) on data which have been pre-whitened. The Hinich
test tends to have low power, however, and (in common with most
statistical procedures in the frequency domain) requires the specification
of a smoothing or window-width parameter. In this article, we develop a
modification of the Hinich bispectral test which substantially ameliorates
both of these problems by the simple expedient of maximizing the test
statistic over the feasible values of the smoothing parameter. Monte Carlo
simulation results are presented indicating that the new test is well
sized and has substantially larger power than the original Hinich test
against a number of relevant alternatives; the simulations also indicate
that the new test preserves the Hinich test's robustness to
misspecifications in the identification of a pre-whitening model.
Journal: Econometric Reviews
Pages: 279-293
Issue: 1-3
Volume: 28
Year: 2009
Keywords: Bispectrum, Linear prefiltering procedure, Maximization technique, Nonlinearity, Smoothing parameter,
X-DOI: 10.1080/07474930802388090
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802388090
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:1-3:p:279-293
Template-Type: ReDIF-Article 1.0
Author-Name: Claude Lopez
Author-X-Name-First: Claude
Author-X-Name-Last: Lopez
Title: A Panel Unit Root Test with Good Power in Small Samples
Abstract:
We propose a new pooled panel unit root test allowing for serial and
contemporaneous correlation. The test combines Elliott et al. (1996)
local-to-unity transformation with a pooled panel ADF test, and accounts
for contemporaneous correlation by estimating the residual covariance
matrix. The critical values are bootstrapped and Monte Carlo simulations
demonstrate enhanced performances, especially when the series are highly
persistent and the panel cross-sectional and time series dimensions are
relatively small. An application of the test to the real exchange rate
convergence for the post Bretton-Woods period leads to strong and reliable
rejections of the unit root.
Journal: Econometric Reviews
Pages: 295-313
Issue: 4
Volume: 28
Year: 2009
Keywords: Bootstrap test, GLS-detrending, Panel unit root test,
X-DOI: 10.1080/07474930802458620
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802458620
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:4:p:295-313
Template-Type: ReDIF-Article 1.0
Author-Name: D. M. Mahinda Samarakoon
Author-X-Name-First: D. M. Mahinda
Author-X-Name-Last: Samarakoon
Author-Name: Keith Knight
Author-X-Name-First: Keith
Author-X-Name-Last: Knight
Title: A Note on Unit Root Tests with Infinite Variance Noise
Abstract:
In recent years, a number of authors have considered extensions of
classical unit root tests to cases where the process is driven by infinite
variance innovations, as well as considering their asymptotic properties.
Unfortunately, these extensions are typically inefficient as they do not
exploit the dynamics of the infinite variance process. In this article, we
consider Dickey-Fuller-type tests based on M-estimators and develop the
asymptotic theory for these estimators and resulting test statistics.
Journal: Econometric Reviews
Pages: 314-334
Issue: 4
Volume: 28
Year: 2009
Keywords: Infinite variance, M-estimators, Stable laws, Unit roots,
X-DOI: 10.1080/07474930802458638
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802458638
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:4:p:314-334
Template-Type: ReDIF-Article 1.0
Author-Name: Daiki Maki
Author-X-Name-First: Daiki
Author-X-Name-Last: Maki
Title: Tests for a Unit Root Using Three-Regime TAR Models: Power Comparison and Some Applications
Abstract:
Tests for a unit root using three-regime threshold autoregressive (TAR)
models play a significant role in the empirical analysis of some economic
theories. This article compares the powers of recently proposed unit root
tests in three-regime TAR models using Monte Carlo experiments. The
following results are obtained from the Monte Carlo simulations:
Kapetanios and Shin's (2006) Wsup, Wave, and Wexp statistics, which
degenerate with respect to the threshold parameters under the null
hypothesis, have a better power in the three-regime TAR process with a
relatively narrow band of a unit root process and a small sample, whereas
their statistics do not perform well when the threshold and sample size
increase; Bec et al.'s (2004, BBC) sup W and Park and Shintani's (2005)
inf-t statistics and their restricted models, which do not degenerate with
respect to the threshold parameters in the limit, perform poorly in the
three-regime TAR process with a small threshold even when compared with
the Dickey-Fuller test, whereas their statistics perform better in the
case of a large threshold; sup W, inf-t, and their restricted models
perform much better when the sample size and threshold increase and the
outer regimes have a rapid convergence. In order to substantiate the use
of our Monte Carlo results for some of the applied work, we apply these
tests to the real exchange rates for many countries.
Journal: Econometric Reviews
Pages: 335-363
Issue: 4
Volume: 28
Year: 2009
Keywords: Power, Three-regime TAR model, Unit root test,
X-DOI: 10.1080/07474930802458893
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802458893
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:4:p:335-363
Template-Type: ReDIF-Article 1.0
Author-Name: Giuseppe Cavaliere
Author-X-Name-First: Giuseppe
Author-X-Name-Last: Cavaliere
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M. Robert
Author-X-Name-Last: Taylor
Title: A Note on Testing Covariance Stationarity
Abstract:
In a recent article, Xiao and Lima (2007) show numerically that the
stationarity test of Kwiatkowski et al. (1992) has power close to size
when the volatility of the innovation process follows a linear trend. In
this article, highlighting published results in Cavaliere and Taylor
(2005), we show that this observation does not in general hold under
time-varying volatility. We also propose alternative tests of covariance
stationarity which we show to improve upon the power properties of the
tests proposed in Xiao and Lima (2007) against changes in the
unconditional variance. Practical recommendations are also made.
Journal: Econometric Reviews
Pages: 364-371
Issue: 4
Volume: 28
Year: 2009
Keywords: KPSS, Nonstationary volatility, Stationarity testing,
X-DOI: 10.1080/07474930802458992
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802458992
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:4:p:364-371
Template-Type: ReDIF-Article 1.0
Author-Name: Andrew Patton
Author-X-Name-First: Andrew
Author-X-Name-Last: Patton
Author-Name: Dimitris Politis
Author-X-Name-First: Dimitris
Author-X-Name-Last: Politis
Author-Name: Halbert White
Author-X-Name-First: Halbert
Author-X-Name-Last: White
Title: Correction to “Automatic Block-Length Selection for the Dependent Bootstrap” by D. Politis and H. White
Abstract:
A correction on the optimal block size algorithms of Politis and White
(2004) is given following a correction of Lahiri's (Lahiri 1999)
theoretical results by Nordman (2008).
Journal: Econometric Reviews
Pages: 372-375
Issue: 4
Volume: 28
Year: 2009
Keywords: Block bootstrap, Block size, Circular bootstrap, Stationary bootstrap,
X-DOI: 10.1080/07474930802459016
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802459016
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:4:p:372-375
Template-Type: ReDIF-Article 1.0
Author-Name: Jesus Fernandez-Villaverde
Author-X-Name-First: Jesus
Author-X-Name-Last: Fernandez-Villaverde
Author-Name: Juan Rubio-Ramirez
Author-X-Name-First: Juan
Author-X-Name-Last: Rubio-Ramirez
Title: Two Books on the New Macroeconometrics
Abstract:
Methods for Applied Macroeconomics Research by Fabio Canova, and
Structural Macroeconometrics by David N. DeJong and Chetan Dave are two
outstanding new books that provide an excellent introduction to what is
sometimes called the New Macroeconometrics. This area of empirical
macroeconomics is centered on the estimation and validation of dynamic
stochastic general equilibrium (DSGE) models. Canova's and DeJong and
Dave's volumes fill a tremendous gap in economists' libraries. Not only
does the writing style of both books allow them to be adopted as a
reference text for a class, but also the books come filled with
applications, exercises, and pointers to computer code that will
complement the lectures. Despite sharing the common theme of an
introduction to the new macroeconometrics, each book has its own focus.
Canova's book aims to survey a long list of techniques relevant to
macroeconomists: filters, vector autoregressions (VARs), general method of
moments (GMM), simulation methods, dynamic panels, maximum likelihood, and
Bayesian econometrics; it also offers two preliminary chapters on
probability theory and on DSGE modeling. In contrast, DeJong and Dave have
the more modest goal of showing how to compute and estimate DSGE models,
which makes it more suitable for a second year graduate class. In
exchange, DeJong and Dave often dig a bit deeper into issues of interest
to them and build the material at a more leisurely pace.
Journal: Econometric Reviews
Pages: 376-387
Issue: 4
Volume: 28
Year: 2009
Keywords: Bayesian econometrics, Dynamic macroeconomic models, Likelihood function, Monte Carlo methods, New macroeconometrics,
X-DOI: 10.1080/07474930802459040
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802459040
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:4:p:376-387
Template-Type: ReDIF-Article 1.0
Author-Name: Tong Li
Author-X-Name-First: Tong
Author-X-Name-Last: Li
Title: Book Review
Abstract:
Journal: Econometric Reviews
Pages: 388-392
Issue: 4
Volume: 28
Year: 2009
X-DOI: 10.1080/07474930802459073
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802459073
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:4:p:388-392
Template-Type: ReDIF-Article 1.0
Author-Name: Giuseppe Cavaliere
Author-X-Name-First: Giuseppe
Author-X-Name-Last: Cavaliere
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M.
Author-X-Name-Last: Robert Taylor
Title: Bootstrap M Unit Root Tests
Abstract:
In this article we propose wild bootstrap implementations of the local
generalized least squares (GLS) de-trended M and ADF unit root tests of
Stock (1999), Ng and Perron (2001), and Elliott et al. (1996),
respectively. The bootstrap statistics are shown to replicate the
first-order asymptotic distributions of the original statistics, while
numerical evidence suggests that the bootstrap tests perform well in small
samples. A recolored version of our bootstrap is also proposed which can
further improve upon the finite sample size properties of the procedure
when the shocks are serially correlated, in particular ameliorating the
significant under-size seen in the M tests against processes with
autoregressive or moving average roots close to -1. The wild bootstrap is
used because it has the desirable property of preserving in the resampled
data the pattern of heteroskedasticity present in the original shocks,
thereby allowing for cases where the series under test is driven by
martingale difference innovations.
Journal: Econometric Reviews
Pages: 393-421
Issue: 5
Volume: 28
Year: 2009
Keywords: Conditional heteroskedasticity, Re-colouring, Unit root tests, Wild bootstrap,
X-DOI: 10.1080/07474930802467167
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802467167
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:5:p:393-421
Template-Type: ReDIF-Article 1.0
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Author-Name: Suhejla Hoti
Author-X-Name-First: Suhejla
Author-X-Name-Last: Hoti
Author-Name: Felix Chan
Author-X-Name-First: Felix
Author-X-Name-Last: Chan
Title: Structure and Asymptotic Theory for Multivariate Asymmetric Conditional Volatility
Abstract:
Various univariate and multivariate models of volatility have been used
to evaluate market risk, asymmetric shocks, thresholds, leverage effects,
and Value-at-Risk in economics and finance. This article is concerned with
market risk, and develops a constant conditional correlation vector
ARMA-asymmetric GARCH (VARMA-AGARCH) model, as an extension of the widely
used univariate asymmetric (or threshold) GJR model of Glosten et al.
(1992), and establishes its underlying structure, including the unique,
strictly stationary, and ergodic solution of the model, its causal
expansion, and convenient sufficient conditions for the existence of
moments. Alternative empirically verifiable sufficient conditions for the
consistency and asymptotic normality of the quasi-maximum likelihood
estimator are established under non-normality of the standardized shocks.
Journal: Econometric Reviews
Pages: 422-440
Issue: 5
Volume: 28
Year: 2009
Keywords: Asymmetric effects, Asymptotic theory, Conditional volatility, Multivariate structure, Regularity conditions,
X-DOI: 10.1080/07474930802467217
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802467217
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:5:p:422-440
Template-Type: ReDIF-Article 1.0
Author-Name: Kenneth West
Author-X-Name-First: Kenneth
Author-X-Name-Last: West
Author-Name: Ka-fu Wong
Author-X-Name-First: Ka-fu
Author-X-Name-Last: Wong
Author-Name: Stanislav Anatolyev
Author-X-Name-First: Stanislav
Author-X-Name-Last: Anatolyev
Title: Instrumental Variables Estimation of Heteroskedastic Linear Models Using All Lags of Instruments
Abstract:
We propose and evaluate a technique for instrumental variables estimation
of linear models with conditional heteroskedasticity. The technique uses
approximating parametric models for the projection of right-hand side
variables onto the instrument space, and for conditional
heteroskedasticity and serial correlation of the disturbance. Use of
parametric models allows one to exploit information in all lags of
instruments, unconstrained by degrees of freedom limitations. Analytical
calculations and simulations indicate that sometimes there are large
asymptotic and finite sample efficiency gains relative to conventional
estimators (Hansen, 1982), and modest gains or losses depending on data
generating process and sample size relative to quasi-maximum likelihood.
These results are robust to minor misspecification of the parametric
models used by our estimator. [Supplemental materials are available for
this article. Go to the publisher's online edition of Econometric Reviews
for the following free supplemental resources: two appendices containing
additional results from this article.]
Journal: Econometric Reviews
Pages: 441-467
Issue: 5
Volume: 28
Year: 2009
Keywords: Efficiency, Efficiency bounds, Instrumental variables, Optimal instrument, Stationary time series,
X-DOI: 10.1080/07474930802467241
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802467241
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:5:p:441-467
Template-Type: ReDIF-Article 1.0
Author-Name: Simon Broda
Author-X-Name-First: Simon
Author-X-Name-Last: Broda
Author-Name: Kai Carstensen
Author-X-Name-First: Kai
Author-X-Name-Last: Carstensen
Author-Name: Marc Paolella
Author-X-Name-First: Marc
Author-X-Name-Last: Paolella
Title: Assessing and Improving the Performance of Nearly Efficient Unit Root Tests in Small Samples
Abstract:
The development of unit root tests continues unabated, with many recent
contributions using techniques such as generalized least squares (GLS)
detrending and recursive detrending to improve the power of the test. In
this article, the relation between the seemingly disparate tests is
demonstrated by algebraically nesting all of them as ratios of quadratic
forms in normal variables. By doing so, and using the exact sampling
distribution of the ratio, it is straightforward to compute, examine, and
compare the test' critical values and power functions. It is shown that
use of GLS detrending parameters other than those recommended in the
literature can lead to substantial power improvements. The open and
important question regarding the nature of the first observation is
addressed. Tests with high power are proposed irrespective of the
distribution of the initial observation, which should be of great use in
practical applications.
Journal: Econometric Reviews
Pages: 468-494
Issue: 5
Volume: 28
Year: 2009
Keywords: GLS detrending, Power loss, Recursive detrending,
X-DOI: 10.1080/07474930802467282
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802467282
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:5:p:468-494
Template-Type: ReDIF-Article 1.0
Author-Name: M. Hashem Pesaran
Author-X-Name-First: M. Hashem
Author-X-Name-Last: Pesaran
Author-Name: Ron Smith
Author-X-Name-First: Ron
Author-X-Name-Last: Smith
Author-Name: Takashi Yamagata
Author-X-Name-First: Takashi
Author-X-Name-Last: Yamagata
Author-Name: Lyudmyla Hvozdyk
Author-X-Name-First: Lyudmyla
Author-X-Name-Last: Hvozdyk
Title: Pairwise Tests of Purchasing Power Parity
Abstract:
Given nominal exchange rates and price data on N + 1 countries
indexed by i = 0,1,2,…, N, the standard procedure for
testing purchasing power parity (PPP) is to apply unit root or
stationarity tests to N real exchange rates all measured relative to a
base country, 0, often taken to be the U.S. Such a procedure is sensitive
to the choice of base country, ignores the information in all the other
cross-rates and is subject to a high degree of cross-section dependence
which has adverse effects on estimation and inference. In this article, we
conduct a variety of unit root tests on all possible N(N + 1)/2
real rates between pairs of the N + 1 countries and estimate the
proportion of the pairs that are stationary. This proportion can be
consistently estimated even in the presence of cross-section dependence.
We estimate this proportion using quarterly data on the real exchange rate
for 50 countries over the period 1957-2001. The main substantive
conclusion is that to reject the null of no adjustment to PPP requires
sufficiently large disequilibria to move the real rate out of the band of
inaction set by trade costs. In such cases, one can reject the null of no
adjustment to PPP up to 90% of the time as compared to around 40% in the
whole sample using a linear alternative and almost 60% using a nonlinear
alternative.
Journal: Econometric Reviews
Pages: 495-521
Issue: 6
Volume: 28
Year: 2009
Keywords: Cross-rates, Cross-section dependence, Pairwise approach, Panel data, Purchasing power parity,
X-DOI: 10.1080/07474930802473702
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802473702
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:6:p:495-521
Template-Type: ReDIF-Article 1.0
Author-Name: Suhejla Hoti
Author-X-Name-First: Suhejla
Author-X-Name-Last: Hoti
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Author-Name: Daniel Slottje
Author-X-Name-First: Daniel
Author-X-Name-Last: Slottje
Title: Measuring the Volatility in U.S. Treasury Benchmarks and Debt Instruments
Abstract:
As U.S. Treasury securities carry the full faith and credit of the U.S.
government, they are free of default risk. Thus, their yields are
risk-free rates of return, which allows the most recently issued U.S.
Treasury securities to be used as a benchmark to price other fixed-income
instruments. This article analyzes the time series properties of interest
rates on U.S. Treasury benchmarks and related debt instruments by
modelling the conditional mean and conditional volatility for weekly
yields on 12 Treasury Bills and other debt instruments for the period
January 8, 1982 to August 20, 2004. The conditional correlations between
all pairs of debt instruments are also calculated. These estimates are of
interest as they enable an assessment of the implications of modelling
conditional volatility on forecasting performance. The estimated
conditional correlation coefficients indicate whether there is
specialization, diversification, or independence in the debt instrument
shocks. Constant conditional correlation estimates of the standardized
shocks indicate that the shocks to the first differences in the debt
instrument yields are generally high and always positively correlated. In
general, the primary purpose in holding a portfolio of Treasury Bills and
other debt instruments should be to specialize on instruments that provide
the largest returns. Tests for Stochastic Dominance are generally
consistent with these findings, but find somewhat surprising rankings
between debt instruments, with implications for portfolio composition.
Thirty year treasuries, Aaa bonds, and mortgages tend to dominate the
other instruments, at least to the second order.
Journal: Econometric Reviews
Pages: 522-554
Issue: 6
Volume: 28
Year: 2009
Keywords: Asymmetry, Conditional correlation, Conditional volatility, Debt instruments, Diversification, Forecasting, Independence, Risk, Specialization, Treasury bills,
X-DOI: 10.1080/07474930802473736
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802473736
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:6:p:522-554
Template-Type: ReDIF-Article 1.0
Author-Name: Joseph Terza
Author-X-Name-First: Joseph
Author-X-Name-Last: Terza
Title: Parametric Nonlinear Regression with Endogenous Switching
Abstract:
Based on the insightful work of Olsen (1980) for the linear context, a
generic and unifying framework is developed that affords a simple
extension of the classical method of Heckman (1974, 1976, 1978, 1979) to a
broad class of nonlinear regression models involving endogenous switching
and its two most common incarnations, endogenous sample selection and
endogenous treatment effects. The approach should be appealing to applied
researchers for three reasons. First, econometric applications involving
endogenous switching abound. Secondly, the approach requires neither
linearity of the regression function nor full parametric specification of
the model. It can, in fact, be applied under the minimal parametric
assumptions—i.e., specification of only the conditional means of
the outcome and switching variables. Finally, it is amenable to relatively
straightforward estimation methods. Examples of applications of the method
are discussed.
Journal: Econometric Reviews
Pages: 555-580
Issue: 6
Volume: 28
Year: 2009
Keywords: Sample selection, Treatment effects, Two-stage estimation,
X-DOI: 10.1080/07474930802473751
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802473751
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:6:p:555-580
Template-Type: ReDIF-Article 1.0
Author-Name: Gonzalo Camba-Mendez
Author-X-Name-First: Gonzalo
Author-X-Name-Last: Camba-Mendez
Author-Name: George Kapetanios
Author-X-Name-First: George
Author-X-Name-Last: Kapetanios
Title: Statistical Tests and Estimators of the Rank of a Matrix and Their Applications in Econometric Modelling
Abstract:
Testing and estimating the rank of a matrix of estimated parameters is
key in a large variety of econometric modelling scenarios. This article
describes general methods to test for and estimate the rank of a matrix,
and provides details on a variety of modelling scenarios in the
econometrics literature where such methods are required. Four different
methods to test for the true rank of a general matrix are described, as
well as one method that can handle the case of a matrix subject to
parameter constraints associated with defineteness structures. The
technical requirements for the implementation of the tests of rank of a
general matrix differ and hence there are merits to all of them that
justify their use in applied work. Nonetheless, we review available
evidence of their small sample properties in the context of different
modelling scenarios where all, or some, are applicable.
Journal: Econometric Reviews
Pages: 581-611
Issue: 6
Volume: 28
Year: 2009
Keywords: Model specification, Multiple time series, Tests of rank,
X-DOI: 10.1080/07474930802473785
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930802473785
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:6:p:581-611
Template-Type: ReDIF-Article 1.0
Author-Name: Christian Hafner
Author-X-Name-First: Christian
Author-X-Name-Last: Hafner
Author-Name: Philip Hans Franses
Author-X-Name-First: Philip Hans
Author-X-Name-Last: Franses
Title: A Generalized Dynamic Conditional Correlation Model: Simulation and Application to Many Assets
Abstract:
In this article, we put forward a generalization of the Dynamic
Conditional Correlation (DCC) Model of Engle (2002). Our model allows for
asset-specific correlation sensitivities, which is useful in particular if
one aims to summarize a large number of asset returns. We propose two
estimation methods, one based on a full likelihood maximization, the other
on individual correlation estimates. The resultant generalized DCC (GDCC)
model is considered for daily data on 39 U.K. stock returns in the FTSE.
We find convincing evidence that the GDCC model improves on the DCC model
and also on the CCC model of Bollerslev (1990).
Journal: Econometric Reviews
Pages: 612-631
Issue: 6
Volume: 28
Year: 2009
Keywords: Dynamic conditional correlation, Multivariate GARCH,
X-DOI: 10.1080/07474930903038834
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903038834
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:6:p:612-631
Template-Type: ReDIF-Article 1.0
Author-Name: Bo Li
Author-X-Name-First: Bo
Author-X-Name-Last: Li
Title: Asymptotically Distribution-Free Goodness-of-Fit Testing: A Unifying View
Abstract:
We outline a general paradigm for constructing asymptotically
distribution-free (ADF) goodness-of-fit tests, which can be regarded as a
generalization of Khmaladze (1993). This is achieved by a nonorthogonal
projection of a class of functions onto the ortho-complement of the
extended tangent space (ETS) associated with the null hypothesis. In
parallel with the work of Bickel et al. (2006), we obtain transformed
empirical processes (TEP) which are the building blocks for constructing
omnibus tests such as the usual Kolmogorov-Smirnov type tests and
Cramer-von Mise type tests, as well as Portmanteau tests and directional
tests. The critical values can be tabulated due to the ADF property. All
the tests are capable of detecting local (Pitman) alternative at the
root-n scale. We shall illustrate the framework in several examples,
mostly in regression model specification testing.
Journal: Econometric Reviews
Pages: 632-657
Issue: 6
Volume: 28
Year: 2009
Keywords: ADF, Empirical process, Goodness-of-fit tests, Martingale transform, Semiparametric, Tangent space,
X-DOI: 10.1080/07474930903038933
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903038933
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:6:p:632-657
Template-Type: ReDIF-Article 1.0
Author-Name: Youngki Shin
Author-X-Name-First: Youngki
Author-X-Name-Last: Shin
Title: Length-bias Correction in Transformation Models with Supplementary Data
Abstract:
In this article, I propose an inferential procedure of monotone
transformation models with random truncation points, which may not be
observable. This class includes length-biased samples that are common in
duration analysis. The proposed estimator can be applied to more general
situations than existing estimators, since it imposes restrictions on
neither the transformation function nor the error terms. Furthermore, it
does not require observed truncation points either. It is sufficient for
point identification to know the cdf of the truncation variable, which can
be estimated from supplementary data that are easily found in
applications. The estimator converges to a normal distribution at the rate
of [image omitted] and Monte Carlo simulations confirm its robustness to
error distributions in finite samples. For an empirical illustration, I
estimate the effect of unemployment insurance benefits on unemployment
duration, using length-biased microdata and supplementary macrodata.
Journal: Econometric Reviews
Pages: 658-681
Issue: 6
Volume: 28
Year: 2009
Keywords: Duration models, Length-biased data, Rank estimation, Random truncation, Transformation model,
X-DOI: 10.1080/07474930903039246
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903039246
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:28:y:2009:i:6:p:658-681
Template-Type: ReDIF-Article 1.0
Author-Name: Pål Børing
Author-X-Name-First: Pål
Author-X-Name-Last: Børing
Title: Gamma Unobserved Heterogeneity and Duration Bias
Abstract:
Røed et al. (1999) demonstrate that the standard result of known
negative duration bias does not necessarily hold in a two-state mixed
proportional hazard (MPH) model. We show that the duration bias is still
ambiguous in a MPH model with a multivariate gamma distribution. A
discrete time two-state version of our MPH model is developed to analyze
the duration of higher education. The estimation results show that we
cannot reject the hypothesis that the two unobserved heterogeneity
variables are uncorrelated. Accepting this hypothesis implies that the
standard result holds in our analysis.
Journal: Econometric Reviews
Pages: 1-19
Issue: 1
Volume: 29
Year: 2010
Keywords: Duration bias, Higher education, Multivariate gamma distribution, Unobserved heterogeneity,
X-DOI: 10.1080/07474930903323822
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903323822
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:1:p:1-19
Template-Type: ReDIF-Article 1.0
Author-Name: W. Kwan
Author-X-Name-First: W.
Author-X-Name-Last: Kwan
Author-Name: W. K. Li
Author-X-Name-First: W. K.
Author-X-Name-Last: Li
Author-Name: K. W. Ng
Author-X-Name-First: K. W.
Author-X-Name-Last: Ng
Title: A Multivariate Threshold Varying Conditional Correlations Model
Abstract:
In this article, a multivariate threshold varying conditional correlation
(TVCC) model is proposed. The model extends the idea of Engle (2002) and
Tse and Tsui (2002) to a threshold framework. This model retains the
interpretation of the univariate threshold GARCH model and allows for
dynamic conditional correlations. Techniques of model identification,
estimation, and model checking are developed. Some simulation results are
reported on the finite sample distribution of the maximum likelihood
estimate of the TVCC model. Real examples demonstrate the asymmetric
behavior of the mean and the variance in financial time series and the
ability of the TVCC model to capture these phenomena.
Journal: Econometric Reviews
Pages: 20-38
Issue: 1
Volume: 29
Year: 2010
Keywords: Conditional correlation, Multivariate TVCC model, Threshold, Volatility,
X-DOI: 10.1080/07474930903327260
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903327260
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:1:p:20-38
Template-Type: ReDIF-Article 1.0
Author-Name: Kien Tran
Author-X-Name-First: Kien
Author-X-Name-Last: Tran
Author-Name: Efthymios Tsionas
Author-X-Name-First: Efthymios
Author-X-Name-Last: Tsionas
Title: Local GMM Estimation of Semiparametric Panel Data with Smooth Coefficient Models
Abstract:
In this article, we consider the estimation of semiparametric panel data
smooth coefficient models. We propose a class of local generalized method
of moments (LGMM) estimators that are simple and easy to implement in
practice. We show that the proposed LGMM estimators are consistent and
asymptotically normal. Monte Carlo simulations suggest that our proposed
estimator performs quite well in finite samples. An empirical application
using a large panel of U.K. firms is also presented.
Journal: Econometric Reviews
Pages: 39-61
Issue: 1
Volume: 29
Year: 2010
Keywords: Local Generalized Method of Moments, Monte Carlo simulation, Semiparametric panel data model, Smooth coefficient,
X-DOI: 10.1080/07474930903327856
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903327856
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:1:p:39-61
Template-Type: ReDIF-Article 1.0
Author-Name: Leopold Simar
Author-X-Name-First: Leopold
Author-X-Name-Last: Simar
Author-Name: Paul Wilson
Author-X-Name-First: Paul
Author-X-Name-Last: Wilson
Title: Inferences from Cross-Sectional, Stochastic Frontier Models
Abstract:
Conventional approaches for inference about efficiency in parametric
stochastic frontier (PSF) models are based on percentiles of the estimated
distribution of the one-sided error term, conditional on the composite
error. When used as prediction intervals, coverage is poor when the
signal-to-noise ratio is low, but improves slowly as sample size
increases. We show that prediction intervals estimated by bagging yield
much better coverages than the conventional approach, even with low
signal-to-noise ratios. We also present a bootstrap method that gives
confidence interval estimates for (conditional) expectations of
efficiency, and which have good coverage properties that improve with
sample size. In addition, researchers who estimate PSF models typically
reject models, samples, or both when residuals have skewness in the
“wrong” direction, i.e., in a direction that would seem to
indicate absence of inefficiency. We show that correctly specified models
can generate samples with “wrongly” skewed residuals, even
when the variance of the inefficiency process is nonzero. Both our bagging
and bootstrap methods provide useful information about inefficiency and
model parameters irrespective of whether residuals have skewness in the
desired direction.
Journal: Econometric Reviews
Pages: 62-98
Issue: 1
Volume: 29
Year: 2010
Keywords: Bagging, Bootstrap, Efficiency, Inference, Stochastic frontier,
X-DOI: 10.1080/07474930903324523
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903324523
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:1:p:62-98
Template-Type: ReDIF-Article 1.0
Author-Name: Patrik Guggenberger
Author-X-Name-First: Patrik
Author-X-Name-Last: Guggenberger
Title: Book Review: Identification and Inference for Econometric Models
Abstract:
Journal: Econometric Reviews
Pages: 99-105
Issue: 1
Volume: 29
Year: 2010
X-DOI: 10.1080/07474930903324549
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903324549
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:1:p:99-105
Template-Type: ReDIF-Article 1.0
Author-Name: Francis Vella
Author-X-Name-First: Francis
Author-X-Name-Last: Vella
Title: Book Review: Econometrics, Statistics and Computational Approaches in Food and Health Sciences
Abstract:
Journal: Econometric Reviews
Pages: 106-109
Issue: 1
Volume: 29
Year: 2010
X-DOI: 10.1080/07474930903324572
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903324572
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:1:p:106-109
Template-Type: ReDIF-Article 1.0
Author-Name: Christian Gengenbach
Author-X-Name-First: Christian
Author-X-Name-Last: Gengenbach
Author-Name: Franz C. Palm
Author-X-Name-First: Franz C.
Author-X-Name-Last: Palm
Author-Name: Jean-Pierre Urbain
Author-X-Name-First: Jean-Pierre
Author-X-Name-Last: Urbain
Title: Panel Unit Root Tests in the Presence of Cross-Sectional Dependencies: Comparison and Implications for Modelling
Abstract:
Several panel unit root tests that account for cross-section
dependence using a common factor structure have been proposed in the
literature recently. Pesaran's (2007) cross-sectionally augmented unit
root tests are designed for cases where cross-sectional dependence is due
to a single factor. The Moon and Perron (2004) tests which use defactored
data are similar in spirit but can account for multiple common factors.
The Bai and Ng (2004a) tests allow to determine the source of
nonstationarity by testing for unit roots in the common factors and the
idiosyncratic factors separately. Breitung and Das (2008) and Sul (2007)
propose panel unit root tests when cross-section dependence is present
possibly due to common factors, but the common factor structure is not
fully exploited. This article makes four contributions: (1) it
compares the testing procedures in terms of similarities and differences
in the data generation process, tests, null, and alternative hypotheses
considered, (2) using Monte Carlo results it compares the small sample
properties of the tests in models with up to two common factors, (3) it
provides an application which illustrates the use of the tests, and (4)
finally, it discusses the use of the tests in modelling in general.
Journal: Econometric Reviews
Pages: 111-145
Issue: 2
Volume: 29
Year: 2010
Month: 4
X-DOI: 10.1080/07474930903382125
File-URL: http://hdl.handle.net/10.1080/07474930903382125
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:2:p:111-145
Template-Type: ReDIF-Article 1.0
Author-Name: Edward Cripps
Author-X-Name-First: Edward
Author-X-Name-Last: Cripps
Author-Name: Denzil G. Fiebig
Author-X-Name-First: Denzil G.
Author-X-Name-Last: Fiebig
Author-Name: Robert Kohn
Author-X-Name-First: Robert
Author-X-Name-Last: Kohn
Title: Parsimonious Estimation of the Covariance Matrix in Multinomial Probit Models
Abstract:
This article presents a Bayesian analysis of a multinomial
probit model by building on previous work that specified priors on
identified parameters. The main contribution of our article is to propose
a prior on the covariance matrix of the latent utilities that permits
elements of the inverse of the covariance matrix to be identically zero.
This allows a parsimonious representation of the covariance matrix when
such parsimony exists. The methodology is applied to both simulated and
real data, and its ability to obtain more efficient estimators of the
covariance matrix and regression coefficients is assessed using simulated
data.
Journal: Econometric Reviews
Pages: 146-157
Issue: 2
Volume: 29
Year: 2010
Month: 4
X-DOI: 10.1080/07474930903382158
File-URL: http://hdl.handle.net/10.1080/07474930903382158
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:2:p:146-157
Template-Type: ReDIF-Article 1.0
Author-Name: Bernd Fitzenberger
Author-X-Name-First: Bernd
Author-X-Name-Last: Fitzenberger
Author-Name: Ralf A. Wilke
Author-X-Name-First: Ralf A.
Author-X-Name-Last: Wilke
Author-Name: Xuan Zhang
Author-X-Name-First: Xuan
Author-X-Name-Last: Zhang
Title: Implementing Box-Cox Quantile Regression
Abstract:
The Box-Cox quantile regression model introduced by Powell
(1991) is a flexible and numerically attractive extension of linear
quantile regression techniques. Chamberlain (1994) and Buchinsky (1995)
suggest a two stage estimator for this model but the objective function in
stage two of their method may not be defined in an application. We suggest
a modification of the estimator which is easy to implement. A simulation
study demonstrates that the modified estimator works well in situations,
where the original estimator is not well defined.
Journal: Econometric Reviews
Pages: 158-181
Issue: 2
Volume: 29
Year: 2010
Month: 4
X-DOI: 10.1080/07474930903382166
File-URL: http://hdl.handle.net/10.1080/07474930903382166
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:2:p:158-181
Template-Type: ReDIF-Article 1.0
Author-Name: Martin Wagner
Author-X-Name-First: Martin
Author-X-Name-Last: Wagner
Author-Name: Jaroslava Hlouskova
Author-X-Name-First: Jaroslava
Author-X-Name-Last: Hlouskova
Title: The Performance of Panel Cointegration Methods: Results from a Large Scale Simulation Study
Abstract:
This article presents results concerning the performance of
both single equation and system panel cointegration tests and estimators.
The study considers the tests developed in Pedroni (1999, 2004),
Westerlund (2005), Larsson et al. (2001), and Breitung (2005) and the
estimators developed in Phillips and Moon (1999), Pedroni (2000), Kao and
Chiang (2000), Mark and Sul (2003), Pedroni (2001), and Breitung (2005).
We study the impact of stable autoregressive roots approaching the unit
circle, of I(2) components, of short-run cross-sectional
correlation and of cross-unit cointegration on the performance of the
tests and estimators. The data are simulated from three-dimensional
individual specific VAR systems with cointegrating ranks varying from zero
to two for fourteen different panel dimensions. The usual specifications
of deterministic components are considered.
Journal: Econometric Reviews
Pages: 182-223
Issue: 2
Volume: 29
Year: 2010
Month: 4
X-DOI: 10.1080/07474930903382182
File-URL: http://hdl.handle.net/10.1080/07474930903382182
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:2:p:182-223
Template-Type: ReDIF-Article 1.0
Author-Name: Gary Koop
Author-X-Name-First: Gary
Author-X-Name-Last: Koop
Author-Name: Roberto León-González
Author-X-Name-First: Roberto
Author-X-Name-Last: León-González
Author-Name: Rodney W. Strachan
Author-X-Name-First: Rodney W.
Author-X-Name-Last: Strachan
Title: Efficient Posterior Simulation for Cointegrated Models with Priors on the Cointegration Space
Abstract:
A message coming out of the recent Bayesian literature on
cointegration is that it is important to elicit a prior on the space
spanned by the cointegrating vectors (as opposed to a particular
identified choice for these vectors). In previous work, such priors have
been found to greatly complicate computation. In this article, we develop
algorithms to carry out efficient posterior simulation in cointegration
models. In particular, we develop a collapsed Gibbs sampling algorithm
which can be used with just-identifed models and demonstrate that it has
very large computational advantages relative to existing approaches. For
over-identifed models, we develop a parameter-augmented Gibbs sampling
algorithm and demonstrate that it also has attractive computational
properties.
Journal: Econometric Reviews
Pages: 224-242
Issue: 2
Volume: 29
Year: 2010
Month: 4
X-DOI: 10.1080/07474930903382208
File-URL: http://hdl.handle.net/10.1080/07474930903382208
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:2:p:224-242
Template-Type: ReDIF-Article 1.0
Author-Name: Òscar Jord�
Author-X-Name-First: Òscar
Author-X-Name-Last: Jord�
Title: Book Review: New Introduction to Multiple Time Series Analysis
Journal: Econometric Reviews
Pages: 243-246
Issue: 2
Volume: 29
Year: 2010
Month: 4
X-DOI: 10.1080/07474930903472868
File-URL: http://hdl.handle.net/10.1080/07474930903472868
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:2:p:243-246
Template-Type: ReDIF-Article 1.0
Author-Name: Gordon Anderson
Author-X-Name-First: Gordon
Author-X-Name-Last: Anderson
Author-Name: Ying Ge
Author-X-Name-First: Ying
Author-X-Name-Last: Ge
Author-Name: Teng Wah Leo
Author-X-Name-First: Teng Wah
Author-X-Name-Last: Leo
Title: Distributional Overlap: Simple, Multivariate, Parametric, and Nonparametric Tests for Alienation, Convergence, and General Distributional Difference Issues
Abstract:
This paper proposes a convenient measure of the degree of distributional
overlap, both parametric and nonparametric, useful in measuring the degree
of Polarization, Alienation, and Convergence. We show the measure is
asymptotically normally distributed, making it amenable to inference in
consequence. This Overlap measure can be used in the univariate and
multivariate framework, and three examples are used to illustrate its use.
The nonparametric Overlap Index has two sources of bias, the first being a
positive bias induced by the unknown intersection point of the underlying
distribution and the second being a negative bias induced by the
expectation of cell probabilities being less than the conditional expected
values. We show that the inconsistency problem generated by the first
bias, prevalent within this class of Goodness of Fit measure, is limited
by the number of intersection points of the underlying distributions. A
Monte Carlo study was used to examine the biases, and it was found that
the latter bias dominates the former. These biases can be diluted by
increasing the number of partitions, but prevails asymptotically
nonetheless.
Journal: Econometric Reviews
Pages: 247-275
Issue: 3
Volume: 29
Year: 2010
Keywords: Alienation, Convergence, Overlap index, Polarization,
X-DOI: 10.1080/07474930903451532
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903451532
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:3:p:247-275
Template-Type: ReDIF-Article 1.0
Author-Name: Marcelo Fernandes
Author-X-Name-First: Marcelo
Author-X-Name-Last: Fernandes
Author-Name: Breno Neri
Author-X-Name-First: Breno
Author-X-Name-Last: Neri
Title: Nonparametric Entropy-Based Tests of Independence Between Stochastic Processes
Abstract:
This article develops nonparametric tests of independence between two
stochastic processes satisfying β-mixing conditions. The testing
strategy boils down to gauging the closeness between the joint and the
product of the marginal stationary densities. For that purpose, we take
advantage of a generalized entropic measure so as to build a whole family
of nonparametric tests of independence. We derive asymptotic normality and
local power using the functional delta method for kernels. As a corollary,
we also develop a class of entropy-based tests for serial independence.
The latter are nuisance parameter free, and hence also qualify for dynamic
misspecification analyses. We then investigate the finite-sample
properties of our serial independence tests through Monte Carlo
simulations. They perform quite well, entailing more power against some
nonlinear AR alternatives than two popular nonparametric
serial-independence tests.
Journal: Econometric Reviews
Pages: 276-306
Issue: 3
Volume: 29
Year: 2010
Keywords: Independence, Misspecification testing, Nonparametric theory, Tsallis entropy,
X-DOI: 10.1080/07474930903451557
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903451557
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:3:p:276-306
Template-Type: ReDIF-Article 1.0
Author-Name: Thanasis Stengos
Author-X-Name-First: Thanasis
Author-X-Name-Last: Stengos
Author-Name: Ximing Wu
Author-X-Name-First: Ximing
Author-X-Name-Last: Wu
Title: Information-Theoretic Distribution Test with Application to Normality
Abstract:
We derive general distribution tests based on the method of maximum
entropy (ME) density. The proposed tests are derived from maximizing the
differential entropy subject to given moment constraints. By exploiting
the equivalence between the ME and maximum likelihood (ML) estimates for
the general exponential family, we can use the conventional likelihood
ratio (LR), Wald, and Lagrange multiplier (LM) testing principles in the
maximum entropy framework. In particular, we use the LM approach to derive
tests for normality. Monte Carlo evidence suggests that the proposed tests
are compatible with and sometimes outperform some commonly used normality
tests. We show that the proposed tests can be extended to tests based on
regression residuals and non-i.i.d. data in a straightforward manner. An
empirical example on production function estimation is presented.
Journal: Econometric Reviews
Pages: 307-329
Issue: 3
Volume: 29
Year: 2010
Keywords: Distribution test, Maximum entropy, Normality,
X-DOI: 10.1080/07474930903451565
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903451565
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:3:p:307-329
Template-Type: ReDIF-Article 1.0
Author-Name: Mehmet Caner
Author-X-Name-First: Mehmet
Author-X-Name-Last: Caner
Title: Testing, Estimation in GMM and CUE with Nearly-Weak Identification
Abstract:
In this article, we analyze Generalized Method of Moments (GMM) and
Continuous Updating Estimator (CUE) with strong, nearly-weak, and weak
identification. We show that with this mixed system, the limits of the
estimators are nonstandard. In the subcase of GMM estimator with only
nearly-weak instruments, the correlation between the instruments and the
first order conditions decline at a slower rate than root T. We find an
important difference between the nearly-weak case and the weak case.
Inference with point estimates is possible with the Wald, likelihood ratio
(LR), and Lagrange multiplier (LM) tests in GMM estimator with only
nearly-weak instruments present in the system. The limit is the standard
χ2 limit. This is important from an applied perspective, since tests
on the weak case do depend on the true value and can only test simple
null. We also show this in the more realistic case of mixed type of
strong, weak, and nearly-weak instruments, Anderson and Rubin (1949) and
Kleibergen (2005) type of tests are asymptotically pivotal and have
χ2 limit.
Journal: Econometric Reviews
Pages: 330-363
Issue: 3
Volume: 29
Year: 2010
Keywords: Empirical process, Rate of convergence, Triangular central limit theorem,
X-DOI: 10.1080/07474930903451599
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903451599
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:3:p:330-363
Template-Type: ReDIF-Article 1.0
Author-Name: Yingyao Hu
Author-X-Name-First: Yingyao
Author-X-Name-Last: Hu
Author-Name: Geert Ridder
Author-X-Name-First: Geert
Author-X-Name-Last: Ridder
Title: On Deconvolution as a First Stage Nonparametric Estimator
Abstract:
We reconsider Taupin's (2001) Integrated Nonlinear Regression (INLR)
estimator for a nonlinear regression with a mismeasured covariate. We find
that if we restrict the distribution of the measurement error to a class
of distributions with restricted support, then much weaker smoothness
assumptions than hers suffice to ensure [image omitted] consistency of the
estimator. In addition, we show that the INLR estimator remains consistent
under these weaker smoothness assumptions if the support of the
measurement error distribution expands with the sample size. In that case
the estimator remains also asymptotically normal with a rate of
convergence that is arbitrarily close to [image omitted]. Our results show
that deconvolution can be used in a nonparametric first step without
imposing restrictive smoothness assumptions on the parametric model.
Journal: Econometric Reviews
Pages: 365-396
Issue: 4
Volume: 29
Year: 2010
Keywords: Asymptotic normality, Bounded support, Deconvolution, Measurement error model, Nonparametric estimation, Ordinary smooth,
X-DOI: 10.1080/07474930903559276
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903559276
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:4:p:365-396
Template-Type: ReDIF-Article 1.0
Author-Name: Chang Sik Kim
Author-X-Name-First: Chang Sik
Author-X-Name-Last: Kim
Author-Name: Joon Park
Author-X-Name-First: Joon
Author-X-Name-Last: Park
Title: Cointegrating Regressions with Time Heterogeneity
Abstract:
This article considers the cointegrating regression with errors whose
variances change smoothly over time. The model can be used to describe a
long-run cointegrating relationship, the tightness of which varies along
with time. Heteroskedasticity in the errors is modeled nonparametrically
and is assumed to be generated by a smooth function of time. We show that
it can be consistently estimated by the kernel method. Given consistent
estimates for error variances, the cointegrating relationship can be
efficiently estimated by the usual generalized least squares (GLS)
correction for heteroskedastic errors. It is shown that the U.S. money
demand function, both for M1 and M2, is well fitted to such a
cointegrating model with an increasing trend in error variances. Moreover,
we found that the bilateral purchasing power parities among the leading
industrialized countries such as the United States, Japan, Canada, and the
United Kingdom have been changed somewhat conspicuously over the past
thirty years. In particular, it appears that they all have generally
become more tightened during the period.
Journal: Econometric Reviews
Pages: 397-438
Issue: 4
Volume: 29
Year: 2010
Keywords: Cointegrating regression, GLS correction for heteroskedasticity, Kernel estimation, Time heterogeneity,
X-DOI: 10.1080/07474930903562221
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903562221
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:4:p:397-438
Template-Type: ReDIF-Article 1.0
Author-Name: Sebastiano Manzan
Author-X-Name-First: Sebastiano
Author-X-Name-Last: Manzan
Author-Name: Dawit Zerom
Author-X-Name-First: Dawit
Author-X-Name-Last: Zerom
Title: A Semiparametric Analysis of Gasoline Demand in the United States Reexamining The Impact of Price
Abstract:
The evaluation of the impact of an increase in gasoline tax on demand
relies crucially on the estimate of the price elasticity. This article
presents an extended application of the Partially Linear Additive Model
(PLAM) to the analysis of gasoline demand using a panel of U.S.
households, focusing mainly on the estimation of the price elasticity.
Unlike previous semiparametric studies that use household-level data, we
work with vehicle-level data within households that can potentially add
richer details to the price variable. Both households and vehicles data
are obtained from the Residential Transportation Energy Consumption Survey
(RTECS) of 1991 and 1994, conducted by the U.S. Energy Information
Administration (EIA). As expected, the derived vehicle-based gasoline
price has significant dispersion across the country and across grades of
gasoline. By using a PLAM specification for gasoline demand, we obtain a
measure of gasoline price elasticity that circumvents the implausible
price effects reported in earlier studies. In particular, our results show
the price elasticity ranges between -0.2, at low prices, and -0.5, at high
prices, suggesting that households might respond differently to price
changes depending on the level of price. In addition, we estimate
separately the model to households that buy only regular gasoline and
those that buy also midgrade/premium gasoline. The results show that the
price elasticities for these groups are increasing in price and that
regular households are more price sensitive compared to nonregular.
Journal: Econometric Reviews
Pages: 439-468
Issue: 4
Volume: 29
Year: 2010
Keywords: Gasoline demand, Partially linear additive model, Semiparametric methods,
X-DOI: 10.1080/07474930903562320
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903562320
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:4:p:439-468
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Marcelo Medeiros
Author-X-Name-First: Marcelo
Author-X-Name-Last: Medeiros
Title: The Link Between Statistical Learning Theory and Econometrics: Applications in Economics, Finance, and Marketing
Abstract:
Statistical Learning refers to statistical aspects of automated
extraction of regularities (structure) in datasets. It is a broad area
which includes neural networks, regression-trees, nonparametric statistics
and sieve approximation, boosting, mixtures of models, computational
complexity, computational statistics, and nonlinear models in general.
Although Statistical Learning Theory and Econometrics are closely related,
much of the development in each of the areas is seemingly proceeding
independently. This special issue brings together these two areas, and is
intended to stimulate new applications and appreciation in economics,
finance, and marketing. This special volume contains ten innovative
articles covering a broad range of relevant topics.
Journal: Econometric Reviews
Pages: 470-475
Issue: 5-6
Volume: 29
Year: 2010
Keywords: Bagging, Forecasting, Mixture of models, Model combination, Neural networks, Nonlinear models, Regression trees, Statistical learning, Support vector regression,
X-DOI: 10.1080/07474938.2010.481544
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.481544
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:470-475
Template-Type: ReDIF-Article 1.0
Author-Name: Nii Ayi Armah
Author-X-Name-First: Nii
Author-X-Name-Last: Ayi Armah
Author-Name: Norman Swanson
Author-X-Name-First: Norman
Author-X-Name-Last: Swanson
Title: Seeing Inside the Black Box: Using Diffusion Index Methodology to Construct Factor Proxies in Large Scale Macroeconomic Time Series Environments
Abstract:
In economics, common factors are often assumed to underlie the
co-movements of a set of macroeconomic variables. For this reason, many
authors have used estimated factors in the construction of prediction
models. In this article, we begin by surveying the extant literature on
diffusion indexes. We then outline a number of approaches to the selection
of factor proxies (observed variables that proxy unobserved estimated
factors) using the statistics developed in Bai and Ng (2006a,b). Our
approach to factor proxy selection is examined via a small Monte Carlo
experiment, where evidence supporting our proposed methodology is
presented, and via a large set of prediction experiments using the panel
dataset of Stock and Watson (2005). One of our main empirical findings is
that our “smoothed” approaches to factor proxy selection
appear to yield predictions that are often superior not only to a
benchmark factor model, but also to simple linear time series models which
are generally difficult to beat in forecasting competitions. In some
sense, by using our approach to predictive factor proxy selection, one is
able to open up the “black box” often associated with factor
analysis, and to identify actual variables that can serve as primitive
building blocks for (prediction) models of a host of macroeconomic
variables, and that can also serve as policy instruments, for example. Our
findings suggest that important observable variables include various
S&P500 variables, including stock price indices and dividend series; a
1-year Treasury bond rate; various housing activity variables; industrial
production; and exchange rates.
Journal: Econometric Reviews
Pages: 476-510
Issue: 5-6
Volume: 29
Year: 2010
Keywords: Diffusion index, Factor, Forecast, Macroeconometrics, Parameter estimation error, Proxy,
X-DOI: 10.1080/07474938.2010.481549
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.481549
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:476-510
Template-Type: ReDIF-Article 1.0
Author-Name: David Rapach
Author-X-Name-First: David
Author-X-Name-Last: Rapach
Author-Name: Jack Strauss
Author-X-Name-First: Jack
Author-X-Name-Last: Strauss
Title: Bagging or Combining (or Both)? An Analysis Based on Forecasting U.S. Employment Growth
Abstract:
Forecasting a macroeconomic variable is challenging in an environment
with many potential predictors whose predictive ability can vary over
time. We compare two approaches to forecasting U.S. employment growth in
this type of environment. The first approach applies bootstrap aggregating
(bagging) to a general-to-specific procedure based on a general dynamic
linear regression model with 30 potential predictors. The second approach
considers several methods for combining forecasts from 30 individual
autoregressive distributed lag (ARDL) models, where each individual ARDL
model contains a potential predictor. We analyze bagging and combination
forecasts at multiple horizons over four different out-of-sample periods
using a mean square forecast error (MSFE) criterion and forecast
encompassing tests. We find that bagging forecasts often deliver the
lowest MSFE. Interestingly, we also find that incorporating information
from both bagging and combination forecasts based on principal components
often leads to further gains in forecast accuracy.
Journal: Econometric Reviews
Pages: 511-533
Issue: 5-6
Volume: 29
Year: 2010
Keywords: Bagging, Combination forecasts, Employment, Forecast encompassing, Principal components,
X-DOI: 10.1080/07474938.2010.481550
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.481550
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:511-533
Template-Type: ReDIF-Article 1.0
Author-Name: Huiyu Huang
Author-X-Name-First: Huiyu
Author-X-Name-Last: Huang
Author-Name: Tae-Hwy Lee
Author-X-Name-First: Tae-Hwy
Author-X-Name-Last: Lee
Title: To Combine Forecasts or to Combine Information?
Abstract:
When the objective is to forecast a variable of interest but with many
explanatory variables available, one could possibly improve the forecast
by carefully integrating them. There are generally two directions one
could proceed: combination of forecasts (CF) or combination of information
(CI). CF combines forecasts generated from simple models each
incorporating a part of the whole information set, while CI brings the
entire information set into one super model to generate an ultimate
forecast. Through linear regression analysis and simulation, we show the
relative merits of each, particularly the circumstances where forecast by
CF can be superior to forecast by CI, when CI model is correctly specified
and when it is misspecified, and shed some light on the success of equally
weighted CF. In our empirical application on prediction of monthly,
quarterly, and annual equity premium, we compare the CF forecasts (with
various weighting schemes) to CI forecasts (with principal component
approach mitigating the problem of parameter proliferation). We find that
CF with (close to) equal weights is generally the best and dominates all
CI schemes, while also performing substantially better than the historical
mean.
Journal: Econometric Reviews
Pages: 534-570
Issue: 5-6
Volume: 29
Year: 2010
Keywords: Equally weighted combination of forecasts, Equity premium, Factor models, Forecast combination, Forecast combination puzzle, Information sets, Many predictors, Principal components, Shrinkage,
X-DOI: 10.1080/07474938.2010.481553
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.481553
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:534-570
Template-Type: ReDIF-Article 1.0
Author-Name: Eric Hillebrand
Author-X-Name-First: Eric
Author-X-Name-Last: Hillebrand
Author-Name: Marcelo Medeiros
Author-X-Name-First: Marcelo
Author-X-Name-Last: Medeiros
Title: The Benefits of Bagging for Forecast Models of Realized Volatility
Abstract:
This article shows that bagging can improve the forecast accuracy of time
series models for realized volatility. We consider 23 stocks from the Dow
Jones Industrial Average over the sample period 1995 to 2005 and employ
two different forecast models, a log-linear specification in the spirit of
the heterogeneous autoregressive model and a nonlinear specification with
logistic transitions. Both forecast model types benefit from bagging, in
particular in the 1990s part of our sample. The log-linear specification
shows larger improvements than the nonlinear model. Bagging the log-linear
model yields the highest forecast accuracy on our sample.
Journal: Econometric Reviews
Pages: 571-593
Issue: 5-6
Volume: 29
Year: 2010
Keywords: Bagging, Boostrap, HAR, Realized volatility,
X-DOI: 10.1080/07474938.2010.481554
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.481554
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:571-593
Template-Type: ReDIF-Article 1.0
Author-Name: Nesreen Ahmed
Author-X-Name-First: Nesreen
Author-X-Name-Last: Ahmed
Author-Name: Amir Atiya
Author-X-Name-First: Amir
Author-X-Name-Last: Atiya
Author-Name: Neamat El Gayar
Author-X-Name-First: Neamat El
Author-X-Name-Last: Gayar
Author-Name: Hisham El-Shishiny
Author-X-Name-First: Hisham
Author-X-Name-Last: El-Shishiny
Title: An Empirical Comparison of Machine Learning Models for Time Series Forecasting
Abstract:
In this work we present a large scale comparison study for the major
machine learning models for time series forecasting. Specifically, we
apply the models on the monthly M3 time series competition data (around a
thousand time series). There have been very few, if any, large scale
comparison studies for machine learning models for the regression or the
time series forecasting problems, so we hope this study would fill this
gap. The models considered are multilayer perceptron, Bayesian neural
networks, radial basis functions, generalized regression neural networks
(also called kernel regression), K-nearest neighbor regression, CART
regression trees, support vector regression, and Gaussian processes. The
study reveals significant differences between the different methods. The
best two methods turned out to be the multilayer perceptron and the
Gaussian process regression. In addition to model comparisons, we have
tested different preprocessing methods and have shown that they have
different impacts on the performance.
Journal: Econometric Reviews
Pages: 594-621
Issue: 5-6
Volume: 29
Year: 2010
Keywords: Comparison study, Gaussian process regression, Machine learning models, Neural network forecasting, Support vector regression,
X-DOI: 10.1080/07474938.2010.481556
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.481556
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:594-621
Template-Type: ReDIF-Article 1.0
Author-Name: Philip Yu
Author-X-Name-First: Philip
Author-X-Name-Last: Yu
Author-Name: Wai Keung Li
Author-X-Name-First: Wai Keung
Author-X-Name-Last: Li
Author-Name: Shusong Jin
Author-X-Name-First: Shusong
Author-X-Name-Last: Jin
Title: On Some Models for Value-At-Risk
Abstract:
The idea of statistical learning can be applied in financial risk
management. In recent years, value-at-risk (VaR) has become the standard
tool for market risk measurement and management. For better VaR
estimation, Engle and Manganelli (2004) introduced the conditional
autoregressive value-at-risk (CAViaR) model to estimate the VaR directly
by quantile regression. To entertain the nonlinearity and structural
change in the VaR, we extend the CAViaR idea using two approaches: the
threshold GARCH (TGARCH) and the mixture-GARCH models. The estimation
method of these models are proposed. Our models should possess all the
advantages of the CAViaR model and enhance the nonlinear structure. The
methods are applied to the S&P500, Hang Seng, Nikkei and Nasdaq indices to
illustrate our models.
Journal: Econometric Reviews
Pages: 622-641
Issue: 5-6
Volume: 29
Year: 2010
Keywords: GARCH model, Mixtures, Threshold models, Value-at-risk,
X-DOI: 10.1080/07474938.2010.481972
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.481972
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:622-641
Template-Type: ReDIF-Article 1.0
Author-Name: Alexandre Carvalho
Author-X-Name-First: Alexandre
Author-X-Name-Last: Carvalho
Author-Name: Georgios Skoulakis
Author-X-Name-First: Georgios
Author-X-Name-Last: Skoulakis
Title: Time Series Mixtures of Generalized t Experts: ML Estimation and an Application to Stock Return Density Forecasting
Abstract:
We propose and analyze a new nonlinear time series model based on local
mixtures of linear regressions, referred to as experts, with thick-tailed
disturbances. The mean function of each expert is an affine function of
covariates that may include lags of the dependent variable and/or lags of
external predictors. The mixing of the experts is determined by a latent
variable, the distribution of which depends on the same covariates used in
the regressions. The expert error terms are assumed to follow the
generalized t distribution, a rather flexible parametric form encompassing
the standard t and normal distributions as special cases and allowing
separate modeling of scale and kurtosis. We show consistency and
asymptotic normality of the maximum likelihood estimator, for correctly
specified and for misspecified models, and provide Monte Carlo evidence on
the performance of standard model selection criteria in selecting the
number of experts. We further employ the model to obtain density forecasts
for daily stock returns and find evidence to support the model.
Journal: Econometric Reviews
Pages: 642-687
Issue: 5-6
Volume: 29
Year: 2010
Keywords: Conditional density forecast, Generalized t distribution, Heavy tail distributions, Maximum likelihood estimation, Mixtures-of-experts, Nonlinear time series,
X-DOI: 10.1080/07474938.2010.481987
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.481987
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:642-687
Template-Type: ReDIF-Article 1.0
Author-Name: Georgi Nalbantov
Author-X-Name-First: Georgi
Author-X-Name-Last: Nalbantov
Author-Name: Philip Hans Franses
Author-X-Name-First: Philip Hans
Author-X-Name-Last: Franses
Author-Name: Patrick Groenen
Author-X-Name-First: Patrick
Author-X-Name-Last: Groenen
Author-Name: Jan Bioch
Author-X-Name-First: Jan
Author-X-Name-Last: Bioch
Title: Estimating the Market Share Attraction Model using Support Vector Regressions
Abstract:
We propose to estimate the parameters of the Market Share Attraction
Model (Cooper and Nakanishi, 1988; Fok and Franses, 2004) in a novel way
by using a nonparametric technique for function estimation called Support
Vector Regressions (SVR) (Smola, 1996; Vapnik, 1995). Traditionally, the
parameters of the Market Share Attraction Model are estimated via a
Maximum Likelihood (ML) procedure, assuming that the data are drawn from a
conditional Gaussian distribution. However, if the distribution is
unknown, Ordinary Least Squares (OLS) estimation may seriously fail
(Vapnik, 1982). One way to tackle this problem is to introduce a linear
loss function over the errors and a penalty on the magnitude of model
coefficients. This leads to qualities such as robustness to outliers and
avoidance of the problem of overfitting. This kind of estimation forms the
basis of the SVR technique, which, as we will argue, makes it a good
candidate for estimating the Market Share Attraction Model. We test the
SVR approach to predict (the evolution of) the market shares of 36 car
brands simultaneously and report promising results.
Journal: Econometric Reviews
Pages: 688-716
Issue: 5-6
Volume: 29
Year: 2010
Keywords: Marketing, Market share attraction model, Multi-output forecasting, Shrinkage estimators, Support vector regression,
X-DOI: 10.1080/07474938.2010.481989
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.481989
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:688-716
Template-Type: ReDIF-Article 1.0
Author-Name: Andre d'Almeida Monteiro
Author-X-Name-First: Andre
Author-X-Name-Last: d'Almeida Monteiro
Title: Estimating Interest Rate Curves by Support Vector Regression
Abstract:
A model that seeks to estimate an interest rate curve should have two
desirable capabilities in addition to the usual characteristics required
from any function-estimation model: it should incorporate the bid-ask
spreads of the securities from which the curve is extracted and restrict
the curve shape. The goal of this article is to estimate interest rate
curves by using Support Vector Regression (SVR), a method derived from the
Statistical Learning Theory developed by Vapnik (1995). The motivation is
that SVR features these extra capabilities at a low estimation cost. The
SVR is specified by a loss function, a kernel function and a smoothing
parameter. SVR models the daily U.S. dollar interest rate swap curves,
from 1997 to 2001. As expected from a priori and sensibility analyses, the
SVR equipped with the kernel generating a spline with an infinite number
of nodes was the best performing SVR. Comparing this SVR with other
models, it achieved the best cross-validation interpolation performance in
controlling the bias-variance trade-off and generating the lowest error
considering the desired accuracy fixed by the bid-ask spreads.
Journal: Econometric Reviews
Pages: 717-753
Issue: 5-6
Volume: 29
Year: 2010
Keywords: Bid-ask spread, Interest rate curves, Interest rate swaps, Support Vector Regression,
X-DOI: 10.1080/07474938.2010.481998
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.481998
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:717-753
Template-Type: ReDIF-Article 1.0
Author-Name: William Rea
Author-X-Name-First: William
Author-X-Name-Last: Rea
Author-Name: Marco Reale
Author-X-Name-First: Marco
Author-X-Name-Last: Reale
Author-Name: Carmela Cappelli
Author-X-Name-First: Carmela
Author-X-Name-Last: Cappelli
Author-Name: Jennifer Brown
Author-X-Name-First: Jennifer
Author-X-Name-Last: Brown
Title: Identification of Changes in Mean with Regression Trees: An Application to Market Research
Abstract:
In this article we present a computationally efficient method for finding
multiple structural breaks at unknown dates based on regression trees. We
outline the procedure and present the results of a simulation study to
assess the performance of the method and to compare it with the procedure
proposed by Bai and Perron. We find the tree-based method performs well in
long series which are impractical to analyze with current methods. We
apply these methods plus the CUSUM test to the market share of Crest
toothpaste between 1958 and 1963.
Journal: Econometric Reviews
Pages: 754-777
Issue: 5-6
Volume: 29
Year: 2010
Keywords: Identification of multiple structural breaks at unknown times, Time series analysis,
X-DOI: 10.1080/07474938.2010.482001
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2010.482001
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:29:y:2010:i:5-6:p:754-777
Template-Type: ReDIF-Article 1.0
Author-Name: Francesco Bravo
Author-X-Name-First: Francesco
Author-X-Name-Last: Bravo
Author-Name: David Jacho-Chavez
Author-X-Name-First: David
Author-X-Name-Last: Jacho-Chavez
Title: Empirical Likelihood for Efficient Semiparametric Average Treatment Effects
Abstract:
This article considers empirical likelihood in the context of efficient
semiparametric estimators of average treatment effects. It shows that the
empirical likelihood ratio converges to a nonstandard distribution, and
proposes a corrected test statistic that is asymptotically chi-squared. A
small Monte Carlo experiment suggests that the corrected empirical
likelihood ratio statistic has competitive finite sample properties. The
results of the article are applied to estimate the environmental effect of
the World Trade Organisation.
Journal: Econometric Reviews
Pages: 1-24
Issue: 1
Volume: 30
Year: 2011
Keywords: Empirical likelihood, Local polynomial regression, Plug-in principle, Propensity score, Weighted moment conditions, WTO,
X-DOI: 10.1080/07474938.2011.520547
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.520547
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:1:p:1-24
Template-Type: ReDIF-Article 1.0
Author-Name: Dinghai Xu
Author-X-Name-First: Dinghai
Author-X-Name-Last: Xu
Author-Name: John Knight
Author-X-Name-First: John
Author-X-Name-Last: Knight
Title: Continuous Empirical Characteristic Function Estimation of Mixtures of Normal Parameters
Abstract:
This article develops an efficient method for estimating the discrete
mixtures of normal family based on the continuous empirical characteristic
function (CECF). An iterated estimation procedure based on the closed form
objective distance function is proposed to improve the estimation
efficiency. The results from the Monte Carlo simulation reveal that the
CECF estimator produces good finite sample properties. In particular, it
outperforms the discrete type of methods when the maximum likelihood
estimation fails to converge. An empirical example is provided for
illustrative purposes.
Journal: Econometric Reviews
Pages: 25-50
Issue: 1
Volume: 30
Year: 2011
Keywords: Empirical characteristic function, Mixtures of normal,
X-DOI: 10.1080/07474938.2011.520565
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.520565
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:1:p:25-50
Template-Type: ReDIF-Article 1.0
Author-Name: Nikolaos Kourogenis
Author-X-Name-First: Nikolaos
Author-X-Name-Last: Kourogenis
Author-Name: Nikitas Pittis
Author-X-Name-First: Nikitas
Author-X-Name-Last: Pittis
Title: Mixing Conditions, Central Limit Theorems, and Invariance Principles: A Survey of the Literature with Some New Results on Heteroscedastic Sequences
Abstract:
This article is a survey of the main results on the central limit theorem
(CLT) and its invariance principle (IP) for mixing sequences that have
been obtained in the probabilistic literature in the last fifty years or
so with a view towards econometric applications. Each of these theorems
specifies a set of moment, dependence, and heterogeneity conditions on the
underlying sequence that ensures the validity of CLT and IP. Special
emphasis is paid to the case in which the underlying sequence has just
barely infinite variance, since this case is relevant to econometrics
applications that involve high-frequency financial data. Moreover, two new
results on IPs that apply to heteroscedastic sequences are obtained. The
first IP applies to sequences whose variances evolve over time in a
polynomial-like fashion, whereas the second IP concerns sequences that
experience a single variance break at some point within the sample.
Journal: Econometric Reviews
Pages: 88-108
Issue: 1
Volume: 30
Year: 2011
Keywords: Central limit theorem, Invariance Principle, Mixing, Trending variances, Variance break,
X-DOI: 10.1080/07474938.2011.520569
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.520569
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:1:p:88-108
Template-Type: ReDIF-Article 1.0
Author-Name: Michael Lechner
Author-X-Name-First: Michael
Author-X-Name-Last: Lechner
Title: The Relation of Different Concepts of Causality Used in Time Series and Microeconometrics
Abstract:
Granger and Sims noncausality (GSNC), a concept frequently applied in
time series econometrics, is compared to noncausality based on concepts
popular in microeconometrics, program evaluation, and epidemiology
literature (potential outcome noncausality or PONC). GSNC is defined as a
set of restrictions on joint distributions of random variables with
observable sample counterparts, whereas PONC combines restrictions on
partially unobservable variables (potential outcomes) with different
identifying assumptions that relate potential outcome variables to their
observable counterparts. Based on the Robins' dynamic model of potential
outcomes, we find that in general neither of the concepts implies each
other without further (untestable) assumptions. However, the identifying
assumptions associated with the sequential selection of the observables
link these concepts such that GSNC implies PONC, and vice versa.
Journal: Econometric Reviews
Pages: 109-127
Issue: 1
Volume: 30
Year: 2011
Keywords: Dynamic treatments, Granger causality, Potential outcome model, Rubin causality, Robins causality, Sims causality,
X-DOI: 10.1080/07474938.2011.520571
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.520571
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:1:p:109-127
Template-Type: ReDIF-Article 1.0
Author-Name: Theis Lange
Author-X-Name-First: Theis
Author-X-Name-Last: Lange
Author-Name: Anders Rahbek
Author-X-Name-First: Anders
Author-X-Name-Last: Rahbek
Author-Name: Søren Tolver Jensen
Author-X-Name-First: Søren Tolver
Author-X-Name-Last: Jensen
Title: Estimation and Asymptotic Inference in the AR-ARCH Model
Abstract:
This article studies asymptotic properties of the quasi-maximum
likelihood estimator (QMLE) for the parameters in the autoregressive (AR)
model with autoregressive conditional heteroskedastic (ARCH) errors. A
modified QMLE (MQMLE) is also studied. This estimator is based on
truncation of individual terms of the likelihood function and is related
to the recent so-called self-weighted QMLE in Ling (2007b). We show that
the MQMLE is asymptotically normal irrespectively of the existence of
finite moments, as geometric ergodicity alone suffice. Moreover, our
included simulations show that the MQMLE is remarkably well-behaved in
small samples. On the other hand, the ordinary QMLE, as is well-known,
requires finite fourth order moments for asymptotic normality. But based
on our considerations and simulations, we conjecture that in fact only
geometric ergodicity and finite second order moments are needed for the
QMLE to be asymptotically normal. Finally, geometric ergodicity for
AR-ARCH processes is shown to hold under mild and classic conditions on
the AR and ARCH processes.
Journal: Econometric Reviews
Pages: 129-153
Issue: 2
Volume: 30
Year: 2011
Keywords: ARCH, Asymptotic theory, Geometric ergodicity, Modified QMLE, QMLE,
X-DOI: 10.1080/07474938.2011.534031
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.534031
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:2:p:129-153
Template-Type: ReDIF-Article 1.0
Author-Name: Gabriel Montes-Rojas
Author-X-Name-First: Gabriel
Author-X-Name-Last: Montes-Rojas
Title: Robust Misspecification Tests for the Heckman's Two-Step Estimator
Abstract:
This article constructs and evaluates Lagrange multiplier (LM) and
Neyman's C(α) tests based on bivariate Edgeworth series expansions
for the consistency of the Heckman's two-step estimator in sample
selection models, that is, for marginal normality and linearity of the
conditional expectation of the error terms. The proposed tests are robust
to local misspecification in nuisance distributional parameters. Monte
Carlo results show that testing marginal normality and linearity of the
conditional expectations separately have a better size performance than
testing bivariate normality. Moreover, the robust variants of the tests
have better empirical size than nonrobust tests, which determines that
these tests can be successfully applied to detect specific departures from
the null model of bivariate normality. Finally, the tests are applied to
women's labor supply data.
Journal: Econometric Reviews
Pages: 154-172
Issue: 2
Volume: 30
Year: 2011
Keywords: Heckman's two-step, LM tests, Neyman's C(α) tests,
X-DOI: 10.1080/07474938.2011.534035
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.534035
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:2:p:154-172
Template-Type: ReDIF-Article 1.0
Author-Name: Qingyan Shang
Author-X-Name-First: Qingyan
Author-X-Name-Last: Shang
Author-Name: Lung-fei Lee
Author-X-Name-First: Lung-fei
Author-X-Name-Last: Lee
Title: Two-Step Estimation of Endogenous and Exogenous Group Effects
Abstract:
In this article, we propose a two-step method to identify and estimate
endogenous and exogenous social interactions in the Manski (1993) and
Brock and Durlauf's (2001a,b) discrete choice model with unobserved group
variables. Taking advantage of social groups with large group sizes, we
first estimate a probit model with group fixed-effects, and then use the
instrumental variables method to estimate endogenous and exogenous group
effects via the group fixed-effect estimates. Our method is
computationally simple. The method is applicable not only to the case of
single equilibrium but also the multiple equilibria case without the need
to specify an (arbitrary) equilibrium selection mechanism. The article
provides a Monte Carlo study on the finite sample performance of such
estimators.
Journal: Econometric Reviews
Pages: 173-207
Issue: 2
Volume: 30
Year: 2011
Keywords: Correlated effect, Discrete choice, Endogenous effect, Exogenous effect, Instrumental variables, Large size group, Monte Carlo, Social interaction, Two-step estimator,
X-DOI: 10.1080/07474938.2011.534039
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.534039
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:2:p:173-207
Template-Type: ReDIF-Article 1.0
Author-Name: Loukia Meligkotsidou
Author-X-Name-First: Loukia
Author-X-Name-Last: Meligkotsidou
Author-Name: Elias Tzavalis
Author-X-Name-First: Elias
Author-X-Name-Last: Tzavalis
Author-Name: Ioannis Vrontos
Author-X-Name-First: Ioannis
Author-X-Name-Last: Vrontos
Title: A Bayesian Analysis of Unit Roots and Structural Breaks in the Level, Trend, and Error Variance of Autoregressive Models of Economic Series
Abstract:
In this article, a Bayesian approach is suggested to compare unit root
models with stationary autoregressive models when the level, the trend,
and the error variance are subject to structural changes (known as breaks)
of an unknown date. Ignoring structural breaks in the error variance may
be responsible for not rejecting the unit root hypothesis, even if
allowance is made in the inferential procedures for breaks in the mean.
The article utilizes analytic and Monte Carlo integration techniques for
calculating the marginal likelihoods of the models under consideration, in
order to compute the posterior model probabilities. The performance of the
method is assessed by simulation experiments. Some empirical applications
of the method are conducted with the aim to investigate if it can detect
structural breaks in financial series, especially with changes in the
error variance.
Journal: Econometric Reviews
Pages: 208-249
Issue: 2
Volume: 30
Year: 2011
Keywords: Autoregressive models, Bayesian inference, Model comparison, Structural breaks, Unit roots,
X-DOI: 10.1080/07474938.2011.534046
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.534046
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:2:p:208-249
Template-Type: ReDIF-Article 1.0
Author-Name: Fuchun Li
Author-X-Name-First: Fuchun
Author-X-Name-Last: Li
Author-Name: Greg Tkacz
Author-X-Name-First: Greg
Author-X-Name-Last: Tkacz
Title: A Consistent Test for Multivariate Conditional Distributions
Abstract:
We propose a new test for a multivariate parametric conditional
distribution of a vector of variables yt given a conditional vector xt.
The proposed test is shown to have an asymptotic normal distribution under
the null hypothesis, while being consistent for all fixed alternatives,
and having nontrivial power against a sequence of local alternatives.
Monte Carlo simulations show that our test has reasonable size and good
power for both univariate and multivariate models, even for highly
persistent dependent data with sample sizes often encountered in empirical
finance.
Journal: Econometric Reviews
Pages: 251-273
Issue: 3
Volume: 30
Year: 2011
Keywords: Absolutely regular process, Consistent test, Degenerate U-statistics, Stochastic differential equation,
X-DOI: 10.1080/07474938.2011.553518
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.553518
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:3:p:251-273
Template-Type: ReDIF-Article 1.0
Author-Name: Rehim Kılıc
Author-X-Name-First: Rehim
Author-X-Name-Last: Kılıc
Title: Testing for a unit root in a stationary ESTAR process
Abstract:
This article develops a statistic for testing the null of a linear unit
root process against the alternative of a stationary exponential smooth
transition autoregressive model. The asymptotic distribution of the test
is shown to be nonstandard but nuisance parameter-free and hence critical
values are obtained by simulations. Simulations show that the proposed
statistic has considerable power under various data generating scenarios.
Applications to real exchange rates also illustrate the ability of our
test to reject null of unit root when some of the alternative tests do
not.
Journal: Econometric Reviews
Pages: 274-302
Issue: 3
Volume: 30
Year: 2011
Keywords: ESTAR model, Nonlinearity, Unit root,
X-DOI: 10.1080/07474938.2011.553511
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.553511
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:3:p:274-302
Template-Type: ReDIF-Article 1.0
Author-Name: Emma Iglesias
Author-X-Name-First: Emma
Author-X-Name-Last: Iglesias
Author-Name: Garry Phillips
Author-X-Name-First: Garry
Author-X-Name-Last: Phillips
Title: Small Sample Estimation Bias in GARCH Models with Any Number of Exogenous Variables in the Mean Equation
Abstract:
In this article we show how bias approximations for the quasi maximum
likelihood estimators of the parameters in Generalized Autoregressive
Conditional Heteroskedastic (GARCH)(p, q) models change when any number of
exogenous variables are included in the mean equation. The approximate
biases are shown to vary in an additive and proportional way in relation
to the number of exogenous variables, and they do not depend on the
moments of the regressors under the correct specification of the model.
This suggests a rule of thumb in testing for misspecification in GARCH
models. We also extend the theoretical bias approximations given in Linton
(1997) for the GARCH(1, 1). Because the expressions are not in closed
form, we concentrate in detail, and for simplicity of interpretation, on
the ARCH(1) model. At each stage, we check our theoretical results by
simulation and generally, we find that the approximations are quite
accurate for sample sizes of at least 50. We find that the biases are not
trivial in some circumstances and we discuss how the bias approximations
may be used, in practice, to reduce the bias. We also carry out
simulations for the GARCH(1,1) model and show that the biases change as
predicted by the approximations when the mean equation is augmented.
Finally, we illustrate the usefulness of our approach for U.S. monthly
inflation rates.
Journal: Econometric Reviews
Pages: 303-336
Issue: 3
Volume: 30
Year: 2011
Keywords: Bias correction, GARCH, Quasi maximum likelihood,
X-DOI: 10.1080/07474930903562551
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903562551
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:3:p:303-336
Template-Type: ReDIF-Article 1.0
Author-Name: Scott Atkinson
Author-X-Name-First: Scott
Author-X-Name-Last: Atkinson
Author-Name: Christopher Cornwell
Author-X-Name-First: Christopher
Author-X-Name-Last: Cornwell
Title: Estimation of Allocative Inefficiency and Productivity Growth with Dynamic Adjustment Costs
Abstract:
A substantial literature has been generated on the estimation of
allocative and technical inefficiency using static production, cost,
profit, and distance functions. We develop a dynamic shadow distance
system that integrates dynamic adjustment costs into a long-run shadow
cost-minimization problem, which allows us to distinguish static
allocative distortions from short-run inefficiencies that arise due to
period-to-period adjustment costs. The set of estimating equations is
comprised of the first-order conditions from the short-run shadow
cost-minimization problem for the variable shadow input quantities, a set
of Euler equations derived from subsequent shadow cost minimization with
respect to the quasi-fixed inputs, and the input distance function,
expressed in terms of shadow quantities. This system nests within it the
static model with zero adjustment costs. Using panel data on U.S. electric
utilities, we contrast the results of static and dynamic shadow distance
systems. First, the zero-adjustment-cost restriction is strongly rejected.
Second, we find that adjustment costs represent about 0.42% of total cost,
and about 1.26% of capital costs. Third, while both models reveal that
labor is not utilized efficiently, the dynamic model indicates a longer
period of over-use and less variance over time in the degree of
inefficiency. With the dynamic model, productivity growth is larger but
more stable.
Journal: Econometric Reviews
Pages: 337-357
Issue: 3
Volume: 30
Year: 2011
Keywords: Allocative inefficiency, Dynamic estimation, Euler equations, Productivity change, Technical change, Technical inefficiency,
X-DOI: 10.1080/07474930903451581
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474930903451581
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:3:p:337-357
Template-Type: ReDIF-Article 1.0
Author-Name: Kulan Ranasinghe
Author-X-Name-First: Kulan
Author-X-Name-Last: Ranasinghe
Author-Name: Mervyn J. Silvapulle
Author-X-Name-First: Mervyn J.
Author-X-Name-Last: Silvapulle
Title: Estimation Under Inequality Constraints: Semiparametric Estimation of Conditional Duration Models
Abstract:
This article proposes a semiparametric estimator of the
parameter in a conditional duration model when there are inequality
constraints on some parameters and the error distribution may be unknown.
We propose to estimate the parameter by a constrained version of an
unrestricted semiparametrically efficient estimator. The main requirement
for applying this method is that the initial unrestricted estimator
converges in distribution. Apart from this, additional regularity
conditions on the data generating process or the likelihood function, are
not required. Hence the method is applicable to a broad range of models
where the parameter space is constrained by inequality constraints, such
as the conditional duration models. In a simulation study involving
conditional duration models, the overall performance of the constrained
estimator was better than its competitors, in terms of mean squared error.
A data example is used to illustrate the method.
Journal: Econometric Reviews
Pages: 359-378
Issue: 4
Volume: 30
Year: 2011
Month: 8
X-DOI: 10.1080/07474938.2011.553537
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553537
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:4:p:359-378
Template-Type: ReDIF-Article 1.0
Author-Name: Nikolay Gospodinov
Author-X-Name-First: Nikolay
Author-X-Name-Last: Gospodinov
Author-Name: Ye Tao
Author-X-Name-First: Ye
Author-X-Name-Last: Tao
Title: Bootstrap Unit Root Tests in Models with GARCH(1,1) Errors
Abstract:
This article proposes a bootstrap unit root test in models
with GARCH(1,1) errors and establishes its asymptotic validity under mild
moment and distributional restrictions. While the proposed bootstrap test
for a unit root shares the power enhancing properties of its asymptotic
counterpart (Ling and Li, 2003), it offers a number of important
advantages. In particular, the bootstrap procedure does not require
explicit estimation of nuisance parameters that enter the distribution of
the test statistic and corrects the substantial size distortions of the
asymptotic test that occur for strongly heteroskedastic processes. The
simulation results demonstrate the excellent finite-sample properties of
the bootstrap unit root test for a wide range of GARCH specifications.
Journal: Econometric Reviews
Pages: 379-405
Issue: 4
Volume: 30
Year: 2011
Month: 8
X-DOI: 10.1080/07474938.2011.553538
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553538
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:4:p:379-405
Template-Type: ReDIF-Article 1.0
Author-Name: Giuseppe Ragusa
Author-X-Name-First: Giuseppe
Author-X-Name-Last: Ragusa
Title: Minimum Divergence, Generalized Empirical Likelihoods, and Higher Order Expansions
Abstract:
This article studies the minimum divergence (MD) class of
estimators for econometric models specified through moment restrictions.
We show that MD estimators can be obtained as solutions to a tractable
lower dimensional optimization problem. This problem is similar to the one
solved by the generalized empirical likelihood estimators of Newey and
Smith (2004), but it is equivalent to it only for a subclass of
divergences. The MD framework provides a coherent testing theory: tests
for overidentification and parametric restrictions in this framework can
be interpreted as semiparametric versions of Pearson-type goodness of fit
tests. The higher order properties of MD estimators are also studied and
it is shown that MD estimators that have the same higher order bias as the
empirical likelihood (EL) estimator also share the same higher order mean
square error and are all higher order efficient. We identify members of
the MD class that are not only higher order efficient, but also, unlike
the EL estimator, well behaved when the moment restrictions are
misspecified.
Journal: Econometric Reviews
Pages: 406-456
Issue: 4
Volume: 30
Year: 2011
Month: 8
X-DOI: 10.1080/07474938.2011.553541
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553541
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:4:p:406-456
Template-Type: ReDIF-Article 1.0
Author-Name: Dale J. Poirier
Author-X-Name-First: Dale J.
Author-X-Name-Last: Poirier
Title: Bayesian Interpretations of Heteroskedastic Consistent Covariance Estimators Using the Informed Bayesian Bootstrap
Abstract:
This article provides Bayesian interpretations for White's
heteroskedastic consistent (HC) covariance estimator, and
various modifications of it, in linear regression models. An informed
Bayesian bootstrap provides a useful framework.
Journal: Econometric Reviews
Pages: 457-468
Issue: 4
Volume: 30
Year: 2011
Month: 8
X-DOI: 10.1080/07474938.2011.553542
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553542
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:4:p:457-468
Template-Type: ReDIF-Article 1.0
Author-Name: Richard Luger
Author-X-Name-First: Richard
Author-X-Name-Last: Luger
Title: Book Review: Introducing Monte Carlo Methods with R
Journal: Econometric Reviews
Pages: 469-474
Issue: 4
Volume: 30
Year: 2011
Month: 8
X-DOI: 10.1080/07474938.2011.553548
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553548
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:4:p:469-474
Template-Type: ReDIF-Article 1.0
Author-Name: Tucker McElroy
Author-X-Name-First: Tucker
Author-X-Name-Last: McElroy
Author-Name: Thomas M. Trimbur
Author-X-Name-First: Thomas M.
Author-X-Name-Last: Trimbur
Title: On the Discretization of Continuous-Time Filters for Nonstationary Stock and Flow Time Series
Abstract:
This article discusses the discretization of continuous-time
filters for application to discrete time series sampled at any fixed
frequency. In this approach, the filter is first set up directly in
continuous-time; since the filter is expressed over a continuous range of
lags, we also refer to them as continuous-lag filters. The second step is
to discretize the filter itself. This approach applies to different
problems in signal extraction, including trend or business cycle analysis,
and the method allows for coherent design of discrete filters for observed
data sampled as a stock or a flow, for nonstationary data with stochastic
trend, and for different sampling frequencies. We derive explicit formulas
for the mean squared error (MSE) optimal discretization filters. We also
discuss the problem of optimal interpolation for nonstationary processes -
namely, how to estimate the values of a process and its components at
arbitrary times in-between the sampling times. A number of illustrations
of discrete filter coefficient calculations are provided, including the
local level model (LLM) trend filter, the smooth trend model (STM) trend
filter, and the Band Pass (BP) filter. The essential methodology can be
applied to other kinds of trend extraction problems. Finally, we provide
an extended demonstration of the method on CPI flow data measured at
monthly and annual sampling frequencies.
Journal: Econometric Reviews
Pages: 475-513
Issue: 5
Volume: 30
Year: 2011
Month: 10
X-DOI: 10.1080/07474938.2011.553554
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553554
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:5:p:475-513
Template-Type: ReDIF-Article 1.0
Author-Name: David I. Harvey
Author-X-Name-First: David I.
Author-X-Name-Last: Harvey
Author-Name: Stephen J. Leybourne
Author-X-Name-First: Stephen J.
Author-X-Name-Last: Leybourne
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M.
Author-X-Name-Last: Robert Taylor
Title: Testing for Unit Roots and the Impact of Quadratic Trends, with an Application to Relative Primary Commodity Prices
Abstract:
In practice a degree of uncertainty will always exist
concerning what specification to adopt for the deterministic trend
function when running unit root tests. While most macroeconomic time
series appear to display an underlying trend, it is often far from clear
whether this component is best modeled as a simple linear trend (so that
long-run growth rates are constant) or by a more complicated nonlinear
trend function which may, for instance, allow the deterministic trend
component to evolve gradually over time. In this article, we consider the
effects on unit root testing of allowing for a local quadratic trend, a
simple yet very flexible example of the latter. Where a local quadratic
trend is present but not modeled, we show that the quasi-differenced
detrended Dickey-Fuller-type test of Elliott et al. (1996) has both size
and power which tend to zero asymptotically. An extension of the Elliott
et al. (1996) approach to allow for a quadratic trend resolves this
problem but is shown to result in large power losses relative to the
standard detrended test when no quadratic trend is present. We
consequently propose a simple and practical approach to dealing with this
form of uncertainty based on a union of rejections-based decision rule
whereby the unit root is rejected whenever either of the detrended or
quadratic detrended unit root tests rejects. A modification of this basic
strategy is also suggested which further improves on the properties of the
procedure. An application to relative primary commodity price data
highlights the empirical relevance of the methods outlined in this
article. A by-product of our analysis is the development of a test for the
presence of a quadratic trend which is robust to whether the data admit a
unit root.
Journal: Econometric Reviews
Pages: 514-547
Issue: 5
Volume: 30
Year: 2011
Month: 10
X-DOI: 10.1080/07474938.2011.553561
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553561
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:5:p:514-547
Template-Type: ReDIF-Article 1.0
Author-Name: Manabu Asai
Author-X-Name-First: Manabu
Author-X-Name-Last: Asai
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: Alternative Asymmetric Stochastic Volatility Models
Abstract:
The stochastic volatility model usually incorporates
asymmetric effects by introducing the negative correlation between the
innovations in returns and volatility. In this paper, we propose a new
asymmetric stochastic volatility model, based on the leverage and size
effects. The model is a generalization of the exponential GARCH (EGARCH)
model of Nelson (1991). We consider categories for asymmetric effects,
which describes the difference among the asymmetric effect of the EGARCH
model, the threshold effects indicator function of Glosten et al. (1992),
and the negative correlation between the innovations in returns and
volatility. The new model is estimated by the efficient importance
sampling method of Liesenfeld and Richard (2003), and the finite sample
properties of the estimator are investigated using numerical simulations.
Four financial time series are used to estimate the alternative asymmetric
stochastic volatility (SV) models, with empirical asymmetric effects found
to be statistically significant in each case. The empirical results for
S&P 500 and Yen/USD returns indicate that the leverage and size effects
are significant, supporting the general model. For Tokyo stock price index
(TOPIX) and USD/AUD returns, the size effect is insignificant, favoring
the negative correlation between the innovations in returns and
volatility. We also consider standardized t distribution
for capturing the tail behavior. The results for Yen/USD returns show that
the model is correctly specified, while the results for three other data
sets suggest there is scope for improvement.
Journal: Econometric Reviews
Pages: 548-564
Issue: 5
Volume: 30
Year: 2011
Month: 10
X-DOI: 10.1080/07474938.2011.553156
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553156
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:5:p:548-564
Template-Type: ReDIF-Article 1.0
Author-Name: Andreas C. Drichoutis
Author-X-Name-First: Andreas C.
Author-X-Name-Last: Drichoutis
Author-Name: Rodolfo M. Nayga
Author-X-Name-First: Rodolfo M.
Author-X-Name-Last: Nayga
Title: Marginal Changes in Random Parameters Ordered Response Models with Interaction Terms
Abstract:
Marginal changes of interacted variables and interaction
terms in random parameters ordered response models are calculated
incorrectly in econometric softwares. We derive the correct formulas for
calculating these marginal changes. In our empirical example, we observe
significant changes not only in the magnitude of the marginal effects but
also in their standard errors, suggesting that the incorrect estimation of
the marginal effects of these variables as is commonly practiced can
render biased inferences on the findings.
Journal: Econometric Reviews
Pages: 565-576
Issue: 5
Volume: 30
Year: 2011
Month: 10
X-DOI: 10.1080/07474938.2011.553564
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553564
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:5:p:565-576
Template-Type: ReDIF-Article 1.0
Author-Name: Jean-Fran�ois Richard
Author-X-Name-First: Jean-Fran�ois
Author-X-Name-Last: Richard
Title: Book Review: Econometric Modeling and Inference
Journal: Econometric Reviews
Pages: 577-581
Issue: 5
Volume: 30
Year: 2011
Month: 10
X-DOI: 10.1080/07474938.2011.553565
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553565
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:5:p:577-581
Template-Type: ReDIF-Article 1.0
Author-Name: George Kapetanios
Author-X-Name-First: George
Author-X-Name-Last: Kapetanios
Author-Name: Yongcheol Shin
Author-X-Name-First: Yongcheol
Author-X-Name-Last: Shin
Title: Testing the Null Hypothesis of Nonstationary Long Memory Against the Alternative Hypothesis of a Nonlinear Ergodic Model
Abstract:
Interest in the interface of nonstationarity and nonlinearity has been
increasing in the econometric literature. This paper provides a formal
method of testing for nonstationary long memory against the alternative of
a particular form of nonlinear ergodic processes; namely, exponential
smooth transition autoregressive processes. In this regard, the current
paper provides a significant generalization to existing unit root tests by
allowing the null hypothesis to encompass a much larger class of
nonstationary processes. The asymptotic theory associated with the
proposed Wald statistic is derived, and Monte Carlo simulation results
confirm that the Wald statistics have reasonably correct size and good
power in small samples. In an application to real interest rates and the
Yen real exchange rates, we find that the tests are able to distinguish
between these competing processes in most cases, supporting the long-run
Purchasing Power Parity (PPP) and Fisher hypotheses. But, there are a few
cases in which long memory and nonlinear ergodic processes display similar
characteristics and are thus confused with each other in small samples.
Journal: Econometric Reviews
Pages: 620-645
Issue: 6
Volume: 30
Year: 2011
Keywords: Long memory I(d) and ESTAR processes, Monte Carlo simulations, Real exchange rates, Real interest rates, The Wald tests,
X-DOI: 10.1080/07474938.2011.553568
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.553568
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:6:p:620-645
Template-Type: ReDIF-Article 1.0
Author-Name: Jose Luis Aznarte
Author-X-Name-First: Jose Luis
Author-X-Name-Last: Aznarte
Author-Name: Jesus Alcala-Fdez
Author-X-Name-First: Jesus
Author-X-Name-Last: Alcala-Fdez
Author-Name: Antonio Arauzo
Author-X-Name-First: Antonio
Author-X-Name-Last: Arauzo
Author-Name: Jose Manuel Benitez
Author-X-Name-First: Jose Manuel
Author-X-Name-Last: Benitez
Title: Fuzzy Autoregressive Rules: Towards Linguistic Time Series Modeling
Abstract:
Fuzzy rule-based models, a key element in soft computing (SC), have
arisen as an alternative for time series analysis and modeling. One
difference with preexisting models is their interpretability in terms of
human language. Their interactions with other components have also
contributed to a huge development in their identification and estimation
procedures. In this article, we present fuzzy rule-based models, their
links with some regime-switching autoregressive models, and how the use of
soft computing concepts can help the practitioner to solve and gain a
deeper insight into a given problem. An example on a realized volatility
series is presented to show the forecasting abilities of a fuzzy
rule-based model.
Journal: Econometric Reviews
Pages: 646-668
Issue: 6
Volume: 30
Year: 2011
Keywords: Fuzzy models, Regime-switching models, Soft computing, Time series, Volatility,
X-DOI: 10.1080/07474938.2011.553569
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.553569
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:6:p:646-668
Template-Type: ReDIF-Article 1.0
Author-Name: Chia-Lin Chang
Author-X-Name-First: Chia-Lin
Author-X-Name-Last: Chang
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Author-Name: Les Oxley
Author-X-Name-First: Les
Author-X-Name-Last: Oxley
Title: Great Expectatrics: Great Papers, Great Journals, Great Econometrics
Abstract:
The article discusses alternative Research Assessment Measures (RAM),
with an emphasis on the Thomson Reuters ISI Web of Science database
(hereafter ISI). Some analysis and comparisons are also made with data
from the SciVerse Scopus database. The various RAM that are calculated
annually or updated daily are defined and analyzed, including the classic
2-year impact factor (2YIF), 2YIF without journal self-citations (2YIF*),
5-year impact factor (5YIF), Immediacy (or zero-year impact factor
(0YIF)), Impact Factor Inflation (IFI), Self-citation Threshold Approval
Rating (STAR), Eigenfactor score, Article Influence, C3PO (Citation
Performance Per Paper Online), h-index, Zinfluence, and PI-BETA (Papers
Ignored - By Even The Authors). The RAM are analyzed for 10 leading
econometrics journals and 4 leading statistics journals. The application
to econometrics can be used as a template for other areas in economics,
for other scientific disciplines, and as a benchmark for newer journals in
a range of disciplines. In addition to evaluating high quality research in
leading econometrics journals, the paper also compares econometrics and
statistics, alternative RAM, highlights the similarities and differences
of the alternative RAM, finds that several RAM capture similar performance
characteristics for the leading econometrics and statistics journals,
while the new PI-BETA criterion is not highly correlated with any of the
other RAM, and hence conveys additional information regarding RAM,
highlights major research areas in leading journals in econometrics, and
discusses some likely future uses of RAM, and shows that the harmonic mean
of 13 RAM provides more robust journal rankings than relying solely on
2YIF.
Journal: Econometric Reviews
Pages: 583-619
Issue: 6
Volume: 30
Year: 2011
Keywords: Article influence, Cited article influence, C3PO, Eigenfactor, IFI, Immediacy, Impact factors, h-Index, PI-BETA, STAR, Research assessment measures, Zinfluence,
X-DOI: 10.1080/07474938.2011.586614
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.586614
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:6:p:583-619
Template-Type: ReDIF-Article 1.0
Author-Name: Davide Raggi
Author-X-Name-First: Davide
Author-X-Name-Last: Raggi
Author-Name: Silvano Bordignon
Author-X-Name-First: Silvano
Author-X-Name-Last: Bordignon
Title: Volatility, Jumps, and Predictability of Returns: A Sequential Analysis
Abstract:
In this article we propose a Monte Carlo algorithm for sequential
parameter learning for a stochastic volatility model with leverage,
nonconstant conditional mean and jumps. We are interested in estimating
the time invariant parameters and the nonobservable dynamics involved in
the model. Our simple but effective idea relies on the auxiliary particle
filter algorithm mixed together with the Markov Chain Monte Carlo (MCMC)
methodology. Adding an MCMC step to the auxiliary particle filter prevents
numerical degeneracies in the sequential algorithm and allows sequential
evaluation of the fixed parameters and the latent processes. Empirical
evaluation on simulated and real data is presented to assess the
performance of the algorithm. A numerical comparison with a full MCMC
procedure is also provided. We also extend our methodology to
superposition models in which volatility is obtained by a linear
combination of independent processes.
Journal: Econometric Reviews
Pages: 669-695
Issue: 6
Volume: 30
Year: 2011
Keywords: Auxiliary particle filters, Bayesian estimation, Leverage, MCMC, Return's predictability, Stochastic volatility with jumps,
X-DOI: 10.1080/07474938.2011.553570
File-URL: http://www.tandfonline.com/doi/abs/10.1080/07474938.2011.553570
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:30:y:2011:i:6:p:669-695
Template-Type: ReDIF-Article 1.0
Author-Name: Vasilis Sarafidis
Author-X-Name-First: Vasilis
Author-X-Name-Last: Sarafidis
Author-Name: Tom Wansbeek
Author-X-Name-First: Tom
Author-X-Name-Last: Wansbeek
Title: Cross-Sectional Dependence in Panel Data Analysis
Abstract:
This article provides an overview of the existing literature on panel
data models with error cross-sectional dependence (CSD). We distinguish
between weak and strong CSD and link these concepts to the spatial and
factor structure approaches. We consider estimation under strong and weak
exogeneity of the regressors for both T fixed and
T large cases. Available tests for CSD and methods for
determining the number of factors are discussed in detail. The
finite-sample properties of some estimators and statistics are
investigated using Monte Carlo experiments.
Journal: Econometric Reviews
Pages: 483-531
Issue: 5
Volume: 31
Year: 2012
Month: 9
X-DOI: 10.1080/07474938.2011.611458
File-URL: http://hdl.handle.net/10.1080/07474938.2011.611458
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:5:p:483-531
Template-Type: ReDIF-Article 1.0
Author-Name: Emma M. Iglesias
Author-X-Name-First: Emma M.
Author-X-Name-Last: Iglesias
Author-Name: Garry D. A. Phillips
Author-X-Name-First: Garry D. A.
Author-X-Name-Last: Phillips
Title: Estimation, Testing, and Finite Sample Properties of Quasi-Maximum Likelihood Estimators in GARCH-M Models
Abstract:
We provide three new results concerning quasi-maximum likelihood (QML)
estimators in generalized autoregressive conditional heteroskedastic in
mean (GARCH-M) models. We first show that, depending on the functional
form that we impose in the mean equation, the properties of the model may
change and the conditional variance parameter space may be restricted, in
contrast to the theory of traditional GARCH processes. Second, we also
present a new test for GARCH effects in the GARCH-M context which is
simpler to implement than alternative procedures such as in Beg et al.
(2001). We propose a new way of dealing with parameters that are not
identified by creating composites of parameters that are identified.
Third, the finite sample properties of QML estimators are explored in a
restricted ARCH-M model and bias and variance approximations are found
which show that the larger the volatility of the process the better the
variance parameters are estimated. The invariance properties that
Lumsdaine (1995) proved for the traditional GARCH are shown not to hold in
the GARCH-M. For those researchers who choose not to rely on the first
order asymptotic approximation of our proposed test statistic, we also
show how our bias expressions can be used to bias correct the QML
estimates with a view to improving the finite sample performance of the
test. Finally, we show how our new proposed test works in practice in an
empirical economic application.
Journal: Econometric Reviews
Pages: 532-557
Issue: 5
Volume: 31
Year: 2012
Month: 9
X-DOI: 10.1080/07474938.2011.608007
File-URL: http://hdl.handle.net/10.1080/07474938.2011.608007
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:5:p:532-557
Template-Type: ReDIF-Article 1.0
Author-Name: Roseline Bilina
Author-X-Name-First: Roseline
Author-X-Name-Last: Bilina
Author-Name: Steve Lawford
Author-X-Name-First: Steve
Author-X-Name-Last: Lawford
Title: Python for Unified Research in Econometrics and Statistics
Abstract:
Python is a powerful high-level open source programming language that is
available for multiple platforms. It supports object-oriented programming
and has recently become a serious alternative to low-level compiled
languages such as C + +. It is easy to learn and use, and is
recognized for very fast development times, which makes it suitable for
rapid software prototyping as well as teaching purposes. We motivate the
use of Python and its free extension modules for high performance
stand-alone applications in econometrics and statistics, and as a tool for
gluing different applications together. (It is in this sense that Python
forms a “unified” environment for statistical research.) We
give details on the core language features, which will enable a user to
immediately begin work, and then provide practical examples of advanced
uses of Python. Finally, we compare the run-time performance of extended
Python against a number of commonly-used statistical packages and
programming environments. Supplemental materials are available for
this article. Go to the publisher's online edition of Econometric
Reviews to view the free supplemental file.
Journal: Econometric Reviews
Pages: 558-591
Issue: 5
Volume: 31
Year: 2012
Month: 9
X-DOI: 10.1080/07474938.2011.553573
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553573
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:5:p:558-591
Template-Type: ReDIF-Article 1.0
Author-Name: Theodore Alexandrov
Author-X-Name-First: Theodore
Author-X-Name-Last: Alexandrov
Author-Name: Silvia Bianconcini
Author-X-Name-First: Silvia
Author-X-Name-Last: Bianconcini
Author-Name: Estela Bee Dagum
Author-X-Name-First: Estela Bee
Author-X-Name-Last: Dagum
Author-Name: Peter Maass
Author-X-Name-First: Peter
Author-X-Name-Last: Maass
Author-Name: Tucker S. McElroy
Author-X-Name-First: Tucker S.
Author-X-Name-Last: McElroy
Title: A Review of Some Modern Approaches to the Problem of Trend Extraction
Abstract:
This article presents a review of some modern approaches to trend
extraction for one-dimensional time series, which is one of the major
tasks of time series analysis. The trend of a time series is usually
defined as a smooth additive component which contains information about
the time series global change, and we discuss this and other definitions
of the trend. We do not aim to review all the novel approaches, but rather
to observe the problem from different viewpoints and from different areas
of expertise. The article contributes to understanding the concept of a
trend and the problem of its extraction. We present an overview of
advantages and disadvantages of the approaches under consideration, which
are: the model-based approach (MBA), nonparametric linear filtering,
singular spectrum analysis (SSA), and wavelets. The MBA assumes the
specification of a stochastic time series model, which is usually either
an autoregressive integrated moving average (ARIMA) model or a state space
model. The nonparametric filtering methods do not require specification of
model and are popular because of their simplicity in application. We
discuss the Henderson, LOESS, and Hodrick--Prescott filters and their
versions derived by exploiting the Reproducing Kernel Hilbert Space
methodology. In addition to these prominent approaches, we consider SSA
and wavelet methods. SSA is widespread in the geosciences; its algorithm
is similar to that of principal components analysis, but SSA is applied to
time series. Wavelet methods are the de facto standard for denoising in
signal procession, and recent works revealed their potential in trend
analysis.
Journal: Econometric Reviews
Pages: 593-624
Issue: 6
Volume: 31
Year: 2012
Month: 11
X-DOI: 10.1080/07474938.2011.608032
File-URL: http://hdl.handle.net/10.1080/07474938.2011.608032
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:6:p:593-624
Template-Type: ReDIF-Article 1.0
Author-Name: Yi-Ting Chen
Author-X-Name-First: Yi-Ting
Author-X-Name-Last: Chen
Author-Name: Hung-Jen Wang
Author-X-Name-First: Hung-Jen
Author-X-Name-Last: Wang
Title: Centered-Residuals-Based Moment Estimator and Test for Stochastic Frontier Models
Abstract:
The composed error of a stochastic frontier (SF) model consists of two
random variables, and the identification of the model relies heavily on
the distribution assumptions for each of these variables. While the
literature has put much effort into applying various SF models to a wide
range of empirical problems, little has been done to test the distribution
assumptions of these two variables. In this article, by exploiting the
specification structures of the SF model, we propose a
centered-residuals-based method of moments which can be easily and
flexibly applied to testing the distribution assumptions on both of the
random variables and to estimating the model parameters. A Monte Carlo
simulation is conducted to assess the performance of the proposed method.
We also provide two empirical examples to demonstrate the use of the
proposed estimator and test using real data.
Journal: Econometric Reviews
Pages: 625-653
Issue: 6
Volume: 31
Year: 2012
Month: 11
X-DOI: 10.1080/07474938.2011.608037
File-URL: http://hdl.handle.net/10.1080/07474938.2011.608037
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:6:p:625-653
Template-Type: ReDIF-Article 1.0
Author-Name: Hans Manner
Author-X-Name-First: Hans
Author-X-Name-Last: Manner
Author-Name: Olga Reznikova
Author-X-Name-First: Olga
Author-X-Name-Last: Reznikova
Title: A Survey on Time-Varying Copulas: Specification, Simulations, and Application
Abstract:
The aim of this article is to bring together different specifications for
copula models with time-varying dependence structure. Copula models are
widely used in financial econometrics and risk management. They are
considered to be a competitive alternative to the Gaussian dependence
structure. The dynamic structure of the dependence between the data can be
modeled by allowing either the copula function or the dependence parameter
to be time-varying. First, we give a brief description of eight different
models, among which there are fully parametric, semiparametric, and
adaptive methods. The purpose of this study is to compare the
applicability of each particular model in different cases. We conduct a
simulation study to show the performance for model selection, to compare
the model fit for different setups and to study the ability of the models
to estimate the (latent) time-varying dependence parameter. Finally, we
provide an illustration by applying the competing models on the same
financial dataset and compare their performance by means of Value-at-Risk.
Journal: Econometric Reviews
Pages: 654-687
Issue: 6
Volume: 31
Year: 2012
Month: 11
X-DOI: 10.1080/07474938.2011.608042
File-URL: http://hdl.handle.net/10.1080/07474938.2011.608042
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:6:p:654-687
Template-Type: ReDIF-Article 1.0
Author-Name: E. Maasoumi
Author-X-Name-First: E.
Author-X-Name-Last: Maasoumi
Author-Name: G. Yalonetzky
Author-X-Name-First: G.
Author-X-Name-Last: Yalonetzky
Title: Introduction to Robustness in Multidimensional Wellbeing Analysis
Journal: Econometric Reviews
Pages: 1-6
Issue: 1
Volume: 32
Year: 2013
Month: 1
X-DOI: 10.1080/07474938.2012.690650
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690650
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:1:p:1-6
Template-Type: ReDIF-Article 1.0
Author-Name: Koen Decancq
Author-X-Name-First: Koen
Author-X-Name-Last: Decancq
Author-Name: María Ana Lugo
Author-X-Name-First: María Ana
Author-X-Name-Last: Lugo
Title: Weights in Multidimensional Indices of Wellbeing: An Overview
Abstract:
Multidimensional indices are becoming increasingly important instruments
to assess the wellbeing of societies. They move beyond the focus on a
single indicator and yet they are easy to present and communicate. A
crucial step in the construction of a multidimensional index of wellbeing
is the selection of the relative weights for the different dimensions. The
aim of this article is to study the role of these weights and to
critically survey eight different approaches to set them. We categorize
the approaches in three classes: data-driven, normative, and hybrid
weighting, and compare their respective advantages and drawbacks.
Journal: Econometric Reviews
Pages: 7-34
Issue: 1
Volume: 32
Year: 2013
Month: 1
X-DOI: 10.1080/07474938.2012.690641
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690641
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:1:p:7-34
Template-Type: ReDIF-Article 1.0
Author-Name: James E. Foster
Author-X-Name-First: James E.
Author-X-Name-Last: Foster
Author-Name: Mark McGillivray
Author-X-Name-First: Mark
Author-X-Name-Last: McGillivray
Author-Name: Suman Seth
Author-X-Name-First: Suman
Author-X-Name-Last: Seth
Title: Composite Indices: Rank Robustness, Statistical Association, and Redundancy
Abstract:
This article evaluates the robustness of rankings obtained from composite
indices that combine information from two or more components via a
weighted sum. It examines the empirical prevalence of robust comparisons
using the method proposed by Foster et al. (2010). Indices examined are
the Human Development Index (HDI), the Index of Economic Freedom (IEF),
and the Environmental Performance Index (EPI). Key theoretical results
demonstrate links between the prevalence of robust comparisons, Kendall's
tau rank correlation coefficient, and statistical association across
components. Implications for redundancy among index components are also
examined.
Journal: Econometric Reviews
Pages: 35-56
Issue: 1
Volume: 32
Year: 2013
Month: 1
X-DOI: 10.1080/07474938.2012.690647
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690647
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:1:p:35-56
Template-Type: ReDIF-Article 1.0
Author-Name: Christopher J. Bennett
Author-X-Name-First: Christopher J.
Author-X-Name-Last: Bennett
Author-Name: Shabana Mitra
Author-X-Name-First: Shabana
Author-X-Name-Last: Mitra
Title: Multidimensional Poverty: Measurement, Estimation, and Inference
Abstract:
Multidimensional poverty measures give rise to a host of statistical
hypotheses that are of interest to applied economists and policy-makers
alike. In the specific context of the generalized Alkire--Foster (Alkire
and Foster, 2008) class of measures, we show that many of these hypotheses
can be treated in a unified manner and also tested simultaneously using a
minimum p-value approach. When applied to study the
relative state of poverty among Hindus and Muslims in India, these tests
reveal novel insights into the plight of the poor which are not otherwise
captured by traditional univariate approaches.
Journal: Econometric Reviews
Pages: 57-83
Issue: 1
Volume: 32
Year: 2013
Month: 1
X-DOI: 10.1080/07474938.2012.690331
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690331
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:1:p:57-83
Template-Type: ReDIF-Article 1.0
Author-Name: Russell Davidson
Author-X-Name-First: Russell
Author-X-Name-Last: Davidson
Author-Name: Jean-Yves Duclos
Author-X-Name-First: Jean-Yves
Author-X-Name-Last: Duclos
Title: Testing for Restricted Stochastic Dominance
Abstract:
Asymptotic and bootstrap tests are studied for testing whether there is a
relation of stochastic dominance between two distributions. These tests
have a null hypothesis of nondominance, with the advantage that, if this
null is rejected, then all that is left is dominance. This also leads us
to define and focus on restricted stochastic dominance,
the only empirically useful form of dominance relation that we can seek to
infer in many settings. One testing procedure that we consider is based on
an empirical likelihood ratio. The computations necessary for obtaining a
test statistic also provide estimates of the distributions under study
that satisfy the null hypothesis, on the frontier between dominance and
nondominance. These estimates can be used to perform dominance tests that
can turn out to provide much improved reliability of inference compared
with the asymptotic tests so far proposed in the literature.
Journal: Econometric Reviews
Pages: 84-125
Issue: 1
Volume: 32
Year: 2013
Month: 1
X-DOI: 10.1080/07474938.2012.690332
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690332
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:1:p:84-125
Template-Type: ReDIF-Article 1.0
Author-Name: Gaston Yalonetzky
Author-X-Name-First: Gaston
Author-X-Name-Last: Yalonetzky
Title: Stochastic Dominance with Ordinal Variables: Conditions and a Test
Abstract:
A re-emerging literature on robustness in multidimensional welfare and
poverty comparisons has revived interest in multidimensional stochastic
dominance. Considering the widespread use of ordinal variables in
wellbeing measurement, and particularly in composite indices, I derive
multivariate stochastic dominance conditions for ordinal variables. These
are the analogues of the conditions for continuous variables (e.g., Bawa,
1975, and Atkinson and Bourguignon, 1982). The article also derives
mixed-order-of-dominance conditions for any type of variable. Then I
propose an extension of Anderson's nonparametric test in order to test
these conditions for ordinal variables. In addition, I propose the use of
vectors and matrices of positions in order to handle
multivariate, multinomial distributions. An empirical application to
multidimensional wellbeing in Peru illustrates these tests.
Journal: Econometric Reviews
Pages: 126-163
Issue: 1
Volume: 32
Year: 2013
Month: 1
X-DOI: 10.1080/07474938.2012.690653
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690653
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:1:p:126-163
Template-Type: ReDIF-Article 1.0
Author-Name: Gordon Anderson
Author-X-Name-First: Gordon
Author-X-Name-Last: Anderson
Author-Name: Kinda Hachem
Author-X-Name-First: Kinda
Author-X-Name-Last: Hachem
Title: Institutions and Economic Outcomes: A Dominance-Based Analysis
Abstract:
An important issue in both welfare and development economics is the
interaction between institutions and economic outcomes. While welfarists
are typically concerned with how these variables contribute to overall
wellbeing, empirical assessments of their joint contribution are limited.
Development economists, on the other hand, have focused extensively on
whether institutions cause or are caused by growth yet the relevant
literature is still rife with debate. In this article, we use a notion of
distributional dominance to tackle both the measurement of multivariate
welfare and the evaluation of inter-temporal dependence without hindrance
from the mix of discrete (political) and continuous (economic) variables
in our data set. On the causality front, our results support the view that
institutions promote growth more than growth promotes institutions. On the
welfare front, we find that economic growth had a positive impact from
1960 to 2000 but declines in institutional quality over the earlier part
of this period were sufficient to produce a decline in overall wellbeing
until the mid-1970s. Subsequent improvements in institutions then reversed
the trend and, ultimately, wellbeing in 2000 was higher than that in 1960.
Journal: Econometric Reviews
Pages: 164-182
Issue: 1
Volume: 32
Year: 2013
Month: 1
X-DOI: 10.1080/07474938.2012.690330
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690330
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:1:p:164-182
Template-Type: ReDIF-Article 1.0
Author-Name: Christoph Hanck
Author-X-Name-First: Christoph
Author-X-Name-Last: Hanck
Title: An Intersection Test for Panel Unit Roots
Abstract:
This article proposes a new panel unit root test based on Simes’
(1986) classical intersection test. The test is robust to general patterns
of cross-sectional dependence and yet is straightforward to implement,
only requiring p-values of time series unit root tests of
the series in the panel, and no resampling. Monte Carlo experiments show
good size and power properties relative to existing panel unit root tests.
Unlike previously suggested tests, the new test allows to identify the
units in the panel for which the alternative of stationarity can be said
to hold. We provide an empirical application to real exchange rate data.
Journal: Econometric Reviews
Pages: 183-203
Issue: 2
Volume: 32
Year: 2013
Month: 2
X-DOI: 10.1080/07474938.2011.608058
File-URL: http://hdl.handle.net/10.1080/07474938.2011.608058
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:2:p:183-203
Template-Type: ReDIF-Article 1.0
Author-Name: Giuseppe Cavaliere
Author-X-Name-First: Giuseppe
Author-X-Name-Last: Cavaliere
Author-Name: Iliyan Georgiev
Author-X-Name-First: Iliyan
Author-X-Name-Last: Georgiev
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M.
Author-X-Name-Last: Robert Taylor
Title: Wild Bootstrap of the Sample Mean in the Infinite Variance Case
Abstract:
It is well known that the standard independent, identically distributed
(iid) bootstrap of the mean is inconsistent in a location model with
infinite variance (α-stable) innovations. This occurs because the
bootstrap distribution of a normalised sum of infinite variance random
variables tends to a random distribution. Consistent bootstrap algorithms
based on subsampling methods have been proposed but have the drawback that
they deliver much wider confidence sets than those generated by the iid
bootstrap owing to the fact that they eliminate the dependence of the
bootstrap distribution on the sample extremes. In this paper we propose
sufficient conditions that allow a simple modification of the bootstrap
(Wu, 1986) to be consistent (in a conditional sense) yet to also reproduce
the narrower confidence sets of the iid bootstrap. Numerical results
demonstrate that our proposed bootstrap method works very well in practice
delivering coverage rates very close to the nominal level and
significantly narrower confidence sets than other consistent methods.
Journal: Econometric Reviews
Pages: 204-219
Issue: 2
Volume: 32
Year: 2013
Month: 2
X-DOI: 10.1080/07474938.2012.690660
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690660
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:2:p:204-219
Template-Type: ReDIF-Article 1.0
Author-Name: Alan J. Rogers
Author-X-Name-First: Alan J.
Author-X-Name-Last: Rogers
Title: Concentration Ellipsoids, Their Planes of Support, and the Linear Regression Model
Abstract:
The relationship between the concentration ellipsoid of a random vector
and its planes of support is exploited to provide a geometric derivation
and interpretation of existing results for a general form of the linear
regression model. In particular, the planes of support whose points of
tangency to the ellipsoid are contained in the range (or column space) of
the design matrix are the source of all linear unbiased minimum variance
estimators. The connection between this idea and estimators based on
projections is explored, as is also its use in obtaining and interpreting
some existing relative efficiency results.
Journal: Econometric Reviews
Pages: 220-243
Issue: 2
Volume: 32
Year: 2013
Month: 2
X-DOI: 10.1080/07474938.2011.608055
File-URL: http://hdl.handle.net/10.1080/07474938.2011.608055
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:2:p:220-243
Template-Type: ReDIF-Article 1.0
Author-Name: Sarantis Tsiaplias
Author-X-Name-First: Sarantis
Author-X-Name-Last: Tsiaplias
Author-Name: Chew Lian Chua
Author-X-Name-First: Chew Lian
Author-X-Name-Last: Chua
Title: A Multivariate GARCH Model Incorporating the Direct and Indirect Transmission of Shocks
Abstract:
Theoretical models of contagion and spillovers allow for asset-specific
shocks that can be directly transmitted from one asset to another, as well
as indirectly transmitted across uncorrelated assets through some
intermediary mechanism. Standard multivariate Generalized Autoregressive
Conditional Heteroskedasticity (GARCH) models, however, provide estimates
of volatilities and correlations based only on the direct transmission of
shocks across assets. As such, spillover effects via an intermediary asset
or market are not considered. In this article, a multivariate GARCH model
is constructed that provides estimates of volatilities and correlations
based on both directly and indirectly transmitted shocks. The model is
applied to exchange rate and equity returns data. The results suggest that
if a spillover component is observed in the data, the spillover augmented
models provide significantly different volatility estimates compared to
standard multivariate GARCH models.
Journal: Econometric Reviews
Pages: 244-271
Issue: 2
Volume: 32
Year: 2013
Month: 2
X-DOI: 10.1080/07474938.2011.608045
File-URL: http://hdl.handle.net/10.1080/07474938.2011.608045
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:2:p:244-271
Template-Type: ReDIF-Article 1.0
Author-Name: Halbert White
Author-X-Name-First: Halbert
Author-X-Name-Last: White
Author-Name: Karim Chalak
Author-X-Name-First: Karim
Author-X-Name-Last: Chalak
Title: Identification and Identification Failure for Treatment Effects Using Structural Systems
Abstract:
We provide necessary and sufficient conditions for effect identification,
thereby characterizing the limits to identification. Our results link the
nonstructural potential outcome framework for identifying and estimating
treatment effects to structural approaches in economics. This permits
economic theory to be built into treatment effect methods. We elucidate
the sources and consequences of identification failure by examining the
biases arising when the necessary conditions fail, and we clarify the
relations between unconfoundedness, conditional exogeneity, and the
necessary and sufficient identification conditions. A new quantity, the
exogeneity score, plays a central role in this analysis, permitting an
omitted variable representation for effect biases. This analysis also
provides practical guidance for selecting covariates and insight into the
price paid for making various identifying assumptions and the benefits
gained.
Journal: Econometric Reviews
Pages: 273-317
Issue: 3
Volume: 32
Year: 2013
Month: 11
X-DOI: 10.1080/07474938.2012.690664
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690664
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:3:p:273-317
Template-Type: ReDIF-Article 1.0
Author-Name: Alex Maynard
Author-X-Name-First: Alex
Author-X-Name-Last: Maynard
Author-Name: Aaron Smallwood
Author-X-Name-First: Aaron
Author-X-Name-Last: Smallwood
Author-Name: Mark E. Wohar
Author-X-Name-First: Mark E.
Author-X-Name-Last: Wohar
Title: Long Memory Regressors and Predictive Testing: A Two-stage Rebalancing Approach
Abstract:
Predictability tests with long memory regressors may entail both size
distortion and incompatibility between the orders of integration of the
dependent and independent variables. Addressing both problems
simultaneously, this paper proposes a two-step procedure that rebalances
the predictive regression by fractionally differencing the predictor based
on a first-stage estimation of the memory parameter. Extensive simulations
indicate that our procedure has good size, is robust to estimation error
in the first stage, and can yield improved power over cases in which an
integer order is assumed for the regressor. We also extend our approach
beyond the standard predictive regression context to cases in which the
dependent variable is also fractionally integrated, but not cointegrated
with the regressor. We use our procedure to provide a valid test of
forward rate unbiasedness that allows for a long memory forward premium.
Journal: Econometric Reviews
Pages: 318-360
Issue: 3
Volume: 32
Year: 2013
Month: 11
X-DOI: 10.1080/07474938.2012.690663
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690663
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:3:p:318-360
Template-Type: ReDIF-Article 1.0
Author-Name: Jonathan B. Hill
Author-X-Name-First: Jonathan B.
Author-X-Name-Last: Hill
Title: Consistent GMM Residuals-Based Tests of Functional Form
Abstract:
This paper presents a consistent Generalized Method of Moments (GMM)
residuals-based test of functional form for time series models. By
relating two moments we deliver a vector moment condition in which at
least one element must be nonzero if the model is misspecified. The test
will never fail to detect misspecification of any form for large samples,
and is asymptotically chi-squared under the null, allowing for fast and
simple inference. A simulation study reveals randomly selecting the
nuisance parameter leads to more power than supremum-tests, and can obtain
empirical power nearly equivalent to the most powerful test for even
relatively small n.
Journal: Econometric Reviews
Pages: 361-383
Issue: 3
Volume: 32
Year: 2013
Month: 11
X-DOI: 10.1080/07474938.2012.690662
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690662
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:3:p:361-383
Template-Type: ReDIF-Article 1.0
Author-Name: Markus Frölich
Author-X-Name-First: Markus
Author-X-Name-Last: Frölich
Author-Name: Blaise Melly
Author-X-Name-First: Blaise
Author-X-Name-Last: Melly
Title: Identification of Treatment Effects on the Treated with One-Sided Non-Compliance
Abstract:
Traditional instrumental variable estimators do not generally estimate
effects for the treated population but for the unobserved population of
compliers. On the other hand, when there is one-sided non-compliance, they
do identify effects for the treated because the populations of treated and
compliers are identical in this case. However, this property is lost when
covariates are included in the model. In this case, we
show that the effects for the treated are still identified but require
modified estimators. We consider both average and quantile treatment
effects and allow the instrument to be discrete or continuous.
Journal: Econometric Reviews
Pages: 384-414
Issue: 3
Volume: 32
Year: 2013
Month: 11
X-DOI: 10.1080/07474938.2012.718684
File-URL: http://hdl.handle.net/10.1080/07474938.2012.718684
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:3:p:384-414
Template-Type: ReDIF-Article 1.0
Author-Name: Yong Bao
Author-X-Name-First: Yong
Author-X-Name-Last: Bao
Title: On Sample Skewness and Kurtosis
Abstract:
It is well documented in the literature that the sample skewness and
excess kurtosis can be severely biased in finite samples. In this paper,
we derive analytical results for their finite-sample biases up to the
second order. In general, the bias results depend on the cumulants (up to
the sixth order) as well as the dependency structure of the data. Using an
AR(1) process for illustration, we show that a feasible bias-correction
procedure based on our analytical results works remarkably well for
reducing the bias of the sample skewness. Bias-correction works reasonably
well also for the sample kurtosis under some moderate degree of
dependency. In terms of hypothesis testing, bias-correction offers power
improvement when testing for normality, and bias-correction under the null
provides also size improvement. However, for testing nonzero skewness
and/or excess kurtosis, there exist nonnegligible size distortions in
finite samples and bias-correction may not help.
Journal: Econometric Reviews
Pages: 415-448
Issue: 4
Volume: 32
Year: 2013
Month: 12
X-DOI: 10.1080/07474938.2012.690665
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690665
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:4:p:415-448
Template-Type: ReDIF-Article 1.0
Author-Name: C. Gourieroux
Author-X-Name-First: C.
Author-X-Name-Last: Gourieroux
Author-Name: A. Monfort
Author-X-Name-First: A.
Author-X-Name-Last: Monfort
Title: Granularity Adjustment for Efficient Portfolios
Abstract:
This article considers large portfolios of assets submitted to both
systematic and unsystematic (or idiosyncratic) risks. The idiosyncratic
risks can be fully diversified if the portfolio size is infinite, but only
partly diversified otherwise. The granularity adjustment measures the
effect of partly diversifying idiosyncratic risks. We derive the
granularity adjustments for a portfolio with naive diversification and for
the efficient mean-variance portfolio allocation. We consider in
particular the Sharpe performances, with and without short-sale
restrictions and we highlight the effect of concentration risk.
Journal: Econometric Reviews
Pages: 449-468
Issue: 4
Volume: 32
Year: 2013
Month: 12
X-DOI: 10.1080/07474938.2012.690667
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690667
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:4:p:449-468
Template-Type: ReDIF-Article 1.0
Author-Name: Sainan Jin
Author-X-Name-First: Sainan
Author-X-Name-Last: Jin
Author-Name: Liangjun Su
Author-X-Name-First: Liangjun
Author-X-Name-Last: Su
Title: A Nonparametric Poolability Test for Panel Data Models with Cross Section Dependence
Abstract:
In this article we propose a nonparametric test for poolability in large
dimensional semiparametric panel data models with cross-section dependence
based on the sieve estimation technique. To construct the test statistic,
we only need to estimate the model under the alternative. We establish the
asymptotic normal distributions of our test statistic under the null
hypothesis of poolability and a sequence of local alternatives, and prove
the consistency of our test. We also suggest a bootstrap method as an
alternative way to obtain the critical values. A small set of Monte Carlo
simulations indicate the test performs reasonably well in finite samples.
Journal: Econometric Reviews
Pages: 469-512
Issue: 4
Volume: 32
Year: 2013
Month: 12
X-DOI: 10.1080/07474938.2012.690669
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690669
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:4:p:469-512
Template-Type: ReDIF-Article 1.0
Author-Name: Brennan S. Thompson
Author-X-Name-First: Brennan S.
Author-X-Name-Last: Thompson
Title: Empirical Likelihood-Based Inference for Poverty Measures with Relative Poverty Lines
Abstract:
In this article, we propose an empirical likelihood-based method of
inference for decomposable poverty measures utilizing poverty lines which
are some fraction of the median of the underlying income distribution.
Specifically, we focus on making poverty comparisons between two subgroups
of the population which share the same poverty line. Our proposed method
is assessed using a Monte Carlo simulation and is applied to some Canadian
household income data.
Journal: Econometric Reviews
Pages: 513-523
Issue: 4
Volume: 32
Year: 2013
Month: 12
X-DOI: 10.1080/07474938.2012.690671
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690671
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:4:p:513-523
Template-Type: ReDIF-Article 1.0
Author-Name: M. Ryan Haley
Author-X-Name-First: M. Ryan
Author-X-Name-Last: Haley
Author-Name: M. Kevin McGee
Author-X-Name-First: M. Kevin
Author-X-Name-Last: McGee
Author-Name: Todd B. Walker
Author-X-Name-First: Todd B.
Author-X-Name-Last: Walker
Title: Disparity, Shortfall, and Twice-Endogenous HARA Utility
Abstract:
We derive a mapping between the shortfall-minimizing portfolio selection
based on higher-order entropy measures and expected utility theory. We
show that the family of HARA utility functions has a minimum-divergence,
shortfall-based representation. This facilitates an interpretation in
which the risk aversion parameters and the type of risk aversion arise
endogenously. We provide a numerical example illustrating this
interpretation.
Journal: Econometric Reviews
Pages: 524-541
Issue: 4
Volume: 32
Year: 2013
Month: 12
X-DOI: 10.1080/07474938.2012.690672
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690672
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:4:p:524-541
Template-Type: ReDIF-Article 1.0
Author-Name: Badi H. Baltagi
Author-X-Name-First: Badi H.
Author-X-Name-Last: Baltagi
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Title: An Overview of Dependence in Cross-Section, Time-Series, and Panel Data
Journal: Econometric Reviews
Pages: 543-546
Issue: 5-6
Volume: 32
Year: 2013
Month: 8
X-DOI: 10.1080/07474938.2012.740957
File-URL: http://hdl.handle.net/10.1080/07474938.2012.740957
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:5-6:p:543-546
Template-Type: ReDIF-Article 1.0
Author-Name: Joakim Westerlund
Author-X-Name-First: Joakim
Author-X-Name-Last: Westerlund
Author-Name: Jörg Breitung
Author-X-Name-First: Jörg
Author-X-Name-Last: Breitung
Title: Lessons from a Decade of IPS and LLC
Abstract:
This paper points to some of the facts that have emerged from
20 years of research into the analysis of unit roots in panel data, an
area that has been heavily influenced by two studies, IPS (Im, Pesaran,
and Shin, 2003) and LLC (Levin et al., 2002). Some of these facts are
known, others are not. But they all have in common that, if ignored, the
effects can be very serious. This is demonstrated using both simulations
and theoretical arguments.
Journal: Econometric Reviews
Pages: 547-591
Issue: 5-6
Volume: 32
Year: 2013
Month: 8
X-DOI: 10.1080/07474938.2013.741023
File-URL: http://hdl.handle.net/10.1080/07474938.2013.741023
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:5-6:p:547-591
Template-Type: ReDIF-Article 1.0
Author-Name: Alexander Chudik
Author-X-Name-First: Alexander
Author-X-Name-Last: Chudik
Author-Name: M. Hashem Pesaran
Author-X-Name-First: M. Hashem
Author-X-Name-Last: Pesaran
Title: Econometric Analysis of High Dimensional VARs Featuring a Dominant Unit
Abstract:
This paper extends the analysis of infinite dimensional
vector autoregressive (IVAR) models proposed in Chudik and Pesaran (2011)
to the case where one of the variables or the cross-section units in the
IVAR model is dominant or pervasive. It is an important extension from
empirical as well theoretical perspectives. In the theory of networks a
dominant unit is the centre node of a star network and arises as an
efficient outcome of a distance-based utility model. Empirically, the
extension poses a number of technical challenges that goes well beyond the
analysis of IVAR models provided in Chudik and Pesaran. This is because
the dominant unit influences the rest of the variables in the IVAR model
both directly and indirectly, and its effects do not vanish as the
dimension of the model (N) tends to infinity. The dominant unit acts as a
dynamic factor in the regressions of the non-dominant units and yields an
infinite order distributed lag relationship between the two types of
units. Despite this it is shown that the effects of the dominant unit as
well as those of the neighborhood units can be consistently estimated by
running augmented least squares regressions that include distributed lag
functions of the dominant unit and its neighbors (if any). The asymptotic
distribution of the estimators is derived and their small sample
properties investigated by means of Monte Carlo experiments.
Journal: Econometric Reviews
Pages: 592-649
Issue: 5-6
Volume: 32
Year: 2013
Month: 8
X-DOI: 10.1080/07474938.2012.740374
File-URL: http://hdl.handle.net/10.1080/07474938.2012.740374
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:5-6:p:592-649
Template-Type: ReDIF-Article 1.0
Author-Name: Badi H. Baltagi
Author-X-Name-First: Badi H.
Author-X-Name-Last: Baltagi
Author-Name: Peter Egger
Author-X-Name-First: Peter
Author-X-Name-Last: Egger
Author-Name: Michael Pfaffermayr
Author-X-Name-First: Michael
Author-X-Name-Last: Pfaffermayr
Title: A Generalized Spatial Panel Data Model with Random Effects
Abstract:
This paper proposes a generalized panel data model with
random effects and first-order spatially autocorrelated residuals that
encompasses two previously suggested specifications. The first one is
described in Anselin's (1988) book and the second one by Kapoor et al.
(2007). Our encompassing specification allows us to test for these models
as restricted specifications. In particular, we derive three Lagrange
multiplier (LM) and likelihood ration (LR) tests that restrict our
generalized model to obtain (i) the Anselin model, (ii) the Kapoor,
Kelejian, and Prucha model, and (iii) the simple random effects model that
ignores the spatial correlation in the residuals. For two of these three
tests, we obtain closed form solutions and we derive their large sample
distributions. Our Monte Carlo results show that the suggested tests are
powerful in testing for these restricted specifications even in small and
medium sized samples.
Journal: Econometric Reviews
Pages: 650-685
Issue: 5-6
Volume: 32
Year: 2013
Month: 8
X-DOI: 10.1080/07474938.2012.742342
File-URL: http://hdl.handle.net/10.1080/07474938.2012.742342
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:5-6:p:650-685
Template-Type: ReDIF-Article 1.0
Author-Name: David M. Drukker
Author-X-Name-First: David M.
Author-X-Name-Last: Drukker
Author-Name: Peter Egger
Author-X-Name-First: Peter
Author-X-Name-Last: Egger
Author-Name: Ingmar R. Prucha
Author-X-Name-First: Ingmar R.
Author-X-Name-Last: Prucha
Title: On Two-Step Estimation of a Spatial Autoregressive Model with Autoregressive Disturbances and Endogenous Regressors
Abstract:
In this paper, we consider a spatial-autoregressive model
with autoregressive disturbances, where we allow for endogenous regressors
in addition to a spatial lag of the dependent variable. We suggest a
two-step generalized method of moments (GMM) and instrumental variable
(IV) estimation approach extending earlier work by, e.g., Kelejian and
Prucha (1998, 1999). In contrast to those papers, we not only prove
consistency for our GMM estimator for the spatial-autoregressive parameter
in the disturbance process, but we also derive the joint limiting
distribution for our GMM estimator and the IV estimator for the regression
parameters. Thus the theory allows for a joint test of zero spatial
interactions in the dependent variable, the exogenous variables and the
disturbances. The paper also provides a Monte Carlo study to illustrate
the performance of the estimator in small samples.
Journal: Econometric Reviews
Pages: 686-733
Issue: 5-6
Volume: 32
Year: 2013
Month: 8
X-DOI: 10.1080/07474938.2013.741020
File-URL: http://hdl.handle.net/10.1080/07474938.2013.741020
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:5-6:p:686-733
Template-Type: ReDIF-Article 1.0
Author-Name: Xiaodong Liu
Author-X-Name-First: Xiaodong
Author-X-Name-Last: Liu
Author-Name: Lung-Fei Lee
Author-X-Name-First: Lung-Fei
Author-X-Name-Last: Lee
Title: Two-Stage Least Squares Estimation of Spatial Autoregressive Models with Endogenous Regressors and Many Instruments
Abstract:
This paper considers the IV estimation of spatial
autoregressive models with endogenous regressors in the presence of many
instruments. To improve asymptotic efficiency, it may be desirable to use
many valid instruments. However, finite sample properties of IV estimators
can be sensitive to the number of instruments. For a spatial model with
endogenous regressors, this paper derives the asymptotic distribution of
the two-stage least squares (2SLS) estimator when the number of
instruments grows with the sample size, and suggests a bias-correction
procedure based on the leading-order many-instrument bias. The paper also
gives the Nagar-type approximate mean square errors (MSEs) of the 2SLS
estimator and the bias-corrected 2SLS estimator, which can be minimized to
choose instruments as in Donald and Newey (2001). A limited Monte Carlo
experiment is carried out to study the finite sample performance of the
instrument selection procedure.
Journal: Econometric Reviews
Pages: 734-753
Issue: 5-6
Volume: 32
Year: 2013
Month: 8
X-DOI: 10.1080/07474938.2013.741018
File-URL: http://hdl.handle.net/10.1080/07474938.2013.741018
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:5-6:p:734-753
Template-Type: ReDIF-Article 1.0
Author-Name: Xiao Huang
Author-X-Name-First: Xiao
Author-X-Name-Last: Huang
Title: Nonparametric Estimation in Large Panels with Cross-Sectional Dependence
Abstract:
In this paper we consider nonparametric estimation in panel
data under cross-sectional dependence. Both the number of cross-sectional
units (N) and the time dimension of the panel (T) are assumed to be large,
and the cross-sectional dependence has a multifactor structure. Local
linear regression is used to filter the unobserved cross-sectional factors
and to estimate the nonparametric conditional mean. A Monte Carlo
simulation study shows that the proposed estimator yields good finite
sample properties.
Journal: Econometric Reviews
Pages: 754-777
Issue: 5-6
Volume: 32
Year: 2013
Month: 8
X-DOI: 10.1080/07474938.2013.740998
File-URL: http://hdl.handle.net/10.1080/07474938.2013.740998
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:5-6:p:754-777
Template-Type: ReDIF-Article 1.0
Author-Name: Jennie Bai
Author-X-Name-First: Jennie
Author-X-Name-Last: Bai
Author-Name: Eric Ghysels
Author-X-Name-First: Eric
Author-X-Name-Last: Ghysels
Author-Name: Jonathan H. Wright
Author-X-Name-First: Jonathan H.
Author-X-Name-Last: Wright
Title: State Space Models and MIDAS Regressions
Abstract:
We examine the relationship between Mi(xed) Da(ta) S(ampling)
(MIDAS) regressions and the Kalman filter when forecasting with mixed
frequency data. In general, state space models involve a system of
equations, whereas MIDAS regressions involve a single equation. As a
consequence, MIDAS regressions might be less efficient, but could also be
less prone to parameter estimation error and/or specification errors. We
examine how MIDAS regressions and Kalman filters match up under ideal
circumstances, that is in population, and in cases where all the
stochastic processes—low and high frequency—are correctly
specified. We characterize cases where the MIDAS regression exactly
replicates the steady state Kalman filter weights. We compare MIDAS and
Kalman filter forecasts in population where the state space model is
misspecified. We also compare MIDAS and Kalman filter forecasts in small
samples. The paper concludes with an empirical application. Overall we
find that the MIDAS and Kalman filter methods give similar forecasts. In
most cases, the Kalman filter is a bit more accurate, but it is also
computationally much more demanding.
Journal: Econometric Reviews
Pages: 779-813
Issue: 7
Volume: 32
Year: 2013
Month: 10
X-DOI: 10.1080/07474938.2012.690675
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690675
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:7:p:779-813
Template-Type: ReDIF-Article 1.0
Author-Name: Giuseppe Cavaliere
Author-X-Name-First: Giuseppe
Author-X-Name-Last: Cavaliere
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M. Robert
Author-X-Name-Last: Taylor
Author-Name: Carsten Trenkler
Author-X-Name-First: Carsten
Author-X-Name-Last: Trenkler
Title: Bootstrap Cointegration Rank Testing: The Role of Deterministic Variables and Initial Values in the Bootstrap Recursion
Abstract:
In this paper we investigate the role of deterministic
components and initial values in bootstrap likelihood ratio type tests of
cointegration rank. A number of bootstrap procedures have been proposed in
the recent literature some of which include estimated deterministic
components and nonzero initial values in the bootstrap recursion while
others do the opposite. To date, however, there has not been a study into
the relative performance of these two alternative approaches. In this
paper we fill this gap in the literature and consider the impact of these
choices on both ordinary least squares (OLS) and generalized least squares
(GLS) detrended tests, in the case of the latter proposing a new bootstrap
algorithm as part of our analysis. Overall, for OLS detrended tests our
findings suggest that it is preferable to take the computationally simpler
approach of not including estimated deterministic components in the
bootstrap recursion and setting the initial values of the bootstrap
recursion to zero. For GLS detrended tests, we find that the approach of
Trenkler (2009), who includes a restricted estimate of the deterministic
component in the bootstrap recursion, can improve finite sample behavior
further.
Journal: Econometric Reviews
Pages: 814-847
Issue: 7
Volume: 32
Year: 2013
Month: 10
X-DOI: 10.1080/07474938.2012.690677
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690677
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:7:p:814-847
Template-Type: ReDIF-Article 1.0
Author-Name: Estela Bee Dagum
Author-X-Name-First: Estela Bee
Author-X-Name-Last: Dagum
Author-Name: Silvia Bianconcini
Author-X-Name-First: Silvia
Author-X-Name-Last: Bianconcini
Title: A Unified View of Nonparametric Trend-Cycle Predictors Via Reproducing Kernel Hilbert Spaces
Abstract:
We provide a common approach for studying several
nonparametric estimators used for smoothing functional time series data.
Linear filters based on different building assumptions are transformed
into kernel functions via reproducing kernel Hilbert spaces. For each
estimator, we identify a density function or second order kernel, from
which a hierarchy of higher order estimators is derived. These are shown
to give excellent representations for the currently applied symmetric
filters. In particular, we derive equivalent kernels of smoothing splines
in Sobolev and polynomial spaces. The asymmetric weights are obtained by
adapting the kernel functions to the length of the various filters, and a
theoretical and empirical comparison is made with the classical estimators
used in real time analysis. The former are shown to be superior in terms
of signal passing, noise suppression and speed of convergence to the
symmetric filter.
Journal: Econometric Reviews
Pages: 848-867
Issue: 7
Volume: 32
Year: 2013
Month: 10
X-DOI: 10.1080/07474938.2012.690674
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690674
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:7:p:848-867
Template-Type: ReDIF-Article 1.0
Author-Name: Stephan Smeekes
Author-X-Name-First: Stephan
Author-X-Name-Last: Smeekes
Title: Detrending Bootstrap Unit Root Tests
Abstract:
The role of detrending in bootstrap unit root tests is
investigated. When bootstrapping, detrending must not only be done for the
construction of the test statistic, but also in the first step of the
bootstrap algorithm. It is argued that the two issues should be treated
separately. Asymptotic validity of sieve bootstrap augmented
Dickey--Fuller (ADF) unit root tests is shown for test statistics based on
full sample and recursive ordinary least squares (OLS) and generalized
least squares (GLS) detrending. It is also shown that the detrending
method in the first step of the bootstrap may differ from the one used in
the construction of the test statistic. A simulation study is conducted to
analyze the effects of detrending on finite sample performance of the
bootstrap test. It is found that full sample OLS detrending should be
preferred based on power in the first step of the bootstrap algorithm, and
that the decision about the detrending method used to obtain the test
statistic should be based on the power properties of the corresponding
asymptotic tests.
Journal: Econometric Reviews
Pages: 869-891
Issue: 8
Volume: 32
Year: 2013
Month: 11
X-DOI: 10.1080/07474938.2012.690693
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690693
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:8:p:869-891
Template-Type: ReDIF-Article 1.0
Author-Name: Mohitosh Kejriwal
Author-X-Name-First: Mohitosh
Author-X-Name-Last: Kejriwal
Author-Name: Claude Lopez
Author-X-Name-First: Claude
Author-X-Name-Last: Lopez
Title: Unit Roots, Level Shifts, and Trend Breaks in Per Capita Output: A Robust Evaluation
Abstract:
Determining whether per capita output can be characterized by
a stochastic trend is complicated by the fact that infrequent breaks in
trend can bias standard unit root tests towards nonrejection of the unit
root hypothesis. The bulk of the existing literature has focused on the
application of unit root tests allowing for structural breaks in the trend
function under the trend stationary alternative but not under the unit
root null. These tests, however, provide little information regarding the
existence and number of trend breaks. Moreover, these tests suffer from
serious power and size distortions due to the asymmetric treatment of
breaks under the null and alternative hypotheses. This article estimates
the number of breaks in trend employing procedures that are robust to the
unit root/stationarity properties of the data. Our analysis of the per
capita gross domestic product (GDP) for Organization for Economic
Cooperation and Development (OECD) countries thereby permits a robust
classification of countries according to the “growth shift,”
“level shift,” and “linear trend” hypotheses.
In contrast to the extant literature, unit root tests conditional on the
presence or absence of breaks do not provide evidence against the unit
root hypothesis.
Journal: Econometric Reviews
Pages: 892-927
Issue: 8
Volume: 32
Year: 2013
Month: 11
X-DOI: 10.1080/07474938.2012.690689
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690689
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:8:p:892-927
Template-Type: ReDIF-Article 1.0
Author-Name: Jia Chen
Author-X-Name-First: Jia
Author-X-Name-Last: Chen
Author-Name: Jiti Gao
Author-X-Name-First: Jiti
Author-X-Name-Last: Gao
Author-Name: Degui Li
Author-X-Name-First: Degui
Author-X-Name-Last: Li
Title: Estimation in Single-Index Panel Data Models with Heterogeneous Link Functions
Abstract:
In this article, we study semiparametric estimation for a
single-index panel data model where the nonlinear link function varies
among the individuals. We propose using the refined minimum average
variance estimation method to estimate the parameter in the single-index.
As the cross-section dimension N and the time series
dimension T tend to infinity simultaneously, we establish
asymptotic distributions for the proposed estimator. In addition, we
provide a real-data example to illustrate the finite sample behavior of
the proposed estimation method.
Journal: Econometric Reviews
Pages: 928-955
Issue: 8
Volume: 32
Year: 2013
Month: 11
X-DOI: 10.1080/07474938.2012.690687
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690687
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:32:y:2013:i:8:p:928-955
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar (Essie) Maasoumi
Author-X-Name-First: Esfandiar (Essie)
Author-X-Name-Last: Maasoumi
Author-Name: Ehsan S. Soofi
Author-X-Name-First: Ehsan S.
Author-X-Name-Last: Soofi
Title: Arnold Zellner: Scientist, Leader, Mentor, and Friend
Journal: Econometric Reviews
Pages: 1-2
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.806840
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806840
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:1-2
Template-Type: ReDIF-Article 1.0
Author-Name: Arnold Zellner
Author-X-Name-First: Arnold
Author-X-Name-Last: Zellner
Author-Name: Tomohiro Ando
Author-X-Name-First: Tomohiro
Author-X-Name-Last: Ando
Author-Name: Nalan Baştük
Author-X-Name-First: Nalan
Author-X-Name-Last: Baştük
Author-Name: Lennart Hoogerheide
Author-X-Name-First: Lennart
Author-X-Name-Last: Hoogerheide
Author-Name: Herman K. van Dijk
Author-X-Name-First: Herman K.
Author-X-Name-Last: van Dijk
Title: Bayesian Analysis of Instrumental Variable Models: Acceptance-Rejection within Direct Monte Carlo
Abstract:
We discuss Bayesian inferential procedures within the family
of instrumental variables regression models and focus on two issues:
existence conditions for posterior moments of the parameters of interest
under a flat prior and the potential of Direct Monte Carlo (DMC)
approaches for efficient evaluation of such possibly highly non-elliptical
posteriors. We show that, for the general case of m endogenous variables
under a flat prior, posterior moments of order r exist for the
coefficients reflecting the endogenous regressors' effect on the dependent
variable, if the number of instruments is greater than m
+r, even though there is an issue of local non-identification
that causes non-elliptical shapes of the posterior. This stresses the need
for efficient Monte Carlo integration methods. We introduce an extension
of DMC that incorporates an acceptance-rejection sampling step within DMC.
This Acceptance-Rejection within Direct Monte Carlo (ARDMC) method has the
attractive property that the generated random drawings are independent,
which greatly helps the fast convergence of simulation results, and which
facilitates the evaluation of the numerical accuracy. The speed of ARDMC
can be easily further improved by making use of parallelized computation
using multiple core machines or computer clusters. We note that ARDMC is
an analogue to the well-known ";Metropolis-Hastings within Gibbs" sampling
in the sense that one 'more difficult' step is used within an 'easier'
simulation method. We compare the ARDMC approach with the Gibbs sampler
using simulated data and two empirical data sets, involving the settler
mortality instrument of Acemoglu et al. (2001) and father's education's
instrument used by Hoogerheide et al. (2012a). Even without making use of
parallelized computation, an efficiency gain is observed both under strong
and weak instruments, where the gain can be enormous in the latter case.
Journal: Econometric Reviews
Pages: 3-35
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807094
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807094
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:3-35
Template-Type: ReDIF-Article 1.0
Author-Name: James J. Heckman
Author-X-Name-First: James J.
Author-X-Name-Last: Heckman
Author-Name: Hedibert F. Lopes
Author-X-Name-First: Hedibert F.
Author-X-Name-Last: Lopes
Author-Name: Rémi Piatek
Author-X-Name-First: Rémi
Author-X-Name-Last: Piatek
Title: Treatment Effects: A Bayesian Perspective
Abstract:
This paper contributes to the emerging Bayesian literature on
treatment effects. It derives treatment parameters in the framework of a
potential outcomes model with a treatment choice equation, where the
correlation between the unobservable components of the model is driven by
a low-dimensional vector of latent factors. The analyst is assumed to have
access to a set of measurements generated by the latent factors. This
approach has attractive features from both theoretical and practical
points of view. Not only does it address the fundamental identification
problem arising from the inability to observe the same person in both the
treated and untreated states, but it also turns out to be straightforward
to implement. Formulae are provided to compute mean treatment effects as
well as their distributional versions. A Monte Carlo simulation study is
carried out to illustrate how the methodology can easily be applied.
Journal: Econometric Reviews
Pages: 36-67
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807103
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807103
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:36-67
Template-Type: ReDIF-Article 1.0
Author-Name: Tomohiro Ando
Author-X-Name-First: Tomohiro
Author-X-Name-Last: Ando
Author-Name: Ruey S. Tsay
Author-X-Name-First: Ruey S.
Author-X-Name-Last: Tsay
Title: A Predictive Approach for Selection of Diffusion Index Models
Abstract:
In this article, we propose a predictive mean squared error
criterion for selecting diffusion index models, which are useful in
forecasting when many predictors are available. A special feature of the
proposed criterion is that it takes into account the uncertainty in
estimated common factors. The new criterion is based on estimating the
predictive mean squared error in forecasting with correction for
asymptotic bias. The resulting estimate of bias-corrected forecast-error
is shown to be
consistent. The proposed criterion is a natural
extension of the traditional Akaike information criterion (AIC), but it
does not require the distributional assumptions for the likelihood.
Results of real data analysis and Monte Carlo simulations demonstrate that
the proposed criterion works well in comparison with the commonly used AIC
and Bayesian information criteria.
Journal: Econometric Reviews
Pages: 68-99
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807105
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807105
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:68-99
Template-Type: ReDIF-Article 1.0
Author-Name: Hedibert F. Lopes
Author-X-Name-First: Hedibert F.
Author-X-Name-Last: Lopes
Author-Name: Nicholas G. Polson
Author-X-Name-First: Nicholas G.
Author-X-Name-Last: Polson
Title: Bayesian Instrumental Variables: Priors and Likelihoods
Abstract:
Instrumental variable (IV) regression provides a number of
statistical challenges due to the shape of the likelihood. We review the
main Bayesian literature on instrumental variables and highlight these
pathologies. We discuss Jeffreys priors, the connection to the
errors-in-the-variables problems and more general error distributions. We
propose, as an alternative to the inverted Wishart prior, a new
Cholesky-based prior for the covariance matrix of the errors in IV
regressions. We argue that this prior is more flexible and more robust
thanthe inverted Wishart prior since it is not based on only one tightness
parameter and therefore can be more informative about certain components
of the covariance matrix and less informative about others. We show how
prior-posterior inference can be formulated in a Gibbs sampler and compare
its performance in the weak instruments case for synthetic as well as two
illustrations based on well-known real data.
Journal: Econometric Reviews
Pages: 100-121
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807146
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807146
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:100-121
Template-Type: ReDIF-Article 1.0
Author-Name: Alex Lenkoski
Author-X-Name-First: Alex
Author-X-Name-Last: Lenkoski
Author-Name: Theo S. Eicher
Author-X-Name-First: Theo S.
Author-X-Name-Last: Eicher
Author-Name: Adrian E. Raftery
Author-X-Name-First: Adrian E.
Author-X-Name-Last: Raftery
Title: Two-Stage Bayesian Model Averaging in Endogenous Variable Models
Abstract:
Economic modeling in the presence of endogeneity is subject
to model uncertainty at both the instrument and covariate level. We
propose a Two-Stage Bayesian Model Averaging (2SBMA) methodology that
extends the Two-Stage Least Squares (2SLS) estimator. By constructing a
Two-Stage Unit Information Prior in the endogenous variable model, we are
able to efficiently combine established methods for addressing model
uncertainty in regression models with the classic technique of 2SLS. To
assess the validity of instruments in the 2SBMA context, we develop
Bayesian tests of the identification restriction that are based on model
averaged posterior predictive p-values. A simulation study showed that
2SBMA has the ability to recover structure in both the instrument and
covariate set, and substantially improves the sharpness of resulting
coefficient estimates in comparison to 2SLS using the full specification
in an automatic fashion. Due to the increased parsimony of the 2SBMA
estimate, the Bayesian Sargan test had a power of 50% in detecting a
violation of the exogeneity assumption, while the method based on 2SLS
using the full specification had negligible power. We apply our approach
to the problem of development accounting, and find support not only for
institutions, but also for geography and integration as development
determinants, once both model uncertainty and endogeneity have been
jointly addressed.
Journal: Econometric Reviews
Pages: 122-151
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807150
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807150
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:122-151
Template-Type: ReDIF-Article 1.0
Author-Name: Siddhartha Chib
Author-X-Name-First: Siddhartha
Author-X-Name-Last: Chib
Author-Name: Srikanth Ramamurthy
Author-X-Name-First: Srikanth
Author-X-Name-Last: Ramamurthy
Title: DSGE Models with Student-t Errors
Abstract:
This paper deals with Dynamic Stochastic General Equilibrium
(DSGE) models under a multivariate student-t distribution
for the structural shocks. Based on the solution algorithm of Klein (2000)
and the gamma-normal representation of the
t-distribution, the TaRB-MH algorithm of Chib and
Ramamurthy (2010) is used to estimate the model. A technique for
estimating the marginal likelihood of the DSGE student-t
model is also provided. The methodologies are illustrated first with
simulated data and then with the DSGE model of Ireland (2004) where the
results support the t-error model in relation to the
Gaussian model.
Journal: Econometric Reviews
Pages: 152-171
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807152
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807152
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:152-171
Template-Type: ReDIF-Article 1.0
Author-Name: Brendan Kline
Author-X-Name-First: Brendan
Author-X-Name-Last: Kline
Author-Name: Justin L. Tobias
Author-X-Name-First: Justin L.
Author-X-Name-Last: Tobias
Title: Explaining Trends in Body Mass Index Using Demographic Counterfactuals
Abstract:
The United States is experiencing a major public health
problem relating to increasing levels of excess body fat. This paper is
about the relationship in the United States between trends in the
distribution of body mass index (BMI), including trends in overweight and
obesity, and demographic change. We provide estimates of the
counterfactual distribution of BMI that would have been observed in
2003--2008 had demographics remained fixed at 1980 values, roughly the
beginning of the period of increasing overweight and obesity. We find that
changes in demographics are partly responsible for the changes in the
population distribution of BMI and are capable of explaining about 8.6% of
the increase in the combined rate of overweight and obesity among women
and about 7.2% of the increase among men. We also use demographic
projections to predict a BMI distribution and corresponding rates of
overweight and obesity for 2050.
Journal: Econometric Reviews
Pages: 172-196
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807155
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807155
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:172-196
Template-Type: ReDIF-Article 1.0
Author-Name: James Berger
Author-X-Name-First: James
Author-X-Name-Last: Berger
Author-Name: M. J. Bayarri
Author-X-Name-First: M. J.
Author-X-Name-Last: Bayarri
Author-Name: L. R. Pericchi
Author-X-Name-First: L. R.
Author-X-Name-Last: Pericchi
Title: The Effective Sample Size
Abstract:
Model selection procedures often depend explicitly on the
sample size n of the experiment. One example is the Bayesian information
criterion (BIC) criterion and another is the use of Zellner--Siow priors
in Bayesian model selection. Sample size is well-defined if one has i.i.d
real observations, but is not well-defined for vector observations or in
non-i.i.d. settings; extensions of critera such as BIC to such settings
thus requires a definition of effective sample size that applies also in
such cases. A definition of effective sample size that applies to fairly
general linear models is proposed and illustrated in a variety of
situations. The definition is also used to propose a suitable 'scale' for
default proper priors for Bayesian model selection.
Journal: Econometric Reviews
Pages: 197-217
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807157
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807157
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:197-217
Template-Type: ReDIF-Article 1.0
Author-Name: Xiao-Li Meng
Author-X-Name-First: Xiao-Li
Author-X-Name-Last: Meng
Author-Name: Xianchao Xie
Author-X-Name-First: Xianchao
Author-X-Name-Last: Xie
Title: I Got More Data, My Model is More Refined, but My Estimator is Getting Worse! Am I Just Dumb?
Abstract:
Possibly, but more likely you are merely a victim of
conventional wisdom. More data or better models by no means guarantee
better estimators (e.g., with a smaller mean squared error), when you are
not following probabilistically principled methods such as MLE (for large
samples) or Bayesian approaches. Estimating equations are particularly
vulnerable in this regard, almost a necessary price for their robustness.
These points will be demonstrated via common tasks of estimating
regression parameters and correlations, under simple models such as
bivariate normal and ARCH(1). Some general strategies for detecting and
avoiding such pitfalls are suggested, including checking for
self-efficiency (Meng, 1994; Statistical Science) and
adopting a guiding working model. Using the example of
estimating the autocorrelation ρ under a stationary AR(1) model, we
also demonstrate the interaction between model assumptions and observation
structures in seeking additional information, as the sampling interval
s increases. Furthermore, for a given sample size, the
optimal s for minimizing the asymptotic variance of
is s = 1 if and only if
ρ-super-2 ≤ 1/3; beyond that region the optimal
s increases at the rate of log -super- -
1(ρ-super- - 2) as ρ approaches a unit root, as does the gain
in efficiency relative to using s = 1. A practical implication
of this result is that the so-called "non-informative" Jeffreys prior can
be far from non-informative even for stationary time series models,
because here it converges rapidly to a point mass at a unit root as
s increases. Our overall emphasis is that intuition and
conventional wisdom need to be examined via critical thinking and
theoretical verification before they can be trusted fully.
Journal: Econometric Reviews
Pages: 218-250
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.808567
File-URL: http://hdl.handle.net/10.1080/07474938.2013.808567
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:218-250
Template-Type: ReDIF-Article 1.0
Author-Name: Edward I. George
Author-X-Name-First: Edward I.
Author-X-Name-Last: George
Author-Name: Yuzo Maruyama
Author-X-Name-First: Yuzo
Author-X-Name-Last: Maruyama
Title: Posterior Odds with a Generalized Hyper-g-Prior
Abstract:
Averaged orthogonal rotations of Zellner's g-prior yield
general, interpretable, closed form Bayes factors for the normal linear
model variable selection problem. Coupled with a model space prior that
balances the weight between the identifiable and the unidentifiable
models, limiting forms for the posterior odds ratios are seen to yield new
expressions for high dimensional model choice.
Journal: Econometric Reviews
Pages: 251-269
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807181
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807181
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:251-269
Template-Type: ReDIF-Article 1.0
Author-Name: John Geweke
Author-X-Name-First: John
Author-X-Name-Last: Geweke
Author-Name: Gianni Amisano
Author-X-Name-First: Gianni
Author-X-Name-Last: Amisano
Title: Analysis of Variance for Bayesian Inference
Abstract:
This paper develops a multiway analysis of variance for
non-Gaussian multivariate distributions and provides a practical
simulation algorithm to estimate the corresponding components of variance.
It specifically addresses variance in Bayesian predictive distributions,
showing that it may be decomposed into the sum of extrinsic variance,
arising from posterior uncertainty about parameters, and intrinsic
variance, which would exist even if parameters were known. Depending on
the application at hand, further decomposition of extrinsic or intrinsic
variance (or both) may be useful. The paper shows how to produce
simulation-consistent estimates of all of these components, and the method
demands little additional effort or computing time beyond that already
invested in the posterior simulator. It illustrates the methods using a
dynamic stochastic general equilibrium model of the US economy, both
before and during the global financial crisis.
Journal: Econometric Reviews
Pages: 270-288
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807182
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807182
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:270-288
Template-Type: ReDIF-Article 1.0
Author-Name: Fabrizio Ruggeri
Author-X-Name-First: Fabrizio
Author-X-Name-Last: Ruggeri
Title: On Some Optimal Bayesian Nonparametric Rules for Estimating Distribution Functions
Abstract:
In this paper, we present a novel approach to estimating
distribution functions, which combines ideas from Bayesian nonparametric
inference, decision theory and robustness. Given a sample from a Dirichlet
process on the space (𝒳, A), with parameter η
in a class of measures, the sampling distribution function is estimated
according to some optimality criteria (mainly minimax and regret), when a
quadratic loss function is assumed. Estimates are then compared in two
examples: one with simulated data and one with gas escapes data in a city
network.
Journal: Econometric Reviews
Pages: 289-304
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807183
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807183
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:289-304
Template-Type: ReDIF-Article 1.0
Author-Name: Minxian Yang
Author-X-Name-First: Minxian
Author-X-Name-Last: Yang
Title: Normality of Posterior Distribution Under Misspecification and Nonsmoothness, and Bayes Factor for Davies' Problem
Abstract:
We examine the large sample properties of Bayes procedures in
a general framework, where data may be dependent and models may be
misspecified and nonsmooth. The posterior distribution of parameters is
shown to be asymptotically normal, centered at the quasi maximum
likelihood estimator, under mild conditions. In this framework, the Bayes
factor for the test problem of Davies (1997, 1987), where a parameter is
unidentified under the null hypothesis, is analyzed. The probability that
the Bayes factor leads to a correct conclusion about the hypotheses in
Davies’ problem is shown to approach to one.
Journal: Econometric Reviews
Pages: 305-336
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807185
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807185
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:305-336
Template-Type: ReDIF-Article 1.0
Author-Name: Bertrand Clarke
Author-X-Name-First: Bertrand
Author-X-Name-Last: Clarke
Author-Name: Jennifer Clarke
Author-X-Name-First: Jennifer
Author-X-Name-Last: Clarke
Author-Name: Chi Wai Yu
Author-X-Name-First: Chi Wai
Author-X-Name-Last: Yu
Title: Statistical Problem Classes and Their Links to Information Theory
Abstract:
We begin by recalling the tripartite division of statistical
problems into three classes, M-closed, M-complete, and M-open and then
reviewing the key ideas of introductory Shannon theory. Focusing on the
related but distinct goals of model selection and prediction, we argue
that different techniques for these two goals are appropriate for the
three different problem classes. For M-closed problems we give relative
entropy justification that the Bayes information criterion (BIC) is
appropriate for model selection and that the Bayes model average is
information optimal for prediction. For M-complete problems, we discuss
the principle of maximum entropy and a way to use the rate distortion
function to bypass the inaccessibility of the true distribution. For
prediction in the M-complete class, there is little work done on
information based model averaging so we discuss the Akaike information
criterion (AIC) and its properties and variants. For the
M-open class, we argue that essentially only predictive criteria are
suitable. Thus, as an analog to model selection, we present the key ideas
of prediction along a string under a codelength criterion and propose a
general form of this criterion. Since little work appears to have been
done on information methods for general prediction in the M-open class of
problems, we mention the field of information theoretic learning in
certain general function spaces.
Journal: Econometric Reviews
Pages: 337-371
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807190
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807190
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:337-371
Template-Type: ReDIF-Article 1.0
Author-Name: Joseph B. Kadane
Author-X-Name-First: Joseph B.
Author-X-Name-Last: Kadane
Author-Name: Jiashun Jin
Author-X-Name-First: Jiashun
Author-X-Name-Last: Jin
Title: Uniform Distributions on the Integers: A connection to the Bernouilli Random Walk
Abstract:
Associate to each subset of the integers its almost sure
limiting relative frequency under the Bernouilli random walk, if it has
one. The resulting probability space is purely finitely additive, and
uniform in the sense of residue classes and shift-invariance. However, it
is not uniform in the sense of limiting relative frequency.
Journal: Econometric Reviews
Pages: 372-378
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807193
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807193
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:372-378
Template-Type: ReDIF-Article 1.0
Author-Name: Nozer D. Singpurwalla
Author-X-Name-First: Nozer D.
Author-X-Name-Last: Singpurwalla
Title: Adaptive Percolation Using Subjective Likelihoods
Abstract:
A phenomenon that I call "adaptive percolation" commonly
arises in biology, business, economics, defense, finance, manufacturing,
and the social sciences. Here one wishes to select a handful of entities
from a large pool of entities via a process of screening through a
hierarchy of sieves. The process is not unlike the percolation of a liquid
through a porous medium. The probability model developed here is based on
a nested and adaptive Bayesian approach that results in the product of
beta-binomial distributions with common parameters. The common parameters
happen to be the observed data. I call this the
percolated beta-binomial distribution
. The model turns out to be a slight generalization of the
probabilistic model used in percolation theory. The generalization is a
consequence of using a subjectively specified likelihood function to
construct a probability model. The notion of using likelihoods for
constructing probability models is not a part of the conventional toolkit
of applied probabilists. To the best of my knowledge, a use of the product
of beta-binomial distributions as a probability model for Bernoulli trials
appears to be new. The development of the material of this article is
illustrated via data from the 2009 astronaut selection program, which
motivated this work.
Journal: Econometric Reviews
Pages: 379-394
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807195
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807195
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:379-394
Template-Type: ReDIF-Article 1.0
Author-Name: Nader Ebrahimi
Author-X-Name-First: Nader
Author-X-Name-Last: Ebrahimi
Author-Name: Nima Y. Jalali
Author-X-Name-First: Nima Y.
Author-X-Name-Last: Jalali
Author-Name: Ehsan S. Soofi
Author-X-Name-First: Ehsan S.
Author-X-Name-Last: Soofi
Author-Name: Refik Soyer
Author-X-Name-First: Refik
Author-X-Name-Last: Soyer
Title: Importance of Components for a System
Abstract:
Which component is most important for a system's survival? We
answer this question by ranking the information relationship between a
system and its components. The mutual information (M) measures dependence
between the operational states of the system and a component for a mission
time as well as between their life lengths. This measure ranks each
component in terms of its expected utility for predicting the system's
survival. We explore some relationships between the ordering of importance
of components by M and by Zellner's Maximal Data Information (MDIP)
criterion. For many systems the bivariate distribution of the component
and system lifetimes does not have a density with respect to the
two-dimensional Lebesgue measure. For these systems, M is
not defined, so we use a modification of a mutual information index to
cover such situations. Our results for ordering dependence are general in
terms of binary structures, sum of random variables, and order statistics.
Journal: Econometric Reviews
Pages: 395-420
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807652
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807652
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:395-420
Template-Type: ReDIF-Article 1.0
Author-Name: Peter Rossi
Author-X-Name-First: Peter
Author-X-Name-Last: Rossi
Title: All Roads Lead to Arnold
Journal: Econometric Reviews
Pages: 421-423
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807654
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807654
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:421-423
Template-Type: ReDIF-Article 1.0
Author-Name: Ehsan S. Soofi
Author-X-Name-First: Ehsan S.
Author-X-Name-Last: Soofi
Title: Memorial Statements by Anderson, Judge, Press, Aigner, Allenby, and Palm
Abstract:
This collection presents memorial statements by Theodore W.
Anderson, George G. Judge, S. James Press, Dennis J. Aigner, Greg M.
Allenby, and Franz C. Palm.
Journal: Econometric Reviews
Pages: 424-427
Issue: 1-4
Volume: 33
Year: 2014
Month: 6
X-DOI: 10.1080/07474938.2013.807657
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807657
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:1-4:p:424-427
Template-Type: ReDIF-Article 1.0
Author-Name: Karim M. Abadir
Author-X-Name-First: Karim M.
Author-X-Name-Last: Abadir
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Title: Overview
Journal: Econometric Reviews
Pages: 429-430
Issue: 5-6
Volume: 33
Year: 2014
Month: 8
X-DOI: 10.1080/07474938.2013.824766
File-URL: http://hdl.handle.net/10.1080/07474938.2013.824766
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:5-6:p:429-430
Template-Type: ReDIF-Article 1.0
Author-Name: Chris D. Orme
Author-X-Name-First: Chris D.
Author-X-Name-Last: Orme
Author-Name: Takashi Yamagata
Author-X-Name-First: Takashi
Author-X-Name-Last: Yamagata
Title: A Heteroskedasticity-Robust F-Test Statistic for Individual Effects
Abstract:
We derive the asymptotic distribution of the standard F-test
statistic for fixed effects, in static linear panel data models, under
both non-normality and heteroskedasticity of the error terms, when the
cross-section dimension is large but the time series dimension is fixed.
It is shown that a simple linear transformation of the F-test statistic
yields asymptotically valid inferences and under local fixed (or
correlated) individual effects, this heteroskedasticity-robust F-test
enjoys higher asymptotic power than a suitably robustified Random Effects
test. Wild bootstrap versions of these tests are considered which, in a
Monte Carlo study, provide more reliable inference in finite samples.
Journal: Econometric Reviews
Pages: 431-471
Issue: 5-6
Volume: 33
Year: 2014
Month: 8
X-DOI: 10.1080/07474938.2013.824792
File-URL: http://hdl.handle.net/10.1080/07474938.2013.824792
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:5-6:p:431-471
Template-Type: ReDIF-Article 1.0
Author-Name: Jerry Hausman
Author-X-Name-First: Jerry
Author-X-Name-Last: Hausman
Author-Name: Tiemen Woutersen
Author-X-Name-First: Tiemen
Author-X-Name-Last: Woutersen
Title: Estimating the Derivative Function and Counterfactuals in Duration Models with Heterogeneity
Abstract:
This paper presents a new estimator for counterfactuals in
duration models. The counterfactual in a duration model is the length of
the spell in case the regressor would have been different. We introduce
the structural duration function, which gives these counterfactuals. The
advantage of focusing on counterfactuals is that one does not need to
identify the mixed proportional hazard model. In particular, we present
examples in which the mixed proportional hazard model is unidentified or
has a singular information matrix but our estimator for counterfactuals
still converges at rate N -super-1/2,
where N is the number of observations. We apply the
structural duration function to simulate important policy effects,
including a change in welfare benefits.
Journal: Econometric Reviews
Pages: 472-496
Issue: 5-6
Volume: 33
Year: 2014
Month: 8
X-DOI: 10.1080/07474938.2013.825120
File-URL: http://hdl.handle.net/10.1080/07474938.2013.825120
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:5-6:p:472-496
Template-Type: ReDIF-Article 1.0
Author-Name: Christine Amsler
Author-X-Name-First: Christine
Author-X-Name-Last: Amsler
Author-Name: Artem Prokhorov
Author-X-Name-First: Artem
Author-X-Name-Last: Prokhorov
Author-Name: Peter Schmidt
Author-X-Name-First: Peter
Author-X-Name-Last: Schmidt
Title: Using Copulas to Model Time Dependence in Stochastic Frontier Models
Abstract:
We consider stochastic frontier models in a panel data
setting where there is dependence over time. Current methods of modeling
time dependence in this setting are either unduly restrictive or
computationally infeasible. Some impose restrictive assumptions on the
nature of dependence such as the "scaling" property. Others involve
T-dimensional integration, where T is
the number of cross-sections, which may be large. Moreover, no known
multivariate distribution has the property of having commonly used,
convenient marginals such as normal/half-normal. We show how to use
copulas to resolve these issues. The range of dependence we allow for is
unrestricted and the computational task involved is easy compared to the
alternatives. Also, the resulting estimators are more efficient than those
that assume independence over time. We propose two alternative
specifications. One applies a copula function to the distribution of the
composed error term. This permits the use of maximum likelyhood estimate
(MLE) and generalized method moments (GMM). The other applies a copula to
the distribution of the one-sided error term. This allows for a simulated
MLE and improved estimation of inefficiencies. An application demonstrates
the usefulness of our approach.
Journal: Econometric Reviews
Pages: 497-522
Issue: 5-6
Volume: 33
Year: 2014
Month: 8
X-DOI: 10.1080/07474938.2013.825126
File-URL: http://hdl.handle.net/10.1080/07474938.2013.825126
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:5-6:p:497-522
Template-Type: ReDIF-Article 1.0
Author-Name: Oliver Linton
Author-X-Name-First: Oliver
Author-X-Name-Last: Linton
Author-Name: Pedro Gozalo
Author-X-Name-First: Pedro
Author-X-Name-Last: Gozalo
Title: Testing Conditional Independence Restrictions
Abstract:
We propose a nonparametric test of the hypothesis of
conditional independence between variables of interest based on a
generalization of the empirical distribution function. This hypothesis is
of interest both for model specification purposes, parametric and
semiparametric, and for nonmodel-based testing of economic hypotheses. We
allow for both discrete variables and estimated parameters. The asymptotic
null distribution of the test statistic is a functional of a Gaussian
process. A bootstrap procedure is proposed for calculating the critical
values. Our test has power against alternatives at distance
n -super-&minus1/2 from the null; this
result holding independently of dimension. Monte Carlo simulations provide
evidence on size and power.
Journal: Econometric Reviews
Pages: 523-552
Issue: 5-6
Volume: 33
Year: 2014
Month: 8
X-DOI: 10.1080/07474938.2013.825135
File-URL: http://hdl.handle.net/10.1080/07474938.2013.825135
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:5-6:p:523-552
Template-Type: ReDIF-Article 1.0
Author-Name: Jennifer L. Castle
Author-X-Name-First: Jennifer L.
Author-X-Name-Last: Castle
Author-Name: Jurgen A. Doornik
Author-X-Name-First: Jurgen A.
Author-X-Name-Last: Doornik
Author-Name: David F. Hendry
Author-X-Name-First: David F.
Author-X-Name-Last: Hendry
Author-Name: Ragnar Nymoen
Author-X-Name-First: Ragnar
Author-X-Name-Last: Nymoen
Title: Misspecification Testing: Non-Invariance of Expectations Models of Inflation
Abstract:
Many economic models (such as the new-Keynesian Phillips
curve, NKPC) include expected future values, often estimated after
replacing the expected value by the actual future outcome, using
Instrumental Variables (IV) or Generalized Method of Moments (GMM).
Although crises, breaks, and regime shifts are relatively common, the
underlying theory does not allow for their occurrence. We show the
consequences for such models of breaks in data processes, and propose an
impulse-indicator saturation test of such specifications, applied to USA
and Euro-area NKPCs.
Journal: Econometric Reviews
Pages: 553-574
Issue: 5-6
Volume: 33
Year: 2014
Month: 8
X-DOI: 10.1080/07474938.2013.825137
File-URL: http://hdl.handle.net/10.1080/07474938.2013.825137
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:5-6:p:553-574
Template-Type: ReDIF-Article 1.0
Author-Name: Sainan Jin
Author-X-Name-First: Sainan
Author-X-Name-Last: Jin
Author-Name: Liangjun Su
Author-X-Name-First: Liangjun
Author-X-Name-Last: Su
Author-Name: Aman Ullah
Author-X-Name-First: Aman
Author-X-Name-Last: Ullah
Title: Robustify Financial Time Series Forecasting with Bagging
Abstract:
In this paper we propose a revised version of (bagging)
bootstrap aggregating as a forecast
combination method for the out-of-sample forecasts in time series models.
The revised version explicitly takes into account the dependence in time
series data and can be used to justify the validity of bagging in the
reduction of mean squared forecast error when compared with the unbagged
forecasts. Monte Carlo simulations show that the new method works quite
well and outperforms the traditional one-step-ahead linear forecast as
well as the nonparametric forecast in general, especially when the
in-sample estimation period is small. We also find that the bagging
forecasts based on misspecified linear models may work as effectively as
those based on nonparametric models, suggesting the robustification
property of bagging method in terms of out-of-sample forecasts. We then
reexamine forecasting powers of predictive variables suggested in the
literature to forecast the excess returns or equity premium. We find that,
consistent with Goyal and Welch (2008), the historical average excess
stock return forecasts may beat other predictor variables in the
literature when we apply traditional one-step linear forecast and the
nonparametric forecasting methods. However, when using the bagging method
or its revised version, which help to improve the mean squared forecast
error for "unstable" predictors, the predictive variables have a better
forecasting power than the historical mean.
Journal: Econometric Reviews
Pages: 575-605
Issue: 5-6
Volume: 33
Year: 2014
Month: 8
X-DOI: 10.1080/07474938.2013.825142
File-URL: http://hdl.handle.net/10.1080/07474938.2013.825142
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:5-6:p:575-605
Template-Type: ReDIF-Article 1.0
Author-Name: Giuseppe Cavaliere
Author-X-Name-First: Giuseppe
Author-X-Name-Last: Cavaliere
Author-Name: Anders Rahbek
Author-X-Name-First: Anders
Author-X-Name-Last: Rahbek
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M.
Author-X-Name-Last: Robert Taylor
Title: Bootstrap Determination of the Co-Integration Rank in Heteroskedastic VAR Models
Abstract:
In a recent paper Cavaliere et al. (2012) develop bootstrap
implementations of the (pseudo-) likelihood ratio (PLR) co-integration
rank test and associated sequential rank determination procedure of
Johansen (1996). The bootstrap samples are constructed using the
restricted parameter estimates of the underlying vector autoregressive
(VAR) model which obtain under the reduced rank null hypothesis. They
propose methods based on an independent and individual distributed
(i.i.d.) bootstrap resampling scheme and establish the validity of their
proposed bootstrap procedures in the context of a co-integrated VAR model
with i.i.d. innovations. In this paper we investigate the properties of
their bootstrap procedures, together with analogous procedures based on a
wild bootstrap resampling scheme, when time-varying behavior is present in
either the conditional or unconditional variance of the innovations. We
show that the bootstrap PLR tests are asymptotically correctly sized and,
moreover, that the probability that the associated bootstrap sequential
procedures select a rank smaller than the true rank converges to zero.
This result is shown to hold for both the i.i.d. and wild bootstrap
variants under conditional heteroskedasticity but only for the latter
under unconditional heteroskedasticity. Monte Carlo evidence is reported
which suggests that the bootstrap approach of Cavaliere et al. (2012)
significantly improves upon the finite sample performance of corresponding
procedures based on either the asymptotic PLR test or an alternative
bootstrap method (where the short run dynamics in the VAR model are
estimated unrestrictedly) for a variety of conditionally and
unconditionally heteroskedastic innovation processes.
Journal: Econometric Reviews
Pages: 606-650
Issue: 5-6
Volume: 33
Year: 2014
Month: 8
X-DOI: 10.1080/07474938.2013.825175
File-URL: http://hdl.handle.net/10.1080/07474938.2013.825175
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:5-6:p:606-650
Template-Type: ReDIF-Article 1.0
Author-Name: Russell Davidson
Author-X-Name-First: Russell
Author-X-Name-Last: Davidson
Author-Name: James G. MacKinnon
Author-X-Name-First: James G.
Author-X-Name-Last: MacKinnon
Title: Bootstrap Confidence Sets with Weak Instruments
Abstract:
We study several methods of constructing confidence sets for
the coefficient of the single right-hand-side endogenous variable in a
linear equation with weak instruments. Two of these are based on
conditional likelihood ratio (CLR) tests, and the others are based on
inverting t statistics or the bootstrap
P values associated with them. We propose a new method
for constructing bootstrap confidence sets based on t
statistics. In large samples, the procedures that generally work best are
CLR confidence sets using asymptotic critical values and bootstrap
confidence sets based on limited-information maximum likelihood (LIML)
estimates.
Journal: Econometric Reviews
Pages: 651-675
Issue: 5-6
Volume: 33
Year: 2014
Month: 8
X-DOI: 10.1080/07474938.2013.825177
File-URL: http://hdl.handle.net/10.1080/07474938.2013.825177
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:5-6:p:651-675
Template-Type: ReDIF-Article 1.0
Author-Name: Ioannis Kasparis
Author-X-Name-First: Ioannis
Author-X-Name-Last: Kasparis
Author-Name: Peter C. B. Phillips
Author-X-Name-First: Peter C. B.
Author-X-Name-Last: Phillips
Author-Name: Tassos Magdalinos
Author-X-Name-First: Tassos
Author-X-Name-Last: Magdalinos
Title: Nonlinearity Induced Weak Instrumentation
Abstract:
In regressions involving integrable functions we examine the
limit properties of instrumental variable (IV) estimators that utilise
integrable transformations of lagged regressors as instruments. The
regressors can be either I(0) or nearly integrated
(NI) processes. We show that this kind of nonlinearity in
the regression function can significantly affect the relevance of the
instruments. In particular, such instruments become weak when the signal
of the regressor is strong, as it is in the NI case.
Instruments based on integrable functions of lagged NI
regressors display long range dependence and so remain relevant even at
long lags, continuing to contribute to variance reduction in IV
estimation. However, simulations show that ordinary least square (OLS) is
generally superior to IV estimation in terms of mean squared error (MSE),
even in the presence of endogeneity. Estimation precision is also reduced
when the regressor is nonstationary.
Journal: Econometric Reviews
Pages: 676-712
Issue: 5-6
Volume: 33
Year: 2014
Month: 8
X-DOI: 10.1080/07474938.2013.825181
File-URL: http://hdl.handle.net/10.1080/07474938.2013.825181
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:5-6:p:676-712
Template-Type: ReDIF-Article 1.0
Author-Name: Marcelo C. Medeiros
Author-X-Name-First: Marcelo C.
Author-X-Name-Last: Medeiros
Author-Name: Eduardo Mendes
Author-X-Name-First: Eduardo
Author-X-Name-Last: Mendes
Author-Name: Les Oxley
Author-X-Name-First: Les
Author-X-Name-Last: Oxley
Title: A Note on Nonlinear Cointegration, Misspecification, and Bimodality
Abstract:
We derive the asymptotic distribution of the ordinary least
squares estimator in a regression with cointegrated variables under
misspecification and/or nonlinearity in the regressors. We show that,
under some circumstances, the order of convergence of the estimator
changes and the asymptotic distribution is non-standard. The
t-statistic might also diverge. A simple case arises when
the intercept is erroneously omitted from the estimated model or in
nonlinear-in-variables models with endogenous regressors. In the latter
case, a solution is to use an instrumental variable estimator. The core
results in this paper also generalise to more complicated nonlinear models
involving integrated time series.
Journal: Econometric Reviews
Pages: 713-731
Issue: 7
Volume: 33
Year: 2014
Month: 10
X-DOI: 10.1080/07474938.2012.690676
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690676
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:7:p:713-731
Template-Type: ReDIF-Article 1.0
Author-Name: Randall A. Lewis
Author-X-Name-First: Randall A.
Author-X-Name-Last: Lewis
Author-Name: James B. McDonald
Author-X-Name-First: James B.
Author-X-Name-Last: McDonald
Title: Partially Adaptive Estimation of the Censored Regression Model
Abstract:
Data censoring causes ordinary least squares estimates of
linear models to be biased and inconsistent. Tobit, semiparametric, and
partially adaptive estimators have been considered as possible solutions.
This paper proposes several new partially adaptive estimators that cover a
wide range of distributional characteristics. A simulation study is used
to investigate the estimators' relative efficiency in these settings. The
partially adaptive censored regression estimators have little efficiency
loss for censored normal errors and may outperform Tobit and
semiparametric estimators considered for non-normal distributions. An
empirical example of out-of-pocket expenditures for a health insurance
plan provides an example, which supports these results.
Journal: Econometric Reviews
Pages: 732-750
Issue: 7
Volume: 33
Year: 2014
Month: 10
X-DOI: 10.1080/07474938.2012.690691
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690691
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:7:p:732-750
Template-Type: ReDIF-Article 1.0
Author-Name: Wanling Huang
Author-X-Name-First: Wanling
Author-X-Name-Last: Huang
Author-Name: Artem Prokhorov
Author-X-Name-First: Artem
Author-X-Name-Last: Prokhorov
Title: A Goodness-of-fit Test for Copulas
Abstract:
We propose a new rank-based goodness-of-fit test for copulas.
It uses the information matrix equality and so relates to the White (1982)
specification test. The test avoids parametric specification of marginal
distributions, it does not involve kernel weighting, bandwidth selection,
or any other strategic choices, it is asymptotically pivotal with a
standard distribution, and it is simple to compute compared to available
alternatives. The finite-sample size of this type of tests is known to
deviate from their nominal size based on asymptotic critical values, and
bootstrapping critical values could be a preferred alternative. A power
study shows that, in a bivariate setting, the test has reasonable
properties compared to its competitors. We conclude with an application in
which we apply the test to two stock indices.
Journal: Econometric Reviews
Pages: 751-771
Issue: 7
Volume: 33
Year: 2014
Month: 10
X-DOI: 10.1080/07474938.2012.690692
File-URL: http://hdl.handle.net/10.1080/07474938.2012.690692
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:7:p:751-771
Template-Type: ReDIF-Article 1.0
Author-Name: Clément Bosquet
Author-X-Name-First: Clément
Author-X-Name-Last: Bosquet
Author-Name: Hervé Boulhol
Author-X-Name-First: Hervé
Author-X-Name-Last: Boulhol
Title: Applying the GLM Variance Assumption to Overcome the Scale-Dependence of the Negative Binomial QGPML Estimator
Abstract:
Recently, various studies have used the Poisson
Pseudo-Maximal Likehood (PML) to estimate gravity specifications of trade
flows and non-count data models more generally. Some papers also report
results based on the Negative Binomial Quasi-Generalised Pseudo-Maximum
Likelihood (NB QGPML) estimator, which encompasses the Poisson assumption
as a special case. This note shows that the NB QGPML estimators that have
been used so far are unappealing when applied to a continuous dependent
variable which unit choice is arbitrary, because estimates artificially
depend on that choice. A new NB QGPML estimator is introduced to overcome
this shortcoming.
Journal: Econometric Reviews
Pages: 772-784
Issue: 7
Volume: 33
Year: 2014
Month: 10
X-DOI: 10.1080/07474938.2013.806102
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806102
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:7:p:772-784
Template-Type: ReDIF-Article 1.0
Author-Name: Eduardo Rossi
Author-X-Name-First: Eduardo
Author-X-Name-Last: Rossi
Author-Name: Paolo Santucci de Magistris
Author-X-Name-First: Paolo
Author-X-Name-Last: Santucci de Magistris
Title: Estimation of Long Memory in Integrated Variance
Abstract:
A stylized fact is that realized variance has long memory. We
show that, when the instantaneous volatility is a long memory process of
order d, the integrated variance is characterized by the
same long-range dependence. We prove that the spectral density of realized
variance is given by the sum of the spectral density of the integrated
variance plus that of a measurement error, due to the sparse sampling and
market microstructure noise. Hence, the realized volatility has the same
degree of long memory as the integrated variance. The additional term in
the spectral density induces a finite-sample bias in the semiparametric
estimates of the long memory. A Monte Carlo simulation provides evidence
that the corrected local Whittle estimator of Hurvich et al. (2005) is
much less biased than the standard local Whittle estimator and the
empirical application shows that it is robust to the choice of the
sampling frequency used to compute the realized variance. Finally, the
empirical results suggest that the volatility series are more likely to be
generated by a nonstationary fractional process.
Journal: Econometric Reviews
Pages: 785-814
Issue: 7
Volume: 33
Year: 2014
Month: 10
X-DOI: 10.1080/07474938.2013.806131
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806131
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:7:p:785-814
Template-Type: ReDIF-Article 1.0
Author-Name: Carlos A. Flores
Author-X-Name-First: Carlos A.
Author-X-Name-Last: Flores
Author-Name: Alfonso Flores-Lagunes
Author-X-Name-First: Alfonso
Author-X-Name-Last: Flores-Lagunes
Author-Name: Dimitrios Kapetanakis
Author-X-Name-First: Dimitrios
Author-X-Name-Last: Kapetanakis
Title: Lessons From Quantile Panel Estimation of the Environmental Kuznets Curve
Abstract:
We employ quantile regression fixed effects models to
estimate the income-pollution relationship on NO
x
(nitrogen oxide) and SO 2
(sulfur dioxide) using U.S. data. Conditional median results suggest that
conditional mean methods provide too optimistic estimates about emissions
reduction for NO
x , while the opposite is found for
SO 2. Deleting outlier states
reverses the absence of a turning point for SO
2 in the conditional mean model, while the conditional
median model is robust to them. We also document the relationship's
sensitivity to including additional covariates for NO
x
, and undertake simulations to shed light on some estimation issues
of the methods employed.
Journal: Econometric Reviews
Pages: 815-853
Issue: 8
Volume: 33
Year: 2014
Month: 11
X-DOI: 10.1080/07474938.2013.806148
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806148
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:8:p:815-853
Template-Type: ReDIF-Article 1.0
Author-Name: Andrew Hodge
Author-X-Name-First: Andrew
Author-X-Name-Last: Hodge
Author-Name: Sriram Shankar
Author-X-Name-First: Sriram
Author-X-Name-Last: Shankar
Title: Partial Effects in Ordered Response Models with Factor Variables
Abstract:
Interpretation in nonlinear regression models that include
sets of dummy variables representing categories of underlying categorical
variables is not straightforward. Partial effects giving the differences
between each category and the reference category are routinely computed in
the empirical economics literature. Yet, partial effects yielding the
differences between each category and all other categories are not
calculated, despite having great interpretative value. We derive the
correct formulae for calculating these partial effects for an ordered
probit model. The results of an application using data on subjective
well-being illustrate the usefulness of the alternative partial effects.
Journal: Econometric Reviews
Pages: 854-868
Issue: 8
Volume: 33
Year: 2014
Month: 11
X-DOI: 10.1080/07474938.2013.806157
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806157
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:8:p:854-868
Template-Type: ReDIF-Article 1.0
Author-Name: Martin Huber
Author-X-Name-First: Martin
Author-X-Name-Last: Huber
Title: Treatment Evaluation in the Presence of Sample Selection
Abstract:
Sample selection and attrition are inherent in a range of
treatment evaluation problems such as the estimation of the returns to
schooling or training. Conventional estimators tackling selection bias
typically rely on restrictive functional form assumptions that are
unlikely to hold in reality. This paper shows identification of average
and quantile treatment effects in the presence of the double selection
problem into (i) a selective subpopulation (e.g., working-selection on
unobservables) and (ii) a binary treatment (e.g., training-selection on
observables) based on weighting observations by the inverse of a nested
propensity score that characterizes either selection probability.
Weighting estimators based on parametric propensity score models are
applied to female labor market data to estimate the returns to education.
Journal: Econometric Reviews
Pages: 869-905
Issue: 8
Volume: 33
Year: 2014
Month: 11
X-DOI: 10.1080/07474938.2013.806197
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806197
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:8:p:869-905
Template-Type: ReDIF-Article 1.0
Author-Name: Ted Juhl
Author-X-Name-First: Ted
Author-X-Name-Last: Juhl
Author-Name: Oleksandr Lugovskyy
Author-X-Name-First: Oleksandr
Author-X-Name-Last: Lugovskyy
Title: A Test for Slope Heterogeneity in Fixed Effects Models
Abstract:
Typical panel data models make use of the assumption that the
regression parameters are the same for each individual cross-sectional
unit. We propose tests for slope heterogeneity in panel data models. Our
tests are based on the conditional Gaussian likelihood function in order
to avoid the incidental parameters problem induced by the inclusion of
individual fixed effects for each cross-sectional unit. We derive the
Conditional Lagrange Multiplier test that is valid in cases where
N → ∞ and T is fixed. The
test applies to both balanced and unbalanced panels. We expand the test to
account for general heteroskedasticity where each cross-sectional unit has
its own form of heteroskedasticity. The modification is possible if
T is large enough to estimate regression coefficients for
each cross-sectional unit by using the MINQUE unbiased estimator for
regression variances under heteroskedasticity. All versions of the test
have a standard Normal distribution under general assumptions on the error
distribution as N → ∞. A Monte Carlo
experiment shows that the test has very good size properties under all
specifications considered, including heteroskedastic errors. In addition,
power of our test is very good relative to existing tests, particularly
when T is not large.
Journal: Econometric Reviews
Pages: 906-935
Issue: 8
Volume: 33
Year: 2014
Month: 11
X-DOI: 10.1080/07474938.2013.806708
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806708
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:8:p:906-935
Template-Type: ReDIF-Article 1.0
Author-Name: Hiroshi Yamada
Author-X-Name-First: Hiroshi
Author-X-Name-Last: Yamada
Author-Name: Wei Yanfeng
Author-X-Name-First: Wei
Author-X-Name-Last: Yanfeng
Title: Some Theoretical and Simulation Results on the Frequency Domain Causality Test
Abstract:
Breitung and Candelon (2006) in Journal of
Econometrics proposed a simple statistical testing procedure for
the noncausality hypothesis at a given frequency. In their paper, however,
they reported some theoretical results indicating that their test severely
suffers from quite low power when the noncausality hypothesis is tested at
a frequency close to 0 or pi. This paper examines whether or not these
results indicate their procedure is useless at such frequencies.
Journal: Econometric Reviews
Pages: 936-947
Issue: 8
Volume: 33
Year: 2014
Month: 11
X-DOI: 10.1080/07474938.2013.808488
File-URL: http://hdl.handle.net/10.1080/07474938.2013.808488
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:33:y:2014:i:8:p:936-947
Template-Type: ReDIF-Article 1.0
Author-Name: James J. Heckman
Author-X-Name-First: James J.
Author-X-Name-Last: Heckman
Author-Name: Apostolos Serletis
Author-X-Name-First: Apostolos
Author-X-Name-Last: Serletis
Title: Introduction to Econometrics with Theory: A Special Issue Honoring William A. Barnett
Abstract:
This special issue of Econometric Reviews
honors William A. Barnett's exceptional contributions in the field of
economics. It follows and complements a recent Journal of
Econometrics special issue also in honor of William A. Barnett,
Internally Consistent Modeling, Aggregation, Inference, and Policy, and is
devoted to papers with emphasis on research in the time domain both at the
individual and aggregate level.
Journal: Econometric Reviews
Pages: 1-5
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.944465
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944465
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:1-5
Template-Type: ReDIF-Article 1.0
Author-Name: James J. Heckman
Author-X-Name-First: James J.
Author-X-Name-Last: Heckman
Author-Name: Rodrigo Pinto
Author-X-Name-First: Rodrigo
Author-X-Name-Last: Pinto
Title: Econometric Mediation Analyses: Identifying the Sources of Treatment Effects from Experimentally Estimated Production Technologies with Unmeasured and Mismeasured Inputs
Abstract:
This paper presents an econometric mediation analysis. It
considers identification of production functions and the sources of output
effects (treatment effects) from experimental interventions when some
inputs are mismeasured and others are entirely omitted.
Journal: Econometric Reviews
Pages: 6-31
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.944466
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944466
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:6-31
Template-Type: ReDIF-Article 1.0
Author-Name: Charles Engel
Author-X-Name-First: Charles
Author-X-Name-Last: Engel
Author-Name: Nelson C. Mark
Author-X-Name-First: Nelson C.
Author-X-Name-Last: Mark
Author-Name: Kenneth D. West
Author-X-Name-First: Kenneth D.
Author-X-Name-Last: West
Title: Factor Model Forecasts of Exchange Rates
Abstract:
We construct factors from a cross-section of exchange rates
and use the idiosyncratic deviations from the factors to forecast. In a
stylized data generating process, we show that such forecasts can be
effective even if there is essentially no serial correlation in the
univariate exchange rate processes. We apply the technique to a panel of
bilateral U.S. dollar rates against 17 Organisation for Economic
Co-operation and Development countries. We forecast using factors, and
using factors combined with any of fundamentals suggested by Taylor rule,
monetary and purchasing power parity models. For long horizon (8 and 12
quarter) forecasts, we tend to improve on the forecast of a "no change"
benchmark in the late (1999-2007) but not early (1987-1998) parts of our
sample.
Journal: Econometric Reviews
Pages: 32-55
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.944467
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944467
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:32-55
Template-Type: ReDIF-Article 1.0
Author-Name: Arnold Zellner
Author-X-Name-First: Arnold
Author-X-Name-Last: Zellner
Author-Name: Jacques Kibambe Ngoie
Author-X-Name-First: Jacques Kibambe
Author-X-Name-Last: Ngoie
Title: Evaluation of the Effects of Reduced Personal and Corporate Tax Rates on the Growth Rates of the U.S. Economy
Abstract:
Using several variants of a Marshallian Macroeconomic Model
(MMM), see Zellner and Israilevich (2005) and Ngoie and Zellner (2010),
this paper investigates how various tax rate reductions may help stimulate
the U.S. economy while not adversely affecting aggregate U.S. debt.
Variants of our MMM that are shown to fit past data and to perform well in
forecasting experiments are employed to evaluate the effects of
alternative tax policies. Using quarterly data, our one-sector MMM has
been able to predict the 2008 downturn and the 2009Q3 upturn of the U.S.
economy. Among other results, this study, using transfer and impulse
response functions associated with our MMM, finds that permanent 5
percentage points cut in the personal income and corporate profits tax
rates will cause the U.S. real gross domestic product (GDP) growth rate to
rise by 3.0 percentage points with a standard error of 0.6 percentage
points. Also, while this policy change leads to positive growth of the
government sector, its share of total real GDP is slightly reduced. This
is understandable since short run effects of tax cuts include the transfer
of tax revenue from the government to the private sector. The private
sector is allowed to manage a larger portion of its revenue, while
government is forced to cut public spending on social programs with little
growth enhancing effects. This broadens private economic activities
overall. Further, these tax rate policy changes stimulate the growth of
the federal tax base considerably, which helps to reduce annual budget
deficits and the federal debt.
Journal: Econometric Reviews
Pages: 56-81
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.944468
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944468
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:56-81
Template-Type: ReDIF-Article 1.0
Author-Name: Yingying Dong
Author-X-Name-First: Yingying
Author-X-Name-Last: Dong
Author-Name: Arthur Lewbel
Author-X-Name-First: Arthur
Author-X-Name-Last: Lewbel
Title: A Simple Estimator for Binary Choice Models with Endogenous Regressors
Abstract:
This paper provides a few variants of a simple estimator for
binary choice models with endogenous or mismeasured regressors, or with
heteroskedastic errors, or with panel fixed effects. Unlike control
function methods, which are generally only valid when endogenous
regressors are continuous, the estimators proposed here can be used with
limited, censored, continuous, or discrete endogenous regressors, and they
allow for latent errors having heteroskedasticity of unknown form,
including random coefficients. The variants of special regressor based
estimators we provide are numerically trivial to implement. We illustrate
these methods with an empirical application estimating migration
probabilities within the US.
Journal: Econometric Reviews
Pages: 82-105
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.944470
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944470
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:82-105
Template-Type: ReDIF-Article 1.0
Author-Name: W. Erwin Diewert
Author-X-Name-First: W. Erwin
Author-X-Name-Last: Diewert
Author-Name: Jan de Haan
Author-X-Name-First: Jan de
Author-X-Name-Last: Haan
Author-Name: Rens Hendriks
Author-X-Name-First: Rens
Author-X-Name-Last: Hendriks
Title: Hedonic Regressions and the Decomposition of a House Price Index into Land and Structure Components
Abstract:
The paper uses hedonic regression techniques in order to
decompose the price of a house into land and structure components using
readily available real estate sales data for a small Dutch city. To get
sensible results, it was useful to estimate a nonlinear model on data that
cover multiple time periods. It also proved necessary to incorporate
exogenous information on the rate of growth of construction costs in the
Netherlands in order to obtain meaningful constant quality indexes for the
price of land and structures separately.
Journal: Econometric Reviews
Pages: 106-126
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.944791
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944791
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:106-126
Template-Type: ReDIF-Article 1.0
Author-Name: Yiguo Sun
Author-X-Name-First: Yiguo
Author-X-Name-Last: Sun
Author-Name: Cheng Hsiao
Author-X-Name-First: Cheng
Author-X-Name-Last: Hsiao
Author-Name: Qi Li
Author-X-Name-First: Qi
Author-X-Name-Last: Li
Title: Volatility Spillover Effect: A Semiparametric Analysis of Non-Cointegrated Process
Abstract:
Stock market volatility is highly persistent and exhibits
large fluctuations so that it is likely to be an integrated or a near
integrated process. Stock markets' volatilities from different countries
are intercorrelated, but are generally not cointegrated as many other
(domestic) factors also affect volatility. In this paper, we use a
semiparametric varying coefficient model to examine stock market
volatility spillover effects. Using the estimation method proposed by Sun
et al. (2011), we study the U.S./U.K. and U.S./Canadian stock market
volatility spillover effects. We find striking similar patterns in both
the U.S./U.K. and the U.S./Canadian markets. The stock market volatility
spillover effects are strengthened when the currency markets experience
high movement, and the spillover effects are asymmetric depending on
whether a foreign currency is appreciating or depreciating.
Journal: Econometric Reviews
Pages: 127-145
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.944793
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944793
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:127-145
Template-Type: ReDIF-Article 1.0
Author-Name: Shigeru Iwata
Author-X-Name-First: Shigeru
Author-X-Name-Last: Iwata
Author-Name: Han Li
Author-X-Name-First: Han
Author-X-Name-Last: Li
Title: What are the Differences in Trend Cycle Decompositions by Beveridge and Nelson and by Unobserved Component Models?
Abstract:
When a certain procedure is applied to extract two component
processes from a single observed process, it is necessary to impose a set
of restrictions that defines two components. One popular restriction is
the assumption that the shocks to the trend and cycle are orthogonal.
Another is the assumption that the trend is a pure random walk process.
The unobserved components (UC) model (Harvey, 1985) assumes both of the
above, whereas the BN decomposition (Beveridge and Nelson, 1981) assumes
only the latter. Quah (1992) investigates a broad class of decompositions
by making the former assumption only. This paper develops a
convenient general framework in which alternative trend-cycle
decompositions are regarded as special cases, and examines alternative
decomposition schemes from the perspective of the frequency domain. We
find that, although the conventional UC model is not necessarily a
misspecification for describing the postwar U.S. GDP, choosing a
univariate model among alternatives on the purely statistical grounds is
difficult.
Journal: Econometric Reviews
Pages: 146-173
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.945335
File-URL: http://hdl.handle.net/10.1080/07474938.2014.945335
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:146-173
Template-Type: ReDIF-Article 1.0
Author-Name: Annastiina Silvennoinen
Author-X-Name-First: Annastiina
Author-X-Name-Last: Silvennoinen
Author-Name: Timo Ter�svirta
Author-X-Name-First: Timo
Author-X-Name-Last: Ter�svirta
Title: Modeling Conditional Correlations of Asset Returns: A Smooth Transition Approach
Abstract:
In this paper we propose a new multivariate GARCH model with
time-varying conditional correlation structure. The time-varying
conditional correlations change smoothly between two extreme states of
constant correlations according to a predetermined or exogenous transition
variable. An LM-test is derived to test the constancy of correlations and
LM- and Wald tests to test the hypothesis of partially constant
correlations. Analytical expressions for the test statistics and the
required derivatives are provided to make computations feasible. An
empirical example based on daily return series of five frequently traded
stocks in the S&P 500 stock index completes the paper.
Journal: Econometric Reviews
Pages: 174-197
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.945336
File-URL: http://hdl.handle.net/10.1080/07474938.2014.945336
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:174-197
Template-Type: ReDIF-Article 1.0
Author-Name: Apostolos Serletis
Author-X-Name-First: Apostolos
Author-X-Name-Last: Serletis
Author-Name: Guohua Feng
Author-X-Name-First: Guohua
Author-X-Name-Last: Feng
Title: Imposing Theoretical Regularity on Flexible Functional Forms
Abstract:
In this paper we build on work by Gallant and Golub (1984),
Diewert and Wales (1987), and Barnett (2002) and provide a comparison
among three different methods of imposing theoretical regularity on
flexible functional forms-reparameterization using Cholesky factorization,
constrained optimization, and Bayesian methodology. We apply the
methodology to a translog cost and share equation system and make a
distinction between local, regional, pointwise, and global regularity. We
find that the imposition of curvature at a single point does not always
assure regularity. We also find that the imposition of global concavity
(at all possible, positive input prices), irrespective of the method used,
exaggerates the elasticity estimates and rules out the possibility of a
complementarity relationship among the inputs. Finally, we find that
constrained optimization and the Bayesian methodology with regional (over
a neighborhood of data points in the sample) or pointwise (at every data
point in the sample) concavity imposed can guarantee inference consistent
with neoclassical microeconomic theory, without compromising much of the
flexibility of the functional form.
Journal: Econometric Reviews
Pages: 198-227
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.945385
File-URL: http://hdl.handle.net/10.1080/07474938.2014.945385
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:198-227
Template-Type: ReDIF-Article 1.0
Author-Name: Richard G. Anderson
Author-X-Name-First: Richard G.
Author-X-Name-Last: Anderson
Author-Name: Marcelle Chauvet
Author-X-Name-First: Marcelle
Author-X-Name-Last: Chauvet
Author-Name: Barry Jones
Author-X-Name-First: Barry
Author-X-Name-Last: Jones
Title: Nonlinear Relationship Between Permanent and Transitory Components of Monetary Aggregates and the Economy
Abstract:
This paper uses several methods to study the
interrelationship among Divisia monetary aggregates, prices, and income,
allowing for nonstationary, nonlinearities, asymmetries, and time-varying
relationships among the series. We propose a multivariate regime switching
unobserved components model to obtain transitory and permanent components
for each series, allowing for potential recurrent and structural changes
in their dynamics. Each component follows distinct two-state Markov
processes representing low or high phases. Since the lead-lag relationship
between the phases can vary over time, rather than pre-imposing a
structure to their linkages, the proposed flexible framework enables us to
study their specific lead-lag relationship over each one of their cycles
and over each U.S. recession in the last 40 years. The decomposition of
the series into permanent and transitory components reveals striking
results. First, we find a strong nonlinear association between the
components of money and prices-all low phases of the transitory component
of prices were preceded by tight transitory and permanent money phases. We
also find that most recessions were preceded by tight money phases (its
cyclical and permanent components) and high transitory price phases (with
the exception of the 2001 and 2009-2010 recessions). In addition, all
recessions were associated with a decrease in transitory and permanent
income.
Journal: Econometric Reviews
Pages: 228-254
Issue: 1-2
Volume: 34
Year: 2015
Month: 2
X-DOI: 10.1080/07474938.2014.945386
File-URL: http://hdl.handle.net/10.1080/07474938.2014.945386
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:1-2:p:228-254
Template-Type: ReDIF-Article 1.0
Author-Name: Amos Golan
Author-X-Name-First: Amos
Author-X-Name-Last: Golan
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Title: Editorial Note
Journal: Econometric Reviews
Pages: 255-255
Issue: 3
Volume: 34
Year: 2015
Month: 3
X-DOI: 10.1080/07474938.2014.944472
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944472
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:3:p:255-255
Template-Type: ReDIF-Article 1.0
Author-Name: Joshua C. C. Chan
Author-X-Name-First: Joshua C. C.
Author-X-Name-Last: Chan
Author-Name: Eric Eisenstat
Author-X-Name-First: Eric
Author-X-Name-Last: Eisenstat
Title: Marginal Likelihood Estimation with the Cross-Entropy Method
Abstract:
We consider an adaptive importance sampling approach to
estimating the marginal likelihood, a quantity that is fundamental in
Bayesian model comparison and Bayesian model averaging. This approach is
motivated by the difficulty of obtaining an accurate estimate through
existing algorithms that use Markov chain Monte Carlo (MCMC) draws, where
the draws are typically costly to obtain and highly correlated in
high-dimensional settings. In contrast, we use the cross-entropy (CE)
method, a versatile adaptive Monte Carlo algorithm originally developed
for rare-event simulation. The main advantage of the importance sampling
approach is that random samples can be obtained from some convenient
density with little additional costs. As we are generating
independent draws instead of correlated
MCMC draws, the increase in simulation effort is much smaller should one
wish to reduce the numerical standard error of the estimator. Moreover,
the importance density derived via the CE method is grounded in
information theory, and therefore, is in a well-defined sense optimal. We
demonstrate the utility of the proposed approach by two empirical
applications involving women's labor market participation and U.S.
macroeconomic time series. In both applications, the proposed CE method
compares favorably to existing estimators.
Journal: Econometric Reviews
Pages: 256-285
Issue: 3
Volume: 34
Year: 2015
Month: 3
X-DOI: 10.1080/07474938.2014.944474
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944474
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:3:p:256-285
Template-Type: ReDIF-Article 1.0
Author-Name: Alastair R. Hall
Author-X-Name-First: Alastair R.
Author-X-Name-Last: Hall
Author-Name: Yuyi Li
Author-X-Name-First: Yuyi
Author-X-Name-Last: Li
Author-Name: Chris D. Orme
Author-X-Name-First: Chris D.
Author-X-Name-Last: Orme
Author-Name: Arthur Sinko
Author-X-Name-First: Arthur
Author-X-Name-Last: Sinko
Title: Testing for Structural Instability in Moment Restriction Models: An Info-Metric Approach
Abstract:
In this paper, we develop an info-metric framework for
testing hypotheses about structural instability in nonlinear, dynamic
models estimated from the information in population moment conditions. Our
methods are designed to distinguish between three states of the world: (i)
the model is structurally stable in the sense that the population moment
condition holds at the same parameter value throughout the sample; (ii)
the model parameters change at some point in the sample but otherwise the
model is correctly specified; and (iii) the model exhibits more general
forms of instability than a single shift in the parameters. An advantage
of the info-metric approach is that the null hypotheses concerned are
formulated in terms of distances between various choices of probability
measures constrained to satisfy (i) and (ii), and the empirical measure of
the sample. Under the alternative hypotheses considered, the model is
assumed to exhibit structural instability at a single point in the sample,
referred to as the break point; our analysis allows for the break point to
be either fixed a priori or treated as occuring at some
unknown point within a certain fraction of the sample. We propose various
test statistics that can be thought of as sample analogs of the distances
described above, and derive their limiting distributions under the
appropriate null hypothesis. The limiting distributions of our statistics
are nonstandard but coincide with various distributions that arise in the
literature on structural instability testing within the Generalized Method
of Moments framework. A small simulation study illustrates the finite
sample performance of our test statistics.
Journal: Econometric Reviews
Pages: 286-327
Issue: 3
Volume: 34
Year: 2015
Month: 3
X-DOI: 10.1080/07474938.2014.944477
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944477
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:3:p:286-327
Template-Type: ReDIF-Article 1.0
Author-Name: Saraswata Chaudhuri
Author-X-Name-First: Saraswata
Author-X-Name-Last: Chaudhuri
Author-Name: Eric Renault
Author-X-Name-First: Eric
Author-X-Name-Last: Renault
Title: Shrinkage of Variance for Minimum Distance Based Tests
Abstract:
This paper promotes information theoretic inference in the
context of minimum distance estimation. Various score test statistics
differ only through the embedded estimator of the variance of estimating
functions. We resort to implied probabilities provided by the constrained
maximization of generalized entropy to get a more accurate variance
estimator under the null. We document, both by theoretical higher order
expansions and by Monte-Carlo evidence, that our improved score tests have
better finite-sample size properties. The competitiveness of our
non-simulation based method with respect to bootstrap is confirmed in the
example of inference on covariance structures previously studied by
Horowitz (1998).
Journal: Econometric Reviews
Pages: 328-351
Issue: 3
Volume: 34
Year: 2015
Month: 3
X-DOI: 10.1080/07474938.2014.944794
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944794
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:3:p:328-351
Template-Type: ReDIF-Article 1.0
Author-Name: Lorenzo Camponovo
Author-X-Name-First: Lorenzo
Author-X-Name-Last: Camponovo
Author-Name: Taisuke Otsu
Author-X-Name-First: Taisuke
Author-X-Name-Last: Otsu
Title: Robustness of Bootstrap in Instrumental Variable Regression
Abstract:
This paper studies robustness of bootstrap inference methods
for instrumental variable (IV) regression models. We consider test
statistics for parameter hypotheses based on the IV estimator and
generalized method of trimmed moments (GMTM) estimator introduced by
Č�žek (2008, 2009), and compare the pairs and implied
probability bootstrap approximations for these statistics by applying the
finite sample breakdown point theory. In particular, we study limiting
behaviors of the bootstrap quantiles when the values of outliers diverge
to infinity but the sample size is held fixed. The outliers are defined as
anomalous observations that can arbitrarily change the value of the
statistic of interest. We analyze both just- and overidentified cases and
discuss implications of the breakdown point analysis to the size and power
properties of bootstrap tests. We conclude that the implied probability
bootstrap test using the statistic based on the GMTM estimator shows
desirable robustness properties. Simulation studies endorse this
conclusion. An empirical example based on Romer's (1993) study on the
effect of openness of countries to inflation rates is presented. Several
extensions including the analysis for the residual bootstrap are provided.
Journal: Econometric Reviews
Pages: 352-393
Issue: 3
Volume: 34
Year: 2015
Month: 3
X-DOI: 10.1080/07474938.2014.944803
File-URL: http://hdl.handle.net/10.1080/07474938.2014.944803
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:3:p:352-393
Template-Type: ReDIF-Article 1.0
Author-Name: Song Li
Author-X-Name-First: Song
Author-X-Name-Last: Li
Author-Name: Mervyn J. Silvapulle
Author-X-Name-First: Mervyn J.
Author-X-Name-Last: Silvapulle
Author-Name: Param Silvapulle
Author-X-Name-First: Param
Author-X-Name-Last: Silvapulle
Author-Name: Xibin Zhang
Author-X-Name-First: Xibin
Author-X-Name-Last: Zhang
Title: Bayesian Approaches to Nonparametric Estimation of Densities on the Unit Interval
Abstract:
This paper investigates nonparametric estimation of density
on [0, 1]. The kernel estimator of density on [0, 1] has been found to be
sensitive to both bandwidth and kernel. This paper proposes a unified
Bayesian framework for choosing both the bandwidth and kernel function. In
a simulation study, the Bayesian bandwidth estimator performed better than
others, and kernel estimators were sensitive to the choice of the kernel
and the shapes of the population densities on [0, 1]. The simulation and
empirical results demonstrate that the methods proposed in this paper can
improve the way the probability densities on [0, 1] are presently
estimated.
Journal: Econometric Reviews
Pages: 394-412
Issue: 3
Volume: 34
Year: 2015
Month: 3
X-DOI: 10.1080/07474938.2013.807130
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807130
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:3:p:394-412
Template-Type: ReDIF-Article 1.0
Author-Name: Carlos Martins-Filho
Author-X-Name-First: Carlos
Author-X-Name-Last: Martins-Filho
Author-Name: Feng Yao
Author-X-Name-First: Feng
Author-X-Name-Last: Yao
Title: Semiparametric Stochastic Frontier Estimation via Profile Likelihood
Abstract:
We consider the estimation of a nonparametric stochastic
frontier model with composite error density which is known up to a finite
parameter vector. Our primary interest is on the estimation of the
parameter vector, as it provides the basis for estimation of firm specific
(in)efficiency. Our frontier model is similar to that of Fan et al.
(1996), but here we extend their work in that: a) we establish the
asymptotic properties of their estimation procedure, and b) propose and
establish the asymptotic properties of an alternative estimator based on
the maximization of a conditional profile likelihood function. The
estimator proposed in Fan et al. (1996) is asymptotically normally
distributed but has bias which does not vanish as the sample size
n → ∞. In contrast, our proposed estimator is
asymptotically normally distributed and correctly centered at the true
value of the parameter vector. In addition, our estimator is shown to be
efficient in a broad class of semiparametric estimators. Our estimation
procedure provides a fast converging alternative to the recently proposed
estimator in Kumbhakar et al. (2007). A Monte Carlo study is performed to
shed light on the finite sample properties of these competing estimators.
Journal: Econometric Reviews
Pages: 413-451
Issue: 4
Volume: 34
Year: 2015
Month: 4
X-DOI: 10.1080/07474938.2013.806729
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806729
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:4:p:413-451
Template-Type: ReDIF-Article 1.0
Author-Name: Arturo Leccadito
Author-X-Name-First: Arturo
Author-X-Name-Last: Leccadito
Author-Name: Omar Rachedi
Author-X-Name-First: Omar
Author-X-Name-Last: Rachedi
Author-Name: Giovanni Urga
Author-X-Name-First: Giovanni
Author-X-Name-Last: Urga
Title: True Versus Spurious Long Memory: Some Theoretical Results and a Monte Carlo Comparison
Abstract:
A common feature of financial time series is their strong
persistence. Yet, long memory may just be the spurious effect of either
structural breaks or slow switching regimes. We explore the effects of
spurious long memory on the elasticity of the stock market price with
respect to volatility and show how cross-sectional aggregation may
generate spurious persistence in the data. We undertake an extensive Monte
Carlo study to compare the performance of five tests, constructed under
the null of true long memory versus the alternative of spurious long
memory due to level shifts or breaks.
Journal: Econometric Reviews
Pages: 452-479
Issue: 4
Volume: 34
Year: 2015
Month: 4
X-DOI: 10.1080/07474938.2013.808462
File-URL: http://hdl.handle.net/10.1080/07474938.2013.808462
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:4:p:452-479
Template-Type: ReDIF-Article 1.0
Author-Name: Dingan Feng
Author-X-Name-First: Dingan
Author-X-Name-Last: Feng
Author-Name: Peter X.-K. Song
Author-X-Name-First: Peter X.-K.
Author-X-Name-Last: Song
Author-Name: Tony S. Wirjanto
Author-X-Name-First: Tony S.
Author-X-Name-Last: Wirjanto
Title: Time-Deformation Modeling of Stock Returns Directed by Duration Processes
Abstract:
This paper proposes a new time-deformation model for stock
returns sampled in transaction time and directed by a generalized duration
process. Stochastic volatility in this model is driven by an observed
duration process and a latent autoregressive process. Parameter estimation
in the model is carried out by using a method of simulated moments (MSM)
due to its analytical tractability and numerical stability for the
proposed model. Simulations are conducted to validate the choice of
moments used in the formulation of MSM. Both simulation and empirical
results indicate that the proposed MSM works well for the model. The main
empirical findings from the analysis of IBM transaction return data
include: (i) the return distribution conditional on the duration process
is not Gaussian, even though the duration process itself can marginally
serve as a directing process; (ii) the return process is highly leveraged;
(iii) longer trade duration tends to be associated with higher return
volatility; and (iv) the proposed model is capable of reproducing a return
process whose marginal density function is close to that of the empirical
return process.
Journal: Econometric Reviews
Pages: 480-511
Issue: 4
Volume: 34
Year: 2015
Month: 4
X-DOI: 10.1080/07474938.2013.808478
File-URL: http://hdl.handle.net/10.1080/07474938.2013.808478
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:4:p:480-511
Template-Type: ReDIF-Article 1.0
Author-Name: Giuseppe Cavaliere
Author-X-Name-First: Giuseppe
Author-X-Name-Last: Cavaliere
Author-Name: Peter C. B. Phillips
Author-X-Name-First: Peter C. B.
Author-X-Name-Last: Phillips
Author-Name: Stephan Smeekes
Author-X-Name-First: Stephan
Author-X-Name-Last: Smeekes
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M. Robert
Author-X-Name-Last: Taylor
Title: Lag Length Selection for Unit Root Tests in the Presence of Nonstationary Volatility
Abstract:
A number of recent papers have focused on the problem of
testing for a unit root in the case where the driving shocks may be
unconditionally heteroskedastic. These papers have, however, taken the lag
length in the unit root test regression to be a deterministic function of
the sample size, rather than data-determined, the latter being standard
empirical practice. We investigate the finite sample impact of
unconditional heteroskedasticity on conventional data-dependent lag
selection methods in augmented Dickey-Fuller type regressions and propose
new lag selection criteria which allow for unconditional
heteroskedasticity. Standard lag selection methods are shown to have a
tendency to over-fit the lag order under heteroskedasticity, resulting in
significant power losses in the (wild bootstrap implementation of the)
augmented Dickey-Fuller tests under the alternative. The proposed new lag
selection criteria are shown to avoid this problem yet deliver unit root
tests with almost identical finite sample properties as the corresponding
tests based on conventional lag selection when the shocks are
homoskedastic.
Journal: Econometric Reviews
Pages: 512-536
Issue: 4
Volume: 34
Year: 2015
Month: 4
X-DOI: 10.1080/07474938.2013.808065
File-URL: http://hdl.handle.net/10.1080/07474938.2013.808065
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:4:p:512-536
Template-Type: ReDIF-Article 1.0
Author-Name: Markus Jochmann
Author-X-Name-First: Markus
Author-X-Name-Last: Jochmann
Title: Modeling U.S. Inflation Dynamics: A Bayesian Nonparametric Approach
Abstract:
The properties of the inflation process and especially
possible changes in its persistence have received much attention in the
literature. However, empirical evidence is ambiguous. Some studies find
that inflation persistence varied over time, others conclude it was
constant. This article contributes further evidence to this ongoing debate
by modeling U.S. inflation dynamics using a sticky infinite hidden
Markov model (sticky IHMM). The sticky IHMM is a Bayesian
nonparametric approach to modeling structural breaks. It allows for an
unknown number of breakpoints and is a flexible and attractive alternative
to existing methods. We found that inflation persistence was highest in
1973-74 and then again around 1980. However, credible intervals for our
estimates of inflation persistence were very wide. Thus, a substantial
amount of uncertainty about this aspect of inflation dynamics remained.
Journal: Econometric Reviews
Pages: 537-558
Issue: 5
Volume: 34
Year: 2015
Month: 5
X-DOI: 10.1080/07474938.2013.806199
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806199
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:5:p:537-558
Template-Type: ReDIF-Article 1.0
Author-Name: Massimiliano Caporin
Author-X-Name-First: Massimiliano
Author-X-Name-Last: Caporin
Author-Name: Paolo Paruolo
Author-X-Name-First: Paolo
Author-X-Name-Last: Paruolo
Title: Proximity-Structured Multivariate Volatility Models
Abstract:
In many multivariate volatility models, the number of
parameters increases faster than the cross-section dimension, hence
creating a curse of dimensionality problem. This paper discusses
specification and identification of structured parameterizations based on
weight matrices induced by economic proximity. It is shown that structured
specifications can mitigate or even solve the curse of dimensionality
problem. Identification and estimation of structured specifications are
analyzed, rank and order conditions for identification are given and the
specification of weight matrices is discussed. Several structured
specifications compare well with alternatives in modelling conditional
covariances of six returns from the New York Stock Exchange.
Journal: Econometric Reviews
Pages: 559-593
Issue: 5
Volume: 34
Year: 2015
Month: 5
X-DOI: 10.1080/07474938.2013.807102
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807102
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:5:p:559-593
Template-Type: ReDIF-Article 1.0
Author-Name: Alexios Ghalanos
Author-X-Name-First: Alexios
Author-X-Name-Last: Ghalanos
Author-Name: Eduardo Rossi
Author-X-Name-First: Eduardo
Author-X-Name-Last: Rossi
Author-Name: Giovanni Urga
Author-X-Name-First: Giovanni
Author-X-Name-Last: Urga
Title: Independent Factor Autoregressive Conditional Density Model
Abstract:
In this article, we propose a novel Independent Factor
Autoregressive Conditional Density (IFACD) model able to generate
time-varying higher moments using an independent factor setup. Our
proposed framework incorporates dynamic estimation of higher comovements
and feasible portfolio representation within a non-elliptical multivariate
distribution. We report an empirical application, using returns data from
14 MSCI equity index iShares for the period 1996 to 2010, and we show that
the IFACD model provides superior VaR forecasts and portfolio allocations
with respect to the Conditionally Heteroskedastic Independent Component
Analysis of Generalized Orthogonal (CHICAGO) and DCC models.
Journal: Econometric Reviews
Pages: 594-616
Issue: 5
Volume: 34
Year: 2015
Month: 5
X-DOI: 10.1080/07474938.2013.808561
File-URL: http://hdl.handle.net/10.1080/07474938.2013.808561
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:5:p:594-616
Template-Type: ReDIF-Article 1.0
Author-Name: Yi-Ting Chen
Author-X-Name-First: Yi-Ting
Author-X-Name-Last: Chen
Author-Name: Zhongjun Qu
Author-X-Name-First: Zhongjun
Author-X-Name-Last: Qu
Title: M Tests with a New Normalization Matrix
Abstract:
This paper proposes a new family of M tests
building on the work of Kuan and Lee (2006) and Kiefer et al. (2000). The
idea is to replace the asymptotic covariance matrix in conventional
M tests with an alternative normalization matrix,
constructed using moment functions estimated from (K + 1)
recursive subsamples. The new tests are simple to implement. They
automatically account for the effect of parameter estimation and allow for
conditional heteroskedasticity and serial correlation of general forms.
They converge to central F distributions under the
fixed-K asymptotics and to chi-square distributions if
K is allowed to approach infinity. We illustrate their
applications using three simulation examples: (1) specification testing
for conditional heteroskedastic models, (2) non-nested testing with
serially correlated errors, and (3) testing for serial correlation with
unknown heteroskedasticity. The results show that the new tests exhibit
good size properties with power often comparable to the conventional
M tests while being substantially higher than that of
Kuan and Lee (2006).
Journal: Econometric Reviews
Pages: 617-652
Issue: 5
Volume: 34
Year: 2015
Month: 5
X-DOI: 10.1080/07474938.2013.833822
File-URL: http://hdl.handle.net/10.1080/07474938.2013.833822
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:5:p:617-652
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Qi Li
Author-X-Name-First: Qi
Author-X-Name-Last: Li
Title: The Special Issue in Honor of Aman Ullah: An Overview
Journal: Econometric Reviews
Pages: 653-658
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956028
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956028
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:653-658
Template-Type: ReDIF-Article 1.0
Author-Name: Badi H. Baltagi
Author-X-Name-First: Badi H.
Author-X-Name-Last: Baltagi
Author-Name: Ying Deng
Author-X-Name-First: Ying
Author-X-Name-Last: Deng
Title: EC3SLS Estimator for a Simultaneous System of Spatial Autoregressive Equations with Random Effects
Abstract:
This article derives a 3SLS estimator for a simultaneous
system of spatial autoregressive equations with random effects which can
therefore handle endoegeneity, spatial lag dependence, heterogeneity as
well as cross equation correlation. This is done by utilizing the Kelejian
and Prucha (1998) and Lee (2003) type instruments from the cross-section
spatial autoregressive literature and marrying them to the error
components 3SLS estimator derived by Baltagi (1981) for a system of
simultaneous panel data equations. Our Monte Carlo experiments indicate
that, for the single equation spatial error components 2SLS estimators,
there is a slight gain in efficiency when Lee (2003) type rather than
Kelejian and Prucha (1998) instruments are used. However, there is not
much difference in efficiency between these instruments for spatial error
components 3SLS estimators.
Journal: Econometric Reviews
Pages: 659-694
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956030
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956030
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:659-694
Template-Type: ReDIF-Article 1.0
Author-Name: Zongwu Cai
Author-X-Name-First: Zongwu
Author-X-Name-Last: Cai
Author-Name: Linna Chen
Author-X-Name-First: Linna
Author-X-Name-Last: Chen
Author-Name: Ying Fang
Author-X-Name-First: Ying
Author-X-Name-Last: Fang
Title: Semiparametric Estimation of Partially Varying-Coefficient Dynamic Panel Data Models
Abstract:
This paper studies a new class of semiparametric dynamic
panel data models, in which some of coefficients are allowed to depend on
other informative variables and some of the regressors can be endogenous.
To estimate both parametric and nonparametric coefficients, a three-stage
estimation method is proposed. A nonparametric generalized method of
moments (GMM) is adopted to estimate all coefficients firstly and an
average method is used to obtain the root-N consistent estimator of
parametric coefficients. At the last stage, the estimator of varying
coefficients is obtained by the partial residuals. The consistency and
asymptotic normality of both estimators are derived. Monte Carlo
simulations are conducted to verify the theoretical results and to
demonstrate that the proposed estimators perform well in a finite sample.
Journal: Econometric Reviews
Pages: 695-719
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956569
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956569
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:695-719
Template-Type: ReDIF-Article 1.0
Author-Name: Keisuke Hirano
Author-X-Name-First: Keisuke
Author-X-Name-Last: Hirano
Author-Name: Jack R. Porter
Author-X-Name-First: Jack R.
Author-X-Name-Last: Porter
Title: Location Properties of Point Estimators in Linear Instrumental Variables and Related Models
Abstract:
We examine statistical models, including the workhorse linear
instrumental variables model, in which the mapping from the reduced form
distribution to the structural parameters of interest is singular. The
singularity of this mapping implies certain fundamental restrictions on
the finite sample properties of point estimators: they cannot be unbiased,
quantile-unbiased, or translation equivariant. The nonexistence of
unbiased estimators does not rule out bias reduction of standard
estimators, but implies that the bias-variance tradeoff cannot be avoided
and needs to be considered carefully. The results can also be extended to
weak instrument asymptotics by using the limits of experiments framework.
Journal: Econometric Reviews
Pages: 720-733
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956573
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956573
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:720-733
Template-Type: ReDIF-Article 1.0
Author-Name: Juan Carlos Escanciano
Author-X-Name-First: Juan Carlos
Author-X-Name-Last: Escanciano
Author-Name: Lin Zhu
Author-X-Name-First: Lin
Author-X-Name-Last: Zhu
Title: A Simple Data-Driven Estimator for the Semiparametric Sample Selection Model
Abstract:
This paper proposes a simple fully data-driven version of
Powell's (2001) two-step semiparametric estimator for the sample selection
model. The main feature of the proposal is that the bandwidth used to
estimate the infinite-dimensional nuisance parameter is chosen by
minimizing the mean squared error of the fitted semiparametric model. We
formally justify data-driven inference. We introduce the concept of
asymptotic normality, uniform in the bandwidth, and show that the proposed
estimator achieves this property for a wide range of bandwidths. The
method of proof is different from that in Powell (2001) and permits
straightforward extensions to other semiparametric or even fully
nonparametric specifications of the selection equation. The results of a
small Monte Carlo suggest that our estimator has excellent finite sample
performance, comparing well with other competing estimators based on
alternative choices of smoothing parameters.
Journal: Econometric Reviews
Pages: 734-762
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956577
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956577
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:734-762
Template-Type: ReDIF-Article 1.0
Author-Name: Yan Dong
Author-X-Name-First: Yan
Author-X-Name-Last: Dong
Author-Name: Li Gan
Author-X-Name-First: Li
Author-X-Name-Last: Gan
Author-Name: Yingning Wang
Author-X-Name-First: Yingning
Author-X-Name-Last: Wang
Title: Residential Mobility, Neighborhood Effects, and Educational Attainment of Blacks and Whites
Abstract:
This paper proposes a new model to identify if and how much
the educational attainment gap between blacks and whites is due to the
difference in their neighborhoods. In this model, individuals belong to
two unobserved types: the endogenous type, which may move in response to
the neighborhood effect on their education; or the exogenous type, which
may move for reasons unrelated to education. The Heckman sample selection
model becomes a special case of the current model in which the probability
of one type of individuals is zero. Although we cannot find any
significant neighborhood effect in the usual Heckman sample selection
model, we do find heterogeneous effects in our two-type model. In
particular, there is a substantial neighborhood effect for the movers who
belong to the endogenous type. No significant effects exist for other
groups. We also find that the endogenous type has more education and moves
more often than the exogenous type. On average, we find that the
neighborhood variable, the percentage of high school graduates in the
neighborhood, accounts for about 28.96% of the education gap between
blacks and whites.
Journal: Econometric Reviews
Pages: 763-798
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956586
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956586
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:763-798
Template-Type: ReDIF-Article 1.0
Author-Name: Chunrong Ai
Author-X-Name-First: Chunrong
Author-X-Name-Last: Ai
Author-Name: Meixia Meng
Author-X-Name-First: Meixia
Author-X-Name-Last: Meng
Title: Endogeneity in Semiparametric Panel Binary Choice Model
Abstract:
In this paper, we study estimation of a semiparametric panel
binary choice model with fixed-effects and continuous endogenous
regressors. The proposed procedure combines the smoothed maximum score
approach with the control function approach and allows for a fixed effect
nonparametric first stage regression. Under some sufficient conditions, we
show that the proposed estimator for the finite dimensional parameter is
consistent and asymptotically normally distributed. A small scale
simulation study demonstrates that the proposed procedure has some
practical value.
Journal: Econometric Reviews
Pages: 799-827
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956589
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956589
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:799-827
Template-Type: ReDIF-Article 1.0
Author-Name: Yanqin Fan
Author-X-Name-First: Yanqin
Author-X-Name-Last: Fan
Author-Name: Ruixuan Liu
Author-X-Name-First: Ruixuan
Author-X-Name-Last: Liu
Title: Symmetrized Multivariate k-NN Estimators
Abstract:
In this article, we propose a symmetrized multivariate
k-NN estimator for the conditional mean and for the
conditional distribution function. We establish consistency and asymptotic
normality of each estimator. For the estimator of the conditional
distribution function, we also establish the weak convergence of the
conditional empirical process to a Gaussian process. Compared with the
corresponding kernel estimators, the asymptotic distributions of our
k-NN estimators do not depend on the existence of the
marginal probability density functions of the covariate vector. A small
simulation study compares the finite sample performance of our symmetrized
multivariate k-NN estimator with the Nadaraya-Watson
kernel estimator for the conditional mean.
Journal: Econometric Reviews
Pages: 828-848
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956590
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956590
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:828-848
Template-Type: ReDIF-Article 1.0
Author-Name: Patrick W. Saart
Author-X-Name-First: Patrick W.
Author-X-Name-Last: Saart
Author-Name: Jiti Gao
Author-X-Name-First: Jiti
Author-X-Name-Last: Gao
Author-Name: David E. Allen
Author-X-Name-First: David E.
Author-X-Name-Last: Allen
Title: Semiparametric Autoregressive Conditional Duration Model: Theory and Practice
Abstract:
Many existing extensions of the Engle and Russell's (1998)
Autoregressive Conditional Duration (ACD) model in the literature are
aimed at providing additional flexibility either on the dynamics of the
conditional duration model or the allowed shape of the hazard function,
i.e., its two most essential components. This article introduces an
alternative semiparametric regression approach to a
nonlinear ACD model; the use of a semiparametric functional form on the
dynamics of the duration process suggests the model being called the
Semiparametric ACD (SEMI-ACD) model. Unlike existing alternatives, the
SEMI-ACD model allows simultaneous generalizations on both of the
above-mentioned components of the ACD framework. To estimate the model, we
establish an alternative use of the existing B�hlmann and McNeil's (2002)
iterative estimation algorithm in the semiparametric setting and provide
the mathematical proof of its statistical consistency in our context.
Furthermore, we investigate the asymptotic properties of the
semiparametric estimators employed in order to ensure the statistical
rigor of the SEMI-ACD estimation procedure. These asymptotic results are
presented in conjunction with simulated examples, which provide an
empirical evidence of the SEMI-ACD model's robust finite-sample
performance. Finally, we apply the proposed model to study price duration
process in the foreign exchange market to illustrate its usefulness in
practice.
Journal: Econometric Reviews
Pages: 849-881
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956594
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956594
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:849-881
Template-Type: ReDIF-Article 1.0
Author-Name: Zhongwen Liang
Author-X-Name-First: Zhongwen
Author-X-Name-Last: Liang
Author-Name: Zhongjian Lin
Author-X-Name-First: Zhongjian
Author-X-Name-Last: Lin
Author-Name: Cheng Hsiao
Author-X-Name-First: Cheng
Author-X-Name-Last: Hsiao
Title: Local Linear Estimation of a Nonparametric Cointegration Model
Abstract:
The nonparametric local linear method has superior properties
compared with the local constant method in the independent and weak
dependent data setting, see e.g. Fan and Gijbels (1996). Recently, much
attention has been drawn to the nonparametric models with nonstationary
data. Wang and Phillips (2009a) studied the asymptotic property of a local
constant estimator of a nonparametric regression model with a
nonstationary I(1) regressor. Sun and Li (2011) show a surprising result
that for a semiparamtric varying coefficient model with nonstationary I(1)
regressors, the local linear estimator has a faster rate of convergence
than the local constant estimator. In this article, we study the
asymptotic behavior of the local linear estimator for the same
nonparametric regression model as considered by Wang and Phillips (2009a).
We focus on the derivation of the joint asymptotic result of both the
unknown regression function and its derivative function. We also examine
the performance of the local linear estimator with the bandwidth selected
by the data driven least squares cross validation (LS-CV) method.
Simulation results show that the local linear estimator, coupled with the
LS-CV selected bandwidth, enjoys substantial efficiency gains over the
local constant estimator.
Journal: Econometric Reviews
Pages: 882-906
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956610
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956610
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:882-906
Template-Type: ReDIF-Article 1.0
Author-Name: Carlos Martins-Filho
Author-X-Name-First: Carlos
Author-X-Name-Last: Martins-Filho
Author-Name: Feng Yao
Author-X-Name-First: Feng
Author-X-Name-Last: Yao
Author-Name: Maximo Torero
Author-X-Name-First: Maximo
Author-X-Name-Last: Torero
Title: High-Order Conditional Quantile Estimation Based on Nonparametric Models of Regression
Abstract:
We consider the estimation of a high order quantile
associated with the conditional distribution of a regressand in a
nonparametric regression model. Our estimator is inspired by Pickands
(1975) where it is shown that arbitrary distributions which lie in the
domain of attraction of an extreme value type have tails that, in the
limit, behave as generalized Pareto distributions (GPD). Smith (1987) has
studied the asymptotic properties of maximum likelihood (ML) estimators
for the parameters of the GPD in this context, but in our paper the
relevant random variables used in estimation are standardized residuals
from a first stage kernel based nonparametric estimation. We obtain
convergence in probability and distribution of the residual based ML
estimator for the parameters of the GPD as well as the asymptotic
distribution for a suitably defined quantile estimator. A Monte Carlo
study provides evidence that our estimator behaves well in finite samples
and is easily implementable. Our results have direct application in
finance, particularly in the estimation of conditional Value-at-Risk, but
other researchers in applied fields such as insurance will also find the
results useful.
Journal: Econometric Reviews
Pages: 907-958
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956612
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956612
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:907-958
Template-Type: ReDIF-Article 1.0
Author-Name: Qi Gao
Author-X-Name-First: Qi
Author-X-Name-Last: Gao
Author-Name: Long Liu
Author-X-Name-First: Long
Author-X-Name-Last: Liu
Author-Name: Jeffrey S. Racine
Author-X-Name-First: Jeffrey S.
Author-X-Name-Last: Racine
Title: A Partially Linear Kernel Estimator for Categorical Data
Abstract:
We extend Robinson's (1988) partially linear estimator to
admit the mix of datatypes typically encountered by applied researchers,
namely, categorical (nominal and ordinal) and continuous. We also relax
the independence assumption that is prevalent in this literature and allow
for β-mixing time-series data. We employ Li, Ouyang, and Racine's
(2009) categorical and continuous data kernel method, and extend this so
that a mix of continuous and/or categorical variables can appear in the
nonparametric part of a partially linear time-series model. The estimator
appearing in the linear part is shown to be
-consistent, which is of course the case for
Robinson's (1988) estimator. Asymptotic normality of the nonparametric
component is also established. A modest Monte Carlo simulation
demonstrates that the proposed estimator can outperform existing
nonparametric, semiparametric, and popular parametric specifications that
appear in the literature. An application using Survey of Income and
Program Participation (SIPP) data to model a dynamic labor supply function
is undertaken that provides a robustness check and demonstrates that the
proposed method is capable of outperforming popular parametric
specifications that have been used to model this dataset.
Journal: Econometric Reviews
Pages: 959-978
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956613
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956613
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:959-978
Template-Type: ReDIF-Article 1.0
Author-Name: Jingping Gu
Author-X-Name-First: Jingping
Author-X-Name-Last: Gu
Author-Name: Qi Li
Author-X-Name-First: Qi
Author-X-Name-Last: Li
Author-Name: Jui-Chung Yang
Author-X-Name-First: Jui-Chung
Author-X-Name-Last: Yang
Title: Multivariate Local Polynomial Kernel Estimators: Leading Bias and Asymptotic Distribution
Abstract:
Masry (1996b) provides estimation bias and variance
expression for a general local polynomial kernel estimator in a general
multivariate regression framework. Under smoother conditions on the
unknown regression function and by including more refined approximation
terms than that in Masry (1996b), we extend the result of Masry (1996b) to
obtain explicit leading bias terms for the whole vector of the local
polynomial estimator. Specifically, we derive the leading bias and leading
variance terms of nonparametric local polynomial kernel estimator in a
general nonparametric multivariate regression model framework. The results
can be used to obtain optimal smoothing parameters in local polynomial
estimation of the unknown conditional mean function and its derivative
functions.
Journal: Econometric Reviews
Pages: 979-1010
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956615
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956615
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:979-1010
Template-Type: ReDIF-Article 1.0
Author-Name: Zaichao Du
Author-X-Name-First: Zaichao
Author-X-Name-Last: Du
Author-Name: Juan Carlos Escanciano
Author-X-Name-First: Juan Carlos
Author-X-Name-Last: Escanciano
Title: A Nonparametric Distribution-Free Test for Serial Independence of Errors
Abstract:
In this article, we propose a test for the serial
independence of unobservable errors in location-scale models. We consider
a Hoeffding-Blum-Kiefer-Rosenblat type empirical process applied to
residuals, and show that under certain conditions it converges weakly to
the same limit as the process based on true errors. We then consider a
generalized spectral test applied to estimated residuals, and get a test
that is asymptotically distribution-free and powerful against any type of
pairwise dependence at all lags. Some Monte Carlo simulations validate our
theoretical findings.
Journal: Econometric Reviews
Pages: 1011-1034
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956616
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956616
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:1011-1034
Template-Type: ReDIF-Article 1.0
Author-Name: Peter C. B. Phillips
Author-X-Name-First: Peter C. B.
Author-X-Name-Last: Phillips
Author-Name: Ji Hyung Lee
Author-X-Name-First: Ji Hyung
Author-X-Name-Last: Lee
Title: Limit Theory for VARs with Mixed Roots Near Unity
Abstract:
Limit theory is developed for nonstationary vector
autoregression (VAR) with mixed roots in the vicinity of unity involving
persistent and explosive components. Statistical tests for common roots
are examined and model selection approaches for discriminating roots are
explored. The results are useful in empirical testing for multiple
manifestations of nonstationarity - in particular for distinguishing
mildly explosive roots from roots that are local to unity and for testing
commonality in persistence.
Journal: Econometric Reviews
Pages: 1035-1056
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956617
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956617
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:1035-1056
Template-Type: ReDIF-Article 1.0
Author-Name: Liangjun Su
Author-X-Name-First: Liangjun
Author-X-Name-Last: Su
Author-Name: Yundong Tu
Author-X-Name-First: Yundong
Author-X-Name-Last: Tu
Author-Name: Aman Ullah
Author-X-Name-First: Aman
Author-X-Name-Last: Ullah
Title: Testing Additive Separability of Error Term in Nonparametric Structural Models
Abstract:
This article considers testing additive error structure in
nonparametric structural models, against the alternative hypothesis that
the random error term enters the nonparametric model nonadditively. We
propose a test statistic under a set of identification conditions
considered by Hoderlein et al. (2012), which require the existence of a
control variable such that the regressor is independent of the error term
given the control variable. The test statistic is motivated from the
observation that, under the additive error structure, the partial
derivative of the nonparametric structural function with respect to the
error term is one under identification. The asymptotic distribution of the
test is established, and a bootstrap version is proposed to enhance its
finite sample performance. Monte Carlo simulations show that the test has
proper size and reasonable power in finite samples.
Journal: Econometric Reviews
Pages: 1057-1088
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956621
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956621
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:1057-1088
Template-Type: ReDIF-Article 1.0
Author-Name: M. Hashem Pesaran
Author-X-Name-First: M. Hashem
Author-X-Name-Last: Pesaran
Title: Testing Weak Cross-Sectional Dependence in Large Panels
Abstract:
This article considers testing the hypothesis that errors in
a panel data model are weakly cross-sectionally dependent, using the
exponent of cross-sectional dependence α, introduced recently in
Bailey, Kapetanios, and Pesaran (2012). It is shown that the implicit null
of the cross-sectional dependence (CD) test depends on the relative
expansion rates of N and T. When
T = O(N
-super-ε), for some 0 > ε ≤1, then
the implicit null of the CD test is given by
0 ≤ α > (2 - ε)/4, which gives
0 ≤ α >1/4, when N and
T tend to infinity at the same rate such that
T/N → κ, with
κ being a finite positive constant. It is argued that in the case of
large N panels, the null of weak dependence is more
appropriate than the null of independence which could be quite restrictive
for large panels. Using Monte Carlo experiments, it is shown that the
CD test has the correct size for values of α in the
range [0, 1/4], for all combinations of N and
T, and irrespective of whether the panel contains lagged
values of the dependent variables, so long as there are no major
asymmetries in the error distribution.
Journal: Econometric Reviews
Pages: 1089-1117
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956623
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956623
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:1089-1117
Template-Type: ReDIF-Article 1.0
Author-Name: Dayong Zhang
Author-X-Name-First: Dayong
Author-X-Name-Last: Zhang
Author-Name: Marco R. Barassi
Author-X-Name-First: Marco R.
Author-X-Name-Last: Barassi
Author-Name: Jijun Tan
Author-X-Name-First: Jijun
Author-X-Name-Last: Tan
Title: Residual-Based Tests for Fractional Cointegration: Testing the Term Structure of Interest Rates
Abstract:
Campbell and Shiller (1987) and Hall et al. (1992) suggest
that the term spread of long-term and short-term interest rates should be
a stationary I(0) process. However, an empirically nonstationary term
spread or rejection of cointegration between long and short term interest
rates need not to be considered an empirical rejection of this theoretical
relationship. It is likely that the dichotomy between I(1) or I(0) and/or
integer values of cointegration are environments which are too restrictive
to model the term structure. To overcome this problem, we propose a
residual-based approach to test for the null of no cointegration against a
fractional alternative which relies on the Exact Local Whittle Estimator
(Shimotsu and Philllips, 2005, 2006). We compare its performance to other
residual-based tests for fractional cointegration, and then we use it to
investigate the term structure in the U.K and the U.S.
Journal: Econometric Reviews
Pages: 1118-1140
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956624
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956624
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:1118-1140
Template-Type: ReDIF-Article 1.0
Author-Name: Serena Ng
Author-X-Name-First: Serena
Author-X-Name-Last: Ng
Title: Constructing Common Factors from Continuous and Categorical Data
Abstract:
The method of principal components is widely used to estimate
common factors in large panels of continuous data. This article first
reviews alternative methods that obtain the common factors by solving a
Procrustes problem. While these matrix decomposition methods do not
specify the probabilistic structure of the data and hence do not permit
statistical evaluations of the estimates, they can be extended to analyze
categorical data. This involves the additional step of quantifying the
ordinal and nominal variables. The article then reviews and explores the
numerical properties of these methods. An interesting finding is that the
factor space can be quite precisely estimated directly from categorical
data without quantification. This may require using a larger number of
estimated factors to compensate for the information loss in categorical
variables. Separate treatment of categorical and continuous variables may
not be necessary if structural interpretation of the factors is not
required, such as in forecasting exercises.
Journal: Econometric Reviews
Pages: 1141-1171
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956625
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956625
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:1141-1171
Template-Type: ReDIF-Article 1.0
Author-Name: John W. Galbraith
Author-X-Name-First: John W.
Author-X-Name-Last: Galbraith
Author-Name: Victoria Zinde-Walsh
Author-X-Name-First: Victoria
Author-X-Name-Last: Zinde-Walsh
Author-Name: Jingmei Zhu
Author-X-Name-First: Jingmei
Author-X-Name-Last: Zhu
Title: GARCH Model Estimation Using Estimated Quadratic Variation
Abstract:
We consider estimates of the parameters of Generalized
Autoregressive Conditional Heteroskedasticity (GARCH) models obtained
using auxiliary information on latent variance which may be available from
higher-frequency data, for example from an estimate of the daily quadratic
variation such as the realized variance. We obtain consistent estimators
of the parameters of the infinite Autoregressive Conditional
Heteroskedasticity (ARCH) representation via a regression using the
estimated quadratic variation, without requiring that it be a consistent
estimate; that is, variance information containing measurement error can
be used for consistent estimation. We obtain GARCH parameters using a
minimum distance estimator based on the estimated ARCH parameters. With
Least Absolute Deviations (LAD) estimation of the truncated ARCH
approximation, we show that consistency and asymptotic normality can be
obtained using a general result on LAD estimation in truncated models of
infinite-order processes. We provide simulation evidence on small-sample
performance for varying numbers of intra-day observations.
Journal: Econometric Reviews
Pages: 1172-1192
Issue: 6-10
Volume: 34
Year: 2015
Month: 12
X-DOI: 10.1080/07474938.2014.956629
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956629
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:34:y:2015:i:6-10:p:1172-1192
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Title: Special Section on Meritocracy and Assessment of Scholarly Outcomes
Journal: Econometric Reviews
Pages: 1-1
Issue: 1
Volume: 35
Year: 2016
Month: 1
X-DOI: 10.1080/07474938.2015.1078626
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1078626
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:1:p:1-1
Template-Type: ReDIF-Article 1.0
Author-Name: Peter C. B. Phillips
Author-X-Name-First: Peter C. B.
Author-X-Name-Last: Phillips
Title: Meritocracy Voting: Measuring the Unmeasurable
Abstract:
Learned societies commonly carry out selection processes to
add new fellows to an existing fellowship. Criteria vary across societies
but are typically based on subjective judgments concerning the merit of
individuals who are nominated for fellowships. These subjective
assessments may be made by existing fellows as they vote in elections to
determine the new fellows or they may be decided by a selection committee
of fellows and officers of the society who determine merit after reviewing
nominations and written assessments. Human judgment inevitably plays a
central role in these determinations and, notwithstanding its limitations,
is usually regarded as being a necessary ingredient in making an overall
assessment of qualifications for fellowship. The present article suggests
a mechanism by which these merit assessments may be complemented with a
quantitative rule that incorporates both subjective and objective
elements. The goal of "measuring merit" may be elusive, but quantitative
assessment rules can help to widen the effective electorate (for instance,
by including the decisions of editors, the judgments of independent
referees, and received opinion about research) and mitigate distortions
that can arise from cluster effects, invisible college coalition voting,
and inner sanctum bias. The rule considered here is designed to assist the
selection process by explicitly taking into account subjective assessments
of individual candidates for election as well as direct quantitative
measures of quality obtained from bibliometric data. Audit methods are
suggested to mitigate possible gaming effects by electors in the peer
review process. The methodology has application to a wide arena of quality
assessment and professional ranking exercises. Some specific issues of
implementation are discussed in the context of the Econometric Society
fellowship elections.
Journal: Econometric Reviews
Pages: 2-40
Issue: 1
Volume: 35
Year: 2016
Month: 1
X-DOI: 10.1080/07474938.2014.956633
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956633
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:1:p:2-40
Template-Type: ReDIF-Article 1.0
Author-Name: Peter Schmidt
Author-X-Name-First: Peter
Author-X-Name-Last: Schmidt
Title: Meritocracy Voting: Measuring the Unmeasurable
Abstract:
This article is a set of comments on "Meritocracy Voting:
Measuring the Unmeasurable" by Peter C. B. Phillips.
Journal: Econometric Reviews
Pages: 41-43
Issue: 1
Volume: 35
Year: 2016
Month: 1
X-DOI: 10.1080/07474938.2015.1078624
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1078624
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:1:p:41-43
Template-Type: ReDIF-Article 1.0
Author-Name: Les Oxley
Author-X-Name-First: Les
Author-X-Name-Last: Oxley
Title: Elites and Secret Handshakes Versus Metrics and Rule-Based Acclamation: A Comment on "Measuring the Unmeasurable"
Abstract:
In this issue of Econometric Reviews, Peter
Phillips proposes a quantitative (objective) rule that
might be used as part of the exercise of measuring merit. In this short
comment, I will try and place the need for such a rule (lack of trust and
pluralism), potential for adoption (negotiation and power), pros
(transparency), and cons (metric failure and loss of institutional
memory), within both an historical context and a broader strategic
management literature. Ultimately, the adoption of such a rule will depend
on its ability to convince the relevant community, via negotiation, that
subjective assessments can be appropriately summarized numerically. Rather
than argue for objectivity or suggest that numbers are somehow neutral
transformations of the real world, it may be advantageous to consider some
of the lessons from the critical accounting literature that has focused on
the "socially constructed nature" of their numerical systems and
techniques.
Journal: Econometric Reviews
Pages: 44-49
Issue: 1
Volume: 35
Year: 2016
Month: 1
X-DOI: 10.1080/07474938.2014.956638
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956638
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:1:p:44-49
Template-Type: ReDIF-Article 1.0
Author-Name: Chia-Lin Chang
Author-X-Name-First: Chia-Lin
Author-X-Name-Last: Chang
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: Robust Ranking of Journal Quality: An Application to Economics
Abstract:
The article focuses on the robustness of rankings of academic
journal quality and research impact in general, and in economics, in
particular, based on the widely-used Thomson Reuters ISI Web of Science
citations database (ISI). The article analyzes 299 leading international
journals in economics using quantifiable Research Assessment Measures
(RAMs), and highlights the similarities and differences in various RAMs,
which are based on alternative transformations of citations and influence.
All existing RAMs to date have been static, so two new dynamic RAMs are
developed to capture changes in impact factor over time and escalating
journal self-citations. Alternative RAMs may be calculated annually or
updated daily to determine When, Where, and How (frequently) published
articles are cited (see Chang et al., 2011a-c). The RAMs are grouped in
four distinct classes that include impact factor, mean citations, and
non-citations, journal policy, number of high quality articles, journal
influence, and article influence. These classes include the most widely
used RAMs, namely, the classic 2-year impact factor including journal
self-citations (2YIF), 2-year impact factor excluding journal self
citations (2YIF*), 5-year impact factor including journal self citations
(5YIF), Eigenfactor (or Journal Influence), Article Influence, h-index,
and Papers Ignored-By Even The Authors (PI-BETA). As all existing RAMs to
date have been static, two new dynamic RAMs are developed to capture
changes in impact factor over time (5YD2 = 5YIF/2YIF) and
Escalating Self-Citations (ESC). We highlight robust rankings based on the
harmonic mean of the ranks of RAMs across the four classes. It is shown
that emphasizing the 2YIF of a journal, which partly answers the question
as to When published articles are cited, to the exclusion of other
informative RAMs, which answer Where and How (frequently) published
articles are cited, can lead to a distorted evaluation of journal quality,
impact, and influence relative to the harmonic mean of the ranks.
Journal: Econometric Reviews
Pages: 50-97
Issue: 1
Volume: 35
Year: 2016
Month: 1
X-DOI: 10.1080/07474938.2014.956639
File-URL: http://hdl.handle.net/10.1080/07474938.2014.956639
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:1:p:50-97
Template-Type: ReDIF-Article 1.0
Author-Name: Xiaodong Liu
Author-X-Name-First: Xiaodong
Author-X-Name-Last: Liu
Title: Nonparametric Estimation of Large Auctions with Risk Averse Bidders
Abstract:
This article studies the robustness of Guerre et al.'s (2000)
two-step nonparametric estimation procedure in a first-price, sealed-bid
auction with n (n >> 1) risk
averse bidders. Based on an asymptotic approximation with precision of
order O(n -super- - 2) of
the intractable equilibrium bidding function, we establish the uniform
consistency with rates of convergence of Guerre et al.'s (2000) two-step
nonparametric estimator in the presence of risk aversion. Monte Carlo
experiments show that the two-step nonparametric estimator performs
reasonably well with a moderate number of bidders such as six.
Journal: Econometric Reviews
Pages: 98-121
Issue: 1
Volume: 35
Year: 2016
Month: 1
X-DOI: 10.1080/07474938.2013.806719
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806719
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:1:p:98-121
Template-Type: ReDIF-Article 1.0
Author-Name: Tom�s del Barrio Castro
Author-X-Name-First: Tom�s
Author-X-Name-Last: del Barrio Castro
Author-Name: Denise R. Osborn
Author-X-Name-First: Denise R.
Author-X-Name-Last: Osborn
Author-Name: A.M. Robert Taylor
Author-X-Name-First: A.M. Robert
Author-X-Name-Last: Taylor
Title: The Performance of Lag Selection and Detrending Methods for HEGY Seasonal Unit Root Tests
Abstract:
This paper analyzes two key issues for the empirical
implementation of parametric seasonal unit root tests, namely generalized
least squares (GLS) versus ordinary least squares (OLS) detrending and the
selection of the lag augmentation polynomial. Through an extensive Monte
Carlo analysis, the performance of a battery of lag selection techniques
is analyzed, including a new extension of modified information criteria
for the seasonal unit root context. All procedures are applied for both
OLS and GLS detrending for a range of data generating processes, also
including an examination of hybrid OLS-GLS detrending in conjunction with
(seasonal) modified AIC lag selection. An application to quarterly U.S.
industrial production indices illustrates the practical implications of
choices made.
Journal: Econometric Reviews
Pages: 122-168
Issue: 1
Volume: 35
Year: 2016
Month: 1
X-DOI: 10.1080/07474938.2013.807710
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807710
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:1:p:122-168
Template-Type: ReDIF-Article 1.0
Author-Name: Heino Bohn Nielsen
Author-X-Name-First: Heino Bohn
Author-X-Name-Last: Nielsen
Title: The Co-Integrated Vector Autoregression with Errors-in-Variables
Abstract:
The co-integrated vector autoregression is extended to allow
variables to be observed with classical measurement errors (ME). For
estimation, the model is parametrized as a time invariant state-space
form, and an accelerated expectation-maximization
algorithm is derived. A simulation study shows that (i)
the finite-sample properties of the maximum likelihood (ML) estimates and
reduced rank test statistics are excellent (ii) neglected
measurement errors will generally distort unit root inference due to a
moving average component in the residuals, and (iii) the
moving average component may-in principle-be approximated by a long
autoregression, but a pure autoregression cannot identify the
autoregressive structure of the latent process, and the adjustment
coefficients are estimated with a substantial asymptotic bias. An
application to the zero-coupon yield-curve is given.
Journal: Econometric Reviews
Pages: 169-200
Issue: 2
Volume: 35
Year: 2016
Month: 2
X-DOI: 10.1080/07474938.2013.806853
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806853
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:2:p:169-200
Template-Type: ReDIF-Article 1.0
Author-Name: Biao Zhang
Author-X-Name-First: Biao
Author-X-Name-Last: Zhang
Title: Empirical Likelihood in Causal Inference
Abstract:
This paper discusses the estimation of average treatment
effects in observational causal inferences. By employing a working
propensity score and two working regression models for treatment and
control groups, Robins et al. (1994, 1995) introduced the augmented
inverse probability weighting (AIPW) method for estimation of average
treatment effects, which extends the inverse probability weighting (IPW)
method of Horvitz and Thompson (1952); the AIPW estimators are locally
efficient and doubly robust. In this paper, we study a hybrid of the
empirical likelihood method and the method of moments by employing three
estimating functions, which can generate estimators for average treatment
effects that are locally efficient and doubly robust. The proposed
estimators of average treatment effects are efficient for the given choice
of three estimating functions when the working propensity score is
correctly specified, and thus are more efficient than the AIPW estimators.
In addition, we consider a regression method for estimation of the average
treatment effects when working regression models for both the treatment
and control groups are correctly specified; the asymptotic variance of the
resulting estimator is no greater than the semiparametric variance bound
characterized by the theory of Robins et al. (1994, 1995). Finally, we
present a simulation study to compare the finite-sample performance of
various methods with respect to bias, efficiency, and robustness to model
misspecification.
Journal: Econometric Reviews
Pages: 201-231
Issue: 2
Volume: 35
Year: 2016
Month: 2
X-DOI: 10.1080/07474938.2013.808490
File-URL: http://hdl.handle.net/10.1080/07474938.2013.808490
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:2:p:201-231
Template-Type: ReDIF-Article 1.0
Author-Name: Francesco Audrino
Author-X-Name-First: Francesco
Author-X-Name-Last: Audrino
Author-Name: Fulvio Corsi
Author-X-Name-First: Fulvio
Author-X-Name-Last: Corsi
Author-Name: Kameliya Filipova
Author-X-Name-First: Kameliya
Author-X-Name-Last: Filipova
Title: Bond Risk Premia Forecasting: A Simple Approach for Extracting Macroeconomic Information from a Panel of Indicators
Abstract:
We propose a simple but effective estimation procedure to
extract the level and the volatility dynamics of a latent macroeconomic
factor from a panel of observable indicators. Our approach is based on a
multivariate conditionally heteroskedastic exact factor model that can
take into account the heteroskedasticity feature shown by most
macroeconomic variables and relies on an iterated Kalman filter procedure.
In simulations we show the unbiasedness of the proposed estimator and its
superiority to different approaches introduced in the literature.
Simulation results are confirmed in applications to real inflation data
with the goal of forecasting long-term bond risk premia. Moreover, we find
that the extracted level and conditional variance of the latent factor for
inflation are strongly related to NBER business cycles.
Journal: Econometric Reviews
Pages: 232-256
Issue: 2
Volume: 35
Year: 2016
Month: 2
X-DOI: 10.1080/07474938.2013.833809
File-URL: http://hdl.handle.net/10.1080/07474938.2013.833809
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:2:p:232-256
Template-Type: ReDIF-Article 1.0
Author-Name: Moawia Alghalith
Author-X-Name-First: Moawia
Author-X-Name-Last: Alghalith
Title: Estimating the Stock/Portfolio Volatility and the Volatility of Volatility: A New Simple Method
Abstract:
We devise a convenient way to estimate stochastic volatility
and its volatility. Our method is applicable to both cross-sectional and
time series data, and both high-frequency and low-frequency data.
Moreover, this method, when applied to cross-sectional data (a collection
of risky assets, portfolio), provides a great simplification in the sense
that estimating the volatility of the portfolio does not require an
estimation of a volatility matrix (the volatilities of
the individual assets in the portfolio and their correlations).
Furthermore, there is no need to generate volatility data.
Journal: Econometric Reviews
Pages: 257-262
Issue: 2
Volume: 35
Year: 2016
Month: 2
X-DOI: 10.1080/07474938.2014.932144
File-URL: http://hdl.handle.net/10.1080/07474938.2014.932144
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:2:p:257-262
Template-Type: ReDIF-Article 1.0
Author-Name: Frédéric Ferraty
Author-X-Name-First: Frédéric
Author-X-Name-Last: Ferraty
Author-Name: Alejandro Quintela-Del-Río
Author-X-Name-First: Alejandro
Author-X-Name-Last: Quintela-Del-Río
Title: Conditional VAR and Expected Shortfall: A New Functional Approach
Abstract:
We estimate two well-known risk measures, the value-at-risk
(VAR) and the expected shortfall, conditionally to a functional variable
(i.e., a random variable valued in some semi(pseudo)-metric space). We use
nonparametric kernel estimation for constructing estimators of these
quantities, under general dependence conditions. Theoretical properties
are stated whereas practical aspects are illustrated on simulated data:
nonlinear functional and GARCH(1,1) models. Some ideas on bandwidth
selection using bootstrap are introduced. Finally, an empirical example is
given through data of the S&P 500 time series.
Journal: Econometric Reviews
Pages: 263-292
Issue: 2
Volume: 35
Year: 2016
Month: 2
X-DOI: 10.1080/07474938.2013.807107
File-URL: http://hdl.handle.net/10.1080/07474938.2013.807107
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:2:p:263-292
Template-Type: ReDIF-Article 1.0
Author-Name: Sofia Anyfantaki
Author-X-Name-First: Sofia
Author-X-Name-Last: Anyfantaki
Author-Name: Antonis Demos
Author-X-Name-First: Antonis
Author-X-Name-Last: Demos
Title: Estimation and Properties of a Time-Varying EGARCH(1,1) in Mean Model
Abstract:
Time-varying GARCH-M models are commonly employed in
econometrics and financial economics. Yet the recursive nature of the
conditional variance makes likelihood analysis of these models
computationally infeasible. This article outlines the issues and suggests
to employ a Markov chain Monte Carlo algorithm which allows the
calculation of a classical estimator via the simulated EM algorithm or a
simulated Bayesian solution in only O(T)
computational operations, where T is the sample size.
Furthermore, the theoretical dynamic properties of a
time-varying-parameter EGARCH(1,1)-M are derived. We discuss them and
apply the suggested Bayesian estimation to three major stock markets.
Journal: Econometric Reviews
Pages: 293-310
Issue: 2
Volume: 35
Year: 2016
Month: 2
X-DOI: 10.1080/07474938.2014.966639
File-URL: http://hdl.handle.net/10.1080/07474938.2014.966639
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:2:p:293-310
Template-Type: ReDIF-Article 1.0
Author-Name: Koji Miyawaki
Author-X-Name-First: Koji
Author-X-Name-Last: Miyawaki
Author-Name: Yasuhiro Omori
Author-X-Name-First: Yasuhiro
Author-X-Name-Last: Omori
Author-Name: Akira Hibiki
Author-X-Name-First: Akira
Author-X-Name-Last: Hibiki
Title: Exact Estimation of Demand Functions under Block-Rate Pricing
Abstract:
This article proposes an exact estimation of demand functions under
block-rate pricing by focusing on increasing block-rate pricing. This is
the first study that explicitly considers the separability condition which
has been ignored in previous literature. Under this pricing structure, the
price changes when consumption exceeds a certain threshold and the
consumer faces a utility maximization problem subject to a
piecewise-linear budget constraint. Solving this maximization problem
leads to a statistical model in which model parameters are strongly
restricted by the separability condition. In this article, by taking a
hierarchical Bayesian approach, we implement a Markov chain Monte Carlo
simulation to properly estimate the demand function. We find, however,
that the convergence of the distribution of simulated samples to the
posterior distribution is slow, requiring an additional scale
transformation step for parameters to the Gibbs sampler. These proposed
methods are then applied to estimate the Japanese residential water demand
function.
Journal: Econometric Reviews
Pages: 311-343
Issue: 3
Volume: 35
Year: 2016
Month: 3
X-DOI: 10.1080/07474938.2013.806857
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806857
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:3:p:311-343
Template-Type: ReDIF-Article 1.0
Author-Name: Alain Guay
Author-X-Name-First: Alain
Author-X-Name-Last: Guay
Author-Name: Florian Pelgrin
Author-X-Name-First: Florian
Author-X-Name-Last: Pelgrin
Title: Using Implied Probabilities to Improve the Estimation of Unconditional Moment Restrictions for Weakly Dependent Data
Abstract:
In this article, we investigate the use of implied probabilities (Back and
Brown, 1993) to improve estimation in unconditional moment conditions
models. Using the seminal contributions of Bonnal and Renault (2001) and
Antoine et al. (2007), we propose two three-step Euclidian empirical
likelihood (3S-EEL) estimators for weakly dependent data. Both estimators
make use of a control variates principle that can be interpreted in terms
of implied probabilities in order to achieve higher-order improvements
relative to the traditional two-step GMM estimator. A Monte Carlo study
reveals that the finite and large sample properties of the three-step
estimators compare favorably to the existing approaches: the two-step GMM
and the continuous updating estimator.
Journal: Econometric Reviews
Pages: 344-372
Issue: 3
Volume: 35
Year: 2016
Month: 3
X-DOI: 10.1080/07474938.2014.966630
File-URL: http://hdl.handle.net/10.1080/07474938.2014.966630
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:3:p:344-372
Template-Type: ReDIF-Article 1.0
Author-Name: Tommaso Proietti
Author-X-Name-First: Tommaso
Author-X-Name-Last: Proietti
Title: The Multistep Beveridge--Nelson Decomposition
Abstract:
The Beveridge--Nelson decomposition defines the trend component in terms
of the eventual forecast function, as the value the series would take if
it were on its long-run path. The article introduces the multistep
Beveridge--Nelson decomposition, which arises when the forecast function
is obtained by the direct autoregressive approach, which optimizes the
predictive ability of the AR model at forecast horizons greater than one.
We compare our proposal with the standard Beveridge--Nelson decomposition,
for which the forecast function is obtained by iterating the
one-step-ahead predictions via the chain rule. We illustrate that the
multistep Beveridge--Nelson trend is more efficient than the standard one
in the presence of model misspecification, and we subsequently assess the
predictive validity of the extracted transitory component with respect to
future growth.
Journal: Econometric Reviews
Pages: 373-395
Issue: 3
Volume: 35
Year: 2016
Month: 3
X-DOI: 10.1080/07474938.2014.966631
File-URL: http://hdl.handle.net/10.1080/07474938.2014.966631
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:3:p:373-395
Template-Type: ReDIF-Article 1.0
Author-Name: Joakim Westerlund
Author-X-Name-First: Joakim
Author-X-Name-Last: Westerlund
Title: Pooled Panel Unit Root Tests and the Effect of Past Initialization
Abstract:
This paper analyzes the role of initialization when testing for a unit
root in panel data, an issue that has received surprisingly little
attention in the literature. In fact, most studies assume that the initial
value is either zero or bounded. As a response to this, the current paper
considers a model in which the initialization is in the past, which is
shown to have several distinctive features that makes it attractive, even
in comparison to the common time series practice of making the initial
value a draw from its unconditional distribution under the stationary
alternative. The results have implications not only for theory, but also
for applied work. In particular, and in contrast to the time series case,
in panels the effect of the initialization need not be negative but can
actually lead to improved test performance.
Journal: Econometric Reviews
Pages: 396-427
Issue: 3
Volume: 35
Year: 2016
Month: 3
X-DOI: 10.1080/07474938.2013.833829
File-URL: http://hdl.handle.net/10.1080/07474938.2013.833829
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:3:p:396-427
Template-Type: ReDIF-Article 1.0
Author-Name: Gerdie Everaert
Author-X-Name-First: Gerdie
Author-X-Name-Last: Everaert
Author-Name: Tom De Groote
Author-X-Name-First: Tom
Author-X-Name-Last: De Groote
Title: Common Correlated Effects Estimation of Dynamic Panels with Cross-Sectional Dependence
Abstract:
We derive inconsistency expressions for dynamic panel data estimators
under error cross-sectional dependence generated by an unobserved common
factor in both the fixed effect and the incidental trends case. We show
that for a temporally dependent factor, the standard within groups (WG)
estimator is inconsistent even as both N and
T tend to infinity. Next we investigate the properties of
the common correlated effects pooled (CCEP) estimator of Pesaran (2006)
which eliminates the error cross-sectional dependence using
cross-sectional averages of the data. In contrast to the static case, the
CCEP estimator is only consistent when next to N also
T tends to infinity. It is shown that for the most
relevant parameter settings, the inconsistency of the CCEP estimator is
larger than that of the infeasible WG estimator, which includes the common
factors as regressors. Restricting the CCEP estimator results in a
somewhat smaller inconsistency. The small sample properties of the various
estimators are analyzed using Monte Carlo experiments. The simulation
results suggest that the CCEP estimator can be used to estimate dynamic
panel data models provided T is not too small. The size
of N is of less importance.
Journal: Econometric Reviews
Pages: 428-463
Issue: 3
Volume: 35
Year: 2016
Month: 3
X-DOI: 10.1080/07474938.2014.966635
File-URL: http://hdl.handle.net/10.1080/07474938.2014.966635
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:3:p:428-463
Template-Type: ReDIF-Article 1.0
Author-Name: Prosper Dovonon
Author-X-Name-First: Prosper
Author-X-Name-Last: Dovonon
Title: Large Sample Properties of the Three-Step Euclidean Likelihood Estimators under Model Misspecification
Abstract:
This article studies the three-step Euclidean likelihood (3S) estimator
and its corrected version as proposed by Antoine et al. (2007) in globally
misspecified models. We establish that the 3S estimator stays
-convergent and
asymptotically Gaussian. The discontinuity in the shrinkage factor makes
the analysis of the corrected-3S estimator harder to carry out in
misspecified models. We propose a slight modification to this factor to
control its rate of divergence in case of misspecification. We show that
the resulting modified-3S estimator is also higher order equivalent to the
maximum empirical likelihood (EL) estimator in well-specified models and
-convergent and
asymptotically Gaussian in misspecified models. Its asymptotic
distribution robust to misspecification is also provided. Because of these
properties, both the 3S and the modified-3S estimators could be considered
as computationally attractive alternatives to the exponentially tilted
empirical likelihood estimator proposed by Schennach (2007) which also is
higher order equivalent to EL in well-specified models and
-convergent in
misspecified models.
Journal: Econometric Reviews
Pages: 465-514
Issue: 4
Volume: 35
Year: 2016
Month: 4
X-DOI: 10.1080/07474938.2014.966634
File-URL: http://hdl.handle.net/10.1080/07474938.2014.966634
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:4:p:465-514
Template-Type: ReDIF-Article 1.0
Author-Name: José M. R. Murteira
Author-X-Name-First: José M. R.
Author-X-Name-Last: Murteira
Author-Name: Joaquim J. S. Ramalho
Author-X-Name-First: Joaquim J. S.
Author-X-Name-Last: Ramalho
Title: Regression Analysis of Multivariate Fractional Data
Abstract:
The present article discusses alternative regression models and estimation
methods for dealing with multivariate fractional response variables. Both
conditional mean models, estimable by quasi-maximum likelihood, and fully
parametric models (Dirichlet and Dirichlet-multinomial), estimable by
maximum likelihood, are considered. A new parameterization is proposed for
the parametric models, which accommodates the most common specifications
for the conditional mean (e.g., multinomial logit, nested logit, random
parameters logit, dogit). The text also discusses at some length the
specification analysis of fractional regression models, proposing several
tests that can be performed through artificial regressions. Finally, an
extensive Monte Carlo study evaluates the finite sample properties of most
of the estimators and tests considered.
Journal: Econometric Reviews
Pages: 515-552
Issue: 4
Volume: 35
Year: 2016
Month: 4
X-DOI: 10.1080/07474938.2013.806849
File-URL: http://hdl.handle.net/10.1080/07474938.2013.806849
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:4:p:515-552
Template-Type: ReDIF-Article 1.0
Author-Name: Stephen G. Donald
Author-X-Name-First: Stephen G.
Author-X-Name-Last: Donald
Author-Name: Yu-Chin Hsu
Author-X-Name-First: Yu-Chin
Author-X-Name-Last: Hsu
Title: Improving the Power of Tests of Stochastic Dominance
Abstract:
We extend Hansen's (2005) recentering method to a continuum of inequality
constraints to construct new Kolmogorov--Smirnov tests for stochastic
dominance of any pre-specified order. We show that our tests have correct
size asymptotically, are consistent against fixed alternatives and are
unbiased against some N-super-−1/2 local
alternatives. It is shown that by avoiding the use of the least favorable
configuration, our tests are less conservative and more powerful than
Barrett and Donald's (2003) and in some simulation examples we consider,
we find that our tests can be more powerful than the subsampling test of
Linton et al. (2005). We apply our method to test stochastic dominance
relations between Canadian income distributions in 1978 and 1986 as
considered in Barrett and Donald (2003) and find that some of the
hypothesis testing results are different using the new method.
Journal: Econometric Reviews
Pages: 553-585
Issue: 4
Volume: 35
Year: 2016
Month: 4
X-DOI: 10.1080/07474938.2013.833813
File-URL: http://hdl.handle.net/10.1080/07474938.2013.833813
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:4:p:553-585
Template-Type: ReDIF-Article 1.0
Author-Name: Ping Yu
Author-X-Name-First: Ping
Author-X-Name-Last: Yu
Title: Understanding Estimators of Treatment Effects in Regression Discontinuity Designs
Abstract:
In this paper, we propose two new estimators of treatment effects in
regression discontinuity designs. These estimators can aid understanding
of the existing estimators such as the local polynomial estimator and the
partially linear estimator. The first estimator is the partially
polynomial estimator which extends the partially linear estimator by
further incorporating derivative differences of the conditional mean of
the outcome on the two sides of the discontinuity point. This estimator is
related to the local polynomial estimator by a relocalization effect.
Unlike the partially linear estimator, this estimator can achieve the
optimal rate of convergence even under broader regularity conditions. The
second estimator is an instrumental variable estimator in the fuzzy
design. This estimator will reduce to the local polynomial estimator if
higher order endogeneities are neglected. We study the asymptotic
properties of these two estimators and conduct simulation studies to
confirm the theoretical analysis.
Journal: Econometric Reviews
Pages: 586-637
Issue: 4
Volume: 35
Year: 2016
Month: 4
X-DOI: 10.1080/07474938.2013.833831
File-URL: http://hdl.handle.net/10.1080/07474938.2013.833831
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:4:p:586-637
Template-Type: ReDIF-Article 1.0
Author-Name: Badi H. Baltagi
Author-X-Name-First: Badi H.
Author-X-Name-Last: Baltagi
Author-Name: Long Liu
Author-X-Name-First: Long
Author-X-Name-Last: Liu
Title: Random Effects, Fixed Effects and Hausman's Test for the Generalized Mixed Regressive Spatial Autoregressive Panel Data Model
Abstract:
This article suggests random and fixed effects spatial two-stage least
squares estimators for the generalized mixed regressive spatial
autoregressive panel data model. This extends the generalized spatial
panel model of Baltagi et al. (2013) by the inclusion of a spatial lag
term. The estimation method utilizes the Generalized Moments method
suggested by Kapoor et al. (2007) for a spatial autoregressive panel data
model. We derive the asymptotic distributions of these estimators and
suggest a Hausman test a la Mutl and Pfaffermayr (2011) based on the
difference between these estimators. Monte Carlo experiments are performed
to investigate the performance of these estimators as well as the
corresponding Hausman test.
Journal: Econometric Reviews
Pages: 638-658
Issue: 4
Volume: 35
Year: 2016
Month: 4
X-DOI: 10.1080/07474938.2014.998148
File-URL: http://hdl.handle.net/10.1080/07474938.2014.998148
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:4:p:638-658
Template-Type: ReDIF-Article 1.0
Author-Name: G. Mesters
Author-X-Name-First: G.
Author-X-Name-Last: Mesters
Author-Name: S. J. Koopman
Author-X-Name-First: S. J.
Author-X-Name-Last: Koopman
Author-Name: M. Ooms
Author-X-Name-First: M.
Author-X-Name-Last: Ooms
Title: Monte Carlo Maximum Likelihood Estimation for Generalized Long-Memory Time Series Models
Abstract:
An exact maximum likelihood method is developed for the estimation of
parameters in a non-Gaussian nonlinear density function that depends on a
latent Gaussian dynamic process with long-memory properties. Our method
relies on the method of importance sampling and on a linear Gaussian
approximating model from which the latent process can be simulated. Given
the presence of a latent long-memory process, we require a modification of
the importance sampling technique. In particular, the long-memory process
needs to be approximated by a finite dynamic linear process. Two possible
approximations are discussed and are compared with each other. We show
that an autoregression obtained from minimizing mean squared prediction
errors leads to an effective and feasible method. In our empirical study,
we analyze ten daily log-return series from the S&P 500 stock index by
univariate and multivariate long-memory stochastic volatility models. We
compare the in-sample and out-of-sample performance of a number of models
within the class of long-memory stochastic volatility models.
Journal: Econometric Reviews
Pages: 659-687
Issue: 4
Volume: 35
Year: 2016
Month: 4
X-DOI: 10.1080/07474938.2015.1031014
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1031014
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:4:p:659-687
Template-Type: ReDIF-Article 1.0
Author-Name: Florian Heiss
Author-X-Name-First: Florian
Author-X-Name-Last: Heiss
Title: Discrete Choice Methods with Simulation
Journal: Econometric Reviews
Pages: 688-692
Issue: 4
Volume: 35
Year: 2016
Month: 4
X-DOI: 10.1080/07474938.2014.975634
File-URL: http://hdl.handle.net/10.1080/07474938.2014.975634
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:4:p:688-692
Template-Type: ReDIF-Article 1.0
Author-Name: Sadat Reza
Author-X-Name-First: Sadat
Author-X-Name-Last: Reza
Author-Name: Paul Rilstone
Author-X-Name-First: Paul
Author-X-Name-Last: Rilstone
Title: Semiparametric Efficiency Bounds and Efficient Estimation of Discrete Duration Models with Unspecified Hazard Rate
Abstract:
We consider semiparametric estimation of discrete duration models whose
hazard rate can be characterized as an unknown transformation of a
parametric index of the observable covariates and elapsed spell length.
The information matrix is derived. In the case of separable duration
dependence the information matrix for the parameters of the index is
singular. In that situation, the information matrix for the regression
component of the hazard function is derived. The information matrix is
also singular when individual--specific/time--invariant unobserved errors
are introduced. We develop --consistent
estimators for those instances when the information matrix is nonsingular.
Less--than----consistent
estimators of the duration dependence component of the hazard rate of the
separable model are developed. Simulations and an application to strike
durations illustrate the viability of the approach.
Journal: Econometric Reviews
Pages: 693-726
Issue: 5
Volume: 35
Year: 2016
Month: 5
X-DOI: 10.1080/07474938.2014.966637
File-URL: http://hdl.handle.net/10.1080/07474938.2014.966637
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:5:p:693-726
Template-Type: ReDIF-Article 1.0
Author-Name: Helmut Herwartz
Author-X-Name-First: Helmut
Author-X-Name-Last: Herwartz
Author-Name: Florian Siedenburg
Author-X-Name-First: Florian
Author-X-Name-Last: Siedenburg
Author-Name: Yabibal M. Walle
Author-X-Name-First: Yabibal M.
Author-X-Name-Last: Walle
Title: Heteroskedasticity Robust Panel Unit Root Testing Under Variance Breaks in Pooled Regressions
Abstract:
Noting that many economic variables display occasional shifts in their
second order moments, we investigate the performance of homogenous panel
unit root tests in the presence of permanent volatility shifts. It is
shown that in this case the test statistic proposed by Herwartz and
Siedenburg (2008) is asymptotically standard Gaussian. By means of a
simulation study we illustrate the performance of first and second
generation panel unit root tests and undertake a more detailed comparison
of the test in Herwartz and Siedenburg (2008) and its heteroskedasticity
consistent Cauchy counterpart introduced in Demetrescu and Hanck (2012a).
As an empirical illustration, we reassess evidence on the Fisher
hypothesis with data from nine countries over the period 1961Q2--2011Q2.
Empirical evidence supports panel stationarity of the real interest rate
for the entire subperiod. With regard to the most recent two decades, the
test results cast doubts on market integration, since the real interest
rate is diagnosed nonstationary.
Journal: Econometric Reviews
Pages: 727-750
Issue: 5
Volume: 35
Year: 2016
Month: 5
X-DOI: 10.1080/07474938.2014.966638
File-URL: http://hdl.handle.net/10.1080/07474938.2014.966638
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:5:p:727-750
Template-Type: ReDIF-Article 1.0
Author-Name: Matei Demetrescu
Author-X-Name-First: Matei
Author-X-Name-Last: Demetrescu
Author-Name: Christoph Hanck
Author-X-Name-First: Christoph
Author-X-Name-Last: Hanck
Title: Robust Inference for Near-Unit Root Processes with Time-Varying Error Variances
Abstract:
The autoregressive Cauchy estimator uses the sign of the first lag as
instrumental variable (IV); under independent and identically distributed
(i.i.d.) errors, the resulting IV t-type statistic is
known to have a standard normal limiting distribution in the unit root
case. With unconditional heteroskedasticity, the ordinary least squares
(OLS) t statistic is affected in the unit root case; but
the paper shows that, by using some nonlinear transformation behaving
asymptotically like the sign as instrument, limiting normality of the IV
t-type statistic is maintained when the series to be
tested has no deterministic trends. Neither estimation of the so-called
variance profile nor bootstrap procedures are required to this end. The
Cauchy unit root test has power in the same 1/T
neighborhoods as the usual unit root tests, also for a wide range of
magnitudes for the initial value. It is furthermore shown to be
competitive with other, bootstrap-based, robust tests. When the series
exhibit a linear trend, however, the null distribution of the Cauchy test
for a unit root becomes nonstandard, reminiscent of the Dickey-Fuller
distribution. In this case, inference robust to nonstationary volatility
is obtained via the wild bootstrap.
Journal: Econometric Reviews
Pages: 751-781
Issue: 5
Volume: 35
Year: 2016
Month: 5
X-DOI: 10.1080/07474938.2014.976525
File-URL: http://hdl.handle.net/10.1080/07474938.2014.976525
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:5:p:751-781
Template-Type: ReDIF-Article 1.0
Author-Name: Pierre Perron
Author-X-Name-First: Pierre
Author-X-Name-Last: Perron
Author-Name: Yohei Yamamoto
Author-X-Name-First: Yohei
Author-X-Name-Last: Yamamoto
Title: On the Usefulness or Lack Thereof of Optimality Criteria for Structural Change Tests
Abstract:
Elliott and Müller (2006) considered the problem of testing for
general types of parameter variations, including infrequent breaks. They
developed a framework that yields optimal tests, in the sense that they
nearly attain some local Gaussian power envelop. The main ingredient in
their setup is that the variance of the process generating the changes in
the parameters must go to zero at a fast rate. They recommended the
so-called qL̂L test, a partial sums type test based
on the residuals obtained from the restricted model. We show that for
breaks that are very small, its power is indeed higher than other tests,
including the popular sup-Wald (SW) test. However, the differences are
very minor. When the magnitude of change is moderate to large, the power
of the test is very low in the context of a regression with lagged
dependent variables or when a correction is applied to account for serial
correlation in the errors. In many cases, the power goes to zero as the
magnitude of change increases. The power of the SW test does not show this
non-monotonicity and its power is far superior to the
qL̂L test when the break is not very small. We
claim that the optimality of the qL̂L test does not
come from the properties of the test statistics but the criterion adopted,
which is not useful to analyze structural change tests. Instead, we use
fixed-break size asymptotic approximations to assess the relative
efficiency or power of the two tests. When doing so, it is shown that the
SW test indeed dominates the qL̂L test and, in many
cases, the latter has zero relative asymptotic efficiency.
Journal: Econometric Reviews
Pages: 782-844
Issue: 5
Volume: 35
Year: 2016
Month: 5
X-DOI: 10.1080/07474938.2014.977621
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977621
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:5:p:782-844
Template-Type: ReDIF-Article 1.0
Author-Name: Joakim Westerlund
Author-X-Name-First: Joakim
Author-X-Name-Last: Westerlund
Author-Name: Mehdi Hosseinkouchack
Author-X-Name-First: Mehdi
Author-X-Name-Last: Hosseinkouchack
Author-Name: Martin Solberger
Author-X-Name-First: Martin
Author-X-Name-Last: Solberger
Title: The Local Power of the CADF and CIPS Panel Unit Root Tests
Abstract:
Very little is known about the local power of second generation panel unit
root tests that are robust to cross-section dependence. This article
derives the local asymptotic power functions of the cross-section
argumented Dickey--Fuller Cross-section Augmented Dickey-Fuller (CADF) and
CIPS tests of Pesaran (2007), which are among the most popular tests
around.
Journal: Econometric Reviews
Pages: 845-870
Issue: 5
Volume: 35
Year: 2016
Month: 5
X-DOI: 10.1080/07474938.2014.977077
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977077
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:5:p:845-870
Template-Type: ReDIF-Article 1.0
Author-Name: Efthymios G. Tsionas
Author-X-Name-First: Efthymios G.
Author-X-Name-Last: Tsionas
Author-Name: Kien C. Tran
Author-X-Name-First: Kien C.
Author-X-Name-Last: Tran
Title: On the Joint Estimation of Heterogeneous Technologies, Technical, and Allocative Inefficiency
Abstract:
In this article, we provide a semiparametric approach to the joint
measurement of technical and allocative inefficiency in a way that the
internal consistency of the specification of allocative errors in the
objective function (e.g., cost function) and the derivative equations
(e.g., share or input demand functions) is assured. We start from the
Cobb--Douglas production and shadow cost system. We show that the shadow
cost system has a closed-form likelihood function contrary to what was
previously thought. In turn, we use the method of local maximum likelihood
applied to a system of equations to obtain firm-specific parameter
estimates (which reveal heterogeneity in production) as well as measures
of technical and allocative inefficiency and its cost. We illustrate its
practical application using data on U.S. electric utilities.
Journal: Econometric Reviews
Pages: 871-893
Issue: 5
Volume: 35
Year: 2016
Month: 5
X-DOI: 10.1080/07474938.2014.975635
File-URL: http://hdl.handle.net/10.1080/07474938.2014.975635
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:5:p:871-893
Template-Type: ReDIF-Article 1.0
Author-Name: Alan T. K. Wan
Author-X-Name-First: Alan T. K.
Author-X-Name-Last: Wan
Author-Name: Jinhong You
Author-X-Name-First: Jinhong
Author-X-Name-Last: You
Author-Name: Riquan Zhang
Author-X-Name-First: Riquan
Author-X-Name-Last: Zhang
Title: A Seemingly Unrelated Nonparametric Additive Model with Autoregressive Errors
Abstract:
This article considers a nonparametric additive seemingly unrelated
regression model with autoregressive errors, and develops estimation and
inference procedures for this model. Our proposed method first estimates
the unknown functions by combining polynomial spline series approximations
with least squares, and then uses the fitted residuals together with the
smoothly clipped absolute deviation (SCAD) penalty to identify the error
structure and estimate the unknown autoregressive coefficients. Based on
the polynomial spline series estimator and the fitted error structure, a
two-stage local polynomial improved estimator for the unknown functions of
the mean is further developed. Our procedure applies a prewhitening
transformation of the dependent variable, and also takes into account the
contemporaneous correlations across equations. We show that the resulting
estimator possesses an oracle property, and is asymptotically more
efficient than estimators that neglect the autocorrelation and/or
contemporaneous correlations of errors. We investigate the small sample
properties of the proposed procedure in a simulation study.
Journal: Econometric Reviews
Pages: 894-928
Issue: 5
Volume: 35
Year: 2016
Month: 5
X-DOI: 10.1080/07474938.2014.998149
File-URL: http://hdl.handle.net/10.1080/07474938.2014.998149
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:5:p:894-928
Template-Type: ReDIF-Article 1.0
Author-Name: Álvaro Cartea
Author-X-Name-First: Álvaro
Author-X-Name-Last: Cartea
Author-Name: Dimitrios Karyampas
Author-X-Name-First: Dimitrios
Author-X-Name-Last: Karyampas
Title: The Relationship between the Volatility of Returns and the Number of Jumps in Financial Markets
Abstract:
We propose a methodology to employ high frequency financial data to obtain
estimates of volatility of log-prices which are not affected by
microstructure noise and Lévy jumps. We introduce the “number
of jumps” as a variable to explain and predict volatility and show
that the number of jumps in SPY prices is an important variable to explain
the daily volatility of the SPY log-returns, has more explanatory power
than other variables (e.g., high and low, open and close), and has a
similar explanatory power to that of the VIX. Finally, the number of jumps
is very useful to forecast volatility and contains information that is not
impounded in the VIX.
Journal: Econometric Reviews
Pages: 929-950
Issue: 6
Volume: 35
Year: 2016
Month: 6
X-DOI: 10.1080/07474938.2014.976529
File-URL: http://hdl.handle.net/10.1080/07474938.2014.976529
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:6:p:929-950
Template-Type: ReDIF-Article 1.0
Author-Name: George Kapetanios
Author-X-Name-First: George
Author-X-Name-Last: Kapetanios
Author-Name: Zacharias Psaradakis
Author-X-Name-First: Zacharias
Author-X-Name-Last: Psaradakis
Title: Semiparametric Sieve-Type Generalized Least Squares Inference
Abstract:
This article considers the problem of statistical inference in linear
regression models with dependent errors. A sieve-type generalized least
squares (GLS) procedure is proposed based on an autoregressive
approximation to the generating mechanism of the errors. The asymptotic
properties of the sieve-type GLS estimator are established under general
conditions, including mixingale-type conditions as well as conditions
which allow for long-range dependence in the stochastic regressors and/or
the errors. A Monte Carlo study examines the finite-sample properties of
the method for testing regression hypotheses.
Journal: Econometric Reviews
Pages: 951-985
Issue: 6
Volume: 35
Year: 2016
Month: 6
X-DOI: 10.1080/07474938.2014.975639
File-URL: http://hdl.handle.net/10.1080/07474938.2014.975639
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:6:p:951-985
Template-Type: ReDIF-Article 1.0
Author-Name: Aaron D. Smallwood
Author-X-Name-First: Aaron D.
Author-X-Name-Last: Smallwood
Title: A Monte Carlo Investigation of Unit Root Tests and Long Memory in Detecting Mean Reversion in I(0) Regime Switching, Structural Break, and Nonlinear Data
Abstract:
The potential observational equivalence between various types of
nonlinearity and long memory has been recognized by the econometrics
community since at least the contribution of Diebold and Inoue (2001). A
large literature has developed in an attempt to ascertain whether or not
the long memory finding in many economic series is spurious. Yet to date,
no study has analyzed the consequences of using long memory methods to
test for unit roots when the “truth” derives from regime
switching, structural breaks, or other types of mean reverting
nonlinearity. In this article, I conduct a comprehensive Monte Carlo
analysis to investigate the consequences of using tests designed to have
power against fractional integration when the actual data generating
process is unknown. I additionally consider the use of tests designed to
have power against breaks and threshold nonlinearity. The findings are
compelling and demonstrate that the use of long memory as an approximation
to nonlinearity yields tests with relatively high power. In contrast,
misspecification has severe consequences for tests designed to have power
against threshold nonlinearity, and especially for tests designed to have
power against breaks.
Journal: Econometric Reviews
Pages: 986-1012
Issue: 6
Volume: 35
Year: 2016
Month: 6
X-DOI: 10.1080/07474938.2014.976526
File-URL: http://hdl.handle.net/10.1080/07474938.2014.976526
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:6:p:986-1012
Template-Type: ReDIF-Article 1.0
Author-Name: Hendrik Wolff
Author-X-Name-First: Hendrik
Author-X-Name-Last: Wolff
Title: Imposing and Testing for Shape Restrictions in Flexible Parametric Models
Abstract:
In many economic models, theory restricts the shape of functions, such as
monotonicity or curvature conditions. This article reviews and presents a
framework for constrained estimation and inference to test for shape
conditions in parametric models. We show that “regional”
shape-restricting estimators have important advantages in terms of model
fit and flexibility (as opposed to standard “local” or
“global” shape-restricting estimators). In our empirical
illustration, this is the first article to impose and test for all shape
restrictions required by economic theory simultaneously in the
“Berndt and Wood” data. We find that this dataset is
consistent with “duality theory,” whereas previous studies
have found violations of economic theory. We discuss policy consequences
for key parameters, such as whether energy and capital are complements or
substitutes.
Journal: Econometric Reviews
Pages: 1013-1039
Issue: 6
Volume: 35
Year: 2016
Month: 6
X-DOI: 10.1080/07474938.2014.975637
File-URL: http://hdl.handle.net/10.1080/07474938.2014.975637
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:6:p:1013-1039
Template-Type: ReDIF-Article 1.0
Author-Name: Jan R. Magnus
Author-X-Name-First: Jan R.
Author-X-Name-Last: Magnus
Author-Name: Wendun Wang
Author-X-Name-First: Wendun
Author-X-Name-Last: Wang
Author-Name: Xinyu Zhang
Author-X-Name-First: Xinyu
Author-X-Name-Last: Zhang
Title: Weighted-Average Least Squares Prediction
Abstract:
Prediction under model uncertainty is an important and difficult issue.
Traditional prediction methods (such as pretesting) are based on model
selection followed by prediction in the selected model, but the reported
prediction and the reported prediction variance ignore the uncertainty
from the selection procedure. This article proposes a weighted-average
least squares (WALS) prediction procedure that is not conditional on the
selected model. Taking both model and error uncertainty into account, we
also propose an appropriate estimate of the variance of the WALS
predictor. Correlations among the random errors are explicitly allowed.
Compared to other prediction averaging methods, the WALS predictor has
important advantages both theoretically and computationally. Simulation
studies show that the WALS predictor generally produces lower mean squared
prediction errors than its competitors, and that the proposed estimator
for the prediction variance performs particularly well when model
uncertainty increases.
Journal: Econometric Reviews
Pages: 1040-1074
Issue: 6
Volume: 35
Year: 2016
Month: 6
X-DOI: 10.1080/07474938.2014.977065
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977065
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:6:p:1040-1074
Template-Type: ReDIF-Article 1.0
Author-Name: Gerard J. van den Berg
Author-X-Name-First: Gerard J.
Author-X-Name-Last: van den Berg
Author-Name: Bettina Drepper
Author-X-Name-First: Bettina
Author-X-Name-Last: Drepper
Title: Inference for Shared-Frailty Survival Models with Left-Truncated Data
Abstract:
Shared-frailty survival models specify that systematic unobserved
determinants of duration outcomes are identical within groups of
individuals. We consider random-effects likelihood-based statistical
inference if the duration data are subject to left-truncation. Such
inference with left-truncated data can be performed in previous versions
of the Stata software package for parametric and semi-parametric shared
frailty models. We show that with left-truncated data, the commands ignore
the weeding-out process before the left-truncation points, affecting the
distribution of unobserved determinants among group members in the data,
namely among the group members who survive until their truncation points.
We critically examine studies in the statistical literature on this issue
as well as published empirical studies that use the commands. Simulations
illustrate the size of the (asymptotic) bias and its dependence on the
degree of truncation.
Journal: Econometric Reviews
Pages: 1075-1098
Issue: 6
Volume: 35
Year: 2016
Month: 6
X-DOI: 10.1080/07474938.2014.975640
File-URL: http://hdl.handle.net/10.1080/07474938.2014.975640
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:6:p:1075-1098
Template-Type: ReDIF-Article 1.0
Author-Name: Jeremiah Richey
Author-X-Name-First: Jeremiah
Author-X-Name-Last: Richey
Title: An Odd Couple: Monotone Instrumental Variables and Binary Treatments
Abstract:
This article investigates Monotone Instrumental Variables (MIV) and their
ability to aid in identifying treatment effects when the treatment is
binary in a nonparametric bounding framework. I show that an MIV can only
aid in identification beyond that of a Monotone Treatment Selection
assumption if for some region of the instrument the observed
conditional-on-received-treatment outcomes exhibit monotonicity in the
instrument in the opposite direction as that assumed by the MIV in a
Simpson's Paradox-like fashion. Furthermore, an MIV can only aid in
identification beyond that of a Monotone Treatment Response assumption if
for some region of the instrument either the above Simpson's Paradox-like
relationship exists or the instrument's indirect effect on the outcome (as
through its influence on treatment selection) is the opposite of its
direct effect as assumed by the MIV. The implications of the main findings
for empirical work are discussed and the results are highlighted with an
application investigating the effect of criminal convictions on job match
quality using data from the 1997 National Longitudinal Survey of the
Youth. Though the main results are shown to hold only for the binary
treatment case in general, they are shown to have important implications
for the multi-valued treatment case as well.
Journal: Econometric Reviews
Pages: 1099-1110
Issue: 6
Volume: 35
Year: 2016
Month: 6
X-DOI: 10.1080/07474938.2014.977082
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977082
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:6:p:1099-1110
Template-Type: ReDIF-Article 1.0
Author-Name: Indeewara Perera
Author-X-Name-First: Indeewara
Author-X-Name-Last: Perera
Author-Name: Javier Hidalgo
Author-X-Name-First: Javier
Author-X-Name-Last: Hidalgo
Author-Name: Mervyn J. Silvapulle
Author-X-Name-First: Mervyn J.
Author-X-Name-Last: Silvapulle
Title: A Goodness-of-Fit Test for a Class of Autoregressive Conditional Duration Models
Abstract:
This article develops a method for testing the goodness-of-fit of a given
parametric autoregressive conditional duration model against unspecified
nonparametric alternatives. The test statistics are functions of the
residuals corresponding to the quasi maximum likelihood estimate of the
given parametric model, and are easy to compute. The limiting
distributions of the test statistics are not free from nuisance
parameters. Hence, critical values cannot be tabulated for general use. A
bootstrap procedure is proposed to implement the tests, and its asymptotic
validity is established. The finite sample performances of the proposed
tests and several other competing ones in the literature, were compared
using a simulation study. The tests proposed in this article performed
well consistently throughout, and they were either the best or close to
the best. None of the tests performed uniformly the best. The tests are
illustrated using an empirical example.
Journal: Econometric Reviews
Pages: 1111-1141
Issue: 6
Volume: 35
Year: 2016
Month: 6
X-DOI: 10.1080/07474938.2014.975644
File-URL: http://hdl.handle.net/10.1080/07474938.2014.975644
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:6:p:1111-1141
Template-Type: ReDIF-Article 1.0
Author-Name: J. Isaac Miller
Author-X-Name-First: J. Isaac
Author-X-Name-Last: Miller
Title: Conditionally Efficient Estimation of Long-Run Relationships Using Mixed-Frequency Time Series
Abstract:
I analyze efficient estimation of a cointegrating vector when the
regressand and regressor are observed at different frequencies. Previous
authors have examined the effects of specific temporal aggregation or
sampling schemes, finding conventionally efficient techniques to be
efficient only when both the regressand and the regressors are average
sampled. Using an alternative method for analyzing aggregation under more
general weighting schemes, I derive an efficiency bound that is
conditional on the type of aggregation used on the low-frequency series
and differs from the unconditional bound defined by the full-information
high-frequency data-generating process, which is infeasible due to
aggregation of at least one series. I modify a conventional estimator,
canonical cointegrating regression (CCR), to accommodate cases in which
the aggregation weights are known. The correlation structure may be
utilized to offset the potential information loss from aggregation,
resulting in a conditionally efficient estimator. In the case of unknown
weights, the correlation structure of the error term generally confounds
identification of conditionally efficient weights. Efficiency is
illustrated using a simulation study and an application to estimating a
gasoline demand equation.
Journal: Econometric Reviews
Pages: 1142-1171
Issue: 6
Volume: 35
Year: 2016
Month: 6
X-DOI: 10.1080/07474938.2014.976527
File-URL: http://hdl.handle.net/10.1080/07474938.2014.976527
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:6:p:1142-1171
Template-Type: ReDIF-Article 1.0
Author-Name: Yuzhi Cai
Author-X-Name-First: Yuzhi
Author-X-Name-Last: Cai
Title: A General Quantile Function Model for Economic and Financial Time Series
Abstract:
This article proposed a general quantile function model that covers both
one- and multiple-dimensional models and that takes several existing
models in the literature as its special cases. This article also developed
a new uniform Bayesian framework for quantile function modelling and
illustrated the developed approach through different quantile function
models. Many distributions are defined explicitly only via their quanitle
functions as the corresponding distribution or density functions do not
have an explicit mathematical expression. Such distributions are rarely
used in economic and financial modelling in practice. The developed
methodology makes it more convenient to use these distributions in
analyzing economic and financial data. Empirical applications to economic
and financial time series and comparisons with other types of models and
methods show that the developed method can be very useful in practice.
Journal: Econometric Reviews
Pages: 1173-1193
Issue: 7
Volume: 35
Year: 2016
Month: 8
X-DOI: 10.1080/07474938.2014.976528
File-URL: http://hdl.handle.net/10.1080/07474938.2014.976528
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:7:p:1173-1193
Template-Type: ReDIF-Article 1.0
Author-Name: Kun Ho Kim
Author-X-Name-First: Kun Ho
Author-X-Name-Last: Kim
Title: Inference of the Trend in a Partially Linear Model with Locally Stationary Regressors
Abstract:
In this article, we construct the uniform confidence band (UCB) of
nonparametric trend in a partially linear model with locally stationary
regressors. A two-stage semiparametric regression is employed to estimate
the trend function. Based on this estimate, we develop an invariance
principle to construct the UCB of the trend function. The proposed
methodology is used to estimate the Non-Accelerating Inflation Rate of
Unemployment (NAIRU) in the Phillips Curve and to perform inference of the
parameter based on its UCB. The empirical results strongly suggest that
the U.S. NAIRU is time-varying.
Journal: Econometric Reviews
Pages: 1194-1220
Issue: 7
Volume: 35
Year: 2016
Month: 8
X-DOI: 10.1080/07474938.2014.976530
File-URL: http://hdl.handle.net/10.1080/07474938.2014.976530
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:7:p:1194-1220
Template-Type: ReDIF-Article 1.0
Author-Name: Marcelo Fernandes
Author-X-Name-First: Marcelo
Author-X-Name-Last: Fernandes
Author-Name: Marcelo C. Medeiros
Author-X-Name-First: Marcelo C.
Author-X-Name-Last: Medeiros
Author-Name: Alvaro Veiga
Author-X-Name-First: Alvaro
Author-X-Name-Last: Veiga
Title: A (Semi)Parametric Functional Coefficient Logarithmic Autoregressive Conditional Duration Model
Abstract:
In this article, we propose a class of logarithmic autoregressive
conditional duration (ACD)-type models that accommodates overdispersion,
intermittent dynamics, multiple regimes, and asymmetries in financial
durations. In particular, our functional coefficient logarithmic
autoregressive conditional duration (FC-LACD) model relies on a
smooth-transition autoregressive specification. The motivation lies on the
fact that the latter yields a universal approximation if one lets the
number of regimes grows without bound. After establishing sufficient
conditions for strict stationarity, we address model identifiability as
well as the asymptotic properties of the quasi-maximum likelihood (QML)
estimator for the FC-LACD model with a fixed number of regimes. In
addition, we also discuss how to consistently estimate a semiparametric
variant of the FC-LACD model that takes the number of regimes to infinity.
An empirical illustration indicates that our functional coefficient model
is flexible enough to model IBM price durations.
Journal: Econometric Reviews
Pages: 1221-1250
Issue: 7
Volume: 35
Year: 2016
Month: 8
X-DOI: 10.1080/07474938.2014.977071
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977071
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:7:p:1221-1250
Template-Type: ReDIF-Article 1.0
Author-Name: Mariano Kulish
Author-X-Name-First: Mariano
Author-X-Name-Last: Kulish
Author-Name: Adrian Pagan
Author-X-Name-First: Adrian
Author-X-Name-Last: Pagan
Title: Issues in Estimating New Keynesian Phillips Curves in the Presence of Unknown Structural Change
Abstract:
Many articles which have estimated models with forward looking
expectations have reported that the magnitude of the coefficients of the
expectations term is very large when compared with the effects coming from
past dynamics. This has sometimes been regarded as implausible and led to
the feeling that the expectations coefficient is biased upwards. A
relatively general argument that has been advanced is that the bias could
be due to structural changes in the means of the variables entering the
structural equation. An alternative explanation is that the bias comes
from weak instruments. In this article, we investigate the issue of upward
bias in the estimated coefficients of the expectations variable based on a
model where we can see what causes the breaks and how to control for them.
We conclude that weak instruments are the most likely cause of any bias
and note that structural change can affect the quality of instruments. We
also look at some empirical work in Castle et al. (2014) on the new
Kaynesian Phillips curve (NYPC) in the Euro Area and U.S. assessing
whether the smaller coefficient on expectations that Castle et al. (2014)
highlight is due to structural change. Our conclusion is that it is not.
Instead it comes from their addition of variables to the NKPC. After
allowing for the fact that there are weak instruments in the estimated
re-specified model, it would seem that the forward coefficient estimate is
actually quite high rather than low.
Journal: Econometric Reviews
Pages: 1251-1270
Issue: 7
Volume: 35
Year: 2016
Month: 8
X-DOI: 10.1080/07474938.2014.977075
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977075
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:7:p:1251-1270
Template-Type: ReDIF-Article 1.0
Author-Name: F. Bartolucci
Author-X-Name-First: F.
Author-X-Name-Last: Bartolucci
Author-Name: R. Bellio
Author-X-Name-First: R.
Author-X-Name-Last: Bellio
Author-Name: A. Salvan
Author-X-Name-First: A.
Author-X-Name-Last: Salvan
Author-Name: N. Sartori
Author-X-Name-First: N.
Author-X-Name-Last: Sartori
Title: Modified Profile Likelihood for Fixed-Effects Panel Data Models
Abstract:
We show how modified profile likelihood methods, developed in the
statistical literature, may be effectively applied to estimate the
structural parameters of econometric models for panel data, with a
remarkable reduction of bias with respect to ordinary likelihood methods.
Initially, the implementation of these methods is illustrated for general
models for panel data including individual-specific fixed effects and
then, in more detail, for the truncated linear regression model and
dynamic regression models for binary data formulated along with different
specifications. Simulation studies show the good behavior of the inference
based on the modified profile likelihood, even when compared to an ideal,
although infeasible, procedure (in which the fixed effects are known) and
also to alternative estimators existing in the econometric literature. The
proposed estimation methods are implemented in an
R package that we make available to the reader.
Journal: Econometric Reviews
Pages: 1271-1289
Issue: 7
Volume: 35
Year: 2016
Month: 8
X-DOI: 10.1080/07474938.2014.975642
File-URL: http://hdl.handle.net/10.1080/07474938.2014.975642
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:7:p:1271-1289
Template-Type: ReDIF-Article 1.0
Author-Name: Benjamin Born
Author-X-Name-First: Benjamin
Author-X-Name-Last: Born
Author-Name: Jörg Breitung
Author-X-Name-First: Jörg
Author-X-Name-Last: Breitung
Title: Testing for Serial Correlation in Fixed-Effects Panel Data Models
Abstract:
In this article, we propose various tests for serial correlation in
fixed-effects panel data regression models with a small number of time
periods. First, a simplified version of the test suggested by Wooldridge
(2002) and Drukker (2003) is considered. The second test is based on the
Lagrange Multiplier (LM) statistic suggested by Baltagi and Li (1995), and
the third test is a modification of the classical Durbin--Watson
statistic. Under the null hypothesis of no serial correlation, all tests
possess a standard normal limiting distribution as N tends to infinity and
T is fixed. Analyzing the local power of the tests, we find that the LM
statistic has superior power properties. Furthermore, a generalization to
test for autocorrelation up to some given lag order and a test statistic
that is robust against time dependent heteroskedasticity are proposed.
Journal: Econometric Reviews
Pages: 1290-1316
Issue: 7
Volume: 35
Year: 2016
Month: 8
X-DOI: 10.1080/07474938.2014.976524
File-URL: http://hdl.handle.net/10.1080/07474938.2014.976524
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:7:p:1290-1316
Template-Type: ReDIF-Article 1.0
Author-Name: Maria Grazia Pittau
Author-X-Name-First: Maria Grazia
Author-X-Name-Last: Pittau
Author-Name: Roberto Zelli
Author-X-Name-First: Roberto
Author-X-Name-Last: Zelli
Author-Name: Riccardo Massari
Author-X-Name-First: Riccardo
Author-X-Name-Last: Massari
Title: Evidence of Convergence Clubs Using Mixture Models
Abstract:
Cross-country economic convergence has been increasingly investigated by
finite mixture models. Multiple components in a mixture reflect groups of
countries that converge locally. Testing for the number of components is
crucial for detecting “convergence clubs.” To assess the
number of components of the mixture, we propose a sequential procedure
that compares the shape of the hypothesized mixture distribution with the
true unknown density, consistently estimated through a kernel estimator.
The novelty of our approach is its capability to select the number of
components along with a satisfactory fitting of the model. Simulation
studies and an empirical application to per capita income distribution
across countries testify for the good performance of our approach. A
three-clubs convergence seems to emerge.
Journal: Econometric Reviews
Pages: 1317-1342
Issue: 7
Volume: 35
Year: 2016
Month: 8
X-DOI: 10.1080/07474938.2014.977062
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977062
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:7:p:1317-1342
Template-Type: ReDIF-Article 1.0
Author-Name: Mehmet Caner
Author-X-Name-First: Mehmet
Author-X-Name-Last: Caner
Author-Name: Marcelo C. Medeiros
Author-X-Name-First: Marcelo C.
Author-X-Name-Last: Medeiros
Title: Model Selection and Shrinkage: An Overview
Abstract:
This special issue is concerned with model selection and shrinkage
estimators. This Introduction gives an overview of the papers published in
this special issue.
Journal: Econometric Reviews
Pages: 1343-1346
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1071157
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1071157
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1343-1346
Template-Type: ReDIF-Article 1.0
Author-Name: Clifford Lam
Author-X-Name-First: Clifford
Author-X-Name-Last: Lam
Author-Name: Pedro C. L. Souza
Author-X-Name-First: Pedro C. L.
Author-X-Name-Last: Souza
Title: Detection and Estimation of Block Structure in Spatial Weight Matrix
Abstract:
In many economic applications, it is often of interest to categorize,
classify, or label individuals by groups based on similarity of observed
behavior. We propose a method that captures group affiliation or,
equivalently, estimates the block structure of a neighboring matrix
embedded in a Spatial Econometric model. The main results of the Least
Absolute Shrinkage and Selection Operator (Lasso) estimator shows that
off-diagonal block elements are estimated as zeros with high probability,
property defined as “zero-block consistency.” Furthermore,
we present and prove zero-block consistency for the estimated spatial
weight matrix even under a thin margin of interaction between groups. The
tool developed in this article can be used as a verification of block
structure by applied researchers, or as an exploration tool for estimating
unknown block structures. We analyzed the U.S. Senate voting data and
correctly identified blocks based on party affiliations. Simulations also
show that the method performs well.
Journal: Econometric Reviews
Pages: 1347-1376
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1085775
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1085775
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1347-1376
Template-Type: ReDIF-Article 1.0
Author-Name: Mehmet Caner
Author-X-Name-First: Mehmet
Author-X-Name-Last: Caner
Author-Name: Anders Bredahl Kock
Author-X-Name-First: Anders Bredahl
Author-X-Name-Last: Kock
Title: Oracle Inequalities for Convex Loss Functions with Nonlinear Targets
Abstract:
This article considers penalized empirical loss minimization of convex
loss functions with unknown target functions. Using the elastic net
penalty, of which the Least Absolute Shrinkage and Selection Operator
(Lasso) is a special case, we establish a finite sample oracle inequality
which bounds the loss of our estimator from above with high probability.
If the unknown target is linear, this inequality also provides an upper
bound of the estimation error of the estimated parameter vector. Next, we
use the non-asymptotic results to show that the excess loss of our
estimator is asymptotically of the same order as that of the oracle. If
the target is linear, we give sufficient conditions for consistency of the
estimated parameter vector. We briefly discuss how a thresholded version
of our estimator can be used to perform consistent variable selection. We
give two examples of loss functions covered by our framework.
Journal: Econometric Reviews
Pages: 1377-1411
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092797
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092797
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1377-1411
Template-Type: ReDIF-Article 1.0
Author-Name: Ulrike Schneider
Author-X-Name-First: Ulrike
Author-X-Name-Last: Schneider
Title: Confidence Sets Based on Thresholding Estimators in High-Dimensional Gaussian Regression Models
Abstract:
We study confidence intervals based on hard-thresholding,
soft-thresholding, and adaptive soft-thresholding in a linear regression
model where the number of regressors k may depend on and
diverge with sample size n. In addition to the case of
known error variance, we define and study versions of the estimators when
the error variance is unknown. In the known-variance case, we provide an
exact analysis of the coverage properties of such intervals in finite
samples. We show that these intervals are always larger than the standard
interval based on the least-squares estimator. Asymptotically, the
intervals based on the thresholding estimators are larger even by an order
of magnitude when the estimators are tuned to perform consistent variable
selection. For the unknown-variance case, we provide nontrivial lower
bounds and a small numerical study for the coverage probabilities in
finite samples. We also conduct an asymptotic analysis where the results
from the known-variance case can be shown to carry over asymptotically if
the number of degrees of freedom
n − k tends to infinity
fast enough in relation to the thresholding parameter.
Journal: Econometric Reviews
Pages: 1412-1455
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092798
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092798
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1412-1455
Template-Type: ReDIF-Article 1.0
Author-Name: Bruce E. Hansen
Author-X-Name-First: Bruce E.
Author-X-Name-Last: Hansen
Title: The Risk of James--Stein and Lasso Shrinkage
Abstract:
This article compares the mean-squared error (or ℓ2
risk) of ordinary least squares (OLS), James--Stein, and least absolute
shrinkage and selection operator (Lasso) shrinkage estimators in simple
linear regression where the number of regressors is smaller than the
sample size. We compare and contrast the known risk bounds for these
estimators, which shows that neither James--Stein nor Lasso uniformly
dominates the other. We investigate the finite sample risk using a simple
simulation experiment. We find that the risk of Lasso estimation is
particularly sensitive to coefficient parameterization, and for a
significant portion of the parameter space Lasso has higher mean-squared
error than OLS. This investigation suggests that there are potential
pitfalls arising with Lasso estimation, and simulation studies need to be
more attentive to careful exploration of the parameter space.
Journal: Econometric Reviews
Pages: 1456-1470
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092799
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092799
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1456-1470
Template-Type: ReDIF-Article 1.0
Author-Name: Keith Knight
Author-X-Name-First: Keith
Author-X-Name-Last: Knight
Title: The Penalized Analytic Center Estimator
Abstract:
In a linear regression model, the Dantzig selector (Candès and Tao,
2007) minimizes the L1 norm of the regression
coefficients subject to a bound λ on the
L∞ norm of the covariances between the
predictors and the residuals; the resulting estimator is the solution of a
linear program, which may be nonunique or unstable. We propose a
regularized alternative to the Dantzig selector. These estimators (which
depend on λ and an additional tuning parameter
r) minimize objective functions that are the sum of the
L1 norm of the regression coefficients plus
r times the logarithmic potential function of the Dantzig
selector constraints, and can be viewed as penalized analytic centers of
the latter constraints. The tuning parameter r controls
the smoothness of the estimators as functions of λ
and, when λ is sufficiently large, the estimators
depend approximately on r and λ
via r/λ-super-2.
Journal: Econometric Reviews
Pages: 1471-1484
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092800
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092800
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1471-1484
Template-Type: ReDIF-Article 1.0
Author-Name: Francesco Audrino
Author-X-Name-First: Francesco
Author-X-Name-Last: Audrino
Author-Name: Simon D. Knaus
Author-X-Name-First: Simon D.
Author-X-Name-Last: Knaus
Title: Lassoing the HAR Model: A Model Selection Perspective on Realized Volatility Dynamics
Abstract:
Realized volatility computed from high-frequency data is an important
measure for many applications in finance, and its dynamics have been
widely investigated. Recent notable advances that perform well include the
heterogeneous autoregressive (HAR) model which can approximate long
memory, is very parsimonious, is easy to estimate, and features good
out-of-sample performance. We prove that the least absolute shrinkage and
selection operator (Lasso) recovers the lags structure of the HAR model
asymptotically if it is the true model, and we present Monte Carlo
evidence in finite samples. The HAR model's lags structure is not fully in
agreement with the one found using the Lasso on real data. Moreover, we
provide empirical evidence that there are two clear breaks in structure
for most of the assets we consider. These results bring into question the
appropriateness of the HAR model for realized volatility. Finally, in an
out-of-sample analysis, we show equal performance of the HAR model and the
Lasso approach.
Journal: Econometric Reviews
Pages: 1485-1521
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092801
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092801
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1485-1521
Template-Type: ReDIF-Article 1.0
Author-Name: Malene Kallestrup-Lamb
Author-X-Name-First: Malene
Author-X-Name-Last: Kallestrup-Lamb
Author-Name: Anders Bredahl Kock
Author-X-Name-First: Anders Bredahl
Author-X-Name-Last: Kock
Author-Name: Johannes Tang Kristensen
Author-X-Name-First: Johannes Tang
Author-X-Name-Last: Kristensen
Title: Lassoing the Determinants of Retirement
Abstract:
This article uses Danish register data to explain the retirement decision
of workers in 1990 and 1998. Many variables might be conjectured to
influence this decision such as demographic, socioeconomic, financial, and
health related variables as well as all the same factors for the spouse in
case the individual is married. In total, we have access to 399 individual
specific variables that all could potentially impact the retirement
decision. We use variants of the least absolute shrinkage and selection
operator (Lasso) and the adaptive Lasso applied to logistic regression in
order to uncover determinants of the retirement decision. To the best of
our knowledge, this is the first application of these estimators in
microeconometrics to a problem of this type and scale. Furthermore, we
investigate whether the factors influencing the retirement decision are
stable over time, gender, and marital status. It is found that this is the
case for core variables such as age, income, wealth, and general health.
We also point out the most important differences between these groups and
explain why these might be present.
Journal: Econometric Reviews
Pages: 1522-1561
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092803
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092803
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1522-1561
Template-Type: ReDIF-Article 1.0
Author-Name: Mehmet Caner
Author-X-Name-First: Mehmet
Author-X-Name-Last: Caner
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Juan Andrés Riquelme
Author-X-Name-First: Juan Andrés
Author-X-Name-Last: Riquelme
Title: Moment and IV Selection Approaches: A Comparative Simulation Study
Abstract:
We compare three moment selection approaches, followed by post-selection
estimation strategies. The first is adaptive least absolute shrinkage and
selection operator (ALASSO) of Zou (2006), recently extended by Liao
(2013) to possibly invalid moments in GMM. In this method, we select the
valid instruments with ALASSO. The second method is based on the
J test, as in Andrews and Lu (2001). The third one is
using a Continuous Updating Objective (CUE) function. This last approach
is based on Hong et al. (2003), who propose a penalized generalized
empirical likelihood-based function to pick up valid moments. They use
empirical likelihood, and exponential tilting in their simulations.
However, the J-test-based approach of Andrews and Lu (2001) provides
generally better moment selection results than the empirical likelihood
and exponential tilting as can be seen in Hong et al. (2003). In this
article, we examine penalized CUE as a third way of selecting valid
moments.Following a determination of valid moments, we run unpenalized
generalized method of moments (GMM) and CUE and model averaging technique
of Okui (2011) to see which one has better postselection estimator
performance for structural parameters. The simulations are aimed at the
following questions: Which moment selection criterion can better select
the valid ones and eliminate the invalid ones? Given the chosen
instruments in the first stage, which strategy delivers the best finite
sample performance?We find that the ALASSO in the model selection stage,
coupled with either unpenalized GMM or moment averaging of Okui delivers
generally the smallest root mean square error (RMSE) for the second stage
coefficient estimators.
Journal: Econometric Reviews
Pages: 1562-1581
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092804
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092804
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1562-1581
Template-Type: ReDIF-Article 1.0
Author-Name: Zhentao Shi
Author-X-Name-First: Zhentao
Author-X-Name-Last: Shi
Title: Estimation of Sparse Structural Parameters with Many Endogenous Variables
Abstract:
We apply the generalized method of moments--least absolute shinkage and
selection operator (GMM-Lasso) (Caner, 2009) to a linear structural model
with many endogenous regressors. If the true parameter is sufficiently
sparse, we can establish a new oracle inequality, which implies that
GMM-Lasso performs almost as well as if we knew a priori
the identities of the relevant variables. Sparsity, meaning that most of
the true coefficients are too small to matter, naturally arises in
econometric applications where the model can be derived from economic
theory. In addition, we propose to use a modified version of AIC or BIC to
select the tuning parameter in practical implementation. Simulations
provide supportive evidence concerning the finite sample properties of the
GMM-Lasso.
Journal: Econometric Reviews
Pages: 1582-1608
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092805
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092805
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1582-1608
Template-Type: ReDIF-Article 1.0
Author-Name: Marine Carrasco
Author-X-Name-First: Marine
Author-X-Name-Last: Carrasco
Author-Name: Guy Tchuente
Author-X-Name-First: Guy
Author-X-Name-Last: Tchuente
Title: Efficient Estimation with Many Weak Instruments Using Regularization Techniques
Abstract:
The problem of weak instruments is due to a very small concentration
parameter. To boost the concentration parameter, we propose to increase
the number of instruments to a large number or even up to a continuum.
However, in finite samples, the inclusion of an excessive number of
moments may be harmful. To address this issue, we use regularization
techniques as in Carrasco (2012) and Carrasco and Tchuente (2014). We show
that normalized regularized two-stage least squares (2SLS) and limited
maximum likelihood (LIML) are consistent and asymptotically normally
distributed. Moreover, our estimators are asymptotically more efficient
than most competing estimators. Our simulations show that the leading
regularized estimators (LF and T of LIML) work very well (are nearly
median unbiased) even in the case of relatively weak instruments. An
application to the effect of institutions on output growth completes the
article.
Journal: Econometric Reviews
Pages: 1609-1637
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092806
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092806
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1609-1637
Template-Type: ReDIF-Article 1.0
Author-Name: Eric Eisenstat
Author-X-Name-First: Eric
Author-X-Name-Last: Eisenstat
Author-Name: Joshua C. C. Chan
Author-X-Name-First: Joshua C. C.
Author-X-Name-Last: Chan
Author-Name: Rodney W. Strachan
Author-X-Name-First: Rodney W.
Author-X-Name-Last: Strachan
Title: Stochastic Model Specification Search for Time-Varying Parameter VARs
Abstract:
This article develops a new econometric methodology for performing
stochastic model specification search (SMSS) in the vast model space of
time-varying parameter vector autoregressions (VARs) with stochastic
volatility and correlated state transitions. This is motivated by the
concern of overfitting and the typically imprecise inference in these
highly parameterized models. For each VAR coefficient, this new method
automatically decides whether it is constant or time-varying. Moreover, it
can be used to shrink an otherwise unrestricted time-varying parameter VAR
to a stationary VAR, thus providing an easy way to (probabilistically)
impose stationarity in time-varying parameter models. We demonstrate the
effectiveness of the approach with a topical application, where we
investigate the dynamic effects of structural shocks in government
spending on U.S. taxes and gross domestic product (GDP) during a period of
very low interest rates.
Journal: Econometric Reviews
Pages: 1638-1665
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092808
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092808
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1638-1665
Template-Type: ReDIF-Article 1.0
Author-Name: Hedibert F. Lopes
Author-X-Name-First: Hedibert F.
Author-X-Name-Last: Lopes
Author-Name: Nicholas G. Polson
Author-X-Name-First: Nicholas G.
Author-X-Name-Last: Polson
Title: Particle Learning for Fat-Tailed Distributions
Abstract:
It is well known that parameter estimates and forecasts are sensitive to
assumptions about the tail behavior of the error distribution. In this
article, we develop an approach to sequential inference that also
simultaneously estimates the tail of the accompanying error distribution.
Our simulation-based approach models errors with a
tν-distribution and, as new data
arrives, we sequentially compute the marginal posterior distribution of
the tail thickness. Our method naturally incorporates fat-tailed error
distributions and can be extended to other data features such as
stochastic volatility. We show that the sequential Bayes factor provides
an optimal test of fat-tails versus normality. We provide an empirical and
theoretical analysis of the rate of learning of tail thickness under a
default Jeffreys prior. We illustrate our sequential methodology on the
British pound/U.S. dollar daily exchange rate data and on data from the
2008--2009 credit crisis using daily S&P500 returns. Our method naturally
extends to multivariate and dynamic panel data.
Journal: Econometric Reviews
Pages: 1666-1691
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092809
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092809
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1666-1691
Template-Type: ReDIF-Article 1.0
Author-Name: Qingfeng Liu
Author-X-Name-First: Qingfeng
Author-X-Name-Last: Liu
Author-Name: Ryo Okui
Author-X-Name-First: Ryo
Author-X-Name-Last: Okui
Author-Name: Arihiro Yoshimura
Author-X-Name-First: Arihiro
Author-X-Name-Last: Yoshimura
Title: Generalized Least Squares Model Averaging
Abstract:
In this article, we propose a method of averaging generalized least
squares estimators for linear regression models with heteroskedastic
errors. The averaging weights are chosen to minimize Mallows’
Cp-like criterion. We show
that the weight vector selected by our method is optimal. It is also shown
that this optimality holds even when the variances of the error terms are
estimated and the feasible generalized least squares estimators are
averaged. The variances can be estimated parametrically or
nonparametrically. Monte Carlo simulation results are encouraging. An
empirical example illustrates that the proposed method is useful for
predicting a measure of firms’ performance.
Journal: Econometric Reviews
Pages: 1692-1752
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1092817
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092817
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1692-1752
Template-Type: ReDIF-Article 1.0
Author-Name: Anders Bredahl Kock
Author-X-Name-First: Anders
Author-X-Name-Last: Bredahl Kock
Author-Name: Timo Teräsvirta
Author-X-Name-First: Timo
Author-X-Name-Last: Teräsvirta
Title: Forecasting Macroeconomic Variables Using Neural Network Models and Three Automated Model Selection Techniques
Abstract:
When forecasting with neural network models one faces several problems,
all of which influence the accuracy of the forecasts. First, neural
networks are often hard to estimate due to their highly nonlinear
structure. To alleviate the problem, White (2006) presented a solution
(QuickNet) that converts the specification and nonlinear estimation
problem into a linear model selection and estimation problem. We shall
compare its performance to that of two other procedures building on the
linearization idea: the Marginal Bridge Estimator and Autometrics. Second,
one must decide whether forecasting should be carried out recursively or
directly. This choice is investigated in this work. The economic time
series used in this study are the consumer price indices for the G7 and
the Scandinavian countries. In addition, a number of simulations are
carried out and results reported in the article.
Journal: Econometric Reviews
Pages: 1753-1779
Issue: 8-10
Volume: 35
Year: 2016
Month: 12
X-DOI: 10.1080/07474938.2015.1035163
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1035163
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:35:y:2016:i:8-10:p:1753-1779
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Robin Sickles
Author-X-Name-First: Robin
Author-X-Name-Last: Sickles
Title: Peter Schmidt: Econometrician and consummate professional
Abstract:
Peter Schmidt has been one of its best-known and most respected
econometricians in the profession for four decades. He has brought his
talents to many scholarly outlets and societies, and has played a
foundational and constructive role in the development of the field of
econometrics. Peter Schmidt has also served and led the development of
Econometric Reviews since its inception in 1982. His judgment has always
been fair, informed, clear, decisive, and constructive. Respect for ideas
and scholarship of others, young and old, is second nature to him. This is
the best of traits, and Peter serves as an uncommon example to us all. The
seventeen articles that make up this Econometric Reviews Special Issue in
Honor of Peter Schmidt represent the work of fifty of the very best
econometricians in our profession. They honor Professor Schmidt's lifelong
accomplishments by providing fundamental research work that reflects many
of the broad research themes that have distinguished his long and
productive career. These include time series econometrics, panel data
econometrics, and stochastic frontier production analysis.
Journal: Econometric Reviews
Pages: 1-5
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1116051
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1116051
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:1-5
Template-Type: ReDIF-Article 1.0
Author-Name: Chunrong Ai
Author-X-Name-First: Chunrong
Author-X-Name-Last: Ai
Author-Name: Yuanqing Zhang
Author-X-Name-First: Yuanqing
Author-X-Name-Last: Zhang
Title: Estimation of partially specified spatial panel data models with fixed-effects
Abstract:
This article extends the spatial panel data regression with fixed-effects
to the case where the regression function is partially linear and some
regressors may be endogenous or predetermined. Under the assumption that
the spatial weighting matrix is strictly exogenous, we propose a sieve two
stage least squares (S2SLS) regression. Under some sufficient conditions,
we show that the proposed estimator for the finite dimensional parameter
is root-N consistent and asymptotically normally distributed and that the
proposed estimator for the unknown function is consistent and also
asymptotically normally distributed but at a rate slower than root-N.
Consistent estimators for the asymptotic variances of the proposed
estimators are provided. A small scale simulation study is conducted, and
the simulation results show that the proposed procedure has good finite
sample performance.
Journal: Econometric Reviews
Pages: 6-22
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1113641
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1113641
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:6-22
Template-Type: ReDIF-Article 1.0
Author-Name: Martyn Andrews
Author-X-Name-First: Martyn
Author-X-Name-Last: Andrews
Author-Name: Obbey Elamin
Author-X-Name-First: Obbey
Author-X-Name-Last: Elamin
Author-Name: Alastair R. Hall
Author-X-Name-First: Alastair R.
Author-X-Name-Last: Hall
Author-Name: Kostas Kyriakoulis
Author-X-Name-First: Kostas
Author-X-Name-Last: Kyriakoulis
Author-Name: Matthew Sutton
Author-X-Name-First: Matthew
Author-X-Name-Last: Sutton
Title: Inference in the presence of redundant moment conditions and the impact of government health expenditure on health outcomes in England
Abstract:
In his 1999 article with Breusch, Qian, and Wyhowski in the
Journal of Econometrics, Peter Schmidt introduced the
concept of “redundant” moment conditions. Such conditions
arise when estimation is based on moment conditions that are valid and can
be divided into two subsets: one that identifies the parameters and
another that provides no further information. Their framework highlights
an important concept in the moment-based estimation literature, namely,
that not all valid moment conditions need be informative about the
parameters of interest. In this article, we demonstrate the empirical
relevance of the concept in the context of the impact of government health
expenditure on health outcomes in England. Using a simulation study
calibrated to this data, we perform a comparative study of the finite
performance of inference procedures based on the Generalized Method of
Moment (GMM) and info-metric (IM) estimators. The results indicate that
the properties of GMM procedures deteriorate as the number of redundant
moment conditions increases; in contrast, the IM methods provide reliable
point estimators, but the performance of associated inference techniques
based on first order asymptotic theory, such as confidence intervals and
overidentifying restriction tests, deteriorates as the number of redundant
moment conditions increases. However, for IM methods, it is shown that
bootstrap procedures can provide reliable inferences; we illustrate such
methods when analysing the impact of government health expenditure on
health outcomes in England.
Journal: Econometric Reviews
Pages: 23-41
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2016.1114205
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1114205
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:23-41
Template-Type: ReDIF-Article 1.0
Author-Name: Manabu Asai
Author-X-Name-First: Manabu
Author-X-Name-Last: Asai
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: A fractionally integrated Wishart stochastic volatility model
Abstract:
There has recently been growing interest in modeling and estimating
alternative continuous time multivariate stochastic volatility models. We
propose a continuous time fractionally integrated Wishart stochastic
volatility (FIWSV) process, and derive the conditional Laplace transform
of the FIWSV model in order to obtain a closed form expression of moments.
A two-step procedure is used, namely estimating the parameter of
fractional integration via the local Whittle estimator in the first step,
and estimating the remaining parameters via the generalized method of
moments in the second step. Monte Carlo results for the procedure show a
reasonable performance in finite samples. The empirical results for the
S&P 500 and FTSE 100 indexes show that the data favor the new FIWSV
process rather than the one-factor and two-factor models of the Wishart
autoregressive process for the covariance structure.
Journal: Econometric Reviews
Pages: 42-59
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114235
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114235
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:42-59
Template-Type: ReDIF-Article 1.0
Author-Name: Richard T. Baillie
Author-X-Name-First: Richard T.
Author-X-Name-Last: Baillie
Author-Name: George Kapetanios
Author-X-Name-First: George
Author-X-Name-Last: Kapetanios
Author-Name: Fotis Papailias
Author-X-Name-First: Fotis
Author-X-Name-Last: Papailias
Title: Inference for impulse response coefficients from multivariate fractionally integrated processes
Abstract:
This article considers a multivariate system of fractionally integrated
time series and investigates the most appropriate way for estimating
Impulse Response (IR) coefficients and their associated
confidence intervals. The article extends the univariate analysis recently
provided by Baillie and Kapetanios (2013), and uses a semiparametric, time
domain estimator, based on a vector autoregression (VAR)
approximation. Results are also derived for the orthogonalized estimated
IRs which are generally more practically relevant.
Simulation evidence strongly indicates the desirability of applying the
Kilian small sample bias correction, which is found to improve the
coverage accuracy of confidence intervals for IRs. The
most appropriate order of the VAR turns out to be
relevant for the lag length of the IR being estimated.
Journal: Econometric Reviews
Pages: 60-84
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114253
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114253
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:60-84
Template-Type: ReDIF-Article 1.0
Author-Name: Badi H. Baltagi
Author-X-Name-First: Badi H.
Author-X-Name-Last: Baltagi
Author-Name: Chihwa Kao
Author-X-Name-First: Chihwa
Author-X-Name-Last: Kao
Author-Name: Long Liu
Author-X-Name-First: Long
Author-X-Name-Last: Liu
Title: Estimation and identification of change points in panel models with nonstationary or stationary regressors and error term
Abstract:
This article studies the estimation of change point in panel models. We
extend Bai (2010) and Feng et al. (2009) to the case of stationary or
nonstationary regressors and error term, and whether the change point is
present or not. We prove consistency and derive the asymptotic
distributions of the Ordinary Least Squares (OLS) and First Difference
(FD) estimators. We find that the FD estimator is robust for all cases
considered.
Journal: Econometric Reviews
Pages: 85-102
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114262
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114262
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:85-102
Template-Type: ReDIF-Article 1.0
Author-Name: Herman J. Bierens
Author-X-Name-First: Herman J.
Author-X-Name-Last: Bierens
Author-Name: Li Wang
Author-X-Name-First: Li
Author-X-Name-Last: Wang
Title: Weighted simulated integrated conditional moment tests for parametric conditional distributions of stationary time series processes
Abstract:
In this article, we propose a weighted simulated integrated conditional
moment (WSICM) test of the validity of parametric specifications of
conditional distribution models for stationary time series data, by
combining the weighted integrated conditional moment (ICM) test of Bierens
(1984) for time series regression models with the simulated ICM test of
Bierens and Wang (2012) of conditional distribution models for
cross-section data. To the best of our knowledge, no other consistent test
for parametric conditional time series distributions has been proposed yet
in the literature, despite consistency claims made by some authors.
Journal: Econometric Reviews
Pages: 103-135
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114275
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114275
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:103-135
Template-Type: ReDIF-Article 1.0
Author-Name: Yoosoon Chang
Author-X-Name-First: Yoosoon
Author-X-Name-Last: Chang
Author-Name: Robin C. Sickles
Author-X-Name-First: Robin C.
Author-X-Name-Last: Sickles
Author-Name: Wonho Song
Author-X-Name-First: Wonho
Author-X-Name-Last: Song
Title: Bootstrapping unit root tests with covariates
Abstract:
We consider the bootstrap method for the covariates augmented
Dickey--Fuller (CADF) unit root test suggested in Hansen (1995) which uses
related variables to improve the power of univariate unit root tests. It
is shown that there are substantial power gains from including correlated
covariates. The limit distribution of the CADF test, however, depends on
the nuisance parameter that represents the correlation between the
equation error and the covariates. Hence, inference based directly on the
CADF test is not possible. To provide a valid inferential basis for the
CADF test, we propose to use the parametric bootstrap procedure to obtain
critical values, and establish the asymptotic validity of the bootstrap
CADF test. Simulations show that the bootstrap CADF test significantly
improves the asymptotic and the finite sample size performances of the
CADF test, especially when the covariates are highly correlated with the
error. Indeed, the bootstrap CADF test offers drastic power gains over the
conventional unit root tests. Our testing procedures are applied to the
extended Nelson and Plosser data set.
Journal: Econometric Reviews
Pages: 136-155
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114279
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114279
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:136-155
Template-Type: ReDIF-Article 1.0
Author-Name: Abdelaati Daouia
Author-X-Name-First: Abdelaati
Author-X-Name-Last: Daouia
Author-Name: Léopold Simar
Author-X-Name-First: Léopold
Author-X-Name-Last: Simar
Author-Name: Paul W. Wilson
Author-X-Name-First: Paul W.
Author-X-Name-Last: Wilson
Title: Measuring firm performance using nonparametric quantile-type distances
Abstract:
When faced with multiple inputs and outputs
, traditional
quantile regression of Y conditional on
X = x for measuring economic
efficiency in the output (input) direction is thwarted by the absence of a
natural ordering of Euclidean space for dimensions q
(p) greater than one. Daouia and Simar (2007) used
nonstandard conditional quantiles to address this problem, conditioning on
Y ≥ y
(X ≤ x) in the output
(input) orientation, but the resulting quantiles depend on the a priori
chosen direction. This article uses a dimensionless transformation of the
(p + q)-dimensional production
process to develop an alternative formulation of distance from a
realization of (X, Y) to the efficient
support boundary, motivating a new, unconditional
quantile frontier lying inside the joint support of (X,
Y), but near the full, efficient frontier. The
interpretation is analogous to univariate quantiles and corrects some of
the disappointing properties of the conditional quantile-based approach.
By contrast with the latter, our approach determines a unique
partial-quantile frontier independent of the chosen orientation (input,
output, hyperbolic, or directional distance). We prove that both the
resulting efficiency score and its estimator share desirable monotonicity
properties. Simple arguments from extreme-value theory are used to derive
the asymptotic distributional properties of the corresponding empirical
efficiency scores (both full and partial). The usefulness of the
quantile-type estimator is shown from an infinitesimal and global
robustness theory viewpoints via a comparison with the previous
conditional quantile-based approach. A diagnostic tool is developed to
find the appropriate quantile-order; in the literature to date, this
trimming order has been fixed a priori. The methodology
is used to analyze the performance of U.S. credit unions, where outliers
are likely to affect traditional approaches.
Journal: Econometric Reviews
Pages: 156-181
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114289
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114289
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:156-181
Template-Type: ReDIF-Article 1.0
Author-Name: Jean-Marie Dufour
Author-X-Name-First: Jean-Marie
Author-X-Name-Last: Dufour
Author-Name: Alain Trognon
Author-X-Name-First: Alain
Author-X-Name-Last: Trognon
Author-Name: Purevdorj Tuvaandorj
Author-X-Name-First: Purevdorj
Author-X-Name-Last: Tuvaandorj
Title: Invariant tests based on M-estimators, estimating functions, and the generalized method of moments
Abstract:
We study the invariance properties of various test criteria which have
been proposed for hypothesis testing in the context of incompletely
specified models, such as models which are formulated in terms of
estimating functions (Godambe, 1960) or moment conditions and are
estimated by generalized method of moments (GMM) procedures (Hansen,
1982), and models estimated by pseudo-likelihood (Gouriéroux,
Monfort, and Trognon, 1984b,c) and M-estimation methods.
The invariance properties considered include invariance to (possibly
nonlinear) hypothesis reformulations and reparameterizations. The test
statistics examined include Wald-type, LR-type, LM-type, score-type, and
C(α)−type criteria.
Extending the approach used in Dagenais and Dufour (1991), we show first
that all these test statistics except the Wald-type ones are invariant to
equivalent hypothesis reformulations (under usual regularity conditions),
but all five of them are not generally invariant to model
reparameterizations, including measurement unit changes in nonlinear
models. In other words, testing two equivalent hypotheses in the context
of equivalent models may lead to completely different inferences. For
example, this may occur after an apparently innocuous rescaling of some
model variables. Then, in view of avoiding such undesirable properties, we
study restrictions that can be imposed on the objective functions used for
pseudo-likelihood (or M-estimation) as well as the structure of the test
criteria used with estimating functions and generalized method of moments
(GMM) procedures to obtain invariant tests. In particular, we show that
using linear exponential pseudo-likelihood functions allows one to obtain
invariant score-type and
C(α)−type test criteria,
while in the context of estimating function (or GMM) procedures it is
possible to modify a LR-type statistic proposed by Newey and West (1987)
to obtain a test statistic that is invariant to general
reparameterizations. The invariance associated with linear exponential
pseudo-likelihood functions is interpreted as a strong argument for
using such pseudo-likelihood functions in empirical work.
Journal: Econometric Reviews
Pages: 182-204
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114285
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114285
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:182-204
Template-Type: ReDIF-Article 1.0
Author-Name: Carl Green
Author-X-Name-First: Carl
Author-X-Name-Last: Green
Author-Name: Qi Li
Author-X-Name-First: Qi
Author-X-Name-Last: Li
Author-Name: Yu Yvette Zhang
Author-X-Name-First: Yu Yvette
Author-X-Name-Last: Zhang
Title: Nonparametric estimation of regression models with mixed discrete and continuous covariates by the K-nn method
Abstract:
In this article we consider the problem of estimating a nonparametric
conditional mean function with mixed discrete and continuous covariates by
the nonparametric k-nearest-neighbor
(k-nn) method. We derive the asymptotic normality result
of the proposed estimator and use Monte Carlo simulations to demonstrate
its finite sample performance. We also provide an illustrative empirical
example of our method.
Journal: Econometric Reviews
Pages: 205-224
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114295
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114295
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:205-224
Template-Type: ReDIF-Article 1.0
Author-Name: Chirok Han
Author-X-Name-First: Chirok
Author-X-Name-Last: Han
Author-Name: Peter C. B. Phillips
Author-X-Name-First: Peter C. B.
Author-X-Name-Last: Phillips
Author-Name: Donggyu Sul
Author-X-Name-First: Donggyu
Author-X-Name-Last: Sul
Title: Lag length selection in panel autoregression
Abstract:
Model selection by BIC is well known to be inconsistent in the presence of
incidental parameters. This article shows that, somewhat surprisingly,
even without fixed effects in dynamic panels BIC is inconsistent and
overestimates the true lag length with considerable probability. The
reason for the inconsistency is explained, and the probability of
overestimation is found to be 50% asymptotically. Three alternative
consistent lag selection methods are considered. Two of these modify BIC,
and the third involves sequential testing. Simulations evaluate the
performance of these alternative lag selection methods in finite samples.
Journal: Econometric Reviews
Pages: 225-240
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114313
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114313
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:225-240
Template-Type: ReDIF-Article 1.0
Author-Name: Nicholas M. Kiefer
Author-X-Name-First: Nicholas M.
Author-X-Name-Last: Kiefer
Author-Name: Jeffrey S. Racine
Author-X-Name-First: Jeffrey S.
Author-X-Name-Last: Racine
Title: The smooth colonel and the reverend find common ground
Abstract:
A semiparametric regression estimator that exploits categorical (i.e.,
discrete-support) kernel functions is developed for a broad class of
hierarchical models including the pooled regression estimator, the
fixed-effects estimator familiar from panel data, and the varying
coefficient estimator, among others. Separate shrinking is allowed for
each coefficient. Regressors may be continuous or discrete. The estimator
is motivated as an intuitive and appealing generalization of existing
methods. It is then supported by demonstrating that it can be realized as
a posterior mean in the Lindley and Smith (1972) framework. As a
demonstration of the flexibility of the proposed approach, the model is
extended to nonparametric hierarchical regression based on B-splines.
Journal: Econometric Reviews
Pages: 241-256
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114304
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114304
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:241-256
Template-Type: ReDIF-Article 1.0
Author-Name: Kajal Lahiri
Author-X-Name-First: Kajal
Author-X-Name-Last: Lahiri
Author-Name: Huaming Peng
Author-X-Name-First: Huaming
Author-X-Name-Last: Peng
Author-Name: Yongchen Zhao
Author-X-Name-First: Yongchen
Author-X-Name-Last: Zhao
Title: Online learning and forecast combination in unbalanced panels
Abstract:
This article evaluates the performance of a few newly proposed online
forecast combination algorithms and compares them with some of the
existing ones including the simple average and that of Bates and Granger
(1969). We derive asymptotic results for the new algorithms that justify
certain established approaches to forecast combination including trimming,
clustering, weighting, and shrinkage. We also show that when implemented
on unbalanced panels, different combination algorithms implicitly impute
missing data differently, so that the performance of the resulting
combined forecasts are not comparable. After explicitly imputing the
missing observations in the U.S. Survey of Professional Forecasters (SPF)
over 1968 IV-2013 I, we find that the equally weighted average continues
to be hard to beat, but the new algorithms can potentially deliver
superior performance at shorter horizons, especially during periods of
volatility clustering and structural breaks.
Journal: Econometric Reviews
Pages: 257-288
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114550
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114550
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:257-288
Template-Type: ReDIF-Article 1.0
Author-Name: Ye Li
Author-X-Name-First: Ye
Author-X-Name-Last: Li
Author-Name: Pierre Perron
Author-X-Name-First: Pierre
Author-X-Name-Last: Perron
Title: Inference on locally ordered breaks in multiple regressions
Abstract:
We consider issues related to inference about locally ordered breaks in a
system of equations, as originally proposed by Qu and Perron (2007). These
apply when break dates in different equations within the system are not
separated by a positive fraction of the sample size. This allows
constructing joint confidence intervals of all such locally ordered break
dates. We extend the results of Qu and Perron (2007) in several
directions. First, we allow the covariates to be any mix of trends and
stationary or integrated regressors. Second, we allow for breaks in the
variance-covariance matrix of the errors. Third, we allow for multiple
locally ordered breaks, each occurring in a different equation within a
subset of equations in the system. Via some simulation experiments, we
show first that the limit distributions derived provide good
approximations to the finite sample distributions. Second, we show that
forming confidence intervals in such a joint fashion allows more precision
(tighter intervals) compared to the standard approach of forming
confidence intervals using the method of Bai and Perron (1998) applied to
a single equation. Simulations also indicate that using the locally
ordered break confidence intervals yields better coverage rates than using
the framework for globally distinct breaks when the break dates are
separated by roughly 10% of the total sample size.
Journal: Econometric Reviews
Pages: 289-353
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114552
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114552
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:289-353
Template-Type: ReDIF-Article 1.0
Author-Name: Kunpeng Li
Author-X-Name-First: Kunpeng
Author-X-Name-Last: Li
Author-Name: Degui Li
Author-X-Name-First: Degui
Author-X-Name-Last: Li
Author-Name: Zhongwen Liang
Author-X-Name-First: Zhongwen
Author-X-Name-Last: Liang
Author-Name: Cheng Hsiao
Author-X-Name-First: Cheng
Author-X-Name-Last: Hsiao
Title: Estimation of semi-varying coefficient models with nonstationary regressors
Abstract:
We study a semivarying coefficient model where the regressors are
generated by the multivariate unit root I(1) processes. The influence of
the explanatory vectors on the response variable satisfies the
semiparametric partially linear structure with the nonlinear component
being functional coefficients. A semiparametric estimation methodology
with the first-stage local polynomial smoothing is applied to estimate
both the constant coefficients in the linear component and the functional
coefficients in the nonlinear component. The asymptotic distribution
theory for the proposed semiparametric estimators is established under
some mild conditions, from which both the parametric and nonparametric
estimators are shown to enjoy the well-known super-consistency property.
Furthermore, a simulation study is conducted to investigate the finite
sample performance of the developed methodology and results.
Journal: Econometric Reviews
Pages: 354-369
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114563
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114563
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:354-369
Template-Type: ReDIF-Article 1.0
Author-Name: Aman Ullah
Author-X-Name-First: Aman
Author-X-Name-Last: Ullah
Author-Name: Alan T. K. Wan
Author-X-Name-First: Alan T. K.
Author-X-Name-Last: Wan
Author-Name: Huansha Wang
Author-X-Name-First: Huansha
Author-X-Name-Last: Wang
Author-Name: Xinyu Zhang
Author-X-Name-First: Xinyu
Author-X-Name-Last: Zhang
Author-Name: Guohua Zou
Author-X-Name-First: Guohua
Author-X-Name-Last: Zou
Title: A semiparametric generalized ridge estimator and link with model averaging
Abstract:
In recent years, the suggestion of combining models as an alternative to
selecting a single model from a frequentist prospective has been advanced
in a number of studies. In this article, we propose a new semiparametric
estimator of regression coefficients, which is in the form of a feasible
generalized ridge estimator by Hoerl and Kennard (1970b) but with
different biasing factors. We prove that after reparameterization such
that the regressors are orthogonal, the generalized ridge estimator is
algebraically identical to the model average estimator. Further, the
biasing factors that determine the properties of both the generalized
ridge and semiparametric estimators are directly linked to the weights
used in model averaging. These are interesting results for the
interpretations and applications of both semiparametric and ridge
estimators. Furthermore, we demonstrate that these estimators based on
model averaging weights can have properties superior to the well-known
feasible generalized ridge estimator in a large region of the parameter
space. Two empirical examples are presented.
Journal: Econometric Reviews
Pages: 370-384
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114564
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114564
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:370-384
Template-Type: ReDIF-Article 1.0
Author-Name: Tom Wansbeek
Author-X-Name-First: Tom
Author-X-Name-Last: Wansbeek
Author-Name: Dennis Prak
Author-X-Name-First: Dennis
Author-X-Name-Last: Prak
Title: LIML in the static linear panel data model
Abstract:
We consider the static linear panel data model with a single regressor.
For this model, we derive the LIML estimator. We study the asymptotic
behavior of this estimator under many-instruments asymptotics, by showing
its consistency, deriving its asymptotic variance, and by presenting an
estimator of the asymptotic variance that is consistent under
many-instruments asymptotics. We briefly indicate the extension to the
static panel data model with multiple regressors.
Journal: Econometric Reviews
Pages: 385-395
Issue: 1-3
Volume: 36
Year: 2017
Month: 3
X-DOI: 10.1080/07474938.2015.1114566
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1114566
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:1-3:p:385-395
Template-Type: ReDIF-Article 1.0
Author-Name: Esmeralda A. Ramalho
Author-X-Name-First: Esmeralda A.
Author-X-Name-Last: Ramalho
Author-Name: Joaquim J. S. Ramalho
Author-X-Name-First: Joaquim J. S.
Author-X-Name-Last: Ramalho
Title: Moment-based estimation of nonlinear regression models with boundary outcomes and endogeneity, with applications to nonnegative and fractional responses
Abstract:
In this article, we suggest simple moment-based estimators to deal with
unobserved heterogeneity in a special class of nonlinear regression models
that includes as main particular cases exponential models for nonnegative
responses and logit and complementary loglog models for fractional
responses. The proposed estimators: (i) treat observed and omitted
covariates in a similar manner; (ii) can deal with boundary outcomes;
(iii) accommodate endogenous explanatory variables without requiring
knowledge on the reduced form model, although such information may be
easily incorporated in the estimation process; (iv) do not require
distributional assumptions on the unobservables, a conditional mean
assumption being enough for consistent estimation of the structural
parameters; and (v) under the additional assumption that the dependence
between observables and unobservables is restricted to the conditional
mean, produce consistent estimators of partial effects conditional only on
observables.
Journal: Econometric Reviews
Pages: 397-420
Issue: 4
Volume: 36
Year: 2017
Month: 4
X-DOI: 10.1080/07474938.2014.976531
File-URL: http://hdl.handle.net/10.1080/07474938.2014.976531
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:4:p:397-420
Template-Type: ReDIF-Article 1.0
Author-Name: Cristina Amado
Author-X-Name-First: Cristina
Author-X-Name-Last: Amado
Author-Name: Timo Teräsvirta
Author-X-Name-First: Timo
Author-X-Name-Last: Teräsvirta
Title: Specification and testing of multiplicative time-varying GARCH models with applications
Abstract:
In this article, we develop a specification technique for building
multiplicative time-varying GARCH models of Amado and Teräsvirta
(2008, 2013). The variance is decomposed into an unconditional and a
conditional component such that the unconditional variance component is
allowed to evolve smoothly over time. This nonstationary component is
defined as a linear combination of logistic transition functions with time
as the transition variable. The appropriate number of transition functions
is determined by a sequence of specification tests. For that purpose, a
coherent modelling strategy based on statistical inference is presented.
It is heavily dependent on Lagrange multiplier type misspecification
tests. The tests are easily implemented as they are entirely based on
auxiliary regressions. Finite-sample properties of the strategy and tests
are examined by simulation. The modelling strategy is illustrated in
practice with two real examples: an empirical application to daily
exchange rate returns and another one to daily coffee futures returns.
Journal: Econometric Reviews
Pages: 421-446
Issue: 4
Volume: 36
Year: 2017
Month: 4
X-DOI: 10.1080/07474938.2014.977064
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977064
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:4:p:421-446
Template-Type: ReDIF-Article 1.0
Author-Name: Chris Blakely
Author-X-Name-First: Chris
Author-X-Name-Last: Blakely
Author-Name: Tucker McElroy
Author-X-Name-First: Tucker
Author-X-Name-Last: McElroy
Title: Signal extraction goodness-of-fit diagnostic tests under model parameter uncertainty: Formulations and empirical evaluation
Abstract:
We present a time-domain goodness-of-fit (gof) diagnostic test that is
based on signal-extraction variances for nonstationary time series. This
diagnostic test extends the time-domain gof statistic of
Maravall (2003) by taking into account the effects of model parameter
uncertainty, utilizing theoretical results of McElroy and
Holan (2009). We demonstrate that omitting this correction results in
a severely undersized statistic. Adequate size and power are obtained in
Monte Carlo studies for fairly short time series (10 to 15 years of
monthly data). Our Monte Carlo studies of finite sample size and power
consider different combinations of both signal and noise components using
seasonal, trend, and irregular component models obtained via canonical
decomposition. Details of the implementation appropriate for SARIMA models
are given. We apply the gof diagnostic test statistics to several U.S.
Census Bureau time series. The results generally corroborate the output of
the automatic model selection procedure of the X-12-ARIMA software, which
in contrast to our diagnostic test statistic does not involve hypothesis
testing. We conclude that these diagnostic test statistics are a useful
supplementary model-checking tool for practitioners engaged in the task of
model-based seasonal adjustment.
Journal: Econometric Reviews
Pages: 447-467
Issue: 4
Volume: 36
Year: 2017
Month: 4
X-DOI: 10.1080/07474938.2016.1140277
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1140277
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:4:p:447-467
Template-Type: ReDIF-Article 1.0
Author-Name: Zdeněk Hlávka
Author-X-Name-First: Zdeněk
Author-X-Name-Last: Hlávka
Author-Name: Marie Hušková
Author-X-Name-First: Marie
Author-X-Name-Last: Hušková
Author-Name: Claudia Kirch
Author-X-Name-First: Claudia
Author-X-Name-Last: Kirch
Author-Name: Simos G. Meintanis
Author-X-Name-First: Simos G.
Author-X-Name-Last: Meintanis
Title: Fourier--type tests involving martingale difference processes
Abstract:
We develop testing procedures which detect if the observed time series is
a martingale difference sequence. Furthermore, tests are developed that
detect change--points in the conditional expectation of the series given
its past. The test statistics are formulated following the approach of
Fourier--type conditional expectations first proposed by
Bierens (1982) and have the advantage of computational simplicity.
The limit behavior of the test statistics is investigated under the null
hypothesis as well as under alternatives. Since the asymptotic null
distribution contains unknown parameters, a bootstrap procedure is
proposed in order to actually perform the test. The performance of the
bootstrap version of the test is compared in finite samples with other
methods for the same problem. A real--data application is also included.
Journal: Econometric Reviews
Pages: 468-492
Issue: 4
Volume: 36
Year: 2017
Month: 4
X-DOI: 10.1080/07474938.2014.977074
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977074
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:4:p:468-492
Template-Type: ReDIF-Article 1.0
Author-Name: Massimilano Caporin
Author-X-Name-First: Massimilano
Author-X-Name-Last: Caporin
Author-Name: Paolo Paruolo
Author-X-Name-First: Paolo
Author-X-Name-Last: Paruolo
Title: Correction of Caporin and Paruolo (2015)
Journal: Econometric Reviews
Pages: 493-493
Issue: 4
Volume: 36
Year: 2017
Month: 4
X-DOI: 10.1080/07474938.2016.1275203
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1275203
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:4:p:493-493
Template-Type: ReDIF-Article 1.0
Author-Name: Tucker McElroy
Author-X-Name-First: Tucker
Author-X-Name-Last: McElroy
Author-Name: Michael W. McCracken
Author-X-Name-First: Michael W.
Author-X-Name-Last: McCracken
Title: Multistep ahead forecasting of vector time series
Abstract:
This article develops the theory of multistep ahead forecasting for vector
time series that exhibit temporal nonstationarity and co-integration. We
treat the case of a semi-infinite past by developing the forecast filters
and the forecast error filters explicitly. We also provide formulas for
forecasting from a finite data sample. This latter application can be
accomplished by using large matrices, which remains practicable when the
total sample size is moderate. Expressions for the mean square error of
forecasts are also derived and can be implemented readily. The flexibility
and generality of these formulas are illustrated by four diverse
applications: forecasting euro area macroeconomic aggregates; backcasting
fertility rates by racial category; forecasting long memory inflation
data; and forecasting regional housing starts using a seasonally
co-integrated model.
Journal: Econometric Reviews
Pages: 495-513
Issue: 5
Volume: 36
Year: 2017
Month: 5
X-DOI: 10.1080/07474938.2014.977088
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977088
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:5:p:495-513
Template-Type: ReDIF-Article 1.0
Author-Name: Guillaume Chevillon
Author-X-Name-First: Guillaume
Author-X-Name-Last: Chevillon
Title: Robust cointegration testing in the presence of weak trends, with an application to the human origin of global warming
Abstract:
Standard tests for the rank of cointegration of a vector autoregressive
process present distributions that are affected by the presence of
deterministic trends. We consider the recent approach of Demetrescu et al.
(2009) who recommend testing a composite null. We assess this methodology
in the presence of trends (linear or broken) whose magnitude is small
enough not to be always detectable at conventional significance levels. We
model them using local asymptotics and derive the properties of the test
statistics. We show that whether the trend is orthogonal to the
cointegrating vector has a major impact on the distributions but that the
test combination approach remains valid. We apply of the methodology to
the study of cointegration properties between global temperatures and the
radiative forcing of human gas emissions. We find new evidence of Granger
Causality.
Journal: Econometric Reviews
Pages: 514-545
Issue: 5
Volume: 36
Year: 2017
Month: 5
X-DOI: 10.1080/07474938.2014.977080
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977080
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:5:p:514-545
Template-Type: ReDIF-Article 1.0
Author-Name: Jouchi Nakajima
Author-X-Name-First: Jouchi
Author-X-Name-Last: Nakajima
Title: Bayesian analysis of multivariate stochastic volatility with skew return distribution
Abstract:
Multivariate stochastic volatility models with skew distributions are
proposed. Exploiting Cholesky stochastic volatility modeling, univariate
stochastic volatility processes with leverage effect and generalized
hyperbolic skew t-distributions are embedded to multivariate analysis with
time-varying correlations. Bayesian modeling allows this approach to
provide parsimonious skew structure and to easily scale up for
high-dimensional problem. Analyses of daily stock returns are illustrated.
Empirical results show that the time-varying correlations and the sparse
skew structure contribute to improved prediction performance and
Value-at-Risk forecasts.
Journal: Econometric Reviews
Pages: 546-562
Issue: 5
Volume: 36
Year: 2017
Month: 5
X-DOI: 10.1080/07474938.2014.977093
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977093
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:5:p:546-562
Template-Type: ReDIF-Article 1.0
Author-Name: Zongwu Cai
Author-X-Name-First: Zongwu
Author-X-Name-Last: Cai
Author-Name: Ying Fang
Author-X-Name-First: Ying
Author-X-Name-Last: Fang
Author-Name: Henong Li
Author-X-Name-First: Henong
Author-X-Name-Last: Li
Title: Weak Instrumental Variables Models for Longitudinal Data
Abstract: This article considers the estimation and testing of a within-group two-stage least squares (TSLS) estimator for instruments with varying degrees of weakness in a longitudinal (panel) data model. We show that adding the repeated cross-sectional information into a regression model can improve the estimation in weak instruments. Moreover, the consistency and limiting distribution of the TSLS estimator are established when both N and T tend to infinity. Some asymptotically pivotal tests are extended to a longitudinal data model and their asymptotic properties are examined. A Monte Carlo experiment is conducted to evaluate the finite sample performance of the proposed estimators.
Journal: Econometric Reviews
Pages: 361-389
Issue: 4
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607356
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607356
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:4:p:361-389
Template-Type: ReDIF-Article 1.0
Author-Name: Chihwa Kao
Author-X-Name-First: Chihwa
Author-X-Name-Last: Kao
Author-Name: Lorenzo Trapani
Author-X-Name-First: Lorenzo
Author-X-Name-Last: Trapani
Author-Name: Giovanni Urga
Author-X-Name-First: Giovanni
Author-X-Name-Last: Urga
Title: Asymptotics for Panel Models with Common Shocks
Abstract: This article develops a novel asymptotic theory for panel models with common shocks. We assume that contemporaneous correlation can be generated by both the presence of common regressors among units and weak spatial dependence among the error terms. Several characteristics of the panel are considered: cross-sectional and time-series dimensions can either be fixed or large; factors can either be observable or unobservable; the factor model can describe either a cointegration relationship or a spurious regression, and we also consider the stationary case. We derive the rate of convergence and the limit distributions for the ordinary least square (OLS) estimates of the model parameters under all the aforementioned cases.
Journal: Econometric Reviews
Pages: 390-439
Issue: 4
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607991
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607991
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:4:p:390-439
Template-Type: ReDIF-Article 1.0
Author-Name: J. Arteche
Author-X-Name-First: J.
Author-X-Name-Last: Arteche
Title: Semiparametric Inference in Correlated Long Memory Signal Plus Noise Models
Abstract: This article proposes an extension of the log periodogram regression in perturbed long memory series that accounts for the added noise, while also allowing for correlation between signal and noise, a common situation in many economic and financial series. Consistency (for d < 1) and asymptotic normality (for d < 3/4) are shown with the same bandwidth restriction as required for the original log periodogram regression in a fully observable series, with the corresponding gain in asymptotic efficiency and faster convergence over competitors. Local Wald, Lagrange Multiplier, and Hausman type tests of the hypothesis of no correlation between the latent signal and noise are also proposed.
Journal: Econometric Reviews
Pages: 440-474
Issue: 4
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607996
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607996
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:4:p:440-474
Template-Type: ReDIF-Article 1.0
Author-Name: Saralees Nadarajah
Author-X-Name-First: Saralees
Author-X-Name-Last: Nadarajah
Author-Name: Mahdi Teimouri
Author-X-Name-First: Mahdi
Author-X-Name-Last: Teimouri
Title: On the Characteristic Function for Asymmetric Exponential Power Distributions
Abstract: The econometric literature has seen a surge of developments in the theory and applications of asymmetric exponential power distributions (AEPDs). Here, for the first time, we derive explicit closed form expressions for the characteristic function of AEPDs. The expressions involve the complex parameter Wright generalized hypergeometric function.
Journal: Econometric Reviews
Pages: 475-481
Issue: 4
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.608000
File-URL: http://hdl.handle.net/10.1080/07474938.2011.608000
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:4:p:475-481
Template-Type: ReDIF-Article 1.0
Author-Name: Xuexin Wang
Author-X-Name-First: Xuexin
Author-X-Name-Last: Wang
Title: A general approach to conditional moment specification testing with projections
Abstract:
This article develops a general approach for model specification analysis within the conditional moment specification testing framework. The new methodology removes the non-negligible estimation effect of test statistic via a projection-based transformation, exploiting the nature of conditional moment specification testing. That is, the conditional moment restrictions, which are implicitly defined in conditional moment testing framework, not only imply the unconditional moment restrictions we are testing, but also many other unconditional moment restrictions. This approach is robust to departures from the distributional assumptions that are not being tested; moreover, only a preliminary \begin{equation}{\sqrt {T}}\end{equation} T-consistent estimator is needed, and the transformation is asymptotically distribution free. Furthermore, the transformed statistic reaches asymptotic efficiency in the sense of generalized method of moments (GMM) estimation. In some specific alternatives, we establish the optimal tests. We apply the methodology to test the adequacy and nonlinearity of the generalized autoregressive conditional heteroskedasticity (GARCH) models. Finally, an application to the S&P 500 daily data highlights the merits of our approach.
Journal: Econometric Reviews
Pages: 140-165
Issue: 2
Volume: 37
Year: 2018
Month: 2
X-DOI: 10.1080/07474938.2015.1032165
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1032165
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:2:p:140-165
Template-Type: ReDIF-Article 1.0
Author-Name: Koji Miyawaki
Author-X-Name-First: Koji
Author-X-Name-Last: Miyawaki
Author-Name: Yasuhiro Omori
Author-X-Name-First: Yasuhiro
Author-X-Name-Last: Omori
Author-Name: Akira Hibiki
Author-X-Name-First: Akira
Author-X-Name-Last: Hibiki
Title: A discrete/continuous choice model on a nonconvex budget set
Abstract:
Decreasing block rate pricing is a nonlinear price system often used for public utility services. Residential gas services in Japan and the United Kingdom are provided under this price schedule. The discrete/continuous choice approach is used to analyze the demand under decreasing block rate pricing. However, the nonlinearity problem, which has not been examined in previous studies, arises because a consumer’s budget set (a set of affordable consumption amounts) is nonconvex, and hence, the resulting model includes highly nonlinear functions. To address this problem, we propose a feasible, efficient method of demand estimation on the nonconvex budget. The advantages of our method are as follows: (i) the construction of an Markov chain Monte Carlo algorithm with an efficient blanket based on the Hermite–Hadamard integral inequality and the power-mean inequality, (ii) the explicit consideration of the (highly nonlinear) separability condition, which often makes numerical likelihood maximization difficult, and (iii) the introduction of normal disturbance into the discrete/continuous choice model on the nonconvex budget set. The proposed method is applied to estimate the Japanese residential gas demand function and evaluate the effect of price schedule changes as a policy experiment.
Journal: Econometric Reviews
Pages: 89-113
Issue: 2
Volume: 37
Year: 2018
Month: 2
X-DOI: 10.1080/07474938.2015.1032166
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1032166
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:2:p:89-113
Template-Type: ReDIF-Article 1.0
Author-Name: Stanislav Anatolyev
Author-X-Name-First: Stanislav
Author-X-Name-Last: Anatolyev
Author-Name: Nikita Kobotaev
Author-X-Name-First: Nikita
Author-X-Name-Last: Kobotaev
Title: Modeling and forecasting realized covariance matrices with accounting for leverage
Abstract:
The existing dynamic models for realized covariance matrices do not account for an asymmetry with respect to price directions. We modify the recently proposed conditional autoregressive Wishart (CAW) model to allow for the leverage effect. In the conditional threshold autoregressive Wishart (CTAW) model and its variations the parameters governing each asset's volatility and covolatility dynamics are subject to switches that depend on signs of previous asset returns or previous market returns. We evaluate the predictive ability of the CTAW model and its restricted and extended specifications from both statistical and economic points of view. We find strong evidence that many CTAW specifications have a better in-sample fit and tend to have a better out-of-sample predictive ability than the original CAW model and its modifications.
Journal: Econometric Reviews
Pages: 114-139
Issue: 2
Volume: 37
Year: 2018
Month: 2
X-DOI: 10.1080/07474938.2015.1035165
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1035165
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:2:p:114-139
Template-Type: ReDIF-Article 1.0
Author-Name: Francisco Blasques
Author-X-Name-First: Francisco
Author-X-Name-Last: Blasques
Author-Name: André Lucas
Author-X-Name-First: André
Author-X-Name-Last: Lucas
Author-Name: Erkki Silde
Author-X-Name-First: Erkki
Author-X-Name-Last: Silde
Title: A stochastic recurrence equations approach for score driven correlation models
Abstract:
We describe stationarity and ergodicity (SE) regions for a recently proposed class of score driven dynamic correlation models. These models have important applications in empirical work. The regions are derived from sufficiency conditions in Bougerol (1993) and take a nonstandard form. We show that the nonstandard shape of the sufficiency regions cannot be avoided by reparameterizing the model or by rescaling the score steps in the transition equation for the correlation parameter. This makes the result markedly different from the volatility case. Observationally equivalent decompositions of the stochastic recurrence equation yield regions with different shapes and sizes. We use these results to establish the consistency and asymptotic normality of the maximum likelihood estimator. We illustrate our results with an analysis of time-varying correlations between U.K. and Greek equity indices. We find that also in empirical applications different decompositions can give rise to different conclusions regarding the stability of the estimated model.
Journal: Econometric Reviews
Pages: 166-181
Issue: 2
Volume: 37
Year: 2018
Month: 2
X-DOI: 10.1080/07474938.2016.1139821
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1139821
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:2:p:166-181
Template-Type: ReDIF-Article 1.0
Author-Name: Yoshimasa Uematsu
Author-X-Name-First: Yoshimasa
Author-X-Name-Last: Uematsu
Title: Nonstationary nonlinear quantile regression
Abstract:
This study examines estimation and inference based on quantile regression for parametric nonlinear models with an integrated time series covariate. We first derive the limiting distribution of the nonlinear quantile regression estimator and then consider testing for parameter restrictions, when the regression function is specified as an asymptotically homogeneous function. We also study linear-in-parameter regression models when the regression function is given by integrable regression functions as well as asymptotically homogeneous regression functions. We, furthermore, propose a fully modified estimator to reduce the bias in the original estimator under a certain set of conditions. Finally, simulation studies show that the estimators behave well, especially when the regression error term has a fat-tailed distribution.
Journal: Econometric Reviews
Pages: 386-416
Issue: 4
Volume: 38
Year: 2019
Month: 4
X-DOI: 10.1080/07474938.2017.1308056
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1308056
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:4:p:386-416
Template-Type: ReDIF-Article 1.0
Author-Name: Xiaodong Liu
Author-X-Name-First: Xiaodong
Author-X-Name-Last: Liu
Author-Name: Paulo Saraiva
Author-X-Name-First: Paulo
Author-X-Name-Last: Saraiva
Title: GMM estimation of spatial autoregressive models in a system of simultaneous equations with heteroskedasticity
Abstract:
This paper proposes a GMM estimation framework for the SAR model in a system of simultaneous equations with heteroskedastic disturbances. Besides linear moment conditions, the proposed GMM estimator also utilizes quadratic moment conditions based on the covariance structure of model disturbances within and across equations. Compared with the QML approach, the GMM estimator is easier to implement and robust under heteroskedasticity of unknown form. We derive the heteroskedasticity-robust standard error for the GMM estimator. Monte Carlo experiments show that the proposed GMM estimator performs well in finite samples.
Journal: Econometric Reviews
Pages: 359-385
Issue: 4
Volume: 38
Year: 2019
Month: 4
X-DOI: 10.1080/07474938.2017.1308087
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1308087
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:4:p:359-385
Template-Type: ReDIF-Article 1.0
Author-Name: Liangjun Su
Author-X-Name-First: Liangjun
Author-X-Name-Last: Su
Author-Name: Pai Xu
Author-X-Name-First: Pai
Author-X-Name-Last: Xu
Title: Common threshold in quantile regressions with an application to pricing for reputation
Abstract:
The paper develops a systematic estimation and inference procedure for quantile regression models where there may exist a common threshold effect across different quantile indices. We first propose a sup-Wald test for the existence of a threshold effect, and then study the asymptotic properties of the estimators in a threshold quantile regression model under the shrinking threshold effect framework. We consider several tests for the presence of a common threshold value across different quantile indices and obtain their limiting distributions. We apply our methodology to study the pricing strategy for reputation through the use of a data set from Taobao.com. In our economic model, an online seller maximizes the sum of the profit from current sales and the possible future gain from a targeted higher reputation level. We show that the model can predict a jump in optimal pricing behavior, which is considered as “reputation effect” in this paper. The use of threshold quantile regression model allows us to identify and explore the reputation effect and its heterogeneity in data. We find both reputation effects and common thresholds for a range of quantile indices in seller’s pricing strategy in our application.
Journal: Econometric Reviews
Pages: 417-450
Issue: 4
Volume: 38
Year: 2019
Month: 4
X-DOI: 10.1080/07474938.2017.1318469
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1318469
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:4:p:417-450
Template-Type: ReDIF-Article 1.0
Author-Name: Sivagowry Sriananthakumar
Author-X-Name-First: Sivagowry
Author-X-Name-Last: Sriananthakumar
Title: Using point optimal test of a simple null hypothesis for testing a composite null hypothesis via maximized Monte Carlo approach
Abstract:
King’s Point Optimal (PO) test of a simple null hypothesis is useful in a number of ways, for example it can be used to trace the power envelope against which existing tests can be compared. However, this test cannot always be constructed when testing a composite null hypothesis. It is suggested in the literature that approximate PO (APO) tests can overcome this problem, but they also have some drawbacks. This paper investigates if King’s PO test can be used for testing a composite null in the presence of nuisance parameters via a maximized Monte Carlo (MMC) approach, with encouraging results.
Journal: Econometric Reviews
Pages: 451-464
Issue: 4
Volume: 38
Year: 2019
Month: 4
X-DOI: 10.1080/07474938.2017.1382781
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1382781
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:4:p:451-464
Template-Type: ReDIF-Article 1.0
Author-Name: Arnaud Dufays
Author-X-Name-First: Arnaud
Author-X-Name-Last: Dufays
Author-Name: Jeroen V. K. Rombouts
Author-X-Name-First: Jeroen V. K.
Author-X-Name-Last: Rombouts
Title: Sparse Change-point HAR Models for Realized Variance
Abstract:
Change-point time series specifications constitute flexible models that capture unknown structural changes by allowing for switches in the model parameters. Nevertheless most models suffer from an over-parametrization issue since typically only one latent state variable drives the switches in all parameters. This implies that all parameters have to change when a break happens. To gauge whether and where there are structural breaks in realized variance, we introduce the sparse change-point HAR model. The approach controls for model parsimony by limiting the number of parameters which evolve from one regime to another. Sparsity is achieved thanks to employing a nonstandard shrinkage prior distribution. We derive a Gibbs sampler for inferring the parameters of this process. Simulation studies illustrate the excellent performance of the sampler. Relying on this new framework, we study the stability of the HAR model using realized variance series of several major international indices between January 2000 and August 2015.
Journal: Econometric Reviews
Pages: 857-880
Issue: 8
Volume: 38
Year: 2019
Month: 9
X-DOI: 10.1080/07474938.2018.1454366
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1454366
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:8:p:857-880
Template-Type: ReDIF-Article 1.0
Author-Name: Roberto León-González
Author-X-Name-First: Roberto
Author-X-Name-Last: León-González
Title: Efficient Bayesian inference in generalized inverse gamma processes for stochastic volatility
Abstract:
This paper develops a novel and efficient algorithm for Bayesian inference in inverse Gamma stochastic volatility models. It is shown that by conditioning on auxiliary variables, it is possible to sample all the volatilities jointly directly from their posterior conditional density, using simple and easy to draw from distributions. Furthermore, this paper develops a generalized inverse gamma process with more flexible tails in the distribution of volatilities, which still allows for simple and efficient calculations. Using several macroeconomic and financial datasets, it is shown that the inverse gamma and generalized inverse gamma processes can greatly outperform the commonly used log normal volatility processes with Student’s t errors or jumps in the mean equation.
Journal: Econometric Reviews
Pages: 899-920
Issue: 8
Volume: 38
Year: 2019
Month: 9
X-DOI: 10.1080/07474938.2018.1485614
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1485614
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:8:p:899-920
Template-Type: ReDIF-Article 1.0
Author-Name: Takahide Yanagi
Author-X-Name-First: Takahide
Author-X-Name-Last: Yanagi
Title: Inference on local average treatment effects for misclassified treatment
Abstract:
We develop point-identification for the local average treatment effect when the binary treatment contains a measurement error. The standard instrumental variable estimator is inconsistent for the parameter since the measurement error is nonclassical by construction. We correct the problem by identifying the distribution of the measurement error based on the use of an exogenous variable that can even be a binary covariate. The moment conditions derived from the identification lead to generalized method of moments estimation with asymptotically valid inferences. Monte Carlo simulations and an empirical illustration demonstrate the usefulness of the proposed procedure.
Journal: Econometric Reviews
Pages: 938-960
Issue: 8
Volume: 38
Year: 2019
Month: 9
X-DOI: 10.1080/07474938.2018.1485833
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1485833
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:8:p:938-960
Template-Type: ReDIF-Article 1.0
Author-Name: Xiaodong Liu
Author-X-Name-First: Xiaodong
Author-X-Name-Last: Liu
Title: Simultaneous equations with binary outcomes and social interactions
Abstract:
This paper introduces a discrete-choice simultaneous-equation social interaction model. We provide a microfoundation for the econometric model by considering an incomplete information game where individuals interact in multiple activities through a network. We characterize the sufficient condition for the existence of a unique BNE of the game. We discuss the identification of the econometric model and propose a two-stage estimation procedure, where the reduced form parameters are estimated by the NPL algorithm in the first stage and the structural parameters are recovered from the estimated reduced form parameters by the AGLS estimator in the second stage. Monte Carlo experiments show that the proposed estimation procedure performs well in finite samples and remains computationally feasible when networks are large. We also provide an empirical example to illustrate the empirical relevance of the proposed model.
Journal: Econometric Reviews
Pages: 921-937
Issue: 8
Volume: 38
Year: 2019
Month: 9
X-DOI: 10.1080/07474938.2018.1485836
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1485836
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:8:p:921-937
Template-Type: ReDIF-Article 1.0
Author-Name: Chaohua Dong
Author-X-Name-First: Chaohua
Author-X-Name-Last: Dong
Author-Name: Jiti Gao
Author-X-Name-First: Jiti
Author-X-Name-Last: Gao
Author-Name: Bin Peng
Author-X-Name-First: Bin
Author-X-Name-Last: Peng
Title: Estimation in a semiparametric panel data model with nonstationarity
Abstract:
In this paper, we consider a partially linear panel data model with nonstationarity and certain cross-sectional dependence. Accounting for the explosive feature of the nonstationary time series, we particularly employ Hermite orthogonal functions in this study. Under a general spatial error dependence structure, we then establish some consistent closed-form estimates for both the unknown parameters and the unknown functions for the cases where N and T go jointly to infinity. Rates of convergence and asymptotic normalities are established for the proposed estimators. Both the finite sample performance and the empirical applications show that the proposed estimation methods work well.
Journal: Econometric Reviews
Pages: 961-977
Issue: 8
Volume: 38
Year: 2019
Month: 9
X-DOI: 10.1080/07474938.2018.1514021
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1514021
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:8:p:961-977
Template-Type: ReDIF-Article 1.0
Author-Name: Josep Lluís Carrion-i-Silvestre
Author-X-Name-First: Josep Lluís
Author-X-Name-Last: Carrion-i-Silvestre
Author-Name: Dukpa Kim
Author-X-Name-First: Dukpa
Author-X-Name-Last: Kim
Title: Quasi-likelihood ratio tests for cointegration, cobreaking, and cotrending
Abstract:
We consider a set of variables with two types of nonstationary features, stochastic trends and broken linear trends. We develop tests that can determine whether there is a linear combination of these variables under which the nonstationary features can be canceled out. The first test can determine whether stochastic trends can be eliminated and thus whether cointegration holds, regardless of whether structural breaks in linear trends are eliminated. The second test can determine whether both stochastic trends and breaks in linear trends are simultaneously removed and thus whether cointegration and cobreaking simultaneously hold. The third test can determine whether not only breaks in linear trends but also linear trends themselves are eliminated along with stochastic trends and thus whether both cointegration and cotrending hold.
Journal: Econometric Reviews
Pages: 881-898
Issue: 8
Volume: 38
Year: 2019
Month: 9
X-DOI: 10.1080/07474938.2018.1528416
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1528416
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:8:p:881-898
Template-Type: ReDIF-Article 1.0
Author-Name: Laszlo Balazsi
Author-X-Name-First: Laszlo
Author-X-Name-Last: Balazsi
Author-Name: Laszlo Matyas
Author-X-Name-First: Laszlo
Author-X-Name-Last: Matyas
Author-Name: Tom Wansbeek
Author-X-Name-First: Tom
Author-X-Name-Last: Wansbeek
Title: The estimation of multidimensional fixed effects panel data models
Abstract:
This article introduces the appropriate within estimators for the most frequently used three-dimensional fixed effects panel data models. It analyzes the behavior of these estimators in the cases of no self-flow data, unbalanced data, and dynamic autoregressive models. The main results are then generalized for higher dimensional panel data sets as well.
Journal: Econometric Reviews
Pages: 212-227
Issue: 3
Volume: 37
Year: 2018
Month: 3
X-DOI: 10.1080/07474938.2015.1032164
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1032164
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:3:p:212-227
Template-Type: ReDIF-Article 1.0
Author-Name: D. S. G. Pollock
Author-X-Name-First: D. S. G.
Author-X-Name-Last: Pollock
Title: Trends cycles and seasons: Econometric methods of signal extraction
Abstract:
Alternative methods of trend extraction and of seasonal adjustment are described that operate in the time domain and in the frequency domain.The time-domain methods that are implemented in the TRAMO–SEATS and the STAMP programs are compared. An abbreviated time-domain method of seasonal adjustment that is implemented in the IDEOLOG program is also presented. Finite-sample versions of the Wiener–Kolmogorov filter are described that can be used to implement the methods in a common way.The frequency-domain method, which is also implemented in the IDEOLOG program, employs an ideal frequency selective filter that depends on identifying the ordinates of the Fourier transform of a detrended data sequence that should lie in the pass band of the filter and those that should lie in its stop band. Filters of this nature can be used both for extracting a low-frequency cyclical component of the data and for extracting the seasonal component.
Journal: Econometric Reviews
Pages: 228-246
Issue: 3
Volume: 37
Year: 2018
Month: 3
X-DOI: 10.1080/07474938.2015.1033218
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1033218
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:3:p:228-246
Template-Type: ReDIF-Article 1.0
Author-Name: William C. Horrace
Author-X-Name-First: William C.
Author-X-Name-Last: Horrace
Author-Name: Christopher F. Parmeter
Author-X-Name-First: Christopher F.
Author-X-Name-Last: Parmeter
Title: A Laplace stochastic frontier model
Abstract:
We propose a Laplace stochastic frontier model as an alternative to the traditional model with normal errors. An interesting feature of the Laplace model is that the distribution of inefficiency conditional on the composed error is constant for positive values of the composed error, but varies for negative values. A simulation study suggests that the model performs well relative to the normal-exponential model when the two-sided error is misspecified. An application to U.S. Airlines is provided.
Journal: Econometric Reviews
Pages: 260-280
Issue: 3
Volume: 37
Year: 2018
Month: 3
X-DOI: 10.1080/07474938.2015.1059715
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1059715
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:3:p:260-280
Template-Type: ReDIF-Article 1.0
Author-Name: Tomohiro Ando
Author-X-Name-First: Tomohiro
Author-X-Name-Last: Ando
Author-Name: Jushan Bai
Author-X-Name-First: Jushan
Author-X-Name-Last: Bai
Title: Selecting the regularization parameters in high-dimensional panel data models: Consistency and efficiency
Abstract:
This article considers panel data models in the presence of a large number of potential predictors and unobservable common factors. The model is estimated by the regularization method together with the principal components procedure. We propose a panel information criterion for selecting the regularization parameter and the number of common factors under a diverging number of predictors. Under the correct model specification, we show that the proposed criterion consistently identifies the true model. If the model is instead misspecified, the proposed criterion achieves asymptotically efficient model selection. Simulation results confirm these theoretical arguments.
Journal: Econometric Reviews
Pages: 183-211
Issue: 3
Volume: 37
Year: 2018
Month: 3
X-DOI: 10.1080/07474938.2015.1092822
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092822
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:3:p:183-211
Template-Type: ReDIF-Article 1.0
Author-Name: S. M. Hatefi
Author-X-Name-First: S. M.
Author-X-Name-Last: Hatefi
Author-Name: S. A. Torabi
Author-X-Name-First: S. A.
Author-X-Name-Last: Torabi
Title: A slack analysis framework for improving composite indicators with applications to human development and sustainable energy indices
Abstract:
Data envelopment analysis models are used for measuring composite indicators in various areas. Although there are many models for measuring composite indicators in the literature, surprisingly, there is no methodology that clearly shows how composite indicators improvement could be performed. This article proposes a slack analysis framework for improving the composite indicator of inefficient entities. For doing so, two dual problems originated from two data envelopment analysis models in the literature are proposed, which can guide decision makers on how to adjust the subindicators of inefficient entities to improve their composite indicators through identifying which subindicators must be improved and how much they should be augmented. The proposed methodology for improving composite indicators is inspired from data envelopment analysis and slack analysis approaches. Applicability of the proposed methodology is investigated for improving two well-known composite indicators, i.e., Sustainable Energy Index and Human Development Index. The results show that 12 out of 18 economies are inefficient in the context of sustainable energy index, for which the proposed slack analysis models provide the suggested adjustments in terms of their respected subindicators. Furthermore, the proposed methodology suggests how to adjust the life expectancy, the education, and the gross domestic product (GDP) as the three socioeconomic indicators to improve the human development index of 24 countries which are identified as inefficient entities among 27 countries.
Journal: Econometric Reviews
Pages: 247-259
Issue: 3
Volume: 37
Year: 2018
Month: 3
X-DOI: 10.1080/07474938.2016.1140286
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1140286
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:3:p:247-259
Template-Type: ReDIF-Article 1.0
Author-Name: Stanislav Anatolyev
Author-X-Name-First: Stanislav
Author-X-Name-Last: Anatolyev
Author-Name: Nikolay Gospodinov
Author-X-Name-First: Nikolay
Author-X-Name-Last: Gospodinov
Title: Multivariate Return Decomposition: Theory and Implications
Abstract:
In this paper, we propose a model based on multivariate decomposition of multiplicative – absolute values and signs – components of asset returns. In the m-variate case, the marginals for the m absolute values and the binary marginals for the m directions are linked through a 2m-dimensional copula. The approach is detailed in the case of a bivariate decomposition. We outline the construction of the likelihood function and the computation of different conditional measures. The finite-sample properties of the maximum likelihood estimator are assessed by simulation. An application to predicting bond returns illustrates the usefulness of the proposed method.
Journal: Econometric Reviews
Pages: 487-508
Issue: 5
Volume: 38
Year: 2019
Month: 5
X-DOI: 10.1080/07474938.2017.1348677
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1348677
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:5:p:487-508
Template-Type: ReDIF-Article 1.0
Author-Name: Ricardo Mora
Author-X-Name-First: Ricardo
Author-X-Name-Last: Mora
Author-Name: Iliana Reggio
Author-X-Name-First: Iliana
Author-X-Name-Last: Reggio
Title: Alternative diff-in-diffs estimators with several pretreatment periods
Abstract:
This paper deals with the identification of treatment effects using difference-in-differences estimators when several pretreatment periods are available. We define a family of identifying nonnested assumptions that lead to alternative difference-in-differences estimators. We show that the most usual difference-in-differences estimators imply equivalence conditions for the identifying nonnested assumptions. We further propose a model that can be used to test multiple equivalence conditions without imposing any of them. We conduct a Monte Carlo analysis and apply our approach to several recent papers to show its practical relevance.
Journal: Econometric Reviews
Pages: 465-486
Issue: 5
Volume: 38
Year: 2019
Month: 5
X-DOI: 10.1080/07474938.2017.1348683
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1348683
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:5:p:465-486
Template-Type: ReDIF-Article 1.0
Author-Name: Giuseppe Cavaliere
Author-X-Name-First: Giuseppe
Author-X-Name-Last: Cavaliere
Author-Name: Anton Skrobotov
Author-X-Name-First: Anton
Author-X-Name-Last: Skrobotov
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M. Robert
Author-X-Name-Last: Taylor
Title: Wild bootstrap seasonal unit root tests for time series with periodic nonstationary volatility
Abstract:
We investigate the behavior of the well-known Hylleberg, Engle, Granger and Yoo (HEGY) regression-based seasonal unit root tests in cases where the driving shocks can display periodic nonstationary volatility and conditional heteroskedasticity. Our set up allows for periodic heteroskedasticity, nonstationary volatility and (seasonal) generalized autoregressive-conditional heteroskedasticity as special cases. We show that the limiting null distributions of the HEGY tests depend, in general, on nuisance parameters which derive from the underlying volatility process. Monte Carlo simulations show that the standard HEGY tests can be substantially oversized in the presence of such effects. As a consequence, we propose wild bootstrap implementations of the HEGY tests. Two possible wild bootstrap resampling schemes are discussed, both of which are shown to deliver asymptotically pivotal inference under our general conditions on the shocks. Simulation evidence is presented which suggests that our proposed bootstrap tests perform well in practice, largely correcting the size problems seen with the standard HEGY tests even under extreme patterns of heteroskedasticity, yet not losing finite sample relative to the standard HEGY tests.
Journal: Econometric Reviews
Pages: 509-532
Issue: 5
Volume: 38
Year: 2019
Month: 5
X-DOI: 10.1080/07474938.2017.1348684
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1348684
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:5:p:509-532
Template-Type: ReDIF-Article 1.0
Author-Name: Qiang Chen
Author-X-Name-First: Qiang
Author-X-Name-Last: Chen
Author-Name: Meidi Hu
Author-X-Name-First: Meidi
Author-X-Name-Last: Hu
Author-Name: Xiaojun Song
Author-X-Name-First: Xiaojun
Author-X-Name-Last: Song
Title: A nonparametric specification test for the volatility functions of diffusion processes
Abstract:
This paper develops a new test for the parametric volatility function of a diffusion model based on nonparametric estimation techniques. The proposed test imposes no restriction on the functional form of the drift function and has an asymptotically standard normal distribution under the null hypothesis of correct specification. It is consistent against any fixed alternatives and has nontrivial asymptotic power against a class of local alternatives with proper rates. Monte Carlo simulations show that the test performs well in finite samples and generally has better power performance than the nonparametric test of Li (2007) and the stochastic process-based tests of Dette and Podolskij (2008). When applying the test to high frequency data of EUR/USD exchange rate, the empirical results show that the commonly used volatility functions fit more poorly when the data frequency becomes higher, and the general volatility functions fit relatively better than the constant volatility function.
Journal: Econometric Reviews
Pages: 557-576
Issue: 5
Volume: 38
Year: 2019
Month: 5
X-DOI: 10.1080/07474938.2017.1365428
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1365428
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:5:p:557-576
Template-Type: ReDIF-Article 1.0
Author-Name: M. Victoria Caballero-Pintado
Author-X-Name-First: M. Victoria
Author-X-Name-Last: Caballero-Pintado
Author-Name: Mariano Matilla-García
Author-X-Name-First: Mariano
Author-X-Name-Last: Matilla-García
Author-Name: Manuel Ruiz Marín
Author-X-Name-First: Manuel Ruiz
Author-X-Name-Last: Marín
Title: Symbolic correlation integral
Abstract:
This paper aims to introduce the concept of symbolic correlation integral SC that is extensively used in many scientific fields. The new correlation integral SC avoids the noisy parameter 𝜀 of the classical correlation integral, defined by Grassberger and Procaccia (1983) and extensively used for constructing correlation-integral-based statistics, as in the BDS test. Once the free parameter 𝜀 disappears, it is possible to construct a nonparametric powerful test for independence that can also be used as a diagnostic tool for model selection. The symbolic correlation integral is also extended to deal with multivariate models, and a test for causality is proposed as an example of the theoretical power of the new concept. With extensive Monte Carlo simulations, the paper shows the good size and power performance of symbolic correlation-integral-based tests.
Journal: Econometric Reviews
Pages: 533-556
Issue: 5
Volume: 38
Year: 2019
Month: 5
X-DOI: 10.1080/07474938.2017.1365431
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1365431
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:5:p:533-556
Template-Type: ReDIF-Article 1.0
Author-Name: Joachim Freyberger
Author-X-Name-First: Joachim
Author-X-Name-Last: Freyberger
Author-Name: Matthew A. Masten
Author-X-Name-First: Matthew A.
Author-X-Name-Last: Masten
Title: A practical guide to compact infinite dimensional parameter spaces
Abstract:
Compactness is a widely used assumption in econometrics. In this article, we gather and review general compactness results for many commonly used parameter spaces in nonparametric estimation, and we provide several new results. We consider three kinds of functions: (1) functions with bounded domains which satisfy standard norm bounds, (2) functions with bounded domains which do not satisfy standard norm bounds, and (3) functions with unbounded domains. In all three cases, we provide two kinds of results, compact embedding and closedness, which together allow one to show that parameter spaces defined by a ||·||s-norm bound are compact under a norm ||·||c. We illustrate how the choice of norms affects the parameter space, the strength of the conclusions, as well as other regularity conditions in two common settings: nonparametric mean regression and nonparametric instrumental variables estimation.
Journal: Econometric Reviews
Pages: 979-1006
Issue: 9
Volume: 38
Year: 2019
Month: 10
X-DOI: 10.1080/07474938.2018.1514025
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1514025
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:9:p:979-1006
Template-Type: ReDIF-Article 1.0
Author-Name: Audronė Virbickaitė
Author-X-Name-First: Audronė
Author-X-Name-Last: Virbickaitė
Author-Name: Hedibert F. Lopes
Author-X-Name-First: Hedibert F.
Author-X-Name-Last: Lopes
Author-Name: M. Concepción Ausín
Author-X-Name-First: M.
Author-X-Name-Last: Concepción Ausín
Author-Name: Pedro Galeano
Author-X-Name-First: Pedro
Author-X-Name-Last: Galeano
Title: Particle learning for Bayesian semi-parametric stochastic volatility model
Abstract:
This article designs a Sequential Monte Carlo (SMC) algorithm for estimation of Bayesian semi-parametric Stochastic Volatility model for financial data. In particular, it makes use of one of the most recent particle filters called Particle Learning (PL). SMC methods are especially well suited for state-space models and can be seen as a cost-efficient alternative to Markov Chain Monte Carlo (MCMC), since they allow for online type inference. The posterior distributions are updated as new data is observed, which is exceedingly costly using MCMC. Also, PL allows for consistent online model comparison using sequential predictive log Bayes factors. A simulated data is used in order to compare the posterior outputs for the PL and MCMC schemes, which are shown to be almost identical. Finally, a short real data application is included.
Journal: Econometric Reviews
Pages: 1007-1023
Issue: 9
Volume: 38
Year: 2019
Month: 10
X-DOI: 10.1080/07474938.2018.1514022
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1514022
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:9:p:1007-1023
Template-Type: ReDIF-Article 1.0
Author-Name: Artem Prokhorov
Author-X-Name-First: Artem
Author-X-Name-Last: Prokhorov
Author-Name: Ulf Schepsmeier
Author-X-Name-First: Ulf
Author-X-Name-Last: Schepsmeier
Author-Name: Yajing Zhu
Author-X-Name-First: Yajing
Author-X-Name-Last: Zhu
Title: Generalized information matrix tests for copulas
Abstract:
We propose a family of goodness-of-fit tests for copulas. The tests use generalizations of the information matrix (IM) equality of White and so relate to the copula test proposed by Huang and Prokhorov. The idea is that eigenspectrum-based statements of the IM equality reduce the degrees of freedom of the test’s asymptotic distribution and lead to better size-power properties, even in high dimensions. The gains are especially pronounced for vine copulas, where additional benefits come from simplifications of score functions and the Hessian. We derive the asymptotic distribution of the generalized tests, accounting for the nonparametric estimation of the marginals and apply a parametric bootstrap procedure, valid when asymptotic critical values are inaccurate. In Monte Carlo simulations, we study the behavior of the new tests, compare them with several Cramer–von Mises type tests and confirm the desired properties of the new tests in high dimensions.
Journal: Econometric Reviews
Pages: 1024-1054
Issue: 9
Volume: 38
Year: 2019
Month: 10
X-DOI: 10.1080/07474938.2018.1514023
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1514023
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:9:p:1024-1054
Template-Type: ReDIF-Article 1.0
Author-Name: Kazuhiko Hayakawa
Author-X-Name-First: Kazuhiko
Author-X-Name-Last: Hayakawa
Author-Name: Meng Qi
Author-X-Name-First: Meng
Author-X-Name-Last: Qi
Author-Name: Jörg Breitung
Author-X-Name-First: Jörg
Author-X-Name-Last: Breitung
Title: Double filter instrumental variable estimation of panel data models with weakly exogenous variables
Abstract:
In this article, we propose instrumental variables (IV) and generalized method of moments (GMM) estimators for panel data models with weakly exogenous variables. The model is allowed to include heterogeneous time trends besides the standard fixed effects (FE). The proposed IV and GMM estimators are obtained by applying a forward filter to the model and a backward filter to the instruments in order to remove FE, thereby called the double filter IV and GMM estimators. We derive the asymptotic properties of the proposed estimators under fixed T and large N, and large T and large N asymptotics where N and T denote the dimensions of cross section and time series, respectively. It is shown that the proposed IV estimator has the same asymptotic distribution as the bias corrected FE estimator when both N and T are large. Monte Carlo simulation results reveal that the proposed estimator performs well in finite samples and outperforms the conventional IV/GMM estimators using instruments in levels in many cases.
Journal: Econometric Reviews
Pages: 1055-1088
Issue: 9
Volume: 38
Year: 2019
Month: 10
X-DOI: 10.1080/07474938.2018.1514024
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1514024
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:9:p:1055-1088
Template-Type: ReDIF-Article 1.0
Author-Name: Stephan Smeekes
Author-X-Name-First: Stephan
Author-X-Name-Last: Smeekes
Author-Name: Joakim Westerlund
Author-X-Name-First: Joakim
Author-X-Name-Last: Westerlund
Title: Robust block bootstrap panel predictability tests
Abstract:
This article develops two block bootstrap-based panel predictability test procedures that are valid under very general conditions. Some of the allowable features include cross-sectional dependence, heterogeneous predictive slopes, persistent predictors, and complex error dynamics, including cross-unit endogeneity. While the first test procedure tests if there is any predictability at all, the second procedure determines the units for which predictability holds in case of a rejection by the first. A weak unit root framework is adopted to allow persistent predictors, and a novel theory is developed to establish asymptotic validity of the proposed bootstrap. Simulations are used to evaluate the performance of our tests in small samples, and their implementation is illustrated through an empirical application to stock returns.
Journal: Econometric Reviews
Pages: 1089-1107
Issue: 9
Volume: 38
Year: 2019
Month: 10
X-DOI: 10.1080/07474938.2018.1536102
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1536102
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:9:p:1089-1107
Template-Type: ReDIF-Article 1.0
Author-Name: In Choi
Author-X-Name-First: In
Author-X-Name-Last: Choi
Author-Name: Hanbat Jeong
Author-X-Name-First: Hanbat
Author-X-Name-Last: Jeong
Title: Model selection for factor analysis: Some new criteria and performance comparisons
Abstract:
This paper derives Akaike information criterion (AIC), corrected AIC, the Bayesian information criterion (BIC) and Hannan and Quinn’s information criterion for approximate factor models assuming a large number of cross-sectional observations and studies the consistency properties of these information criteria. It also reports extensive simulation results comparing the performance of the extant and new procedures for the selection of the number of factors. The simulation results show the difficulty of determining which criterion performs best. In practice, it is advisable to consider several criteria at the same time, especially Hannan and Quinn’s information criterion, Bai and Ng’s ICp2 and BIC3, and Onatski’s and Ahn and Horenstein’s eigenvalue-based criteria. The model-selection criteria considered in this paper are also applied to Stock and Watson’s two macroeconomic data sets. The results differ considerably depending on the model-selection criterion in use, but evidence suggesting five factors for the first data and five to seven factors for the second data is obtainable.
Journal: Econometric Reviews
Pages: 577-596
Issue: 6
Volume: 38
Year: 2019
Month: 7
X-DOI: 10.1080/07474938.2017.1382763
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1382763
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:6:p:577-596
Template-Type: ReDIF-Article 1.0
Author-Name: Tingting Cheng
Author-X-Name-First: Tingting
Author-X-Name-Last: Cheng
Title: Functional coefficient time series models with trending regressors
Abstract:
This paper studies a functional coefficient time series model with trending regressors, where the coefficients are unknown functions of time and random variables. We propose a local linear estimation method to estimate the unknown coefficient functions, and establish the corresponding asymptotic theory under mild conditions. We also develop a test procedure to see if the functional coefficients take particular parametric forms. For practical use, we further propose a Bayesian approach to select the bandwidths, and conduct several numerical experiments to examine the finite sample performance of our proposed local linear estimator and the test procedure. The results show that the local linear estimator works well and the proposed test has satisfactory size and power. In addition, our simulation studies show that the Bayesian bandwidth selection method performs better than the cross-validation method. Furthermore, we use the functional coefficient model to study the relationship between consumption per capita and income per capita in United States, and it was shown that the functional coefficient model with our proposed local linear estimator and Bayesian bandwidth selection method performs well in both in-sample fitting and out-of-sample forecasting.
Journal: Econometric Reviews
Pages: 636-659
Issue: 6
Volume: 38
Year: 2019
Month: 7
X-DOI: 10.1080/07474938.2017.1382774
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1382774
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:6:p:636-659
Template-Type: ReDIF-Article 1.0
Author-Name: Jan Mutl
Author-X-Name-First: Jan
Author-X-Name-Last: Mutl
Author-Name: Leopold Sögner
Author-X-Name-First: Leopold
Author-X-Name-Last: Sögner
Title: Parameter estimation and inference with spatial lags and cointegration
Abstract:
This article studies dynamic panel data models in which the long run outcome for a particular cross-section is affected by a weighted average of the outcomes in the other cross-sections. We show that imposing such a structure implies a model with several cointegrating relationships that, unlike in the standard case, are nonlinear in the coefficients to be estimated. Assuming that the weights are exogenously given, we extend the dynamic ordinary least squares methodology and provide a dynamic two-stage least squares estimator. We derive the large sample properties of our proposed estimator under a set of low-level assumptions. Then our methodology is applied to US financial market data, which consist of credit default swap spreads, as well as firm-specific and industry data. We construct the economic space using a “closeness” measure for firms based on input–output matrices. Our estimates show that this particular form of spatial correlation of credit default swap spreads is substantial and highly significant.
Journal: Econometric Reviews
Pages: 597-635
Issue: 6
Volume: 38
Year: 2019
Month: 7
X-DOI: 10.1080/07474938.2017.1382803
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1382803
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:6:p:597-635
Template-Type: ReDIF-Article 1.0
Author-Name: Andreea G. Halunga
Author-X-Name-First: Andreea G.
Author-X-Name-Last: Halunga
Author-Name: Christos S. Savva
Author-X-Name-First: Christos S.
Author-X-Name-Last: Savva
Title: Neglecting structural breaks when estimating and valuing dynamic correlations for asset allocation
Abstract:
This paper assesses the econometric and economic value consequences of neglecting structural breaks in dynamic correlation models and in the context of asset allocation framework. It is shown that changes in the parameters of the conditional correlation process can lead to biased estimates of persistence. Monte Carlo simulations reveal that short-run persistence is downward biased while long-run persistence is severely upward biased, leading to spurious high persistence of shocks to conditional correlation. An application to stock returns supports these results and concludes that neglecting such structural shifts could lead to misleading decisions on portfolio diversification, hedging, and risk management.
Journal: Econometric Reviews
Pages: 660-678
Issue: 6
Volume: 38
Year: 2019
Month: 7
X-DOI: 10.1080/07474938.2017.1411431
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1411431
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:6:p:660-678
Template-Type: ReDIF-Article 1.0
Author-Name: Katerina Aristodemou
Author-X-Name-First: Katerina
Author-X-Name-Last: Aristodemou
Author-Name: Jian He
Author-X-Name-First: Jian
Author-X-Name-Last: He
Author-Name: Keming Yu
Author-X-Name-First: Keming
Author-X-Name-Last: Yu
Title: Binary quantile regression and variable selection: A new approach
Abstract:
In this paper, we propose a new estimation method for binary quantile regression and variable selection which can be implemented by an iteratively reweighted least square approach. In contrast to existing approaches, this method is computationally simple, guaranteed to converge to a unique solution and implemented with standard software packages. We demonstrate our methods using Monte-Carlo experiments and then we apply the proposed method to the widely used work trip mode choice dataset. The results indicate that the proposed estimators work well in finite samples.
Journal: Econometric Reviews
Pages: 679-694
Issue: 6
Volume: 38
Year: 2019
Month: 7
X-DOI: 10.1080/07474938.2017.1417701
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1417701
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:6:p:679-694
Template-Type: ReDIF-Article 1.0
Author-Name: Christian Schluter
Author-X-Name-First: Christian
Author-X-Name-Last: Schluter
Author-Name: Mark Trede
Author-X-Name-First: Mark
Author-X-Name-Last: Trede
Title: Size distributions reconsidered
Abstract:
We consider tests of the hypothesis that the tail of size distributions decays faster than any power function. These are based on a single parameter that emerges from the Fisher–Tippett limit theorem, and discriminate between leading laws considered in the literature without requiring fully parametric models/specifications. We study the proposed tests taking into account the higher order regular variation of the size distribution that can lead to catastrophic distortions. The theoretical bias corrections realign successfully nominal and empirical test behavior, and inform a sensitivity analysis for practical work. The methods are used in an examination of the size distribution of cities and firms.
Journal: Econometric Reviews
Pages: 695-710
Issue: 6
Volume: 38
Year: 2019
Month: 7
X-DOI: 10.1080/07474938.2017.1417732
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1417732
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:6:p:695-710
Template-Type: ReDIF-Article 1.0
Author-Name: Francesco Bartolucci
Author-X-Name-First: Francesco
Author-X-Name-Last: Bartolucci
Author-Name: Valentina Nigro
Author-X-Name-First: Valentina
Author-X-Name-Last: Nigro
Author-Name: Claudia Pigini
Author-X-Name-First: Claudia
Author-X-Name-Last: Pigini
Title: Testing for state dependence in binary panel data with individual covariates by a modified quadratic exponential model
Abstract:
We propose a test for state dependence in binary panel data with individual covariates. For this aim, we rely on a quadratic exponential model in which the association between the response variables is accounted for in a different way with respect to more standard formulations. The level of association is measured by a single parameter that may be estimated by a Conditional Maximum Likelihood (CML) approach. Under the dynamic logit model, the conditional estimator of this parameter converges to zero when the hypothesis of absence of state dependence is true. Therefore, it is possible to implement a t-test for this hypothesis which may be very simply performed and attains the nominal significance level under several structures of the individual covariates. Through an extensive simulation study, we find that our test has good finite sample properties and it is more robust to the presence of (autocorrelated) covariates in the model specification in comparison with other existing testing procedures for state dependence. The proposed approach is illustrated by two empirical applications: the first is based on data coming from the Panel Study of Income Dynamics and concerns employment and fertility; the second is based on the Health and Retirement Study and concerns the self reported health status.
Journal: Econometric Reviews
Pages: 61-88
Issue: 1
Volume: 37
Year: 2018
Month: 1
X-DOI: 10.1080/07474938.2015.1060039
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1060039
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:1:p:61-88
Template-Type: ReDIF-Article 1.0
Author-Name: Saburo Ohno
Author-X-Name-First: Saburo
Author-X-Name-Last: Ohno
Author-Name: Tomohiro Ando
Author-X-Name-First: Tomohiro
Author-X-Name-Last: Ando
Title: Stock return predictability: A factor-augmented predictive regression system with shrinkage method
Abstract:
To predict stock market behaviors, we use a factor-augmented predictive regression with shrinkage to incorporate the information available across literally thousands of financial and economic variables. The system is constructed in terms of both expected returns and the tails of the return distribution. We develop the variable selection consistency and asymptotic normality of the estimator. To select the regularization parameter, we employ the prediction error, with the aim of predicting the behavior of the stock market. Through analysis of the Tokyo Stock Exchange, we find that a large number of variables provide useful information for predicting stock market behaviors.
Journal: Econometric Reviews
Pages: 29-60
Issue: 1
Volume: 37
Year: 2018
Month: 1
X-DOI: 10.1080/07474938.2014.977086
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977086
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:1:p:29-60
Template-Type: ReDIF-Article 1.0
Author-Name: Eric S. Lin
Author-X-Name-First: Eric S.
Author-X-Name-Last: Lin
Author-Name: Ta-Sheng Chou
Author-X-Name-First: Ta-Sheng
Author-X-Name-Last: Chou
Title: Finite-sample refinement of GMM approach to nonlinear models under heteroskedasticity of unknown form
Abstract:
It is quite common to observe heteroskedasticity in real data, in particular, cross-sectional or micro data. Previous studies concentrate on improving the finite-sample properties of tests under heteroskedasticity of unknown forms in linear models. The advantage of a heteroskedasticity consistent covariance matrix estimator (HCCME)-type small-sample improvement for linear models does not carry over to the nonlinear model specifications since there is no obvious counterpart for the diagonal element of the projection matrix in linear models, which is crucial for implementing the finite-sample refinement. Within the framework of nonlinear models, we develop a straightforward approach by extending the applicability of HCCME-type corrections to the two-step GMM method. The Monte Carlo experiments show that the proposed method not only refines the testing procedure in terms of the error of rejection probability, but also improves the coefficient estimation based on the mean squared error (MSE) and the mean absolute error (MAE). The estimation of a constant elasticity of substitution (CES)-type production function is also provided to illustrate how to implement the proposed method empirically.
Journal: Econometric Reviews
Pages: 1-28
Issue: 1
Volume: 37
Year: 2018
Month: 1
X-DOI: 10.1080/07474938.2014.999499
File-URL: http://hdl.handle.net/10.1080/07474938.2014.999499
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:1:p:1-28
Template-Type: ReDIF-Article 1.0
Author-Name: Tingting Cheng
Author-X-Name-First: Tingting
Author-X-Name-Last: Cheng
Author-Name: Jiti Gao
Author-X-Name-First: Jiti
Author-X-Name-Last: Gao
Author-Name: Xibin Zhang
Author-X-Name-First: Xibin
Author-X-Name-Last: Zhang
Title: Nonparametric localized bandwidth selection for Kernel density estimation
Abstract:
As conventional cross-validation bandwidth selection methods do not work properly in the situation where the data are serially dependent time series, alternative bandwidth selection methods are necessary. In recent years, Bayesian-based methods for global bandwidth selection have been studied. Our experience shows that a global bandwidth is however less suitable than a localized bandwidth in kernel density estimation based on serially dependent time series data. Nonetheless, a difficult issue is how we can consistently estimate a localized bandwidth. This paper presents a nonparametric localized bandwidth estimator, for which we establish a completely new asymptotic theory. Applications of this new bandwidth estimator to the kernel density estimation of Eurodollar deposit rate and the S&P 500 daily return demonstrate the effectiveness and competitiveness of the proposed localized bandwidth.
Journal: Econometric Reviews
Pages: 733-762
Issue: 7
Volume: 38
Year: 2019
Month: 8
X-DOI: 10.1080/07474938.2017.1397835
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1397835
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:7:p:733-762
Template-Type: ReDIF-Article 1.0
Author-Name: Marie Bessec
Author-X-Name-First: Marie
Author-X-Name-Last: Bessec
Title: Revisiting the transitional dynamics of business cycle phases with mixed-frequency data
Abstract:
This paper introduces a Markov-switching model in which transition probabilities depend on higher frequency indicators and their lags through polynomial weighting schemes. The MSV-MIDAS model is estimated through maximum likelihood (ML) methods with a slightly modified version of Hamilton’s filter. Monte Carlo simulations show that ML provides accurate estimates, but they suggest some caution in interpreting the tests of the parameters in the transition probabilities. We apply this new model to forecast business cycle turning points in the United States. We properly detect recessions by exploiting the link between GDP growth and higher frequency variables from financial and energy markets.
Journal: Econometric Reviews
Pages: 711-732
Issue: 7
Volume: 38
Year: 2019
Month: 8
X-DOI: 10.1080/07474938.2017.1397837
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1397837
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:7:p:711-732
Template-Type: ReDIF-Article 1.0
Author-Name: Jan Lohmeyer
Author-X-Name-First: Jan
Author-X-Name-Last: Lohmeyer
Author-Name: Franz Palm
Author-X-Name-First: Franz
Author-X-Name-Last: Palm
Author-Name: Hanno Reuvers
Author-X-Name-First: Hanno
Author-X-Name-Last: Reuvers
Author-Name: Jean-Pierre Urbain
Author-X-Name-First: Jean-Pierre
Author-X-Name-Last: Urbain
Title: Focused information criterion for locally misspecified vector autoregressive models
Abstract:
This paper investigates the focused information criterion and plug-in average for vector autoregressive models with local-to-zero misspecification. These methods have the advantage of focusing on a quantity of interest rather than aiming at overall model fit. Any (sufficiently regular) function of the parameters can be used as a quantity of interest. We determine the asymptotic properties and elaborate on the role of the locally misspecified parameters. In particular, we show that the inability to consistently estimate locally misspecified parameters translates into suboptimal selection and averaging. We apply this framework to impulse response analysis. A Monte Carlo simulation study supports our claims.
Journal: Econometric Reviews
Pages: 763-792
Issue: 7
Volume: 38
Year: 2019
Month: 8
X-DOI: 10.1080/07474938.2017.1409410
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1409410
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:7:p:763-792
Template-Type: ReDIF-Article 1.0
Author-Name: Uwe Hassler
Author-X-Name-First: Uwe
Author-X-Name-Last: Hassler
Author-Name: Mehdi Hosseinkouchack
Author-X-Name-First: Mehdi
Author-X-Name-Last: Hosseinkouchack
Title: Ratio tests under limiting normality
Abstract:
We propose a class of ratio tests that is applicable whenever a cumulation (of transformed) data is asymptotically normal upon appropriate normalization. The Karhunen–Loève theorem is employed to compute weighted averages. The test statistics are ratios of quadratic forms of these averages and hence scale-invariant, also called self-normalizing: The scaling parameter cancels asymptotically. Limiting distributions are obtained. Critical values and asymptotic local power functions can be calculated by standard numerical means. The ratio tests are directed against local alternatives and turn out to be almost as powerful as optimal competitors, without being plagued by nuisance parameters at the same time. Also in finite samples they perform well relative to self-normalizing competitors.
Journal: Econometric Reviews
Pages: 793-813
Issue: 7
Volume: 38
Year: 2019
Month: 8
X-DOI: 10.1080/07474938.2018.1427296
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1427296
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:7:p:793-813
Template-Type: ReDIF-Article 1.0
Author-Name: Maurice J. G. Bun
Author-X-Name-First: Maurice J. G.
Author-X-Name-Last: Bun
Author-Name: Teresa D. Harrison
Author-X-Name-First: Teresa D.
Author-X-Name-Last: Harrison
Title: OLS and IV estimation of regression models including endogenous interaction terms
Abstract:
We analyze a class of linear regression models including interactions of endogenous regressors and exogenous covariates. We show how to generate instrumental variables using the nonlinear functional form of the structural equation when traditional excluded instruments are unknown. We propose to use these instruments with identification robust IV inference. We furthermore show that, whenever functional form identification is not valid, the ordinary least squares (OLS) estimator of the coefficient of the interaction term is consistent and standard OLS inference applies. Using our alternative empirical methods we confirm recent empirical findings on the nonlinear causal relation between financial development and economic growth.
Journal: Econometric Reviews
Pages: 814-827
Issue: 7
Volume: 38
Year: 2019
Month: 8
X-DOI: 10.1080/07474938.2018.1427486
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1427486
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:7:p:814-827
Template-Type: ReDIF-Article 1.0
Author-Name: Jaromír Antoch
Author-X-Name-First: Jaromír
Author-X-Name-Last: Antoch
Author-Name: Jan Hanousek
Author-X-Name-First: Jan
Author-X-Name-Last: Hanousek
Author-Name: Lajos Horváth
Author-X-Name-First: Lajos
Author-X-Name-Last: Horváth
Author-Name: Marie Hušková
Author-X-Name-First: Marie
Author-X-Name-Last: Hušková
Author-Name: Shixuan Wang
Author-X-Name-First: Shixuan
Author-X-Name-Last: Wang
Title: Structural breaks in panel data: Large number of panels and short length time series
Abstract:
The detection of (structural) breaks or the so called change point problem has drawn increasing attention from the theoretical, applied economic and financial fields. Much of the existing research concentrates on the detection of change points and asymptotic properties of their estimators in panels when N, the number of panels, as well as T, the number of observations in each panel are large. In this paper we pursue a different approach, i.e., we consider the asymptotic properties when N→∞ while keeping T fixed. This situation is typically related to large (firm-level) data containing financial information about an immense number of firms/stocks across a limited number of years/quarters/months. We propose a general approach for testing for break(s) in this setup. In particular, we obtain the asymptotic behavior of test statistics. We also propose a wild bootstrap procedure that could be used to generate the critical values of the test statistics. The theoretical approach is supplemented by numerous simulations and by an empirical illustration. We demonstrate that the testing procedure works well in the framework of the four factors CAPM model. In particular, we estimate the breaks in the monthly returns of US mutual funds during the period January 2006 to February 2010 which covers the subprime crises.
Journal: Econometric Reviews
Pages: 828-855
Issue: 7
Volume: 38
Year: 2019
Month: 8
X-DOI: 10.1080/07474938.2018.1454378
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1454378
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:7:p:828-855
Template-Type: ReDIF-Article 1.0
Author-Name: Wasel Shadat
Author-X-Name-First: Wasel
Author-X-Name-Last: Shadat
Author-Name: Chris Orme
Author-X-Name-First: Chris
Author-X-Name-Last: Orme
Title: Robust parametric tests of constant conditional correlation in a MGARCH model
Abstract:
This article provides a rigorous asymptotic treatment of new and existing asymptotically valid conditional moment (CM) testing procedures of the constant conditional correlation (CCC) assumption in a multivariate GARCH model. Full and partial quasi maximum likelihood estimation (QMLE) frameworks are considered, as is the robustness of these tests to non-normality. In particular, the asymptotic validity of the LM procedure proposed by Tse (2000) is analyzed, and new asymptotically robust versions of this test are proposed for both estimation frameworks. A Monte Carlo study suggests that a robust Tse test procedure exhibits good size and power properties, unlike the original variant which exhibits size distortion under non-normality.
Journal: Econometric Reviews
Pages: 551-576
Issue: 6
Volume: 37
Year: 2018
Month: 7
X-DOI: 10.1080/07474938.2015.1122120
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1122120
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:6:p:551-576
Template-Type: ReDIF-Article 1.0
Author-Name: Seong Yeon Chang
Author-X-Name-First: Seong Yeon
Author-X-Name-Last: Chang
Author-Name: Pierre Perron
Author-X-Name-First: Pierre
Author-X-Name-Last: Perron
Title: A comparison of alternative methods to construct confidence intervals for the estimate of a break date in linear regression models
Abstract:
This article considers constructing confidence intervals for the date of a structural break in linear regression models. Using extensive simulations, we compare the performance of various procedures in terms of exact coverage rates and lengths of the confidence intervals. These include the procedures of Bai (1997) based on the asymptotic distribution under a shrinking shift framework, Elliott and Müller (2007) based on inverting a test locally invariant to the magnitude of break, Eo and Morley (2015) based on inverting a likelihood ratio test, and various bootstrap procedures. On the basis of achieving an exact coverage rate that is closest to the nominal level, Elliott and Müller's (2007) approach is by far the best one. However, this comes with a very high cost in terms of the length of the confidence intervals. When the errors are serially correlated and dealing with a change in intercept or a change in the coefficient of a stationary regressor with a high signal-to-noise ratio, the length of the confidence interval increases and approaches the whole sample as the magnitude of the change increases. The same problem occurs in models with a lagged dependent variable, a common case in practice. This drawback is not present for the other methods, which have similar properties. Theoretical results are provided to explain the drawbacks of Elliott and Müller's (2007) method.
Journal: Econometric Reviews
Pages: 577-601
Issue: 6
Volume: 37
Year: 2018
Month: 7
X-DOI: 10.1080/07474938.2015.1122142
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1122142
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:6:p:577-601
Template-Type: ReDIF-Article 1.0
Author-Name: Liangjun Su
Author-X-Name-First: Liangjun
Author-X-Name-Last: Su
Author-Name: Zhenlin Yang
Author-X-Name-First: Zhenlin
Author-X-Name-Last: Yang
Title: Asymptotics and bootstrap for random-effects panel data transformation models
Abstract:
This article investigates the asymptotic properties of quasi-maximum likelihood (QML) estimators for random-effects panel data transformation models where both the response and (some of) the covariates are subject to transformations for inducing normality, flexible functional form, homoskedasticity, and simple model structure. We develop a QML-type procedure for model estimation and inference. We prove the consistency and asymptotic normality of the QML estimators, and propose a simple bootstrap procedure that leads to a robust estimate of the variance-covariance (VC) matrix. Monte Carlo results reveal that the QML estimators perform well in finite samples, and that the gains by using the robust VC matrix estimate for inference can be enormous.
Journal: Econometric Reviews
Pages: 602-625
Issue: 6
Volume: 37
Year: 2018
Month: 7
X-DOI: 10.1080/07474938.2015.1122235
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1122235
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:6:p:602-625
Template-Type: ReDIF-Article 1.0
Author-Name: Renée Fry-McKibbin
Author-X-Name-First: Renée
Author-X-Name-Last: Fry-McKibbin
Author-Name: Cody Yu-Ling Hsiao
Author-X-Name-First: Cody Yu-Ling
Author-X-Name-Last: Hsiao
Title: Extremal dependence tests for contagion
Abstract:
A new test for financial market contagion based on changes in extremal dependence defined as co-kurtosis and co-volatility is developed to identify the propagation mechanism of shocks across international financial markets. The proposed approach captures changes in various aspects of the asset return relationships such as cross-market mean and skewness (co-kurtosis) as well as cross-market volatilities (co-volatility). Monte Carlo experiments show that the tests perform well except for when crisis periods are short in duration. Small crisis sample critical values are calculated for use in this case. In an empirical application involving the global financial crisis of 2008–2009, the results show that significant contagion effects are widespread from the US banking sector to global equity markets and banking sectors through either the co-kurtosis or the co-volatility channels, reinforcing that higher order moments matter during crises.
Journal: Econometric Reviews
Pages: 626-649
Issue: 6
Volume: 37
Year: 2018
Month: 7
X-DOI: 10.1080/07474938.2015.1122270
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1122270
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:6:p:626-649
Template-Type: ReDIF-Article 1.0
Author-Name: Artūras Juodis
Author-X-Name-First: Artūras
Author-X-Name-Last: Juodis
Title: First difference transformation in panel VAR models: Robustness, estimation, and inference
Abstract:
This article considers estimation of Panel Vector Autoregressive Models of order 1 (PVAR(1)) with focus on fixed T consistent estimation methods in First Differences (FD) with additional strictly exogenous regressors. Additional results for the Panel FD ordinary least squares (OLS) estimator and the FDLS type estimator of Han and Phillips (2010) are provided. Furthermore, we simplify the analysis of Binder et al. (2005) by providing additional analytical results and extend the original model by taking into account possible cross-sectional heteroscedasticity and presence of strictly exogenous regressors. We show that in the three wave panel the log-likelihood function of the unrestricted Transformed Maximum Likelihood (TML) estimator might violate the global identification assumption. The finite-sample performance of the analyzed methods is investigated in a Monte Carlo study.
Journal: Econometric Reviews
Pages: 650-693
Issue: 6
Volume: 37
Year: 2018
Month: 7
X-DOI: 10.1080/07474938.2016.1139559
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1139559
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:6:p:650-693
Template-Type: ReDIF-Article 1.0
Author-Name: Qichang Xie
Author-X-Name-First: Qichang
Author-X-Name-Last: Xie
Author-Name: Qiankun Sun
Author-X-Name-First: Qiankun
Author-X-Name-Last: Sun
Author-Name: Junxian Liu
Author-X-Name-First: Junxian
Author-X-Name-Last: Liu
Title: Local weighted composite quantile estimation and smoothing parameter selection for nonparametric derivative function
Abstract:
Estimating derivatives is of primary interest as it quantitatively measures the rate of change of the relationship between response and explanatory variables. We propose a local weighted composite quantile method to estimate the gradient of an unknown regression function. Because of the use of weights, a data-driven weighting scheme is discussed for maximizing the asymptotic efficiency of the estimators. We derive the leading bias, variance and normality of the estimator proposed. The asymptotic relative efficiency is investigated and reveals that the new approach provides a highly efficient alternative to the local least squares, the local quantile regression and the local composite quantile regression methods. In addition, a fully automatic bandwidth selection method is considered and is shown to deliver the bandwidth with oracle property meaning that it is asymptotically equivalent to the optimal bandwidth if the true gradient were known. Simulations are conducted to compare different estimators and an example is used to illustrate their performance. Both simulation and empirical results are consistent with our theoretical findings.
Journal: Econometric Reviews
Pages: 215-233
Issue: 3
Volume: 39
Year: 2020
Month: 3
X-DOI: 10.1080/07474938.2019.1580947
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1580947
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:3:p:215-233
Template-Type: ReDIF-Article 1.0
Author-Name: Halvor Mehlum
Author-X-Name-First: Halvor
Author-X-Name-Last: Mehlum
Title: The polar confidence curve for a ratio
Abstract:
Based on Fieller’s method for the estimation of a confidence set for a ratio, I construct a polar plot of the test statistics for all angles associated with the ratio. This polar confidence plot clarifies and systematizes the inherent properties of the confidence set for ratios and, in particular, determines how the confidence set may be uninformative or disconnected.
Journal: Econometric Reviews
Pages: 234-243
Issue: 3
Volume: 39
Year: 2020
Month: 3
X-DOI: 10.1080/07474938.2019.1580951
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1580951
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:3:p:234-243
Template-Type: ReDIF-Article 1.0
Author-Name: Rasmus Søndergaard Pedersen
Author-X-Name-First: Rasmus Søndergaard
Author-X-Name-Last: Pedersen
Title: Robust inference in conditionally heteroskedastic autoregressions
Abstract:
We consider robust inference for an autoregressive parameter in a stationary linear autoregressive model with GARCH innovations. As the innovations exhibit GARCH, they are by construction heavy-tailed with some tail index κ. This implies that the rate of convergence as well as the limiting distribution of the least squares estimator depend on κ. In the spirit of Ibragimov and Müller (“t-statistic based correlation and heterogeneity robust inference”, Journal of Business & Economic Statistics, 2010, vol. 28, pp. 453–468), we consider testing a hypothesis about a parameter based on a Student’s t-statistic based on least squares estimates for a fixed number of groups of the original sample. The merit of this approach is that no knowledge about the value of κ nor about the rate of convergence and the limiting distribution of the least squares estimator is required. We verify that the two-sided t-test is asymptotically a level α test whenever α≤5% for any κ≥2, which includes cases where the innovations have infinite variance. A simulation experiment suggests that the finite-sample properties of the test are quite good.
Journal: Econometric Reviews
Pages: 244-259
Issue: 3
Volume: 39
Year: 2020
Month: 3
X-DOI: 10.1080/07474938.2019.1580950
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1580950
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:3:p:244-259
Template-Type: ReDIF-Article 1.0
Author-Name: N. R. Ramírez-Rondán
Author-X-Name-First: N. R.
Author-X-Name-Last: Ramírez-Rondán
Title: Maximum likelihood estimation of dynamic panel threshold models
Abstract:
Threshold estimation methods are developed for dynamic panels with individual specific fixed effects covering short time periods. Maximum likelihood estimation of the threshold and slope parameters is proposed using first difference transformations. Threshold estimate is shown to be consistent and its asymptotic distribution is nonstandard when the number of individuals grows to infinity for a fixed time period; the slope estimates are consistent and asymptotically normally distributed. The method is applied to a sample of 74 countries and 11 periods of 5-year averages to determine the effect of inflation rate on long-run economic growth.
Journal: Econometric Reviews
Pages: 260-276
Issue: 3
Volume: 39
Year: 2020
Month: 3
X-DOI: 10.1080/07474938.2019.1624401
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1624401
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:3:p:260-276
Template-Type: ReDIF-Article 1.0
Author-Name: Alexandra Soberon
Author-X-Name-First: Alexandra
Author-X-Name-Last: Soberon
Author-Name: Winfried Stute
Author-X-Name-First: Winfried
Author-X-Name-Last: Stute
Author-Name: Juan M. Rodriguez-Poo
Author-X-Name-First: Juan M.
Author-X-Name-Last: Rodriguez-Poo
Title: Testing for distributional features in varying coefficient panel data models
Abstract:
This article provides several tests for skewness and kurtosis for the error terms in a one-way fixed-effects varying coefficient panel data model. To obtain these tests, estimators of higher-order moments of both error components are obtained as solutions of estimating equations. Additionally, to obtain the nonparametric residuals, a local constant estimator based on a pairwise differencing transformation is proposed. The asymptotic properties of these estimators and tests are established. The proposed estimators and test statistics are augmented by simulation studies, and they are also illustrated in an empirical analysis regarding the technical efficiency of European Union companies.
Journal: Econometric Reviews
Pages: 277-298
Issue: 3
Volume: 39
Year: 2020
Month: 3
X-DOI: 10.1080/07474938.2019.1624403
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1624403
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:3:p:277-298
Template-Type: ReDIF-Article 1.0
Author-Name: Yundong Tu
Author-X-Name-First: Yundong
Author-X-Name-Last: Tu
Author-Name: Ying Wang
Author-X-Name-First: Ying
Author-X-Name-Last: Wang
Title: Adaptive estimation of heteroskedastic functional-coefficient regressions with an application to fiscal policy evaluation on asset markets
Abstract:
This article studies the adaptive estimation of the heteroskedastic functional-coefficient regressions. The motivation for such a theoretical study originates from the empirical analysis of Jansen et al., where the role of fiscal policy on the U.S. asset markets (treasury bonds) is evaluated via the functional-coefficient model. It is found that this model is subject to time-varying heteroskedasticity. As a result, the local least square (LLS) estimator suffers from efficiency loss. To overcome this problem, we propose an adaptive LLS (ALLS) estimator, which can adapt to heteroskedasticity of unknown form asymptotically. Simulation studies confirm that the ALLS estimator can achieve significant efficiency gain in finite samples, compared to the LLS estimator. Real data analysis reveals that the heteroskedastic functional-coefficient model provides adequate fit and better out-of-sample forecasting.
Journal: Econometric Reviews
Pages: 299-318
Issue: 3
Volume: 39
Year: 2020
Month: 3
X-DOI: 10.1080/07474938.2019.1624402
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1624402
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:3:p:299-318
Template-Type: ReDIF-Article 1.0
Author-Name: Otilia Boldea
Author-X-Name-First: Otilia
Author-X-Name-Last: Boldea
Author-Name: Alastair Hall
Author-X-Name-First: Alastair
Author-X-Name-Last: Hall
Author-Name: Sanggohn Han
Author-X-Name-First: Sanggohn
Author-X-Name-Last: Han
Title: Asymptotic Distribution Theory for Break Point Estimators in Models Estimated via 2SLS
Abstract: In this article, we present a limiting distribution theory for the break point estimator in a linear regression model with multiple structural breaks obtained by minimizing a Two Stage Least Squares (2SLS) objective function. Our analysis covers both the case in which the reduced form for the endogenous regressors is stable and the case in which it is unstable with multiple structural breaks. For stable reduced forms, we present a limiting distribution theory under two different scenarios: in the case where the parameter change is of fixed magnitude, it is shown that the resulting distribution depends on the distribution of the data and is not of much practical use for inference; in the case where the magnitude of the parameter change shrinks with the sample size, it is shown that the resulting distribution can be used to construct approximate large sample confidence intervals for the break points. For unstable reduced forms, we consider the case where the magnitudes of the parameter changes in both the equation of interest and the reduced forms shrink with the sample size at potentially different rates and not necessarily the same locations in the sample. The resulting limiting distribution theory can be used to construct approximate large sample confidence intervals for the break points. Its usefulness is illustrated via an application to the New Keynesian Phillips curve.
Journal: Econometric Reviews
Pages: 1-33
Issue: 1
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607082
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607082
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:1:p:1-33
Template-Type: ReDIF-Article 1.0
Author-Name: Changli He
Author-X-Name-First: Changli
Author-X-Name-Last: He
Author-Name: Rickard Sandberg
Author-X-Name-First: Rickard
Author-X-Name-Last: Sandberg
Title: Testing Parameter Constancy in Unit Root Autoregressive Models Against Multiple Continuous Structural Changes
Abstract: This article considers tests for logistic smooth transition autoregressive (LSTAR) models accommodating multiple time dependent transitions between regimes when the data generating process is a random walk. The asymptotic null distributions of the tests, in contrast to the standard results in Lin and Teräsvirta (1994), are nonstandard. Monte Carlo experiments reveal that the tests have modest size distortions and satisfactory power against LSTAR models with multiple smooth breaks. The tests are applied to Swedish unemployment rates and the hysteresis hypothesis is over-turned in favour of an LSTAR model with two transitions between extreme regimes.
Journal: Econometric Reviews
Pages: 34-59
Issue: 1
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607085
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607085
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:1:p:34-59
Template-Type: ReDIF-Article 1.0
Author-Name: George Athanasopoulos
Author-X-Name-First: George
Author-X-Name-Last: Athanasopoulos
Author-Name: D. Poskitt
Author-X-Name-First: D.
Author-X-Name-Last: Poskitt
Author-Name: Farshid Vahid
Author-X-Name-First: Farshid
Author-X-Name-Last: Vahid
Title: Two Canonical VARMA Forms: Scalar Component Models Vis-à-Vis the Echelon Form
Abstract: In this article we study two methodologies which identify and specify canonical form VARMA models. The two methodologies are: (1) an extension of the scalar component methodology which specifies canonical VARMA models by identifying scalar components through canonical correlations analysis; and (2) the Echelon form methodology, which specifies canonical VARMA models through the estimation of Kronecker indices. We compare the actual forms and the methodologies on three levels. Firstly, we present a theoretical comparison. Secondly, we present a Monte Carlo simulation study that compares the performances of the two methodologies in identifying some pre-specified data generating processes. Lastly, we compare the out-of-sample forecast performance of the two forms when models are fitted to real macroeconomic data.
Journal: Econometric Reviews
Pages: 60-83
Issue: 1
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607088
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607088
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:1:p:60-83
Template-Type: ReDIF-Article 1.0
Author-Name: Andriy Norets
Author-X-Name-First: Andriy
Author-X-Name-Last: Norets
Title: Estimation of Dynamic Discrete Choice Models Using Artificial Neural Network Approximations
Abstract: I propose a method for inference in dynamic discrete choice models (DDCM) that utilizes Markov chain Monte Carlo (MCMC) and artificial neural networks (ANNs). MCMC is intended to handle high-dimensional integration in the likelihood function of richly specified DDCMs. ANNs approximate the dynamic-program (DP) solution as a function of the parameters and state variables prior to estimation to avoid having to solve the DP on each iteration. Potential applications of the proposed methodology include inference in DDCMs with random coefficients, serially correlated unobservables, and dependence across individual observations. The article discusses MCMC estimation of DDCMs, provides relevant background on ANNs, and derives a theoretical justification for the method. Experiments suggest this to be a promising approach.
Journal: Econometric Reviews
Pages: 84-106
Issue: 1
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607089
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607089
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:1:p:84-106
Template-Type: ReDIF-Article 1.0
Author-Name: Patrick Bajari
Author-X-Name-First: Patrick
Author-X-Name-Last: Bajari
Author-Name: Thomas Youle
Author-X-Name-First: Thomas
Author-X-Name-Last: Youle
Title: Book Review: and
Journal: Econometric Reviews
Pages: 107-117
Issue: 1
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607090
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607090
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:1:p:107-117
Template-Type: ReDIF-Article 1.0
Author-Name: Pierre Chaussé
Author-X-Name-First: Pierre
Author-X-Name-Last: Chaussé
Author-Name: Dinghai Xu
Author-X-Name-First: Dinghai
Author-X-Name-Last: Xu
Title: GMM estimation of a realized stochastic volatility model: A Monte Carlo study
Abstract:
This article investigates alternative generalized method of moments (GMM) estimation procedures of a stochastic volatility model with realized volatility measures. The extended model can accommodate a more general correlation structure. General closed form moment conditions are derived to examine the model properties and to evaluate the performance of various GMM estimation procedures under Monte Carlo environment, including standard GMM, principal component GMM, robust GMM and regularized GMM. An application to five company stocks and one stock index is also provided for an empirical demonstration.
Journal: Econometric Reviews
Pages: 719-743
Issue: 7
Volume: 37
Year: 2018
Month: 8
X-DOI: 10.1080/07474938.2016.1152654
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1152654
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:7:p:719-743
Template-Type: ReDIF-Article 1.0
Author-Name: Hung-Pin Lai
Author-X-Name-First: Hung-Pin
Author-X-Name-Last: Lai
Author-Name: Wen-Jen Tsay
Author-X-Name-First: Wen-Jen
Author-X-Name-Last: Tsay
Title: Maximum simulated likelihood estimation of the panel sample selection model
Abstract:
Heckman's (1976, 1979) sample selection model has been employed in many studies of linear and nonlinear regression applications. It is well known that ignoring the sample selectivity may result in inconsistency of the estimator due to the correlation between the statistical errors in the selection and main equations. In this article, we reconsider the maximum likelihood estimator for the panel sample selection model in Keane et al. (1988). Since the panel data model contains individual effects, such as fixed or random effects, the likelihood function is more complicated than that of the classical Heckman model. As an alternative to the existing derivation of the likelihood function in the literature, we show that the conditional distribution of the main equation follows a closed skew-normal (CSN) distribution, of which the linear transformation is still a CSN. Although the evaluation of the likelihood function involves high-dimensional integration, we show that the integration can be further simplified into a one-dimensional problem and can be evaluated by the simulated likelihood method. Moreover, we also conduct a Monte Carlo experiment to investigate the finite sample performance of the proposed estimator and find that our estimator provides reliable and quite satisfactory results.
Journal: Econometric Reviews
Pages: 744-759
Issue: 7
Volume: 37
Year: 2018
Month: 8
X-DOI: 10.1080/07474938.2016.1152657
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1152657
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:7:p:744-759
Template-Type: ReDIF-Article 1.0
Author-Name: Nikolay Gospodinov
Author-X-Name-First: Nikolay
Author-X-Name-Last: Gospodinov
Author-Name: Raymond Kan
Author-X-Name-First: Raymond
Author-X-Name-Last: Kan
Author-Name: Cesare Robotti
Author-X-Name-First: Cesare
Author-X-Name-Last: Robotti
Title: Asymptotic variance approximations for invariant estimators in uncertain asset-pricing models
Abstract:
This article derives explicit expressions for the asymptotic variances of the maximum likelihood and continuously-updated GMM estimators in models that may not satisfy the fundamental asset-pricing restrictions in population. The proposed misspecification-robust variance estimators allow the researcher to conduct valid inference on the model parameters even when the model is rejected by the data. While the results for the maximum likelihood estimator are only applicable to linear asset-pricing models, the asymptotic distribution of the continuously-updated GMM estimator is derived for general, possibly nonlinear, models. The large corrections in the asymptotic variances, that arise from explicitly incorporating model misspecification in the analysis, are illustrated using simulations and an empirical application.
Journal: Econometric Reviews
Pages: 695-718
Issue: 7
Volume: 37
Year: 2018
Month: 8
X-DOI: 10.1080/07474938.2016.1165945
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1165945
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:7:p:695-718
Template-Type: ReDIF-Article 1.0
Author-Name: Ke Yang
Author-X-Name-First: Ke
Author-X-Name-Last: Yang
Title: More efficient local polynomial regression with random-effects panel data models
Abstract:
We propose a modification on the local polynomial estimation procedure to account for the “within-subject” correlation presented in panel data. The proposed procedure is rather simple to compute and has a closed-form expression. We study the asymptotic bias and variance of the proposed procedure and show that it outperforms the working independence estimator uniformly up to the first order. Simulation study shows that the gains in efficiency with the proposed method in the presence of “within-subject” correlation can be significant in small samples. For illustration purposes, the procedure is applied to explore the impact of market concentration on airfare.
Journal: Econometric Reviews
Pages: 760-776
Issue: 7
Volume: 37
Year: 2018
Month: 8
X-DOI: 10.1080/07474938.2016.1167813
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1167813
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:7:p:760-776
Template-Type: ReDIF-Article 1.0
Author-Name: Huigang Chen
Author-X-Name-First: Huigang
Author-X-Name-Last: Chen
Author-Name: Alin Mirestean
Author-X-Name-First: Alin
Author-X-Name-Last: Mirestean
Author-Name: Charalambos G. Tsangarides
Author-X-Name-First: Charalambos G.
Author-X-Name-Last: Tsangarides
Title: Bayesian model averaging for dynamic panels with an application to a trade gravity model
Abstract:
We extend the Bayesian Model Averaging (BMA) framework to dynamic panel data models with endogenous regressors using a Limited Information Bayesian Model Averaging (LIBMA) methodology. Monte Carlo simulations confirm the asymptotic performance of our methodology both in BMA and selection, with high posterior inclusion probabilities for all relevant regressors, and parameter estimates very close to their true values. In addition, we illustrate the use of LIBMA by estimating a dynamic gravity model for bilateral trade. Once model uncertainty, dynamics, and endogeneity are accounted for, we find several factors that are robustly correlated with bilateral trade. We also find that applying methodologies that do not account for either dynamics or endogeneity (or both) results in different sets of robust determinants.
Journal: Econometric Reviews
Pages: 777-805
Issue: 7
Volume: 37
Year: 2018
Month: 8
X-DOI: 10.1080/07474938.2016.1167857
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1167857
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:7:p:777-805
Template-Type: ReDIF-Article 1.0
Author-Name: Antonia Arsova
Author-X-Name-First: Antonia
Author-X-Name-Last: Arsova
Author-Name: Deniz Dilan Karaman Örsal
Author-X-Name-First: Deniz Dilan Karaman
Author-X-Name-Last: Örsal
Title: Likelihood-based panel cointegration test in the presence of a linear time trend and cross-sectional dependence
Abstract:
This article proposes a new likelihood-based panel cointegration rank test which extends the test of Örsal and Droge (2014) (henceforth panel SL test) to dependent panels. The dependence is modelled by unobserved common factors which affect the variables in each cross-section through heterogeneous loadings. The data are defactored following the panel analysis of nonstationarity in idiosyncratic and common components (PANIC) approach of Bai and Ng (2004) and the cointegrating rank of the defactored data is then tested by the panel SL test. A Monte Carlo study demonstrates that the proposed testing procedure has reasonable size and power properties in finite samples.
Journal: Econometric Reviews
Pages: 1033-1050
Issue: 10
Volume: 37
Year: 2018
Month: 11
X-DOI: 10.1080/07474938.2016.1183070
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1183070
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:10:p:1033-1050
Template-Type: ReDIF-Article 1.0
Author-Name: Jing Zheng
Author-X-Name-First: Jing
Author-X-Name-Last: Zheng
Author-Name: Wentao Gu
Author-X-Name-First: Wentao
Author-X-Name-Last: Gu
Author-Name: Baolin Xu
Author-X-Name-First: Baolin
Author-X-Name-Last: Xu
Author-Name: Zongwu Cai
Author-X-Name-First: Zongwu
Author-X-Name-Last: Cai
Title: The estimation for Lévy processes in high frequency data
Abstract:
In this article, a generalized Lévy model is proposed and its parameters are estimated in high-frequency data settings. An infinitesimal generator of Lévy processes is used to study the asymptotic properties of the drift and volatility estimators. They are consistent asymptotically and are independent of other parameters making them better than those in Chen et al. (2010). The estimators proposed here also have fast convergence rates and are simple to implement.
Journal: Econometric Reviews
Pages: 1051-1066
Issue: 10
Volume: 37
Year: 2018
Month: 11
X-DOI: 10.1080/07474938.2016.1188876
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1188876
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:10:p:1051-1066
Template-Type: ReDIF-Article 1.0
Author-Name: M. Hashem Pesaran
Author-X-Name-First: M. Hashem
Author-X-Name-Last: Pesaran
Author-Name: Qiankun Zhou
Author-X-Name-First: Qiankun
Author-X-Name-Last: Zhou
Title: Estimation of time-invariant effects in static panel data models
Abstract:
This article proposes the Fixed Effects Filtered (FEF) and Fixed Effects Filtered instrumental variable (FEF-IV) estimators for estimation and inference in the case of time-invariant effects in static panel data models when N is large and T is fixed. The FEF-IV allows for endogenous time-invariant regressors but assumes that there exists a sufficient number of instruments for such regressors. It is shown that the FEF and FEF-IV estimators are \begin{equation}{\sqrt {N}}\end{equation} N-consistent and asymptotically normally distributed. The FEF estimator is compared with the Fixed Effects Vector Decomposition (FEVD) estimator proposed by Plumper and Troeger (2007) and conditions under which the two estimators are equivalent are established. It is also shown that the variance estimator proposed for FEVD estimator is inconsistent and its use could lead to misleading inference. Alternative variance estimators are proposed for both FEF and FEF-IV estimators which are shown to be consistent under fairly general conditions. The small sample properties of the FEF and FEF-IV estimators are investigated by Monte Carlo experiments, and it is shown that FEF has smaller bias and RMSE, unless an intercept is included in the second stage of the FEVD procedure which renders the FEF and FEVD estimators identical. The FEVD procedure, however, results in substantial size distortions since it uses incorrect standard errors. In the case where some of the time-invariant regressors are endogenous, the FEF-IV procedure is compared with a modified version of Hausman and Taylor (1981) (HT) estimator. It is shown that both estimators perform well and have similar small sample properties. But the application of standard HT procedure, that incorrectly assumes a subset of time-varying regressors are uncorrelated with the individual effects, will yield biased estimates and significant size distortions.
Journal: Econometric Reviews
Pages: 1137-1171
Issue: 10
Volume: 37
Year: 2018
Month: 11
X-DOI: 10.1080/07474938.2016.1222225
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1222225
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:10:p:1137-1171
Template-Type: ReDIF-Article 1.0
Author-Name: Rehim Kılıç
Author-X-Name-First: Rehim
Author-X-Name-Last: Kılıç
Title: Robust inference for predictability in smooth transition predictive regressions
Abstract:
This article provides a novel test for predictability within a nonlinear smooth transition predictive regression (STPR) model where inference is complicated due not only to the presence of persistent, local to unit root, predictors, and endogeneity but also the presence of unidentified parameters under the null of no predictability. In order to circumvent the unidentified parameters problem, t− statistic for the predictor in the STPR model is optimized over the Cartesian product of the spaces for the transition and threshold parameters; and to address the difficulties due to persistent and endogenous predictors, the instrumental variable (IVX) method originally developed in the linear cointegration testing framework is adopted within the STPR model. Limit distribution of this statistic (i.e., sup−tIVX test) is shown to be nuisance parameter-free and robust to the local to unit root and endogenous regressors. Simulations show that sup−tIVX has good size and power properties. An application to stock return predictability reveals presence of asymmetric regime-dependence and variability in the strength and size of predictability across asset-related (e.g., dividend/price ratio) vs. other (e.g., default yield spread) predictors.
Journal: Econometric Reviews
Pages: 1067-1094
Issue: 10
Volume: 37
Year: 2018
Month: 11
X-DOI: 10.1080/07474938.2016.1222233
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1222233
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:10:p:1067-1094
Template-Type: ReDIF-Article 1.0
Author-Name: Emir Malikov
Author-X-Name-First: Emir
Author-X-Name-Last: Malikov
Author-Name: Diego A. Restrepo-Tobón
Author-X-Name-First: Diego A.
Author-X-Name-Last: Restrepo-Tobón
Author-Name: Subal C. Kumbhakar
Author-X-Name-First: Subal C.
Author-X-Name-Last: Kumbhakar
Title: Heterogeneous credit union production technologies with endogenous switching and correlated effects
Abstract:
Credit unions differ in the types of financial services they offer to their members. This article explicitly models this observed heterogeneity using a generalized model of endogenous ordered switching. Our approach captures the endogenous choice that credit unions make when adding new products to their financial services mix. The model that we consider also allows for the dependence between unobserved effects and regressors in both the selection and outcome equations and can accommodate the presence of predetermined covariates in the model. We use this model to estimate returns to scale for U.S. retail credit unions from 1996 to 2011. We document strong evidence of persistent technological heterogeneity among credit unions offering different financial service mixes, which, if ignored, can produce quite misleading results. Employing our model, we find that credit unions of all types exhibit substantial economies of scale.
Journal: Econometric Reviews
Pages: 1095-1119
Issue: 10
Volume: 37
Year: 2018
Month: 11
X-DOI: 10.1080/07474938.2016.1222234
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1222234
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:10:p:1095-1119
Template-Type: ReDIF-Article 1.0
Author-Name: Thomas Demuynck
Author-X-Name-First: Thomas
Author-X-Name-Last: Demuynck
Title: Testing the homogeneous marginal utility of income assumption
Abstract:
We develop a test for the hypothesis that every agent from a population of heterogeneous consumers has the same marginal utility of income function. This homogeneous marginal utility of income (HMUI) assumption is often (implicitly) used in applied demand studies because it has nice aggregation properties and facilitates welfare analysis. If the HMUI assumption holds, we can also identify the common marginal utility of income function. We apply our results using a U.S. cross sectional dataset on food consumption.
Journal: Econometric Reviews
Pages: 1120-1136
Issue: 10
Volume: 37
Year: 2018
Month: 11
X-DOI: 10.1080/07474938.2016.1222235
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1222235
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:10:p:1120-1136
Template-Type: ReDIF-Article 1.0
Author-Name: The Editors
Title: List of Referees
Journal: Econometric Reviews
Pages: 1172-1173
Issue: 10
Volume: 37
Year: 2018
Month: 11
X-DOI: 10.1080/07474938.2018.1468300
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1468300
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:10:p:1172-1173
Template-Type: ReDIF-Article 1.0
Author-Name: The Editors
Title: Editorial Board EOV
Journal: Econometric Reviews
Pages: ebi-ebi
Issue: 10
Volume: 37
Year: 2018
Month: 11
X-DOI: 10.1080/07474938.2018.1468304
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1468304
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:10:p:ebi-ebi
Template-Type: ReDIF-Article 1.0
Author-Name: David Pacini
Author-X-Name-First: David
Author-X-Name-Last: Pacini
Title: Two-sample least squares projection
Abstract:
This article investigates the problem of making inference about the coefficients in the linear projection of an outcome variable y on covariates (x,z) when data are available from two independent random samples; the first sample contains information on only the variables (y,z), while the second sample contains information on only the covariates. In this context, the validity of existing inference procedures depends crucially on the assumptions imposed on the joint distribution of (y,z,x). This article introduces a novel characterization of the identified set of the coefficients of interest when no assumption (except for the existence of second moments) on this joint distribution is imposed. One finding is that inference is necessarily nonstandard because the function characterizing the identified set is a nondifferentiable (yet directionally differentiable) function of the data. The article then introduces an estimator and a confidence interval based on the directional differential of the function characterizing the identified set. Monte Carlo experiments explore the numerical performance of the proposed estimator and confidence interval.
Journal: Econometric Reviews
Pages: 95-123
Issue: 1
Volume: 38
Year: 2019
Month: 1
X-DOI: 10.1080/07474938.2016.1222068
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1222068
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:1:p:95-123
Template-Type: ReDIF-Article 1.0
Author-Name: Ulf Schepsmeier
Author-X-Name-First: Ulf
Author-X-Name-Last: Schepsmeier
Title: A goodness-of-fit test for regular vine copula models
Abstract:
We introduce a new goodness-of-fit test for regular vine (R-vine) copula models, a very flexible class of multivariate copulas based on a pair-copula construction (PCC). The test arises from White’s information matrix test and extends an existing goodness-of-fit test for copulas. The corresponding critical value can be approximated by asymptotic theory or simulation. The simulation based test shows excellent performance with regard to observed size and power in an extensive simulation study, while the asymptotic theory based test is inadequate for n≤10,000 for a 5-dimensional model (in d = 8 even 20,000 are not enough). The simulation based test is applied to select among different R-vine specifications modeling the dependency among exchange rates.
Journal: Econometric Reviews
Pages: 25-46
Issue: 1
Volume: 38
Year: 2019
Month: 1
X-DOI: 10.1080/07474938.2016.1222231
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1222231
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:1:p:25-46
Template-Type: ReDIF-Article 1.0
Author-Name: Mirza Trokić
Author-X-Name-First: Mirza
Author-X-Name-Last: Trokić
Title: Wavelet energy ratio unit root tests
Abstract:
This article uses wavelet theory to propose a frequency domain nonparametric and tuning parameter-free family of unit root tests. The proposed test exploits the wavelet power spectrum of the observed series and its fractional partial sum to construct a test of the unit root based on the ratio of the resulting scaling energies. The proposed statistic enjoys good power properties and is robust to severe size distortions even in the presence of serially correlated MA(1) errors with a highly negative moving average (MA) parameter, as well as in the presence of random additive outliers. Any remaining size distortions are effectively eliminated using a novel wavestrapping algorithm.
Journal: Econometric Reviews
Pages: 69-94
Issue: 1
Volume: 38
Year: 2019
Month: 1
X-DOI: 10.1080/07474938.2016.1222232
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1222232
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:1:p:69-94
Template-Type: ReDIF-Article 1.0
Author-Name: Neshat Beheshti
Author-X-Name-First: Neshat
Author-X-Name-Last: Beheshti
Author-Name: Jeffrey S. Racine
Author-X-Name-First: Jeffrey S.
Author-X-Name-Last: Racine
Author-Name: Ehsan S. Soofi
Author-X-Name-First: Ehsan S.
Author-X-Name-Last: Soofi
Title: Information measures of kernel estimation
Abstract:
Kernel estimates of entropy and mutual information have been studied extensively in statistics and econometrics. Kullback–Leibler divergence has been used in the kernel estimation literature; yet the information characteristic of kernel estimation remains unexplored. We explore kernel estimation as an information transmission operation where the empirical cumulative distribution function is transformed into a smooth estimate. The smooth kernel estimate is a mixture of kernel functions. The Jensen–Shannon (JS) divergence of the mixture distribution provides the information measure of kernel estimation. This measure admits Kullback–Leibler and mutual information representations and provides a lower bound for the entropy of the kernel estimate of the distribution in terms of the Shannon entropy of the kernel function and the bandwidth. The JS divergence provides guidance for kernel choice based on information-theoretic considerations which helps resolve a conundrum, namely that it is legitimate and desirable to base such choice on considerations other than the mean integrated square error of the kernel smoother. We introduce a generalized polynomial kernel (GPK) family that nests a broad range of popular kernel functions, and explore its properties in terms of Shannon and Rényi entropies. We show that these entropies and variance order the GPK functions similarly. The JS information measures of six kernel functions are compared via simulations from Gaussian, gamma, and Student-t data-generating processes. The proposed framework provides the foundation for further explorations into the information-theoretic nature of kernel smoothing.
Journal: Econometric Reviews
Pages: 47-68
Issue: 1
Volume: 38
Year: 2019
Month: 1
X-DOI: 10.1080/07474938.2016.1222236
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1222236
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:1:p:47-68
Template-Type: ReDIF-Article 1.0
Author-Name: Luke Taylor
Author-X-Name-First: Luke
Author-X-Name-Last: Taylor
Author-Name: Taisuke Otsu
Author-X-Name-First: Taisuke
Author-X-Name-Last: Otsu
Title: Estimation of nonseparable models with censored dependent variables and endogenous regressors
Abstract:
In this article we develop a nonparametric estimator for the local average response of a censored dependent variable to endogenous regressors in a nonseparable model where the unobservable error term is not restricted to be scalar and where the nonseparable function need not be monotone in the unobservables. We formalize the identification argument put forward in Altonji, Ichimura, and Otsu (2012), construct a nonparametric estimator, characterize its asymptotic property, and conduct a Monte Carlo investigation to study its small sample properties. Identification is constructive and is achieved through a control function approach. We show that the estimator is consistent and asymptotically normally distributed. The Monte Carlo results are encouraging.
Journal: Econometric Reviews
Pages: 4-24
Issue: 1
Volume: 38
Year: 2019
Month: 1
X-DOI: 10.1080/07474938.2016.1235310
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1235310
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:1:p:4-24
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Title: “Fellows and Scholars of Econometric Reviews”
Journal: Econometric Reviews
Pages: 1-3
Issue: 1
Volume: 38
Year: 2019
Month: 1
X-DOI: 10.1080/07474938.2018.1554868
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1554868
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:1:p:1-3
Template-Type: ReDIF-Article 1.0
Author-Name: Laura Magazzini
Author-X-Name-First: Laura
Author-X-Name-Last: Magazzini
Author-Name: Giorgio Calzolari
Author-X-Name-First: Giorgio
Author-X-Name-Last: Calzolari
Title: Testing initial conditions in dynamic panel data models
Abstract:
We propose a new framework for testing the “mean stationarity” assumption in dynamic panel data models, required for the consistency of the system GMM estimator. In our set up the assumption is obtained as a parametric restriction in an extended set of moment conditions, allowing the use of a LM test to check its validity. Our framework provides a ranking in terms of power of the analyzed test statistics, in which our approach exhibits better power than the difference-in-Sargan/Hansen test that compares system GMM and difference GMM, that is, on its turn, more powerful than the Sargan/Hansen test based on the system GMM moment conditions.
Journal: Econometric Reviews
Pages: 115-134
Issue: 2
Volume: 39
Year: 2020
Month: 2
X-DOI: 10.1080/07474938.2019.1690194
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1690194
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:2:p:115-134
Template-Type: ReDIF-Article 1.0
Author-Name: Hervé Cardot
Author-X-Name-First: Hervé
Author-X-Name-Last: Cardot
Author-Name: Antonio Musolesi
Author-X-Name-First: Antonio
Author-X-Name-Last: Musolesi
Title: Modeling temporal treatment effects with zero inflated semi-parametric regression models: The case of local development policies in France
Abstract:
A semi-parametric approach is proposed to estimate the variation along time of the effects of two distinct public policies that were devoted to boost rural development in France over a similar period of time. At a micro data level, it is often observed that the dependent variable, such as local employment, does not vary along time, so that we face a kind of zero inflated phenomenon that cannot be dealt with a continuous response model. We introduce a conditional mixture model which combines a mass at zero and a continuous response. The suggested zero inflated semi-parametric statistical approach relies on the flexibility and modularity of additive models with the ability of panel data to deal with selection bias and to allow for the estimation of dynamic treatment effects. In this multiple treatment analysis, we find evidence of interesting patterns of temporal treatment effects with relevant nonlinear policy effects. The adopted semi-parametric modeling also offers the possibility of making a counterfactual analysis at an individual level. The methodology is illustrated and compared with parametric linear approaches on a few municipalities for which the mean evolution of the potential outcomes is estimated under the different possible treatments.
Journal: Econometric Reviews
Pages: 135-157
Issue: 2
Volume: 39
Year: 2020
Month: 2
X-DOI: 10.1080/07474938.2019.1690193
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1690193
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:2:p:135-157
Template-Type: ReDIF-Article 1.0
Author-Name: Michael S. Delgado
Author-X-Name-First: Michael S.
Author-X-Name-Last: Delgado
Author-Name: Deniz Ozabaci
Author-X-Name-First: Deniz
Author-X-Name-Last: Ozabaci
Author-Name: Yiguo Sun
Author-X-Name-First: Yiguo
Author-X-Name-Last: Sun
Author-Name: Subal C. Kumbhakar
Author-X-Name-First: Subal C.
Author-X-Name-Last: Kumbhakar
Title: Smooth coefficient models with endogenous environmental variables
Abstract:
We develop a three-step, oracle-efficient estimator for a structural semiparametric smooth coefficient model with endogenous variables in the nonparametric part of the model. We use a control function approach, combined with both series and kernel estimators to obtain consistent and asymptotically normal estimators of the functions and their partial derivatives. We develop a residual-based test statistic for testing endogeneity, and demonstrate the finite sample performance of our estimators, as well as our test, via Monte Carlo simulations. Finally, we develop an application of our estimator to the relationship between public benefits and private savings.
Journal: Econometric Reviews
Pages: 158-180
Issue: 2
Volume: 39
Year: 2020
Month: 2
X-DOI: 10.1080/07474938.2018.1552413
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1552413
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:2:p:158-180
Template-Type: ReDIF-Article 1.0
Author-Name: Paul Bekker
Author-X-Name-First: Paul
Author-X-Name-Last: Bekker
Author-Name: Joëlle van Essen
Author-X-Name-First: Joëlle
Author-X-Name-Last: van Essen
Title: ML and GMM with concentrated instruments in the static panel data model
Abstract:
We study the asymptotic behavior of instrumental variable estimators in the static panel model under many-instruments asymptotics. We provide new estimators and standard errors based on concentrated instruments as alternatives to an estimator based on maximum likelihood. We prove that the latter estimator is consistent under many-instruments asymptotics only if the starting value in an iterative procedure is root-N consistent. A similar approach for continuous updating GMM shows the derivation is nontrivial. For the standard cross-sectional case (T = 1), the simple formulation of standard errors offer an alternative to earlier formulations.
Journal: Econometric Reviews
Pages: 181-195
Issue: 2
Volume: 39
Year: 2020
Month: 2
X-DOI: 10.1080/07474938.2019.1580946
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1580946
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:2:p:181-195
Template-Type: ReDIF-Article 1.0
Author-Name: Zhengyu Zhang
Author-X-Name-First: Zhengyu
Author-X-Name-Last: Zhang
Author-Name: Zequn Jin
Author-X-Name-First: Zequn
Author-X-Name-Last: Jin
Title: Identification and estimation in a linear correlated random coefficients model with censoring
Abstract:
In this paper, we study the identification and estimation of a linear correlated random coefficients model with censoring, namely, Y=max{B0+X′B,C}, where C is a known constant or an unknown function of regressors. Here, random coefficients (B0,B) can be correlated with one or more components of X. Under a generalized conditional median restriction similar to that in Hoderlein and Sherman, we show that both the average partial effect and the average partial effect on the treated are identified. We develop estimators for the identified parameters and analyze their large sample properties. A Monte Carlo simulation indicates that our estimators perform reasonably well with small samples. We then present an application.
Journal: Econometric Reviews
Pages: 196-213
Issue: 2
Volume: 39
Year: 2020
Month: 2
X-DOI: 10.1080/07474938.2019.1580949
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1580949
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:2:p:196-213
Template-Type: ReDIF-Article 1.0
Author-Name: Mototsugu Shintani
Author-X-Name-First: Mototsugu
Author-X-Name-Last: Shintani
Author-Name: Zi-Yi Guo
Author-X-Name-First: Zi-Yi
Author-X-Name-Last: Guo
Title: Improving the finite sample performance of autoregression estimators in dynamic factor models: A bootstrap approach
Abstract:
We investigate the finite sample properties of the estimator of a persistence parameter of an unobservable common factor when the factor is estimated by the principal components method. When the number of cross-sectional observations is not sufficiently large, relative to the number of time series observations, the autoregressive coefficient estimator of a positively autocorrelated factor is biased downward, and the bias becomes larger for a more persistent factor. Based on theoretical and simulation analyses, we show that bootstrap procedures are effective in reducing the bias, and bootstrap confidence intervals outperform naive asymptotic confidence intervals in terms of the coverage probability.
Journal: Econometric Reviews
Pages: 360-379
Issue: 4
Volume: 37
Year: 2018
Month: 4
X-DOI: 10.1080/07474938.2015.1092825
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092825
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:4:p:360-379
Template-Type: ReDIF-Article 1.0
Author-Name: Rosa Bernardini Papalia
Author-X-Name-First: Rosa Bernardini
Author-X-Name-Last: Papalia
Author-Name: Esteban Fernandez-Vazquez
Author-X-Name-First: Esteban
Author-X-Name-Last: Fernandez-Vazquez
Title: Information theoretic methods in small domain estimation
Abstract:
Small area estimation techniques are becoming increasingly used in survey applications to provide estimates for local areas of interest. The objective of this article is to develop and apply Information Theoretic (IT)-based formulations to estimate small area business and trade statistics. More specifically, we propose a Generalized Maximum Entropy (GME) approach to the problem of small area estimation that exploits auxiliary information relating to other known variables on the population and adjusts for consistency and additivity. The GME formulations, combining information from the sample together with out-of-sample aggregates of the population of interest, can be particularly useful in the context of small area estimation, for both direct and model-based estimators, since they do not require strong distributional assumptions on the disturbances. The performance of the proposed IT formulations is illustrated through real and simulated datasets.
Journal: Econometric Reviews
Pages: 347-359
Issue: 4
Volume: 37
Year: 2018
Month: 4
X-DOI: 10.1080/07474938.2015.1092834
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092834
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:4:p:347-359
Template-Type: ReDIF-Article 1.0
Author-Name: Tomasz Woźniak
Author-X-Name-First: Tomasz
Author-X-Name-Last: Woźniak
Title: Granger-causal analysis of GARCH models: A Bayesian approach
Abstract:
A multivariate GARCH model is used to investigate Granger causality in the conditional variance of time series. Parametric restrictions for the hypothesis of noncausality in conditional variances between two groups of variables, when there are other variables in the system as well, are derived. These novel conditions are convenient for the analysis of potentially large systems of economic variables. To evaluate hypotheses of noncausality, a Bayesian testing procedure is proposed. It avoids the singularity problem that may appear in the Wald test, and it relaxes the assumption of the existence of higher-order moments of the residuals required in classical tests.
Journal: Econometric Reviews
Pages: 325-346
Issue: 4
Volume: 37
Year: 2018
Month: 4
X-DOI: 10.1080/07474938.2015.1092839
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092839
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:4:p:325-346
Template-Type: ReDIF-Article 1.0
Author-Name: E. C. Brechmann
Author-X-Name-First: E. C.
Author-X-Name-Last: Brechmann
Author-Name: M. Heiden
Author-X-Name-First: M.
Author-X-Name-Last: Heiden
Author-Name: Y. Okhrin
Author-X-Name-First: Y.
Author-X-Name-Last: Okhrin
Title: A multivariate volatility vine copula model
Abstract:
This article proposes a dynamic framework for modeling and forecasting of realized covariance matrices using vine copulas to allow for more flexible dependencies between assets. Our model automatically guarantees positive definiteness of the forecast through the use of a Cholesky decomposition of the realized covariance matrix. We explicitly account for long-memory behavior by using fractionally integrated autoregressive moving average (ARFIMA) and heterogeneous autoregressive (HAR) models for the individual elements of the decomposition. Furthermore, our model incorporates non-Gaussian innovations and GARCH effects, accounting for volatility clustering and unconditional kurtosis. The dependence structure between assets is studied using vine copula constructions, which allow for nonlinearity and asymmetry without suffering from an inflexible tail behavior or symmetry restrictions as in conventional multivariate models. Further, the copulas have a direct impact on the point forecasts of the realized covariances matrices, due to being computed as a nonlinear transformation of the forecasts for the Cholesky matrix. Beside studying in-sample properties, we assess the usefulness of our method in a one-day-ahead forecasting framework, comparing recent types of models for the realized covariance matrix based on a model confidence set approach. Additionally, we find that in Value-at-Risk (VaR) forecasting, vine models require less capital requirements due to smoother and more accurate forecasts.
Journal: Econometric Reviews
Pages: 281-308
Issue: 4
Volume: 37
Year: 2018
Month: 4
X-DOI: 10.1080/07474938.2015.1096695
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1096695
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:4:p:281-308
Template-Type: ReDIF-Article 1.0
Author-Name: Christian M. Hafner
Author-X-Name-First: Christian M.
Author-X-Name-Last: Hafner
Author-Name: Hans Manner
Author-X-Name-First: Hans
Author-X-Name-Last: Manner
Author-Name: Léopold Simar
Author-X-Name-First: Léopold
Author-X-Name-Last: Simar
Title: The “wrong skewness” problem in stochastic frontier models: A new approach
Abstract:
Stochastic frontier models are widely used to measure, e.g., technical efficiencies of firms. The classical stochastic frontier model often suffers from the empirical artefact that the residuals of the production function may have a positive skewness, whereas a negative one is expected under the model, which leads to estimated full efficiencies of all firms. We propose a new approach to the problem by generalizing the distribution used for the inefficiency variable. This generalized stochastic frontier model allows the sample data to have the wrong skewness while estimating well-defined and nondegenerate efficiency measures. We discuss the statistical properties of the model, and we discuss a test for the symmetry of the error term (no inefficiency). We provide a simulation study to show that our model delivers estimators of efficiency with smaller bias than those of the classical model even if the population skewness has the correct sign. Finally, we apply the model to data of the U.S. textile industry for 1958–2005 and show that for a number of years our model suggests technical efficiencies well below the frontier while the classical one estimates no inefficiency in those years.
Journal: Econometric Reviews
Pages: 380-400
Issue: 4
Volume: 37
Year: 2018
Month: 4
X-DOI: 10.1080/07474938.2016.1140284
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1140284
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:4:p:380-400
Template-Type: ReDIF-Article 1.0
Author-Name: Yong Bao
Author-X-Name-First: Yong
Author-X-Name-Last: Bao
Title: The asymptotic covariance matrix of the QMLE in ARMA models
Abstract:
A compact analytical representation of the asymptotic covariance matrix, in terms of model parameters directly, of the quasi maximum likelihood estimator (QMLE) is derived in autoregressive moving average (ARMA) models with possible nonzero means and non-Gaussian error terms. For model parameters excluding the error variance, it is found that the Huber (1967) sandwich form for the asymptotic covariance matrix degenerates into the inverse of the associated information matrix. In comparison to the existing result that involves the second moments of some auxiliary variables for the case of zero-mean ARMA models, the analytical asymptotic covariance in this article has an advantage in that it can be conveniently estimated by plugging in the estimated model parameters directly.
Journal: Econometric Reviews
Pages: 309-324
Issue: 4
Volume: 37
Year: 2018
Month: 4
X-DOI: 10.1080/07474938.2016.1140287
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1140287
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:4:p:309-324
Template-Type: ReDIF-Article 1.0
Author-Name: Young Min Kim
Author-X-Name-First: Young Min
Author-X-Name-Last: Kim
Author-Name: Kyu Ho Kang
Author-X-Name-First: Kyu Ho
Author-X-Name-Last: Kang
Title: Likelihood inference for dynamic linear models with Markov switching parameters: on the efficiency of the Kim filter
Abstract:
The Kim filter (KF) approximation is widely used for the likelihood calculation of dynamic linear models with Markov regime-switching parameters. However, despite its popularity, its approximation error has not yet been examined rigorously. Therefore, this study investigates the reliability of the KF approximation for maximum likelihood (ML) and Bayesian estimations. To measure the approximation error, we compare the outcomes of the KF method with those of the auxiliary particle filter (APF). The APF is a numerical method that requires a longer computing time, but its numerical error can be sufficiently minimized by increasing simulation size. According to our extensive simulation and empirical studies, the likelihood values obtained from the KF approximation are practically identical to those of the APF. Furthermore, we show that the KF method is reliable, particularly when regimes are persistent and sample size is small. From the Bayesian perspective, we show that the KF method improves the efficiency of posterior simulation. This study contributes to the literature by providing evidence to justify the use of the KF method in both ML and Bayesian estimations.
Journal: Econometric Reviews
Pages: 1109-1130
Issue: 10
Volume: 38
Year: 2019
Month: 11
X-DOI: 10.1080/07474938.2018.1514027
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1514027
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:10:p:1109-1130
Template-Type: ReDIF-Article 1.0
Author-Name: David I. Harvey
Author-X-Name-First: David I.
Author-X-Name-Last: Harvey
Author-Name: Stephen J. Leybourne
Author-X-Name-First: Stephen J.
Author-X-Name-Last: Leybourne
Author-Name: Yang Zu
Author-X-Name-First: Yang
Author-X-Name-Last: Zu
Title: Testing explosive bubbles with time-varying volatility
Abstract:
This article considers the problem of testing for an explosive bubble in financial data in the presence of time-varying volatility. We propose a weighted least squares-based variant of the Phillips et al.) test for explosive autoregressive behavior. We find that such an approach has appealing asymptotic power properties, with the potential to deliver substantially greater power than the established OLS-based approach for many volatility and bubble settings. Given that the OLS-based test can outperform the weighted least squares-based test for other volatility and bubble specifications, we also suggest a union of rejections procedure that succeeds in capturing the better power available from the two constituent tests for a given alternative. Our approach involves a nonparametric kernel-based volatility function estimator for computation of the weighted least squares-based statistic, together with the use of a wild bootstrap procedure applied jointly to both individual tests, delivering a powerful testing procedure that is asymptotically size-robust to a wide range of time-varying volatility specifications.
Journal: Econometric Reviews
Pages: 1131-1151
Issue: 10
Volume: 38
Year: 2019
Month: 11
X-DOI: 10.1080/07474938.2018.1536099
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1536099
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:10:p:1131-1151
Template-Type: ReDIF-Article 1.0
Author-Name: Oliver Grothe
Author-X-Name-First: Oliver
Author-X-Name-Last: Grothe
Author-Name: Tore Selland Kleppe
Author-X-Name-First: Tore Selland
Author-X-Name-Last: Kleppe
Author-Name: Roman Liesenfeld
Author-X-Name-First: Roman
Author-X-Name-Last: Liesenfeld
Title: The Gibbs sampler with particle efficient importance sampling for state-space models*
Abstract:
We consider Particle Gibbs (PG) for Bayesian analysis of non-linear non-Gaussian state-space models. As a Monte Carlo (MC) approximation of the Gibbs procedure, PG uses sequential MC (SMC) importance sampling inside the Gibbs to update the latent states. We propose to combine PG with the Particle Efficient Importance Sampling (PEIS). By using SMC sampling densities which are approximately globally fully adapted to the targeted density of the states, PEIS can substantially improve the simulation efficiency of the PG relative to existing PG implementations. The efficiency gains are illustrated in PG applications to a non-linear local-level model and stochastic volatility models.
Journal: Econometric Reviews
Pages: 1152-1175
Issue: 10
Volume: 38
Year: 2019
Month: 11
X-DOI: 10.1080/07474938.2018.1536098
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1536098
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:10:p:1152-1175
Template-Type: ReDIF-Article 1.0
Author-Name: Massimo Franchi
Author-X-Name-First: Massimo
Author-X-Name-Last: Franchi
Author-Name: Paolo Paruolo
Author-X-Name-First: Paolo
Author-X-Name-Last: Paruolo
Title: A general inversion theorem for cointegration
Abstract:
A generalization of the Granger and the Johansen Representation Theorems valid for any (possibly fractional) order of integration is presented. This Representation Theorem is based on inversion results that characterize the order of the pole and the coefficients of the Laurent series representation of the inverse of a matrix function around a singular point. Explicit expressions of the matrix coefficients of the (polynomial) cointegrating relations, of the Common Trends and of the Triangular representations are provided, either starting from the Moving Average or the Auto Regressive form. This contribution unifies different approaches in the literature and extends them to an arbitrary order of integration. The role of deterministic terms is discussed in detail.
Journal: Econometric Reviews
Pages: 1176-1201
Issue: 10
Volume: 38
Year: 2019
Month: 11
X-DOI: 10.1080/07474938.2018.1536100
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1536100
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:10:p:1176-1201
Template-Type: ReDIF-Article 1.0
Author-Name: Shuo Li
Author-X-Name-First: Shuo
Author-X-Name-Last: Li
Author-Name: Yundong Tu
Author-X-Name-First: Yundong
Author-X-Name-Last: Tu
Title: A joint test for parametric specification and independence in nonlinear regression models
Abstract:
This paper develops a testing procedure to simultaneously check (i) the independence between the error and the regressor(s), and (ii) the parametric specification in nonlinear regression models. This procedure generalizes the existing work of Sen and Sen [“Testing Independence and Goodness-of-fit in Linear Models,” Biometrika, 101, 927–942.] to a regression setting that allows any smooth parametric form of the regression function. We establish asymptotic theory for the test procedure under both conditional homoscedastic error and heteroscedastic error. The derived tests are easily implementable, asymptotically normal, and consistent against a large class of fixed alternatives. Besides, the local power performance is investigated. To calibrate the finite sample distribution of the test statistics, a smooth bootstrap procedure is proposed and found work well in simulation studies. Finally, two real data examples are analyzed to illustrate the practical merit of our proposed tests.
Journal: Econometric Reviews
Pages: 1202-1215
Issue: 10
Volume: 38
Year: 2019
Month: 11
X-DOI: 10.1080/07474938.2018.1536101
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1536101
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:10:p:1202-1215
Template-Type: ReDIF-Article 1.0
Author-Name: The Editors
Title: List of Referees
Journal: Econometric Reviews
Pages: 1216-1217
Issue: 10
Volume: 38
Year: 2019
Month: 11
X-DOI: 10.1080/07474938.2019.1630074
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1630074
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:10:p:1216-1217
Template-Type: ReDIF-Article 1.0
Author-Name: Federico Martellosio
Author-X-Name-First: Federico
Author-X-Name-Last: Martellosio
Title: Testing for Spatial Autocorrelation: The Regressors that Make the Power Disappear
Abstract: We show that for any sample size, any size of the test, and any weights matrix outside a small class of exceptions, there exists a positive measure set of regression spaces such that the power of the Cliff–Ord test vanishes as the autocorrelation increases in a spatial error model. This result extends to the tests that define the Gaussian power envelope of all invariant tests for residual spatial autocorrelation. In most cases, the regression spaces such that the problem occurs depend on the size of the test, but there also exist regression spaces such that the power vanishes regardless of the size. A characterization of such particularly hostile regression spaces is provided.
Journal: Econometric Reviews
Pages: 215-240
Issue: 2
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.553571
File-URL: http://hdl.handle.net/10.1080/07474938.2011.553571
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:2:p:215-240
Template-Type: ReDIF-Article 1.0
Author-Name: Silvia Platoni
Author-X-Name-First: Silvia
Author-X-Name-Last: Platoni
Author-Name: Paolo Sckokai
Author-X-Name-First: Paolo
Author-X-Name-Last: Sckokai
Author-Name: Daniele Moro
Author-X-Name-First: Daniele
Author-X-Name-Last: Moro
Title: A Note on Two-Way ECM Estimation of SUR Systems on Unbalanced Panel Data
Abstract: This article considers the two-way error components model (ECM) estimation of seemingly unrelated regressions (SUR) on unbalanced panel by generalized least squares (GLS). As suggested by Biørn (2004) for the one-way case, in order to use the standard results for the balanced case the individuals are arranged in groups according to the number of times they are observed. Thus, the GLS estimator can be interpreted as a matrix weighted average of the group specific GLS estimators with weights equal to the inverse of their respective covariance matrices.
Journal: Econometric Reviews
Pages: 119-141
Issue: 2
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607098
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607098
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:2:p:119-141
Template-Type: ReDIF-Article 1.0
Author-Name: Haiqiang Chen
Author-X-Name-First: Haiqiang
Author-X-Name-Last: Chen
Author-Name: Terence Chong
Author-X-Name-First: Terence
Author-X-Name-Last: Chong
Author-Name: Jushan Bai
Author-X-Name-First: Jushan
Author-X-Name-Last: Bai
Title: Theory and Applications of TAR Model with Two Threshold Variables
Abstract: A growing body of threshold models has been developed over the past two decades to capture the nonlinear movement of financial time series. Most of these models, however, contain a single threshold variable only. In many empirical applications, models with two or more threshold variables are needed. This article develops a new threshold autoregressive model which contains two threshold variables. A likelihood ratio test is proposed to determine the number of regimes in the model. The finite-sample performance of the estimators is evaluated and an empirical application is provided.
Journal: Econometric Reviews
Pages: 142-170
Issue: 2
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607100
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607100
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:2:p:142-170
Template-Type: ReDIF-Article 1.0
Author-Name: Myoung-Jae Lee
Author-X-Name-First: Myoung-Jae
Author-X-Name-Last: Lee
Title: Semiparametric Estimators for Limited Dependent Variable (LDV) Models with Endogenous Regressors
Abstract: This article reviews semiparametric estimators for limited dependent variable (LDV) models with endogenous regressors, where nonlinearity and nonseparability pose difficulties. We first introduce six main approaches in the linear equation system literature to handle endogenous regressors with linear projections: (i) ‘substitution’ replacing the endogenous regressors with their projected versions on the system exogenous regressors x, (ii) instrumental variable estimator (IVE) based on E{(error) × x} = 0, (iii) ‘model-projection’ turning the original model into a model in terms of only x-projected variables, (iv) ‘system reduced form (RF)’ finding RF parameters first and then the structural form (SF) parameters, (v) ‘artificial instrumental regressor’ using instruments as artificial regressors with zero coefficients, and (vi) ‘control function’ adding an extra term as a regressor to control for the endogeneity source. We then check if these approaches are applicable to LDV models using conditional mean/quantiles instead of linear projection. The six approaches provide a convenient forum on which semiparametric estimators in the literature can be categorized, although there are a few exceptions. The pros and cons of the approaches are discussed, and a small-scale simulation study is provided for some reviewed estimators.
Journal: Econometric Reviews
Pages: 171-214
Issue: 2
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607101
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607101
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:2:p:171-214
Template-Type: ReDIF-Article 1.0
Author-Name: Ron Smith
Author-X-Name-First: Ron
Author-X-Name-Last: Smith
Title: Review of Microfit5
Journal: Econometric Reviews
Pages: 241-244
Issue: 2
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607102
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607102
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:2:p:241-244
Template-Type: ReDIF-Article 1.0
Author-Name: James A. Duffy
Author-X-Name-First: James A.
Author-X-Name-Last: Duffy
Author-Name: David F. Hendry
Author-X-Name-First: David F.
Author-X-Name-Last: Hendry
Title: The impact of integrated measurement errors on modeling long-run macroeconomic time series
Abstract:
Data spanning long time periods, such as that over 1860–2012 for the UK, seem likely to have substantial errors of measurement that may even be integrated of order one, but which are probably cointegrated for cognate variables. We analyze and simulate the impacts of such measurement errors on parameter estimates and tests in a bivariate cointegrated system with trends and location shifts which reflect the many major turbulent events that have occurred historically. When trends or shifts therein are large, cointegration analysis is not much affected by such measurement errors, leading to conventional stationary attenuation biases dependent on the measurement error variance, unlike the outcome when there are no offsetting shifts or trends.
Journal: Econometric Reviews
Pages: 568-587
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307177
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307177
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:568-587
Template-Type: ReDIF-Article 1.0
Author-Name: Francis X. Diebold
Author-X-Name-First: Francis X.
Author-X-Name-Last: Diebold
Author-Name: Minchul Shin
Author-X-Name-First: Minchul
Author-X-Name-Last: Shin
Title: Assessing point forecast accuracy by stochastic error distance
Abstract:
We propose point forecast accuracy measures based directly on distance of the forecast-error c.d.f. from the unit step function at 0 (“stochastic error distance,” or SED). We provide a precise characterization of the relationship between SED and standard predictive loss functions, and we show that all such loss functions can be written as weighted SEDs. The leading case is absolute error loss. Among other things, this suggests shifting attention away from conditional-mean forecasts and toward conditional-median forecasts.
Journal: Econometric Reviews
Pages: 588-598
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307247
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307247
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:588-598
Template-Type: ReDIF-Article 1.0
Author-Name: Paul Catani
Author-X-Name-First: Paul
Author-X-Name-Last: Catani
Author-Name: Timo Teräsvirta
Author-X-Name-First: Timo
Author-X-Name-Last: Teräsvirta
Author-Name: Meiqun Yin
Author-X-Name-First: Meiqun
Author-X-Name-Last: Yin
Title: A Lagrange multiplier test for testing the adequacy of constant conditional correlation GARCH model
Abstract:
A Lagrange multiplier test for testing the parametric structure of a constant conditional correlation-generalized autoregressive conditional heteroskedasticity (CCC-GARCH) model is proposed. The test is based on decomposing the CCC-GARCH model multiplicatively into two components, one of which represents the null model, whereas the other one describes the misspecification. A simulation study shows that the test has good finite sample properties. We compare the test with other tests for misspecification of multivariate GARCH models. The test has high power against alternatives where the misspecification is in the GARCH parameters and is superior to other tests. The test is not greatly affected by misspecification in the conditional correlations and is therefore well suited for considering misspecification of GARCH equations.
Journal: Econometric Reviews
Pages: 599-621
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307311
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307311
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:599-621
Template-Type: ReDIF-Article 1.0
Author-Name: Marcelo C. Medeiros
Author-X-Name-First: Marcelo C.
Author-X-Name-Last: Medeiros
Author-Name: Eduardo F. Mendes
Author-X-Name-First: Eduardo F.
Author-X-Name-Last: Mendes
Title: Adaptive LASSO estimation for ARDL models with GARCH innovations
Abstract:
In this paper, we show the validity of the adaptive least absolute shrinkage and selection operator (LASSO) procedure in estimating stationary autoregressive distributed lag(p,q) models with innovations in a broad class of conditionally heteroskedastic models. We show that the adaptive LASSO selects the relevant variables with probability converging to one and that the estimator is oracle efficient, meaning that its distribution converges to the same distribution of the oracle-assisted least squares, i.e., the least square estimator calculated as if we knew the set of relevant variables beforehand. Finally, we show that the LASSO estimator can be used to construct the initial weights. The performance of the method in finite samples is illustrated using Monte Carlo simulation.
Journal: Econometric Reviews
Pages: 622-637
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307319
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307319
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:622-637
Template-Type: ReDIF-Article 1.0
Author-Name: Manabu Asai
Author-X-Name-First: Manabu
Author-X-Name-Last: Asai
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: The impact of jumps and leverage in forecasting covolatility
Abstract:
The paper investigates the impact of jumps in forecasting covolatility, accommodating leverage effects. We modify the preaveraged truncated covariance estimator of Koike (2016) such that the estimated matrix is positive definite. Using this approach, we can disentangle the estimates of the integrated covolatility matrix and jump variations from the quadratic covariation matrix. Empirical results for three stocks traded on the New York Stock Exchange indicate that the cojumps of two assets have a significant impact on future covolatility, but the impact is negligible for forecasting weekly and monthly horizons.
Journal: Econometric Reviews
Pages: 638-650
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307326
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307326
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:638-650
Template-Type: ReDIF-Article 1.0
Author-Name: Sam Astill
Author-X-Name-First: Sam
Author-X-Name-Last: Astill
Author-Name: David I. Harvey
Author-X-Name-First: David I.
Author-X-Name-Last: Harvey
Author-Name: Stephen J. Leybourne
Author-X-Name-First: Stephen J.
Author-X-Name-Last: Leybourne
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M. Robert
Author-X-Name-Last: Taylor
Title: Tests for an end-of-sample bubble in financial time series
Abstract:
In this paper, we examine the issue of detecting explosive behavior in economic and financial time series when an explosive episode is both ongoing at the end of the sample and of finite length. We propose a testing strategy based on a subsampling method in which a suitable test statistic is calculated on a finite number of end-of-sample observations, with a critical value obtained using subsample test statistics calculated on the remaining observations. This approach also has the practical advantage that, by virtue of how the critical values are obtained, it can deliver tests which are robust to, among other things, conditional heteroskedasticity and serial correlation in the driving shocks. We also explore modifications of the raw statistics to account for unconditional heteroskedasticity using studentization and a White-type correction. We evaluate the finite sample size and power properties of our proposed procedures and find that they offer promising levels of power, suggesting the possibility for earlier detection of end-of-sample bubble episodes compared to existing procedures.
Journal: Econometric Reviews
Pages: 651-666
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307490
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307490
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:651-666
Template-Type: ReDIF-Article 1.0
Author-Name: Alastair R. Hall
Author-X-Name-First: Alastair R.
Author-X-Name-Last: Hall
Author-Name: Denise R. Osborn
Author-X-Name-First: Denise R.
Author-X-Name-Last: Osborn
Author-Name: Nikolaos Sakkas
Author-X-Name-First: Nikolaos
Author-X-Name-Last: Sakkas
Title: The asymptotic behaviour of the residual sum of squares in models with multiple break points
Abstract:
Models with multiple discrete breaks in parameters are usually estimated via least squares. This paper, first, derives the asymptotic expectation of the residual sum of squares and shows that the number of estimated break points and the number of regression parameters affect the expectation differently. Second, we propose a statistic for testing the joint hypothesis that the breaks occur at specified points in the sample. Our analytical results cover models estimated by the ordinary, nonlinear, and two-stage least squares. An application to U.S. monetary policy rejects the assumption that breaks are associated with changes in the chair of the Fed.
Journal: Econometric Reviews
Pages: 667-698
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307523
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307523
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:667-698
Template-Type: ReDIF-Article 1.0
Author-Name: Nicholas M. Kiefer
Author-X-Name-First: Nicholas M.
Author-X-Name-Last: Kiefer
Title: Correlated defaults, temporal correlation, expert information and predictability of default rates
Abstract:
Dependence among defaults both across assets and over time is an important characteristic of financial risk. A Bayesian approach to default rate estimation is proposed and illustrated using prior distributions assessed from an experienced industry expert. Two extensions of the binomial model are proposed. The first allows correlated defaults yet remains consistent with Basel II’s asymptotic single-factor model. The second adds temporal correlation in default rates through autocorrelation in the systemic factor. Implications for the predictability of default rates are considered. The single-factor model generates more forecast uncertainty than does the parameter uncertainty. A robustness exercise illustrates that the correlation indicated by the data is much smaller than that specified in the Basel II regulations.
Journal: Econometric Reviews
Pages: 699-712
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307547
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307547
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:699-712
Template-Type: ReDIF-Article 1.0
Author-Name: Jean-Marie Dufour
Author-X-Name-First: Jean-Marie
Author-X-Name-Last: Dufour
Author-Name: Richard Luger
Author-X-Name-First: Richard
Author-X-Name-Last: Luger
Title: Identification-robust moment-based tests for Markov switching in autoregressive models
Abstract:
This paper develops tests of the null hypothesis of linearity in the context of autoregressive models with Markov-switching means and variances. These tests are robust to the identification failures that plague conventional likelihood-based inference methods. The approach exploits the moments of normal mixtures implied by the regime-switching process and uses Monte Carlo test techniques to deal with the presence of an autoregressive component in the model specification. The proposed tests have very respectable power in comparison with the optimal tests for Markov-switching parameters of Carrasco et al. (2014), and they are also quite attractive owing to their computational simplicity. The new tests are illustrated with an empirical application to an autoregressive model of USA output growth.
Journal: Econometric Reviews
Pages: 713-727
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307548
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307548
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:713-727
Template-Type: ReDIF-Article 1.0
Author-Name: Yongmiao Hong
Author-X-Name-First: Yongmiao
Author-X-Name-Last: Hong
Author-Name: Xia Wang
Author-X-Name-First: Xia
Author-X-Name-Last: Wang
Author-Name: Wenjie Zhang
Author-X-Name-First: Wenjie
Author-X-Name-Last: Zhang
Author-Name: Shouyang Wang
Author-X-Name-First: Shouyang
Author-X-Name-Last: Wang
Title: An efficient integrated nonparametric entropy estimator of serial dependence
Abstract:
We propose an efficient numerical integration-based nonparametric entropy estimator for serial dependence and show that the new entropy estimator has a smaller asymptotic variance than Hong and White’s (2005) sample average-based estimator. This delivers an asymptotically more efficient test for serial dependence. In particular, the uniform kernel gives the smallest asymptotic variance for the numerical integration-based entropy estimator over a class of positive kernel functions. Moreover, the naive bootstrap can be used to obtain accurate inferences for our test, whereas it is not applicable to Hong and White’s (2005) sample averaging approach. A simulation study confirms the merits of our approach.
Journal: Econometric Reviews
Pages: 728-780
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307564
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307564
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:728-780
Template-Type: ReDIF-Article 1.0
Author-Name: Amos Golan
Author-X-Name-First: Amos
Author-X-Name-Last: Golan
Author-Name: Aman Ullah
Author-X-Name-First: Aman
Author-X-Name-Last: Ullah
Title: Interval estimation: An information theoretic approach
Abstract:
We develop here an alternative information theoretic method of inference of problems in which all of the observed information is in terms of intervals. We focus on the unconditional case in which the observed information is in terms the minimal and maximal values at each period. Given interval data, we infer the joint and marginal distributions of the interval variable and its range. Our inferential procedure is based on entropy maximization subject to multidimensional moment conditions and normalization in which the entropy is defined over discretized intervals. The discretization is based on theory or empirically observed quantities. The number of estimated parameters is independent of the discretization so the level of discretization does not change the fundamental level of complexity of our model. As an example, we apply our method to study the weather pattern for Los Angeles and New York City across the last century.
Journal: Econometric Reviews
Pages: 781-795
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307573
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307573
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:781-795
Template-Type: ReDIF-Article 1.0
Author-Name: Mehdi Shoja
Author-X-Name-First: Mehdi
Author-X-Name-Last: Shoja
Author-Name: Ehsan S. Soofi
Author-X-Name-First: Ehsan S.
Author-X-Name-Last: Soofi
Title: Uncertainty, information, and disagreement of economic forecasters
Abstract:
An information framework is proposed for studying uncertainty and disagreement of economic forecasters. This framework builds upon the mixture model of combining density forecasts through a systematic application of the information theory. The framework encompasses the measures used in the literature and leads to their generalizations. The focal measure is the Jensen–Shannon divergence of the mixture which admits Kullback–Leibler and mutual information representations. Illustrations include exploring the dynamics of the individual and aggregate uncertainty about the US inflation rate using the survey of professional forecasters (SPF). We show that the normalized entropy index corrects some of the distortions caused by changes of the design of the SPF over time. Bayesian hierarchical models are used to examine the association of the inflation uncertainty with the anticipated inflation and the dispersion of point forecasts. Implementation of the information framework based on the variance and Dirichlet model for capturing uncertainty about the probability distribution of the economic variable are briefly discussed.
Journal: Econometric Reviews
Pages: 796-817
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307577
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307577
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:796-817
Template-Type: ReDIF-Article 1.0
Author-Name: Peter C. B. Phillips
Author-X-Name-First: Peter C. B.
Author-X-Name-Last: Phillips
Title: Reduced forms and weak instrumentation
Abstract:
This paper develops exact finite sample and asymptotic distributions for a class of reduced form estimators and predictors, allowing for the presence of unidentified or weakly identified structural equations. Weak instrument asymptotic theory is developed directly from finite sample results, unifying earlier findings and showing the usefulness of structural information in making predictions from reduced form systems in applications. Asymptotic results are reported for predictions from models with many weak instruments. Of particular interest is the finding that, in unidentified and weakly identified structural models, partially restricted reduced form predictors have considerably smaller forecast mean square errors than unrestricted reduced forms. These results are related to the use of shrinkage methods in system-wide reduced form estimation.
Journal: Econometric Reviews
Pages: 818-839
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307578
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307578
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:818-839
Template-Type: ReDIF-Article 1.0
Author-Name: Bruce E. Hansen
Author-X-Name-First: Bruce E.
Author-X-Name-Last: Hansen
Title: Stein-like 2SLS estimator
Abstract:
Maasoumi (1978) proposed a Stein-like estimator for simultaneous equations and showed that his Stein shrinkage estimator has bounded finite sample risk, unlike the three-stage least square estimator. We revisit his proposal by investigating Stein-like shrinkage in the context of two-stage least square (2SLS) estimation of a structural parameter. Our estimator follows Maasoumi (1978) in taking a weighted average of the 2SLS and ordinary least square estimators, with the weight depending inversely on the Hausman (1978) statistic for exogeneity. Using a local-to-exogenous asymptotic theory, we derive the asymptotic distribution of the Stein estimator and calculate its asymptotic risk. We find that if the number of endogenous variables exceeds 2, then the shrinkage estimator has strictly smaller risk than the 2SLS estimator, extending the classic result of James and Stein (1961). In a simple simulation experiment, we show that the shrinkage estimator has substantially reduced finite sample median squared error relative to the standard 2SLS estimator.
Journal: Econometric Reviews
Pages: 840-852
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307579
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307579
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:840-852
Template-Type: ReDIF-Article 1.0
Author-Name: Badi H. Baltagi
Author-X-Name-First: Badi H.
Author-X-Name-Last: Baltagi
Author-Name: Chihwa Kao
Author-X-Name-First: Chihwa
Author-X-Name-Last: Kao
Author-Name: Fa Wang
Author-X-Name-First: Fa
Author-X-Name-Last: Wang
Title: Asymptotic power of the sphericity test under weak and strong factors in a fixed effects panel data model
Abstract:
This paper studies the asymptotic power for the sphericity test in a fixed effect panel data model proposed by Baltagi et al. (2011), (JBFK). This is done under the alternative hypotheses of weak and strong factors. By weak factors, we mean that the Euclidean norm of the vector of the factor loadings is O(1). By strong factors, we mean that the Euclidean norm of the vector of factor loadings is \begin{equation}O(\sqrt{n})\end{equation}O(n), where n is the number of individuals in the panel. To derive the limiting distribution of JBFK under the alternative, we first derive the limiting distribution of its raw data counterpart. Our results show that, when the factor is strong, the test statistic diverges in probability to infinity as fast as Op(nT). However, when the factor is weak, its limiting distribution is a rightward mean shift of the limit distribution under the null. Second, we derive the asymptotic behavior of the difference between JBFK and its raw data counterpart. Our results show that when the factor is strong, this difference is as large as Op(n). In contrast, when the factor is weak, this difference converges in probability to a constant. Taken together, these results imply that when the factor is strong, JBFK is consistent, but when the factor is weak, JBFK is inconsistent even though its asymptotic power is nontrivial.
Journal: Econometric Reviews
Pages: 853-882
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307580
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307580
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:853-882
Template-Type: ReDIF-Article 1.0
Author-Name: Cheng Hsiao
Author-X-Name-First: Cheng
Author-X-Name-Last: Hsiao
Author-Name: Qiankun Zhou
Author-X-Name-First: Qiankun
Author-X-Name-Last: Zhou
Title: First difference or forward demeaning: Implications for the method of moments estimators
Abstract:
In this paper, we consider the method of moment estimation for dynamic panel models based on either forward demeaning (FOD) or first difference (FD) transformations to eliminate the individual-specific effects, using either all lags or one lag as instruments. We show that the Arellano–Bond-type generalized method of moment (GMM) based on FD is asymptotically biased of order \begin{equation}\sqrt{c}\end{equation}c using all lags or one lag as instruments where \begin{equation}c={{T} \over {N}}\lte{}\infty \end{equation}c=TN<∞ as N,T→∞. For GMM based on FOD, it is asymptotically biased of order \begin{equation}\sqrt{c}\end{equation}c when using all lags, but it is asymptotically unbiased when using only fixed number of lags as instruments. We also discuss these findings in light of the simple IV estimator. Monte Carlo simulations confirm our findings in this paper.
Journal: Econometric Reviews
Pages: 883-897
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307594
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307594
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:883-897
Template-Type: ReDIF-Article 1.0
Author-Name: Majid M. Al-Sadoon
Author-X-Name-First: Majid M.
Author-X-Name-Last: Al-Sadoon
Author-Name: Tong Li
Author-X-Name-First: Tong
Author-X-Name-Last: Li
Author-Name: M. Hashem Pesaran
Author-X-Name-First: M. Hashem
Author-X-Name-Last: Pesaran
Title: Exponential class of dynamic binary choice panel data models with fixed effects
Abstract:
This paper proposes an exponential class of dynamic binary choice panel data models for the analysis of short T (time dimension) large N (cross section dimension) panel data sets that allow for unobserved heterogeneity (fixed effects) to be arbitrarily correlated with the covariates. The paper derives moment conditions that are invariant to the fixed effects which are then used to identify and estimate the parameters of the model. Accordingly, generalized method of moments (GMM) estimators are proposed that are consistent and asymptotically normally distributed at the root-N rate. We also study the conditional likelihood approach and show that under exponential specification, it can identify the effect of state dependence but not the effects of other covariates. Monte Carlo experiments show satisfactory finite sample performance for the proposed estimators and investigate their robustness to misspecification.
Journal: Econometric Reviews
Pages: 898-927
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307597
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307597
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:898-927
Template-Type: ReDIF-Article 1.0
Author-Name: Bertille Antoine
Author-X-Name-First: Bertille
Author-X-Name-Last: Antoine
Author-Name: Eric Renault
Author-X-Name-First: Eric
Author-X-Name-Last: Renault
Title: On the relevance of weaker instruments
Abstract:
We study the asymptotic properties of the standard GMM estimator when additional moment restrictions, weaker than the original ones, are available. We provide conditions under which these additional weaker restrictions improve the efficiency of the GMM estimator. To detect “spurious” identification that may come from invalid moments, we rely on the Hansen J-test that assesses the compatibility between existing restrictions and additional ones. Our simulations reveal that the J-test has good power properties and that its power increases with the weakness of the additional restrictions. Our theoretical characterization of the J-test provides some intuition for why that is.
Journal: Econometric Reviews
Pages: 928-945
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307598
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307598
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:928-945
Template-Type: ReDIF-Article 1.0
Author-Name: Xu Han
Author-X-Name-First: Xu
Author-X-Name-Last: Han
Author-Name: Mehmet Caner
Author-X-Name-First: Mehmet
Author-X-Name-Last: Caner
Title: Determining the number of factors with potentially strong within-block correlations in error terms
Abstract:
We develop methods to estimate the number of factors when error terms have potentially strong correlations in the cross-sectional dimension. The information criteria proposed by Bai and Ng (2002) require the cross-sectional correlations between the error terms to be weak. Violation of this weak correlation assumption may lead to inconsistent estimates of the number of factors. We establish two data-dependent estimators that are consistent whether the error terms are weakly or strongly correlated in the cross-sectional dimension. To handle potentially strong cross-sectional correlations between the error terms, we use a block structure in which the within-block correlation may either be weak or strong, but the between-block correlation is limited. Our estimators allow imperfect knowledge and a moderate misspecification of the block structure. Monte-Carlo simulation results show that our estimators perform similarly to existing methods for cases in which the conventional weak correlation assumption is satisfied. When the error terms have a strong cross-sectional correlation, our estimators outperform the existing methods.
Journal: Econometric Reviews
Pages: 946-969
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307599
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307599
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:946-969
Template-Type: ReDIF-Article 1.0
Author-Name: Cong Li
Author-X-Name-First: Cong
Author-X-Name-Last: Li
Author-Name: Hongjun Li
Author-X-Name-First: Hongjun
Author-X-Name-Last: Li
Author-Name: Jeffrey S. Racine
Author-X-Name-First: Jeffrey S.
Author-X-Name-Last: Racine
Title: Cross-validated mixed-datatype bandwidth selection for nonparametric cumulative distribution/survivor functions
Abstract:
We propose a computationally efficient data-driven least square cross-validation method to optimally select smoothing parameters for the nonparametric estimation of cumulative distribution/survivor functions. We allow for general multivariate covariates that can be continuous, discrete/ordered categorical or a mix of either. We provide asymptotic analysis, examine finite-sample properties through Monte Carlo simulation, and consider an illustration involving nonparametric copula modeling. We also demonstrate how the approach can also be used to construct a smooth Kolmogorov–Smirnov test that has a slightly better power profile than its nonsmooth counterpart.
Journal: Econometric Reviews
Pages: 970-987
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307900
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307900
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:970-987
Template-Type: ReDIF-Article 1.0
Author-Name: Zheng Li
Author-X-Name-First: Zheng
Author-X-Name-Last: Li
Author-Name: Guannan Liu
Author-X-Name-First: Guannan
Author-X-Name-Last: Liu
Author-Name: Qi Li
Author-X-Name-First: Qi
Author-X-Name-Last: Li
Title: Nonparametric Knn estimation with monotone constraints
Abstract:
The K-nearest-neighbor (Knn) method is known to be more suitable in fitting nonparametrically specified curves than the kernel method (with a globally fixed smoothing parameter) when data sets are highly unevenly distributed. In this paper, we propose to estimate a nonparametric regression function subject to a monotonicity restriction using the Knn method. We also propose using a new convergence criterion to measure the closeness between an unconstrained and the (monotone) constrained Knn-estimated curves. This method is an alternative to the monotone kernel methods proposed by Hall and Huang (2001), and Du et al. (2013). We use a bootstrap procedure for testing the validity of the monotone restriction. We apply our method to the “Job Market Matching” data taken from Gan and Li (2016) and find that the unconstrained/constrained Knn estimators work better than kernel estimators for this type of highly unevenly distributed data.
Journal: Econometric Reviews
Pages: 988-1006
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307904
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307904
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:988-1006
Template-Type: ReDIF-Article 1.0
Author-Name: Russell Davidson
Author-X-Name-First: Russell
Author-X-Name-Last: Davidson
Title: Diagnostics for the bootstrap and fast double bootstrap
Abstract:
The bootstrap is typically less reliable in the context of time-series models with serial correlation of unknown form than when regularity conditions for the conventional IID bootstrap apply. It is, therefore, useful to have diagnostic techniques capable of evaluating bootstrap performance in specific cases. Those suggested in this paper are closely related to the fast double bootstrap (FDB) and are not computationally intensive. They can also be used to gauge the performance of the FDB itself. Examples of bootstrapping time series are presented, which illustrate the diagnostic procedures, and show how the results can cast light on bootstrap performance.
Journal: Econometric Reviews
Pages: 1021-1038
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307918
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307918
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:1021-1038
Template-Type: ReDIF-Article 1.0
Author-Name: Yong Bao
Author-X-Name-First: Yong
Author-X-Name-Last: Bao
Author-Name: Aman Ullah
Author-X-Name-First: Aman
Author-X-Name-Last: Ullah
Author-Name: Yun Wang
Author-X-Name-First: Yun
Author-X-Name-Last: Wang
Title: Distribution of the mean reversion estimator in the Ornstein–Uhlenbeck process
Abstract:
We derive the exact distribution of the maximum likelihood estimator of the mean reversion parameter (κ) in the Ornstein–Uhlenbeck process using numerical integration through analytical evaluation of a joint characteristic function. Different scenarios are considered: known or unknown drift term, fixed or random start-up value, and zero or positive κ. Monte Carlo results demonstrate the remarkably reliable performance of our exact approach across all the scenarios. In comparison, misleading results may arise under the asymptotic distributions, including the advocated infill asymptotic distribution, which performs poorly in the tails when there is no intercept in the regression and the starting value of the process is nonzero.
Journal: Econometric Reviews
Pages: 1039-1056
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1307977
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1307977
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:1039-1056
Template-Type: ReDIF-Article 1.0
Author-Name: Yanqin Fan
Author-X-Name-First: Yanqin
Author-X-Name-Last: Fan
Author-Name: Carlos A. Manzanares
Author-X-Name-First: Carlos A.
Author-X-Name-Last: Manzanares
Title: Partial identification of average treatment effects on the treated through difference-in-differences
Abstract:
The difference-in-differences (DID) method is widely used as a tool for identifying causal effects of treatments in program evaluation. When panel data sets are available, it is well-known that the average treatment effect on the treated (ATT) is point-identified under the DID setup. If a panel data set is not available, repeated cross sections (pretreatment and posttreatment) may be used, but may not point-identify the ATT. This paper systematically studies the identification of the ATT under the DID setup when posttreatment treatment status is unknown for the pretreatment sample. This is done through a novel application of an extension of a continuous version of the classical monotone rearrangement inequality which allows for general copula bounds. The identifying power of an instrumental variable and of a ‘matched subsample’ is also explored. Finally, we illustrate our approach by estimating the effect of the Americans with Disabilities Act of 1991 on employment outcomes of the disabled.
Journal: Econometric Reviews
Pages: 1057-1080
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1308036
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1308036
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:1057-1080
Template-Type: ReDIF-Article 1.0
Author-Name: Christine Amsler
Author-X-Name-First: Christine
Author-X-Name-Last: Amsler
Author-Name: Christopher J. O’Donnell
Author-X-Name-First: Christopher J.
Author-X-Name-Last: O’Donnell
Author-Name: Peter Schmidt
Author-X-Name-First: Peter
Author-X-Name-Last: Schmidt
Title: Stochastic metafrontiers
Abstract:
We consider the case of production units arranged into a number of groups. All units within a group choose output–input combinations from the same production possibilities set that is represented by a stochastic frontier model. The metafrontier is the envelope of the group-specific frontiers. We are interested in the metafrontier distance, which is the amount by which the group-specific frontier lies below the metafrontier.Previous work has measured the metafrontier distance using the deterministic portion of the frontier. In a stochastic frontier model, this is not appropriate. We show how to evaluate the metafrontier distance, and we demonstrate the empirical relevance of this issue.
Journal: Econometric Reviews
Pages: 1007-1020
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1308345
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1308345
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:1007-1020
Template-Type: ReDIF-Article 1.0
Author-Name: Peter C. B. Phillips
Author-X-Name-First: Peter C. B.
Author-X-Name-Last: Phillips
Author-Name: Aman Ullah
Author-X-Name-First: Aman
Author-X-Name-Last: Ullah
Title: Econometric Reviews honors Esfandiar Maasoumi
Journal: Econometric Reviews
Pages: 563-567
Issue: 6-9
Volume: 36
Year: 2017
Month: 10
X-DOI: 10.1080/07474938.2017.1312074
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1312074
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:6-9:p:563-567
Template-Type: ReDIF-Article 1.0
Author-Name: Joshua C. C. Chan
Author-X-Name-First: Joshua C. C.
Author-X-Name-Last: Chan
Title: Specification tests for time-varying parameter models with stochastic volatility
Abstract:
We propose an easy technique to test for time-variation in coefficients and volatilities. Specifically, by using a noncentered parameterization for state space models, we develop a method to directly calculate the relevant Bayes factor using the Savage–Dickey density ratio—thus avoiding the computation of the marginal likelihood altogether. The proposed methodology is illustrated via two empirical applications. In the first application, we test for time-variation in the volatility of inflation in the G7 countries. The second application investigates if there is substantial time-variation in the nonaccelerating inflation rate of unemployment (NAIRU) in the United States.
Journal: Econometric Reviews
Pages: 807-823
Issue: 8
Volume: 37
Year: 2018
Month: 9
X-DOI: 10.1080/07474938.2016.1167948
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1167948
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:8:p:807-823
Template-Type: ReDIF-Article 1.0
Author-Name: Guillaume Gaetan Martinet
Author-X-Name-First: Guillaume Gaetan
Author-X-Name-Last: Martinet
Author-Name: Michael McAleer
Author-X-Name-First: Michael
Author-X-Name-Last: McAleer
Title: On the invertibility of EGARCH(p, q)
Abstract:
Of the two most widely estimated univariate asymmetric conditional volatility models, the exponential GARCH (or EGARCH) specification is said to be able to capture asymmetry, which refers to the different effects on conditional volatility of positive and negative effects of equal magnitude, and leverage, which refers to the negative correlation between the returns shocks and subsequent shocks to volatility. However, the statistical properties of the (quasi-)maximum likelihood estimator (QMLE) of the EGARCH(p, q) parameters are not available under general conditions, but only for special cases under highly restrictive and unverifiable sufficient conditions, such as EGARCH(1,0) or EGARCH(1,1), and possibly only under simulation. A limitation in the development of asymptotic properties of the QMLE for the EGARCH(p, q) model is the lack of an invertibility condition for the returns shocks underlying the model. It is shown in this article that the EGARCH(p, q) model can be derived from a stochastic process, for which sufficient invertibility conditions can be stated simply and explicitly when the parameters respect a simple condition.11Using the notation introduced in part 2, this refers to the cases where α ≥ |γ| or α ≤ − |γ|. The first inequality is generally assumed in the literature related to the invertibility of EGARCH. This article provides (in the Appendix) an argument for the possible lack of invertibility when these conditions are not met. This will be useful in reinterpreting the existing properties of the QMLE of the EGARCH(p, q) parameters.
Journal: Econometric Reviews
Pages: 824-849
Issue: 8
Volume: 37
Year: 2018
Month: 9
X-DOI: 10.1080/07474938.2016.1167994
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1167994
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:8:p:824-849
Template-Type: ReDIF-Article 1.0
Author-Name: Victor Troster
Author-X-Name-First: Victor
Author-X-Name-Last: Troster
Title: Testing for Granger-causality in quantiles
Abstract:
This paper proposes a consistent parametric test of Granger-causality in quantiles. Although the concept of Granger-causality is defined in terms of the conditional distribution, most articles have tested Granger-causality using conditional mean regression models in which the causal relations are linear. Rather than focusing on a single part of the conditional distribution, we develop a test that evaluates nonlinear causalities and possible causal relations in all conditional quantiles, which provides a sufficient condition for Granger-causality when all quantiles are considered. The proposed test statistic has correct asymptotic size, is consistent against fixed alternatives, and has power against Pitman deviations from the null hypothesis. As the proposed test statistic is asymptotically nonpivotal, we tabulate critical values via a subsampling approach. We present Monte Carlo evidence and an application considering the causal relation between the gold price, the USD/GBP exchange rate, and the oil price.
Journal: Econometric Reviews
Pages: 850-866
Issue: 8
Volume: 37
Year: 2018
Month: 9
X-DOI: 10.1080/07474938.2016.1172400
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1172400
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:8:p:850-866
Template-Type: ReDIF-Article 1.0
Author-Name: Haiqi Li
Author-X-Name-First: Haiqi
Author-X-Name-Last: Li
Author-Name: Sung Y. Park
Author-X-Name-First: Sung Y.
Author-X-Name-Last: Park
Title: Testing for a unit root in a nonlinear quantile autoregression framework
Abstract:
The nonlinear unit root test of Kapetanios, Shin, and Snell (2003) (KSS) has attracted much recent attention. However, the KSS test relies on the ordinary least squares (OLS) estimator, which is not robust to a heavy-tailed distribution and, in practice, the test suffers from a large power loss. This study develops three kinds of quantile nonlinear unit root tests: the quantile t-ratio test; the quantile Kolmogorov–Smirnov test; and the quantile Cramer–von Mises test. A Monte Carlo simulation shows that these tests have significantly better power when an innovation follows a non-normal distribution. In addition, the quantile t-ratio test can reveal the heterogeneity of the asymmetric dynamics in a time series. In our empirical studies, we investigate the unit root properties of U.S. macroeconomic time series and the real effective exchange rates for 61 countries. The results show that our proposed tests reject the unit roots more often, indicating that the series are likely to be asymmetric nonlinear reverting processes.
Journal: Econometric Reviews
Pages: 867-892
Issue: 8
Volume: 37
Year: 2018
Month: 9
X-DOI: 10.1080/00927872.2016.1178871
File-URL: http://hdl.handle.net/10.1080/00927872.2016.1178871
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:8:p:867-892
Template-Type: ReDIF-Article 1.0
Author-Name: Artūras Juodis
Author-X-Name-First: Artūras
Author-X-Name-Last: Juodis
Author-Name: Vasilis Sarafidis
Author-X-Name-First: Vasilis
Author-X-Name-Last: Sarafidis
Title: Fixed T dynamic panel data estimators with multifactor errors
Abstract:
This article analyzes a growing group of fixed T dynamic panel data estimators with a multifactor error structure. We use a unified notational approach to describe these estimators and discuss their properties in terms of deviations from an underlying set of basic assumptions. Furthermore, we consider the extendability of these estimators to practical situations that may frequently arise, such as their ability to accommodate unbalanced panels and common observed factors. Using a large-scale simulation exercise, we consider scenarios that remain largely unexplored in the literature, albeit being of great empirical relevance. In particular, we examine (i) the effect of the presence of weakly exogenous covariates, (ii) the effect of changing the magnitude of the correlation between the factor loadings of the dependent variable and those of the covariates, (iii) the impact of the number of moment conditions on bias and size for GMM estimators, and finally (iv) the effect of sample size. We apply each of these estimators to a crime application using a panel data set of local government authorities in New South Wales, Australia; we find that the results bear substantially different policy implications relative to those potentially derived from standard dynamic panel GMM estimators. Thus, our study may serve as a useful guide to practitioners who wish to allow for multiplicative sources of unobserved heterogeneity in their model.
Journal: Econometric Reviews
Pages: 893-929
Issue: 8
Volume: 37
Year: 2018
Month: 9
X-DOI: 10.1080/00927872.2016.1178875
File-URL: http://hdl.handle.net/10.1080/00927872.2016.1178875
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:8:p:893-929
Template-Type: ReDIF-Article 1.0
Author-Name: Milan Nedeljkovic
Author-X-Name-First: Milan
Author-X-Name-Last: Nedeljkovic
Title: A Projection-Based Nonparametric Test of Conditional Quantile Independence
Abstract:
This paper proposes a nonparametric procedure for testing conditional quantile independence using projections. Relative to existing smoothed nonparametric tests, the resulting test statistic: (i) detects the high frequency local alternatives that converge to the null hypothesis in probability at faster rate and, (ii) yields improvements in the finite sample power when a large number of variables are included under the alternative. In addition, it allows the researcher to include qualitative information and, if desired, direct the test against specific subsets of alternatives without imposing any functional form on them. We use the weighted Nadaraya-Watson (WNW) estimator of the conditional quantile function avoiding the boundary problems in estimation and testing and prove weak uniform consistency (with rate) of the WNW estimator for absolutely regular processes. The procedure is applied to a study of risk spillovers among the banks. We show that the methodology generalizes some of the recently proposed measures of systemic risk and we use the quantile framework to assess the intensity of risk spillovers among individual financial institutions.
Journal: Econometric Reviews
Pages: 1-26
Issue: 1
Volume: 39
Year: 2020
Month: 1
X-DOI: 10.1080/07474938.2019.1690192
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1690192
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:1:p:1-26
Template-Type: ReDIF-Article 1.0
Author-Name: Ruijun Bu
Author-X-Name-First: Ruijun
Author-X-Name-Last: Bu
Author-Name: Fredj Jawadi
Author-X-Name-First: Fredj
Author-X-Name-Last: Jawadi
Author-Name: Yuyi Li
Author-X-Name-First: Yuyi
Author-X-Name-Last: Li
Title: A multifactor transformed diffusion model with applications to VIX and VIX futures
Abstract:
Transformed diffusions (TDs) have become increasingly popular in financial modeling for their model flexibility and tractability. While existing TD models are predominately one-factor models, empirical evidence often prefers models with multiple factors. We propose a novel distribution-driven nonlinear multifactor TD model with latent components. Our model is a transformation of a underlying multivariate Ornstein–Uhlenbeck (MVOU) process, where the transformation function is endogenously specified by a flexible parametric stationary distribution of the observed variable. Computationally efficient exact likelihood inference can be implemented for our model using a modified Kalman filter algorithm and the transformed affine structure also allows us to price derivatives in semi-closed form. We compare the proposed multifactor model with existing TD models for modeling VIX and pricing VIX futures. Our results show that the proposed model outperforms all existing TD models both in the sample and out of the sample consistently across all categories and scenarios of our comparison.
Journal: Econometric Reviews
Pages: 27-53
Issue: 1
Volume: 39
Year: 2020
Month: 1
X-DOI: 10.1080/07474938.2019.1690195
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1690195
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:1:p:27-53
Template-Type: ReDIF-Article 1.0
Author-Name: Fredj Jawadi
Author-X-Name-First: Fredj
Author-X-Name-Last: Jawadi
Author-Name: Zied Ftiti
Author-X-Name-First: Zied
Author-X-Name-Last: Ftiti
Author-Name: Waël Louhichi
Author-X-Name-First: Waël
Author-X-Name-Last: Louhichi
Title: Forecasting energy futures volatility with threshold augmented heterogeneous autoregressive jump models
Abstract:
This study forecasts the volatility of two energy futures markets (oil and gas), using high-frequency data. We, first, disentangle volatility into continuous volatility and jumps. Second, we apply wavelet analysis to study the relationship between volume and the volatility measures for different horizons. Third, we augment the heterogeneous autoregressive (HAR) model by nonlinearly including both jumps and volume. We then propose different empirical extensions of the HAR model. Our study shows that oil and gas volatilities nonlinearly depend on public information (jumps), private information (continuous volatility), and trading volume. Moreover, our threshold augmented HAR model with heterogeneous jumps and continuous volatility outperforms HAR model in forecasting volatility.
Journal: Econometric Reviews
Pages: 54-70
Issue: 1
Volume: 39
Year: 2020
Month: 1
X-DOI: 10.1080/07474938.2019.1690190
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1690190
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:1:p:54-70
Template-Type: ReDIF-Article 1.0
Author-Name: Cem Çakmaklı
Author-X-Name-First: Cem
Author-X-Name-Last: Çakmaklı
Title: Modeling the density of US yield curve using Bayesian semiparametric dynamic Nelson-Siegel model
Abstract:
This paper proposes the Bayesian semiparametric dynamic Nelson-Siegel model for estimating the density of bond yields. Specifically, we model the distribution of the yield curve factors according to an infinite Markov mixture (iMM). The model allows for time variation in the mean and covariance matrix of factors in a discrete manner, as opposed to continuous changes in these parameters such as the Time Varying Parameter (TVP) models. Estimating the number of regimes using the iMM structure endogenously leads to an adaptive process that can generate newly emerging regimes over time in response to changing economic conditions in addition to existing regimes. The potential of the proposed framework is examined using US bond yields data. The semiparametric structure of the factors can handle various forms of non-normalities including fat tails and nonlinear dependence between factors using a unified approach by generating new clusters capturing these specific characteristics. We document that modeling parameter changes in a discrete manner increases the model fit as well as forecasting performance at both short and long horizons relative to models with fixed parameters as well as the TVP model with continuous parameter changes. This is mainly due to fact that the discrete changes in parameters suit the typical low frequency monthly bond yields data characteristics better.
Journal: Econometric Reviews
Pages: 71-91
Issue: 1
Volume: 39
Year: 2020
Month: 1
X-DOI: 10.1080/07474938.2019.1690191
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1690191
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:1:p:71-91
Template-Type: ReDIF-Article 1.0
Author-Name: Benjamin Williams
Author-X-Name-First: Benjamin
Author-X-Name-Last: Williams
Title: Identification of the linear factor model
Abstract:
This paper provides several new results on identification of the linear factor model. The model allows for correlated latent factors and dependence among the idiosyncratic errors. I also illustrate identification under a dedicated measurement structure and other reduced rank restrictions. I use these results to study identification in a model with both observed covariates and latent factors. The analysis emphasizes the different roles played by restrictions on the error covariance matrix, restrictions on the factor loadings and the factor covariance matrix, and restrictions on the coefficients on covariates. The identification results are simple, intuitive, and directly applicable to many settings.
Journal: Econometric Reviews
Pages: 92-109
Issue: 1
Volume: 39
Year: 2020
Month: 1
X-DOI: 10.1080/07474938.2018.1550042
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1550042
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:1:p:92-109
Template-Type: ReDIF-Article 1.0
Author-Name: Alastair R. Hall
Author-X-Name-First: Alastair R.
Author-X-Name-Last: Hall
Title: Foundations of info-metrics: modeling, inference and imperfect information
Journal: Econometric Reviews
Pages: 110-113
Issue: 1
Volume: 39
Year: 2020
Month: 1
X-DOI: 10.1080/07474938.2019.1682315
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1682315
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:1:p:110-113
Template-Type: ReDIF-Article 1.0
Author-Name: Jin-Huei Yeh
Author-X-Name-First: Jin-Huei
Author-X-Name-Last: Yeh
Author-Name: Jying-Nan Wang
Author-X-Name-First: Jying-Nan
Author-X-Name-Last: Wang
Title: Bias-corrected realized variance
Abstract:
We propose a novel “bias-corrected realized variance” (BCRV) estimator based upon the appropriate re-weighting of two realized variances calculated at different sampling frequencies. Our bias-correction methodology is found to be extremely accurate, with the finite sample variance being significantly minimized. In our Monte Carlo experiments and a finite sample MSE comparison of alternative estimators, the performance of our straightforward BCRV estimator is shown to be comparable to other widely-used integrated variance estimators. Given its simplicity, our BCRV estimator is likely to appeal to researchers and practitioners alike for the estimation of integrated variance.
Journal: Econometric Reviews
Pages: 170-192
Issue: 2
Volume: 38
Year: 2019
Month: 2
X-DOI: 10.1080/07474938.2016.1222230
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1222230
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:2:p:170-192
Template-Type: ReDIF-Article 1.0
Author-Name: Rongmao Zhang
Author-X-Name-First: Rongmao
Author-X-Name-Last: Zhang
Author-Name: Chenxue Li
Author-X-Name-First: Chenxue
Author-X-Name-Last: Li
Author-Name: Liang Peng
Author-X-Name-First: Liang
Author-X-Name-Last: Peng
Title: Inference for the tail index of a GARCH(1,1) model and an AR(1) model with ARCH(1) errors
Abstract:
For a GARCH(1,1) sequence or an AR(1) model with ARCH(1) errors, one can estimate the tail index by solving an estimating equation with unknown parameters replaced by the quasi maximum likelihood estimation, and a profile empirical likelihood method can be employed to effectively construct a confidence interval for the tail index. However, this requires that the errors of such a model have at least a finite fourth moment. In this article, we show that the finite fourth moment can be relaxed by employing a least absolute deviations estimate for the unknown parameters by noting that the estimating equation for determining the tail index is invariant to a scale transformation of the underlying model.
Journal: Econometric Reviews
Pages: 151-169
Issue: 2
Volume: 38
Year: 2019
Month: 2
X-DOI: 10.1080/07474938.2016.1224024
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1224024
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:2:p:151-169
Template-Type: ReDIF-Article 1.0
Author-Name: Chaohua Dong
Author-X-Name-First: Chaohua
Author-X-Name-Last: Dong
Author-Name: Jiti Gao
Author-X-Name-First: Jiti
Author-X-Name-Last: Gao
Title: Expansion and estimation of Lévy process functionals in nonlinear and nonstationary time series regression
Abstract:
In this article, we develop a series estimation method for unknown time-inhomogeneous functionals of Lévy processes involved in econometric time series models. To obtain an asymptotic distribution for the proposed estimators, we establish a general asymptotic theory for partial sums of bivariate functionals of time and nonstationary variables. These results show that the proposed estimators in different situations converge to quite different random variables. In addition, the rates of convergence depend on various factors rather than just the sample size. Finite sample simulations are provided to evaluate the finite sample performance of the proposed model and estimation method.
Journal: Econometric Reviews
Pages: 125-150
Issue: 2
Volume: 38
Year: 2019
Month: 2
X-DOI: 10.1080/07474938.2016.1235305
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1235305
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:2:p:125-150
Template-Type: ReDIF-Article 1.0
Author-Name: Zacharias Psaradakis
Author-X-Name-First: Zacharias
Author-X-Name-Last: Psaradakis
Author-Name: Marián Vávra
Author-X-Name-First: Marián
Author-X-Name-Last: Vávra
Title: Portmanteau tests for linearity of stationary time series
Abstract:
This article considers the problem of testing for linearity of stationary time series. Portmanteau tests are discussed which are based on generalized correlations of residuals from a linear model (that is, autocorrelations and cross-correlations of different powers of the residuals). The finite-sample properties of the tests are assessed by means of Monte Carlo experiments. The tests are applied to 100 time series of stock returns.
Journal: Econometric Reviews
Pages: 248-262
Issue: 2
Volume: 38
Year: 2019
Month: 2
X-DOI: 10.1080/07474938.2016.1261015
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1261015
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:2:p:248-262
Template-Type: ReDIF-Article 1.0
Author-Name: Giovanni Forchini
Author-X-Name-First: Giovanni
Author-X-Name-Last: Forchini
Author-Name: Bin Jiang
Author-X-Name-First: Bin
Author-X-Name-Last: Jiang
Title: The unconditional distributions of the OLS, TSLS and LIML estimators in a simple structural equations model
Abstract:
The exact distributions of the standard estimators of the structural coefficients in a linear structural equations model conditional on the exogenous variables have been shown to have some unexpected and quirky features. Since the argument for conditioning on exogenous (ancillary) variables has been weakened over the past 20 years by the discovery of an “ancillarity paradox,” it is natural to wonder whether such finite sample properties are in fact due to conditioning on the exogenous variables. This article studies the exact distributions of the ordinary least squares (OLS), two-stage least squares (TSLS), and limited information maximum likelihood (LIML) estimators of the structural coefficients in a linear structural equation without conditioning on the exogenous variables.
Journal: Econometric Reviews
Pages: 208-247
Issue: 2
Volume: 38
Year: 2019
Month: 2
X-DOI: 10.1080/07474938.2016.1261072
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1261072
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:2:p:208-247
Template-Type: ReDIF-Article 1.0
Author-Name: Michael Lechner
Author-X-Name-First: Michael
Author-X-Name-Last: Lechner
Author-Name: Anthony Strittmatter
Author-X-Name-First: Anthony
Author-X-Name-Last: Strittmatter
Title: Practical procedures to deal with common support problems in matching estimation
Abstract:
This paper assesses the performance of common estimators adjusting for differences in covariates, such as matching and regression, when faced with the so-called common support problems. It also shows how different procedures suggested in the literature affect the properties of such estimators. Based on an empirical Monte Carlo simulation design, a lack of common support is found to increase the root-mean-squared error of all investigated parametric and semiparametric estimators. Dropping observations that are off support usually improves their performance, although the magnitude of the improvement depends on the particular method used.
Journal: Econometric Reviews
Pages: 193-207
Issue: 2
Volume: 38
Year: 2019
Month: 2
X-DOI: 10.1080/07474938.2017.1318509
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1318509
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:2:p:193-207
Template-Type: ReDIF-Article 1.0
Author-Name: Luis F. Martins
Author-X-Name-First: Luis F.
Author-X-Name-Last: Martins
Title: Bootstrap tests for time varying cointegration
Abstract:
This article proposes wild and the independent and identically distibuted (i.i.d.) parametric bootstrap implementations of the time-varying cointegration test of Bierens and Martins (2010). The bootstrap statistics and the original likelihood ratio test share the same first-order asymptotic null distribution. Monte Carlo results suggest that the bootstrap approximation to the finite-sample distribution is very accurate, in particular for the wild bootstrap case. The tests are applied to study the purchasing power parity hypothesis for twelve Organisation for Economic Cooperation and Development (OECD) countries and we only find evidence of a constant long-term equilibrium for the U.S.–U.K. relationship.
Journal: Econometric Reviews
Pages: 466-483
Issue: 5
Volume: 37
Year: 2018
Month: 5
X-DOI: 10.1080/07474938.2015.1092830
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092830
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:5:p:466-483
Template-Type: ReDIF-Article 1.0
Author-Name: Feng Liu
Author-X-Name-First: Feng
Author-X-Name-Last: Liu
Author-Name: Dong Li
Author-X-Name-First: Dong
Author-X-Name-Last: Li
Author-Name: Xinmei Kang
Author-X-Name-First: Xinmei
Author-X-Name-Last: Kang
Title: Sample path properties of an explosive double autoregressive model
Abstract:
This article studies sample path properties of an explosive double autoregressive (DAR) model. After suitable renormalization, it is shown that the sample path converges weakly to a geometric Brownian motion. This further strengthens our understanding of sample paths of nonstationary DAR processes. The obtained results can be extended to nonstationary random coefficient autoregressive (RCA) models. Simulation studies are carried out to support our results.
Journal: Econometric Reviews
Pages: 484-490
Issue: 5
Volume: 37
Year: 2018
Month: 5
X-DOI: 10.1080/07474938.2015.1092841
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092841
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:5:p:484-490
Template-Type: ReDIF-Article 1.0
Author-Name: Guangyu Mao
Author-X-Name-First: Guangyu
Author-X-Name-Last: Mao
Title: Testing for sphericity in a two-way error components panel data model
Abstract:
This article is concerned with sphericity test for the two-way error components panel data model. It is found that the John statistic and the bias-corrected LM statistic recently developed by Baltagi et al. (2011)Baltagi et al. (2012, which are based on the within residuals, are not helpful under the present circumstances even though they are in the one-way fixed effects model. However, we prove that when the within residuals are properly transformed, the resulting residuals can serve to construct useful statistics that are similar to those of Baltagi et al. (2011)Baltagi et al. (2012). Simulation results show that the newly proposed statistics perform well under the null hypothesis and several typical alternatives.
Journal: Econometric Reviews
Pages: 491-506
Issue: 5
Volume: 37
Year: 2018
Month: 5
X-DOI: 10.1080/07474938.2015.1092844
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092844
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:5:p:491-506
Template-Type: ReDIF-Article 1.0
Author-Name: Masayuki Hirukawa
Author-X-Name-First: Masayuki
Author-X-Name-Last: Hirukawa
Author-Name: Mari Sakudo
Author-X-Name-First: Mari
Author-X-Name-Last: Sakudo
Title: Functional-coefficient cointegration models in the presence of deterministic trends
Abstract:
In this article, we extend the functional-coefficient cointegration model (FCCM) to the cases in which nonstationary regressors contain both stochastic and deterministic trends. A nondegenerate distributional theory on the local linear (LL) regression smoother of the FCCM is explored. It is demonstrated that even when integrated regressors are endogenous, the limiting distribution is the same as if they were exogenous. Finite-sample performance of the LL estimator is investigated via Monte Carlo simulations in comparison with an alternative estimation method. As an application of the FCCM, electricity demand analysis in Illinois is considered.
Journal: Econometric Reviews
Pages: 507-533
Issue: 5
Volume: 37
Year: 2018
Month: 5
X-DOI: 10.1080/07474938.2015.1092845
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1092845
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:5:p:507-533
Template-Type: ReDIF-Article 1.0
Author-Name: Koen Bel
Author-X-Name-First: Koen
Author-X-Name-Last: Bel
Author-Name: Dennis Fok
Author-X-Name-First: Dennis
Author-X-Name-Last: Fok
Author-Name: Richard Paap
Author-X-Name-First: Richard
Author-X-Name-Last: Paap
Title: Parameter estimation in multivariate logit models with many binary choices
Abstract:
Multivariate Logit models are convenient to describe multivariate correlated binary choices as they provide closed-form likelihood functions. However, the computation time required for calculating choice probabilities increases exponentially with the number of choices, which makes maximum likelihood-based estimation infeasible when many choices are considered. To solve this, we propose three novel estimation methods: (i) stratified importance sampling, (ii) composite conditional likelihood (CCL), and (iii) generalized method of moments, which yield consistent estimates and still have similar small-sample bias to maximum likelihood. Our simulation study shows that computation times for CCL are much smaller and that its efficiency loss is small.
Journal: Econometric Reviews
Pages: 534-550
Issue: 5
Volume: 37
Year: 2018
Month: 5
X-DOI: 10.1080/07474938.2015.1093780
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1093780
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:5:p:534-550
Template-Type: ReDIF-Article 1.0
Author-Name: Simon Reese
Author-X-Name-First: Simon
Author-X-Name-Last: Reese
Author-Name: Joakim Westerlund
Author-X-Name-First: Joakim
Author-X-Name-Last: Westerlund
Title: Estimation of factor-augmented panel regressions with weakly influential factors
Abstract:
The use of factor-augmented panel regressions has become very popular in recent years. Existing methods for such regressions require that the common factors are strong, an assumption that is likely to be mistaken in practice. Motivated by this, the current article offers an analysis of the effect of weak, semi-weak, and semi-strong factors on two of the most popular estimators for factor-augmented regressions, namely, principal components (PC) and common correlated effects (CCE).
Journal: Econometric Reviews
Pages: 401-465
Issue: 5
Volume: 37
Year: 2018
Month: 5
X-DOI: 10.1080/07474938.2015.1106758
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1106758
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:5:p:401-465
Template-Type: ReDIF-Article 1.0
Author-Name: Filip Žikeš
Author-X-Name-First: Filip
Author-X-Name-Last: Žikeš
Author-Name: Jozef Baruník
Author-X-Name-First: Jozef
Author-X-Name-Last: Baruník
Author-Name: Nikhil Shenai
Author-X-Name-First: Nikhil
Author-X-Name-Last: Shenai
Title: Modeling and forecasting persistent financial durations
Abstract:
This article introduces the Markov-Switching Multifractal Duration (MSMD) model by adapting the MSM stochastic volatility model of Calvet and Fisher (2004) to the duration setting. Although the MSMD process is exponential β-mixing as we show in the article, it is capable of generating highly persistent autocorrelation. We study, analytically and by simulation, how this feature of durations generated by the MSMD process propagates to counts and realized volatility. We employ a quasi-maximum likelihood estimator of the MSMD parameters based on the Whittle approximation and establish its strong consistency and asymptotic normality for general MSMD specifications. We show that the Whittle estimation is a computationally simple and fast alternative to maximum likelihood. Finally, we compare the performance of the MSMD model with competing short- and long-memory duration models in an out-of-sample forecasting exercise based on price durations of three major foreign exchange futures contracts. The results of the comparison show that the MSMD and the Long Memory Stochastic Duration model perform similarly and are superior to the short-memory Autoregressive Conditional Duration models.
Journal: Econometric Reviews
Pages: 1081-1110
Issue: 10
Volume: 36
Year: 2017
Month: 11
X-DOI: 10.1080/07474938.2014.977057
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977057
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:10:p:1081-1110
Template-Type: ReDIF-Article 1.0
Author-Name: Apostolos Serletis
Author-X-Name-First: Apostolos
Author-X-Name-Last: Serletis
Author-Name: Maksim Isakin
Author-X-Name-First: Maksim
Author-X-Name-Last: Isakin
Title: Stochastic volatility demand systems
Abstract:
We address the estimation of stochastic volatility demand systems. In particular, we relax the homoscedasticity assumption and instead assume that the covariance matrix of the errors of demand systems is time-varying. Since most economic and financial time series are nonlinear, we achieve superior modeling using parametric nonlinear demand systems in which the unconditional variance is constant but the conditional variance, like the conditional mean, is also a random variable depending on current and past information. We also prove an important practical result of invariance of the maximum likelihood estimator with respect to the choice of equation eliminated from a singular demand system. An empirical application is provided, using the BEKK specification to model the conditional covariance matrix of the errors of the basic translog demand system.
Journal: Econometric Reviews
Pages: 1111-1122
Issue: 10
Volume: 36
Year: 2017
Month: 11
X-DOI: 10.1080/07474938.2014.977091
File-URL: http://hdl.handle.net/10.1080/07474938.2014.977091
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:10:p:1111-1122
Template-Type: ReDIF-Article 1.0
Author-Name: Yiannis Karavias
Author-X-Name-First: Yiannis
Author-X-Name-Last: Karavias
Author-Name: Elias Tzavalis
Author-X-Name-First: Elias
Author-X-Name-Last: Tzavalis
Title: Local power of panel unit root tests allowing for structural breaks
Abstract:
The asymptotic local power of least squares–based fixed-T panel unit root tests allowing for a structural break in their individual effects and/or incidental trends of the AR(1) panel data model is studied. Limiting distributions of these tests are derived under a sequence of local alternatives, and analytic expressions show how their means and variances are functions of the break date and the time dimension of the panel. The considered tests have nontrivial local power in a N−1/2 neighborhood of unity when the panel data model includes individual intercepts. For panel data models with incidental trends, the power of the tests becomes trivial in this neighborhood. However, this problem does not always appear if the tests allow for serial correlation in the error term and completely vanishes in the presence of cross-section correlation. These results show that fixed-T tests have very different theoretical properties than their large-T counterparts. Monte Carlo experiments demonstrate the usefulness of the asymptotic theory in small samples.
Journal: Econometric Reviews
Pages: 1123-1156
Issue: 10
Volume: 36
Year: 2017
Month: 11
X-DOI: 10.1080/07474938.2015.1059722
File-URL: http://hdl.handle.net/10.1080/07474938.2015.1059722
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:10:p:1123-1156
Template-Type: ReDIF-Article 1.0
Author-Name: Dominik Wied
Author-X-Name-First: Dominik
Author-X-Name-Last: Wied
Title: A nonparametric test for a constant correlation matrix
Abstract:
We propose a nonparametric procedure to test for changes in correlation matrices at an unknown point in time. The new test requires constant expectations and variances, but only mild assumptions on the serial dependence structure, and has considerable power in finite samples. We derive the asymptotic distribution under the null hypothesis of no change as well as local power results and apply the test to stock returns.
Journal: Econometric Reviews
Pages: 1157-1172
Issue: 10
Volume: 36
Year: 2017
Month: 11
X-DOI: 10.1080/07474938.2014.998152
File-URL: http://hdl.handle.net/10.1080/07474938.2014.998152
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:10:p:1157-1172
Template-Type: ReDIF-Article 1.0
Author-Name: The Editors
Title: List of Referees
Journal: Econometric Reviews
Pages: 1173-1174
Issue: 10
Volume: 36
Year: 2017
Month: 11
X-DOI: 10.1080/07474938.2017.1329616
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1329616
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:10:p:1173-1174
Template-Type: ReDIF-Article 1.0
Author-Name: The Editors
Title: Editorial Board EOV
Journal: Econometric Reviews
Pages: ebi-ebi
Issue: 10
Volume: 36
Year: 2017
Month: 11
X-DOI: 10.1080/07474938.2017.1363147
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1363147
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:36:y:2017:i:10:p:ebi-ebi
Template-Type: ReDIF-Article 1.0
Author-Name: Drew Creal
Author-X-Name-First: Drew
Author-X-Name-Last: Creal
Title: A Survey of Sequential Monte Carlo Methods for Economics and Finance
Abstract: This article serves as an introduction and survey for economists to the field of sequential Monte Carlo methods which are also known as particle filters. Sequential Monte Carlo methods are simulation-based algorithms used to compute the high-dimensional and/or complex integrals that arise regularly in applied work. These methods are becoming increasingly popular in economics and finance; from dynamic stochastic general equilibrium models in macro-economics to option pricing. The objective of this article is to explain the basics of the methodology, provide references to the literature, and cover some of the theoretical results that justify the methods in practice.
Journal: Econometric Reviews
Pages: 245-296
Issue: 3
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607333
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607333
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:3:p:245-296
Template-Type: ReDIF-Article 1.0
Author-Name: Christian Kascha
Author-X-Name-First: Christian
Author-X-Name-Last: Kascha
Title: A Comparison of Estimation Methods for Vector Autoregressive Moving-Average Models
Abstract: Recently, there has been a renewed interest in modeling economic time series by vector autoregressive moving-average models. However, this class of models has been unpopular in practice because of estimation problems and the complexity of the identification stage. These disadvantages could have led to the dominant use of vector autoregressive models in macroeconomic research. In this article, several simple estimation methods for vector autoregressive moving-average models are compared among each other and with pure vector autoregressive modeling using ordinary least squares by means of a Monte Carlo study. Different evaluation criteria are used to judge the relative performances of the algorithms.
Journal: Econometric Reviews
Pages: 297-324
Issue: 3
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607343
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607343
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:3:p:297-324
Template-Type: ReDIF-Article 1.0
Author-Name: Takamitsu Kurita
Author-X-Name-First: Takamitsu
Author-X-Name-Last: Kurita
Title: Likelihood-Based Inference for Weak Exogeneity in (2) Cointegrated VAR Models
Abstract: This article develops limit theory for likelihood analysis of weak exogeneity in I(2) cointegrated vector autoregressive (VAR) models incorporating deterministic terms. Conditions for weak exogeneity in I(2) VAR models are reviewed, and the asymptotic properties of conditional maximum likelihood estimators and a likelihood-based weak exogeneity test are then investigated. It is demonstrated that weak exogeneity in I(2) VAR models allows us to conduct asymptotic conditional inference based on mixed Gaussian distributions. It is then proved that a log-likelihood ratio test statistic for weak exogeneity in I(2) VAR models is asymptotically χ2 distributed. The article also presents an empirical illustration of the proposed test for weak exogeneity using Japan's macroeconomic data.
Journal: Econometric Reviews
Pages: 325-360
Issue: 3
Volume: 31
Year: 2012
X-DOI: 10.1080/07474938.2011.607346
File-URL: http://hdl.handle.net/10.1080/07474938.2011.607346
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:31:y:2012:i:3:p:325-360
Template-Type: ReDIF-Article 1.0
Author-Name: Süleyman Taşpınar
Author-X-Name-First: Süleyman
Author-X-Name-Last: Taşpınar
Author-Name: Osman Doğan
Author-X-Name-First: Osman
Author-X-Name-Last: Doğan
Author-Name: Wim P. M. Vijverberg
Author-X-Name-First: Wim P. M.
Author-X-Name-Last: Vijverberg
Title: GMM inference in spatial autoregressive models
Abstract:
In this study, we investigate the finite sample properties of the optimal generalized method of moments estimator (OGMME) for a spatial econometric model with a first-order spatial autoregressive process in the dependent variable and the disturbance term (for short SARAR(1, 1)). We show that the estimated asymptotic standard errors for spatial autoregressive parameters can be substantially smaller than their empirical counterparts. Hence, we extend the finite sample variance correction methodology of Windmeijer (2005) to the OGMME for the SARAR(1, 1) model. Results from simulation studies indicate that the correction method improves the variance estimates in small samples and leads to more accurate inference for the spatial autoregressive parameters. For the same model, we compare the finite sample properties of various test statistics for linear restrictions on autoregressive parameters. These tests include the standard asymptotic Wald test based on various GMMEs, a bootstrapped version of the Wald test, two versions of the C(α) test, the standard Lagrange multiplier (LM) test, the minimum chi-square test (MC), and two versions of the generalized method of moments (GMM) criterion test. Finally, we study the finite sample properties of effects estimators that show how changes in explanatory variables impact the dependent variable.
Journal: Econometric Reviews
Pages: 931-954
Issue: 9
Volume: 37
Year: 2018
Month: 10
X-DOI: 10.1080/00927872.2016.1178885
File-URL: http://hdl.handle.net/10.1080/00927872.2016.1178885
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:9:p:931-954
Template-Type: ReDIF-Article 1.0
Author-Name: Efstathios Paparoditis
Author-X-Name-First: Efstathios
Author-X-Name-Last: Paparoditis
Author-Name: Dimitris N. Politis
Author-X-Name-First: Dimitris N.
Author-X-Name-Last: Politis
Title: The asymptotic size and power of the augmented Dickey–Fuller test for a unit root
Abstract:
It is shown that the limiting distribution of the augmented Dickey–Fuller (ADF) test under the null hypothesis of a unit root is valid under a very general set of assumptions that goes far beyond the linear AR(∞) process assumption typically imposed. In essence, all that is required is that the error process driving the random walk possesses a continuous spectral density that is strictly positive. Furthermore, under the same weak assumptions, the limiting distribution of the ADF test is derived under the alternative of stationarity, and a theoretical explanation is given for the well-known empirical fact that the test's power is a decreasing function of the chosen autoregressive order p. The intuitive reason for the reduced power of the ADF test is that, as p tends to infinity, the p regressors become asymptotically collinear.
Journal: Econometric Reviews
Pages: 955-973
Issue: 9
Volume: 37
Year: 2018
Month: 10
X-DOI: 10.1080/00927872.2016.1178887
File-URL: http://hdl.handle.net/10.1080/00927872.2016.1178887
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:9:p:955-973
Template-Type: ReDIF-Article 1.0
Author-Name: Yohei Yamamoto
Author-X-Name-First: Yohei
Author-X-Name-Last: Yamamoto
Title: A modified confidence set for the structural break date in linear regression models
Abstract:
Elliott and Müller (EM) (2007) provide a method for constructing a confidence set for the structural break date by inverting a variant of the locally best test statistic. Previous studies have shown that the EM method produces a set with an accurate coverage ratio even for a small break; however, the set is often overly lengthy. This study proposes a simple modification to rehabilitate their method through the long-run variance estimation. Following the literature, we provide an asymptotic justification for the improvement of the modified method over the original method under a nonlocal asymptotic framework. A Monte Carlo simulation shows that the modified method achieves a shorter confidence set than the EM method, especially when the break is large or the HAC correction is conducted. The modified method may exhibit minor errors in the coverage rate when the break is small; however, the coverage is more stable than alternative methods when the break is large. We apply our method to a level shift in post-1980s Japanese inflation data.
Journal: Econometric Reviews
Pages: 974-999
Issue: 9
Volume: 37
Year: 2018
Month: 10
X-DOI: 10.1080/00927872.2016.1178892
File-URL: http://hdl.handle.net/10.1080/00927872.2016.1178892
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:9:p:974-999
Template-Type: ReDIF-Article 1.0
Author-Name: Alain Guay
Author-X-Name-First: Alain
Author-X-Name-Last: Guay
Author-Name: Jean-François Lamarche
Author-X-Name-First: Jean-François
Author-X-Name-Last: Lamarche
Title: Structural change tests for GEL criteria
Abstract:
This article examines structural change tests based on generalized empirical likelihood methods in the time series context, allowing for dependent data. Standard structural change tests for the Generalized method of moments (GMM) are adapted to the generalized empirical likelihood (GEL) context. We show that when moment conditions are properly smoothed, these test statistics converge to the same asymptotic distribution as in the GMM, in cases with known and unknown breakpoints. New test statistics specific to GEL methods, and that are robust to weak identification, are also introduced. A simulation study examines the small sample properties of the tests and reveals that GEL-based robust tests performed well, both in terms of the presence and location of a structural change and in terms of the nature of identification.
Journal: Econometric Reviews
Pages: 1000-1032
Issue: 9
Volume: 37
Year: 2018
Month: 10
X-DOI: 10.1080/00927872.2016.1178893
File-URL: http://hdl.handle.net/10.1080/00927872.2016.1178893
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:37:y:2018:i:9:p:1000-1032
Template-Type: ReDIF-Article 1.0
Author-Name: Sven Schreiber
Author-X-Name-First: Sven
Author-X-Name-Last: Schreiber
Title: The estimation uncertainty of permanent-transitory decompositions in co-integrated systems
Abstract:
The topic of this article is the estimation uncertainty of the Stock–Watson and Gonzalo–Granger permanent-transitory decompositions in the framework of the co-integrated vector autoregression. We suggest an approach to construct the confidence interval of the transitory component estimate in a given period (e.g., the latest observation) by conditioning on the observed data in that period. To calculate asymptotically valid confidence intervals, we use the delta method and two bootstrap variants. As an illustration, we analyze the uncertainty of (U.S.) output gap estimates in a system of output, consumption, and investment.
Journal: Econometric Reviews
Pages: 279-300
Issue: 3
Volume: 38
Year: 2019
Month: 3
X-DOI: 10.1080/07474938.2016.1235257
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1235257
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:3:p:279-300
Template-Type: ReDIF-Article 1.0
Author-Name: Andrés Ramírez Hassan
Author-X-Name-First: Andrés
Author-X-Name-Last: Ramírez Hassan
Author-Name: Santiago Montoya Blandón
Author-X-Name-First: Santiago
Author-X-Name-Last: Montoya Blandón
Title: Welfare gains of the poor: An endogenous Bayesian approach with spatial random effects
Abstract:
We introduce a Bayesian instrumental variable procedure with spatial random effects that handles endogeneity, and spatial dependence with unobserved heterogeneity. We find through a limited Monte Carlo experiment that our proposal works well in terms of point estimates and prediction. We apply our method to analyze the welfare effects generated by a process of electricity tariff unification on the poorest households. In particular, we deduce an Equivalent Variation measure where there is a budget constraint for a two-tiered pricing scheme, and find that 10% of the poorest municipalities attained welfare gains above 2% of their initial income.
Journal: Econometric Reviews
Pages: 301-318
Issue: 3
Volume: 38
Year: 2019
Month: 3
X-DOI: 10.1080/07474938.2016.1261062
File-URL: http://hdl.handle.net/10.1080/07474938.2016.1261062
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:3:p:301-318
Template-Type: ReDIF-Article 1.0
Author-Name: Gabi Gayer
Author-X-Name-First: Gabi
Author-X-Name-Last: Gayer
Author-Name: Offer Lieberman
Author-X-Name-First: Offer
Author-X-Name-Last: Lieberman
Author-Name: Omer Yaffe
Author-X-Name-First: Omer
Author-X-Name-Last: Yaffe
Title: Similarity-based model for ordered categorical data
Abstract:
In a large variety of applications, the data for a variable we wish to explain are ordered and categorical. In this paper, we present a new similarity-based model for the scenario and investigate its properties. We establish that the process is ψ-mixing and strictly stationary and derive the explicit form of the autocorrelation function in some special cases. Consistency and asymptotic normality of the maximum likelihood estimator of the model’s parameters are proven. A simulation study supports our findings. The results are applied to the Netflix data set, comprised of a survey on users’ grading of movies.
Journal: Econometric Reviews
Pages: 263-278
Issue: 3
Volume: 38
Year: 2019
Month: 3
X-DOI: 10.1080/07474938.2017.1308054
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1308054
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:3:p:263-278
Template-Type: ReDIF-Article 1.0
Author-Name: Heino Bohn Nielsen
Author-X-Name-First: Heino Bohn
Author-X-Name-Last: Nielsen
Title: Estimation bias and bias correction in reduced rank autoregressions
Abstract:
This paper characterizes the finite-sample bias of the maximum likelihood estimator (MLE) in a reduced rank vector autoregression and suggests two simulation-based bias corrections. One is a simple bootstrap implementation that approximates the bias at the MLE. The other is an iterative root-finding algorithm implemented using stochastic approximation methods. Both algorithms are shown to be improvements over the MLE, measured in terms of mean square error and mean absolute deviation. An illustration to US macroeconomic time series is given.
Journal: Econometric Reviews
Pages: 332-349
Issue: 3
Volume: 38
Year: 2019
Month: 3
X-DOI: 10.1080/07474938.2017.1308065
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1308065
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:3:p:332-349
Template-Type: ReDIF-Article 1.0
Author-Name: José Ignacio Cuesta
Author-X-Name-First: José Ignacio
Author-X-Name-Last: Cuesta
Author-Name: Jonathan M. V. Davis
Author-X-Name-First: Jonathan M. V.
Author-X-Name-Last: Davis
Author-Name: Andrew Gianou
Author-X-Name-First: Andrew
Author-X-Name-Last: Gianou
Author-Name: Alejandro Hoyos
Author-X-Name-First: Alejandro
Author-X-Name-Last: Hoyos
Title: Identification of average marginal effects under misspecification when covariates are normal
Abstract:
A previously known result in the econometrics literature is that when covariates of an underlying data generating process are jointly normally distributed, estimates from a nonlinear model that is misspecified as linear can be interpreted as average marginal effects. This has been shown for models with exogenous covariates and separability between covariates and errors. In this paper, we extend this identification result to a variety of more general cases, in particular for combinations of separable and nonseparable models under both exogeneity and endogeneity. So long as the underlying model belongs to one of these large classes of data generating processes, our results show that nothing else must be known about the true DGP—beyond normality of observable data, a testable assumption—in order for linear estimators to be interpretable as average marginal effects. We use simulation to explore the performance of these estimators using a misspecified linear model and show they perform well when the data are normal but can perform poorly when this is not the case.
Journal: Econometric Reviews
Pages: 350-357
Issue: 3
Volume: 38
Year: 2019
Month: 3
X-DOI: 10.1080/07474938.2017.1308091
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1308091
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:3:p:350-357
Template-Type: ReDIF-Article 1.0
Author-Name: Dong Li
Author-X-Name-First: Dong
Author-X-Name-Last: Li
Author-Name: Shaojun Guo
Author-X-Name-First: Shaojun
Author-X-Name-Last: Guo
Author-Name: Ke Zhu
Author-X-Name-First: Ke
Author-X-Name-Last: Zhu
Title: Double AR model without intercept: An alternative to modeling nonstationarity and heteroscedasticity
Abstract:
This paper presents a double AR model without intercept (DARWIN model) and provides us a new way to study the nonstationary heteroscedastic time series. It is shown that the DARWIN model is always nonstationary and heteroscedastic, and its sample properties depend on the Lyapunov exponent. An easy-to-implement estimator is proposed for the Lyapunov exponent, and it is unbiased, strongly consistent, and asymptotically normal. Based on this estimator, a powerful test is constructed for testing the ordinary oscillation of the model. Moreover, this paper proposes the quasi-maximum likelihood estimator (QMLE) for the DARWIN model, which has an explicit form. The strong consistency and asymptotic normality of the QMLE are established regardless of the sign of the Lyapunov exponent. Simulation studies are conducted to assess the performance of the estimation and testing, and an empirical example is given for illustrating the usefulness of the DARWIN model.
Journal: Econometric Reviews
Pages: 319-331
Issue: 3
Volume: 38
Year: 2019
Month: 3
X-DOI: 10.1080/07474938.2017.1310080
File-URL: http://hdl.handle.net/10.1080/07474938.2017.1310080
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:38:y:2019:i:3:p:319-331
Template-Type: ReDIF-Article 1.0
Author-Name: Stefanos Dimitrakopoulos
Author-X-Name-First: Stefanos
Author-X-Name-Last: Dimitrakopoulos
Author-Name: Michalis Kolossiatis
Author-X-Name-First: Michalis
Author-X-Name-Last: Kolossiatis
Title: Bayesian analysis of moving average stochastic volatility models: modeling in-mean effects and leverage for financial time series
Abstract:
We propose a moving average stochastic volatility in mean model and a moving average stochastic volatility model with leverage. For parameter estimation, we develop efficient Markov chain Monte Carlo algorithms and illustrate our methods, using simulated and real data sets. We compare the proposed specifications against several competing stochastic volatility models, using marginal likelihoods and the observed-data Deviance information criterion. We also perform a forecasting exercise, using predictive likelihoods, the root mean square forecast error and Kullback-Leibler divergence. We find that the moving average stochastic volatility model with leverage better fits the four empirical data sets used.
Journal: Econometric Reviews
Pages: 319-343
Issue: 4
Volume: 39
Year: 2020
Month: 4
X-DOI: 10.1080/07474938.2019.1630075
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1630075
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:4:p:319-343
Template-Type: ReDIF-Article 1.0
Author-Name: Gholamreza Hajargasht
Author-X-Name-First: Gholamreza
Author-X-Name-Last: Hajargasht
Author-Name: William E. Griffiths
Author-X-Name-First: William E.
Author-X-Name-Last: Griffiths
Title: Minimum distance estimation of parametric Lorenz curves based on grouped data
Abstract:
The Lorenz curve, introduced more than 100 years ago, remains as one of the main tools for analysis of inequality. International institutions such as the World Bank collect and publish grouped income data in the form of population and income shares for a large number of countries. These data are often used for estimation of parametric Lorenz curves which in turn form the basis for most inequality analyses. Despite the prevalence of parametric estimation of Lorenz curves from grouped data, and the existence of well-developed nonparametric methods, a formal description of rigorous methodology for estimating parametric Lorenz curves from grouped data is lacking. We fill this gap. Building on two data generating mechanisms, efficient methods of estimation and inference are described; several results useful for comparing the two methods of inference, and aiding computation, are derived. Simulations are used to assess the estimators, and curves are estimated for some example countries. We also show how the proposed methods improve upon World Bank methods and make recommendations for improving current practices.
Journal: Econometric Reviews
Pages: 344-361
Issue: 4
Volume: 39
Year: 2020
Month: 4
X-DOI: 10.1080/07474938.2019.1630077
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1630077
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2020:i:4:p:344-361
Template-Type: ReDIF-Article 1.0
Author-Name: Lorenzo Camponovo
Author-X-Name-First: Lorenzo
Author-X-Name-Last: Camponovo
Title: Bootstrap inference for penalized GMM estimators with oracle properties
Abstract:
We study the validity of bootstrap methods in approximating the sampling distribution of penalized GMM estimators with oracle properties. More precisely, we focus on bridge estimators with Lq penalty for 0N log N. We confirm these results through finite sample Monte-Carlo simulations.
Journal: Econometric Reviews
Pages: 830-851
Issue: 9
Volume: 40
Year: 2021
Month: 10
X-DOI: 10.1080/07474938.2021.1889195
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1889195
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:40:y:2021:i:9:p:830-851
Template-Type: ReDIF-Article 1.0
Author-Name: Yu Sun
Author-X-Name-First: Yu
Author-X-Name-Last: Sun
Author-Name: Karen X. Yan
Author-X-Name-First: Karen X.
Author-X-Name-Last: Yan
Author-Name: Qi Li
Author-X-Name-First: Qi
Author-X-Name-Last: Li
Title: Estimation of average treatment effect based on a semiparametric propensity score
Abstract:
This paper considers the estimation of average treatment effect using propensity score method. We propose to use a semiparametric single-index model to estimate the propensity score. This avoids the curse of dimensionality problem with the nonparametric method based propensity score estimator. We establish the asymptotic distribution of the average treatment effect estimator. Monte Carlo simulation results show that the proposed method works well in finite samples and outperforms the conventional nonparametric kernel approach. We apply the proposed method to an empirical data examining the efficacy of right heart catheterization on medical outcomes.
Journal: Econometric Reviews
Pages: 852-866
Issue: 9
Volume: 40
Year: 2021
Month: 10
X-DOI: 10.1080/07474938.2021.1889206
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1889206
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:40:y:2021:i:9:p:852-866
Template-Type: ReDIF-Article 1.0
Author-Name: Burak Alparslan Eroğlu
Author-X-Name-First: Burak Alparslan
Author-X-Name-Last: Eroğlu
Author-Name: J. Isaac Miller
Author-X-Name-First: J. Isaac
Author-X-Name-Last: Miller
Author-Name: Taner Yiğit
Author-X-Name-First: Taner
Author-X-Name-Last: Yiğit
Title: Time-varying cointegration and the Kalman filter
Abstract:
We show that time-varying parameter state-space models estimated using the Kalman filter are particularly vulnerable to the problem of spurious regression, because the integrated error is transferred to the estimated state equation. We offer a simple yet effective methodology to reliably recover the instability in cointegrating vectors. In the process, the proposed methodology successfully distinguishes between the cases of no cointegration, fixed cointegration, and time-varying cointegration. We apply these proposed tests to elucidate the relationship between concentrations of greenhouse gases and global temperatures, an important relationship to both climate scientists and economists.
Journal: Econometric Reviews
Pages: 1-21
Issue: 1
Volume: 41
Year: 2022
Month: 1
X-DOI: 10.1080/07474938.2020.1861776
File-URL: http://hdl.handle.net/10.1080/07474938.2020.1861776
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:1:p:1-21
Template-Type: ReDIF-Article 1.0
Author-Name: Alecos Papadopoulos
Author-X-Name-First: Alecos
Author-X-Name-Last: Papadopoulos
Author-Name: Mike G. Tsionas
Author-X-Name-First: Mike G.
Author-X-Name-Last: Tsionas
Title: Efficiency gains in least squares estimation: A new approach
Abstract:
In pursuit of efficiency, we propose a new way to construct least squares estimators, as the minimizers of an augmented objective function that takes explicitly into account the variability of the error term and the resulting uncertainty, as well as the possible existence of heteroskedasticity. We initially derive an infeasible estimator which we then approximate using Ordinary Least Squares (OLS) residuals from a first-step regression to obtain the feasible “HOLS” estimator. This estimator has negligible bias, is consistent and outperforms OLS in terms of finite-sample Mean Squared Error, but also in terms of asymptotic efficiency, under all skedastic scenarios, including homoskedasticity. Analogous efficiency gains are obtained for the case of Instrumental Variables estimation. Theoretical results are accompanied by simulations that support them.
Journal: Econometric Reviews
Pages: 51-74
Issue: 1
Volume: 41
Year: 2022
Month: 1
X-DOI: 10.1080/07474938.2020.1824731
File-URL: http://hdl.handle.net/10.1080/07474938.2020.1824731
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:1:p:51-74
Template-Type: ReDIF-Article 1.0
Author-Name: Tobias Hartl
Author-X-Name-First: Tobias
Author-X-Name-Last: Hartl
Author-Name: Roland Jucknewitz
Author-X-Name-First: Roland
Author-X-Name-Last: Jucknewitz
Title: Approximate state space modelling of unobserved fractional components
Abstract:
We propose convenient inferential methods for potentially nonstationary multivariate unobserved components models with fractional integration and cointegration. Based on finite-order ARMA approximations in the state space representation, maximum likelihood estimation can make use of the EM algorithm and related techniques. The approximation outperforms the frequently used autoregressive or moving average truncation, both in terms of computational costs and with respect to approximation quality. Monte Carlo simulations reveal good estimation properties of the proposed methods for processes of different complexity and dimension.
Journal: Econometric Reviews
Pages: 75-98
Issue: 1
Volume: 41
Year: 2022
Month: 1
X-DOI: 10.1080/07474938.2020.1841444
File-URL: http://hdl.handle.net/10.1080/07474938.2020.1841444
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:1:p:75-98
Template-Type: ReDIF-Article 1.0
Author-Name: Yu-Chin Hsu
Author-X-Name-First: Yu-Chin
Author-X-Name-Last: Hsu
Author-Name: Tsung-Chih Lai
Author-X-Name-First: Tsung-Chih
Author-X-Name-Last: Lai
Author-Name: Robert P. Lieli
Author-X-Name-First: Robert P.
Author-X-Name-Last: Lieli
Title: Estimation and inference for distribution and quantile functions in endogenous treatment effect models
Abstract:
Given a standard endogenous treatment effect model, we propose nonparametric estimation and inference procedures for the distribution and quantile functions of the potential outcomes among compliers, as well as the local quantile treatment effect function. The preliminary distribution function estimator is a weighted average of indicator functions, but is not monotonically increasing in general. We therefore propose a simple monotonizing method for proper distribution function estimation, and obtain the quantile function estimator by inversion. Our monotonizing method is an alternative to Chernozhukov et al. (2010) and is arguably preferable when the outcome has unbounded support. We show that all the estimators converge weakly to Gaussian processes at the parametric rate, and propose a multiplier bootstrap for uniform inference. Our uniform results thus generalize the pointwise theory developed by Frölich and Melly (2013). Monte Carlo simulations and an application to the effect of fertility on family income distribution illustrate the use of the methods. All results extend to the subpopulation of treated compliers as well.
Journal: Econometric Reviews
Pages: 22-50
Issue: 1
Volume: 41
Year: 2022
Month: 1
X-DOI: 10.1080/07474938.2020.1847479
File-URL: http://hdl.handle.net/10.1080/07474938.2020.1847479
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:1:p:22-50
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Title: Best Paper Award Econometric Reviews, 2017–2018
Journal: Econometric Reviews
Pages: 115-115
Issue: 1
Volume: 41
Year: 2022
Month: 1
X-DOI: 10.1080/07474938.2022.2035112
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2035112
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:1:p:115-115
Template-Type: ReDIF-Article 1.0
Author-Name: Collin S. Philipps
Author-X-Name-First: Collin S.
Author-X-Name-Last: Philipps
Title: The MLE of Aigner, Amemiya, and Poirier is not the expectile MLE
Abstract:
This article compares two asymmetric Gaussian likelihood models and their corresponding estimators. Recently, there has been confusion in the literature regarding these models and (1) whether they are the same, or (2) whether both of them can be used to estimate expectiles. After the comparison, it becomes clear that they are not the same and only one of these models is appropriate for that purpose. The similarity between these models is purely superficial. The historical origin of expectiles has also been disputed: some degree of credit can be shared between two papers.
Journal: Econometric Reviews
Pages: 99-114
Issue: 1
Volume: 41
Year: 2022
Month: 1
X-DOI: 10.1080/07474938.2021.1899505
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1899505
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:1:p:99-114
Template-Type: ReDIF-Article 1.0
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Title: Best Paper Award Econometric Reviews, 2019–2020
Journal: Econometric Reviews
Pages: 116-116
Issue: 1
Volume: 41
Year: 2022
Month: 1
X-DOI: 10.1080/07474938.2022.2035113
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2035113
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:1:p:116-116
Template-Type: ReDIF-Article 1.0
Author-Name: Laszlo Balazsi
Author-X-Name-First: Laszlo
Author-X-Name-Last: Balazsi
Author-Name: Felix Chan
Author-X-Name-First: Felix
Author-X-Name-Last: Chan
Author-Name: Laszlo Matyas
Author-X-Name-First: Laszlo
Author-X-Name-Last: Matyas
Title: Event count estimation
Abstract:
This paper proposes a new estimation procedure called Event Count Estimator (ECE). The estimator is straightforward to implement and is robust against outliers, censoring and ‘excess zeros’ in the data. The paper establishes asymptotic properties of the new estimator and the theoretical results are supported by several Monte Carlo experiments. Monte Carlo experiments also show that the estimator has reasonable properties in moderate to large samples. As such, the cost of trading efficiency for robustness here is negligible from an applied viewpoint. The practical usefulness of the new estimator is demonstrated via an empirical application of the Gravity Model of trade.
Journal: Econometric Reviews
Pages: 147-176
Issue: 2
Volume: 41
Year: 2022
Month: 2
X-DOI: 10.1080/07474938.2020.1862505
File-URL: http://hdl.handle.net/10.1080/07474938.2020.1862505
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:2:p:147-176
Template-Type: ReDIF-Article 1.0
Author-Name: Marta Regis
Author-X-Name-First: Marta
Author-X-Name-Last: Regis
Author-Name: Paulo Serra
Author-X-Name-First: Paulo
Author-X-Name-Last: Serra
Author-Name: Edwin R. van den Heuvel
Author-X-Name-First: Edwin R.
Author-X-Name-Last: van den Heuvel
Title: Random autoregressive models: A structured overview
Abstract:
Models characterized by autoregressive structure and random coefficients are powerful tools for the analysis of high-frequency, high-dimensional and volatile time series. The available literature on such models is broad, but also sector-specific, overlapping, and confusing. Most models focus on one property of the data, while much can be gained by combining the strength of various models and their sources of heterogeneity. We present a structured overview of the literature on autoregressive models with random coefficients. We describe hierarchy and analogies among models, and for each we systematically list properties, estimation methods, tests, software packages and typical applications.
Journal: Econometric Reviews
Pages: 207-230
Issue: 2
Volume: 41
Year: 2022
Month: 2
X-DOI: 10.1080/07474938.2021.1899504
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1899504
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:2:p:207-230
Template-Type: ReDIF-Article 1.0
Author-Name: Kyoo il Kim
Author-X-Name-First: Kyoo il
Author-X-Name-Last: Kim
Title: Semiparametric estimation of signaling games with equilibrium refinement
Abstract:
We study an econometric modeling of a signaling game where one informed player may have multiple types. For this game, the problem of multiple equilibria arises and we achieve the uniqueness of equilibrium using an equilibrium refinement, which enables us to identify the model parameters. We then develop an estimation strategy that identifies the payoffs structure and the distribution of types from the observed actions. In this game, the type distribution is nonparametrically specified and we estimate the model using a sieve conditional MLE. We achieve the consistency and the asymptotic normality for the structural parameter estimates.
Journal: Econometric Reviews
Pages: 231-267
Issue: 2
Volume: 41
Year: 2022
Month: 2
X-DOI: 10.1080/07474938.2021.1899506
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1899506
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:2:p:231-267
Template-Type: ReDIF-Article 1.0
Author-Name: Yixiao Sun
Author-X-Name-First: Yixiao
Author-X-Name-Last: Sun
Author-Name: Xuexin Wang
Author-X-Name-First: Xuexin
Author-X-Name-Last: Wang
Title: An asymptotically F-distributed Chow test in the presence of heteroscedasticity and autocorrelation
Abstract:
This study proposes a simple, trustworthy Chow test in the presence of heteroscedasticity and autocorrelation. The test is based on a series heteroscedasticity and autocorrelation robust variance estimator with judiciously crafted basis functions. Like the Chow test in a classical normal linear regression, the proposed test employs the standard F distribution as the reference distribution, which is justified under fixed-smoothing asymptotics. Monte Carlo simulations show that the null rejection probability of the asymptotic F test is closer to the nominal level than that of the chi-square test.
Journal: Econometric Reviews
Pages: 177-206
Issue: 2
Volume: 41
Year: 2022
Month: 2
X-DOI: 10.1080/07474938.2021.1874703
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1874703
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:2:p:177-206
Template-Type: ReDIF-Article 1.0
Author-Name: Hugo Kruiniger
Author-X-Name-First: Hugo
Author-X-Name-Last: Kruiniger
Title: Estimation of dynamic panel data models with a lot of heterogeneity
Abstract:
The commonly used 1-step and 2-step System GMM estimators for the panel AR(1) model are inconsistent under mean stationarity when the ratio of the variance of the individual effects to the variance of the idiosyncratic errors is unbounded when N→∞. The reason for their inconsistency is that their weight matrices select moment conditions that do not identify the autoregressive parameter. This paper proposes a new 2-step System estimator that is still consistent in this case provided that T>3. Unlike the commonly used 2-step System estimator, the new estimator uses an estimator of the optimal weight matrix that remains consistent in this case. We also show that the commonly used 1-step and 2-step Arellano-Bond GMM estimators and the Random Effects Quasi MLE remain consistent under the same conditions. To illustrate the usefulness of our new System estimator we revisit the growth study of Levine et al. (2000).
Journal: Econometric Reviews
Pages: 117-146
Issue: 2
Volume: 41
Year: 2022
Month: 2
X-DOI: 10.1080/07474938.2021.1899507
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1899507
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:2:p:117-146
Template-Type: ReDIF-Article 1.0
Author-Name: Hande Karabiyik
Author-X-Name-First: Hande
Author-X-Name-Last: Karabiyik
Author-Name: Joakim Westerlund
Author-X-Name-First: Joakim
Author-X-Name-Last: Westerlund
Author-Name: Paresh Narayan
Author-X-Name-First: Paresh
Author-X-Name-Last: Narayan
Title: Panel data measures of price discovery
Abstract:
This paper considers disaggregated price data that are observed not only for multiple markets over extended periods of time, but also for a large number of assets. The previous literature has argued that in such data rich environments, which arise frequently in applied work, the analysis of price discovery can be made more precise by accounting for the panel structure of the data. Moreover, since the individual assets are not that interesting anyways, little is lost by taking the overall panel perspective. These arguments are, however, mainly based on empirical observations, and there is little in terms of econometric support. The purpose of the present study is to fill this gap in the literature. This is done by offering a full-blown econometric analysis of panel analogs of the information share and permanent–transitory measures of price discovery, which are the workhorses of the time series literature. Both measures are shown to be consistent and they support standard normal inference, which is in contrast to the time series case, where such inference is only possible for the permanent–transitory measure.
Journal: Econometric Reviews
Pages: 269-290
Issue: 3
Volume: 41
Year: 2022
Month: 5
X-DOI: 10.1080/07474938.2021.1912973
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1912973
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:3:p:269-290
Template-Type: ReDIF-Article 1.0
Author-Name: Daniel J. Henderson
Author-X-Name-First: Daniel J.
Author-X-Name-Last: Henderson
Author-Name: Alexandra Soberon
Author-X-Name-First: Alexandra
Author-X-Name-Last: Soberon
Author-Name: Juan M. Rodriguez-Poo
Author-X-Name-First: Juan M.
Author-X-Name-Last: Rodriguez-Poo
Title: Nonparametric multidimensional fixed effects panel data models
Abstract:
Multidimensional panel datasets are routinely employed to identify marginal effects in empirical research. Fixed effects estimators are typically used to deal with potential correlation between unobserved effects and regressors. Nonparametric estimators for one-way fixed effects models exist, but are cumbersome to employ in practice as they typically require iteration, marginal integration or profile estimation. We develop a nonparametric estimator that works for essentially any dimension fixed effects model, has a closed form solution and can be estimated in a single step. A cross-validation bandwidth selection procedure is proposed and asymptotic properties (for either a fixed or large time dimension) are given. Finite sample properties are shown via simulations, as well as with an empirical application, which further extends our model to the partially linear setting.
Journal: Econometric Reviews
Pages: 321-358
Issue: 3
Volume: 41
Year: 2022
Month: 5
X-DOI: 10.1080/07474938.2021.1957283
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1957283
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:3:p:321-358
Template-Type: ReDIF-Article 1.0
Author-Name: Minyu Han
Author-X-Name-First: Minyu
Author-X-Name-Last: Han
Author-Name: Jihun Kwak
Author-X-Name-First: Jihun
Author-X-Name-Last: Kwak
Author-Name: Donggyu Sul
Author-X-Name-First: Donggyu
Author-X-Name-Last: Sul
Title: Two-way fixed effects versus panel factor-augmented estimators: asymptotic comparison among pretesting procedures
Abstract:
Empirical researchers may wonder whether or not a two-way fixed effects estimator (with individual and period fixed effects) is sufficiently sophisticated to isolate the influence of common shocks on the estimation of slope coefficients. If it is not, practitioners need to run the so-called panel factor augmented regression instead. There are two pretesting procedures available in the literature: the use of the estimated number of factors and the direct test of estimated factor loading coefficients. This article compares the two pretesting methods asymptotically. Under the presence of the heterogeneous factor loadings, both pretesting procedures suggest using the common correlated effects (CCE) estimator. Meanwhile, when factor loadings are homogeneous, the pretesting method utilizing the estimated number of factors always suggests more efficient estimation methods. By comparing asymptotic variances, this article finds that when the slope coefficients are homogeneous with homogeneous factor loadings, the two-way fixed effects estimation is more efficient than the CCE estimation. However, when the slope coefficients are heterogeneous with homogeneous factor loadings, the CCE estimation is, surprisingly, more efficient than the two-way fixed effects estimation. By means of Monte Carlo simulations, we verify the asymptotic claims. We demonstrate how to use the two pretesting methods through the use of an empirical example.
Journal: Econometric Reviews
Pages: 291-320
Issue: 3
Volume: 41
Year: 2022
Month: 5
X-DOI: 10.1080/07474938.2021.1957282
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1957282
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:3:p:291-320
Template-Type: ReDIF-Article 1.0
Author-Name: Giacomo Benini
Author-X-Name-First: Giacomo
Author-X-Name-Last: Benini
Author-Name: Stefan Sperlich
Author-X-Name-First: Stefan
Author-X-Name-Last: Sperlich
Title: Modeling heterogeneous treatment effects in the presence of endogeneity
Abstract:
An inappropriate handling of cross-sectional heterogeneity renders estimates of causal effects inaccurate and uninformative. The present paper discusses how the direct modeling of cross-sectional differences via semiparametric models represents a useful bridge between a statistical approach, where the conditional distribution of the dependent variable returns any value of the outcome given any value of the explanatory variables, and an econometric analysis, where functions and parameters have direct policy implications. The explicit modeling of heterogeneity across different groups improves the quality of the estimates, mitigates their dependence upon the chosen instrumental variable, diminishes the self-selection problem, and fosters the acquisition of useful information for the entire sample.
Journal: Econometric Reviews
Pages: 359-372
Issue: 3
Volume: 41
Year: 2022
Month: 5
X-DOI: 10.1080/07474938.2021.1927548
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1927548
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:3:p:359-372
Template-Type: ReDIF-Article 1.0
Author-Name: Kyoo il Kim
Author-X-Name-First: Kyoo il
Author-X-Name-Last: Kim
Author-Name: Suyong Song
Author-X-Name-First: Suyong
Author-X-Name-Last: Song
Title: Control variables approach to estimate semiparametric models of mismeasured endogenous regressors with an application to U.K. twin data
Abstract:
We study the identification and estimation of semiparametric models with mismeasured endogenous regressors using control variables that ensure the conditional covariance restriction on endogenous regressors and unobserved causes. We provide a set of sufficient conditions for identification, which control for both endogeneity and measurement error. We propose a sieve-based estimator and derive its asymptotic properties. Given the sieve approximation, our proposed estimator is easy to implement as weighted least squares. Monte Carlo simulations illustrate that our proposed estimator performs well in the finite samples. In an empirical application, we estimate the return to education on earnings using U.K. twin data, in which self-reported education is potentially measured with error and is also correlated with unobserved factors. Our approach utilizes the twin’s reported education as a control variable to obtain consistent estimates. We find that a one-year increase in education leads to an 11% increase in hourly wage. The estimate is significantly higher than those from OLS and IV approaches which are potentially biased. The application underscores that our proposed estimator is useful to correct for both endogeneity and measurement error in estimating returns to education.
Journal: Econometric Reviews
Pages: 448-483
Issue: 4
Volume: 41
Year: 2022
Month: 4
X-DOI: 10.1080/07474938.2021.1960752
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1960752
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:4:p:448-483
Template-Type: ReDIF-Article 1.0
Author-Name: Yingyao Hu
Author-X-Name-First: Yingyao
Author-X-Name-Last: Hu
Author-Name: Ji-Liang Shiu
Author-X-Name-First: Ji-Liang
Author-X-Name-Last: Shiu
Title: A simple test of completeness in a class of nonparametric specification
Abstract:
This paper provides a test for completeness in a class of nonparametric specification with an additive and independent error term. It is known that such a nonparametric location family of functions is complete if and only if the characteristic function of the error term has no zeros on the real line. Because a zero of the error characteristic function implies that of an observed marginal distribution, we propose a simple test for zeros of characteristic function of the observed distribution, in which rejection of the null hypothesis implies the completeness. This test is applicable to many popular settings, such as nonparametric regression models with instrumental variables, and nonclassical measurement error models. We describe the asymptotic behavior of the tests under the null and alternative hypotheses and investigate the finite sample properties of the proposed test through a Monte Carlo study. We illustrate our method empirically by estimating a measurement error model using the CPS/SSR 1978 exact match file.
Journal: Econometric Reviews
Pages: 373-399
Issue: 4
Volume: 41
Year: 2022
Month: 4
X-DOI: 10.1080/07474938.2021.1957285
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1957285
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:4:p:373-399
Template-Type: ReDIF-Article 1.0
Author-Name: Pavel Čížek
Author-X-Name-First: Pavel
Author-X-Name-Last: Čížek
Author-Name: Chao Hui Koo
Author-X-Name-First: Chao Hui
Author-X-Name-Last: Koo
Title: Semiparametric transition models
Abstract:
A new semiparametric time series model is introduced – the semiparametric transition (SETR) model – that generalizes the threshold and smooth transition models by letting the transition function to be of an unknown form. Estimation is based on a combination of the (local) least squares estimations of the transition function and regression parameters. The asymptotic behavior for the regression coefficient estimator of the SETR model is established, including its oracle property. Monte Carlo simulations demonstrate that the proposed estimator is more robust to the form of the transition function than parametric threshold and smooth transition methods and more precise than varying coefficient estimators.
Journal: Econometric Reviews
Pages: 400-415
Issue: 4
Volume: 41
Year: 2022
Month: 4
X-DOI: 10.1080/07474938.2021.1957281
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1957281
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:4:p:400-415
Template-Type: ReDIF-Article 1.0
Author-Name: Alexander Chudik
Author-X-Name-First: Alexander
Author-X-Name-Last: Chudik
Author-Name: M. Hashem Pesaran
Author-X-Name-First: M. Hashem
Author-X-Name-Last: Pesaran
Title: An augmented Anderson–Hsiao estimator for dynamic short-T panels†
Abstract:
This article introduces the idea of self-instrumenting endogenous regressors in settings when the correlation between these regressors and the errors can be derived and used to bias-correct the moment conditions. The resulting bias-corrected moment conditions are less likely to be subject to the weak instrument problem and can be used on their own or in conjunction with other available moment conditions to obtain more efficient estimators. This approach can be applied to estimation of a variety of models such as spatial and dynamic panel data models. This article focuses on the latter, and proposes a new estimator for short T dynamic panels by augmenting Anderson and Hsiao (AAH) estimator with bias-corrected quadratic moment conditions in first differences which substantially improve the small sample performance of the AH estimator without sacrificing the generality of its underlying assumptions regarding the fixed effects, initial values, and heteroskedasticity of error terms. Using Monte-Carlo experiments it is shown that AAH estimator represents a substantial improvement over the AH estimator and more importantly it performs well even when compared to Arellano and Bond and Blundell and Bond (BB) estimators that are based on more restrictive assumptions, and continues to have satisfactory performance in cases where the standard GMM estimators are inconsistent. Finally, to decide between AAH and BB estimators we also propose a Hausman type test which is shown to work well when T is small and n sufficiently large.
Journal: Econometric Reviews
Pages: 416-447
Issue: 4
Volume: 41
Year: 2022
Month: 4
X-DOI: 10.1080/07474938.2021.1971388
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1971388
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:4:p:416-447
Template-Type: ReDIF-Article 1.0
Author-Name: Paul Bekker
Author-X-Name-First: Paul
Author-X-Name-Last: Bekker
Author-Name: Joëlle van Essen
Author-X-Name-First: Joëlle
Author-X-Name-Last: van Essen
Title: ML and GMM with concentrated instruments in the static panel data model
Abstract:
We study the asymptotic behavior of instrumental variable estimators in the static panel model under many-instruments asymptotics. We provide new estimators and standard errors based on concentrated instruments as alternatives to an estimator based on maximum likelihood. We prove that the latter estimator is consistent under many-instruments asymptotics only if the starting value in an iterative procedure is root-N consistent. A similar approach for continuous updating GMM shows the derivation is nontrivial. For the standard cross-sectional case (T = 1), the simple formulation of standard errors offer an alternative to earlier formulations.
Journal: Econometric Reviews
Pages: 181-195
Issue: 2
Volume: 39
Year: 2019
Month: 12
X-DOI: 10.1080/07474938.2019.1580946
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1580946
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2019:i:2:p:181-195
Template-Type: ReDIF-Article 1.0
Author-Name: Zhengyu Zhang
Author-X-Name-First: Zhengyu
Author-X-Name-Last: Zhang
Author-Name: Zequn Jin
Author-X-Name-First: Zequn
Author-X-Name-Last: Jin
Title: Identification and estimation in a linear correlated random coefficients model with censoring
Abstract:
In this paper, we study the identification and estimation of a linear correlated random coefficients model with censoring, namely, Y=max{B0+X′B,C}, where C is a known constant or an unknown function of regressors. Here, random coefficients (B0,B) can be correlated with one or more components of X. Under a generalized conditional median restriction similar to that in Hoderlein and Sherman, we show that both the average partial effect and the average partial effect on the treated are identified. We develop estimators for the identified parameters and analyze their large sample properties. A Monte Carlo simulation indicates that our estimators perform reasonably well with small samples. We then present an application.
Journal: Econometric Reviews
Pages: 196-213
Issue: 2
Volume: 39
Year: 2019
Month: 12
X-DOI: 10.1080/07474938.2019.1580949
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1580949
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2019:i:2:p:196-213
Template-Type: ReDIF-Article 1.0
Author-Name: Laura Magazzini
Author-X-Name-First: Laura
Author-X-Name-Last: Magazzini
Author-Name: Giorgio Calzolari
Author-X-Name-First: Giorgio
Author-X-Name-Last: Calzolari
Title: Testing initial conditions in dynamic panel data models
Abstract:
We propose a new framework for testing the “mean stationarity” assumption in dynamic panel data models, required for the consistency of the system GMM estimator. In our set up the assumption is obtained as a parametric restriction in an extended set of moment conditions, allowing the use of a LM test to check its validity. Our framework provides a ranking in terms of power of the analyzed test statistics, in which our approach exhibits better power than the difference-in-Sargan/Hansen test that compares system GMM and difference GMM, that is, on its turn, more powerful than the Sargan/Hansen test based on the system GMM moment conditions.
Journal: Econometric Reviews
Pages: 115-134
Issue: 2
Volume: 39
Year: 2019
Month: 12
X-DOI: 10.1080/07474938.2019.1690194
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1690194
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2019:i:2:p:115-134
Template-Type: ReDIF-Article 1.0
Author-Name: Michael S. Delgado
Author-X-Name-First: Michael S.
Author-X-Name-Last: Delgado
Author-Name: Deniz Ozabaci
Author-X-Name-First: Deniz
Author-X-Name-Last: Ozabaci
Author-Name: Yiguo Sun
Author-X-Name-First: Yiguo
Author-X-Name-Last: Sun
Author-Name: Subal C. Kumbhakar
Author-X-Name-First: Subal C.
Author-X-Name-Last: Kumbhakar
Title: Smooth coefficient models with endogenous environmental variables
Abstract:
We develop a three-step, oracle-efficient estimator for a structural semiparametric smooth coefficient model with endogenous variables in the nonparametric part of the model. We use a control function approach, combined with both series and kernel estimators to obtain consistent and asymptotically normal estimators of the functions and their partial derivatives. We develop a residual-based test statistic for testing endogeneity, and demonstrate the finite sample performance of our estimators, as well as our test, via Monte Carlo simulations. Finally, we develop an application of our estimator to the relationship between public benefits and private savings.
Journal: Econometric Reviews
Pages: 158-180
Issue: 2
Volume: 39
Year: 2019
Month: 12
X-DOI: 10.1080/07474938.2018.1552413
File-URL: http://hdl.handle.net/10.1080/07474938.2018.1552413
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2019:i:2:p:158-180
Template-Type: ReDIF-Article 1.0
Author-Name: Hervé Cardot
Author-X-Name-First: Hervé
Author-X-Name-Last: Cardot
Author-Name: Antonio Musolesi
Author-X-Name-First: Antonio
Author-X-Name-Last: Musolesi
Title: Modeling temporal treatment effects with zero inflated semi-parametric regression models: The case of local development policies in France
Abstract:
A semi-parametric approach is proposed to estimate the variation along time of the effects of two distinct public policies that were devoted to boost rural development in France over a similar period of time. At a micro data level, it is often observed that the dependent variable, such as local employment, does not vary along time, so that we face a kind of zero inflated phenomenon that cannot be dealt with a continuous response model. We introduce a conditional mixture model which combines a mass at zero and a continuous response. The suggested zero inflated semi-parametric statistical approach relies on the flexibility and modularity of additive models with the ability of panel data to deal with selection bias and to allow for the estimation of dynamic treatment effects. In this multiple treatment analysis, we find evidence of interesting patterns of temporal treatment effects with relevant nonlinear policy effects. The adopted semi-parametric modeling also offers the possibility of making a counterfactual analysis at an individual level. The methodology is illustrated and compared with parametric linear approaches on a few municipalities for which the mean evolution of the potential outcomes is estimated under the different possible treatments.
Journal: Econometric Reviews
Pages: 135-157
Issue: 2
Volume: 39
Year: 2019
Month: 12
X-DOI: 10.1080/07474938.2019.1690193
File-URL: http://hdl.handle.net/10.1080/07474938.2019.1690193
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:39:y:2019:i:2:p:135-157
Template-Type: ReDIF-Article 1.0
Author-Name: Natalia Bailey
Author-X-Name-First: Natalia
Author-X-Name-Last: Bailey
Author-Name: Dandan Jiang
Author-X-Name-First: Dandan
Author-X-Name-Last: Jiang
Author-Name: Jianfeng Yao
Author-X-Name-First: Jianfeng
Author-X-Name-Last: Yao
Title: A RMT-based LM test for error cross-sectional independence in large heterogeneous panel data models*
Abstract:
This paper introduces a new test for error cross-sectional independence in large panel data models with exogenous regressors having heterogenous slope coefficients. The proposed statistic, LMRMT, is based on the Lagrange Multiplier (LM) principle and the sample correlation matrix R^N of the model’s residuals. Since in large panels R^N poorly estimates its population counterpart, results from Random Matrix Theory (RMT) are used to establish the high-dimensional limiting distribution of LMRMT under heteroskedastic normal errors and assuming that both the panel size N and the sample size T grow to infinity in comparable magnitude. Simulation results show that LMRMT is largely correctly sized (except for some small values of N and T). Further, the empirical size and power outcomes show robustness of our statistic to deviations from the assumptions of normality for the error terms and of strict exogeneity for the regressors. The test has comparable small sample properties to related tests in the literature which have been developed under different asymptotic theory.
Journal: Econometric Reviews
Pages: 564-582
Issue: 5
Volume: 41
Year: 2022
Month: 6
X-DOI: 10.1080/07474938.2021.2009705
File-URL: http://hdl.handle.net/10.1080/07474938.2021.2009705
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:5:p:564-582
Template-Type: ReDIF-Article 1.0
Author-Name: Amaresh K. Tiwari
Author-X-Name-First: Amaresh K.
Author-X-Name-Last: Tiwari
Title: A control function approach to estimate panel data binary response model
Abstract:
We propose a new control function (CF) method to estimate a binary response model in a triangular system with multiple unobserved heterogeneities The CFs are the expected values of the heterogeneity terms in the reduced form equations conditional on the histories of the endogenous and the exogenous variables. The method requires weaker restrictions compared to CF methods with similar imposed structures. If the support of endogenous regressors is large, average partial effects are point-identified even when instruments are discrete. Bounds are provided when the support assumption is violated. An application and Monte Carlo experiments compare several alternative methods with ours.
Journal: Econometric Reviews
Pages: 505-538
Issue: 5
Volume: 41
Year: 2022
Month: 6
X-DOI: 10.1080/07474938.2021.1983328
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1983328
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:5:p:505-538
Template-Type: ReDIF-Article 1.0
Author-Name: Kien C. Tran
Author-X-Name-First: Kien C.
Author-X-Name-Last: Tran
Author-Name: Mike G. Tsionas
Author-X-Name-First: Mike G.
Author-X-Name-Last: Tsionas
Title: Efficient semiparametric copula estimation of regression models with endogeneity
Abstract:
An efficient sieve maximum likelihood estimation procedure for regression models with endogenous regressors using a copula-based approach is proposed. Specifically, the joint distribution of the endogenous regressor and the error term is characterized by a parametric copula function evaluated at the nonparametric marginal distributions. The asymptotic properties of the proposed estimator are derived, including semiparametrically efficient property. Monte Carlo simulations reveal that the proposed method performs well in finite samples comparing to other existing methods. An empirical application is presented to demonstrate the usefulness of the proposed approach.
Journal: Econometric Reviews
Pages: 485-504
Issue: 5
Volume: 41
Year: 2022
Month: 6
X-DOI: 10.1080/07474938.2021.1957284
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1957284
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:5:p:485-504
Template-Type: ReDIF-Article 1.0
Author-Name: Siyang Peng
Author-X-Name-First: Siyang
Author-X-Name-Last: Peng
Author-Name: Shaojun Guo
Author-X-Name-First: Shaojun
Author-X-Name-Last: Guo
Author-Name: Yonghong Long
Author-X-Name-First: Yonghong
Author-X-Name-Last: Long
Title: Large dimensional portfolio allocation based on a mixed frequency dynamic factor model
Abstract:
In this paper, we propose a mixed-frequency dynamic factor model (MFDFM) taking into account the high-frequency variation and low-frequency variation at the same time. The factor loadings in our model are affected by the past quadratic variation of factor returns, while the process of the factor quadratic variation is under a mixed-frequency framework (DCC-RV). By combing the variations from the high-frequency and low-frequency domain, our approach exhibits a better estimation and forecast of the assets covariance matrix. Our empirical study compares our MFDFM model with the sample realized covariance matrix and the traditional factor model with intraday returns or daily returns. The results of the empirical study indicate that our proposed model indeed outperforms other models in the sense that the Markowitz’s portfolios based on the MFDFM have a better performance.
Journal: Econometric Reviews
Pages: 539-563
Issue: 5
Volume: 41
Year: 2022
Month: 6
X-DOI: 10.1080/07474938.2021.1983327
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1983327
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:5:p:539-563
Template-Type: ReDIF-Article 1.0
Author-Name: Stan Hurn
Author-X-Name-First: Stan
Author-X-Name-Last: Hurn
Author-Name: Vance L. Martin
Author-X-Name-First: Vance L.
Author-X-Name-Last: Martin
Author-Name: Lina Xu
Author-X-Name-First: Lina
Author-X-Name-Last: Xu
Title: Specification tests for univariate diffusions
Abstract:
A new class of specification tests for stochastic differential equations (SDE) is proposed to determine whether the probability integral transform of the estimated model generates an independent and identically distributed uniform random variable. The tests are based on Neyman’s smooth test, appropriately adjusted to correct for both the size distortion arising from having to estimate the unknown parameters of the SDE and possible dependence in the uniform random variable. The suite of tests is compared against other commonly used specification tests for SDEs. The finite sample properties of the tests are investigated using a range of Monte Carlo experiments. The tests are then applied to testing the specification of SDEs used to model the spot interest rate and financial asset volatility.
Journal: Econometric Reviews
Pages: 607-632
Issue: 6
Volume: 41
Year: 2022
Month: 7
X-DOI: 10.1080/07474938.2021.1995683
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1995683
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:6:p:607-632
Template-Type: ReDIF-Article 1.0
Author-Name: Francesco Bravo
Author-X-Name-First: Francesco
Author-X-Name-Last: Bravo
Title: Second order expansions of estimators in nonparametric moment conditions models with weakly dependent data
Abstract:
This paper considers estimation of nonparametric moment conditions models with weakly dependent data. The estimator is based on a local linear version of the generalized empirical likelihood approach, and is an alternative to the popular local linear generalized method of moment estimator. The paper derives uniform convergence rates and pointwise asymptotic normality of the resulting local linear generalized empirical likelihood estimator. The paper also develops second order stochastic expansions (under a standard undersmoothing condition) that explain the better finite sample performance of the local linear generalized empirical likelihood estimator compared to that of the efficient local linear generalized method of moments estimator, and can be used to obtain (second order) bias corrected estimators. Monte Carlo simulations and an empirical application illustrate the competitive finite sample properties and the usefulness of the proposed estimators and second order bias corrections.
Journal: Econometric Reviews
Pages: 583-606
Issue: 6
Volume: 41
Year: 2022
Month: 7
X-DOI: 10.1080/07474938.2021.1991140
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1991140
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:6:p:583-606
Template-Type: ReDIF-Article 1.0
Author-Name: Dalia Ghanem
Author-X-Name-First: Dalia
Author-X-Name-Last: Ghanem
Title: A James-Stein-type adjustment to bias correction in fixed effects panel models
Abstract:
This paper proposes a James-Stein-type (JS) adjustment to analytical bias correction in fixed effects panel models that suffer from the incidental parameters problem. We provide high-level conditions under which the infeasible JS adjustment leads to a higher-order MSE improvement over the bias-corrected estimator, and the former is asymptotically equivalent to the latter. To obtain a feasible JS adjustment, we propose a nonparametric bootstrap procedure to estimate the JS weighting matrix and provide conditions for its consistency. We apply the JS adjustment to two models: (1) the linear autoregressive model with fixed effects, (2) the nonlinear static fixed effects model. For each application, we employ Monte Carlo simulations which confirm the theoretical results and illustrate the finite-sample improvements due to the JS adjustment. Finally, the extension of the JS procedure to a more general class of models and other policy parameters are illustrated.
Journal: Econometric Reviews
Pages: 633-651
Issue: 6
Volume: 41
Year: 2022
Month: 7
X-DOI: 10.1080/07474938.2021.1996994
File-URL: http://hdl.handle.net/10.1080/07474938.2021.1996994
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:6:p:633-651
Template-Type: ReDIF-Article 1.0
Author-Name: Fei Jin
Author-X-Name-First: Fei
Author-X-Name-Last: Jin
Author-Name: Yuqin Wang
Author-X-Name-First: Yuqin
Author-X-Name-Last: Wang
Title: GMM estimation of a spatial autoregressive model with autoregressive disturbances and endogenous regressors
Abstract:
This paper considers the generalized method of moments (GMM) estimation of a spatial autoregressive (SAR) model with SAR disturbances, where we allow for endogenous regressors in addition to a spatial lag of the dependent variable. We do not assume any reduced form of the endogenous regressors, thus we allow for spatial dependence and heterogeneity in endogenous regressors, and allow for nonlinear relations between endogenous regressors and their instruments. Innovations in the model can be homoscedastic or heteroskedastic with unknown forms. We prove that GMM estimators with linear and quadratic moments are consistent and asymptotically normal. In the homoscedastic case, we derive the best linear and quadratic moments that can generate an optimal GMM estimator with the minimum asymptotic variance.
Journal: Econometric Reviews
Pages: 652-674
Issue: 6
Volume: 41
Year: 2022
Month: 7
X-DOI: 10.1080/07474938.2021.2002521
File-URL: http://hdl.handle.net/10.1080/07474938.2021.2002521
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:6:p:652-674
Template-Type: ReDIF-Article 1.0
# input file: catalog-resolver5897852068298930950.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220713T202513 git hash: 99d3863004
Author-Name: Erik Meijer
Author-X-Name-First: Erik
Author-X-Name-Last: Meijer
Author-Name: Laura Spierdijk
Author-X-Name-First: Laura
Author-X-Name-Last: Spierdijk
Author-Name: Tom Wansbeek
Author-X-Name-First: Tom
Author-X-Name-Last: Wansbeek
Title: Moment conditions for the quadratic regression model with measurement error
Abstract:
We consider a new estimator for the quadratic errors-in-variables model that exploits higher-order moment conditions under the assumption that the distribution of the measurement error is symmetric and free of excess kurtosis. Our approach contributes to the literature by not requiring any side information and by straightforwardly allowing for one or more error-free control variables. We propose a Wald-type statistical test, based on an auxiliary method-of-moments estimator, to verify a necessary condition for our estimator’s consistency. We derive the asymptotic properties of the estimator and the statistical test and illustrate their finite-sample properties by means of a simulation study and an empirical application to existing data from the literature. Our simulations show that the method-of-moments estimator performs well in terms of bias and variance and even exhibits a certain degree of robustness to the distributional assumptions about the measurement error. In the simulation experiments where such robustness is not present, our statistical test already has high power for relatively small samples.
Journal: Econometric Reviews
Pages: 749-774
Issue: 7
Volume: 41
Year: 2022
Month: 8
X-DOI: 10.1080/07474938.2022.2052666
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2052666
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:7:p:749-774
Template-Type: ReDIF-Article 1.0
# input file: catalog-resolver7011080076405554254.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220713T202513 git hash: 99d3863004
Author-Name: Ye Yang
Author-X-Name-First: Ye
Author-X-Name-Last: Yang
Title: Unified M-estimation of matrix exponential spatial dynamic panel specification
Abstract:
In this paper, a unified M-estimation method in Yang (2018) is extended to the matrix exponential spatial dynamic panel specification (MESDPS) with fixed effects in short panels. Similar to the STLE model which includes the spatial lag effect, the space-time effect and the spatial error effect in Yang (2018), the quasi-maximum likelihood (QML) estimation for MESDPS also has the initial condition specification problem. The initial-condition free M-estimator in this paper solves this problem and is proved to be consistent and asymptotically normal. An outer product of martingale difference (OPMD) estimator for the variance-covariance (VC) matrix of the M-estimator is also derived and proved to be consistent. The finite sample property of the M-estimator is studied through an extensive Monte Carlo study. The method is applied to US outward FDI data to show its validity.
Journal: Econometric Reviews
Pages: 729-748
Issue: 7
Volume: 41
Year: 2022
Month: 8
X-DOI: 10.1080/07474938.2022.2039494
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2039494
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:7:p:729-748
Template-Type: ReDIF-Article 1.0
# input file: catalog-resolver2954190192561525714.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220713T202513 git hash: 99d3863004
Author-Name: Shuo Li
Author-X-Name-First: Shuo
Author-X-Name-Last: Li
Author-Name: Liuhua Peng
Author-X-Name-First: Liuhua
Author-X-Name-Last: Peng
Author-Name: Yundong Tu
Author-X-Name-First: Yundong
Author-X-Name-Last: Tu
Title: Testing independence between exogenous variables and unobserved errors
Abstract:
Although the exogeneity condition is usually used in many econometric models to identify parameters, the stronger restriction that the error term is independent of a vector of exogenous variables might lead to theoretical benefits. In this paper, we develop a unified methodology for testing the independence assumption. Our methodology can deal with a wide class of parametric models and allows for endogeneity and instrumental variables. In the first-step development, we construct tests that are continuous functionals of the estimated difference of the joint distribution and the product marginal distributions. Next, to remedy the dimensionality issue that arises when the dimension of the exogenous random vector is large, we propose a multiple testing approach which combines marginal p-values obtained by employing the original tests to test independence between the error term and each exogenous variable, while taking full account of the multiplicity nature of the testing problem. We obtain null limiting distributions of our tests, establish the testing consistency, and justify the sensitivity to n−1/2-local alternatives, with n the sample size. The multiplier bootstrap is employed to estimate the critical values. Our methodology is illustrated in the linear regression, the instrumental variables regression, and the nonlinear quantile regression. Our tests are found to perform well in simulations and are demonstrated via an empirical example.
Journal: Econometric Reviews
Pages: 697-728
Issue: 7
Volume: 41
Year: 2022
Month: 8
X-DOI: 10.1080/07474938.2022.2039493
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2039493
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:7:p:697-728
Template-Type: ReDIF-Article 1.0
# input file: catalog-resolver2363472726524469405.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220713T202513 git hash: 99d3863004
Author-Name: Jiahui Zou
Author-X-Name-First: Jiahui
Author-X-Name-Last: Zou
Author-Name: Wendun Wang
Author-X-Name-First: Wendun
Author-X-Name-Last: Wang
Author-Name: Xinyu Zhang
Author-X-Name-First: Xinyu
Author-X-Name-Last: Zhang
Author-Name: Guohua Zou
Author-X-Name-First: Guohua
Author-X-Name-Last: Zou
Title: Optimal model averaging for divergent-dimensional Poisson regressions
Abstract:
This paper proposes a new model averaging method to address model uncertainty in Poisson regressions, allowing the dimension of covariates to increase with the sample size. We derive an unbiased estimator of the Kullback–Leibler (KL) divergence to choose averaging weights. We show that when all candidate models are misspecified, the proposed estimate is asymptotically optimal by achieving the least KL divergence among all possible averaging estimators. In another situation where correct models exist in the model space, our method can produce consistent coefficient estimates. We apply the proposed techniques to study the determinants and predict corporate innovation outcomes measured by the number of patents.
Journal: Econometric Reviews
Pages: 775-805
Issue: 7
Volume: 41
Year: 2022
Month: 8
X-DOI: 10.1080/07474938.2022.2047508
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2047508
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:7:p:775-805
Template-Type: ReDIF-Article 1.0
# input file: catalog-resolver8564431966767516299.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220713T202513 git hash: 99d3863004
Author-Name: Jack Fosten
Author-X-Name-First: Jack
Author-X-Name-Last: Fosten
Author-Name: Ryan Greenaway-McGrevy
Author-X-Name-First: Ryan
Author-X-Name-Last: Greenaway-McGrevy
Title: Panel data nowcasting
Abstract:
This article promotes the use of panel data methods in nowcasting. This shifts the focus of the literature from national to regional nowcasting of variables like gross domestic product (GDP). We propose a mixed-frequency panel VAR model and a bias-corrected least squares estimator which attenuates the bias in fixed effects dynamic panel settings. Simulations show that panel forecast model selection and combination methods are successfully adapted to the nowcasting setting. Our novel empirical application of nowcasting quarterly U.S. state-level real GDP growth highlights the success of state-level nowcasting, as well as the gains from pooling information across states.
Journal: Econometric Reviews
Pages: 675-696
Issue: 7
Volume: 41
Year: 2022
Month: 8
X-DOI: 10.1080/07474938.2021.2017670
File-URL: http://hdl.handle.net/10.1080/07474938.2021.2017670
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:7:p:675-696
Template-Type: ReDIF-Article 1.0
# input file: catalog-resolver-4356618351439161000.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220713T202513 git hash: 99d3863004
Author-Name: Aubrey Poon
Author-X-Name-First: Aubrey
Author-X-Name-Last: Poon
Author-Name: Dan Zhu
Author-X-Name-First: Dan
Author-X-Name-Last: Zhu
Title: A new Bayesian model for contagion and interdependence
Abstract:
We develop a flexible Bayesian time-varying parameter model with a Leamer correction to measure contagion and interdependence. Our proposed framework facilitates a model-based identification mechanism for static and dynamic interdependence. We also allow for fat-tails stochastic volatility within the model, which enables us to capture volatility clustering and outliers in high-frequency financial data. We apply our new proposed framework to two empirical applications: the Chilean foreign exchange market during the Argentine crisis of 2001 and the recent Covid-19 pandemic in the United Kingdom. We find no evidence of contagion effects from Argentina or Brazil to Chile and three additional key insights compared to Ciccarelli and Rebucci 2006 study. For the Covid-19 pandemic application, our results convey that the United Kingdom government was largely ineffective in preventing the importation of Covid-19 cases from European countries during the second wave of the pandemic.
Journal: Econometric Reviews
Pages: 806-826
Issue: 7
Volume: 41
Year: 2022
Month: 8
X-DOI: 10.1080/07474938.2022.2072319
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2072319
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:7:p:806-826
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2073743_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220804T044749 git hash: 24b08f8188
Author-Name: Barbara Brune
Author-X-Name-First: Barbara
Author-X-Name-Last: Brune
Author-Name: Wolfgang Scherrer
Author-X-Name-First: Wolfgang
Author-X-Name-Last: Scherrer
Author-Name: Efstathia Bura
Author-X-Name-First: Efstathia
Author-X-Name-Last: Bura
Title: A state-space approach to time-varying reduced-rank regression
Abstract:
We propose a new approach to reduced-rank regression that allows for time-variation in the regression coefficients. The Kalman filter based estimation allows for usage of standard methods and easy implementation of our procedure. The EM-algorithm ensures convergence to a local maximum of the likelihood. Our estimation approach in time-varying reduced-rank regression performs well in simulations, with amplified competitive advantage in time series that experience large structural changes. We illustrate the performance of our approach with a simulation study and two applications to stock index and Covid-19 case data.
Journal: Econometric Reviews
Pages: 895-917
Issue: 8
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2073743
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2073743
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:8:p:895-917
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2047507_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220804T044749 git hash: 24b08f8188
Author-Name: Ye Yang
Author-X-Name-First: Ye
Author-X-Name-Last: Yang
Author-Name: Osman Doğan
Author-X-Name-First: Osman
Author-X-Name-Last: Doğan
Author-Name: Suleyman Taspinar
Author-X-Name-First: Suleyman
Author-X-Name-Last: Taspinar
Title: Model selection and model averaging for matrix exponential spatial models
Abstract:
In this paper, we focus on a model specification problem in spatial econometric models when an empiricist needs to choose from a pool of candidates for the spatial weights matrix. We propose a model selection (MS) procedure for the matrix exponential spatial specification (MESS), when the true spatial weights matrix may not be in the set of candidate spatial weights matrices. We show that the selection estimator is asymptotically optimal in the sense that asymptotically it is as efficient as the infeasible estimator that uses the best candidate spatial weights matrix. The proposed selection procedure is also consistent in the sense that when the data generating process involves spatial effects, it chooses the true spatial weights matrix with probability approaching one in large samples. We also propose a model averaging (MA) estimator that compromises across a set of candidate models. We show that it is asymptotically optimal. We further flesh out how to extend the proposed selection and averaging schemes to higher order specifications and to the MESS with heteroscedasticity. Our Monte Carlo simulation results indicate that the MS and MA estimators perform well in finite samples. We also illustrate the usefulness of the proposed MS and MA schemes in a spatially augmented economic growth model.
Journal: Econometric Reviews
Pages: 827-858
Issue: 8
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2047507
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2047507
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:8:p:827-858
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2074188_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220804T044749 git hash: 24b08f8188
Author-Name: Wen Xu
Author-X-Name-First: Wen
Author-X-Name-Last: Xu
Title: Testing for time-varying factor loadings in high-dimensional factor models
Abstract:
This paper proposes a test for structural changes in factor loadings in high-dimensional factor models under weak serial and cross-sectional dependence. The test is an aggregate statistic in the form of the maximum of the variable-specific statistics whose asymptotic null distribution and local power property are studied. Two approaches including extreme value theory and Bonferroni correction are adopted to compute the critical values of the aggregate test statistic. Monte Carlo simulations reveal the non-trivial power of the proposed test against various types of structural changes, including abrupt changes, nonrandom smooth changes, random-walk variations and stationary variations. Additionally, our test can be more powerful than some alternative tests in the considered scenarios. The usefulness of the test is illustrated by an empirical application to Stock and Watson’s U.S. data set.
Journal: Econometric Reviews
Pages: 918-965
Issue: 8
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2074188
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2074188
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:8:p:918-965
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2072323_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220804T044749 git hash: 24b08f8188
Author-Name: Dimitra Kyriakopoulou
Author-X-Name-First: Dimitra
Author-X-Name-Last: Kyriakopoulou
Author-Name: Christian M. Hafner
Author-X-Name-First: Christian M.
Author-X-Name-Last: Hafner
Title: Reconciling negative return skewness with positive time-varying risk premia
Abstract:
One of the implications of the intertemporal capital asset pricing model (ICAPM) is a positive and linear relationship between the conditional mean and conditional variance of returns to the market portfolio. Empirically, however, it is often observed that there is a negative skewness in equity returns. This article shows that a negative skewness is only compatible with a positive risk premium if the innovation distribution is asymmetric with a negative skewness. We extend recent work using the EGARCH-in-Mean specification to allow for asymmetric innovations, and give results for the unconditional skewness of returns. We apply the model to the prediction of Value-at-Risk of the largest stock market indices, and demonstrate its good performance. Keywords: Exponential GARCH, in-mean, risk premium, ICAPM, unconditional skewness, asymmetric distribution, portfolio selection, Value-at-Risk.
Journal: Econometric Reviews
Pages: 877-894
Issue: 8
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2072323
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2072323
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:8:p:877-894
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2072321_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220804T044749 git hash: 24b08f8188
Author-Name: Chuhui Li
Author-X-Name-First: Chuhui
Author-X-Name-Last: Li
Author-Name: Donald S. Poskitt
Author-X-Name-First: Donald S.
Author-X-Name-Last: Poskitt
Author-Name: Frank Windmeijer
Author-X-Name-First: Frank
Author-X-Name-Last: Windmeijer
Author-Name: Xueyan Zhao
Author-X-Name-First: Xueyan
Author-X-Name-Last: Zhao
Title: Binary outcomes, OLS, 2SLS and IV probit
Abstract:
For a binary outcome Y, generated by a simple threshold crossing model with a single exogenous normally distributed explanatory variable X, the OLS estimator of the coefficient on X in a linear probability model is a consistent estimator of the average partial effect of X. Even in this very simple setting, we show that when allowing for X to be endogenously determined, the 2SLS estimator, using a normally distributed instrumental variable Z, does not identify the same causal parameter. It instead estimates the average partial effect of Z, scaled by the coefficient on Z in the linear first-stage model for X, denoted γ1, or equivalently, it estimates the average partial effect of the population predicted value of X, Zγ1. These causal parameters can differ substantially as we show for the normal Probit model, which implies that care has to be taken when interpreting 2SLS estimation results in a linear probability model. Under joint normality of the error terms, IV Probit maximum likelihood estimation does identify the average partial effect of X. The two-step control function procedure of Rivers and Vuong can also estimate this causal parameter consistently, but a double averaging is needed, one over the distribution of the first-stage error V and one over the distribution of X. If instead a single averaging is performed over the joint distribution of X and V, then the same causal parameter is estimated as the one estimated by the 2SLS estimator in the linear probability model. The 2SLS estimator is a consistent estimator when the average partial effect is equal to 0, and the standard Wald test for this hypothesis has correct size under strong instrument asymptotics. We show that, in general, the standard weak instrument first-stage F-test interpretations do not apply in this setting.
Journal: Econometric Reviews
Pages: 859-876
Issue: 8
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2072321
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2072321
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:8:p:859-876
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2091713_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220804T044749 git hash: 24b08f8188
Author-Name: Dakyung Seong
Author-X-Name-First: Dakyung
Author-X-Name-Last: Seong
Author-Name: Jin Seo Cho
Author-X-Name-First: Jin Seo
Author-X-Name-Last: Cho
Author-Name: Timo Teräsvirta
Author-X-Name-First: Timo
Author-X-Name-Last: Teräsvirta
Title: Comprehensively testing linearity hypothesis using the smooth transition autoregressive model
Abstract:
This article examines the null limit distribution of the quasi-likelihood ratio (QLR) statistic for testing linearity condition against the smooth transition autoregressive (STAR) model. We explicitly show that the QLR test statistic weakly converges to a functional of a multivariate Gaussian process under the null of linearity, which is done by resolving the issue of identification problem arises in two different ways under the null. In contrast with the Lagrange multiplier test that is widely employed for testing the linearity condition, the proposed QLR statistic has an omnibus power, and thus, it complements the existing testing procedure. We show the empirical relevance of our test by testing the neglected nonlinearity of the US fiscal multipliers and growth rates of US unemployment. These empirical examples demonstrate that the QLR test is useful for detecting the nonlinear structure among economic variables.
Journal: Econometric Reviews
Pages: 966-984
Issue: 8
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2091713
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2091713
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:8:p:966-984
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2074187_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220823T191300 git hash: 39867e6e2f
Author-Name: Xin Geng
Author-X-Name-First: Xin
Author-X-Name-Last: Geng
Author-Name: Kai Sun
Author-X-Name-First: Kai
Author-X-Name-Last: Sun
Title: Estimation of a partially linear seemingly unrelated regressions model: application to a translog cost system
Abstract:
This article studies a partially linear seemingly unrelated regressions (SUR) model to estimate a translog cost system that consists of a partially linear translog cost function and input share equations. The parametric component is estimated via a simple two-step feasible SUR estimation procedure. We show that the resulting estimator achieves root-n convergence and is asymptotically normal. The nonparametric component is estimated with a nonparametric SUR estimator based on the Cholesky decomposition. We show that this estimator is consistent, asymptotically normal, and more efficient relative to the ones that ignore cross-equation correlation. We emphasize the importance and implication of the choice of square root of the covariance matrix by comparing the Cholesky and Spectral decompositions. A model specification test for parametric functional form is proposed. An Italian banking data set is used to estimate the translog cost system. Results show that marginal effects of risks on cost of production are heterogeneous but increase with risk levels.
Journal: Econometric Reviews
Pages: 1008-1046
Issue: 9
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2074187
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2074187
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:9:p:1008-1046
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2091361_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220823T191300 git hash: 39867e6e2f
Author-Name: Shuaizhang Feng
Author-X-Name-First: Shuaizhang
Author-X-Name-Last: Feng
Author-Name: Yingyao Hu
Author-X-Name-First: Yingyao
Author-X-Name-Last: Hu
Author-Name: Jiandong Sun
Author-X-Name-First: Jiandong
Author-X-Name-Last: Sun
Title: Rotation group bias and the persistence of misclassification errors in the Current Population Surveys
Abstract:
We develop a general misclassification model to explain the so-called “Rotation Group Bias (RGB)” problem in the Current Population Surveys, where different rotation groups report different labor force statistics. The key insight is that responses to repeated questions in surveys can depend not only on unobserved true values, but also on previous responses to the same questions. Our method provides a framework to understand why unemployment rates in rotation group one are higher than those in other rotation groups in the CPS, without imposing any a priori assumptions on the existence and direction of RGB. Using our method, we provide new estimates of the U.S. unemployment rates, which are much higher than the official series, but lower than previous estimates that ignored persistence in misclassification.
Journal: Econometric Reviews
Pages: 1077-1094
Issue: 9
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2091361
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2091361
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:9:p:1077-1094
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2091363_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220823T191300 git hash: 39867e6e2f
Author-Name: David M. Drukker
Author-X-Name-First: David M.
Author-X-Name-Last: Drukker
Author-Name: Di Liu
Author-X-Name-First: Di
Author-X-Name-Last: Liu
Title: Finite-sample results for lasso and stepwise Neyman-orthogonal Poisson estimators
Abstract:
High-dimensional models that include many covariates which might potentially affect an outcome are increasingly common. This paper begins by introducing a lasso-based approach and a stepwise-based approach to valid inference for a high-dimensional model. It then discusses several essential extensions to the literature that make the estimators more usable in practice. Finally, it presents Monte Carlo evidence to help applied researchers choose which of several available estimators should be used in practice. The Monte Carlo evidence shows that our extensions to the literature perform well. It also shows that a BIC-stepwise approach performs well for a data-generating process for which the lasso-based approaches and a testing-stepwise approach fail. The Monte Carlo evidence also indicates the BIC-based lasso and plugin-based lasso can produce better inferential results than the ubiquitous CV-based lasso. Easy-to-use Stata commands are available for all the methods that we discuss.
Journal: Econometric Reviews
Pages: 1047-1076
Issue: 9
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2091363
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2091363
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:9:p:1047-1076
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2091360_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220823T191300 git hash: 39867e6e2f
Author-Name: Shunan Zhao
Author-X-Name-First: Shunan
Author-X-Name-Last: Zhao
Author-Name: Yiguo Sun
Author-X-Name-First: Yiguo
Author-X-Name-Last: Sun
Author-Name: Subal C. Kumbhakar
Author-X-Name-First: Subal C.
Author-X-Name-Last: Kumbhakar
Title: Income and democracy: a semiparametric approach
Abstract:
We examine heterogeneous nonlinear effects of income on democracy using country-level data from 1960 to 2000. Existing studies mainly focused on a linear relationship or restricted nonlinear ones and find mixed findings about the effects of income on democracy. The strong positive cross-country correlation between income and democracy is often found to disappear after controlling country specific fixed effects, although the result varies with different estimation methods and samples. In contrast to previous studies, we apply a flexible semiparametric additive partially linear dynamic panel data model to explore the heterogeneous effects of income on democracy. We assume income is endogenous and it enters in the regression model nonparametrically. Our model specification also allows for different democracy equilibria and adjustment speeds toward equilibria. We propose a nonlinearity test for our model and a penalized sieve minimum distance estimator to solve the ill-posed inverse problem in the semiparametric instrumental variable estimator. The finite sample performance of the proposed test and estimator are evaluated by simulations. In the empirical model, we find that the relationship between income and democracy is nonlinear and it is more complex than a simple inverted U-shape. Specifically, depending on the choice of the democracy measure, income may have positive effects on democracy for low-income countries, negative effects for middle-income countries, and no effects for high-income countries.
Journal: Econometric Reviews
Pages: 1113-1140
Issue: 9
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2091360
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2091360
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:9:p:1113-1140
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2091362_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220823T191300 git hash: 39867e6e2f
Author-Name: Bin Chen
Author-X-Name-First: Bin
Author-X-Name-Last: Chen
Title: A robust test for serial correlation in panel data models
Abstract:
We consider a new nonparametric test for serial correlation of unknown form in the estimated residuals of a panel regression model, where individual and time effects can be fixed or random, and the panel data can be balanced or unbalanced. Our test is robust against potential weak error cross-sectional dependence and error serial dependence in higher-order moments. This is in contrast to existing tests for serial correlation in panel data models, which assume error components to be cross-sectionally and serially independent. Our test has an asymptotic N(0, 1) distribution under the null hypothesis and is consistent against serial correlation of unknown form. No common alternative is assumed and hence our test allows for substantial inhomogeneity in serial correlation across individuals. A simulation study highlights the merits of the proposed test relative to a variety of existing tests in the literature. We apply the new test to the empirical study of Wolfers on the relationship between unilateral divorce laws and divorce rates and find strong evidence against serial uncorrelatedness even controlling for the fixed effect.
Journal: Econometric Reviews
Pages: 1095-1112
Issue: 9
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2091362
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2091362
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:9:p:1095-1112
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2082169_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220823T191300 git hash: 39867e6e2f
Author-Name: Yu-Chin Hsu
Author-X-Name-First: Yu-Chin
Author-X-Name-Last: Hsu
Author-Name: Jen-Che Liao
Author-X-Name-First: Jen-Che
Author-X-Name-Last: Liao
Author-Name: Eric S. Lin
Author-X-Name-First: Eric S.
Author-X-Name-Last: Lin
Title: Two-step series estimation and specification testing of (partially) linear models with generated regressors
Abstract:
This paper studies three semiparametric models that are useful and frequently encountered in applied econometric work—a linear and two partially linear specifications with generated regressors, i.e., the regressors that are unobserved, but can be nonparametrically estimated from the data. Our framework allows for generated regressors to appear in linear or nonlinear components of partially linear models. We propose two-step series estimators for the finite-dimensional parameters, establish their n-consistency (with sample size n) and asymptotic normality, and provide the asymptotic variance formulae that take into account the estimation error of generated regressors. Moreover, we develop a nonparametric specification test for the models considered. Numerical performances of the proposed estimators and test via simulation experiments and an empirical application illustrate the utility of our approach.
Journal: Econometric Reviews
Pages: 985-1007
Issue: 9
Volume: 41
Year: 2022
Month: 9
X-DOI: 10.1080/07474938.2022.2082169
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2082169
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:9:p:985-1007
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2127077_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220907T060133 git hash: 85d61bd949
Author-Name: Martin Huber
Author-X-Name-First: Martin
Author-X-Name-Last: Huber
Author-Name: Lukáš Lafférs
Author-X-Name-First: Lukáš
Author-X-Name-Last: Lafférs
Title: Bounds on direct and indirect effects under treatment/mediator endogeneity and outcome attrition
Abstract:
Causal mediation analysis aims at disentangling a treatment effect into an indirect mechanism operating through an intermediate outcome or mediator, as well as the direct effect of the treatment on the outcome of interest. However, the evaluation of direct and indirect effects is frequently complicated by non-ignorable selection into the treatment and/or mediator, even after controlling for observables, as well as sample selection/outcome attrition. We propose a method for bounding direct and indirect effects in the presence of such complications using a method that is based on a sequence of linear programming problems. Considering inverse probability weighting by propensity scores, we compute the weights that would yield identification in the absence of complications and perturb them by an entropy parameter reflecting a specific amount of propensity score misspecification to set-identify the effects of interest. We apply our method to data from the National Longitudinal Survey of Youth 1979 to derive bounds on the explained and unexplained components of a gender wage gap decomposition that is likely prone to non-ignorable mediator selection and outcome attrition.
Journal: Econometric Reviews
Pages: 1141-1163
Issue: 10
Volume: 41
Year: 2022
Month: 11
X-DOI: 10.1080/07474938.2022.2127077
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2127077
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:10:p:1141-1163
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2114624_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220907T060133 git hash: 85d61bd949
Author-Name: Ju Hyun Kim
Author-X-Name-First: Ju Hyun
Author-X-Name-Last: Kim
Author-Name: Byoung G. Park
Author-X-Name-First: Byoung G.
Author-X-Name-Last: Park
Title: Testing rank similarity in the local average treatment effects model
Abstract:
This paper develops a test for the rank similarity condition of the nonseparable instrumental variable quantile regression model using the local average treatment effect model. When the instrument takes more than two values or multiple binary instruments are available, there exist multiple complier groups for which the marginal distributions of potential outcomes are identified. A testable implication is obtained by comparing the distributions of ranks across complier groups. We propose a test procedure in a semiparametric quantile regression specification. We establish the weak convergence of the test statistic and the validity of the bootstrap critical value. We illustrate the test with an empirical example of the effects of fertility on women’s labor supply.
Journal: Econometric Reviews
Pages: 1265-1286
Issue: 10
Volume: 41
Year: 2022
Month: 11
X-DOI: 10.1080/07474938.2022.2114624
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2114624
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:10:p:1265-1286
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2114623_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220907T060133 git hash: 85d61bd949
Author-Name: Maoshan Tian
Author-X-Name-First: Maoshan
Author-X-Name-Last: Tian
Author-Name: Huw Dixon
Author-X-Name-First: Huw
Author-X-Name-Last: Dixon
Title: The variances of non-parametric estimates of the cross-sectional distribution of durations
Abstract:
This paper focuses on the link between non-parametric survival analysis and three distributions. The delta method is applied to derive the variances of the non-parametric estimators of three distributions: the distribution of durations (DD), the cross-sectional distribution of ages (CSA) and the cross-sectional distribution of (completed) durations (CSD). The non-parametric estimator of the the cross-sectional distribution of durations (CSD) has been defined and derived by Dixon (2012) and used in the generalized Taylor price model (GTE) by Dixon and Le Bihan (2012). The Monte Carlo method is applied to evaluate the variances of the estimators of DD and CSD and how their performance varies with sample size and the censoring of data. We apply those estimators to two data sets: the UK CPI micro-price data and waiting-time data from UK hospitals. Both the estimates of the distributions and their variances are calculated. Depending on the empirical results, the estimated variances indicate that the DD and CSD estimators are all significant.
Journal: Econometric Reviews
Pages: 1243-1264
Issue: 10
Volume: 41
Year: 2022
Month: 11
X-DOI: 10.1080/07474938.2022.2114623
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2114623
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:10:p:1243-1264
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2114625_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220907T060133 git hash: 85d61bd949
Author-Name: Marie-Claude Beaulieu
Author-X-Name-First: Marie-Claude
Author-X-Name-Last: Beaulieu
Author-Name: Lynda Khalaf
Author-X-Name-First: Lynda
Author-X-Name-Last: Khalaf
Author-Name: Maral Kichian
Author-X-Name-First: Maral
Author-X-Name-Last: Kichian
Author-Name: Olena Melin
Author-X-Name-First: Olena
Author-X-Name-Last: Melin
Title: Finite sample inference in multivariate instrumental regressions with an application to Catastrophe bonds*
Abstract:
We propose exact exogeneity tests and weak-instruments-robust tests on factor loadings for a system of regressions with possibly non-Gaussian disturbances. Our methodology is valid in finite samples and accounts for common cross-sectional factors. Analytical invariance results are derived, with companion simulation studies. Finally, a total-effect parameter is introduced that embeds the unobservable endogeneity factor. Proposed tests are applied to assess whether Catastrophe bond mutual funds co-move with financial markets. Significant risk premiums are detected globally and over time, although they are less pervasive from a domestic currency perspective. Findings underscore the importance of instrumenting and assessing direct and total effects.
Journal: Econometric Reviews
Pages: 1205-1242
Issue: 10
Volume: 41
Year: 2022
Month: 11
X-DOI: 10.1080/07474938.2022.2114625
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2114625
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:10:p:1205-1242
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2127076_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220907T060133 git hash: 85d61bd949
Author-Name: Hao Dong
Author-X-Name-First: Hao
Author-X-Name-Last: Dong
Author-Name: Taisuke Otsu
Author-X-Name-First: Taisuke
Author-X-Name-Last: Otsu
Author-Name: Luke Taylor
Author-X-Name-First: Luke
Author-X-Name-Last: Taylor
Title: Nonparametric estimation of additive models with errors-in-variables
Abstract:
In the estimation of nonparametric additive models, conventional methods, such as backfitting and series approximation, cannot be applied when measurement error is present in a covariate. This paper proposes a two-stage estimator for such models. In the first stage, to adapt to the additive structure, we use a series approximation together with a ridge approach to deal with the ill-posedness brought by mismeasurement. We derive the uniform convergence rate of this first-stage estimator and characterize how the measurement error slows down the convergence rate for ordinary/super smooth cases. To establish the limiting distribution, we construct a second-stage estimator via one-step backfitting with a deconvolution kernel using the first-stage estimator. The asymptotic normality of the second-stage estimator is established for ordinary/super smooth measurement error cases. Finally, a Monte Carlo study and an empirical application highlight the applicability of the estimator.
Journal: Econometric Reviews
Pages: 1164-1204
Issue: 10
Volume: 41
Year: 2022
Month: 11
X-DOI: 10.1080/07474938.2022.2127076
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2127076
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:10:p:1164-1204
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2147136_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20220907T060133 git hash: 85d61bd949
Author-Name: The Editors
Title: Back Matter
Journal: Econometric Reviews
Pages: 1287-1288
Issue: 10
Volume: 41
Year: 2022
Month: 11
X-DOI: 10.1080/07474938.2022.2147136
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2147136
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:41:y:2022:i:10:p:1287-1288
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2135495_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Helmut Herwartz
Author-X-Name-First: Helmut
Author-X-Name-Last: Herwartz
Author-Name: Simone Maxand
Author-X-Name-First: Simone
Author-X-Name-Last: Maxand
Author-Name: Yabibal M. Walle
Author-X-Name-First: Yabibal M.
Author-X-Name-Last: Walle
Title: Forward detrending for heteroskedasticity-robust panel unit root testing
Abstract:
The variances of most economic time series display marked fluctuations over time. Panel unit root tests of the so-called first and second generation are not robust in such cases. In response to this problem, a few heteroskedasticity-robust panel unit root tests have been proposed. An important limitation of these tests is, however, that they become invalid if the data are trending. As a prominent means of drift adjustment under the panel unit root hypothesis, the (unweighted) forward detrending scheme of Breitung suffers from nuisance parameters if the data feature time-varying variances. In this article, we propose a weighted forward-detrending scheme. Unlike its unweighted counterpart, the new detrending scheme restores the pivotalness of the heteroskedasticity-robust panel unit root tests suggested by Demetrescu and Hanck and Herwartz et al. when applied to trending panels with heteroskedastic variances. As an empirical illustration, we provide evidence in favor of non-stationarity of health care expenditures as shares of GDP in a panel of OECD economies.
Journal: Econometric Reviews
Pages: 28-53
Issue: 1
Volume: 42
Year: 2023
Month: 1
X-DOI: 10.1080/07474938.2022.2135495
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2135495
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:1:p:28-53
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2094539_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Jingjie Xiang
Author-X-Name-First: Jingjie
Author-X-Name-Last: Xiang
Author-Name: Gangzheng Guo
Author-X-Name-First: Gangzheng
Author-X-Name-Last: Guo
Author-Name: Jiaolong Li
Author-X-Name-First: Jiaolong
Author-X-Name-Last: Li
Title: Determining the number of factors in constrained factor models via Bayesian information criterion
Abstract:
This paper estimates the number of factors in constrained and partially constrained factor models (Tsai and Tsay, 2010) based on constrained Bayesian information criterion (CBIC). Following Bai and Ng (2002), the estimation of the number of factors depends on the tradeoff between good fit and parsimony, so we first derive the convergence rate of constrained factor estimates under the framework of large cross-sections (N) and large time dimensions (T). Furthermore, we demonstrate that the penalty for overfitting can be a function of N alone, so the BIC form, which does not work in the case of (unconstrained) approximate factor models, consistently estimates the number of factors in constrained factor models. We then conduct Monte Carlo simulations to show that our proposed CBIC has good finite sample performance and outperforms competing methods.
Journal: Econometric Reviews
Pages: 98-122
Issue: 1
Volume: 42
Year: 2023
Month: 1
X-DOI: 10.1080/07474938.2022.2094539
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2094539
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:1:p:98-122
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2140982_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Martin Burda
Author-X-Name-First: Martin
Author-X-Name-Last: Burda
Author-Name: Remi Daviet
Author-X-Name-First: Remi
Author-X-Name-Last: Daviet
Title: Hamiltonian sequential Monte Carlo with application to consumer choice behavior
Abstract:
The practical use of nonparametric Bayesian methods requires the availability of efficient algorithms for posterior inference. The inherently serial nature of traditional Markov chain Monte Carlo (MCMC) methods imposes limitations on their efficiency and scalability. In recent years, there has been a surge of research activity devoted to developing alternative implementation methods that target parallel computing environments. Sequential Monte Carlo (SMC), also known as a particle filter, has been gaining popularity due to its desirable properties. SMC uses a genetic mutation-selection sampling approach with a set of particles representing the posterior distribution of a stochastic process. We propose to enhance the performance of SMC by utilizing Hamiltonian transition dynamics in the particle transition phase, in place of random walk used in the previous literature. We call the resulting procedure Hamiltonian Sequential Monte Carlo (HSMC). Hamiltonian transition dynamics have been shown to yield superior mixing and convergence properties relative to random walk transition dynamics in the context of MCMC procedures. The rationale behind HSMC is to translate such gains to the SMC environment. HSMC will facilitate practical estimation of models with complicated latent structures, such as nonparametric individual unobserved heterogeneity, that are otherwise difficult to implement. We demonstrate the behavior of HSMC in a challenging simulation study and contrast its favorable performance with SMC and other alternative approaches. We then apply HSMC to a panel discrete choice model with nonparametric consumer heterogeneity, allowing for multiple modes, asymmetries, and data-driven clustering, providing insights for consumer segmentation, individual level marketing, and price micromanagement.
Journal: Econometric Reviews
Pages: 54-77
Issue: 1
Volume: 42
Year: 2023
Month: 1
X-DOI: 10.1080/07474938.2022.2140982
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2140982
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:1:p:54-77
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2156740_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Saban Nazlioglu
Author-X-Name-First: Saban
Author-X-Name-Last: Nazlioglu
Author-Name: Junsoo Lee
Author-X-Name-First: Junsoo
Author-X-Name-Last: Lee
Author-Name: Margie Tieslau
Author-X-Name-First: Margie
Author-X-Name-Last: Tieslau
Author-Name: Cagin Karul
Author-X-Name-First: Cagin
Author-X-Name-Last: Karul
Author-Name: Yu You
Author-X-Name-First: Yu
Author-X-Name-Last: You
Title: Smooth structural changes and common factors in nonstationary panel data: an analysis of healthcare expenditures†
Abstract:
This article suggests new panel unit root tests that allow for multiple structural breaks and control for cross-correlations in the panel. Breaks are modeled with a Fourier function, which allows for smooth or gradual change rather than abrupt breaks. Cross-correlations are corrected by using the PANIC procedure. The simulations show that our tests have good size and power properties and perform reasonably well when the nature of breaks or the factor structure is unknown. The new panel unit root tests support fresh evidence on the persistence of healthcare expenditures in OECD countries.
Journal: Econometric Reviews
Pages: 78-97
Issue: 1
Volume: 42
Year: 2023
Month: 1
X-DOI: 10.1080/07474938.2022.2156740
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2156740
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:1:p:78-97
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2157965_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Masayuki Hirukawa
Author-X-Name-First: Masayuki
Author-X-Name-Last: Hirukawa
Author-Name: Irina Murtazashvili
Author-X-Name-First: Irina
Author-X-Name-Last: Murtazashvili
Author-Name: Artem Prokhorov
Author-X-Name-First: Artem
Author-X-Name-Last: Prokhorov
Title: Yet another look at the omitted variable bias
Abstract:
When conducting regression analysis, econometricians often face the situation where some relevant regressors are unavailable in the data set at hand. This article shows how to construct a new class of nonparametric proxies by combining the original data set with one containing the missing regressors. Imputation of the missing values is done using a nonstandard kernel adapted to mixed data. We derive the asymptotic distribution of the resulting semiparametric two-sample estimator of the parameters of interest and show, using Monte Carlo simulations, that it dominates the solutions involving instrumental variables and other parametric alternatives. An application to the PSID and NLS data illustrates the importance of our estimation approach for empirical research.
Journal: Econometric Reviews
Pages: 1-27
Issue: 1
Volume: 42
Year: 2023
Month: 1
X-DOI: 10.1080/07474938.2022.2157965
File-URL: http://hdl.handle.net/10.1080/07474938.2022.2157965
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:1:p:1-27
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2178089_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Bhavna Rai
Author-X-Name-First: Bhavna
Author-X-Name-Last: Rai
Title: Efficient estimation with missing data and endogeneity
Abstract:
I study the problem of missing values in the outcome and endogenous covariates in linear models. I propose an estimator that improves efficiency relative to a complete cases 2SLS. Unlike traditional imputation, my estimator is consistent even if the model contains nonlinear functions – like squares and interactions – of the endogenous covariates. It can also be used to combine data sets with missing outcome, missing endogenous covariates, and no missing variables. It includes the well-known “Two-Sample 2SLS” as a special case under weaker assumptions than the corresponding literature.
Journal: Econometric Reviews
Pages: 220-239
Issue: 2
Volume: 42
Year: 2023
Month: 2
X-DOI: 10.1080/07474938.2023.2178089
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2178089
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:2:p:220-239
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2178086_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Andres Aradillas-Lopez
Author-X-Name-First: Andres
Author-X-Name-Last: Aradillas-Lopez
Title: Inference in an incomplete information entry game with an incumbent and with beliefs conditioned on unobservable market characteristics
Abstract:
We consider a static entry game played between an incumbent and a collection of potential entrants. Entry decisions are made with incomplete information and beliefs are conditioned, at least partially, on a market characteristic that is unobserved by the econometrician. We describe conditions under which, even though the unobserved market characteristic cannot be identified, a subset of parameters of the model can still be identified, including all the strategic-interaction effects. We also characterize testable implications for strategic behavior by the incumbent when this player is able to shift the unobserved market characteristic to deter entry. We present results under Bayesian Nash equilibrium (BNE) and under the weaker behavioral model of iterated elimination of nonrationalizable strategies. Our empirical example analyzes geographic entry decisions in the Mexican internet service provider (ISP) industry. This industry has an incumbent, América Móvil (AMX), which established a widespread geographic presence as a monopolist following the privatization of Telmex in 1990. Our results show significant strategic interaction effects between AMX and its competitors, as well as evidence of strategic behavior by AMX to deter entry and maximize its market share.
Journal: Econometric Reviews
Pages: 123-156
Issue: 2
Volume: 42
Year: 2023
Month: 2
X-DOI: 10.1080/07474938.2023.2178086
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2178086
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:2:p:123-156
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2178139_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Badi H. Baltagi
Author-X-Name-First: Badi H.
Author-X-Name-Last: Baltagi
Title: The two-way Mundlak estimator
Abstract:
Mundlak shows that the fixed effects estimator is equivalent to the random effects estimator in the one-way error component model once the random individual effects are modeled as a linear function of all the averaged regressors over time. In the spirit of Mundlak, this paper shows that this result also holds for the two-way error component model once the individual and time effects are modeled as linear functions of all the averaged regressors across time and across individuals. Wooldridge also shows that the two-way fixed effects estimator can be obtained as a pooled OLS with the regressors augmented by the time and individual averages and calls it the two-way Mundlak estimator. While Mundlak used GLS rather than OLS on this augmented regression, we show that both estimators are equivalent for this augmented regression. This extends Baltagi’s results from the one-way to the two-way error component model. The F test suggested by Mundlak to test for this correlation between the random effects and the regressors generate a Hausman type test that is easily generalizable to the two-way Mundlak regression. In fact, the resulting F-tests for the two-way error component regression are related to the Hausman type tests proposed by Kang for the two-way error component model.
Journal: Econometric Reviews
Pages: 240-246
Issue: 2
Volume: 42
Year: 2023
Month: 2
X-DOI: 10.1080/07474938.2023.2178139
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2178139
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:2:p:240-246
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2178087_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Fei Jin
Author-X-Name-First: Fei
Author-X-Name-Last: Jin
Author-Name: Lung-fei Lee
Author-X-Name-First: Lung-fei
Author-X-Name-Last: Lee
Author-Name: Jihai Yu
Author-X-Name-First: Jihai
Author-X-Name-Last: Yu
Title: Estimating flow data models of international trade: dual gravity and spatial interactions
Abstract:
This article investigates asymptotic properties of quasi-maximum likelihood (QML) estimates for flow data on the dual gravity model in international trade with spatial interactions (dependence). The dual gravity model has a well-established economic foundation, and it takes the form of a spatial autoregressive (SAR) model. The dual gravity model originates from Behrens et al., but the spatial weights matrix motivated by their economic theory has a feature that violates existing regularity conditions for asymptotic econometrics analysis. By overcoming the limitations of existing asymptotic theory, we show that QML estimates are consistent and asymptotically normal. The simulation results show the satisfactory finite sample performance of the estimates. We illustrate the usefulness of the model by investigating the McCallum “border puzzle” in the gravity literature.
Journal: Econometric Reviews
Pages: 157-194
Issue: 2
Volume: 42
Year: 2023
Month: 2
X-DOI: 10.1080/07474938.2023.2178087
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2178087
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:2:p:157-194
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2178088_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Eric Beutner
Author-X-Name-First: Eric
Author-X-Name-Last: Beutner
Author-Name: Yicong Lin
Author-X-Name-First: Yicong
Author-X-Name-Last: Lin
Author-Name: Stephan Smeekes
Author-X-Name-First: Stephan
Author-X-Name-Last: Smeekes
Title: GLS estimation and confidence sets for the date of a single break in models with trends
Abstract:
We develop a Feasible Generalized Least Squares estimator of the date of a structural break in level and/or trend. The estimator is based on a consistent estimate of a T-dimensional inverse autocovariance matrix. A cubic polynomial transformation of break date estimates can be approximated by a nonstandard yet nuisance parameter free distribution asymptotically. The new limiting distribution captures the asymmetry and bimodality in finite samples and is applicable for inference with a single, known, set of critical values. We consider the confidence intervals/sets for break dates based on both Wald-type tests and by inverting multiple likelihood ratio (LR) tests. A simulation study shows that the proposed estimator increases the empirical concentration probability in a small neighborhood of the true break date and potentially reduces the mean squared errors. The LR-based confidence intervals/sets have good coverage while maintaining informative length even with highly persistent errors and small break sizes.
Journal: Econometric Reviews
Pages: 195-219
Issue: 2
Volume: 42
Year: 2023
Month: 2
X-DOI: 10.1080/07474938.2023.2178088
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2178088
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:2:p:195-219
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2178138_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Federico Belotti
Author-X-Name-First: Federico
Author-X-Name-Last: Belotti
Author-Name: Alessandro Casini
Author-X-Name-First: Alessandro
Author-X-Name-Last: Casini
Author-Name: Leopoldo Catania
Author-X-Name-First: Leopoldo
Author-X-Name-Last: Catania
Author-Name: Stefano Grassi
Author-X-Name-First: Stefano
Author-X-Name-Last: Grassi
Author-Name: Pierre Perron
Author-X-Name-First: Pierre
Author-X-Name-Last: Perron
Title: Simultaneous bandwidths determination for DK-HAC estimators and long-run variance estimation in nonparametric settings
Abstract:
We consider the derivation of data-dependent simultaneous bandwidths for double kernel heteroscedasticity and autocorrelation consistent (DK-HAC) estimators. In addition to the usual smoothing over lagged autocovariances for classical HAC estimators, the DK-HAC estimator also applies smoothing over the time direction. We obtain the optimal bandwidths that jointly minimize the global asymptotic MSE criterion and discuss the tradeoff between bias and variance with respect to smoothing over lagged autocovariances and over time. Unlike the MSE results of Andrews, we establish how nonstationarity affects the bias-variance tradeoff. We use the plug-in approach to construct data-dependent bandwidths for the DK-HAC estimators and compare them with the DK-HAC estimators from Casini that use data-dependent bandwidths obtained from a sequential MSE criterion. The former performs better in terms of size control, especially with stationary and close to stationary data. Finally, we consider long-run variance (LRV) estimation under the assumption that the series is a function of a nonparametric estimator rather than of a semiparametric estimator that enjoys the usual T rate of convergence. Thus, we also establish the validity of consistent LRV estimation in nonparametric parameter estimation settings.
Journal: Econometric Reviews
Pages: 281-306
Issue: 3
Volume: 42
Year: 2023
Month: 2
X-DOI: 10.1080/07474938.2023.2178138
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2178138
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:3:p:281-306
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2178136_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Yong Bao
Author-X-Name-First: Yong
Author-X-Name-Last: Bao
Title: Indirect inference estimation of higher-order spatial autoregressive models
Abstract:
This paper proposes estimating parameters in higher-order spatial autoregressive models, where the error term also follows a spatial autoregression and its innovations are heteroskedastic, by matching the simple ordinary least squares estimator with its analytical approximate expectation, following the principle of indirect inference. The resulting estimator is shown to be consistent, asymptotically normal, simulation-free, and robust to unknown heteroskedasticity. Monte Carlo simulations demonstrate its good finite-sample properties in comparison with existing estimators. An empirical study of Airbnb rental prices in the city of Asheville illustrates that the structure of spatial correlation and effects of various factors at the early stage of the COVID-19 pandemic are quite different from those during the second summer. Notably, during the pandemic, safety is valued more and on-line reviews are valued much less.
Journal: Econometric Reviews
Pages: 247-280
Issue: 3
Volume: 42
Year: 2023
Month: 2
X-DOI: 10.1080/07474938.2023.2178136
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2178136
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:3:p:247-280
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2178140_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Sungho Noh
Author-X-Name-First: Sungho
Author-X-Name-Last: Noh
Title: Nonparametric identification and estimation of heterogeneous causal effects under conditional independence
Abstract:
In this article, I propose a nonparametric strategy to identify the distribution of heterogeneous causal effects. A set of identification restrictions proposed in this article differs from existing approaches in three ways. First, it extends the random coefficient model by allowing potentially nonlinear interactions between distributional parameters and the set of covariates. Second, the causal effect distributions identified in this article give an alternative to those under the rank invariance assumption. Third, identified distribution lies within the sharp bound of distributions of the treatment effect. I develop a consistent nonparametric estimator exploiting the identifying restriction by extending the conventional statistical deconvolution method to the Rubin causal framework. Results from a Monte Carlo experiment and an application to wage loss of displaced workers suggest that the method yields robust estimates under various scenarios.
Journal: Econometric Reviews
Pages: 307-341
Issue: 3
Volume: 42
Year: 2023
Month: 2
X-DOI: 10.1080/07474938.2023.2178140
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2178140
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:3:p:307-341
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2191105_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Hao Dong
Author-X-Name-First: Hao
Author-X-Name-Last: Dong
Author-Name: Taisuke Otsu
Author-X-Name-First: Taisuke
Author-X-Name-Last: Otsu
Author-Name: Luke Taylor
Author-X-Name-First: Luke
Author-X-Name-Last: Taylor
Title: Bandwidth selection for nonparametric regression with errors-in-variables
Abstract:
We propose two novel bandwidth selection procedures for the nonparametric regression model with classical measurement error in the regressors. Each method evaluates the prediction errors of the regression using a second (density) deconvolution. The first approach uses a typical leave-one-out cross-validation criterion, while the second applies a bootstrap approach and the concept of out-of-bag prediction. We show the asymptotic validity of both procedures and compare them to the SIMEX method in a Monte Carlo study. As well as dramatically reducing computational cost, the methods proposed in this article lead to lower mean integrated squared error (MISE) compared to the current state-of-the-art.
Journal: Econometric Reviews
Pages: 393-419
Issue: 4
Volume: 42
Year: 2023
Month: 4
X-DOI: 10.1080/07474938.2023.2191105
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2191105
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:4:p:393-419
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2178137_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Richard Startz
Author-X-Name-First: Richard
Author-X-Name-Last: Startz
Author-Name: Douglas G. Steigerwald
Author-X-Name-First: Douglas G.
Author-X-Name-Last: Steigerwald
Title: Inference and extrapolation in finite populations with special attention to clustering
Abstract:
Statistical inference in economics is commonly based on formulas assuming infinite populations. We present appropriate formulas for use when sampling from finite populations, with special attention given to issues of treatment effects and to issues of clustering. Issues of whether to apply finite population corrections are often subtle, and appropriate corrections may depend on difficult to observe parameters, leaving the investigator only with bounds on relevant estimator variances.
Journal: Econometric Reviews
Pages: 343-357
Issue: 4
Volume: 42
Year: 2023
Month: 4
X-DOI: 10.1080/07474938.2023.2178137
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2178137
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:4:p:343-357
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2178141_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Martin Wagner
Author-X-Name-First: Martin
Author-X-Name-Last: Wagner
Author-Name: Karsten Reichold
Author-X-Name-First: Karsten
Author-X-Name-Last: Reichold
Title: Panel cointegrating polynomial regressions: group-mean fully modified OLS estimation and inference
Abstract:
We develop group-mean fully modified OLS (FM-OLS) estimation and inference for panels of cointegrating polynomial regressions, i.e., regressions that include an integrated process and its powers as explanatory variables. The stationary errors are allowed to be serially correlated, the integrated regressors – allowed to contain drifts – to be endogenous and, as usual in the panel literature, we include individual-specific fixed effects and also allow for individual-specific time trends. We consider a fixed cross-section dimension and asymptotics in the time dimension only. Within this setting, we develop cross-section dependence robust inference for the group-mean estimator. In both the simulations and an illustrative application estimating environmental Kuznets curves (EKCs) for carbon dioxide emissions we compare our group-mean FM-OLS approach with a recently proposed pooled FM-OLS approach of de Jong and Wagner.
Journal: Econometric Reviews
Pages: 358-392
Issue: 4
Volume: 42
Year: 2023
Month: 4
X-DOI: 10.1080/07474938.2023.2178141
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2178141
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:4:p:358-392
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2198930_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Enzo D’Innocenzo
Author-X-Name-First: Enzo
Author-X-Name-Last: D’Innocenzo
Author-Name: Alessandra Luati
Author-X-Name-First: Alessandra
Author-X-Name-Last: Luati
Author-Name: Mario Mazzocchi
Author-X-Name-First: Mario
Author-X-Name-Last: Mazzocchi
Title: A robust score-driven filter for multivariate time series
Abstract:
A multivariate score-driven filter is developed to extract signals from noisy vector processes. By assuming that the conditional location vector from a multivariate Student’s t distribution changes over time, we construct a robust filter which is able to overcome several issues that naturally arise when modeling heavy-tailed phenomena and, more in general, vectors of dependent non-Gaussian time series. We derive conditions for stationarity and invertibility and estimate the unknown parameters by maximum likelihood. Strong consistency and asymptotic normality of the estimator are derived. Analytical formulae are derived which consent to develop estimation procedures based on a fast and reliable Fisher scoring method. An extensive Monte–Carlo study is designed to assess the finite samples properties of the estimator, the impact of initial conditions on the filtered sequence, the performance when some of the underlying assumptions are violated, such as symmetry of the underlying distribution and homogeneity of the degrees of freedom parameter across marginals. The theory is supported by a novel empirical illustration that shows how the model can be effectively applied to estimate consumer prices from home scanner data.
Journal: Econometric Reviews
Pages: 441-470
Issue: 5
Volume: 42
Year: 2023
Month: 5
X-DOI: 10.1080/07474938.2023.2198930
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2198930
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:5:p:441-470
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2205339_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Offer Lieberman
Author-X-Name-First: Offer
Author-X-Name-Last: Lieberman
Author-Name: Francesca Rossi
Author-X-Name-First: Francesca
Author-X-Name-Last: Rossi
Title: Inference in a similarity-based spatial autoregressive model
Abstract:
In this article, we develop asymptotic theory for a spatial autoregressive (SAR) model where the network structure is defined according to a similarity-based weight matrix, in line with the similarity theory, which in turn has an axiomatic justification. We prove consistency of the quasi-maximum-likelihood estimator and derive its limit distribution. The contribution of this article is two-fold: on one hand, we incorporate a regression component in the data generating process while allowing the similarity structure to accommodate non-ordered data and by estimating explicitly the weight of the similarity, allowing it to be equal to unity. On the other hand, this work complements the literature on SAR models by adopting a data-driven weight matrix which depends on a finite set of parameters that have to be estimated. The spatial parameter, which corresponds to the weight of the similarity structure, is in turn allowed to take values at the boundary of the standard SAR parameter space. In addition, our setup accommodates strong forms of cross-sectional correlation that are normally ruled out in the standard SAR literature. Our framework is general enough to include as special cases also the random walk with a drift model, the local to unit root model (LUR) with a drift and the model for moderate integration with a drift.
Journal: Econometric Reviews
Pages: 471-486
Issue: 5
Volume: 42
Year: 2023
Month: 5
X-DOI: 10.1080/07474938.2023.2205339
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2205339
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:5:p:471-486
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2209008_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Estela Bee Dagum
Author-X-Name-First: Estela Bee
Author-X-Name-Last: Dagum
Author-Name: Silvia Bianconcini
Author-X-Name-First: Silvia
Author-X-Name-Last: Bianconcini
Title: Monitoring the direction of the short-term trend of economic indicators
Abstract:
Socioeconomic indicators have long been used by official statistical agencies to analyze and assess the current stage at which the economy stands via the application of linear filters used in conjunction with seasonal adjustment procedures. In this study, we propose a new set of symmetric and asymmetric weights that offer substantial gains in real-time by providing timely and more accurate information for detecting short-term trends with respect to filters commonly applied by statistical agencies. We compare the new filters to the classical ones through application to indicators of the US economy, which remains the linchpin of the global economic system. To assess the superiority of the proposed filters, we develop and evaluate explicit tests of the null hypothesis of no difference in revision accuracy of two competing filters. Furthermore, asymptotic and exact finite-sample tests are proposed and illustrated to assess if two compared filters have equal probabilities of failing to detect turning points at different time horizons after their occurrence.
Journal: Econometric Reviews
Pages: 421-440
Issue: 5
Volume: 42
Year: 2023
Month: 5
X-DOI: 10.1080/07474938.2023.2209008
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2209008
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:5:p:421-440
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2198929_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Kohtaro Hitomi
Author-X-Name-First: Kohtaro
Author-X-Name-Last: Hitomi
Author-Name: Masamune Iwasawa
Author-X-Name-First: Masamune
Author-X-Name-Last: Iwasawa
Author-Name: Yoshihiko Nishiyama
Author-X-Name-First: Yoshihiko
Author-X-Name-Last: Nishiyama
Title: Optimal minimax rates of specification testing with data-driven bandwidth
Abstract:
This study investigates optimal minimax rates of specification testing for linear and non-linear instrumental variable regression models. The test constructed by non-parametric kernel techniques can be rate optimal when bandwidths are selected appropriately. Since bandwidths are often selected in a data-dependent way in empirical studies, the rate-optimality of the test with data-driven bandwidths is investigated. While least squares cross-validation selects bandwidths that are optimal for estimation, it is shown not to be optimal for testing. Thus, we propose a novel bandwidth selection method for testing, the performance of which is investigated in a simulation study.
Journal: Econometric Reviews
Pages: 487-512
Issue: 6
Volume: 42
Year: 2023
Month: 6
X-DOI: 10.1080/07474938.2023.2198929
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2198929
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:6:p:487-512
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2209007_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Yuta Yamauchi
Author-X-Name-First: Yuta
Author-X-Name-Last: Yamauchi
Author-Name: Yasuhiro Omori
Author-X-Name-First: Yasuhiro
Author-X-Name-Last: Omori
Title: Dynamic factor, leverage and realized covariances in multivariate stochastic volatility
Abstract:
In the stochastic volatility models for multivariate daily stock returns, it has been found that the estimates of parameters become unstable as the dimension of returns increases. To solve this problem, we focus on the factor structure of multiple returns and consider two additional sources of information: first, the stock index associated with the market factor and, second, the realized covariance matrix calculated from high-frequency data. The proposed dynamic factor model with the leverage effect and realized measures is applied to 10 top stocks composing the exchange traded fund linked with the investment return of the S&P 500 index and the model is shown to have a stable advantage in portfolio performance.
Journal: Econometric Reviews
Pages: 513-539
Issue: 6
Volume: 42
Year: 2023
Month: 6
X-DOI: 10.1080/07474938.2023.2209007
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2209007
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:6:p:513-539
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2217077_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Bingduo Yang
Author-X-Name-First: Bingduo
Author-X-Name-Last: Yang
Author-Name: Xiaohui Liu
Author-X-Name-First: Xiaohui
Author-X-Name-Last: Liu
Author-Name: Wei Long
Author-X-Name-First: Wei
Author-X-Name-Last: Long
Author-Name: Liang Peng
Author-X-Name-First: Liang
Author-X-Name-Last: Peng
Title: A unified unit root test regardless of intercept
Abstract:
Using the augmented Dickey-Fuller test to verify the existence of a unit root in an autoregressive process often requires the correctly specified intercept, since the test statistics can be distinctive under different model specifications and lead to contradictory results at times. In this article, we develop a unified inference that not only unifies the specifications of the intercept but also accommodates different degrees of persistence of the underlying process and heteroscedastic errors. A simulation study shows that the resulting unified unit root test exhibits excellent size control and reasonably good power. In an empirical application, we implement the proposed test to re-examine the presence of unit roots within eleven widely used variables in stock return predictability.
Journal: Econometric Reviews
Pages: 540-555
Issue: 6
Volume: 42
Year: 2023
Month: 6
X-DOI: 10.1080/07474938.2023.2217077
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2217077
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:6:p:540-555
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2215034_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Xiaohu Wang
Author-X-Name-First: Xiaohu
Author-X-Name-Last: Wang
Author-Name: Jun Yu
Author-X-Name-First: Jun
Author-X-Name-Last: Yu
Title: Latent local-to-unity models
Abstract:
The article studies a class of state-space models where the state equation is a local-to-unity process. The parameter of interest is the persistence parameter of the latent process. The large sample theory for the least squares (LS) estimator and an instrumental variable (IV) estimator of the persistent parameter in the autoregressive (AR) representation of the model is developed under two sets of conditions. In the first set of conditions, the measurement error is independent and identically distributed, and the error term in the state equation is stationary and fractionally integrated with memory parameter d∈(−0.5,0.5). For both estimators, the convergence rate and the asymptotic distribution crucially depend on d. The LS estimator has a severe downward bias, which is aggravated even more by the measurement error when d≤0. The IV estimator eliminates the effects of the measurement error and reduces the bias. In the second set of conditions, the measurement error is independent but not necessarily identically distributed, and the error term in the state equation is strongly mixing. In this case, the IV estimator still leads to a smaller bias than the LS estimator. Special cases of our models and results in relation to those in the literature are discussed.
Journal: Econometric Reviews
Pages: 586-611
Issue: 7
Volume: 42
Year: 2023
Month: 8
X-DOI: 10.1080/07474938.2023.2215034
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2215034
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:7:p:586-611
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2219183_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Yining Chen
Author-X-Name-First: Yining
Author-X-Name-Last: Chen
Author-Name: Hudson S. Torrent
Author-X-Name-First: Hudson S.
Author-X-Name-Last: Torrent
Author-Name: Flavio A. Ziegelmann
Author-X-Name-First: Flavio A.
Author-X-Name-Last: Ziegelmann
Title: Robust nonparametric frontier estimation in two steps
Abstract:
We propose a robust methodology for estimating production frontiers with multi-dimensional input via a two-step nonparametric regression, in which we estimate the level and shape of the frontier before shifting it to an appropriate position. Our main contribution is to derive a novel frontier estimation method under a variety of flexible models which is robust to the presence of outliers and possesses some inherent advantages over traditional frontier estimators. Our approach may be viewed as a simplification, yet a generalization, of those proposed by Martins-Filho and coauthors, who estimate frontier surfaces in three steps. In particular, outliers, as well as commonly seen shape constraints of the frontier surfaces, such as concavity and monotonicity, can be straightforwardly handled by our estimation procedure. We show consistency and asymptotic distributional theory of our resulting estimators under standard assumptions in the multi-dimensional input setting. The competitive finite-sample performances of our estimators are highlighted in both simulation studies and empirical data analysis.
Journal: Econometric Reviews
Pages: 612-634
Issue: 7
Volume: 42
Year: 2023
Month: 8
X-DOI: 10.1080/07474938.2023.2219183
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2219183
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:7:p:612-634
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2213605_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Yundong Tu
Author-X-Name-First: Yundong
Author-X-Name-Last: Tu
Author-Name: Xinling Xie
Author-X-Name-First: Xinling
Author-X-Name-Last: Xie
Title: Forecasting vector autoregressions with mixed roots in the vicinity of unity
Abstract:
This article evaluates the forecast performance of model averaging forecasts in a nonstationary vector autoregression with mixed roots in the vicinity of unity. The deviation from unit root allows for local to unity, moderate deviation from unity and strong unit root, and the direction of such deviation could be from either the stationary or the explosive side. We provide a theoretical foundation for comparison among various forecasts, including the least squares estimator, the constrained estimator imposing the unit root constraint, and the selection or average over these two basic estimators. Furthermore, three new types of estimators are constructed, i.e., the bagging versions of the pretest estimator, the Mallows-pretest estimator that marries the Mallows averaging criterion and the Wald test, and the Mallows-bagging estimator that combines the Mallows averaging criterion and bagging technique. The asymptotic risks are shown to depend on the local parameters, which are not consistently estimable. Via Monte Carlo simulations, graphic comparisons indicate that the Mallows averaging estimator has both robust and outstanding forecasting performance. Model averaging over the vector autoregressive lag order is further considered to address the issue of model uncertainty in the lag specification. Finite sample simulations show that the Mallows averaging estimator performs superior to other frequently used selection and averaging methods. The application to forecasting the financial indices popularly used in the predictive regression further illustrates the practical merit of the proposed estimator.
Journal: Econometric Reviews
Pages: 556-585
Issue: 7
Volume: 42
Year: 2023
Month: 8
X-DOI: 10.1080/07474938.2023.2213605
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2213605
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:7:p:556-585
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2222637_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: C. Grazian
Author-X-Name-First: C.
Author-X-Name-Last: Grazian
Author-Name: A. McInnes
Author-X-Name-First: A.
Author-X-Name-Last: McInnes
Title: An application of copulas to OPEC’s changing influence on fossil fuel prices
Abstract:
This work examines how the dependence structures between energy futures asset prices differ in two periods identified before and after the 2008 global financial crisis. These two periods were characterized by a difference in the number of extraordinary meetings of OPEC countries organized to announce a change of oil production. In the period immediately following the global financial crisis, the decrease in oil prices and oil and gas demand forced OPEC countries to make frequent adjustments to the production of oil, while, since the first quarter of 2010, the recovery led to more regular meetings, with only three organized extraordinary meetings. We propose to use a copula model to study how the dependence structure among energy prices changed among the two periods. The use of copula models allows to introduce flexible and realistic models for the marginal time series; once marginal parameters are estimated, the estimates are used to fit several copula models for all asset combinations. Model selection techniques based on information criteria are implemented to choose the best models both for the univariate asset prices series and for the distribution of co-movements. The changes in the dependence structure of couple of assets are investigated through copula functionals and their uncertainty estimated through a bootstrapping method. We find the strength of dependence between asset combinations considerably differ between the two periods, showing a significant decrease for all the pairs of assets.
Journal: Econometric Reviews
Pages: 676-699
Issue: 8
Volume: 42
Year: 2023
Month: 9
X-DOI: 10.1080/07474938.2023.2222637
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2222637
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:8:p:676-699
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2224658_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Helmut Farbmacher
Author-X-Name-First: Helmut
Author-X-Name-Last: Farbmacher
Author-Name: Harald Tauchmann
Author-X-Name-First: Harald
Author-X-Name-Last: Tauchmann
Title: Linear fixed-effects estimation with nonrepeated outcomes
Abstract:
We demonstrate that popular linear fixed-effects panel-data estimators are biased and inconsistent when applied in a discrete-time hazard setting, even if the data-generating process is consistent with the linear model. The bias is not just survival bias, but originates from the impossibility to transform the model such that the remaining disturbance term becomes conditional mean independent of the explanatory variables. The bias is hence present even in the absence of unobserved heterogeneity. We discuss instrumental variables estimation, using first-differences of the explanatory variables as instruments, as alternative estimation strategy. Monte Carlo simulations and an empirical application substantiate our theoretical results.
Journal: Econometric Reviews
Pages: 635-654
Issue: 8
Volume: 42
Year: 2023
Month: 9
X-DOI: 10.1080/07474938.2023.2224658
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2224658
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:8:p:635-654
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2225947_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Fang Lu
Author-X-Name-First: Fang
Author-X-Name-Last: Lu
Author-Name: Sisheng Liu
Author-X-Name-First: Sisheng
Author-X-Name-Last: Liu
Author-Name: Jing Yang
Author-X-Name-First: Jing
Author-X-Name-Last: Yang
Author-Name: Xuewen Lu
Author-X-Name-First: Xuewen
Author-X-Name-Last: Lu
Title: Automatic variable selection for semiparametric spatial autoregressive model
Abstract:
This article studies the generalized method of moment estimation of semiparametric varying coefficient partially linear spatial autoregressive model. The technique of profile least squares is employed and all estimators have explicit formulas which are computationally convenient. We derive the limiting distributions of the proposed estimators for both parametric and non parametric components. Variable selection procedures based on smooth-threshold estimating equations are proposed to automatically eliminate irrelevant parameters and zero varying coefficient functions. Compared to the alternative approaches based on shrinkage penalty, the new method is easily implemented. Oracle properties of the resulting estimators are established. Large amounts of Monte Carlo simulations confirm our theories and demonstrate that the estimators perform reasonably well in finite samples. We also apply the novel methods to an empirical data analysis.
Journal: Econometric Reviews
Pages: 655-675
Issue: 8
Volume: 42
Year: 2023
Month: 9
X-DOI: 10.1080/07474938.2023.2225947
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2225947
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:8:p:655-675
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2222634_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: David I. Harvey
Author-X-Name-First: David I.
Author-X-Name-Last: Harvey
Author-Name: Stephen J. Leybourne
Author-X-Name-First: Stephen J.
Author-X-Name-Last: Leybourne
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M. Robert
Author-X-Name-Last: Taylor
Title: Improved tests for stock return predictability
Abstract:
Predictive regression methods are widely used to examine the predictability of (excess) stock returns by lagged financial variables characterized by unknown degrees of persistence and endogeneity. We develop a new hybrid test for predictability in these circumstances based on simple regression t-statistics. Where the predictor is endogenous, the optimal, but infeasible, test for predictability is based on the t-statistic on the lagged predictor in the basic predictive regression augmented with the current period innovation driving the predictor. We propose a feasible version of this augmented test, designed for the case where the predictor is an endogenous near-unit root process, using a GLS-based estimate of the innovation used in the infeasible test regression. The limiting null distribution of this statistic depends on both the endogeneity correlation parameter and the local-to-unity parameter characterizing the predictor. A method for obtaining asymptotic critical values is discussed and response surfaces are provided. We compare the asymptotic power properties of the feasible augmented test with those of a (non augmented) t-test recently considered in Harvey et al. and show that the augmented test is more powerful in the strongly persistent predictor case. We then propose using a weighted combination of the augmented statistic and the t-statistic of Harvey et al., where the weights are obtained using the p-values from a unit root test on the predictor. We find this can further improve asymptotic power in cases where the predictor has persistence at or close to that of a unit root process. Our final hybrid testing procedure then embeds the weighted statistic within a switching-based procedure which makes use of a standard predictive regression t-test, compared with standard normal critical values, when there is evidence for the predictor being weakly persistent. Monte Carlo simulations suggest that overall our new hybrid test displays superior finite sample performance to comparable extant tests.
Journal: Econometric Reviews
Pages: 834-861
Issue: 9-10
Volume: 42
Year: 2023
Month: 11
X-DOI: 10.1080/07474938.2023.2222634
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2222634
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:9-10:p:834-861
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2222633_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: H. Peter Boswijk
Author-X-Name-First: H.
Author-X-Name-Last: Peter Boswijk
Author-Name: Giuseppe Cavaliere
Author-X-Name-First: Giuseppe
Author-X-Name-Last: Cavaliere
Author-Name: Luca De Angelis
Author-X-Name-First: Luca
Author-X-Name-Last: De Angelis
Author-Name: A. M. Robert Taylor
Author-X-Name-First: A. M. Robert
Author-X-Name-Last: Taylor
Title: Adaptive information-based methods for determining the co-integration rank in heteroskedastic VAR models
Abstract:
Standard methods, such as sequential procedures based on Johansen’s (pseudo-)likelihood ratio (PLR) test, for determining the co-integration rank of a vector autoregressive (VAR) system of variables integrated of order one can be significantly affected, even asymptotically, by unconditional heteroskedasticity (non-stationary volatility) in the data. Known solutions to this problem include wild bootstrap implementations of the PLR test or the use of an information criterion, such as the BIC, to select the co-integration rank. Although asymptotically valid in the presence of heteroskedasticity, these methods can display very low finite sample power under some patterns of non-stationary volatility. In particular, they do not exploit potential efficiency gains that could be realized in the presence of non-stationary volatility by using adaptive inference methods. Under the assumption of a known autoregressive lag length, Boswijk and Zu develop adaptive PLR test based methods using a non-parametric estimate of the covariance matrix process. It is well-known, however, that selecting an incorrect lag length can significantly impact on the efficacy of both information criteria and bootstrap PLR tests to determine co-integration rank in finite samples. We show that adaptive information criteria-based approaches can be used to estimate the autoregressive lag order to use in connection with bootstrap adaptive PLR tests, or to jointly determine the co-integration rank and the VAR lag length and that in both cases they are weakly consistent for these parameters in the presence of non-stationary volatility provided standard conditions hold on the penalty term. Monte Carlo simulations are used to demonstrate the potential gains from using adaptive methods and an empirical application to the U.S. term structure is provided.
Journal: Econometric Reviews
Pages: 725-757
Issue: 9-10
Volume: 42
Year: 2023
Month: 11
X-DOI: 10.1080/07474938.2023.2222633
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2222633
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:9-10:p:725-757
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2241223_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Panayiotis C. Andreou
Author-X-Name-First: Panayiotis C.
Author-X-Name-Last: Andreou
Author-Name: Sofia Anyfantaki
Author-X-Name-First: Sofia
Author-X-Name-Last: Anyfantaki
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Carlo Sala
Author-X-Name-First: Carlo
Author-X-Name-Last: Sala
Title: Extremal quantiles and stock price crashes
Abstract:
We employ extreme value theory to identify stock price crashes, featuring low-probability events that produce large, idiosyncratic negative outliers in the conditional distribution. Traditional methods employ approximations under Gaussian assumptions and central moments. This is inherently imprecise and susceptible to misspecifications, especially for tail events. We instead propose new definitions and measures for crash risk based on conditional extremal quantiles (CEQ) of idiosyncratic stock returns. CEQ provide information on quantile-specific impact of covariates, and shed light on prior empirical puzzles and shortcomings in identifying crashes. Additionally, to capture the magnitude of crashes, we provide an expected shortfall analysis of the losses due to crash. Our findings have important implications for a burgeoning literature in financial economics that relies on traditional approximations.
Journal: Econometric Reviews
Pages: 703-724
Issue: 9-10
Volume: 42
Year: 2023
Month: 11
X-DOI: 10.1080/07474938.2023.2241223
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2241223
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:9-10:p:703-724
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2243696_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Esfandiar Maasoumi
Author-X-Name-First: Esfandiar
Author-X-Name-Last: Maasoumi
Author-Name: Robert Taylor
Author-X-Name-First: Robert
Author-X-Name-Last: Taylor
Title: In memory of Michael McAleer: special issue of Econometric Reviews
Journal: Econometric Reviews
Pages: 700-702
Issue: 9-10
Volume: 42
Year: 2023
Month: 11
X-DOI: 10.1080/07474938.2023.2243696
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2243696
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:9-10:p:700-702
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2221558_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Chaoyi Chen
Author-X-Name-First: Chaoyi
Author-X-Name-Last: Chen
Author-Name: Thanasis Stengos
Author-X-Name-First: Thanasis
Author-X-Name-Last: Stengos
Author-Name: Yiguo Sun
Author-X-Name-First: Yiguo
Author-X-Name-Last: Sun
Title: Endogeneity in semiparametric threshold regression models with two threshold variables
Abstract:
This article considers a semiparametric threshold regression model with two threshold variables. The proposed model allows endogenous threshold variables and endogenous slope regressors. Under the diminishing threshold effects framework, we derive consistency and asymptotic results of our proposed estimator for weakly dependent data. We study the finite sample performance of our proposed estimator via small Monte Carlo simulations and apply our model to classify economic growth regimes based on both national public debt and national external debt.
Journal: Econometric Reviews
Pages: 758-779
Issue: 9-10
Volume: 42
Year: 2023
Month: 11
X-DOI: 10.1080/07474938.2023.2221558
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2221558
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:9-10:p:758-779
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2227019_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Feifei Guo
Author-X-Name-First: Feifei
Author-X-Name-Last: Guo
Author-Name: Shiqing Ling
Author-X-Name-First: Shiqing
Author-X-Name-Last: Ling
Title: Inference for the VEC(1) model with a heavy-tailed linear process errors*
Abstract:
This article studies the first-order vector error correction (VEC(1)) model when its noise is a linear process of independent and identically distributed (i.i.d.) heavy-tailed random vectors with a tail index α∈(0,2)
. We show that the rate of convergence of the least squares estimator (LSE) related to the long-run parameters is n (sample size) and its limiting distribution is a stochastic integral in terms of two stable random processes, while the LSE related to the short-term parameters is not consistent. We further propose an automated approach via adaptive shrinkage techniques to determine the cointegrating rank in the VEC(1) model. It is demonstrated that the cointegration rank r0 can be consistently selected despite the fact that the LSE related to the short-term parameters is not consistently estimable when the tail index α∈(1,2)
. Simulation studies are carried out to evaluate the performance of the proposed procedure in finite samples. Last, we use our techniques to explore the long-run and short-run behavior of the monthly prices of wheat, corn, and wheat flour in the United States.
Journal: Econometric Reviews
Pages: 806-833
Issue: 9-10
Volume: 42
Year: 2023
Month: 11
X-DOI: 10.1080/07474938.2023.2227019
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2227019
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:9-10:p:806-833
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2224175_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Kees Jan van Garderen
Author-X-Name-First: Kees Jan
Author-X-Name-Last: van Garderen
Title: Forecasting Levels in Loglinear Unit Root Models
Abstract:
This article considers unbiased prediction of levels when data series are modeled as a random walk with drift and other exogenous factors after taking natural logs. We derive the unique unbiased predictors for growth and its variance. Derivation of level forecasts is more involved because the last observation enters the conditional expectation and is highly correlated with the parameter estimates, even asymptotically. This leads to conceptual questions regarding conditioning on endogenous variables. We prove that no conditionally unbiased forecast exists. We derive forecasts that are unconditionally unbiased and take into account estimation uncertainty, non linearity of the transformations, and the correlation between the last observation and estimate, which is quantitatively more important than estimation uncertainty and future disturbances together. The exact unbiased forecasts are shown to have lower Mean Squared Forecast Error (MSFE) than usual forecasts. The results are applied to Bitcoin price levels and a disaggregated eight sector model of UK industrial production.
Journal: Econometric Reviews
Pages: 780-805
Issue: 9-10
Volume: 42
Year: 2023
Month: 11
X-DOI: 10.1080/07474938.2023.2224175
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2224175
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:42:y:2023:i:9-10:p:780-805
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2237274_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Zhongfang He
Author-X-Name-First: Zhongfang
Author-X-Name-Last: He
Title: Time-dependent shrinkage of time-varying parameter regression models
Abstract:
This article studies the time-varying parameter (TVP) regression model in which the regression coefficients are random walk latent states with time-dependent conditional variances. This TVP model is flexible to accommodate a wide variety of time variation patterns but requires effective shrinkage on the state variances to avoid over-fitting. A Bayesian shrinkage prior is proposed based on reparameterization that translates the variance shrinkage problem into a variable shrinkage one in a conditionally linear regression with fixed coefficients. The proposed prior allows strong shrinkage for the state variances while maintaining the flexibility to accommodate local signals. A Bayesian estimation method is developed that employs the ancilarity-sufficiency interweaving strategy to boost sampling efficiency. Simulation study and an empirical application to forecast inflation rate illustrate the benefits of the proposed approach.
Journal: Econometric Reviews
Pages: 1-29
Issue: 1
Volume: 43
Year: 2024
Month: 1
X-DOI: 10.1080/07474938.2023.2237274
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2237274
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:1:p:1-29
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2280825_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Chaoxia Yuan
Author-X-Name-First: Chaoxia
Author-X-Name-Last: Yuan
Author-Name: Fang Fang
Author-X-Name-First: Fang
Author-X-Name-Last: Fang
Author-Name: Jialiang Li
Author-X-Name-First: Jialiang
Author-X-Name-Last: Li
Title: Model averaging for generalized linear models in diverging model spaces with effective model size
Abstract:
While plenty of frequentist model averaging methods have been proposed, existing weight selection criteria for generalized linear models (GLM) are usually based on a model size penalized Kullback-Leibler (KL) loss or simply cross-validation. In this article, when the data is generated from an exponential distribution, we propose a novel model averaging approach for GLM motivated by an asymptotically unbiased estimator of the KL loss penalized by an “effective model size” that incorporates the model misspecification. When all the candidate models are misspecified, the proposed method achieves asymptotic optimality while allowing both the number of candidate models and the dimension of covariates to diverging. Furthermore, when correct models are included in the candidate model set, we prove that the weight of wrong candidate models converges to zero, and hence the weighted regression coefficient estimator is consistent. Simulation studies and two real-data examples demonstrate the advantage of our new method over the existing frequentist model averaging methods.
Journal: Econometric Reviews
Pages: 71-96
Issue: 1
Volume: 43
Year: 2024
Month: 1
X-DOI: 10.1080/07474938.2023.2280825
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2280825
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:1:p:71-96
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2246823_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Taoufik Bouezmarni
Author-X-Name-First: Taoufik
Author-X-Name-Last: Bouezmarni
Author-Name: Mohamed Doukali
Author-X-Name-First: Mohamed
Author-X-Name-Last: Doukali
Author-Name: Abderrahim Taamouti
Author-X-Name-First: Abderrahim
Author-X-Name-Last: Taamouti
Title: Testing Granger non-causality in expectiles
Abstract:
This article aims to derive a consistent test of Granger causality at a given expectile. We also propose a sup-Wald test for jointly testing Granger causality at all expectiles that has the correct asymptotic size and power properties. Expectiles have the advantage of capturing similar information as quantiles, but they also have the merit of being much more straightforward to use than quantiles, since they are defined as least squares analog of quantiles. Studying Granger causality in expectiles is practically simpler and allows us to examine the causality at all levels of the conditional distribution. Moreover, testing Granger causality at all expectiles provides a sufficient condition for testing Granger causality in distribution. A Monte Carlo simulation study reveals that our tests have good finite-sample size and power properties for a variety of data-generating processes and different sample sizes. Finally, we provide two empirical applications to illustrate the usefulness of the proposed tests.
Journal: Econometric Reviews
Pages: 30-51
Issue: 1
Volume: 43
Year: 2024
Month: 1
X-DOI: 10.1080/07474938.2023.2246823
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2246823
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:1:p:30-51
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2255438_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20230119T200553 git hash: 724830af20
Author-Name: Giampiero Marra
Author-X-Name-First: Giampiero
Author-X-Name-Last: Marra
Author-Name: Rosalba Radice
Author-X-Name-First: Rosalba
Author-X-Name-Last: Radice
Author-Name: David Zimmer
Author-X-Name-First: David
Author-X-Name-Last: Zimmer
Title: A unifying switching regime regression framework with applications in health economics
Abstract:
Motivated by three health economics-related case studies, we propose a unifying and flexible regression modeling framework that involves regime switching. The proposal can handle the peculiar distributional shapes of the considered outcomes via a vast range of marginal distributions, allows for a wide variety of copula dependence structures and permits to specify all model parameters (including the dependence parameters) as flexible functions of covariate effects. The algorithm is based on a computationally efficient and stable penalized maximum likelihood estimation approach. The proposed modeling framework is employed in three applications in health economics, that use data from the Medical Expenditure Panel Survey, where novel patterns are uncovered. The framework has been incorporated in the R package GJRM, hence allowing users to fit the desired model(s) and produce easy-to-interpret numerical and visual summaries.
Journal: Econometric Reviews
Pages: 52-70
Issue: 1
Volume: 43
Year: 2024
Month: 1
X-DOI: 10.1080/07474938.2023.2255438
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2255438
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:1:p:52-70
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2315543_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20240209T083504 git hash: db97ba8e3a
Author-Name: The Editors
Title: ANNOUNCEMENT
Journal: Econometric Reviews
Pages: 97-97
Issue: 2-4
Volume: 43
Year: 2024
Month: 4
X-DOI: 10.1080/07474938.2024.2315543
File-URL: http://hdl.handle.net/10.1080/07474938.2024.2315543
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:2-4:p:97-97
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2310987_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20240209T083504 git hash: db97ba8e3a
Author-Name: Akanksha Negi
Author-X-Name-First: Akanksha
Author-X-Name-Last: Negi
Author-Name: Wooldridge Jeffrey M.
Author-X-Name-First: Wooldridge
Author-X-Name-Last: Jeffrey M.
Title: Doubly robust estimation of multivariate fractional outcome means with multivalued treatments
Abstract:
This article suggests a doubly robust method of estimating potential outcome means for multivariate fractional outcomes when the treatment of interest is unconfounded and can take more than two values. The method involves maximizing a propensity score weighted multinomial quasi-log-likelihood function with a multinomial logit conditional mean. We show that this estimator, which we call weighted multivariate fractional logit (wmflogit), consistently estimates the potential outcome means if either the propensity score model or the conditional mean model is misspecified. Our simulations demonstrate this double robustness property for the case of shares generated using a Dirichlet distribution. Finally, we advocate for the use of wmflogit by applying it to estimate time-use shares of women participating in the Mexican conditional cash transfer program, Progresa, using Stata’s fmlogit command developed by Buis.
Journal: Econometric Reviews
Pages: 175-196
Issue: 2-4
Volume: 43
Year: 2024
Month: 4
X-DOI: 10.1080/07474938.2024.2310987
File-URL: http://hdl.handle.net/10.1080/07474938.2024.2310987
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:2-4:p:175-196
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2292383_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20240209T083504 git hash: db97ba8e3a
Author-Name: Ignace De Vos
Author-X-Name-First: Ignace
Author-X-Name-Last: De Vos
Author-Name: Gerdie Everaert
Author-X-Name-First: Gerdie
Author-X-Name-Last: Everaert
Author-Name: Vasilis Sarafidis
Author-X-Name-First: Vasilis
Author-X-Name-Last: Sarafidis
Title: A method to evaluate the rank condition for CCE estimators
Abstract:
We develop a binary classifier to evaluate whether the rank condition (RC) is satisfied or not for the Common Correlated Effects (CCE) estimator. The RC postulates that the number of unobserved factors, m, is not larger than the rank of the unobserved matrix of average factor loadings, ϱ. When this condition fails, the CCE estimator is inconsistent, in general. Despite its importance, to date this rank condition could not be verified. The difficulty lies in the fact that factor loadings are unobserved, such that ϱ cannot be directly determined. The key insight in this article is that ϱ can be consistently estimated with existing techniques through the matrix of cross-sectional averages of the data. Similarly, m can be estimated consistently from the data using existing methods. Thus, a binary classifier, constructed by comparing estimates of m and ϱ, correctly determines whether the RC is satisfied or not as (N,T)→∞. We illustrate the practical relevance of testing the RC by studying the effect of the Dodd-Frank Act on bank profitability. The RC classifier reveals that the rank condition fails for a subperiod of the sample, in which case the estimated effect of bank size on profitability appears to be biased upwards.
Journal: Econometric Reviews
Pages: 123-155
Issue: 2-4
Volume: 43
Year: 2024
Month: 4
X-DOI: 10.1080/07474938.2023.2292383
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2292383
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:2-4:p:123-155
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2306069_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20240209T083504 git hash: db97ba8e3a
Author-Name: Ramses H. Abul Naga
Author-X-Name-First: Ramses H.
Author-X-Name-Last: Abul Naga
Author-Name: Christopher Stapenhurst
Author-X-Name-First: Christopher
Author-X-Name-Last: Stapenhurst
Author-Name: Gaston Yalonetzky
Author-X-Name-First: Gaston
Author-X-Name-Last: Yalonetzky
Title: Inferring inequality: Testing for median-preserving spreads in ordinal data
Abstract:
The median-preserving spread (MPS) ordering for ordinal variables has become ubiquitous in the inequality literature. We devise statistical tests of the hypothesis that a distribution G is not an MPS of a distribution F. Rejecting this hypothesis enables the conclusion that G is more unequal than F according to the MPS criterion. Monte Carlo simulations and novel graphical techniques show that a simple, asymptotic Z test is sufficient for most applications. We illustrate our tests with two applications: happiness inequality in the US and self-assessed health in Europe.
Journal: Econometric Reviews
Pages: 156-174
Issue: 2-4
Volume: 43
Year: 2024
Month: 4
X-DOI: 10.1080/07474938.2024.2306069
File-URL: http://hdl.handle.net/10.1080/07474938.2024.2306069
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:2-4:p:156-174
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2314092_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20240209T083504 git hash: db97ba8e3a
Author-Name: Lukang Huang
Author-X-Name-First: Lukang
Author-X-Name-Last: Huang
Author-Name: Wei Huang
Author-X-Name-First: Wei
Author-X-Name-Last: Huang
Author-Name: Oliver Linton
Author-X-Name-First: Oliver
Author-X-Name-Last: Linton
Author-Name: Zheng Zhang
Author-X-Name-First: Zheng
Author-X-Name-Last: Zhang
Title: Nonparametric estimation of mediation effects with a general treatment
Abstract:
To investigate causal mechanisms, causal mediation analysis decomposes the total treatment effect into the natural direct and indirect effects. This article examines the estimation of the direct and indirect effects in a general treatment effect model, where the treatment can be binary, multi-valued, continuous, or a mixture. We propose generalized weighting estimators with weights estimated by solving an expanding set of equations. Under some sufficient conditions, we show that the proposed estimators are consistent and asymptotically normal. Specifically, when the treatment is discrete, the proposed estimators attain semiparametric efficiency bounds. Meanwhile, when the treatment is continuous, the convergence rates of the proposed estimators are slower than N−1/2; however, they are still more efficient than those constructed from the true weighting function. A simulation study reveals that our estimators exhibit satisfactory finite-sample performance, while an application shows their practical value.
Journal: Econometric Reviews
Pages: 215-237
Issue: 2-4
Volume: 43
Year: 2024
Month: 4
X-DOI: 10.1080/07474938.2024.2314092
File-URL: http://hdl.handle.net/10.1080/07474938.2024.2314092
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:2-4:p:215-237
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2312288_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20240209T083504 git hash: db97ba8e3a
Author-Name: David M. Kaplan
Author-X-Name-First: David M.
Author-X-Name-Last: Kaplan
Author-Name: Xin Liu
Author-X-Name-First: Xin
Author-X-Name-Last: Liu
Title: Confidence intervals for intentionally biased estimators
Abstract:
We propose and study three confidence intervals (CIs) centered at an estimator that is intentionally biased to reduce mean squared error. The first CI simply uses an unbiased estimator’s standard error; compared to centering at the unbiased estimator, this CI has higher coverage probability for confidence levels above 91. 7%, even if the biased and unbiased estimators have equal mean squared error. The second CI trades some of this “excess” coverage for shorter length. The third CI is centered at a convex combination of the two estimators to further reduce length. Practically, these CIs apply broadly and are simple to compute.
Journal: Econometric Reviews
Pages: 197-214
Issue: 2-4
Volume: 43
Year: 2024
Month: 4
X-DOI: 10.1080/07474938.2024.2312288
File-URL: http://hdl.handle.net/10.1080/07474938.2024.2312288
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:2-4:p:197-214
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2292377_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20240209T083504 git hash: db97ba8e3a
Author-Name: Dalei Yu
Author-X-Name-First: Dalei
Author-X-Name-Last: Yu
Author-Name: Heng Lian
Author-X-Name-First: Heng
Author-X-Name-Last: Lian
Author-Name: Yuying Sun
Author-X-Name-First: Yuying
Author-X-Name-Last: Sun
Author-Name: Xinyu Zhang
Author-X-Name-First: Xinyu
Author-X-Name-Last: Zhang
Author-Name: Yongmiao Hong
Author-X-Name-First: Yongmiao
Author-X-Name-Last: Hong
Title: Post-averaging inference for optimal model averaging estimator in generalized linear models
Abstract:
This article considers the problem of post-averaging inference for optimal model averaging estimators in a generalized linear model (GLM). We establish the asymptotic distributions of optimal model averaging estimators for GLMs. The asymptotic distributions of the model averaging estimators are nonstandard, depending on the configuration of the penalty term in the weight choice criterion. We also propose a feasible simulation-based confidence interval estimator and investigate its asymptotic properties rigorously. Monte Carlo simulations verify the usefulness of our theoretical results, and the proposed methods are employed to analyze a stock car racing dataset.
Journal: Econometric Reviews
Pages: 98-122
Issue: 2-4
Volume: 43
Year: 2024
Month: 4
X-DOI: 10.1080/07474938.2023.2292377
File-URL: http://hdl.handle.net/10.1080/07474938.2023.2292377
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:2-4:p:98-122
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2334119_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20240209T083504 git hash: db97ba8e3a
Author-Name: Juan Lin
Author-X-Name-First: Juan
Author-X-Name-Last: Lin
Author-Name: Ximing Wu
Author-X-Name-First: Ximing
Author-X-Name-Last: Wu
Title: A hybrid nonparametric multivariate density estimator with applications to risk management
Abstract:
Multivariate density estimation is plagued by the curse of dimensionality in theory and practice. We propose a hybrid density estimator of a multivariate density f that combines the strengths of the kernel estimator and the exponential series estimator. This estimator refines a preliminary kernel estimate f̂0 with a multiplicative correction that estimates the ratio r=f/f̂0 with an exponential series estimator. Thanks to the consistency of the pilot estimate, the coefficients of the series expansion tend to approach zero asymptotically. Accordingly, we design a thresholding method for basis function selection. A major obstacle of multivariate exponential series estimator is the calculation of its normalization factor. We resolve this difficulty with Monte Carlo integration, using the pilot kernel estimate as the trial density for importance sampling. This approach greatly enhances the practicality of the hybrid estimator. Numerical simulations demonstrate the good finite sample performance of the hybrid estimator. We present one empirical application in financial risk management.
Journal: Econometric Reviews
Pages: 301-318
Issue: 5
Volume: 43
Year: 2024
Month: 5
X-DOI: 10.1080/07474938.2024.2334119
File-URL: http://hdl.handle.net/10.1080/07474938.2024.2334119
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:5:p:301-318
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2328905_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20240209T083504 git hash: db97ba8e3a
Author-Name: Jun Cai
Author-X-Name-First: Jun
Author-X-Name-Last: Cai
Author-Name: William C. Horrace
Author-X-Name-First: William C.
Author-X-Name-Last: Horrace
Author-Name: Yoonseok Lee
Author-X-Name-First: Yoonseok
Author-X-Name-Last: Lee
Title: Identification and estimation of panel semiparametric conditional heteroskedastic frontiers with dynamic inefficiency
Abstract:
We study a semiparametric panel stochastic frontier model with nonseparable unobserved heterogeneity, which allows for time-varying conditional heteroskedastic productivity components. It does not require distributional assumptions on random noise except conditional symmetry. We utilize conditional characteristic functions from Kotlarski’s Lemma to derive new moment conditions that yield the identification of the heteroskedastic variance parameters of inefficiency and random noise. Identification only requires a panel with three periods for serially correlated inefficiency. A nonparametric estimation procedure is also developed for the conditional variance of inefficiency, and its convergence rate is established. Monte Carlo simulation shows that the estimator is robust to misspecification of inefficiency distributions.
Journal: Econometric Reviews
Pages: 238-268
Issue: 5
Volume: 43
Year: 2024
Month: 5
X-DOI: 10.1080/07474938.2024.2328905
File-URL: http://hdl.handle.net/10.1080/07474938.2024.2328905
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:5:p:238-268
Template-Type: ReDIF-Article 1.0
# input file: LECR_A_2330127_J.xml processed with: repec_from_jats12.xsl darts-xml-transformations-20240209T083504 git hash: db97ba8e3a
Author-Name: Zhongfang He
Author-X-Name-First: Zhongfang
Author-X-Name-Last: He
Title: Locally time-varying parameter regression
Abstract:
I discuss a framework to allow dynamic sparsity in time-varying parameter regression models. The conditional variances of the innovations of time-varying parameters are time varying and equal to zero adaptively via thresholding. The resulting model allows the dynamics of the time-varying parameters to mix over different frequencies of parameter changes in a data driven way and permits great flexibility while achieving model parsimony. A convenient strategy is discussed to infer if each coefficient is static or dynamic and, if dynamic, how frequent the parameter change is. An MCMC scheme is developed for model estimation. The performance of the proposed approach is illustrated in studies of both simulated and real economic data.
Journal: Econometric Reviews
Pages: 269-300
Issue: 5
Volume: 43
Year: 2024
Month: 5
X-DOI: 10.1080/07474938.2024.2330127
File-URL: http://hdl.handle.net/10.1080/07474938.2024.2330127
File-Format: text/html
File-Restriction: Access to full text is restricted to subscribers.
Handle: RePEc:taf:emetrv:v:43:y:2024:i:5:p:269-300