Creator: Geweke, John Series: New methods in business cycle research Abstract:
A simple stochastic model of the firm is constructed in which a dynamic monopolist who maximizes a discounted profits stream subject to labor adjustment costs and given factor prices sets output price as a distributed lag of past wages and input prices. If the observed relation of wages and prices in manufacturing arises solely from this behavior then wages and input prices are exogenous with respect to output prices. In tests using quarterly and monthly series for the straight time wage, an index of raw materials prices and the wholesale price index for manufacturing and its durable and nondurable subsectors this hypothesis cannot be refuted for the period 1955:1 to 1971:11. During the period 1926:1 to 1940:11, however, symmetrically opposite behavior is observed manufacturing wholesale prices are exogenous with respect to the wage rate, a relation which can arise if dynamically monopsonistic firms compete in product markets. Neither structural relation has withstood direct wage and price controls.
Keyword: Wholesale, Labor, Wages, Prices, and Manufacturing Subject (JEL): E32 - Business Fluctuations; Cycles, E31 - Price Level; Inflation; Deflation, and L60 - Industry Studies: Manufacturing: General
Creator: Geweke, John Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 539 Abstract:
In the specification of linear regression models it is common to indicate a list of candidate variables from which a subset enters the model with nonzero coefficients. This paper interprets this specification as a mixed continuous-discrete prior distribution for coefficient values. It then utilizes a Gibbs sampler to construct posterior moments. It is shown how this method can incorporate sign constraints and provide posterior probabilities for all possible subsets of regressors. The methods are illustrated using some standard data sets.
Creator: Geweke, John Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 249 Abstract:
This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a fixed number of completely specified models, the paper introduces subjective Bayesian tools for formal comparison of these models with as yet incompletely specified models. The paper then shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. A theme of the paper is the practicality of subjective Bayesian methods. To this end, the paper describes publicly available software for Bayesian inference, model development, and communication and provides illustrations using two simple econometric models.
Creator: Geweke, John, Keane, Michael P., and Runkle, David Edward Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 177 Abstract:
Statistical inference in multinomial multiperiod probit models has been hindered in the past by the high dimensional numerical integrations necessary to form the likelihood functions, posterior distributions, or moment conditions in these models. We describe three alternative approaches to inference that circumvent the integration problem: Bayesian inference using Gibbs sampling and data augmentation to compute posterior moments, simulated maximum likelihood (SML) estimation using the GHK recursive probability simulator, and method of simulated moment (MSM) estimation using the GHK simulator. We perform a set of Monte-Carlo experiments to compare the performance of these approaches. Although all the methods perform reasonably well, some important differences emerge. The root mean square errors (RMSEs) of the SML parameter estimates around the data generating values exceed those of the MSM estimates by 21 percent on average, while the RMSEs of the MSM estimates exceed those of the posterior parameter means obtained via agreement via Gibbs sampling by 18 percent on average. While MSM produces a good agreement between empirical RMSEs and asymptotic standard errors, the RMSEs of the SML estimates exceed the asymptotic standard errors by 28 percent on average. Also, the SML estimates of serial correlation parameters exhibit significant downward bias.
Keyword: Panel data, Discrete choice, Bayesian inference, Method of simulated moments, Simulated maximum likelihood, and Gibbs sampling Subject (JEL): C35 - Multiple or Simultaneous Equation Models: Discrete Regression and Qualitative Choice Models; Discrete Regressors; Proportions and C15 - Statistical Simulation Methods: General
Creator: Geweke, John Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 570 Abstract:
This paper surveys recently developed methods for Bayesian inference and their use in economic time series models. It begins by reviewing aspects of Bayesian inference essential to understanding the implications of the Bayesian paradigm for time series analysis. It next describes the use of posterior simulators to solve otherwise intractable analytical problems. The theory and the computational advances are brought together in setting forth a practical framework for decision-making and forecasting. These developments are illustrated in the context of the vector autoregressions, stochastic volatility models, and models of changing regimes.
Creator: Geweke, John and Petrella, Lea Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 553 Abstract:
This paper provides a general and efficient method for computing density ratio class bounds on posterior moments, given the output of a posterior simulator. It shows how density ratio class bounds for posterior odds ratios may be formed in many situations, also on the basis of posterior simulator output. The computational method is used to provide density ratio class bounds in two econometric models. It is found that the exact bounds are approximated poorly by their asymptotic approximation, when the posterior distribution of the function of interest is skewed. It is also found that posterior odds ratios display substantial variation within the density ratio class, in ways that cannot be anticipated by the asymptotic approximation.
Keyword: Bayesian inference, Markov-chain Monte Carlo, Normal mixture, and Probit model Subject (JEL): C11 - Bayesian Analysis: General and C63 - Computational Techniques; Simulation Modeling
Creator: Chin, Daniel M., Geweke, John, and Miller, Preston J. Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 267 Abstract:
This paper presents a new method for predicting turning points. The paper formally defines a turning point; develops a probit model for estimating the probability of a turning point; and then examines both the in-sample and out-of-sample forecasting performance of the model. The model performs better than some other methods for predicting turning points.
Creator: Geweke, John Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 555
Creator: Geweke, John Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 526 Keyword: Econometrics, Monte Carlo, and Simulation Subject (JEL): C15 - Statistical Simulation Methods: General and C63 - Computational Techniques; Simulation Modeling
Creator: Geweke, John Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 192 Abstract:
This is a survey of simulation methods in economics, with a specific focus on integration problems. It describes acceptance methods, importance sampling procedures, and Markov chain Monte Carlo methods for simulation from univariate and multivariate distributions and their application to the approximation of integrals. The exposition gives emphasis to combinations of different approaches and assessment of the accuracy of numerical approximations to integrals and expectations. The survey illustrates these procedures with applications to simulation and integration problems in economics.
Creator: Geweke, John and Keane, Michael P. Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 237 Abstract:
This paper generalizes the normal probit model of dichotomous choice by introducing mixtures of normals distributions for the disturbance term. By mixing on both the mean and variance parameters and by increasing the number of distributions in the mixture these models effectively remove the normality assumption and are much closer to semiparametric models. When a Bayesian approach is taken, there is an exact finite-sample distribution theory for the choice probability conditional on the covariates. The paper uses artificial data to show how posterior odds ratios can discriminate between normal and nonnormal distributions in probit models. The method is also applied to female labor force participation decisions in a sample with 1,555 observations from the PSID. In this application, Bayes factors strongly favor mixture of normals probit models over the conventional probit model, and the most favored models have mixtures of four normal distributions for the disturbance term.
Keyword: Normal mixture, Discrete choice, and Markov chain Monte Carlo Subject (JEL): C25 - Single Equation Models; Single Variables: Discrete Regression and Qualitative Choice Models; Discrete Regressors; Proportions; Probabilities and C11 - Bayesian Analysis: General
Creator: Geweke, John and Zhou, Guofo Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 189 Abstract:
This paper provides an exact Bayesian framework for analyzing the arbitrage pricing theory (APT). Based on the Gibbs sampler, we show how to obtain the exact posterior distributions for functions of interest in the factor model. In particular, we propose a measure of the APT pricing deviations and obtain its exact posterior distribution. Using monthly portfolio returns grouped by industry and market capitalization, we find that there is little improvement in reducing the pricing errors by including more factors beyond the first one.
Subject (JEL): G10 - General Financial Markets: General (includes Measurement and Data)
Creator: Geweke, John Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 148 Abstract:
Data augmentation and Gibbs sampling are two closely related, sampling-based approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accuracy of the approximations to the expected value of functions of interest under the posterior. In this paper methods for spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence. These methods are illustrated in the normal linear model with informative priors, and in the Tobit-censored regression model.
Creator: Geweke, John Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 540 Abstract:
The reduced rank regression model arises repeatedly in theoretical and applied econometrics. To date the only general treatment of this model have been frequentist. This paper develops general methods for Bayesian inference with noninformative reference priors in this model, based on a Markov chain sampling algorithm, and procedures for obtaining predictive odds ratios for regression models with different ranks. These methods are used to obtain evidence on the number of factors in a capital asset pricing model.
Keyword: Factor model, Capital asset pricing model, and Predictive odds Subject (JEL): C11 - Bayesian Analysis: General and C15 - Statistical Simulation Methods: General
Creator: Geweke, John Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 552 Abstract:
The normal linear model, with sign or other linear inequality constraints on its coefficients, arises very commonly in many scientific applications. Given inequality constraints Bayesian inference is much simpler than classical inference, but standard Bayesian computational methods become impractical when the posterior probability of the inequality constraints (under a diffuse prior) is small. This paper shows how the Gibbs sampling algorithm can provide an alternative, attractive approach to inference subject to linear inequality constraints in this situation, and how the GHK probability simulator may be used to assess the posterior probability of the constraints.
Creator: Geweke, John Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 532 Abstract:
This paper integrates and extends some recent computational advances in Bayesian inference with the objective of more fully realizing the Bayesian promise of coherent inference and model comparison in economics. It combines Markov chain Monte Carlo and independence Monte Carlo with importance sampling to provide an efficient and generic method for updating posterior distributions. It exploits the multiplicative decomposition of marginalized likelihood into predictive factors, to compute posterior odds ratios efficiently and with minimal further investment in software. It argues for the use of predictive odds ratios in model comparison in economics. Finally, it suggests procedures for public reporting that will enable remote clients to conveniently modify priors, form posterior expectations of their own functions of interest, and update the posterior distribution with new observations. A series of examples explores the practicality and efficiency of these methods.
This paper was prepared for the inaugural Colin Clark Lecture, Australasian Meetings of the Econometric Society, July 1994.
Keyword: Computation, Model comparison, Bayesian inference, and Econometric modeling Subject (JEL): C53 - Forecasting Models; Simulation Methods and C11 - Bayesian Analysis: General
Creator: Geweke, John and Keane, Michael P. Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 233 Abstract:
This study uses data from the Panel Survey of Income Dynamics (PSID) to address a number of questions about life cycle earnings mobility. It develops a dynamic reduced form model of earnings and marital status that is nonstationary over the life cycle. The study reaches several firm conclusions about life cycle earnings mobility. Incorporating non-Gaussian shocks makes it possible to account for transitions between low and higher earnings states, a heretofore unresolved problem. The non-Gaussian distribution substantially increases the lifetime return to post-secondary education, and substantially reduces differences in lifetime wages attributable to race. In a given year, the majority of variance in earnings not accounted for by race, education and age is due to transitory shocks, but over a lifetime the majority is due to unobserved individual heterogeneity. Consequently, low earnings at early ages are strong predictors of low earnings later in life, even conditioning on observed individual characteristics.
Creator: Geweke, John, Keane, Michael P., and Runkle, David Edward Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 170 Abstract:
This research compares several approaches to inference in the multinomial probit model, based on Monte-Carlo results for a seven choice model. The experiment compares the simulated maximum likelihood estimator using the GHK recursive probability simulator, the method of simulated moments estimator using the GHK recursive simulator and kernel-smoothed frequency simulators, and posterior means using a Gibbs sampling-data augmentation algorithm. Each estimator is applied in nine different models, which have from 1 to 40 free parameters. The performance of all estimators is found to be satisfactory. However, the results indicate that the method of simulated moments estimator with the kernel-smoothed frequency simulator does not perform quite as well as the other three methods. Among those three, the Gibbs sampling-data augmentation algorithm appears to have a slight overall edge, with the relative performance of MSM and SML based on the GHK simulator difficult to determine.