Busca
Resultados da Busca

Creator: Supel, Thomas M. Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 000 Descrição: This paper was published with no issue number.
Palavrachave: Extreme value problem, Random variables, Truncated normal variate, and Probability models Sujeito: C10  Econometric and Statistical Methods and Methodology: General 
Creator: Hansen, Lars Peter and Jagannathan, Ravi Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 167 Abstract: In this paper we develop alternative ways to compare asset pricing models when it is understood that their implied stochastic discount factors do not price all portfolios correctly. Unlike comparisons based on chisquared statistics associated with null hypotheses that models are correct, our measures of model performance do not reward variability of discount factor proxies. One of our measures is designed to exploit fully the implications of arbitragefree pricing of derivative claims. We demonstrate empirically the usefulness of methods in assessing some alternative stochastic factor models that have been proposed in asset pricing literature.
Sujeito: C13  Estimation: General, E30  Prices, Business Fluctuations, and Cycles: General (includes Measurement and Data), C12  Hypothesis Testing: General, C10  Econometric and Statistical Methods and Methodology: General, G10  General Financial Markets: General (includes Measurement and Data), and G12  Asset Pricing; Trading Volume; Bond Interest Rates 
Creator: Bryant, John B. Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 168 Abstract: A simple model of backed money without a store of value function is presented, discussed, and defended. The function of money in the model is to replace complex contingent contracts traded on a centralized exchange with simple trades in decentralized markets.
Palavrachave: Fiat money, Commodity money, and Contracts Sujeito: C10  Econometric and Statistical Methods and Methodology: General and E40  Money and Interest Rates: General 
Creator: Todd, Richard M. Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 207 Palavrachave: Timeinvariant system, Timevarying system, and Convergence theorem Sujeito: C10  Econometric and Statistical Methods and Methodology: General 

Creator: Uhlig, Harald, 1961 Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 342 Abstract: [Please note that the following Greek lettering is improperly transcribed.] If [0,1] is a measure space of agents and X a collection of pairwise uncorrelated random variables with common finite mean U and variance a , one would like to establish a law of large numbers () Xdl = U. In this paper we propose to interpret () as a Pettis integral. Using the corresponding Riemanntype version of this integral, we establish (*) and interpret it as an L2law of large numbers. Intuitively, the main idea is to integrate before drawing an W, thus avoiding wellknow measurability problems. We discuss distributional properties of i.i.d. random shocks across the population. We given examples for the economic interpretability of our definition. Finally, we establish a vectorvalued version of the law of large numbers for economies.
Palavrachave: Khinchines law of large numbers, Pettis integral, L2 law of large numbers, Riemann integral, Large numbers, and Random variable Sujeito: C10  Econometric and Statistical Methods and Methodology: General 
Creator: Uhlig, Harald, 1961 Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 342 Abstract: [Please note that the following Greek lettering is improperly transcribed.] If [0,1] is a measure space of agents and X a collection of pairwise uncorrelated random variables with common finite mean U and variance a , one would like to establish a law of large numbers () Xdl = U. In this paper we propose to interpret () as a Pettis integral. Using the corresponding Riemanntype version of this integral, we establish (*) and interpret it as an L2law of large numbers. Intuitively, the main idea is to integrate before drawing an W, thus avoiding wellknow measurability problems. We discuss distributional properties of i.i.d. random shocks across the population. We given examples for the economic interpretability of our definition. Finally, we establish a vectorvalued version of the law of large numbers for economies.
Palavrachave: Khinchines law of large numbers, Pettis integral, L2 law of large numbers, Riemann integral, Large numbers, and Random variable Sujeito: C10  Econometric and Statistical Methods and Methodology: General 
Creator: Prescott, Edward C. Series: Staff report (Federal Reserve Bank of Minneapolis. Research Department) Number: 527 Abstract: This essay reviews the development of neoclassical growth theory, a unified theory of aggregate economic phenomena that was first used to study business cycles and aggregate labor supply. Subsequently, the theory has been used to understand asset pricing, growth miracles and disasters, monetary economics, capital accounts, aggregate public finance, economic development, and foreign direct investment.
The focus of this essay is on real business cycle (RBC) methodology. Those who employ the discipline behind the methodology to address various quantitative questions come up with essentially the same answer—evidence that the theory has a life of its own, directing researchers to essentially the same conclusions when they apply its discipline. Deviations from the theory sometimes arise and remain open for a considerable period before they are resolved by better measurement and extensions of the theory. Elements of the discipline include selecting a model economy or sometimes a set of model economies. The model used to address a specific question or issue must have a consistent set of national accounts with all the accounting identities holding. In addition, the model assumptions must be consistent across applications and be consistent with micro as well as aggregate observations. Reality is complex, and any model economy used is necessarily an abstraction and therefore false. This does not mean, however, that model economies are not useful in drawing scientific inference.
The vast number of contributions made by many researchers who have used this methodology precludes reviewing them all in this essay. Instead, the contributions reviewed here are ones that illustrate methodological points or extend the applicability of neoclassical growth theory. Of particular interest will be important developments subsequent to the Cooley (1995) volume, Frontiers of Business Cycle Research. The interaction between theory and measurement is emphasized because this is the way in which hard quantitative sciences progress.
Palavrachave: Aggregate financial economics, Development, Business cycle fluctuations, Prosperities, RBC methodology, Neoclassical growth theory, Depressions, Aggregate economic theory, and Aggregation Sujeito: B40  Economic Methodology: General, E60  Macroeconomic Policy, Macroeconomic Aspects of Public Finance, and General Outlook: General, E32  Business Fluctuations; Cycles, E13  General Aggregative Models: Neoclassical, C10  Econometric and Statistical Methods and Methodology: General, and E00  Macroeconomics and Monetary Economics: General 
Creator: Turdaliev, Nurlan Series: Working paper (Federal Reserve Bank of Minneapolis. Research Department) Number: 596 Abstract: In a repeated game of incomplete information, myopic players form beliefs on nextperiod play and choose strategies to maximize nextperiod payoffs. Beliefs are treated as forecast of future plays. Forecast accuracy is assessed using calibration tests, which measure asymptotic accuracy of beliefs against some realizations. Beliefs are calibrated if they pass all calibration tests. For a positive Lebesgue measure of payoff vectors, beliefs are not calibrated. But, if payoff vector and calibration test are drawn from a suitable product measure, beliefs pass the calibration test almost surely.
Sujeito: C10  Econometric and Statistical Methods and Methodology: General, C72  Noncooperative Games, and C70  Game Theory and Bargaining Theory: General 
Creator: FernandezVillaverde, Jesus and RubioRamírez, Juan Francisco Series: Joint committee on business and financial analysis Abstract: This paper presents a method to perform likelihoodbased inference in nonlinear dynamic equilibrium economies. This type of models has become a standard tool in quantitative economics. However, existing literature has been forced so far to use moment procedures or linearization techniques to estimate these models. This situation is unsatisfactory: moment procedures suffer from strong small samples biases and linearization depends crucially on the shape of the true policy functions, possibly leading to erroneous answers. We propose the use of Sequential Monte Carlo methods to evaluate the likelihood function implied by the model. Then we can perform likelihoodbased inference, either searching for a maximum (QuasiMaximum Likelihood Estimation) or simulating the posterior using a Markov Chain Monte Carlo algorithm (Bayesian Estimation). We can also compare different models even if they are nonnested and misspecified. To perform classical model selection, we follow Vuong (1989) and use the KullbackLeibler distance to build Likelihood Ratio Tests. To perform Bayesian model comparison, we build Bayes factors. As an application, we estimate the stochastic neoclassical growth model.
Palavrachave: Sequential Monte Carlo methods, Nonlinear filtering, Dynamic equilibrium economies, and Likelihoodbased inference Sujeito: C11  Bayesian Analysis: General, C10  Econometric and Statistical Methods and Methodology: General, C13  Estimation: General, and C15  Statistical Simulation Methods: General