This paper presents a frequency-domain technique for estimating distributed lag coefficients (the impulse-response function) when observations are randomly missed. The technique treats stationary processes with randomly missed observations as amplitude-modulated processes and estimates the transfer function accordingly. Estimates of the lag coefficients are obtained by taking the inverse transform of the estimated transfer function. Results with artificially created data show that the technique performs well even when the probability of an observation being missed is one-half and in some cases when the probability is as low as one-fifth. The approximate asymptotic variance of the estimator is also calculated in the paper.
The purpose of this paper is to begin a reevaluation of the Free Banking Era by developing and examining individual bank information on the population of banks which existed under the free banking laws in four states. This information allows us to determine the number of free banks which failed and to estimate the resulting losses to their note holders. While the new evidence suggests there were problems with free banking, it presents a serious challenge to the prevailing view that free banking led to financial chaos.
In this paper we propose and test a new explanation of bank behavior during the Free Banking Era, 1837–63. Arguing against the view that free bank failures were due to fraud, we claim that they were caused by exposure to term structure risk. Testing this new explanation with a new and extensive body of data, we find strong support for it: periods of falling bond prices correspond to the periods with most of the free bank failures. The new data do not support the view that fraud caused the failures.
The claim that bad money drives out good is one of the oldest and most cited in economics. Economists refer to this claim as Gresham’s law. Yet despite its seemingly universal acceptance, this claim does not warrant its status as a law. We find it has no convincing explanations and many overlooked exceptions. We propose an alternative hypothesis based on the costs of using a medium of exchange at a nonpar price: small-denomination currency undervalued at the mint tends to disappear from circulation while large-denomination currency usually circulates at premium. Examining a variety of historical episodes when market and legal prices were different, we find our “law” can explain history much better than Gresham’s.
This paper analyzes the variability of output under money supply and exchange rate rules in an open economy in which the slope of the aggregate supply curve depends on the variances of aggregate demand and market-specific innovations. It demonstrates that results regarding the dominance of one rule over the other when the slope of the aggregate supply curve is constant are reversed when the slope of the aggregate supply curve depends on the variances of innovations and these variances are sufficiently large.
This paper explains why the risky notes of banks established during the Free Banking Era (1837–63) were demanded even when relatively safe specie (gold and silver coin) was an alternative. Free bank notes were demanded because they were priced to reflect the expected value of their backing. The empirical evidence supports this explanation. Specifically, in New York, Wisconsin, and Indiana the expected value of backing was sufficient for free bank notes to circulate at par, which they did. In Minnesota the backing for notes was very poor: they exchanged well below par, being treated as small-denomination securities.
This paper shows that there can be equilibria in which exchange rates display randomness unrelated to fundamentals. This is demonstrated in the context of a two currency, one good model, with three agent types and cash-in-advance constraints. A crucial feature is that the type i agents, for i=l, 2, must satisfy a cash—in-advance constraint by holding currency i, while type 3 agents can satisfy it by holding either currency. It is shown that real allocations vary across the multiple equilibria if markets for hedging exchange risk do not exist and that the randomness is innocuous if complete markets exist.
In this paper we present a consistent estimator for a linear filter (distributed lag) when the independent variable is subject to observational error. Unlike the standard errors-in-variables estimator which uses instrumental variables, our estimator works directly with observed data. It is based on the Hilbert transform relationship between the phase and the log gain of a minimum phase-lag linear filter. The results of using our method to estimate a known filter and to estimate the relationship between consumption and income demonstrate that the method performs quite well even when the noise-to-signal ratio for the observed independent variable is large. We also develop a criterion for determining whether an estimated phase function is minimum phase-lag.