Andrew Caplin - Academia.edu (original) (raw)
Papers by Andrew Caplin
RePEc: Research Papers in Economics, Feb 1, 1990
RePEc: Research Papers in Economics, Feb 1, 1990
Social Science Research Network, 2020
for valuable feedback. All mistakes are our own. We thank the Alfred P. Sloan and NOMIS Foundatio... more for valuable feedback. All mistakes are our own. We thank the Alfred P. Sloan and NOMIS Foundations for support. The views expressed herein are those of the authors and do not necessarily reflect the views of the National Bureau of Economic Research. NBER working papers are circulated for discussion and comment purposes. They have not been peer-reviewed or been subject to the review by the NBER Board of Directors that accompanies official NBER publications.
A model of endogenous price adjustment under money growth is presented. Firma follow (a, S) prici... more A model of endogenous price adjustment under money growth is presented. Firma follow (a, S) pricing policies and price revisions are imperfectly synchronized. In the aggregate, price stickiness disappears and money is neutral. The connection between firm price adjustment and relative price variability in the presence of monetary growth is also investigated. The results contrast with those obtained in models with exogenous fixed timing of price adjustment.
arXiv (Cornell University), May 9, 2022
A much studied issue is the extent to which the confidence scores provided by machine learning al... more A much studied issue is the extent to which the confidence scores provided by machine learning algorithms are calibrated to ground truth probabilities. Our starting point is that calibration is seemingly incompatible with class weighting, a technique often employed when one class is less common (class imbalance) or with the hope of achieving some external objective (cost-sensitive learning). We provide a model-based explanation for this incompatibility and use our anthropomorphic model to generate a simple method of recovering likelihoods from an algorithm that is miscalibrated due to class weighting. We validate this approach in the binary pneumonia detection task of Rajpurkar, Irvin, Zhu, et al. (2017).
SSRN Electronic Journal
Despite its centrality in monetary policy, communication is not a focus in social security reform... more Despite its centrality in monetary policy, communication is not a focus in social security reform. We investigate the potential for active communication to dissipate apparently widespread public confusion about the future of social security. We implement a simple information treatment in which we randomly provide survey respondents access to the longevitybased eligibility age implemented by reform that Denmark launched in 2006. Absent treatment, younger workers not only have biased beliefs, expecting to become eligible for social security earlier than policy makers intend, but also are highly uncertain about eligibility age. The information treatment eliminates the bias, suggesting it results from misunderstanding. Yet it has no influence on uncertainty, suggesting this is driven by unavoidable demographic and political uncertainties. Our results highlight the value of communication strategies and belief measurement as policy instruments outside the monetary policy arena.
SSRN Electronic Journal
How worker productivity evolves with tenure and experience is central to economics, shaping, for ... more How worker productivity evolves with tenure and experience is central to economics, shaping, for example, life-cycle earnings and the losses from involuntary job separation. Yet, worker-level productivity is hard to identify from observational data. This paper introduces direct measurement of worker productivity in a firm survey designed to separate the role of on-the-job tenure from total experience in determining productivity growth. A key innovation is to elicit what managers know about the productivity of their workers. Several findings emerge concerning the initial period on the job. (1) On-the-job productivity growth exceeds wage growth, consistent with wages not being allocative period-by-period. (2) Previous experience is a substitute, but a far less than perfect one, for on-the-job tenure. (3) There is substantial heterogeneity across jobs in the extent to which previous experience substitutes for tenure. The survey makes use of administrative data to construct a representative sample of firms, check for selective nonresponse, validate survey measures with administrative measures, and calibrate parameters not measured in the survey.
This paper examines the effects of inter-generational altruism on late-in-life wealth accumulatio... more This paper examines the effects of inter-generational altruism on late-in-life wealth accumulation. We designed and fielded a new survey to better measure transfers from parents to descendants as part of the Vanguard Research Initiative. New survey features include a carefully designed family inventory, a break-down of transfers into four categories, and a path of past and future expected transfers three years before and after. We also asked Strategic Survey Questions (SSQs) to identify preference parameters related to the desire to insure family risks via transfers. We use these new transfer measurements and SSQs to study the dynamics of parent-to-child giving by estimating a life-cycle consumption-savings model. Agents in our model save for consumption smoothing, uncertain medical and long-term care needs, inter-vivos transfers to cover uncertain family needs, and bequests. We find that there is a large and uncertain family need risk, and parents save in order to help when their d...
Consumers often face an overwhelming amount of information when deciding between products, and on... more Consumers often face an overwhelming amount of information when deciding between products, and one of the primary policymaking tools available to improve their informativeness is the framing of this information. We introduce a general theoretical approach that characterizes when one frame is revealed to provide robustly higher welfare than another. Because it is testable, adaptable, and both necessary and sufficient, our condition determines both whether frames are robustly welfare ranked in a particular data set and the overall proportion of data sets in which frames can be so ranked.
Handbook of Experimental Economic Methodology, 2015
We outline experiments that improve our understanding of decision making by analyzing behavior in... more We outline experiments that improve our understanding of decision making by analyzing behavior in the period of contemplation that preceeds commitment to a …nal choice. The experiments are based on axiomatic models of the decision making process that relate closely to revealed preference logic. To test the models, we arti…cially incentivize particular choices to be made in the pre-decision period. We show how the resulting experiments can improve our understanding not only of the decision making process, but of the decision itself. Our broad method is to make aspects of search visible while retaining the disciplined approach to data that axiomatic modeling best provides.
American Economic Review, 2011
Many everyday decisions are made without full examination of all available options, and, as a res... more Many everyday decisions are made without full examination of all available options, and, as a result, the best available option may be missed. We develop a search-theoretic choice experiment to study the impact of incomplete consideration on the quality of choices. We find that many decisions can be understood using the satisficing model of Herbert Simon (1955): most subjects search sequentially, stopping when a “satisficing” level of reservation utility is realized. We find that reservation utilities and search order respond systematically to changes in the decision making environment. (JEL D03, D12, D83)
The Quarterly Journal of Economics, 1987
A model of endogenous price adjustment under money growth is presented. Firma follow (a, S) prici... more A model of endogenous price adjustment under money growth is presented. Firma follow (a, S) pricing policies and price revisions are imperfectly synchronized. In the aggregate, price stickiness disappears and money is neutral. The connection between firm price adjustment and relative price variability in the presence of monetary growth is also investigated. The results contrast with those obtained in models with exogenous fixed timing of price adjustment.
American Economic Review, 2007
Cognitive economics studies imperfect information and decision-making mistakes. A central scienti... more Cognitive economics studies imperfect information and decision-making mistakes. A central scientific challenge is that these can't be identified in standard choice data. Overcoming this challenge calls for data engineering, in which new data forms are introduced to separately identify preferences, beliefs, and other model constructs. I present applications to traditional areas of economic research, such as wealth accumulation, earnings, and consumer spending. I also present less traditional applications to assessment of decision-making skills, and to human-AI interactions. Methods apply both to individual and to collective decisions. I make the case for broader application of data engineering beyond cognitive economics. It allows symbiotic advances in modeling and measurement. It cuts across existing boundaries between disciplines and styles of research.
SSRN Electronic Journal, 2008
Economists do not have reliable measures of current house values, let alone housing returns. This... more Economists do not have reliable measures of current house values, let alone housing returns. This ignorance underlies the illiquidity of mortgage-backed securities, which in turn feeds back to deepen the sub-prime crisis. Using a massive new data tape of housing transactions in L.A., we demonstrate systematic patterns in the error associated with using the ubiquitous repeat sales methodology to understand house values. In all periods, the resulting indices under-predict sales prices of less expensive homes, and over-predict prices of more expensive homes. The recent period has produced errors that are not only unprecedentedly large in absolute value, but highly systematic: after a few years in which the indices under-predicted prices, they now significantly over-predict them. We introduce new machine learning techniques from computer science to correct for prediction errors that have geographic origins. The results are striking. Accounting for geography significantly reduces the extent of the prediction error, removes many of the systematic patterns, and results in far less deterioration in model performance in the recent period.
The American Economic Review, Mar 4, 2004
Journal of Political Economy, 2021
We take the perspective of an econometrician who wants to determine which of two experiments prov... more We take the perspective of an econometrician who wants to determine which of two experiments provides higher expected utility but only knows the decisions under each experiment. To compare these decisions, the econometrician must make inferences about what the experiment might have been for each set of decisions. We provide a necessary and sufficient condition that identifies when every experiment consistent with one set of decisions has a higher value of information than every experiment consistent with the other set of decisions.
RePEc: Research Papers in Economics, Feb 1, 1990
RePEc: Research Papers in Economics, Feb 1, 1990
Social Science Research Network, 2020
for valuable feedback. All mistakes are our own. We thank the Alfred P. Sloan and NOMIS Foundatio... more for valuable feedback. All mistakes are our own. We thank the Alfred P. Sloan and NOMIS Foundations for support. The views expressed herein are those of the authors and do not necessarily reflect the views of the National Bureau of Economic Research. NBER working papers are circulated for discussion and comment purposes. They have not been peer-reviewed or been subject to the review by the NBER Board of Directors that accompanies official NBER publications.
A model of endogenous price adjustment under money growth is presented. Firma follow (a, S) prici... more A model of endogenous price adjustment under money growth is presented. Firma follow (a, S) pricing policies and price revisions are imperfectly synchronized. In the aggregate, price stickiness disappears and money is neutral. The connection between firm price adjustment and relative price variability in the presence of monetary growth is also investigated. The results contrast with those obtained in models with exogenous fixed timing of price adjustment.
arXiv (Cornell University), May 9, 2022
A much studied issue is the extent to which the confidence scores provided by machine learning al... more A much studied issue is the extent to which the confidence scores provided by machine learning algorithms are calibrated to ground truth probabilities. Our starting point is that calibration is seemingly incompatible with class weighting, a technique often employed when one class is less common (class imbalance) or with the hope of achieving some external objective (cost-sensitive learning). We provide a model-based explanation for this incompatibility and use our anthropomorphic model to generate a simple method of recovering likelihoods from an algorithm that is miscalibrated due to class weighting. We validate this approach in the binary pneumonia detection task of Rajpurkar, Irvin, Zhu, et al. (2017).
SSRN Electronic Journal
Despite its centrality in monetary policy, communication is not a focus in social security reform... more Despite its centrality in monetary policy, communication is not a focus in social security reform. We investigate the potential for active communication to dissipate apparently widespread public confusion about the future of social security. We implement a simple information treatment in which we randomly provide survey respondents access to the longevitybased eligibility age implemented by reform that Denmark launched in 2006. Absent treatment, younger workers not only have biased beliefs, expecting to become eligible for social security earlier than policy makers intend, but also are highly uncertain about eligibility age. The information treatment eliminates the bias, suggesting it results from misunderstanding. Yet it has no influence on uncertainty, suggesting this is driven by unavoidable demographic and political uncertainties. Our results highlight the value of communication strategies and belief measurement as policy instruments outside the monetary policy arena.
SSRN Electronic Journal
How worker productivity evolves with tenure and experience is central to economics, shaping, for ... more How worker productivity evolves with tenure and experience is central to economics, shaping, for example, life-cycle earnings and the losses from involuntary job separation. Yet, worker-level productivity is hard to identify from observational data. This paper introduces direct measurement of worker productivity in a firm survey designed to separate the role of on-the-job tenure from total experience in determining productivity growth. A key innovation is to elicit what managers know about the productivity of their workers. Several findings emerge concerning the initial period on the job. (1) On-the-job productivity growth exceeds wage growth, consistent with wages not being allocative period-by-period. (2) Previous experience is a substitute, but a far less than perfect one, for on-the-job tenure. (3) There is substantial heterogeneity across jobs in the extent to which previous experience substitutes for tenure. The survey makes use of administrative data to construct a representative sample of firms, check for selective nonresponse, validate survey measures with administrative measures, and calibrate parameters not measured in the survey.
This paper examines the effects of inter-generational altruism on late-in-life wealth accumulatio... more This paper examines the effects of inter-generational altruism on late-in-life wealth accumulation. We designed and fielded a new survey to better measure transfers from parents to descendants as part of the Vanguard Research Initiative. New survey features include a carefully designed family inventory, a break-down of transfers into four categories, and a path of past and future expected transfers three years before and after. We also asked Strategic Survey Questions (SSQs) to identify preference parameters related to the desire to insure family risks via transfers. We use these new transfer measurements and SSQs to study the dynamics of parent-to-child giving by estimating a life-cycle consumption-savings model. Agents in our model save for consumption smoothing, uncertain medical and long-term care needs, inter-vivos transfers to cover uncertain family needs, and bequests. We find that there is a large and uncertain family need risk, and parents save in order to help when their d...
Consumers often face an overwhelming amount of information when deciding between products, and on... more Consumers often face an overwhelming amount of information when deciding between products, and one of the primary policymaking tools available to improve their informativeness is the framing of this information. We introduce a general theoretical approach that characterizes when one frame is revealed to provide robustly higher welfare than another. Because it is testable, adaptable, and both necessary and sufficient, our condition determines both whether frames are robustly welfare ranked in a particular data set and the overall proportion of data sets in which frames can be so ranked.
Handbook of Experimental Economic Methodology, 2015
We outline experiments that improve our understanding of decision making by analyzing behavior in... more We outline experiments that improve our understanding of decision making by analyzing behavior in the period of contemplation that preceeds commitment to a …nal choice. The experiments are based on axiomatic models of the decision making process that relate closely to revealed preference logic. To test the models, we arti…cially incentivize particular choices to be made in the pre-decision period. We show how the resulting experiments can improve our understanding not only of the decision making process, but of the decision itself. Our broad method is to make aspects of search visible while retaining the disciplined approach to data that axiomatic modeling best provides.
American Economic Review, 2011
Many everyday decisions are made without full examination of all available options, and, as a res... more Many everyday decisions are made without full examination of all available options, and, as a result, the best available option may be missed. We develop a search-theoretic choice experiment to study the impact of incomplete consideration on the quality of choices. We find that many decisions can be understood using the satisficing model of Herbert Simon (1955): most subjects search sequentially, stopping when a “satisficing” level of reservation utility is realized. We find that reservation utilities and search order respond systematically to changes in the decision making environment. (JEL D03, D12, D83)
The Quarterly Journal of Economics, 1987
A model of endogenous price adjustment under money growth is presented. Firma follow (a, S) prici... more A model of endogenous price adjustment under money growth is presented. Firma follow (a, S) pricing policies and price revisions are imperfectly synchronized. In the aggregate, price stickiness disappears and money is neutral. The connection between firm price adjustment and relative price variability in the presence of monetary growth is also investigated. The results contrast with those obtained in models with exogenous fixed timing of price adjustment.
American Economic Review, 2007
Cognitive economics studies imperfect information and decision-making mistakes. A central scienti... more Cognitive economics studies imperfect information and decision-making mistakes. A central scientific challenge is that these can't be identified in standard choice data. Overcoming this challenge calls for data engineering, in which new data forms are introduced to separately identify preferences, beliefs, and other model constructs. I present applications to traditional areas of economic research, such as wealth accumulation, earnings, and consumer spending. I also present less traditional applications to assessment of decision-making skills, and to human-AI interactions. Methods apply both to individual and to collective decisions. I make the case for broader application of data engineering beyond cognitive economics. It allows symbiotic advances in modeling and measurement. It cuts across existing boundaries between disciplines and styles of research.
SSRN Electronic Journal, 2008
Economists do not have reliable measures of current house values, let alone housing returns. This... more Economists do not have reliable measures of current house values, let alone housing returns. This ignorance underlies the illiquidity of mortgage-backed securities, which in turn feeds back to deepen the sub-prime crisis. Using a massive new data tape of housing transactions in L.A., we demonstrate systematic patterns in the error associated with using the ubiquitous repeat sales methodology to understand house values. In all periods, the resulting indices under-predict sales prices of less expensive homes, and over-predict prices of more expensive homes. The recent period has produced errors that are not only unprecedentedly large in absolute value, but highly systematic: after a few years in which the indices under-predicted prices, they now significantly over-predict them. We introduce new machine learning techniques from computer science to correct for prediction errors that have geographic origins. The results are striking. Accounting for geography significantly reduces the extent of the prediction error, removes many of the systematic patterns, and results in far less deterioration in model performance in the recent period.
The American Economic Review, Mar 4, 2004
Journal of Political Economy, 2021
We take the perspective of an econometrician who wants to determine which of two experiments prov... more We take the perspective of an econometrician who wants to determine which of two experiments provides higher expected utility but only knows the decisions under each experiment. To compare these decisions, the econometrician must make inferences about what the experiment might have been for each set of decisions. We provide a necessary and sufficient condition that identifies when every experiment consistent with one set of decisions has a higher value of information than every experiment consistent with the other set of decisions.