Zenodo (CERN European Organization for Nuclear Research), Feb 28, 2022
To determine the welfare implications of price changes in demand data, we introduce a revealed pr... more To determine the welfare implications of price changes in demand data, we introduce a revealed preference relation over prices. We show that the absence of cycles in this relation characterizes a consumer who trades off the utility of consumption against the disutility of expenditure. Our model can be applied whenever a consumer's demand over a strict subset of all available goods is being analyzed; it can also be extended to settings with discrete goods and nonlinear prices. To illustrate its use, we apply our model to a single-agent data set and to a data set with repeated cross-sections. We develop a novel test of linear hypotheses on partially identified parameters to estimate the proportion of the population who are revealed better off due to a price change in the latter application. This new technique can be used for nonparametric counterfactual analysis more broadly.
Abstract We use unique data from seven intermediate economics courses taught at four R1 instituti... more Abstract We use unique data from seven intermediate economics courses taught at four R1 institutions to examine the effects of the COVID-19 pandemic on student learning. Because the same assessments of course knowledge mastery were administered across semesters, we can cleanly infer the impact of the unanticipated switch to remote teaching in Spring 2020. During the pandemic, total assessment scores declined by 0.2 standard deviations on average. However, we find substantial heterogeneity in learning outcomes across courses. Course instructors were surveyed about their pedagogy practices and our analysis suggests that prior online teaching experience and teaching methods that encouraged active engagement, such as the use of small group activities and projects, played an important role in mitigating this negative effect. In contrast, we find that student characteristics, including gender, race, and first-generation status, had no significant association with the decline in student performance in the pandemic semester.
This code is shared under a MIT license. Permission is hereby granted, free of charge, to any per... more This code is shared under a MIT license. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.We present the calibrated-projection MATLAB package implementing the method to construct confidence intervals proposed by Kaido, Molinari, and Stoye (2019). This manual provides details on how to use the package for inference on projections of partially identified parameters and instructions on how to replicate the empirical application and simulation results in the paper. The version of this code included in this ZIP file is what was used to carry out the empirical application in Section 4 of Kaido et al. (2019) and the Monte Carlo simulations in Appendix C. Please visit https://molinari.economics.cornell.edu/programs.html for the most up-to-date version of the code.We gratefully acknowledge financial support through NSF Grants SES-1230071 and SES-1824344 (Kaido), SES-0922330 and SES-1824375 (Molinari), and SES-1260980 and SES-1824375 (Stoye)
Hoderlein, Stefan, and Stoye, Joerg, (2014) "Revealed Preferences in a Heterogeneous Populat... more Hoderlein, Stefan, and Stoye, Joerg, (2014) "Revealed Preferences in a Heterogeneous Population." Review of Economics and Statistics 96:2, 197-213
With the aim of determining the welfare implications of price change in consumption data, we intr... more With the aim of determining the welfare implications of price change in consumption data, we introduce a revealed preference relation over prices. We show that an absence of cycles in this preference relation characterizes a model of demand where consumers trade-off the utility of consumption against the disutility of expenditure. This model is appropriate whenever a consumer's demand over a strict subset of all available goods is being analyzed. For the random utility extension of the model, we devise nonparametric statistical procedures for testing and welfare comparisons. The latter requires the development of novel tests of linear hypotheses for partially identified parameters. In doing so, we provide new algorithms for the calculation and statistical inference in nonparametric counterfactual analysis for a general partially identified model. Our applications on national household expenditure data provide support for the model and yield informative bounds concerning welfare rankings across different prices. This in turn leads naturally to the following question: if a consumer's observed demand behavior obeys GAPP, what could we say about her decision making procedure?
This paper provides axiomatic foundations for maximin criteria in statistical decision theory. Sp... more This paper provides axiomatic foundations for maximin criteria in statistical decision theory. Specifically, consider a decision maker who faces a number of possible models of the world. Every model generates objective probabilities, but no probabilities of models are given. This is the classic problem captured by Wald's (1950) device of risk functions. I characterize a number of decision rules including Bayesianism (as a backdrop), maximin utility, the Hurwicz criterion, and especially several variations of minimax regret. The main contributions are the unified axiomatization of these rules in a framework tailored to statistical decision making, an axiomatic system that relaxes transitivity as well as menu-independence of preferences, and the introduction of new, regret-based decision criteria. Interestingly, the axiom that picks regret-based rules over maximin utility is independence.
This paper applies recently developed methods to robust assessment of treatment outcomes and robu... more This paper applies recently developed methods to robust assessment of treatment outcomes and robust treatment choice based on nonexperimental data. The substantive question is whether young offenders should be assigned to residential or nonresidential treatment in order to prevent subsequent recidivism. A large data set on past offenders exists, but treatment assignment was by judges and not by experimenters, hence counterfactual outcomes are not identified unless one imposes strong assumptions. The analysis is carried out in two steps. First, I show how to compute identified bounds on expected outcomes under various assumptions that are too weak to restore conventional identification but may be accordingly credible. The bounds are estimated, and confidence regions that take current theoretical developments into account are computed. I then ask which treatment to assign to future offenders if the identity of the best treatment will not be learned from the data. This is a decision problem under ambiguity. I characterize and compute decision rules that are asymptotically efficient under the minimax regret criterion. The substantive conclusion is that both bounds and recommended decisions vary significantly across the assumptions. The data alone do not permit conclusions or decisions that are globally robust in the sense of holding uniformly over reasonable assumptions.
This paper unifies and extends the recent axiomatic literature on minimax regret. It compares sev... more This paper unifies and extends the recent axiomatic literature on minimax regret. It compares several models of minimax regret, shows how to characterize the according choice correspondences in a unified setting, extends one of them to choice from convex (through randomization) sets, and connects them by defining a behavioral notion of perceived ambiguity. Substantively, a main idea is to behaviorally identify ambiguity with failures of independence of irrelevant alternatives. Regarding proof technique, the core contribution is to uncover a dualism between choice correspondences and preferences in an environment where this dualism is not obvious. This insight can be used to generate results by importing findings from the existing literature on preference orderings.
This paper applies the minimax regret criterion to choice between two treatments conditional on o... more This paper applies the minimax regret criterion to choice between two treatments conditional on observation of a finite sample. The analysis is based on exact small sample regret and does not use asymptotic approximations nor finite-sample bounds. Core results are: (i) Minimax regret treatment rules are well approximated by empirical success rules in many cases, but differ from them significantly-both in terms of how the rules look and in terms of maximal regret incurredfor small sample sizes and certain sample designs. (ii) Absent prior cross-covariate restrictions on treatment outcomes, they prescribe inference that is completely separate across covariates, leading to no-data rules as the support of a covariate grows. I conclude by offering an assessment of these results.
This paper explores the empirical content of the weak axiom of revealed preference (WARP) for rep... more This paper explores the empirical content of the weak axiom of revealed preference (WARP) for repeated cross-sectional data or for panel data where individuals experience preference shocks. Specifically, in a heterogeneous population, think of the fraction of consumers violating WARP as the parameter of interest. This parameter depends on the joint distribution of choices over different budget sets. Repeated cross-sections do not reveal this distribution but only its marginals. Thus, the parameter is not point identified but can be bounded. We frame this as a copula problem and use copula techniques to analyze it. The bounds, as well as some nonparametric refinements of them, correspond to intuitive behavioral assumptions in the two goods case. With three or more goods, these intuitions break down, and plausible assumptions can have counterintuitive implications. Inference on the bounds is an application of partial identification through moment inequalities. We implement our analysi...
This paper analyzes partial identification of parameters that measure a distribution's spread, e.... more This paper analyzes partial identification of parameters that measure a distribution's spread, e.g. variance, Gini ratio, entropy, or interquartile range. The core result are tight bounds on such parameters for cases where a distribution is known to be a mixture between a known one and one whose c.d.f. can be bounded. More specifically, I derive joint identification regions for the mean of an incompletely observable random variable and parameters that increase in meanpreserving spreads -e.g. variance, entropy, or Gini ratio -as well as joint bounds on quantiles and quantile contrasts. A side result is a generalization of the Rothschild-Stiglitz notion of comparative riskiness.
definition of identification. Sources of identification. Main example: Nonparametric nonadditive ... more definition of identification. Sources of identification. Main example: Nonparametric nonadditive mean regression.
This paper extends Imbens and analysis of confidence intervals for interval identified parameters... more This paper extends Imbens and analysis of confidence intervals for interval identified parameters. For their final result, Imbens and Manski implicitly assume superefficient estimation of a nuisance parameter. This appears to have gone unnoticed before, and it limits the result's applicability.
This paper unifies and extends the recent axiomatic literature on minimax regret. It compares sev... more This paper unifies and extends the recent axiomatic literature on minimax regret. It compares several models of minimax regret, shows how to characterize the according choice correspondences in a unified setting, extends them to choice from convex sets, connects them by defining a behavioral notion of perceived ambiguity, and provides an axiomatization of Hannan regret. Substantively, a main idea is to behaviorally identify ambiguity with failures of independence of irrelevant alternatives. Regarding proof technique, the core contribution is to uncover a dualism between choice correspondences and preferences in an environment where this dualism is not obvious. This insight can be used to generate results by importing findings from the existing literature on preference orderings.
This paper applies the minimax regret criterion to choice between two treatments conditional on o... more This paper applies the minimax regret criterion to choice between two treatments conditional on observation of a finite sample. The analysis is based on exact small sample regret and does not use asymptotic approximations nor finite-sample bounds. Core results are the following: (i) Minimax regret treatment rules are well approximated by empirical success rules in many cases, but differ from them significantly for small sample sizes and certain sample designs. (ii) They prescribe inference that is completely separate across covariates. This result can be avoided by imposing sufficient prior information. (iii) The relative performance of empirical success rules can be evaluated and is significantly lacking in very small samples. (iv) Manski's (2004) analysis of optimal sample stratification can be reproduced, with somewhat different implications, in terms of finite sample results as opposed to large deviations bounds. I conclude by offering some assessment of the results.
Zenodo (CERN European Organization for Nuclear Research), Feb 28, 2022
To determine the welfare implications of price changes in demand data, we introduce a revealed pr... more To determine the welfare implications of price changes in demand data, we introduce a revealed preference relation over prices. We show that the absence of cycles in this relation characterizes a consumer who trades off the utility of consumption against the disutility of expenditure. Our model can be applied whenever a consumer's demand over a strict subset of all available goods is being analyzed; it can also be extended to settings with discrete goods and nonlinear prices. To illustrate its use, we apply our model to a single-agent data set and to a data set with repeated cross-sections. We develop a novel test of linear hypotheses on partially identified parameters to estimate the proportion of the population who are revealed better off due to a price change in the latter application. This new technique can be used for nonparametric counterfactual analysis more broadly.
Abstract We use unique data from seven intermediate economics courses taught at four R1 instituti... more Abstract We use unique data from seven intermediate economics courses taught at four R1 institutions to examine the effects of the COVID-19 pandemic on student learning. Because the same assessments of course knowledge mastery were administered across semesters, we can cleanly infer the impact of the unanticipated switch to remote teaching in Spring 2020. During the pandemic, total assessment scores declined by 0.2 standard deviations on average. However, we find substantial heterogeneity in learning outcomes across courses. Course instructors were surveyed about their pedagogy practices and our analysis suggests that prior online teaching experience and teaching methods that encouraged active engagement, such as the use of small group activities and projects, played an important role in mitigating this negative effect. In contrast, we find that student characteristics, including gender, race, and first-generation status, had no significant association with the decline in student performance in the pandemic semester.
This code is shared under a MIT license. Permission is hereby granted, free of charge, to any per... more This code is shared under a MIT license. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.We present the calibrated-projection MATLAB package implementing the method to construct confidence intervals proposed by Kaido, Molinari, and Stoye (2019). This manual provides details on how to use the package for inference on projections of partially identified parameters and instructions on how to replicate the empirical application and simulation results in the paper. The version of this code included in this ZIP file is what was used to carry out the empirical application in Section 4 of Kaido et al. (2019) and the Monte Carlo simulations in Appendix C. Please visit https://molinari.economics.cornell.edu/programs.html for the most up-to-date version of the code.We gratefully acknowledge financial support through NSF Grants SES-1230071 and SES-1824344 (Kaido), SES-0922330 and SES-1824375 (Molinari), and SES-1260980 and SES-1824375 (Stoye)
Hoderlein, Stefan, and Stoye, Joerg, (2014) "Revealed Preferences in a Heterogeneous Populat... more Hoderlein, Stefan, and Stoye, Joerg, (2014) "Revealed Preferences in a Heterogeneous Population." Review of Economics and Statistics 96:2, 197-213
With the aim of determining the welfare implications of price change in consumption data, we intr... more With the aim of determining the welfare implications of price change in consumption data, we introduce a revealed preference relation over prices. We show that an absence of cycles in this preference relation characterizes a model of demand where consumers trade-off the utility of consumption against the disutility of expenditure. This model is appropriate whenever a consumer's demand over a strict subset of all available goods is being analyzed. For the random utility extension of the model, we devise nonparametric statistical procedures for testing and welfare comparisons. The latter requires the development of novel tests of linear hypotheses for partially identified parameters. In doing so, we provide new algorithms for the calculation and statistical inference in nonparametric counterfactual analysis for a general partially identified model. Our applications on national household expenditure data provide support for the model and yield informative bounds concerning welfare rankings across different prices. This in turn leads naturally to the following question: if a consumer's observed demand behavior obeys GAPP, what could we say about her decision making procedure?
This paper provides axiomatic foundations for maximin criteria in statistical decision theory. Sp... more This paper provides axiomatic foundations for maximin criteria in statistical decision theory. Specifically, consider a decision maker who faces a number of possible models of the world. Every model generates objective probabilities, but no probabilities of models are given. This is the classic problem captured by Wald's (1950) device of risk functions. I characterize a number of decision rules including Bayesianism (as a backdrop), maximin utility, the Hurwicz criterion, and especially several variations of minimax regret. The main contributions are the unified axiomatization of these rules in a framework tailored to statistical decision making, an axiomatic system that relaxes transitivity as well as menu-independence of preferences, and the introduction of new, regret-based decision criteria. Interestingly, the axiom that picks regret-based rules over maximin utility is independence.
This paper applies recently developed methods to robust assessment of treatment outcomes and robu... more This paper applies recently developed methods to robust assessment of treatment outcomes and robust treatment choice based on nonexperimental data. The substantive question is whether young offenders should be assigned to residential or nonresidential treatment in order to prevent subsequent recidivism. A large data set on past offenders exists, but treatment assignment was by judges and not by experimenters, hence counterfactual outcomes are not identified unless one imposes strong assumptions. The analysis is carried out in two steps. First, I show how to compute identified bounds on expected outcomes under various assumptions that are too weak to restore conventional identification but may be accordingly credible. The bounds are estimated, and confidence regions that take current theoretical developments into account are computed. I then ask which treatment to assign to future offenders if the identity of the best treatment will not be learned from the data. This is a decision problem under ambiguity. I characterize and compute decision rules that are asymptotically efficient under the minimax regret criterion. The substantive conclusion is that both bounds and recommended decisions vary significantly across the assumptions. The data alone do not permit conclusions or decisions that are globally robust in the sense of holding uniformly over reasonable assumptions.
This paper unifies and extends the recent axiomatic literature on minimax regret. It compares sev... more This paper unifies and extends the recent axiomatic literature on minimax regret. It compares several models of minimax regret, shows how to characterize the according choice correspondences in a unified setting, extends one of them to choice from convex (through randomization) sets, and connects them by defining a behavioral notion of perceived ambiguity. Substantively, a main idea is to behaviorally identify ambiguity with failures of independence of irrelevant alternatives. Regarding proof technique, the core contribution is to uncover a dualism between choice correspondences and preferences in an environment where this dualism is not obvious. This insight can be used to generate results by importing findings from the existing literature on preference orderings.
This paper applies the minimax regret criterion to choice between two treatments conditional on o... more This paper applies the minimax regret criterion to choice between two treatments conditional on observation of a finite sample. The analysis is based on exact small sample regret and does not use asymptotic approximations nor finite-sample bounds. Core results are: (i) Minimax regret treatment rules are well approximated by empirical success rules in many cases, but differ from them significantly-both in terms of how the rules look and in terms of maximal regret incurredfor small sample sizes and certain sample designs. (ii) Absent prior cross-covariate restrictions on treatment outcomes, they prescribe inference that is completely separate across covariates, leading to no-data rules as the support of a covariate grows. I conclude by offering an assessment of these results.
This paper explores the empirical content of the weak axiom of revealed preference (WARP) for rep... more This paper explores the empirical content of the weak axiom of revealed preference (WARP) for repeated cross-sectional data or for panel data where individuals experience preference shocks. Specifically, in a heterogeneous population, think of the fraction of consumers violating WARP as the parameter of interest. This parameter depends on the joint distribution of choices over different budget sets. Repeated cross-sections do not reveal this distribution but only its marginals. Thus, the parameter is not point identified but can be bounded. We frame this as a copula problem and use copula techniques to analyze it. The bounds, as well as some nonparametric refinements of them, correspond to intuitive behavioral assumptions in the two goods case. With three or more goods, these intuitions break down, and plausible assumptions can have counterintuitive implications. Inference on the bounds is an application of partial identification through moment inequalities. We implement our analysi...
This paper analyzes partial identification of parameters that measure a distribution's spread, e.... more This paper analyzes partial identification of parameters that measure a distribution's spread, e.g. variance, Gini ratio, entropy, or interquartile range. The core result are tight bounds on such parameters for cases where a distribution is known to be a mixture between a known one and one whose c.d.f. can be bounded. More specifically, I derive joint identification regions for the mean of an incompletely observable random variable and parameters that increase in meanpreserving spreads -e.g. variance, entropy, or Gini ratio -as well as joint bounds on quantiles and quantile contrasts. A side result is a generalization of the Rothschild-Stiglitz notion of comparative riskiness.
definition of identification. Sources of identification. Main example: Nonparametric nonadditive ... more definition of identification. Sources of identification. Main example: Nonparametric nonadditive mean regression.
This paper extends Imbens and analysis of confidence intervals for interval identified parameters... more This paper extends Imbens and analysis of confidence intervals for interval identified parameters. For their final result, Imbens and Manski implicitly assume superefficient estimation of a nuisance parameter. This appears to have gone unnoticed before, and it limits the result's applicability.
This paper unifies and extends the recent axiomatic literature on minimax regret. It compares sev... more This paper unifies and extends the recent axiomatic literature on minimax regret. It compares several models of minimax regret, shows how to characterize the according choice correspondences in a unified setting, extends them to choice from convex sets, connects them by defining a behavioral notion of perceived ambiguity, and provides an axiomatization of Hannan regret. Substantively, a main idea is to behaviorally identify ambiguity with failures of independence of irrelevant alternatives. Regarding proof technique, the core contribution is to uncover a dualism between choice correspondences and preferences in an environment where this dualism is not obvious. This insight can be used to generate results by importing findings from the existing literature on preference orderings.
This paper applies the minimax regret criterion to choice between two treatments conditional on o... more This paper applies the minimax regret criterion to choice between two treatments conditional on observation of a finite sample. The analysis is based on exact small sample regret and does not use asymptotic approximations nor finite-sample bounds. Core results are the following: (i) Minimax regret treatment rules are well approximated by empirical success rules in many cases, but differ from them significantly for small sample sizes and certain sample designs. (ii) They prescribe inference that is completely separate across covariates. This result can be avoided by imposing sufficient prior information. (iii) The relative performance of empirical success rules can be evaluated and is significantly lacking in very small samples. (iv) Manski's (2004) analysis of optimal sample stratification can be reproduced, with somewhat different implications, in terms of finite sample results as opposed to large deviations bounds. I conclude by offering some assessment of the results.
Uploads
Papers by Joerg Stoye