Center Director, Research Center for Group Dynamics, Institute for Social Research
Director, BioSocial Methods Collaborative, RCGD
Amos N Tversky Collegiate Professor, Psychology and Statistics, LSA
Professor of Marketing, Stephen M Ross School of Business
Professor of Integrative Systems and Design, College of Engineering
Using computer adaptive methods to select the next query in a decision making study
We extend the adaptive design optimization (ADO) approach to the domain of decision making under risk. ADO is a Bayesian method that adapts the experimental design in real time; it chooses the next “query,” or test question, to present in the study that can maximally discriminate the predictions of competing theories. We study the behavior of this adaptive system in a set of simulations in being able to discriminate between popular theories of decision making under risk.
Cavagnaro, D., Gonzalez R., Myung, J., & Pitt, M. (2013). Optimal decision stimuli for risky choice experiments: An adaptive approach. Management Science, 59, 358-75. DOI: 10.1287/mnsc.1120.1558 PDF
Collecting data to discriminate between models of risky choice requires careful selection of decision stimuli. Models of decision making aim to predict decisions across a wide range of possible stimuli, but practical limitations force experimenters to select only a handful of them for actual testing. Some stimuli are more diagnostic between models than others, so the choice of stimuli is critical. This paper provides the theoretical background and a methodological framework for adaptive selection of optimal stimuli for discriminating among models of risky choice. The approach, called adaptive design optimization, adapts the stimulus in each experimental trial based on the results of the preceding trials. We demonstrate the validity of the approach with simulation studies aiming to discriminate expected utility, weighted expected utility, original prospect theory, and cumulative prospect theory models.