Ishita Dasgupta has received the Student Award for Outstanding Scientific Contribution at the International Conference on Thinking for our joint work on stochastic hypothesis generation. Below is the abstract of the presented talk:
Ishita Dasgupta, Eric Schulz & Samuel Gershman
ABSTRACT. How do humans approximate Bayesian inference when the task requires them to generate hypotheses? Previous models (e.g., Thomas et al., 2008) crucially depend on cued recall to generate hypotheses. However, this strategy is impractical in combinatorially complex hypothesis spaces, where relevant hypotheses may have to be constructed de novo rather than retrieved from memory. We present a novel algorithmic model of hypothesis generation based on Markov chain Monte Carlo sampling. The Markov chain generates hypotheses using local proposals and accepts them based on their probability. The accepted hypotheses are then used to construct a sample-based approximation of the posterior. As the number of generated hypotheses increases, the approximation converges to the true posterior. However, following Lieder et al. (2013), we assume that humans run the chain for a finite length, producing several well-known probability biases such as subadditivity, superadditivity, variance in responses, and anchoring. Additionally, our model makes a new prediction: the context-dependent inversion of subadditivity and superadditivity. These simulations suggest that resource-bounded sampling provides a plausible account of human hypothesis generation.