We introduce an abstract model of computation corresponding to an experiment in which identical, non-interacting bosons are sent through a non-adaptive linear circuit before being measured. We show that despite the very limited nature of the model, an exact classical simulation would imply a collapse of the polynomial hierarchy. Moreover, under plausible conjectures, a "noisy" approximate simulation would do the same. This gives evidence that quantum computers can sample a distribution that classical computers cannot even approximate, even when restricted to use no entanglement except that arising from particles being identical. We briefly discuss experimental prospects for realizing this model. This talk is based on The Computational Complexity of Linear Optics [STOC '11], which is joint work with Scott Aaronson.