I just looked at the actual
research article. So far as I can tell it's "amazing" because the summary article referenced in the original post is wrong. These devices are not 1000x better than current state-of-the-art CCD or CMOS detectors. They are 2-3 orders of magnitude better than current
graphene detectors. Which, currently, suck. The new devices reported in the paper have a quantum efficiency of ~2.3% (darned good for a device a single molecule thick).
CCDs and CMOS devices currently on the commercial market exhibit quantum efficiencies which in the green-yellow 550 nm range peak at 50-95%. Sony was shipping interline CCDs a decade ago that had 65% QE, and those devices weren't even backside-illuminated. The limiting factor in current sensors is noise in the pre-amp and analog-to-digital conversion stages of the devices, but even that is getting darned good.
It is not likely, in my somewhat-informed opinion, that a 1000x (almost exactly 10 stops) improvement in detection efficiency is physically possible -- at least, for devices that must operate at or slightly above room temperature.
The new graphene devices have the potential to be very power-efficient and they are exceptionally infrared-sensitive which may make them very useful in certain applications.