
Now take a look at the image. The first few lines were created with a normal visible camera; the last line was created using the above-discussed technique. Note that the last line is a bit fuzzier than the first ones. Here's how the MIT article explains this fuzziness (read "lower resolution"):
The technique clearly has some limitations, not least of which is the drop in resolution that this process causes. That's largely because of the motion of rubidium atoms in the gas, which must be heated to 140 degrees C.
I think they're getting too complicated with this explanation. How about the reason for the lower resolution is due to the fact that the original image was created using infrared wavelengths? Resolution, meaning the ability to resolve objects within an image, is related to wavelength. The longer the wavelength, the lower the resolution becomes. Infrared wavelengths are longer than visible light. Hence, it makes sense that an infrared wavelength image is going to be lower resolution than one made using visible light.
This was going to be the comment that I was going to leave at the MIT Tech Review site, but I decided not to.

