Microscopy resolution, magnification, etc


First, let's consider an ideal object: a fluorescent atom, something very tiny but very bright. Even though this is extremely tiny, you can see it. You could even see it with your naked eye if it was bright enough. So, the issue is not about being able to "see" something -- it's more about being able to see where it is, and to distinguish between nearby objects.
Consider the ideal fluorescent atom. The image of this atom in a microscope (confocal or regular optical microscope) is a spot, more technically, an Airy disk, which looks like the picture at right. This is due to diffraction effects.

The size of the spot is related to your resolution. Resolution is being able to tell the difference between two closely positioned bright objects, and one big object. If two objects are closer together than your resolution, then they blur together in the microscope image and it's impossible to tell that they are two points (except maybe the combined image is twice as bright as one object: but still, you can't measure their separation). The best resolution for an optical microscope is about 0.2 microns = 200 nm.

The good news is, there's a difference between resolution and "ability to locate the position".

If you have one tiny and isolated fluorescent object, you can often locate the position of that object to better than your resolution. The image of the object will show up as an extended blob, and you can find the "center of mass" of that blob-shaped image. If the blob is N pixels wide and each pixel is M microns across, you can estimate the center of the blob to about M/N accuracy, which often beats the optical resolution. This is a useful trick, but not solving the same problem as resolution. In some cases you can do various tricks to make the spot size bigger (increase N) so that you can locate the center even better. Various experiments I've heard of have claimed to be able to locate the centers of spots to within 10-30 nm using this sort of method. You may be interested in some software available for identifying particle positions, which implements this center-of-mass method.

The magnification is something different altogether. There's a technical definition which compares the apparent angular size of the image, to the actual angular size of the object as it would appear if it were 25 cm away from your eye. This is a somewhat arbitary definition and in my opinion is mainly relevant for devising problems when I teach optics in my introductory physics classes. In real life, one often takes pictures using a CCD camera on a microscope, and projects them on a monitor. Using a larger monitor certainly can magnify the image further. But, it will still be just as blurry or sharp as the resolution.

Fortunately, in general higher magnification lenses also have better resolution. In our lab a 10x objective has a resolution of 0.7 microns and a 100x objective has a resolution of 0.2 microns. One other tradeoff to consider: higher magnification lenses look at smaller fields of view, in proportion to their magnification. A 100x objective that sees a field of view of 100 x 100 micron^2 can be contrasted with a 10x objective looking at a 1000 x 1000 micron^2 field of view.

So, when worrying about how good a microscope is, the most important question is what the resolution is. And in some science applications (such as my work) you care a lot about how well you can locate the centers of objects, and hopefully you can beat the resolution. Magnification is a much less useful specification (in my opinion).


Technical note

More technically, a microscope objective's resolution is quantified by the Numerical Aperture. This webpage has great description of this as well as a more in-depth discussion of resolution. I'll note here that the wavelength of light you use makes a difference; shorter wavelengths improve the resolution.


Links