The statement that the wavelength of a probe must be smaller than the size of the object being observed is related to a concept called the diffraction limit. It applies to various imaging techniques, such as optical microscopy.
Diffraction is a phenomenon that occurs when waves encounter obstacles or pass through small openings. When a wave encounters an object with a size comparable to or larger than its own wavelength, it diffracts or bends around the object. This diffraction can cause blurring or spreading out of the wavefront, leading to a loss of fine spatial details in the resulting image.
In the context of imaging, the resolution refers to the ability to distinguish and capture fine details of an object. The diffraction limit sets a fundamental limit on the resolution of an imaging system. According to the Rayleigh criterion, which is a commonly used criterion for resolution, two closely spaced points or features can only be resolved if the center of one is separated from the center of the other by a distance larger than the wavelength of the probe divided by a certain factor (usually around 1.22 for circular apertures).
If the wavelength of the probe used for imaging is larger than the size of the object being observed, the diffraction effects will prevent the imaging system from resolving fine details of the object. On the other hand, when the wavelength of the probe is smaller than the size of the object, the diffraction effects become less significant, and the system can capture more details and provide higher resolution.
Therefore, for optimal resolution and to capture fine details of an object, it is desirable to use a probe with a smaller wavelength compared to the size of the object. This principle is applicable not only in optics but also in other forms of wave-based imaging, such as electron microscopy and radio astronomy.