Optical transfer function

Optical transfer function

The optical transfer function (OTF) of an imaging system (camera, video system, microscope etc) is the true measure of resolution (image sharpness) that the system is capable of. The common practice of defining resolution in terms of pixel count is not meaningful, as it is the overall OTF of the complete system, including lens and anti-aliasing filter as well as other factors, that defines true performance. In the most common applications (cameras and video systems) it is the Modulation Transfer Function (the magnitude of the OTF), that is most relevant, although the phase component can have a secondary effect. While resolution, as commonly used with reference to camera systems, describes only the number of pixels in an image, and hence the potential to show fine detail, the transfer function describes the ability of adjacent pixels to change from black to white in response to patterns of varying spatial frequency, and hence the actual capability to show fine detail, whether with full or reduced contrast. An image reproduced with an optical transfer function that 'rolls off' at high spatial frequencies will appear 'blurred' in everyday language. Modulation Transfer Function or MTF (the OTF magnitude with phase ignored) is roughly the equivalent of frequency response in an audio system, and can be represented by a graph of light amplitude (brightness) versus spatial frequency (cycles per picture width).

Contents

Example

Taking the example of a current High Definition video system, with 1920 by 1080 pixels, the Nyquist theorem says that it should be possible, in a perfect system, to resolve fully (with true black to white transitions) nearly 1920 alternate black and white lines, otherwise referred to as a spatial frequency of 960 line pairs per picture width, or 960 cycles per picture width, (definitions in terms of cycles per unit angle or per mm are also possible but generally less clear when dealing with cameras and more appropriate to telescopes etc). In practice this is far from the case, and spatial frequencies that approach the Nyquist rate will generally be reproduced with decreasing amplitude, so that fine detail, though it can be seen, is greatly reduced in contrast. This gives rise to the interesting observation that, for example, a standard definition television picture derived from a film scanner that uses oversampling, as described later, may appear sharper than a high definition picture shot on a camera with a poor Modulation Transfer Function. The two picture show an interesting difference that is often missed, the former having full contrast on detail up to a certain point but then no really fine detail, while the latter does contain finer detail, but with such reduced contrast as to appear inferior overall.

Factors affecting MTF in typical camera systems

In practice, many factors result in considerable blurring of a reproduced image, such that patterns with spatial frequency just below the Nyquist rate may not even be visible, and the finest patterns that can be seen appear 'washed out' as shades of grey, not black and white. A major factor is usually the impossibility of making the perfect 'brick wall' optical filter (often realised as a 'phase plate' or a lens with specific blurring properties in digital cameras and video camcorders). Such a filter is necessary to reduce aliasing by eliminating spatial frequencies above the Nyquist rate, but in practice it will have a response that 'rolls off' seriously before the Nyquist frequency is reached.

Oversampling and downconversion to maintain MTF

For this reason, the only way in practice to approach the theoretical sharpness possible in a digital imaging system such as a camera is to use more pixels in the camera sensor than samples in the final image, and 'downconvert' or 'interpolate' using special digital processing which cuts off high frequencies above the Nyquist rate to avoid aliasing whilst maintaining a reasonably flat MTF up to that frequency. This approach was first taken in the 1970s when flying spot scanners, and later CCD line scanners, were developed which sampled more pixels than were needed and then 'downconverted', which is why movies have always looked sharper on television than other material shot with a video camera. The only theoretically correct way to interpolate or downconvert is by use of a steep low-pass spatial filter, realised by convolution with a two-dimensional sinx/x weighting function which requires powerful processing. In practice, various mathematical approximations to this are used to reduce the processing requirement. These approximations are now implemented widely in video editing systems and in image processing programs such as Photoshop.

Just as standard definition video with a flat MTF is only possible with oversampling, so HD television with full theoretical sharpness is only possible by starting with a camera that has at least twice as many pixels, and then digitally filtering. With movies now being shot in 4k and even 8k video for the cinema, using cameras like the Red, we can expect to see the best pictures on HDTV only from movies or material shot at the higher standard. However much we raise the number of pixels used in cameras, this will always remain true (unless a perfect optical spatial filter can be devised), and the same problem exists of course with stills cameras, where a better image can be expected when, say, a 10 megapixel image is converted to a 5 megapixel image, than could ever be obtained from a even the best 5 megapixel camera. Because of this problem of maintaining a flat MTF, broadcasters like the BBC did for a long time consider maintaining standard definition television, but improving its quality by shooting and viewing with many more pixels (though as previously mentioned, such a system, though impressive, does ultimately lack the very fine detail which, though attenuated, enhances the effect of true HD viewing.

Another factor in digital cameras and camcorders is lens resolution. A lens may be said to 'resolve' 1920 horizontal lines, but this does not mean that it does so with full modulation from black to white. The 'Modulation Transfer Function' (just a term for the magnitude of the optical transfer function with phase ignored) gives the true measure of lens performance, and is represented by a graph of amplitude against spatial frequency.

Lens aperture diffraction also limits MTF. Whilst reducing the aperture of a lens usually reduces aberrations and hence improves the flatness of the MTF, there is an optimum aperture for any lens and image sensor size beyond which smaller apertures reduce resolution because of diffraction, which spreads light across the image sensor. This was hardly a problem in the days of plate cameras and even 35mm film, but has become an insurmountable limitation with the very small format sensors used in digital cameras and especially video cameras. First generation HD consumer camcorders used 1/4 inch sensors, for which apertures smaller than about f4 begin to limit resolution. Even professional video cameras mostly use 2/3 inch sensors, prohibiting the use of apertures around f16 that would have been considered normal for film formats. Certain cameras (such as the Pentax K10D) feature an "MTF autoexposure" mode, where the choice of aperture is optimised for maximum sharpness. Typically this means somewhere in the middle of the aperture range.[1]

The Trend to Digital Large-Format SLRs and improved MTF Potential

There has recently been a shift towards the use of large image format digital single lens reflex cameras driven by the need for low-light sensitivity and narrow depth of field effects. This has led to such cameras becoming preferred by some film and television programme makers over even professional HD video cameras, because of their 'filmic' potential. In theory the use of cameras with 16 and 21 megapixel sensors offers the possibility of almost perfect sharpness by downconversion within the camera, with digital filtering to eliminate aliasing. In practise such cameras currently fail in this respect and they do not have the processing power to do what is required. The Canon EOS 5D Mark II is believed to use only every third line, and hence suffers bad aliasing, as its optical filter is optimised for stills use. The Panasonic Lumix DMC-GH2 may do some processing across pixels, producing very sharp images, but with some aliasing. Nevertheless, such cameras produce very impressive results, and appear to be leading the way in video production towards large-format downconversion with digital filtering becoming the standard approach to the realisation of a flat MTF with true freedom from aliasing.

Measuring Modulation Transfer Function

Although 'sharpness' is often judged on grid patterns of alternate black and white lines, it should strictly be measured using a sine-wave variation from black to white (a blurred version of the usual pattern). Where a square wave pattern is used (simple black and white lines) not only is there more risk of aliasing, but account must be taken of the fact that the fundamental component of a square wave is higher than the amplitude of the square wave itself (the harmonic components reduce the peak amplitude). A square wave test chart will therefore show optimistic results (better resolution of high spatial frequencies than is actually achieved). The square wave result is sometimes referred to as the 'contrast transfer function' (CTF).

More Advanced Details

OTF may be broken down into the magnitude transfer function and phase transfer function components as follows:

\mathbf{OTF(\xi,\eta)}=\mathbf{MTF(\xi,\eta)}\cdot\mathbf{PTF(\xi,\eta)}

where

\mathbf{MTF(\xi,\eta)} = | \mathbf{OTF(\xi,\eta)} |
\mathbf{PTF(\xi,\eta)} = e^{-i 2\cdot\pi\cdot\lambda (\xi,\eta)}

and (ξ,η) are spatial frequency in the x- and y-plane, respectively.


Phase is critically important to adaptive optics and holographic systems.

The OTF is the Fourier transform of the incoherent Point Spread Function.

The modulation transfer function represents the Bode plot of an imaging system (such as a microscope or the human eye), and thus depicts the filtering characteristic of the imaging system. The human eye, for instance, acts as a low-pass filter, in that very high-frequency components (sharp edges) cannot be perfectly perceived.

See also

References


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать реферат

Look at other dictionaries:

  • Transfer function — A transfer function (also known as the system function[1] or network function) is a mathematical representation, in terms of spatial or temporal frequency, of the relation between the input and output of a linear time invariant system. With… …   Wikipedia

  • Signal transfer function — The signal transfer function (SiTF) is a measure of the signal output versus the signal input of a system such as an infrared system or sensor. [cite book | title = The Optical Transfer Function of Imaging Systems | author = Tom L. Williams |… …   Wikipedia

  • Contrast transfer function — Typical contrast transfer function observed from an electron micrograph The contrast transfer function is a type of optical transfer function that affects images collected in an transmission electron microscope. The contrast transfer function… …   Wikipedia

  • Modulation transfer function (infrared imaging) — The Modulation Transfer Function (MTF) is used to approximate the position of best focus of an infrared imaging system. In an imaging system, best focus is typically achieved when the MTF is between 0.4 and 0.6; most often at 0.5 (50% cutoff… …   Wikipedia

  • Optical resolution — This article is about optical resolution in optics. For the method of separating enantiomers in chemistry, see Chiral resolution. Optical resolution describes the ability of an imaging system to resolve detail in the object that is being imaged.… …   Wikipedia

  • Optical autocorrelation — Classification of the different kinds of optical autocorrelation. In optics, various autocorrelation functions can be experimentally realized. The field autocorrelation may be used to calculate the spectrum of a source of light, while the… …   Wikipedia

  • Optical lens design — refers to the calculation of lens construction parameters (variables) that will meet a set of performance requirements and constraints, including cost and schedule limitations. Construction parameters include surface profile types (spherical,… …   Wikipedia

  • Optical tweezers — (originally called single beam gradient force trap ) are scientific instruments that use a highly focused laser beam to provide an attractive or repulsive force (typically on the order of piconewtons), depending on the refractive index mismatch… …   Wikipedia

  • Optical power meter — in use An optical power meter (OPM) is a device used to measure the power in an optical signal. The term usually refers to a device for testing average power in fiber optic systems. Other general purpose light power measuring devices are usually… …   Wikipedia

  • Optical properties of carbon nanotubes — Contents 1 Terminology 2 Electronic structure of carbon nanotube 3 Van Hove singularities …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”