This article describes the issues involved with using cameras in Particle Tracking applications.
In fast particle-tracking systems, camera selection is critical because of the fast framing AND short exposure conditions. The frame rate of the camera is important and relatively easy to understand. The duration of a frame (essentially the reciprocal of the frame rate) determines the distance (in pixels) moved by the particle from one acquired frame to the next. If the frame rate of the camera is too slow, the inter-frame displacement may be larger than the field-of-view. In this case, a particle may be visible in one frame but it may be outside the Field of View (FOV) in the next frame.
Intra-frame motion blur
Another area of concern in particle-tracking is the intra-frame motion blur. In a camera, the duration of exposure determines how long photoelectrons are collected in pixels. If the object that is being imaged moves during the exposure, the reflected or scattered light from the particle moves spatially during exposure. This results in motion blur – for example, the particle may not appear at one location but as a blur.
One way to reduce motion-blur is to use a short exposure duration (implemented via electronic shuttering). This is described in the next section of this article.
Short Exposures create their own problems!
Reducing the exposure helps reduce motion blur, but it can turn a reasonably bright-light imaging situation into one that is light-starved. Ultimately the goal of quantitative imaging is to produce an image that contains good data. In this context, the Signal-to-Noise Ration (SNR) of the image serves as a useful figure-of-merit for image quality. If there are fewer photons collected due to the short exposure, the resulting image may be swamped by the noise-floor of the camera.
For this reason, a camera that is used for particle-tracking has to be sensitive: it must generate a signal that is large enough (given the short exposure duration) to overcome system noise in order to obtain a discernible image with an acceptable Signal-to-Noise Ratio (SNR). Therefore, cameras with good Quantum Efficiency (QE) at the wavelength of interest AND low noise are often used in such situations.
Some particle tracking applications utilize a strobed illumination system, which is easier to implement if the camera is trigger-able. Finally, the camera should operate in Global Shutter mode, so that all the pixels are exposed at the same time and for the same duration – cameras with “rolling shutter” will produce images with undesirable artifacts except when used in carefully controlled timing modes
A simple example
It is necessary to estimate whether the frame rate of the camera is sufficient to capture the movement of particles from one frame to the next: for example, a camera with a max frame rate of 60f/sec provides an inter-frame time of 16.6ms apart. A faster-framing-camera may provide frames at up to 200f/sec, so the inter-frame time is only 5ms. Multiplying the distance moved by the particle in the inter-frame time (either 16.6ms or 5ms in our example) by the magnification (a function of the focal length of the lens and its distance from the sample plane) and dividing by the pixel size, we can translate the displacement in the sample plane translates a number of pixels on the camera.
As described above, one could reduce intra-frame “motion blur”, by reducing the duration of exposure. Otherwise, a particles may only be imaged as a multi-pixel blur. By setting the camera shutter to (for example) 1/1000 sec = 1ms, each exposure captures a much smaller movement of the particle, which significantly reduces the pixel blur. Note that a shorter exposure is only possible if there is sufficient light in the FOV.
In a real-world case, one would do a calculation with the ACTUAL particle speed as projected by the optics onto the image sensor. This will help determine if the inter-frame effects of motion are within an acceptable range.
The next step is to estimate the duration of exposure that is long enough to produce an image with acceptable SNR but not so long as to make the motion-caused blur to be above a desirable threshold.
In the next section, we take on a slightly more complex example in which we compare two different cameras being considered for a particle tracking application.
A more complex example
In this example, we compare the usability of two different cameras in a particle tracking experiment. For the sake of simplicity, it is assumed that the particles being imaged are moving in a straight line at a constant velocity.
- Camera “A” with 7.4um pixels and a frame rate of 200f/sec
- Camera “B” with 2.5um pixels and a frame rate of 60f/sec
Let’s also assume that both cameras have imagers with an identical optical format, for example, they are both 1/3″ cameras with a diagonal of ~6mm. The larger pixel size of Camera “A” means that it must have fewer pixels than Camera “B”.
The first thing to note is that the max exposure of camera “A” is 5ms while the maximum exposure of camera “B” is 16.6ms.
The max frame rate of Camera “B” = 60f/sec. Our particles can travel further in 1-frame-period of this camera than they would in a 1-frame-period of Camera “A” (roughly 3x more). The smaller pixel size of Camera “B” factors into our thinking because distances are resolved into pixels.
It is important to find the right balance between speed (clearly the most important requirement) and resolution (also important, in order to resolve small particles within a relatively large FOV). If we know the speed at which the particles are moving (e.g. 1cm/sec), we can translate that to the imager plane by multiplying 1cm/sec by the magnification. If the system involved is perhaps one of our Zoomable Macroscope configurations, with a magnification of 2.25x, the projected speed of the particle on the imager plan is 22.5mm/sec.
Camera “A”: On the imager plane, a particle can be estimated to move 112microns in one frame-period of 5ms. Dividing this distance by the pixel size of 7.4um tells us that the particle will move ~15pix from one frame to another.
Camera “B”: On the imager plane, a particle can be estimated to move 373microns in one frame-period of 16.6ms. Dividing this distance by the pixel size of 2.5um tells us that the particle will move ~149pix from one frame to another.
If both cameras were operated at an Exposure of 5ms (the max exposure for Camera “A”), then there will be a 112um blur in both cameras, represented as a ~15pix intra-frame blur for Camera “A”, but a 45pixel blur for Camera “B”. This blur occurs because the imager will integrate the image as the particle moves. Neither of these is satisfactory … so, one may consider a lower exposure of, for example 1ms to “stop motion”.
Since the frame rates are not changed, the particle would still have an inter-frame movement of 15pixels (Camera “A) and 149pixels (Camera “B”) from one frame to the next. But, with an exposure of 1ms the blur would be reduced considerably. It would be ~3pixels for Camera “A” and ~15pixels for Camera “B”.
So far, we have ignored the size of the particle. Clearly, the particle size can be similarly estimated in #pixels. This is useful, in order to get a true sense of how the particles may appear: a 50um diameter particle would be represented by 50×2.25 = 112microns diameter on the imager plane:
For Camera “A” this means that the particle is visualized as having a diameter of ~15pix. So, one can visualize a ~15pix particle with an intra-frame blur of ~3pixels and an inter-frame displacement of ~15pixels.
For Camera “B” this means that the particle is visualized as having a dimension of ~45pix. So, one can visualize a ~45pix particle with an intra-frame blur of ~15pixels and an inter-frame displacement of ~149pixels.