I. INTRODUCTION
Tracking systems that maintain an in-focus state for moving objects have been extensively studied for a wide range of applications [1-3]. For example, the cells floating in a fluid are easily affected by Brownian motion, which causes the cells to drift in the depth direction and makes long-term cell imaging difficult. Various optical and numerical localization algorithms have been widely applied because they have advantages of automation, are non-labor-intensive, and have high accuracy. The in-focus object tracking, or autofocusing, system is generally composed of two steps. First, a depth scanning process is performed to estimate the location of the object relative to the focal plane of the imaging system. The second step is to adjust the imaging optics so that the focal plane is relocated to the object plane where the object is placed.
In a conventional focus tracking system, estimating the defocusing distance of an object is usually done by moving the object along the depth direction [2]. However, the object scanning process is relatively slow because it involves mechanical movement such as the motion of a linear stage. Further, it can adversely affect the imaging quality due to the vibration caused by the movement of the stage. In order to minimize this problem, variable focusing methods have been studied, in which the depth scanning is performed by moving the focal plane of the imaging optics [4-7]. This method has merit of minimizing the mechanical movement, but it also encounters the problem of how to scan the focal plane accurately and smoothly.
In this paper, we propose an autofocus tracking system based on digital holographic microscopy (DHM) and an electrically tunable lens (ETL). The estimation of the defocusing distance is made by taking a digital hologram(DH) of an object and numerically refocusing the image of the object [8-11], and the focal plane adjustment is made by the fast and precise focal length change of an ETL. Recently, great attention has been focused on the field combining DHM and ETL [12-19]. ETL has been used to modulate the phase of illumination or the reference wave[12-14], or to compensate the phase distortion that happened in the sample arm of a DHM system [15-17]. On the other hand, the ETL itself was used as a pure phase sample of a DHM system [15, 18], or used to obtain a refocused hologram for the phase retrieval experiment [19]. To the best of our knowledge, however, a study on the autofocus tracking system based on DHM and ETL has not been reported yet. In addition, to get a more accurate and coincident measurement, the DH capturing and the brightfield image taking are made simultaneously with a single-camera. The different RGB color pixels of a color camera are used as different imaging channels. In our experiment, the red channel is used for getting the DH of an object under imaging and the blue channel for taking the optical image of the same object at the same time.
This proposed method has the advantage of continuous imaging without any physical scanning. It allows us to track an object like a floating cell that is drifting even along the depth direction, without using any physical scanning.
II. METHODS
We have implemented the autofocus tracking process by two steps: the defocusing distance estimation by DHM and the focal plane shifting by ETL.
2.1. Defocusing Distance Estimation by DHM
To get a DH, a Mach-Zehnder interferometry setup is configured as shown in Fig. 1. A He-Ne laser beam (Laser; HNLS008L, Thorlabs, λ = 632.8 nm) is divided into an object wave and a reference wave by a beam splitter (BS1).
The reference wave is slightly tilted with a reference mirror(M2) to make an off-axis geometry and then recombined with the object wave through the second beam splitter(BS2). The interference image, or the DH, is recorded by the red channel of a color camera (CM; MQ003CG-CM, XIMEA). By illuminating a blue light-emitting diode (LED, M470L3, Thorlabs, λ = 470 nm, bandwidth = 25 nm) from the opposite side of the system, the optical image of the object can be obtained simultaneously by the blue channel of the same camera. The recorded DH is converted into the spectral domain by using Fourier transform to obtain its angular spectrum. Then, Fourier-domain filtering is applied to the angular spectrum to select the region of interest corresponding only to the object spectrum [20, 21].
In Fig. 1, it was assumed that the object S located at the focal plane d0 was well imaged at the hologram plane denoted by h0, the same one as the camera plane. When the object is moved to another object planed for some reason, the image plane is shifted to h away from the original hologram plane at h0. However, since the camera plane is placed at h0, the image of the object at d is defocused in general. It means the camera captures a defocused image.
FIG. 1. Schematic of the proposed autofocus tracking system. Laser: He-Ne laser, BS1,2: beam splitters, M1,2: mirrors, S: object, RL1,2: relay lenses, ETL: electrically tunable lens, CM: camera, CL: condenser lens, LED: light-emitting diode, ∆h: distance between the hologram plane h0 and the image plane h, ∆d: defocusing distance between the focal plane d0 and the object plane d.
However, it is well known that the complex object field at \(O(x, y ; z=h)\) z = h can be calculated with the complex field \(O\left(x, y ; z=h_{0}\right)\) taken at z = h0 with a DH. With the angular spectrum method [22], the focused image is reconstructed from the defocused image by numerically propagating the wave in the free space of a propagation distance ∆h = h - h0 as;
\(O(x, y, h)=F^{-1}\left\{\text {filter }\left[F\left\{O\left(x, y, h_{0}\right)\right\}\right] \exp \left[i k_{z} \Delta h\right]\right\}\), (1)
where F represents Fourier transform, ‘filter’ represents filtering in the Fourier-domain, and k is the wave numberwith an axial component of \(k_{z}=\sqrt{k^{2}-k_{x}^{2}-k_{y}^{2}}\) .
Determination of the best in-focus image plane location hbest is made by utilizing an image contrast C(h) function.At each pixel (x,y) of an image numerically reconstructed at z = h, the intensity deviation from the mean intensity 〈O〉 of the whole image is calculated [8, 23, 24] as
\(C(h)=\frac{1}{N_{x} N_{y}} \sum_{x y}[|O(x, y, h)|-\langle O\rangle]^{2}\) (2)
where Nx and Ny are the numbers of pixels in each dimension. This method is based on the statistical analysis of the gray value distribution. In other words, the image of an edge-like structure, such as a resolution target, has a higher contrast C when the image is better-focused [25, 26].
In the experiment, for a given defocusing distance ∆dof an object, the propagation distance ∆h was calculated by finding the h that gives the maximum C while varying in small steps. The defocusing distance ∆d of the object has been varied in steps of 1 mm using a linear stage. Figure 2 shows the propagation distance ∆h obtained in terms of ∆d. It is well fitted with a linear curve of
FIG. 2. The propagation distance ∆h measured in terms of the defocusing distance ∆d. The red line is a linear fitting curve.
\(\Delta d=-0.94086 \times \Delta h+0.13033\) (3)
2.2. Focal Plane Shifting by ETL
To maintain the in-focus status, it is necessary to adjust the focal plane of the imaging system so that the camera can always capture the well-focused image of the object regardless of its position in the object plane. Or, the image plane h of the object located at d is optically shifted to the camera plane h0. For this, a 4f relay lens system was constructed by two relay lenses, RL1 and RL2 (AC254-100-A-ML, Thorlabs). Achromatic lenses were used to minimize the chromatic aberration that could be caused by the two light sources of different wavelengths. By placing an ETL(EL-10-30-C-VIS-LD-MV, Optotune) at the confocal plane of the system as shown in Fig. 3, the focal plane could be widely shifted with the ETL without appreciably affecting the magnification ratio of the imaging optics. The dioptric power (the reciprocal of the focal length) of the ETL, with a negative offset lens (-150 mm), could be adjusted from-1.5 to 3.5 diopters by controlling the driving current up to 250 mA. The ETL had a container filled with an optical fluid and the container was sealed with an elastic polymer membrane. With an electromagnetic actuator, the pressure to the container was adjusted, which gave the focal length change of the ETL [27].
FIG. 3. Optics simulation of the focal plane shifting with ETL in a 4\(f\) relay system. By changing the dioptric power of the ETL with a driving current, the focal plane could be shifted up to 47.6 mm while keeping the image plane at the camera.
One important feature of the 4f relay system is that the chief ray, the ray passing through the confocal point, is parallel to the optical axis out of two relay lenses. Therefore, by placing an ETL at the confocal plane of two lenses, we can move the focal plane without shifting the image plane from the camera plane or the hologram plane of Fig. 1. Therefore, even when an object is located out of the focal plane of the original system, with adjusting the ETL, the camera fixed at the hologram plane can image it without hurting its image magnification ratio [28]. The relationship between the focal length \(f\) of the ETL and the focal length \(f\) of the relay lens and the defocusing distance ∆d of the object in the 4f relay system is given as [28, 29]:
\(\Delta d=-\frac{f^{2}}{f_{E T L}}\) (4)
In the experiment, the object was placed initially at position d0, called the focal plane, while applying zero diopter current to the ETL. While displacing the object in steps along the optical axis, the image of the object was captured by the camera and the image focusing was adjusted by the ETL current. For each defocusing distanced, the ETL current giving the best-focused image was measured with Eq. (2) and depicted in Fig. 4. The data set was well fitted with a 2nd order polynomial curve of
FIG. 4. The current change i of ETL measured in term of the times while moving the defocusing distance from -10 mm defocusing distance ∆d. The red line is a 2nd order polynomial fitting curve.
\(i=-0.03226 \times \Delta d^{2}+6.35295 \times \Delta d-0.13588\), (5)
where i represents the electrical current change to the ETLwith an offset current of 140 mA providing zero diopters.
Further, combining Eqs. (3) and (5), we can directly get the ETL current change that is necessary to adjust the defocusing distance. With the propagation distance ∆calculated with the DH taken with an out-of-focus object, we can get the ETL current change giving the well-focused optical image of the object. As shown in Fig. 5, the data is well fitted with a 2nd order polynomial fitting curve of
FIG. 5. The relation between the propagation distance ∆h and the current change i of the ETL, obtained by combining the results of Figs. 2 and 4. The red line is a 2nd order polynomial fitting curve.
\(i=-0.02856 \times \Delta h^{2}-5.96935 \times \Delta h+0.69155\) (6)
III. RESULTS
The numerical image reconstruction has been performed with the process shown in Fig. 6. A resolution target was located at a distance d of Fig. 1 and its DH was captured with a camera at the hologram plane at h0, which gave the defocused hologram O(x,y,h0 )of the object as Fig. 6(a).The Fourier transform of the hologram,F{O(x,y,h0 )}, was taken as Fig. 6(b). After filtering only the object angular spectrum as Fig. 6(c), the filtered angular spectrum was shifted to the center of the coordinate as in Fig. 6(d). The inverse Fourier transform gave the out-of-focus object wavefield of Fig. 6(e). Finally, after making it numerically propagate a distance ∆h we had the well-focused image O(x,y,h0 ) as in Fig. 6(f).
FIG. 6. Images of a resolution target; numerically reconstructed by using off-axis DHM. (a) Defocused hologram captured at z=h0; O(x,y,h0 ), (b) Fourier transform hologram; F{O(x,y,h0 )}, (c) Filtered angular spectrum;filter[F{O(x,y,h0 )}], (d) Shifted angular spectrum, (e)Inverse Fourier transformed image of the shifted angular spectrum; the out-of-focused object wave, (f) The well-focused image obtained with numerical refocusing;O(x,y,h0 ) .
Further, with the numerically obtained propagation distance ∆h, the ETL current for having the best-focused optical image was calculated with Eq. (6). After turning the ETL, the focused image was captured with the blue channel of the same camera. The hologram images O(x,y,h0 ) obtained with the red channel are shown in the first row of Fig. 7. The experiment was performed five times while moving the defocusing distance from -10 mm to 10 mm at an interval of 5 mm. The middle row is the optical images of the object captured with the blue channel. We can see that the images were blurred as the defocusing distance increased. The last row shows the optical images well-focused by the ETL with the current change of Eq.(6). The ETL current changes were -66.4 mA, -33.0 mA,0 mA, 34.2 mA, and 65.4 mA, respectively from the left. As a result, we can ensure that the image was wellau to focused even with the wide movement of the object ina range of 20 mm. The autofocus tracking time depends on how many times the hologram is refocused. In our experiment, 10 times refocusing was made to estimate there focusing distance. The average autofocus tracking time was measured as 0.707 s with 20 times measurements. Though it is not fast enough for real-time tracking, we can expect that program optimization can improve it.
FIG. 7. Experimentally obtained images of a resolution target. The first row shows the holograms obtained with the red channel of the camera. The second row is the images obtained with the blue channel of the camera. The third row is the images captured after being optically focused by the ETL. Each column was obtained with the defocusing distance of -10 mm, -5 mm, 0 mm, 5 mm, or 10mm from the left, respectively.
It is noteworthy that the images in the first row and the second row of Fig. 7 look different. Even though one is a hologram and the other is an image, the status of focusing looks rather different from one to the other. It is mainly due to the chromatic aberration of the optics. It is thought that the red channel and the blue channel of the system might have appreciably different optical specifications. Although achromatic lenses were used for the relay lenses, chromatic aberrations were not removed completely. Further intensive and detailed study is necessary to fully utilize the RGB color channels of a camera.
IV. CONCLUSION
We have implemented an autofocus tracking system based on off-axis digital holographic microscopy (DHM) and an electrically tunable lens (ETL). By using the numerical refocusing of DHM, the defocusing distance of the object placed at an out-of-focus position was calculated. With the calculated defocusing distance, the focal length of the ETLwas tuned directly to give a focused image of the object. Experiments have confirmed that the defocus of a drifting object could be corrected instantaneously without mechanical or optical scanning. By placing the ETL at the confocal plane of a 4f relay system, without hurting the image magnification ratio, the focus tracking could be made with a wide shift of the object up to 20 mm. In addition, the hologram and the optical image could be captured at the same time by utilizing the red and blue channels of a digital color camera. The results of this study are expected to be of great help in maintaining the in-focus status forin-vivo imaging where the sample can move or the focus is unstable.
ACKNOWLEDGEMENT
This work was supported by the Industrial StrategicTechnology Development Program (No.10048888) funded by the Ministry of Trade, Industry and Energy (MOTIE, South Korea) and by the GIST Research Institute (GRI)grant funded by the GIST in 2018.
References
- J.-M. Geusebroek, F. Cornelissen, A. W. M. Smeulders, and H. Geerts, "Robust autofocusing in microscopy," Cytometry 39, 1-9 (2000). https://doi.org/10.1002/(SICI)1097-0320(20000101)39:1<1::AID-CYTO2>3.0.CO;2-J
- G. Rabut and J. Ellenberg, "Automatic real-time threedimensional cell tracking by fluorescence microscopy," J. Micros. 216, 131-137 (2004). https://doi.org/10.1111/j.0022-2720.2004.01404.x
- R. Redondo, G. Cristobal, G. B. Garcia, O. Deniz, J. Salido, M. del M. Fernandez, J. Vidal, J. C. Valdiviezo, R. Nava, B. Escalante-Ramirez, and M. Garcia-Rojo, "Autofocus evaluation for brightfield microscopy pathology," J. Biomed. Opt. 17, 036008 (2012). https://doi.org/10.1117/1.JBO.17.3.036008
- M. E. Bravo-Zanoguera, C. A. Laris, L. K. Nguyen, M. Oliva, and J. H. Price, "Dynamic autofocus for continuousscanning time-delay-and-integration image acquisition in automated microscopy," J. Biomed. Opt. 12, 034011 (2007). https://doi.org/10.1117/1.2743078
- M. Duocastella, B. Sun, and C. B. Arnold, "Simultaneous imaging of multiple focal planes for three-dimensional microscopy using ultra-high-speed adaptive optics," J. Biomed. Opt. 17, 050505 (2012). https://doi.org/10.1117/1.JBO.17.5.050505
- Z. Wang, M. Lei, B. Yao, Y. Cai, Y. Liang, Y. Yang, X. Yang, H. Li, and D. Xiong, "Compact multi-band fluorescent microscope with an electrically tunable lens for autofocusing," Biomed. Opt. Express 6, 4353-4364 (2015). https://doi.org/10.1364/BOE.6.004353
- Y. Nakai, M. Ozeki, T. Hiraiwa, R. Tanimoto, A. Funahashi, N. Hiroi, A. Taniguchi, S. Nonaka, V. Boilot, R. Shrestha, J. Clark, N. Tamura, V. M. Draviam, and H. Oku, "High-speed microscopy with an electrically tunable lens to image the dynamics of in vivo molecular complexes," Rev. Sci. Instrum. 86, 013707 (2015). https://doi.org/10.1063/1.4905330
- P. Langehanenberg, B. Kemper, D. Dirksen, and G. von Bally, "Autofocusing in digital holographic phase contrast microscopy on pure phase objects for live cell imaging," Appl. Opt. 47, D176 (2008). https://doi.org/10.1364/AO.47.00D176
- P. Langehanenberg, L. Ivanova, I. Bernhardt, S. Ketelhut, A. Vollmer, D. Dirksen, G. Georgiev, G. von Bally, and B. Kemper, "Automated three-dimensional tracking of living cells by digital holographic microscopy," J. Biomed. Opt. 14, 014018 (2009). https://doi.org/10.1117/1.3080133
- P. Gao, B. Yao, J. Min, R. Guo, B. Ma, J. Zheng, M. Lei, S. Yan, D. Dan, and T. Ye, "Autofocusing of digital holographic microscopy based on off-axis illuminations," Opt. Lett. 37, 3630 (2012). https://doi.org/10.1364/OL.37.003630
- T. Cacace, M. Paturzo, P. Memmolo, M. Vassalli, P. Ferraro, M. Fraldi, and G. Mensitieri, "Digital holography as 3D tracking tool for assessing acoustophoretic particle manipulation," Opt. Express 25, 17746 (2017). https://doi.org/10.1364/OE.25.017746
- B. Kemper, R. Schubert, S. Dartmann, A. Vollmer, S. Ketelhut, and G. von Bally, "Improved quantitative phase contrast in self-interference digital holographic microscopy and sensing dynamic refractive index changes of the cytoplasm using internalized microspheres as probes," Proc. SPIE 8589, 85890M (2013).
- R. Schubert, A. Vollmer, S. Ketelhut, and B. Kemper, "Enhanced quantitative phase imaging in self-interference digital holographic microscopy using an electrically focus tunable lens," Biomed. Opt. Express 5, 4213-4222 (2014). https://doi.org/10.1364/BOE.5.004213
- C. Trujillo, A. Doblas, G. Saavedra, M. Martinez-Corral, and J. Garcia-Sucerquia, "Phase-shifting by means of an electronically tunable lens: quantitative phase imaging of biological specimens with digital holographic microscopy," Opt. Lett. 41, 1416 (2016). https://doi.org/10.1364/OL.41.001416
- W. Qu, C. Y. Cheng, and A. Asundi, "Physical phase compensation in digital holographic microscopy by using of electrical tunable lens," Proc. SPIE 8769, 87693C (2013).
- D. Deng, J. Peng, W. Qu, Y. Wu, X. Liu, W. He, and X. Peng, "Simple and flexible phase compensation for digital holographic microscopy with electrically tunable lens," Appl. Opt. 56, 6007-6014 (2017). https://doi.org/10.1364/AO.56.006007
- Z. Wang, W. Qu, F. Yang, A. Tian, and A. Asundi, "Absolute measurement of aspheric lens with electrically tunable lens in digital holography," Opt. Lasers Eng. 88, 313-318 (2017). https://doi.org/10.1016/j.optlaseng.2016.09.002
- Z. Wang, W. Qu, F. Yang, and A. K. Asundi, "Focal length calibration of an electrically tunable lens by digital holography," Appl. Opt. 55, 749-756 (2016). https://doi.org/10.1364/AO.55.000749
- D. Deng, W. Qu, W. He, Y. Wu, X. Liu, and X. Peng, "Phase retrieval for digital holographic microscopy with defocused holograms," IEEE Photon. J. 10, 1-9 (2018).
- M. K. Kim, "Principles and techniques of digital holographic microscopy," SPIE Rev. 1, 018005 (2010).
- X. Yu, J. Hong, C. Liu, and M. K. Kim, "Review of digital holographic microscopy for three-dimensional profiling and tracking," Opt. Eng. 53, 112306 (2014). https://doi.org/10.1117/1.OE.53.11.112306
- M. K. Kim, L. Yu, and C. J. Mann, "Interference techniques in digital holography," J. Opt. A: Pure Appl. Opt. 8, S518- S523 (2006). https://doi.org/10.1088/1464-4258/8/7/S33
- J. A'lvarez-Borrego, "Fast autofocus algorithm for automated microscopes," Opt. Eng. 44, 063601 (2005). https://doi.org/10.1117/1.1925119
- P. Memmolo, L. Miccio, M. Paturzo, G. D. Caprio, G. Coppola, P. A. Netti, and P. Ferraro, "Recent advances in holographic 3D particle tracking," Adv. Opt. Photonics 7, 713 (2015). https://doi.org/10.1364/AOP.7.000713
- P. Langehanenberg, B. Kemper, and G. von Bally, "Autofocus algorithms for digital-holographic microscopy," Proc. SPIE 6633, 66330E (2007).
- H. Wang, A. Qin, and M. Huang, "Autofocus method for digital holographic reconstruction of microscopic object," in Proc. Symposium on Photonics and Optoelectronics (China, Aug. 2009), pp. 1-4.
- M. Blum, M. Bueler, C. Gratzel, and M. Aschwanden, "Compact optical design solutions using focus tunable lenses," Proc. SPIE 8167, 81670W (2011).
- J. W. Kim, J. S. Ahn, J. B. Eom, and B. H. Lee, "Magnification-invariant surface profiling technique for structured illumination imaging and microscopy," Opt. Commun. 434, 257-263 (2019). https://doi.org/10.1016/j.optcom.2018.10.052
- C.-Y. Lin, W.-H. Lin, J.-H. Chien, J.-C. Tsai, and Y. Luo, "In vivo volumetric fluorescence sectioning microscopy with mechanical-scan-free hybrid illumination imaging," Biomed. Opt. Express 7, 3968-3978 (2016). https://doi.org/10.1364/BOE.7.003968