5030

OPTICS LETTERS / Vol. 39, No. 17 / September 1, 2014

High resolution imaging and wavefront aberration correction in plenoptic systems J. M. Trujillo-Sevilla,1,* L. F. Rodríguez-Ramos,2 I. Montilla,2 and J. M. Rodríguez-Ramos3 1

2 3

University of La Laguna, Spain Institute of Astrophysics of the Canary Islands, Spain

Center for Biomedical Research of the Canary Islands, University of La Laguna, Spain *Corresponding author: [email protected] Received June 10, 2014; revised July 18, 2014; accepted July 21, 2014; posted July 21, 2014 (Doc. ID 213857); published August 19, 2014

Plenoptic imaging systems are becoming more common since they provide capabilities unattainable in conventional imaging systems, but one of their main limitations is the poor bidimensional resolution. Combining the wavefront phase measurement and the plenoptic image deconvolution, we propose a system capable of improving the resolution when a wavefront aberration is present and the image is blurred. In this work, a plenoptic system is simulated using Fourier optics, and the results show that an improved resolution is achieved, even in the presence of strong wavefront aberrations. © 2014 Optical Society of America OCIS codes: (110.0115) Imaging through turbulent media; (100.1830) Deconvolution; (120.5050) Phase measurement. http://dx.doi.org/10.1364/OL.39.005030

Plenoptic cameras are devices intended to capture the lightfield or the plenoptic function, which was defined by Adelson [1] as the intensity of every light ray in a volume as a function of wavelength, time, and position. The first lightfield capturing device was proposed in 1908 by G. Lippman, who had the idea of covering a photographic film with an array of spherical lenslets with the goal of rendering several points of view in a single image [2]. A modern plenoptic camera comprises a main lens with a fixed flange focal distance, a microlens array, and an image sensor. The microlens array is placed in the focal plane of the main lens, and the sensor is placed in the focal plane of the microlenses (Fig. 1). This scheme was proposed by Ives [3] in 1930. Images formed by the microlenses would overlap if the f -numbers corresponding to the main lens and to the microlenses don’t match. Recent research has explored the capabilities of the plenoptic cameras. In the computational photography field, it makes possible a posteriori refocus [4,5] and 3D [6]. The plenoptic sensors can also work as wavefront sensors, either when using punctual sources [7] or extended objects [8]. Wavefront recovery is not limited to a single layer. It is also possible to recover tomographic information using a plenoptic sensor [9]. One of the main limitations of plenoptic cameras is the final resolution of images, typically limited to one pixel per microlens. Several methods to improve final image resolution have been proposed by some authors [6,10]. These methods are based on the interpolation of the information of the lightfield. One of the most promising methods can be found in [11]. It recovers the full resolution of the original object, with the drawback of being very computationally intensive. All the above methods exploit the lightfield capturing ability and optical characteristics of plenoptic systems to improve final resolution, but none exploits the wavefront phase measurement to eliminate the effect of wavefront aberrations. In this work we propose a method that not only recovers full object resolution but also eliminates any wavefront aberration (blurring) in the final image. 0146-9592/14/175030-04$15.00/0

The plenoptic images used to test this technique have been simulated using Fourier optics. Depending on the parameters of the system (f -number and resolution) Fraunhofer propagation or Fresnel propagation can be used, as both ways are computationally efficient. In this work we will assume Fresnel propagation, since the microlenses used in imaging systems commonly have a Fresnel number of 10 or less. In other cases, good approximate results can be obtained using Fraunhofer propagation [12]. Let U i s; t be the input field affected by a phase screen of amplitude Φ, placed just before the main lens (with focal length zml ), that aberrates the wavefront (Fig. 2). The field at the microlens array plane can be written as   U 1 s; t I −1 I Ps; tU i s; t 

 2  s  t2 × exp −jk  Φs; t 2zml    ; × exp −jπλzml f 2s  f 2t

(1)

where I and I −1 denote direct and inverse Fourier transform. respectively. Ps; t is the pupil function, k is the wavenumber, and λ is the wavelength. f s and f t are frequency independent variables associated with the coordinates s and t, respectively. For a microlens array comprising M × N lenslets (of focal length zμl ) with indexes m; n, each one illuminates

Fig. 1. Schematic of a basic plenoptic sensor. © 2014 Optical Society of America

September 1, 2014 / Vol. 39, No. 17 / OPTICS LETTERS

Fig. 2. Schematic of the simulated system. The phase screen layer is placed in the main lens pupil. The object is in the conjugated plane of the main lens.

P × Q pixels with coordinates ξ; η relative to the microlens origin. The field obtained at the sensor by the microlens m; nth can be calculated with equation (2), where f ξ and f η are frequency independent variables associated with the coordinates ξ and η, respectively.   −1 U m;n ξ; η  I I U i Pm − 1  ξ; Qn − 1  η    −jk 2 2 ξ η × exp 2zμl    × exp −jπλzμl f 2ξ  f 2η . (2) The final plenoptic image, referred to the sensor absolute coordinates u; v, can be obtained arranging the field formed by each lenslet and calculating the squared modulus Iu; v  jU m;n u − m − 1P; v − n − 1Qj2 ;

(3)

where m and n range from one to M and N, respectively. We used this method to generate a simulated plenoptic image. This image contains information about the phase screen and it can be recovered because the plenoptic sensor can also work as a wavefront sensor. The plenoptic sensor has also been proposed by [8] to measure wavefront phases when using extended objects as a reference. When using this method, the starting point is to generate synthetic aperture images by recomposition of the plenoptic image. Every point in the pupil can be synthetized (or reimaged) rearranging pixels on the plenoptic image. Then, the image formed by putting together one pixel from every microlens (with the same relative position to each microlens center) is a synthetic aperture image of the pupil (Fig. 3). From the synthetized aperture images, the wavefront phase gradients can be obtained by the cross correlation of every image with respect to one of them used as a reference. The position of the correlation peak is proportional to the relative average tilt of the wavefront in the area subtended by the synthetic aperture. The cross correlation may not work well in images lacking enough information to accurately calculate displacements between them. It seems clear that the contrast and the spatial frequency contents at the object intensity are predominant factors in the quality of the recovered phase.

5031

Fig. 3. Left, aperture of the main lens. Right, sample plenoptic image. Creating an image rearranging pixels marked with “*” would synthesize a 4 by 4 pixels image with the point of view from point “a” in the main lens.

A bigger number of pixels illuminated by microlenses P × Q will lead to a greater amount of synthetic apertures, but suffer from smaller resolution, and then the object should be limited to contain lower spatial frequencies. A smaller number of pixels illuminated by microlenses will lead to higher resolution synthetic aperture images, but since the number of possible synthetic apertures is smaller, the final resolution of the recovered wavefront will also be smaller. Research on this topic can be found in [8], where constraints to the size of the microlenses with respect to the frequency of the object intensity irregularities are noted, resulting in a constraint in the microlens diameter dμl  equal to dμl ≤

1 ; 2ρc

(4)

where ρc is the maximum measurable object frequency. As seen before, the quality of the recovered wavefront phase depends on the object intensity irregularities, for this reason, in this work the recovered phase is assumed to be a resampled version of the original phase screen to match with the resolution of the microlenses P × Q. According to the Huygens–Fresnel principle, the object can be seen as a collection of point sources, each one producing a spherical wave associated with its position x; y. Then, the system can be characterized by the impulse response for each point in the object plane. Let IRx;y u; v be the impulse response image for the position x; y in the object plane, and I pleno u; v a recorded plenoptic image of an unknown object I object x; y, affected by an unknown wavefront aberration. Note that the object resolution and the plenoptic image resolution does not have to match. In this case we will consider an object resolution of R × S points and a sensor resolution of R × S pixels, and thus there are a total of R × S impulse response images. The deconvolution can be accomplished by solving the following linear inverse problem: I pleno   IRI object ;

(5)

where I pleno  is a vector created by reorganizing the plenoptic image to a 1 × R · S elements vector. IR is the observation matrix and it is obtained by reorganizing each IRx;y u; v in the same way as I pleno , and concatenating them into a large R · S × R · S matrix. This is the same method that can be found in [11], with the addition of the phase measurement using only the information of the plenoptic image, deconvolving

5032

OPTICS LETTERS / Vol. 39, No. 17 / September 1, 2014

quantification error in the sensor, the system can be simply solved by solving Eq. (6): I object   IR−1 I pleno .

Fig. 4. Upper left, original image. Upper right, blurred image because of the phase screen. Bottom left, plenoptic image. Bottom right, deconvolved image.

not only the plenoptic image but also the wavefront aberration. In the ideal case, when the phase screen Φ can be obtained without error and there is no noise or

Fig. 5. Left, original object used to test, based on the 1951 USAF chart. Center, image created at microlens plane. Right, plenoptic image to be deconvolved.

(6)

In that case, the object intensity values can be completely restored as the system has a single solution (Fig. 4), i.e., knowing the exact phase screen leads to a full reconstruction of the object. In a more realistic case, not only the recovered phase screen differs from the real one, but there are also other effects such as quantification error or sensor noise. For this reason, it is not possible to solve the problem by only inverting IR; iterative methods, like the LSQR we have used in this work are now required [13]. To show the feasibility of this technique, several simulations have been performed varying the recovered phase screen resolution. The system comprises a 400 mm focal length main lens (f -number of 62.5), and a Kolmogorov phase screen with Fried parameter 0.4 mm. The Fried parameter is the diameter of a circular area in which the RMS wavefront aberration is equal to 1 radian. The original resolution of the object, phase screen, and plenoptic image is 512 × 512 pixels (Fig. 5). Every microlens image has a resolution of 32 × 32 pixels, being the maximum recovered phase screen resolution. The plenoptic image is simulated and sampled to eight bits. The image is deconvolved using the phase screen resampled at different resolutions. The results of the simulation can be seen in Fig. 6. Note that even using the full resolution of the phase screen, the full resolution of the original object is not achievable, because the simulated plenoptic sensor uses only eight-bit color and introduces quantification error. However, sampling to only eight bits, the final resolution is already improved with respect to the image that would be captured at the focal plane of the main lens without using any microlens

Fig. 6. Deconvolved images and detail of the central groups. The resolution of the recovered phase screen varies between 256 × 256 and 8 × 8 pixels.

September 1, 2014 / Vol. 39, No. 17 / OPTICS LETTERS

Fig. 7. Structural similarity (SSIM) versus recovered phase screen resolution.

5033

The immunity to noise of the method has been analyzed deconvolving the plenoptic images after adding Gaussian noise at a single-to-noise ratio (SNR) from 0 to 60 dB (Fig. 8). We found that at an SNR lower than 30 dB the image degradation is noticeable. In this work, a deconvolution method is proposed in a scenario where a strong wavefront aberration is induced by a phase screen. This result is obtained using a single sensor, used simultaneously for both imaging and wavefront phase sensing. To test this technique, a laboratory experiment has been designed (with the same optical parameters used in these simulations), where the most difficult part to achieve is to calculate the observation matrix of the system, which implies a calibration of the system. This will be the main objective of our future work. This work was supported by the National R&D Program (Project AYA2012-32079) of the Ministry of Economy and Competitiveness, the European Regional Development Fund (ERDF), and the European Project FP7-REGPOT2012-CT2012-31637-IMBRAIN.

Fig. 8. Structural similarity (SSIM) of object and deconvolved image versus an SNR in plenoptic image.

array, i.e., placing the sensor directly at that position, even when the resolution of the recovered phase screen is as low as 32 × 32 pixels. It should also be noted that a wavefront aberration with a bigger Fried parameter will have less high frequency contents and would then lead to better results even when the recovered phase screen resolution is lower. Figure 7 shows a graphical representation of the quality of the deconvolved image versus the recovered phase screen resolution. The metric used is the structural similarity (SSIM), as this is a better technique when measuring image degradation [14]. This metric is valued between 1 (perfect reconstruction of the image) and 0 (no reconstruction at all). This result shows that even when the phase screen is sampled at the microlens resolution, the quality of the deconvolution is near its maximum. This implies that, once the recovered phase screen resolution P × Q is high enough to sample the wavefront phase aberration, the optical resolution depends only on the object resolution R × S.

References 1. E. H. Adelson and J. R. Bergen, Computational Models of Visual Processing (MIT, 1991), Vol. 1. 2. H. E. Ives, J. Opt. Soc. Am. 20, 332 (1930). 3. G. Lippmann, C. R. Acad. Sci. 146, 446 (1908). 4. J. G. Marichal-Hernández, J. P. Lüke, F. Rosa, F. Perez, and J. M. Rodriguez-Ramos, 3DTV Conference: The True VisionCapture, Transmission and Display of 3D Video (IEEE, 2009). 5. R. Ng, ACM Trans. Graph. 24, 735 (2005). 6. J. P. Lüke, F. Perez, J. G. Marichal-Hernandez, J. M. Rodriguez-Ramos, and F. Rosa, Int. J. Digital Multimedia Broadcast. 2010, 942037 (2009). 7. R. M. Clare and R. G. Lane, J. Opt. Soc. Am. A 22, 117 (2005). 8. L. F. Rodríguez-Ramos, Y. Martin, J. J. Diaz, J. Piqueras, and J. M. Rodriguez-Ramos, Proc. SPIE 7439, 74390I (2009). 9. J. M. Rodriguez-Ramos, B. Femenia, I. Montilla, L. F. Rodriguez-Ramos, J. G. Marichal-Hernandez, J. P. Lüke, R. Lopez, J. J. Diaz, and Y. Martin, in Proceedings of the 1st AO4ELT Conference—Adaptive Optics for Extremely Large Telescopes (EDP Sciences, 2010), pp. 22–26. 10. A. Lumsdaine and T. Georgiev, “Full resolution lightfield rendering,” Technical Report (Indiana University and Adobe Systems, 2008). 11. S. A. Shroff and K. Berkner, Appl. Opt. 52, D22 (2013). 12. D. G. Voelz, Computational Fourier Optics: A MATLAB Tutorial (SPIE, 2011). 13. C. C. Paige and M. A. Saunders, ACM Trans. Math Softw. 8, 43 (1982). 14. Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, IEEE Trans. Image Process. 13, 600 (2004).

High resolution imaging and wavefront aberration correction in plenoptic systems.

Plenoptic imaging systems are becoming more common since they provide capabilities unattainable in conventional imaging systems, but one of their main...
1MB Sizes 0 Downloads 7 Views