Image formation analysis and high resolution image reconstruction for plenoptic imaging systems

Size: px
Start display at page:

Download "Image formation analysis and high resolution image reconstruction for plenoptic imaging systems"

Transcription

1 Image formation analysis and high resolution image reconstruction for plenoptic imaging systems Sapna A. Shroff* and Kathrin Berkner Computational Optics Sensing and Visual Processing Research, Ricoh Innovations Inc., 88 Sand Hill Road, Suite 115, Menlo Park, California 9405, USA *Corresponding author Received 14 November 01; accepted 18 January 013; posted 1 February 013 (Doc. ID ); published 0 March 013 Plenoptic imaging systems are often used for applications like refocusing, multimodal imaging, and multiview imaging. However, their resolution is limited to the number of lenslets. In this paper we investigate paraxial, incoherent, plenoptic image formation, and develop a method to recover some of the resolution for the case of a two-dimensional (D) in-focus object. This enables the recovery of a conventional-resolution, D image from the data captured in a plenoptic system. We show simulation results for a plenoptic system with a known response and Gaussian sensor noise. 013 Optical Society of America OCIS codes , , , , , Introduction Plenoptic imaging systems are being explored in many computational imaging applications, such as [1 3] computational axial refocusing, imaging different perspective angles, imaging with varying depths of field [4], and high dynamic range imaging. Plenoptic systems also find application in multimodal imaging [5] using a multimodal filter array in the plane of the pupil aperture. Each filter is imaged at the sensor, effectively producing a multiplexed image of the object for each imaging modality at the filter plane. The filter modalities may be spectral, polarization, neutral density, etc. A known drawback of plenoptic imaging systems is their loss of spatial resolution. A lenslet array at the conventional image plane spreads light on to the sensor. Every pixel behind a lenslet corresponds to a different angle or a different wavelength (in the case of multispectral imaging). Therefore, each angle image, or each wavelength image, is restricted to as many pixels as lenslets. When the multiangle images are further processed to obtain a refocused image, the result is also low resolution. A solution to this issue X/13/100D-10$15.00/0 013 Optical Society of America has been discussed from the perspective of refocusing in prior work [6], using the repetition of an object point in multiple subimages, assuming priors on Lambertianity and texture statistics at the object surface, and using digital superresolution techniques. But the algorithm requires some amount of object defocus and does not perform when the object is in focus at the lenslet array. In the multimodal or multispectral imaging application, different pixels behind a lenslet give images with different modalities, such as different wavelength images or different dynamic range images. They are not processed further for refocusing. The object plane in this case is focused at the lenslet array, and the resultant low-resolution multimodal images do not give much perspective benefit. We did not find any literature supporting a higher resolution solution for the case of such an in-focus object. Plenoptic systems are flexible in allowing a user to obtain refocused images, multiview images, or multimodal images with minimal modifications of the same hardware. If it were possible to use the same hardware to obtain a two-dimensional (D) image with similar resolution as a conventional imaging system, that would further increase the flexibility and usability of these systems. Keeping in mind the in-focus object necessary for the multimodal D APPLIED OPTICS / Vol. 5, No. 10 / 1 April 013

2 application, and the insufficient prior work on recovering high resolution for the in-focus object, in this paper we propose a method to extract an image with resolution approaching that obtained from a conventional imaging system. The solution we propose works for defocused objects, as well. In this paper we do not discuss the multimodal, refocusing, or light-field aspects of plenoptic systems and concentrate only on in-focus resolution recovery. In order to investigate the possibility of recovering spatial resolution in this system, it would be useful to understand image formation in a plenoptic system. Reference [7] is a related paper connecting Wigner distributions to light fields. Reference [8] also discusses light fields using generalized radiance and a radiance coding perspective. However, they do not discuss wavefront-propagation-based image formation. Reference [6] discusses an approximation to an image formation model, but their response matrix contains some geometric-optics-based approximations that do not support the in-focus case that interests us. Since we did not find any literature discussing detailed field propagation in a plenoptic system, in this paper we first analyze wavefront propagation for the case of paraxial, incoherent systems, such as fluorescence microscopes, use it to develop a forward image formation model, and propose a method to obtain high-resolution images. We provide simulation results for a D object such as may be used in multispectral plenoptic systems.. Incoherent Image Formation in a Paraxial Plenoptic System We consider the layout of a general plenoptic system. Figure 1 is a schematic diagram of such a system, not drawn to scale. The object, denoted by the coordinates ξ; η, is imaged by a main lens, which may be any general paraxial imaging system, considered, for the sake of simplicity, as a single thin lens. The spatial coordinates at the plane of the pupil aperture of the main lens are denoted by x; y. This lens produces a primary image of the object at the plane of the lenslet array, which is denoted by the coordinates u; v. The main lens is considered to be a distance z 1 from the object, and the lenslet array is placed at a distance z from the plane of the main lens. The sensor is placed a distance z 3 from the lenslet array. We denote the sensor plane coordinates by t; w. The focal lengths of Fig. 1. Schematic layout of a plenoptic imaging system, with the first subsystem containing the main lens (solid gray), and the second subsystem containing the microlens array (dotted gray) [10,11,1]. the main lens and the lenslet array are given by f 1 and f and their diameters are D 1 and D, respectively. In this manuscript we define the term image as a replica of a target. The target in this paper may be the object, the pupil aperture, or filters in the pupil. We define each separately as we introduce them. The main lens produces an image of the object, henceforth called the primary image. This is considered the first imaging subsystem, marked in solid gray in Figure 1. The lenslet array is placed approximately at the plane of this primary image. Each lenslet then images the pupil of the main lens at the sensor plane. This is the second imaging subsystem, also marked in Figure 1 in dotted gray, which partially overlaps with the first subsystem with distance z in common. We refer to the data collected at the sensor behind a single lenslet as the pupil image and the data collected behind the entire lenslet array the plenoptic image. In order to analyze this optical layout and its impact on the data collected at the sensor we perform a wave analysis of this system. A. Wave Propagation analysis Consider the first imaging subsystem, where the main lens has a generalized pupil function denoted as P 1, and its corresponding impulse response is given as [9] h 1 u; v; ξ; η ez 1 e z λ exp u v z 1 z z ZZ exp ξ η dxdyp z 1 x; y 1 1 exp 1 1 x y z 1 z f 1 exp u Mξx v Mηy ; z (1) where λ is the imaging wavelength, k π λ, and the magnification from object to the primary image plane is M z z 1. Substituting x 0 x λz and y 0 y λz, the above equation becomes h 1 u; v; ξ; ηe z 1 e z exp u v z exp ξ η z 1 ZZ M dx 0 dy 0 P 1 x 0 λz ;y 0 λz 1 exp 1 1 z 1 z f 1 x 0 λz y 0 λz expf jπu Mξx 0 v Mηy 0 g () 1 April 013 / Vol. 5, No. 10 / APPLIED OPTICS D3

3 Defining the Fourier transform of P 1 xλz ;yλz exp f 1 z 1 1 z 1 f 1 xλz yλz g as h 0 1 u;v makes Eq. () h 1 u; v; ξ; ηme z 1 e z exp u v z exp ξ η h 0 1u Mξ;v Mη z 1 Substituting ξ 0 Mξ and η 0 Mη in Eq. (3), and imaging an object field, U o, we obtain the field of the primary image formed by the main lens (3) U 0 i u; vu iu; v X m exp f X P u md ;v nd n u md v nd Assuming the lenslet imaging system is also paraxial, propagating this field to the sensor located at a distance z 3 from the lenslet array and using Eq. (4) gives the following field at the sensor (5) U f t; w ez 1 e z e z 3 X jλz 3 M exp X t w exp md z 3 f nd m n ZZ 1 dudvp u md ;v nd exp 1 1 u v z z 3 f t exp u md w v nd z 3 f z 3 f ZZ ξ dξ 0 dη 0 0 U o M ; η0 ξ 0 η 0 exp h 0 1 M z 1 M M u ξ0 ;v η 0 (6) U i u; v ez 1 e z M exp u v z ZZ ξ dξ 0 dη 0 0 U o M ; η0 M ξ 0 η 0 exp z 1 M M h 0 1 u ξ0 ;v η 0 ez 1 e z M exp u v z u U o M ; v u exp M z 1 M v h 01 M u; v ; (4) where the symbol denotes a convolution. In a conventional system the sensor would be placed at this plane and ju i j would give the intensity of the conventional image. In a plenoptic system [], however, a lenslet array is placed at this primary image plane. We assume each lenslet has a diameter D, focal length f, a pupil function given by P, and there are M N such lenslets in the array. The field after traversing the lenslet array is described by The analysis up to this part was briefly discussed in our prior work [10]. Now we introduce incoherence into the system, such as from an object with fluorescent emission hu o ξ 0 ; η 0 U o~ξ 0 ; ~η 0 i I o ξ 0 ; η 0 δξ 0 ~ξ 0 ; η 0 ~η 0 (7) Using Eqs. (6) and(7), the intensity at the sensor plane is given by ZZ ξ I f t; w dξ 0 dη 0 0 I o M ; η0 e z 1 e z e z 3 M jλz 3 M exp t w z 3 X X exp md f nd m n ZZ dudvp u md ;v nd 1 exp 1 1 u v z z 3 f t exp u md w v nd z 3 f z 3 f h 0 1 u ξ0 ;v η 0 (8) D4 APPLIED OPTICS / Vol. 5, No. 10 / 1 April 013

4 We introduced this analysis briefly in [11,1]. In the following section we explore this image formation model in greater detail with analyses and simulations. B. System Response In this paper we refer to the field of the conventional response for a point source imaged through the main lens, at the conventional or primary image plane, as the impulse response (IPR), and the square of its magnitude as the point spread function (PSF). This is typically an Airy disk for a conventional diffractionlimited imaging system with a circular pupil, and is spatially invariant within an isoplanatic region. The response of the plenoptic system must include the impact of the lenslet array and propagation to the sensor, in addition to the system response up to the plane of the lenslet array. Thus it should include the effect of the traditional PSF and contain additional information. As we show in the following work, this response does not look like an Airy disk, even when the system has no aberrations. We also show that the response for the plenoptic system is not as spatially invariant as the conventional PSF. For the sake of clarity and the convenience of being able to distinguish clearly between the conventional PSF, which characterizes only the first subsystem, versus the system response for the complete plenoptic system, we call them by different names. Since the response of the complete plenoptic system resembles an image of the pupil aperture of the main lens, in this paper we refer to it as the pupil image function (PIF) [10,11,1]. The term I o ξ 0 M; η 0 M in Eq. (8) refers to the intensity at a single point in the object plane. All the terms within the squared magnitude are related to parameters of the optical system, such as pupils, impulse responses for the main lens and lenslet, system distances, and can be included in the overall system response, PIF, written as PIF t;w ξ 0 ;η PIFt; w; 0 ξ0 ; η 0 e z 1 e z e z 3 jλz 3 M exp t w z 3 X X exp md f nd m n ZZ dudvp u md ;v nd 1 exp 1 1 u v z z 3 f t exp u md w v nd z 3 f z 3 f h 0 1 u ξ0 ;v η 0 (9) The PIF is a four-dimensional (4D) function of the sensor plane coordinates t; w and the scaled object coordinates ξ 0 ; η 0. Each point ξ 0 ; η 0 in the object plane has a corresponding response PIF t;w ξ 0 ;η0 that falls across all the sensor pixels. We simulated the system response using these equations. We assumed an imaging wavelength 450 nm, a thin aberration-free main lens with focal length 40 mm and diameter 4 mm. In most of the paper we assumed a single thin lenslet with focal length 4 mm and diameter 0.16 mm. Toward the end of the paper we discuss an lenslet array with the same individual specifications for focal length and diameter of each lenslet. We assumed z 1 and z of 65 and 104 mm, respectively. This created a system where a flat object was imaged by the main lens and got perfectly focused at the primary image plane, which was also the plane of the lenslet array. The sensor was placed at a distance 4. mm from the lenslet plane. We assumed a sensor with pixel size of 1.3 μm, and a dynamic range of We simulated the field of the conventional impulse response, IPR, using Eq. (4) with an impulse as the object. For pictorial succinctness we display its magnitude square, PSF, in Figure (a). This field is displayed, cropped by the extent of a single circular on-axis lenslet. In Figs. (c) and (e) we do the same for point sources that were off-axis by 50% and 85% of the on-axis lenslet radius. Figs. (b), (d), and (f) show the PIF responses of the entire plenoptic system for the same point sources, simulated using Eq. (9). Figures (a) and (b) show the case of an on-axis point source imaged through a single on-axis lenslet in a plenoptic system. The field associated with the conventional Airy disk reached the lenslet, got cropped by its extent and further propagated to the sensor plane. Figures (c) and (d) show the case of a slightly off-axis point source. Here, some of the rings of the Airy disk were cropped by the extent of the lenslet. Figures (e) and (f) show, as the object went more off-axis, the effect of the cropping was more severe. This caused diffraction effects and spatial variation in the effective PIF response for each object point, as seen in Figures (b), (d), and (f). The plenoptic system response, PIF, in Figure showed differences in spatial extent and diffraction patterns when compared to the PSF. Also, the PIF response includes the effect of the main lens (and therefore the PSF), and the rest of the system, while the converse is not true. Therefore, we consider the PIF response when dealing with the plenoptic system. The same equations can be used to obtain PIF responses for a defocused D object. C. Image Simulation When the integral in Eq. (8) is performed over all the ξ 0 ; η 0 coordinates, we effectively sum over all the points in an object and the result is a function of only sensor coordinates t; w. This implies that information from a single object point is spread out over multiple sensor pixels, and, conversely, a single sensor pixel contains information from multiple object points. We simulated this linear system by arranging 1 April 013 / Vol. 5, No. 10 / APPLIED OPTICS D5

5 the object and sensor data in column vectors Iv o and Iv f, and the PIF t;w ξ 0 ;η for a given object point 0 ξ0 ; η 0 in a column vector, and concatenating the column vector responses for every point in the object plane in a matrix PIFv as follows 6 4 I 1;1 f I 1; f I ;1 f I ; f I T;W f Iv f PIFvIv o ; (10) 3 3 PIF 1;1 1;1 PIF 1;1 1; PIF 1;1 M;N PIF 1; 1;1 PIF 1; 1; PIF 1; M;N PIF 1;W 1;1 PIF 1;W 1; PIF 1;W M;N PIF ;1 1;1 PIF ;1 1; PIF ;W M;N PIF T;W 1;1 PIF T;W 1; PIF T;W M;N I o;1;1 I o;1; I o;;1 I o;;1 I o;m;n (11) Note that PIFv is just a matrix-style rearrangement of the 4D quantity PIF t;w ξ 0 ;η0. Thus, the set of data captured by the sensor in a plenoptic system is a product of the response matrix, PIFv, and the object intensity vector, Iv o. This formulation was used to simulate image formation in a plenoptic imaging system with a single lenslet. We used a D object containing a sharp slant edge, dark towards the left and bright toward the right, shown in Fig. 3(a), as the object in this simulation. Figure 3(b) shows the data at the sensor when the object was perfectly focused at the lenslet array plane. The sensor data appeared disk-like, similar to an image of the circular main lens aperture. It contained no direct appearance of the edge in the object. When the object was moved closer to the main lens, as shown in the schematic layout in Fig. 4 (not drawn to scale), the system formed a roughly defocused, upside-down image of the object. Our simulation showed this effect for different amounts of defocus in Figs. 3(c) and 3(d) with the formation of a subimage at the sensor plane, where the dark portion Fig.. (a), (c), and (e) show the PSF responses on a stretched scale for impulses that are laterally shifted 0%, 50%, and 85% of the lenslet radius away from the optical axis, respectively, cropped by the extent of the on-axis lenslet. (b), (d), and (f) show the overall PIF responses of the system for the same impulses. Fig. 3. Simulation of incoherent plenoptic image for a single lenslet. (a) shows the pristine object, (b) is the sensor data when the object plane is focused at the plane of the lenslet array, and (c) (f) are the sensor data when the object is defocused by, 5,, and 5 mm, respectively. D6 APPLIED OPTICS / Vol. 5, No. 10 / 1 April 013

6 Fig. 4. Schematic layout of a plenoptic imaging system when the object is moved closer to the main lens. is now to the right and the bright portion is to the left. Figure 3(c) still shows energy from the pupil imaging subsystem and the circular shape of the pupil. As the amount of defocus increased in Fig. 3(d), the energy from the pupil imaging subsystem was lower than from the combined object imaging subsystem. Also, the object shown in Fig. 3(a) was surrounded by zeros. Therefore, Fig. 3(d) shows a roughly defocused version of this object and low energy around it. Figure 5 shows a similar illustration of the system when the object was moved away from the main lens. Here the primary image was formed before the lenslet array and the lenslet array relayed an upright defocused image of the object at the sensor. Figures 3(e) and 3(f) show this was indeed the case; the dark portion of the object is seen to the left in the data and the bright part is to the right. Here again, the amount of defocus in Fig. 3(e), being less than that in Fig. 3(f), shows more effect of pupil imaging with the object imparting only a shading of the data. Figure 3(f) shows more energy from the defocused image of the object. This appearance of flipped subimages of a defocused object was experimentally demonstrated by [1], corroborating the theory of image formation we have proposed. 3. Recovery of High Resolution Image The image formation theory and simulations developed in the previous section indicate the possibility of estimating the object intensity via solving a linear inverse problem, namely, calculating an estimate Îv o of the object by solving the following optimization problem Îv oargmin Ivo PIFvIv o Iv f (1) subject to possible constraints on Iv o. Solutions to this problem may be implemented in closed-form or via iterative algorithms, depending on the choice of the norm (or combination of norms) and the constraints on Iv o. The system matrix, PIFv, does not convolve with the object intensity if the object is focused at the plane of the lenslet array. Therefore, direct deconvolution techniques, such as Wiener filtering, are not useful for our problem. The PIFv matrix may be obtained by good system calibration, or by simulating or estimating an approximation to this response. In this paper we show results for a wellknown error-free PIFv matrix. More work could be done to incorporate approximations or errors in the knowledge of the PIFv matrix. A. Noise-Free Case For the results shown in this section we assumed a noise-free PIFv matrix, noise-free sensor data, and used pseudoinverse PIFv to compute a closed-form solution to Eq. (1). Figure 3(a) shows the object used in this simulation, a sharp slant edge, dark to the left and bright to the right of the edge. Figures 6(a) and 6(b) show a conventional image of the edge taken with just the main lens and the same after deconvolution using a Wiener filter in MATLAB [13] with a noise-to-signal power ratio of The sharpness of the edge was enhanced after deconvolution, but to a finite extent due to the low-pass nature of an imaging system. There was some ringing owing to oversharpening. The plenoptic system sensor data for the in-focus case in Fig. 3(b) showed the least similarity to object spatial content and portended to be the most difficult case to reconstruct. Comparatively, the sensor data in the defocused cases contained an increasing amount of direct object spatial content, like the shading and object imaging seen in Figs. 3(c) to 3(f). Therefore, here we mainly discuss the reconstruction using the more difficult, in-focus plenoptic data, shown in Fig. 3(b). To the best of our knowledge, this case has not been discussed in prior work. Figure 6(c) shows one such reconstruction, where we used a pseudoinverse to invert the matrix PIFv. Since PIFv is assumed to be known and there is no noise in this simulation, the pseudoinverse worked reasonably well, giving us a clearly visible edge. But it suffered from a dominant twin-image artifact. In a plenoptic system with radially symmetric lenses, a given focused object point has a diametrically opposite object point, which gives a flipped version of almost the same optical response. Therefore, a column in the resultant PIFv matrix, which is the vectorized response for one object point, is an upside-down version of another column in the matrix, Fig. 5. Schematic layout of a plenoptic imaging system when the object is moved farther from the main lens. 1 April 013 / Vol. 5, No. 10 / APPLIED OPTICS D7

7 corresponding to an object point that is located diametrically opposite in the optical system at the plane of the lenslet. The similarity of plenoptic systems to holography [14] and the resemblance of the twinimage problem to a similar issue in holography and phase retrieval [15] suggested that it could be handled by using a noncentrosymmetric response constraint for object points across a given lenslet. Since our main lens here was assumed to be aberration free, we incorporated local noncentrosymmetry in the shape, amplitude, or phase of each lenslet in our lenslet array. This imparted enough variation across the columns of matrix PIFv to avoid the twin-image reconstruction error. This reconstruction, shown in Fig. 6(e), was obtained using a lenslet with a notch at one side, shown in Fig. 6(f), making the response different for object points across the lenslet compared to the traditional circular lenslet pupil shown in Fig. 6(d). Other forms of lenslet shape, amplitude, and phase variations, shown in Fig. 7, were also able to reduce the solution space and eliminate the twinimage artifact. One could also leave the lenslets unaltered and place a filter or mask having amplitude or phase variations in a conjugate object plane to obtain such variation in the response. When the response across different parts of the object was different, the reconstruction got easier. Therefore, altering the main lens response could also help the reconstruction. The lenslet array is already in a plane conjugate to the object; therefore, adding such variations to the lenslet array was convenient for us and reduced image processing complexity. One could also avoid using such noncentrosymmetric constraints and instead use advanced image processing algorithms and prior assumptions, such as object statistics, to help with the inversion. However, in this paper, we did not make any such assumptions about the object, and used algorithms available in MATLAB libraries [13]. The reconstruction obtained in Fig. 6(e) shows a clearly defined edge, similar to that in the original object. The sharpness of the reconstructed edge is no more than that seen in the deconvolved conventional image in Fig. 6(b). There are some ringing artifacts present in both cases. Issues like ringing are well known and easy to manage in a Wiener filter deconvolution, but the investigation and management of these issues in a plenoptic reconstruction provides interesting opportunities for future work. Thus, the proposed image reconstruction for the case of an in-focus D object, imaged through a plenoptic system with a single lenslet, a known system response, and a noise-free setting provided us a reconstruction with similar spatial resolution information about the object as that obtained in a deconvolved image from a conventional system. B. Noise at the Sensor There are various kinds of noise in this system that should be considered for a thorough examination of system inversion, such as sensor noise, or errors, approximations and uncertainties in the PIFv response matrix, and the three-dimensional (3D) nature of the object. All these sources of noise are worthy of extensive exploration and may involve further algorithmic modifications. In this paper, we discuss only the effect of sensor noise modeled as white Gaussian noise. Assuming N is a vector of independently distributed Gaussian random variables of zero mean and standard deviation, σ, the linear forward model for image formation can be stated as Iv f PIFvIv o N (13) In this section, we show simulation results for a plenoptic system with sensor noise of SNR 40 db. We simulated an array of lenslets. We showed results with an on-axis lenslet in the previous section. Here we show results for reconstruction using Fig. 6. Simulation for a single lenslet. (a) Conventional image. (b) Deconvolved conventional image. (c) Reconstruction with a twinimage error, obtained using a circle shaped lenslet shown in (d), and (e) is a reconstruction without the twin-image error, obtained using a lenslet with a noncentrosymmetric shape, shown in (f). D8 APPLIED OPTICS / Vol. 5, No. 10 / 1 April 013

8 data from a single 3; 3 off-axis lenslet. We show the complete array reconstruction thereafter. Figure 8(a) is the portion of the object containing blurred, alternate bright and dark, elongated patches of different thicknesses and slants. Figures 8(b) and 8(c) show a conventional, noise-free image of this object simulated with just the main lens PSF, and, after deconvolution with a Wiener filter in MATLAB [13], with a noise-to-signal power ratio of We obtained a reconstruction using the plenoptic system sensor data for a single lenslet, with the object plane focused at the lenslet. Figure 8(d) shows the reconstruction for the noise-free case, obtained using a pseudoinverse approach. In Subsection 3.A we used a lenslet with a noncentrosymmetric shape to eliminate a possible twin-image artifact. In the results in this subsection, we used a lenslet with an amplitude gradient across its face, as shown in Fig. 7(b), to incorporate the noncentrosymmetric constraint in the system. The reconstruction in the noise-free case was similar in content to the deconvolved conventional image in Fig. 8(c). When Gaussian noise was added to the sensor data, the pseudoinverse based solution did not work. Figure 8(e) shows a reconstruction obtained by solving the linear inverse problem in Eq. (13) with a constrained linear least squares approach, using the MATLAB library function [13] for the same, imposing upper and lower bounds on the dynamic range during optimization. Noise clearly affected the reconstruction, but the result showed recovery of some object spatial content. The quality of the reconstruction dropped beyond the extent of the lenslet as seen by the shadow of the lenslet, which appears in the corners of the reconstruction. The amplitude gradient also affected the quality of the reconstruction. We also applied an iterative nonlinear least squares algorithm that allowed us to incorporate smoothing, and lower and upper bounds on the dynamic range during optimization, using the MATLAB library function [13] for the same. The result in Fig. 8(f) showed less ringing and better recovery of the object spatial content than the linear least squares approach in Fig. 8(e). The amount of spatial content recovered was not equal to that in the noisefree case, but was more than the traditional plenoptic in-focus image, where each lenslet contributes only one pixel to the image (computed as an average of all the sensor data behind a lenslet). Figure 9(a) shows the complete pristine planar object used in our simulations, which would cover the entire field of view of an lenslet array. The white circles mark the portion of the object and Fig. 7. Noncentrosymmetry in shape as well as amplitude improved image reconstruction. (a) and (b) show lenslets with noncentrosymetric shape and amplitude, respectively. Alternatively, phase aberrations in lenslets were also useful. (c) and (d) illustrate the real and imaginary parts of noncentrosymmetric phase in a lenslet. Fig. 8. Simulation for a single lenslet. (a) Pristine object used in simulation. (b) Conventional image, no noise. (c) Deconvolved noiseless conventional image. (d) Reconstruction with pseudoinverse for noiseless data. (e) and (f) are reconstructions using linear least squares and nonlinear iterative algorithms for noisy data, SNR 40 db. 1 April 013 / Vol. 5, No. 10 / APPLIED OPTICS D9

9 corresponding reconstructions shown in the simulation results in Fig. 8. The object was simulated to be perfectly focused at the lenslet array. This gave us the plenoptic sensor data shown in Fig. 9(b). In this paper we assume negligible lenslet crosstalk for the in-focus object. Figure 9(c) shows the traditional plenoptic image that would be obtained for this object, Fig. 9. Simulation for an array of lenslets. (a) Pristine object used in simulation. (b) Plenoptic sensor data. (c) Traditional single plenoptic image obtained by binning light behind each lenslet into 1 pixel. (d) Reconstruction with pseudoinverse for noiseless data. (e) and (f) are reconstructions using linear least squares and nonlinear iterative algorithms for noisy data, SNR 40 db for the case where the object is in focus at the lenslet array. (g) and (h) are the same when the object is defocused by mm. by binning all the sensor pixels behind one lenslet into one effective image pixel. The reconstruction obtained when there is no sensor noise, using the pseudoinverse approach and an amplitude gradient across the lenslet array, is shown in Fig. 9(d). This result contained pixels, which is equivalent sampling to a conventional image with a sampling factor Q 3. The reconstruction showed visible recovery of spatial content. Figures 9(e) and 9(f) show the reconstructions obtained for the case of sensor noise with SNR 40 db using the linear least squares and nonlinear iterative algorithms discussed before. As observed, there remain distinct artifacts in the reconstruction, but more of the object s spatial content was visible in this result when compared to the traditional plenoptic image in Fig. 9(c). Our simulation results suggest potential for future research on improved reconstruction algorithms, especially in the noisy case. We also did some simulations when the object was defocused by mm, with sensor noise of SNR 40 db. Figures 9(g) and 9(h) show the reconstructions obtained for this defocused case. They were obtained using the same two algorithms used in the in-focus case in Figs. 9(e) and 9(f). Similar results were obtained on the other side of focus. Our results showed an improvement over the in-focus reconstructions and supported the findings in [6] that it is possible to recover spatial resolution for the defocused case. The aforementioned assumption of a known or calibrated response matrix, PIFv, in the algorithm implemented in this paper also implies prior knowledge of defocus. We note that the defocused case simulations here are preliminary because we did not incorporate magnification changes associated with defocus and the subsequent interlenslet overflow. The magnification change affects the field of view for each lenslet. Overflow of the same object point into two or more lenslets allows reimaging of the same object point under multiple lenslets. This is best handled by taking this redundancy into account, and performing the response inversion at a global level, rather than using the lenslet-by-lenslet processing we have implemented in this paper. This aspect of overflow has been discussed to some extent in [6], where the object spatial content is recovered using multiframe digital superresolution techniques. While their approach is different from ours, the outcome indicates that the defocused object case does lend itself to object reconstruction. We will discuss a global level implementation using our approach and additional image processing in a future publication. 4. Conclusion In this paper we have introduced a detailed analysis of paraxial image formation for incoherent plenoptic imaging systems using a D object. We have shown simulation results for image formation that are corroborated by experimental images observed in [1]. We used our forward model, simulated images for a sample optical system with a known system D30 APPLIED OPTICS / Vol. 5, No. 10 / 1 April 013

10 response, and proposed a reconstruction method that allowed the recovery of some of the object information beyond the traditional resolution limited by the number of lenslets []. We have shown simulation results for the case where there is no noise at the sensor, and when there is 40 db Gaussian noise. The solution to the linear inverse problem for the plenoptic system is challenging and could result in a twin-image problem similar to holography. In order to overcome this problem, we used a noncentrosymmetric response constraint, incorporated in the design of lenslets having noncentrosymmetric shape, amplitude, or phase. The noiseless case allowed the use of a simple pseudoinverse to obtain a reconstruction having quality comparable with a deconvolved conventional image taken with just the main lens (no lenslet array), where the deconvolution was based on the main lens blur. The reconstruction for the noisy case needed iterative optimization methods with constraints on smoothness and dynamic range of the reconstruction. The resultant reconstruction showed better resolution than the traditional plenoptic system resolution, although not comparable to a deconvolved conventional image. We are optimistic about obtaining further improvement with better image processing for the noisy case and will present work on this in the future. In this paper we do not explicitly work on multimodal imaging. So we do not show results with a partitioned pupil as would be necessary for multimodal imaging, and which would accordingly reduce the effective resolution. However, we refer to some of our prior work [16] dealing with system responses for a partitioned pupil and propose to show results on that specific topic in future. More work could also be done in the areas of response estimation, calibration, and uncertainty, and incorporation of a 3D object model or generalization of the image formation theory for other wide-angle systems, such as cameras and other imagers. Experimental verification of the theoretical framework proposed in this paper is on-going and will be discussed in future. We have proposed a deconvolution-like method to recover spatial resolution content in plentoptic imaging systems even when the object is focused on the lenslet array, which could allow a paraxial, incoherent plenoptic system to also be used as a conventional imaging system. References 1. T. Adelson and J. Wang, Single lens stereo with a plenoptic camera, IEEE Trans. Pattern Anal. Mach. Intell. 14, (199).. R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, Light field photography with a hand-held plenoptic camera, Tech. Rep. (Stanford University, 005). 3. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, Light field microscopy, ACM Trans. Graph. 5, (006). 4. C. Perwass and L. Wietzke, Single-lens 3D camera with extended depth-of-field, Proc. SPIE 891, (01). 5. R. Horstmeyer, G. Euliss, R. Athale, and M. Levoy, Flexible multimodal camera using a light field architecture, in Proceedings of the IEEE International Conference on Computational Photography (IEEE, 009). 6. T. E. Bishop, S. Zanetti, and P. Favaro, Light field superresolution, Proceedings of the IEEE International Conference on Computational Photography (IEEE, 009). 7. Z. Zhang and M. Levoy, Wigner distributions and how they relate to the light field, IEEE International Conference on Computational Photography (IEEE, 009). 8. D. J. Brady and D. L. Marks, Coding for compressive focal tomography, Appl. Opt. 50, (011). 9. J. W. Goodman, Introduction to Fourier Optics (McGraw-Hill, 1986). 10. S. A. Shroff and K. Berkner, Defocus analysis for a coherent plenoptic system, in Frontiers in Optics, OSA Technical Digest (Optical Society of America, 011), paper FThR S. A. Shroff and K. Berkner, High Resolution image reconstruction for plenoptic imaging systems using system response, in Computational Optical Sensing and Imaging, OSA Technical Digest (Optical Society of America, 01), paper CMB.. 1. S. A. Shroff and K. Berkner, Wave analysis of a plenoptic system and its applications, Proc. SPIE 8667, 86671L (013). 13. The Mathworks, Natick, MA, http// 14. J. Goodman, Assessing a new imaging modality, in Optical Sensors, OSA Technical Digest (Optical Society of America, 01), paper JM1A J. R. Fienup and C. C. Wackerman, Phase-retrieval stagnation problems and solutions, J. Opt. Soc. Am. A 3, (1986). 16. K. Berkner and S. A. Shroff, Optimization of spectrally coded mask for multi-modal plenoptic camera, in Computational Optical Sensing and Imaging/Information Photonics (Optical Society of America, 011), paper CMD4. 1 April 013 / Vol. 5, No. 10 / APPLIED OPTICS D31

DOING PHYSICS WITH MATLAB COMPUTATIONAL OPTICS RAYLEIGH-SOMMERFELD DIFFRACTION INTEGRAL OF THE FIRST KIND

DOING PHYSICS WITH MATLAB COMPUTATIONAL OPTICS RAYLEIGH-SOMMERFELD DIFFRACTION INTEGRAL OF THE FIRST KIND DOING PHYSICS WITH MATLAB COMPUTATIONAL OPTICS RAYLEIGH-SOMMERFELD DIFFRACTION INTEGRAL OF THE FIRST KIND THE THREE-DIMENSIONAL DISTRIBUTION OF THE RADIANT FLUX DENSITY AT THE FOCUS OF A CONVERGENCE BEAM

More information

Endoscope Optics. Chapter 8. 8.1 Introduction

Endoscope Optics. Chapter 8. 8.1 Introduction Chapter 8 Endoscope Optics Endoscopes are used to observe otherwise inaccessible areas within the human body either noninvasively or minimally invasively. Endoscopes have unparalleled ability to visualize

More information

Lecture 14. Point Spread Function (PSF)

Lecture 14. Point Spread Function (PSF) Lecture 14 Point Spread Function (PSF), Modulation Transfer Function (MTF), Signal-to-noise Ratio (SNR), Contrast-to-noise Ratio (CNR), and Receiver Operating Curves (ROC) Point Spread Function (PSF) Recollect

More information

1051-232 Imaging Systems Laboratory II. Laboratory 4: Basic Lens Design in OSLO April 2 & 4, 2002

1051-232 Imaging Systems Laboratory II. Laboratory 4: Basic Lens Design in OSLO April 2 & 4, 2002 05-232 Imaging Systems Laboratory II Laboratory 4: Basic Lens Design in OSLO April 2 & 4, 2002 Abstract: For designing the optics of an imaging system, one of the main types of tools used today is optical

More information

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010 Lecture 12: Cameras and Geometry CAP 5415 Fall 2010 The midterm What does the response of a derivative filter tell me about whether there is an edge or not? Things aren't working Did you look at the filters?

More information

Today. next two weeks

Today. next two weeks Today Temporal and spatial coherence Spatially incoherent imaging The incoherent PSF The Optical Transfer Function (OTF) and Modulation Transfer Function (MTF) MTF and contrast comparison of spatially

More information

Computational Optical Imaging - Optique Numerique. -- Deconvolution --

Computational Optical Imaging - Optique Numerique. -- Deconvolution -- Computational Optical Imaging - Optique Numerique -- Deconvolution -- Winter 2014 Ivo Ihrke Deconvolution Ivo Ihrke Outline Deconvolution Theory example 1D deconvolution Fourier method Algebraic method

More information

Holographically corrected microscope with a large working distance (as appears in Applied Optics, Vol. 37, No. 10, 1849-1853, 1 April 1998)

Holographically corrected microscope with a large working distance (as appears in Applied Optics, Vol. 37, No. 10, 1849-1853, 1 April 1998) Holographically corrected microscope with a large working distance (as appears in Applied Optics, Vol. 37, No. 10, 1849-1853, 1 April 1998) Geoff Andersen and R. J. Knize Laser and Optics Research Center

More information

Diffraction of a Circular Aperture

Diffraction of a Circular Aperture Diffraction of a Circular Aperture Diffraction can be understood by considering the wave nature of light. Huygen's principle, illustrated in the image below, states that each point on a propagating wavefront

More information

Understanding astigmatism Spring 2003

Understanding astigmatism Spring 2003 MAS450/854 Understanding astigmatism Spring 2003 March 9th 2003 Introduction Spherical lens with no astigmatism Crossed cylindrical lenses with astigmatism Horizontal focus Vertical focus Plane of sharpest

More information

Blind Deconvolution of Barcodes via Dictionary Analysis and Wiener Filter of Barcode Subsections

Blind Deconvolution of Barcodes via Dictionary Analysis and Wiener Filter of Barcode Subsections Blind Deconvolution of Barcodes via Dictionary Analysis and Wiener Filter of Barcode Subsections Maximilian Hung, Bohyun B. Kim, Xiling Zhang August 17, 2013 Abstract While current systems already provide

More information

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

Geometric Optics Converging Lenses and Mirrors Physics Lab IV Objective Geometric Optics Converging Lenses and Mirrors Physics Lab IV In this set of lab exercises, the basic properties geometric optics concerning converging lenses and mirrors will be explored. The

More information

Imaging techniques with refractive beam shaping optics

Imaging techniques with refractive beam shaping optics Imaging techniques with refractive beam shaping optics Alexander Laskin, Vadim Laskin AdlOptica GmbH, Rudower Chaussee 29, 12489 Berlin, Germany ABSTRACT Applying of the refractive beam shapers in real

More information

Measuring Line Edge Roughness: Fluctuations in Uncertainty

Measuring Line Edge Roughness: Fluctuations in Uncertainty Tutor6.doc: Version 5/6/08 T h e L i t h o g r a p h y E x p e r t (August 008) Measuring Line Edge Roughness: Fluctuations in Uncertainty Line edge roughness () is the deviation of a feature edge (as

More information

Fraunhofer Diffraction

Fraunhofer Diffraction Physics 334 Spring 1 Purpose Fraunhofer Diffraction The experiment will test the theory of Fraunhofer diffraction at a single slit by comparing a careful measurement of the angular dependence of intensity

More information

Achromatic three-wave (or more) lateral shearing interferometer

Achromatic three-wave (or more) lateral shearing interferometer J. Primot and L. Sogno Vol. 12, No. 12/December 1995/J. Opt. Soc. Am. A 2679 Achromatic three-wave (or more) lateral shearing interferometer J. Primot and L. Sogno Office National d Etudes et de Recherches

More information

Optical Design for Automatic Identification

Optical Design for Automatic Identification s for Automatic Identification s design tutors: Prof. Paolo Bassi, eng. Federico Canini cotutors: eng. Gnan, eng. Bassam Hallal Outline s 1 2 3 design 4 design Outline s 1 2 3 design 4 design s : new Techniques

More information

How To Fix Out Of Focus And Blur Images With A Dynamic Template Matching Algorithm

How To Fix Out Of Focus And Blur Images With A Dynamic Template Matching Algorithm IJSTE - International Journal of Science Technology & Engineering Volume 1 Issue 10 April 2015 ISSN (online): 2349-784X Image Estimation Algorithm for Out of Focus and Blur Images to Retrieve the Barcode

More information

Measuring the Point Spread Function of a Fluorescence Microscope

Measuring the Point Spread Function of a Fluorescence Microscope Frederick National Laboratory Measuring the Point Spread Function of a Fluorescence Microscope Stephen J Lockett, PhD Principal Scientist, Optical Microscopy and Analysis Laboratory Frederick National

More information

CONFOCAL LASER SCANNING MICROSCOPY TUTORIAL

CONFOCAL LASER SCANNING MICROSCOPY TUTORIAL CONFOCAL LASER SCANNING MICROSCOPY TUTORIAL Robert Bagnell 2006 This tutorial covers the following CLSM topics: 1) What is the optical principal behind CLSM? 2) What is the spatial resolution in X, Y,

More information

Chapter 13 Confocal Laser Scanning Microscopy C. Robert Bagnell, Jr., Ph.D., 2012

Chapter 13 Confocal Laser Scanning Microscopy C. Robert Bagnell, Jr., Ph.D., 2012 Chapter 13 Confocal Laser Scanning Microscopy C. Robert Bagnell, Jr., Ph.D., 2012 You are sitting at your microscope working at high magnification trying to sort out the three-dimensional compartmentalization

More information

Theory and Methods of Lightfield Photography SIGGRAPH 2009

Theory and Methods of Lightfield Photography SIGGRAPH 2009 Theory and Methods of Lightfield Photography SIGGRAPH 2009 Todor Georgiev Adobe Systems tgeorgie@adobe.com Andrew Lumsdaine Indiana University lums@cs.indiana.edu 1 Web Page http://www.tgeorgiev.net/asia2009/

More information

Beam shaping to generate uniform Laser Light Sheet and Linear Laser Spots

Beam shaping to generate uniform Laser Light Sheet and Linear Laser Spots Beam shaping to generate uniform Laser Light Sheet and Linear Laser Spots Alexander Laskin, Vadim Laskin AdlOptica GmbH, Rudower Chaussee 29, 12489 Berlin, Germany ABSTRACT Generation of Laser Light Sheet

More information

Diffraction of Laser Light

Diffraction of Laser Light Diffraction of Laser Light No Prelab Introduction The laser is a unique light source because its light is coherent and monochromatic. Coherent light is made up of waves, which are all in phase. Monochromatic

More information

The Image Deblurring Problem

The Image Deblurring Problem page 1 Chapter 1 The Image Deblurring Problem You cannot depend on your eyes when your imagination is out of focus. Mark Twain When we use a camera, we want the recorded image to be a faithful representation

More information

WHITE PAPER. P-Iris. New iris control improves image quality in megapixel and HDTV network cameras.

WHITE PAPER. P-Iris. New iris control improves image quality in megapixel and HDTV network cameras. WHITE PAPER P-Iris. New iris control improves image quality in megapixel and HDTV network cameras. Table of contents 1. Introduction 3 2. The role of an iris 3 3. Existing lens options 4 4. How P-Iris

More information

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK vii LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK LIST OF CONTENTS LIST OF TABLES LIST OF FIGURES LIST OF NOTATIONS LIST OF ABBREVIATIONS LIST OF APPENDICES

More information

High-resolution Imaging System for Omnidirectional Illuminant Estimation

High-resolution Imaging System for Omnidirectional Illuminant Estimation High-resolution Imaging System for Omnidirectional Illuminant Estimation Shoji Tominaga*, Tsuyoshi Fukuda**, and Akira Kimachi** *Graduate School of Advanced Integration Science, Chiba University, Chiba

More information

DYNAMIC RANGE IMPROVEMENT THROUGH MULTIPLE EXPOSURES. Mark A. Robertson, Sean Borman, and Robert L. Stevenson

DYNAMIC RANGE IMPROVEMENT THROUGH MULTIPLE EXPOSURES. Mark A. Robertson, Sean Borman, and Robert L. Stevenson c 1999 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or

More information

Optical Metrology. Third Edition. Kjell J. Gasvik Spectra Vision AS, Trondheim, Norway JOHN WILEY & SONS, LTD

Optical Metrology. Third Edition. Kjell J. Gasvik Spectra Vision AS, Trondheim, Norway JOHN WILEY & SONS, LTD 2008 AGI-Information Management Consultants May be used for personal purporses only or by libraries associated to dandelon.com network. Optical Metrology Third Edition Kjell J. Gasvik Spectra Vision AS,

More information

High Quality Image Magnification using Cross-Scale Self-Similarity

High Quality Image Magnification using Cross-Scale Self-Similarity High Quality Image Magnification using Cross-Scale Self-Similarity André Gooßen 1, Arne Ehlers 1, Thomas Pralow 2, Rolf-Rainer Grigat 1 1 Vision Systems, Hamburg University of Technology, D-21079 Hamburg

More information

Experiment 3 Lenses and Images

Experiment 3 Lenses and Images Experiment 3 Lenses and Images Who shall teach thee, unless it be thine own eyes? Euripides (480?-406? BC) OBJECTIVES To examine the nature and location of images formed by es. THEORY Lenses are frequently

More information

Monochromatic electromagnetic fields with maximum focal energy density

Monochromatic electromagnetic fields with maximum focal energy density Moore et al. Vol. 4, No. 10 /October 007 /J. Opt. Soc. Am. A 3115 Monochromatic electromagnetic fields with maximum focal energy density Nicole J. Moore, 1, * Miguel A. Alonso, 1 and Colin J. R. Sheppard,3

More information

Super-resolution method based on edge feature for high resolution imaging

Super-resolution method based on edge feature for high resolution imaging Science Journal of Circuits, Systems and Signal Processing 2014; 3(6-1): 24-29 Published online December 26, 2014 (http://www.sciencepublishinggroup.com/j/cssp) doi: 10.11648/j.cssp.s.2014030601.14 ISSN:

More information

Adaptive Coded Aperture Photography

Adaptive Coded Aperture Photography Adaptive Coded Aperture Photography Oliver Bimber, Haroon Qureshi, Daniel Danch Institute of Johannes Kepler University, Linz Anselm Grundhoefer Disney Research Zurich Max Grosse Bauhaus University Weimar

More information

Optical correlation based on the fractional Fourier transform

Optical correlation based on the fractional Fourier transform Optical correlation based on the fractional Fourier transform Sergio Granieri, Ricardo Arizaga, and Enrique E. Sicre Some properties of optical correlation based on the fractional Fourier transform are

More information

Introduction to Optics

Introduction to Optics Second Edition Introduction to Optics FRANK L. PEDROTTI, S.J. Marquette University Milwaukee, Wisconsin Vatican Radio, Rome LENO S. PEDROTTI Center for Occupational Research and Development Waco, Texas

More information

PHYS 39a Lab 3: Microscope Optics

PHYS 39a Lab 3: Microscope Optics PHYS 39a Lab 3: Microscope Optics Trevor Kafka December 15, 2014 Abstract In this lab task, we sought to use critical illumination and Köhler illumination techniques to view the image of a 1000 lines-per-inch

More information

MECHANICAL PRINCIPLES HNC/D MOMENTS OF AREA. Define and calculate 1st. moments of areas. Define and calculate 2nd moments of areas.

MECHANICAL PRINCIPLES HNC/D MOMENTS OF AREA. Define and calculate 1st. moments of areas. Define and calculate 2nd moments of areas. MECHANICAL PRINCIPLES HNC/D MOMENTS OF AREA The concepts of first and second moments of area fundamental to several areas of engineering including solid mechanics and fluid mechanics. Students who are

More information

Lenses and Apertures of A TEM

Lenses and Apertures of A TEM Instructor: Dr. C.Wang EMA 6518 Course Presentation Lenses and Apertures of A TEM Group Member: Anup Kr. Keshri Srikanth Korla Sushma Amruthaluri Venkata Pasumarthi Xudong Chen Outline Electron Optics

More information

Experimental and modeling studies of imaging with curvilinear electronic eye cameras

Experimental and modeling studies of imaging with curvilinear electronic eye cameras Experimental and modeling studies of imaging with curvilinear electronic eye cameras Viktor Malyarchuk, 1 Inhwa Jung, 1 John A. Rogers, 1,* Gunchul Shin, 2 and Jeong Sook Ha 2 1 Department of Materials

More information

Using the Normalized Image Log-Slope

Using the Normalized Image Log-Slope T h e L i t h o g r a p h y E x p e r t (Winter 2001) Using the Normalized mage Log-Slope Chris A. Mack, FNLE Technologies, A Division of KLA-Tencor, Austin, Texas Projection imaging tools, such as scanners,

More information

jorge s. marques image processing

jorge s. marques image processing image processing images images: what are they? what is shown in this image? What is this? what is an image images describe the evolution of physical variables (intensity, color, reflectance, condutivity)

More information

The continuous and discrete Fourier transforms

The continuous and discrete Fourier transforms FYSA21 Mathematical Tools in Science The continuous and discrete Fourier transforms Lennart Lindegren Lund Observatory (Department of Astronomy, Lund University) 1 The continuous Fourier transform 1.1

More information

Reflection and Refraction

Reflection and Refraction Equipment Reflection and Refraction Acrylic block set, plane-concave-convex universal mirror, cork board, cork board stand, pins, flashlight, protractor, ruler, mirror worksheet, rectangular block worksheet,

More information

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY V. Knyaz a, *, Yu. Visilter, S. Zheltov a State Research Institute for Aviation System (GosNIIAS), 7, Victorenko str., Moscow, Russia

More information

Optimization of axial resolution in a confocal microscope with D-shaped apertures

Optimization of axial resolution in a confocal microscope with D-shaped apertures Optimization of axial resolution in a confocal microscope with D-shaped apertures Wei Gong, 1 Ke Si, and Colin J. R. Sheppard 1,,3, * 1 Division of Bioengineering, National University of Singapore, Singapore

More information

2) A convex lens is known as a diverging lens and a concave lens is known as a converging lens. Answer: FALSE Diff: 1 Var: 1 Page Ref: Sec.

2) A convex lens is known as a diverging lens and a concave lens is known as a converging lens. Answer: FALSE Diff: 1 Var: 1 Page Ref: Sec. Physics for Scientists and Engineers, 4e (Giancoli) Chapter 33 Lenses and Optical Instruments 33.1 Conceptual Questions 1) State how to draw the three rays for finding the image position due to a thin

More information

Low-resolution Character Recognition by Video-based Super-resolution

Low-resolution Character Recognition by Video-based Super-resolution 2009 10th International Conference on Document Analysis and Recognition Low-resolution Character Recognition by Video-based Super-resolution Ataru Ohkura 1, Daisuke Deguchi 1, Tomokazu Takahashi 2, Ichiro

More information

Aliasing, Image Sampling and Reconstruction

Aliasing, Image Sampling and Reconstruction Aliasing, Image Sampling and Reconstruction Recall: a pixel is a point It is NOT a box, disc or teeny wee light It has no dimension It occupies no area It can have a coordinate More than a point, it is

More information

2.2 Creaseness operator

2.2 Creaseness operator 2.2. Creaseness operator 31 2.2 Creaseness operator Antonio López, a member of our group, has studied for his PhD dissertation the differential operators described in this section [72]. He has compared

More information

Chapter 22: Electric Flux and Gauss s Law

Chapter 22: Electric Flux and Gauss s Law 22.1 ntroduction We have seen in chapter 21 that determining the electric field of a continuous charge distribution can become very complicated for some charge distributions. t would be desirable if we

More information

Enhancing the SNR of the Fiber Optic Rotation Sensor using the LMS Algorithm

Enhancing the SNR of the Fiber Optic Rotation Sensor using the LMS Algorithm 1 Enhancing the SNR of the Fiber Optic Rotation Sensor using the LMS Algorithm Hani Mehrpouyan, Student Member, IEEE, Department of Electrical and Computer Engineering Queen s University, Kingston, Ontario,

More information

Resolution Enhancement of Photogrammetric Digital Images

Resolution Enhancement of Photogrammetric Digital Images DICTA2002: Digital Image Computing Techniques and Applications, 21--22 January 2002, Melbourne, Australia 1 Resolution Enhancement of Photogrammetric Digital Images John G. FRYER and Gabriele SCARMANA

More information

A Guide to Acousto-Optic Modulators

A Guide to Acousto-Optic Modulators A Guide to Acousto-Optic Modulators D. J. McCarron December 7, 2007 1 Introduction Acousto-optic modulators (AOMs) are useful devices which allow the frequency, intensity and direction of a laser beam

More information

Epipolar Geometry. Readings: See Sections 10.1 and 15.6 of Forsyth and Ponce. Right Image. Left Image. e(p ) Epipolar Lines. e(q ) q R.

Epipolar Geometry. Readings: See Sections 10.1 and 15.6 of Forsyth and Ponce. Right Image. Left Image. e(p ) Epipolar Lines. e(q ) q R. Epipolar Geometry We consider two perspective images of a scene as taken from a stereo pair of cameras (or equivalently, assume the scene is rigid and imaged with a single camera from two different locations).

More information

Axial intensity distribution of lens axicon illuminated by Gaussian-Schell model beam

Axial intensity distribution of lens axicon illuminated by Gaussian-Schell model beam 46 1, 018003 January 2007 Axial intensity distribution of lens axicon illuminated by Gaussian-Schell model beam Yuan Chen Jixiong Pu Xiaoyun Liu Huaqiao University Department of Electronic Science and

More information

Choosing a digital camera for your microscope John C. Russ, Materials Science and Engineering Dept., North Carolina State Univ.

Choosing a digital camera for your microscope John C. Russ, Materials Science and Engineering Dept., North Carolina State Univ. Choosing a digital camera for your microscope John C. Russ, Materials Science and Engineering Dept., North Carolina State Univ., Raleigh, NC One vital step is to choose a transfer lens matched to your

More information

Using visible SNR (vsnr) to compare image quality of pixel binning and digital resizing

Using visible SNR (vsnr) to compare image quality of pixel binning and digital resizing Using visible SNR (vsnr) to compare image quality of pixel binning and digital resizing Joyce Farrell a, Mike Okincha b, Manu Parmar ac, and Brian Wandell ac a Dept. of Electrical Engineering, Stanford

More information

Biomedical & X-ray Physics Kjell Carlsson. Light Microscopy. Compendium compiled for course SK2500, Physics of Biomedical Microscopy.

Biomedical & X-ray Physics Kjell Carlsson. Light Microscopy. Compendium compiled for course SK2500, Physics of Biomedical Microscopy. Biomedical & X-ray Physics Kjell Carlsson Light Microscopy Compendium compiled for course SK2500, Physics of Biomedical Microscopy by Kjell Carlsson Applied Physics Dept., KTH, Stockholm, 2007 No part

More information

Revision problem. Chapter 18 problem 37 page 612. Suppose you point a pinhole camera at a 15m tall tree that is 75m away.

Revision problem. Chapter 18 problem 37 page 612. Suppose you point a pinhole camera at a 15m tall tree that is 75m away. Revision problem Chapter 18 problem 37 page 612 Suppose you point a pinhole camera at a 15m tall tree that is 75m away. 1 Optical Instruments Thin lens equation Refractive power Cameras The human eye Combining

More information

Optimal design and critical analysis of a high-resolution video plenoptic demonstrator

Optimal design and critical analysis of a high-resolution video plenoptic demonstrator Optimal design and critical analysis of a high-resolution video plenoptic demonstrator Valter Drazic Jean-Jacques Sacré Arno Schubert Jérôme Bertrand Etienne Blondé Journal of Electronic Imaging 2(), 007

More information

High-resolution imaging of the human retina with a Fourier deconvolution technique

High-resolution imaging of the human retina with a Fourier deconvolution technique D. Catlin and C. Dainty Vol. 19, No. 8/August 2002/J. Opt. Soc. Am. A 1515 High-resolution imaging of the human retina with a Fourier deconvolution technique David Catlin and Christopher Dainty The Blackett

More information

Bildverarbeitung und Mustererkennung Image Processing and Pattern Recognition

Bildverarbeitung und Mustererkennung Image Processing and Pattern Recognition Bildverarbeitung und Mustererkennung Image Processing and Pattern Recognition 1. Image Pre-Processing - Pixel Brightness Transformation - Geometric Transformation - Image Denoising 1 1. Image Pre-Processing

More information

Laser expander design of highly efficient Blu-ray disc pickup head

Laser expander design of highly efficient Blu-ray disc pickup head Laser expander design of highly efficient Blu-ray disc pickup head Wen-Shing Sun, 1,* Kun-Di Liu, 1 Jui-Wen Pan, 1 Chuen-Lin Tien, 2 and Min-Sheng Hsieh 1 1 Department of Optics and Photonics, National

More information

Development of Optical Wave Microphone Measuring Sound Waves with No Diaphragm

Development of Optical Wave Microphone Measuring Sound Waves with No Diaphragm Progress In Electromagnetics Research Symposium Proceedings, Taipei, March 5 8, 3 359 Development of Optical Wave Microphone Measuring Sound Waves with No Diaphragm Yoshito Sonoda, Takashi Samatsu, and

More information

http://dx.doi.org/10.1117/12.906346

http://dx.doi.org/10.1117/12.906346 Stephanie Fullerton ; Keith Bennett ; Eiji Toda and Teruo Takahashi "Camera simulation engine enables efficient system optimization for super-resolution imaging", Proc. SPIE 8228, Single Molecule Spectroscopy

More information

Analyzing LASIK Optical Data Using Zernike Functions

Analyzing LASIK Optical Data Using Zernike Functions MATLAB Digest Analyzing LASIK Optical Data Using Zernike Functions By Paul Fricker Researchers in fields as diverse as optometry, astronomy, and photonics face a common challenge: how to accurately measure

More information

Applications to Data Smoothing and Image Processing I

Applications to Data Smoothing and Image Processing I Applications to Data Smoothing and Image Processing I MA 348 Kurt Bryan Signals and Images Let t denote time and consider a signal a(t) on some time interval, say t. We ll assume that the signal a(t) is

More information

Plastic Film Texture Measurement With 3D Profilometry

Plastic Film Texture Measurement With 3D Profilometry Plastic Film Texture Measurement With 3D Profilometry Prepared by Jorge Ramirez 6 Morgan, Ste156, Irvine CA 92618 P: 949.461.9292 F: 949.461.9232 nanovea.com Today's standard for tomorrow's materials.

More information

Efficiency of a Light Emitting Diode

Efficiency of a Light Emitting Diode PHYSICS THROUGH TEACHING LABORATORY VII Efficiency of a Light Emitting Diode RAJESH B. KHAPARDE AND SMITHA PUTHIYADAN Homi Bhabha Centre for Science Education Tata Institute of Fundamental Research V.

More information

Application Note #503 Comparing 3D Optical Microscopy Techniques for Metrology Applications

Application Note #503 Comparing 3D Optical Microscopy Techniques for Metrology Applications Screw thread image generated by WLI Steep PSS angles WLI color imaging Application Note #503 Comparing 3D Optical Microscopy Techniques for Metrology Applications 3D optical microscopy is a mainstay metrology

More information

SUPER RESOLUTION FROM MULTIPLE LOW RESOLUTION IMAGES

SUPER RESOLUTION FROM MULTIPLE LOW RESOLUTION IMAGES SUPER RESOLUTION FROM MULTIPLE LOW RESOLUTION IMAGES ABSTRACT Florin Manaila 1 Costin-Anton Boiangiu 2 Ion Bucur 3 Although the technology of optical instruments is constantly advancing, the capture of

More information

High Quality Image Deblurring Panchromatic Pixels

High Quality Image Deblurring Panchromatic Pixels High Quality Image Deblurring Panchromatic Pixels ACM Transaction on Graphics vol. 31, No. 5, 2012 Sen Wang, Tingbo Hou, John Border, Hong Qin, and Rodney Miller Presented by Bong-Seok Choi School of Electrical

More information

Computed Tomography Resolution Enhancement by Integrating High-Resolution 2D X-Ray Images into the CT reconstruction

Computed Tomography Resolution Enhancement by Integrating High-Resolution 2D X-Ray Images into the CT reconstruction Digital Industrial Radiology and Computed Tomography (DIR 2015) 22-25 June 2015, Belgium, Ghent - www.ndt.net/app.dir2015 More Info at Open Access Database www.ndt.net/?id=18046 Computed Tomography Resolution

More information

3D Scanner using Line Laser. 1. Introduction. 2. Theory

3D Scanner using Line Laser. 1. Introduction. 2. Theory . Introduction 3D Scanner using Line Laser Di Lu Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute The goal of 3D reconstruction is to recover the 3D properties of a geometric

More information

Simultaneous Gamma Correction and Registration in the Frequency Domain

Simultaneous Gamma Correction and Registration in the Frequency Domain Simultaneous Gamma Correction and Registration in the Frequency Domain Alexander Wong a28wong@uwaterloo.ca William Bishop wdbishop@uwaterloo.ca Department of Electrical and Computer Engineering University

More information

AP Physics B Ch. 23 and Ch. 24 Geometric Optics and Wave Nature of Light

AP Physics B Ch. 23 and Ch. 24 Geometric Optics and Wave Nature of Light AP Physics B Ch. 23 and Ch. 24 Geometric Optics and Wave Nature of Light Name: Period: Date: MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question. 1) Reflection,

More information

Optical laser beam scanner lens relay system

Optical laser beam scanner lens relay system 1. Introduction Optical laser beam scanner lens relay system Laser beam scanning is used most often by far in confocal microscopes. There are many ways by which a laser beam can be scanned across the back

More information

Wii Remote Calibration Using the Sensor Bar

Wii Remote Calibration Using the Sensor Bar Wii Remote Calibration Using the Sensor Bar Alparslan Yildiz Abdullah Akay Yusuf Sinan Akgul GIT Vision Lab - http://vision.gyte.edu.tr Gebze Institute of Technology Kocaeli, Turkey {yildiz, akay, akgul}@bilmuh.gyte.edu.tr

More information

Relating Vanishing Points to Catadioptric Camera Calibration

Relating Vanishing Points to Catadioptric Camera Calibration Relating Vanishing Points to Catadioptric Camera Calibration Wenting Duan* a, Hui Zhang b, Nigel M. Allinson a a Laboratory of Vision Engineering, University of Lincoln, Brayford Pool, Lincoln, U.K. LN6

More information

Incoherent beam combining using stimulated Brillouin scattering in multimode fibers

Incoherent beam combining using stimulated Brillouin scattering in multimode fibers Incoherent beam combining using stimulated Brillouin scattering in multimode fibers Timothy H. Russell and Won B. Roh Air Force Institute of Technology, Wright-Patterson AFB, Ohio 45433 timothy.russell@afit.edu;

More information

Current Standard: Mathematical Concepts and Applications Shape, Space, and Measurement- Primary

Current Standard: Mathematical Concepts and Applications Shape, Space, and Measurement- Primary Shape, Space, and Measurement- Primary A student shall apply concepts of shape, space, and measurement to solve problems involving two- and three-dimensional shapes by demonstrating an understanding of:

More information

An Experimental Study of the Performance of Histogram Equalization for Image Enhancement

An Experimental Study of the Performance of Histogram Equalization for Image Enhancement International Journal of Computer Sciences and Engineering Open Access Research Paper Volume-4, Special Issue-2, April 216 E-ISSN: 2347-2693 An Experimental Study of the Performance of Histogram Equalization

More information

Online refocusing algorithm for a satellite camera using stellar sources

Online refocusing algorithm for a satellite camera using stellar sources Online refocusing algorithm for a satellite camera using stellar sources Jeong-Bin Jo, Jai-Hyuk Hwang, * and Jae-Sung Bae School of Aerospace and Mechanical Engineering, Korea Aerospace University, Gyeonggi-do,

More information

Holographic data storage at 2+ Tbit/in 2

Holographic data storage at 2+ Tbit/in 2 Holographic data storage at + Tbit/in Mark R. Ayres *, Ken Anderson, Fred Askham, Brad Sissom, Adam C. Urness Akonia Holographics, LLC, Miller Dr., Longmont, CO, USA, 85 ABSTRACT The onslaught of big data

More information

View of ΣIGMA TM (Ref. 1)

View of ΣIGMA TM (Ref. 1) Overview of the FESEM system 1. Electron optical column 2. Specimen chamber 3. EDS detector [Electron Dispersive Spectroscopy] 4. Monitors 5. BSD (Back scatter detector) 6. Personal Computer 7. ON/STANDBY/OFF

More information

Palmprint Recognition. By Sree Rama Murthy kora Praveen Verma Yashwant Kashyap

Palmprint Recognition. By Sree Rama Murthy kora Praveen Verma Yashwant Kashyap Palmprint Recognition By Sree Rama Murthy kora Praveen Verma Yashwant Kashyap Palm print Palm Patterns are utilized in many applications: 1. To correlate palm patterns with medical disorders, e.g. genetic

More information

Projection Center Calibration for a Co-located Projector Camera System

Projection Center Calibration for a Co-located Projector Camera System Projection Center Calibration for a Co-located Camera System Toshiyuki Amano Department of Computer and Communication Science Faculty of Systems Engineering, Wakayama University Sakaedani 930, Wakayama,

More information

NOVEL FOCUSING OPTICS FOR IR LASERS Paper 1504

NOVEL FOCUSING OPTICS FOR IR LASERS Paper 1504 NOVEL FOCUSING OPTICS FOR IR LASERS Paper 1504 Gary Herrit 1, Alan Hedges 1, Herman Reedy 1 1 II-VI Incorporated, 375 Saxonburg Blvd., Saxonburg, PA, 16056, USA Abstract Traditional focusing optics for

More information

Theremino System Theremino Spectrometer Technology

Theremino System Theremino Spectrometer Technology Theremino System Theremino Spectrometer Technology theremino System - Theremino Spectrometer Technology - August 15, 2014 - Page 1 Operation principles By placing a digital camera with a diffraction grating

More information

JPEG compression of monochrome 2D-barcode images using DCT coefficient distributions

JPEG compression of monochrome 2D-barcode images using DCT coefficient distributions Edith Cowan University Research Online ECU Publications Pre. JPEG compression of monochrome D-barcode images using DCT coefficient distributions Keng Teong Tan Hong Kong Baptist University Douglas Chai

More information

Signal to Noise Instrumental Excel Assignment

Signal to Noise Instrumental Excel Assignment Signal to Noise Instrumental Excel Assignment Instrumental methods, as all techniques involved in physical measurements, are limited by both the precision and accuracy. The precision and accuracy of a

More information

Extreme-AO for GMT. Center for Astronomical Adaptive Optics, University of Arizona Subaru Telescope, National Astronomical Observatory of Japan

Extreme-AO for GMT. Center for Astronomical Adaptive Optics, University of Arizona Subaru Telescope, National Astronomical Observatory of Japan Extreme-AO for GMT Olivier Guyon Center for Astronomical Adaptive Optics, University of Arizona Subaru Telescope, National Astronomical Observatory of Japan guyon@naoj.org 1 Extreme-AO for GMT: Key science

More information

Introduction to reflective aberration corrected holographic diffraction gratings

Introduction to reflective aberration corrected holographic diffraction gratings Introduction to reflective aberration corrected holographic diffraction gratings By Steve Slutter, Wu Jiang, and Olivier Nicolle The reflective diffraction grating is the heart of most spectroscopy systems

More information

15.062 Data Mining: Algorithms and Applications Matrix Math Review

15.062 Data Mining: Algorithms and Applications Matrix Math Review .6 Data Mining: Algorithms and Applications Matrix Math Review The purpose of this document is to give a brief review of selected linear algebra concepts that will be useful for the course and to develop

More information

Optical Design using Fresnel Lenses

Optical Design using Fresnel Lenses Optical Design using Fresnel Lenses Basic principles and some practical examples Arthur Davis and Frank Kühnlenz Reflexite Optical Solutions Business Abstract The fresnel lens can be used in a wide variety

More information

Numerical Methods For Image Restoration

Numerical Methods For Image Restoration Numerical Methods For Image Restoration CIRAM Alessandro Lanza University of Bologna, Italy Faculty of Engineering CIRAM Outline 1. Image Restoration as an inverse problem 2. Image degradation models:

More information

Interference. Physics 102 Workshop #3. General Instructions

Interference. Physics 102 Workshop #3. General Instructions Interference Physics 102 Workshop #3 Name: Lab Partner(s): Instructor: Time of Workshop: General Instructions Workshop exercises are to be carried out in groups of three. One report per group is due by

More information

Chapter 1 High-Resolution Optical and Confocal Microscopy

Chapter 1 High-Resolution Optical and Confocal Microscopy Chapter 1 High-Resolution Optical and Confocal Microscopy Olaf Hollricher and Wolfram Ibach Abstract In this chapter, the theory of optical image formation in an optical microscope is described, and the

More information