Wide-viewing-angle three-dimensional integral imaging using a curved screen and a curved lens array

Similar documents
Understanding astigmatism Spring 2003

Geometric Optics Converging Lenses and Mirrors Physics Lab IV

AP Physics B Ch. 23 and Ch. 24 Geometric Optics and Wave Nature of Light

Color holographic 3D display unit with aperture field division

Thin Lenses Drawing Ray Diagrams

C) D) As object AB is moved from its present position toward the left, the size of the image produced A) decreases B) increases C) remains the same

Procedure: Geometrical Optics. Theory Refer to your Lab Manual, pages Equipment Needed

2) A convex lens is known as a diverging lens and a concave lens is known as a converging lens. Answer: FALSE Diff: 1 Var: 1 Page Ref: Sec.

1. You stand two feet away from a plane mirror. How far is it from you to your image? a. 2.0 ft c. 4.0 ft b. 3.0 ft d. 5.0 ft

4. CAMERA ADJUSTMENTS

Imaging Systems Laboratory II. Laboratory 4: Basic Lens Design in OSLO April 2 & 4, 2002

waves rays Consider rays of light from an object being reflected by a plane mirror (the rays are diverging): mirror object

Experiment 3 Lenses and Images

Endoscope Optics. Chapter Introduction

Experimental and modeling studies of imaging with curvilinear electronic eye cameras

Rodenstock Photo Optics

Stereoscopic 3D Digital Theater System. Operator Manual (MI-2100)

EXPERIMENT 6 OPTICS: FOCAL LENGTH OF A LENS

Reflection and Refraction

Arrayoptik für ultraflache Datensichtbrillen

Holographically corrected microscope with a large working distance (as appears in Applied Optics, Vol. 37, No. 10, , 1 April 1998)

Lecture 17. Image formation Ray tracing Calculation. Lenses Convex Concave. Mirrors Convex Concave. Optical instruments

Convex Mirrors. Ray Diagram for Convex Mirror

Study of the Human Eye Working Principle: An impressive high angular resolution system with simple array detectors

Chapter 23. The Reflection of Light: Mirrors

The Wide Field Cassegrain: Exploring Solution Space

SAMPLE TEST PAPER - I

Basic Optics System OS-8515C

LIGHT SECTION 6-REFRACTION-BENDING LIGHT From Hands on Science by Linda Poore, 2003.

Anamorphic imaging with three mirrors: a survey

Lecture 12: Cameras and Geometry. CAP 5415 Fall 2010

DOING PHYSICS WITH MATLAB COMPUTATIONAL OPTICS RAYLEIGH-SOMMERFELD DIFFRACTION INTEGRAL OF THE FIRST KIND

ME 111: Engineering Drawing

Ultra-High Resolution Digital Mosaics

Photography of Cultural Heritage items

Chapter 17: Light and Image Formation

Theremino System Theremino Spectrometer Technology

Observer Tracking Autostereoscopic 3D Display Systems

Physics 10. Lecture 29A. "There are two ways of spreading light: to be the candle or the mirror that reflects it." --Edith Wharton

Chapter 36 - Lenses. A PowerPoint Presentation by Paul E. Tippens, Professor of Physics Southern Polytechnic State University

Traditional Drawing Tools

Testing and Performance of the Convex Lens Concentrating Solar Power Panel Prototype

Anamorphic Projection Photographic Techniques for setting up 3D Chalk Paintings

Longwave IR focal-plane binary optics

WAVELENGTH OF LIGHT - DIFFRACTION GRATING

RAY OPTICS II 7.1 INTRODUCTION

How an electronic shutter works in a CMOS camera. First, let s review how shutters work in film cameras.

5.3 Cell Phone Camera

Digital Photography Composition. Kent Messamore 9/8/2013

GRID AND PRISM SPECTROMETERS

Rutgers Analytical Physics 750:228, Spring 2016 ( RUPHY228S16 )

Rodenstock Photo Optics

Introduction to reflective aberration corrected holographic diffraction gratings

19 - RAY OPTICS Page 1 ( Answers at the end of all questions )

3D Scanner using Line Laser. 1. Introduction. 2. Theory

Optical Design using Fresnel Lenses

Physics 25 Exam 3 November 3, 2009

Revision problem. Chapter 18 problem 37 page 612. Suppose you point a pinhole camera at a 15m tall tree that is 75m away.

TS-E24mm f/3.5l TS-E45mm f/2.8 TS-E90mm f/2.8 Instructions

What is a DSLR and what is a compact camera? And newer versions of DSLR are now mirrorless

LIGHT REFLECTION AND REFRACTION

Achromatic three-wave (or more) lateral shearing interferometer

RESOLUTION CHARTS AND GRATINGS RESOLUTION CHARTS AND GRATINGS RESOLUTION CHARTS AND GRATINGS RESOLUTION CHARTS AND GRATINGS RESOLUTION CHARTS AND

Digital Image Requirements for New Online US Visa Application

Introduction to Optics

Protocol for Microscope Calibration

How To Use 3D On A Computer Or Tv

3D Drawing. Single Point Perspective with Diminishing Spaces

Microlenses immersed in nematic liquid crystal with electrically. controllable focal length

PHYS 222 Spring 2012 Final Exam. Closed books, notes, etc. No electronic device except a calculator.

9/16 Optics 1 /11 GEOMETRIC OPTICS

Using the Normalized Image Log-Slope

Fast Varifocal Lenses Based on KTa 1-x Nb x O 3 (KTN) Single Crystals

Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision

Measuring the Point Spread Function of a Fluorescence Microscope

PDF Created with deskpdf PDF Writer - Trial ::

Light and Sound. Pupil Booklet

Flat-Field IR Mega-Pixel Lens

Geometrical Optics - Grade 11

Zoom Lens Design of Mobilephone Camera with Global-Explorer Optimization

RAY TRACING UNIFIED FIELD TRACING

First let us consider microscopes. Human eyes are sensitive to radiation having wavelengths between

OmniBSI TM Technology Backgrounder. Embargoed News: June 22, OmniVision Technologies, Inc.

Cathode Ray Tube. Introduction. Functional principle

SCREW THREADS C H A P T E R 17

Introduction to 3D Imaging

Choosing a digital camera for your microscope John C. Russ, Materials Science and Engineering Dept., North Carolina State Univ.

Lesson 26: Reflection & Mirror Diagrams

LIST OF CONTENTS CHAPTER CONTENT PAGE DECLARATION DEDICATION ACKNOWLEDGEMENTS ABSTRACT ABSTRAK

Interference. Physics 102 Workshop #3. General Instructions

A Guide to Acousto-Optic Modulators

Lecture Notes for Chapter 34: Images

3D Printing LESSON PLAN PHYSICS 8,11: OPTICS

WHITE PAPER. P-Iris. New iris control improves image quality in megapixel and HDTV network cameras.

Imaging techniques with refractive beam shaping optics

Thea Omni Light. Thea Spot Light. Light setup & Optimization

Automated Optical Inspection is one of many manufacturing test methods common in the assembly of printed circuit boards. This list includes:

DEVELOPMENT OF HYBRID VECTORIZING SOFTWARE FOR DIGITIZATION OF CADASTRAL MAPS

Achieving high focusing power for a largeaperture liquid crystal lens with novel hole-andring

Detection and Restoration of Vertical Non-linear Scratches in Digitized Film Sequences

Transcription:

Wide-viewing-angle three-dimensional integral imaging using a curved screen and a curved lens array Yunhee Kim, Jae-Hyeung Park, Sung-Wook Min, and Byoungho Lee * School of Electrical Engineering, Seoul National University Kwanak-Gu Shinlim-Dong, Seoul 151-744, Korea ABSTRACT In this paper, we propose a wide-viewing-angle three-dimensional integral imaging using a curved screen and a curved lens array. Elemental s are projected to the curved screen. Incorporation of the curved screen and the curved lens array instead of a conventional flat display panel and a flat lens array expands the viewing angle remarkably. In addition adopting barriers between a curved lens array and a curved screen eliminates the flipped s effectively without affecting the viewing angle. The principle of the proposed system is explained and the experimental results about the viewing angle of real and virtual s are also presented. Keywords: Integral imaging, three-dimensional display, viewing angle, curved lens array 1. INTRODUCTION Integral imaging (integral photography) is a three-dimensional display technique, first proposed by Lippman in 1908. Recently the integral imaging has been attracting much attention as an autostereoscopic three-dimensional display method for its many advantages 1-5. It has continuous viewpoints within the viewing angle and does not require any special glasses. It also provides full parallax and can display real time 3D animated s owing to the advancement of the display devices. The method of integral imaging is similar to that of holography in a view that it can record 3D information on a two-dimensional (2D) medium. However, integral imaging does not need coherent light sources and can display the in full natural color. Figure 1 shows the basic concept of integral imaging, composed of pick up and display steps. Pickup Elemental Display Lens array Object Reconstructed Pickup device Display device Figure 1. Basic concept of integral imaging In the pick-up step lights from an object go through each lens in an array and generate elemental s. Then the elemental s are recorded on a film as a form of 2D array. In display step the developed photographic plate or film is illuminated by diffused light and then elemental s retrace the original routes and form a 3D. Recently using electronic devices such as high-definition television camera and liquid crystal display instead of photographic film has enabled real-time 3D integral imaging and has overcomes some conventional technical problems. However, a primary drawback of the integral imaging is a narrow viewing angle. The viewing angle, the angle within which observers can see the complete reconstructed by integral imaging, is limited due to the restriction of the * Correspondence: Email: byoungho@snu.ac.kr; WWW: http://oeqelab.snu.ac.kr; Tel: +82-2-880-7245; Fax: +82-2-873-9953 Information Optics and Photonics Technology, edited by Guoguang Mu, Francis T. S. Yu, Suganda Jutamulia, Proceedings of SPIE Vol. 5642 (SPIE, Bellingham, WA, 2005) 0277-786X/05/$15 doi: 10.1117/12.576146 339

area where each elemental can be displayed. Generally, in the integral imaging system each elemental lens has its corresponding area on the display panel. To prevent the flipping the elemental that exceeds the corresponding area is discarded optically in direct pick up method 5 or electrically in computer-generated integral imaging (CGII) method 6. Therefore the number of the elemental s is limited and an observer outside the viewing zone cannot see the integrated. To overcome the limitation of viewing angle some methods have been studied 7-12. One method is using a Fresnel lens array that has a small f-number to widen the viewing angle 7. However, there is limitation in decreasing the f-number and the lens aberration occurs. Another method is using lens switching to enhance the viewing angle by doubling the region of each elemental 8. This approach, however, needs a mechanical mask that moves fast enough for the after- effect. Thus this method causes some problems such as air resistance and noise. Another method is using orthogonal polarization switching to realize the dynamic mask without mechanical movement 9. This method uses orthogonally polarized elemental s with a polarization shutter screen and an orthogonal polarization sheet, which act like the mask. By using the polarization sheet, however, the brightness of original is inevitably reduced by half and the integrated becomes dim. The method that uses volume holographic recording of the elemental s 10 has been proposed also. However, the method cannot implement dynamic color display. Another method is the idea that is using multiple display devices 11. However, the structure is bulky and tested only for the double-device case. Recently we proposed the method that uses a curved lens array as shown in Fig. 2(a) 12. However, since a flat display panel is adopted, gap mismatch problem occurs and set the limit of viewing angle as shown in Fig. 2(b). In this paper, we propose a wide-viewing-angle integral imaging system by curving not only the lens array but also the screen to widen the viewing angle. In the proposed system we use a flexible screen and project the calculated elemental s on the screen. Owing to the curved structure of the proposed system all elemental lenses have corresponding elemental s and the viewing angle is expanded. Here we adopt a projection system and use a curved screen instead of the curved display panel and provide the experimental results. 2. PRINCIPLE OF THE PROPOSED METHODS In the previously proposed method of our curved lens array system 12, if we can use, instead of a flat display panel, a flexible display panel or a lens array of which focal length is tunable, we can overcome the gap mismatch problem and make all elemental s integrated at the original location of the object. However, there is no large flexible display panel or tunable lens of commercial use yet. Thus we curved a transmission type screen to have a certain radius of curvature and used it instead of the conventional flat display panel. region Display panel Curved lens array Display panel g' Central axis Integrated g g'' R g g ', g g '' Barriers Gap mismatch (a) (b) Figure 2. (a) Integral imaging using a curved lens array and a flat display panel (b) gap mismatch problem 340 Proc. of SPIE Vol. 5642

Curved screen region Projector Integrated (a) Curved screen Curved lens array Barrier Projector Integrated Observer region (b) Figure 3. Configuration of the proposed method for displaying (a) real (b) virtual Figure 3 shows the configuration of the proposed system using both a curved screen and a curved lens array. The lens array and the screen are curved horizontally with certain radius of curvature. They hold the same center and keeping the gap, the distance between the lens array and the corresponding elemental on the screen, constant. For displaying real or virtual with the wide viewing angle, the lens array and the screen are curved around the as shown in Fig. 3(a) and (b). We suggest redefining the region of elemental according to the curvature of the lens array. When straight lines are drawn that join the center of the radius of curvature and the edges of each elemental lens and these intersect the curved screen. The region between the intersections is the each elemental region. Along these lines the barriers are set up vertically between the lens array and the screen as shown in Fig. 3. The elemental s are calculated by using ray optics in a reverse manner of pickup step. That is, for a point in the object to be integrated, we follow an imaginary ray that originates from the object point and goes through the center of each lens and arrives at the display screen. The meeting point on the display screen is the elemental point corresponding to the object point. We perform this process for all lenses and all points in the object. In this calculation, computer-generated method integral imaging is used. Next, considering the projection process we modify the elemental s for projection. The modified elemental s are projected at the rear of the screen and the elemental s displayed on the screen are integrated through the curved lens array. Proc. of SPIE Vol. 5642 341

Owing to the curved structure of the proposed system the gap is constant and all elemental lenses have corresponding elemental s in horizontal direction if we assume that the object is located in the vicinity of the curvature center of the lens array. These are the main reasons why the viewing angle is enhanced. Figure 4 shows some examples of the computer generated elemental s for the conventional flat lens array system and for the proposed method that uses both a curved lens array and a curved screen. In the conventional method that uses a flat display panel and a flat lens array, some elemental lenses, which are located near the central axis of the object, have whole s of the object in the center areas of corresponding elemental display regions as shown in Fig. 4(b). As elemental lens gets farther from the central axis, its elemental becomes more shifted with respect to the center of the corresponding elemental region. When a portion of elemental crosses the neighboring elemental region, it is removed to avoid flipping, and only the rest elemental that remains in its elemental region is displayed. As a result, beyond the viewing zone there is no of the object and this limitation of elemental number causes the limitation of viewing angle. In the curved lens array method that uses a flat display panel 12, however, all elemental lenses have elemental s in centers of corresponding elemental region along horizontal direction without any shift in horizontal direction as shown in Fig. 4(c). Thus there are as many elemental s as the number of elemental lenses in horizontal direction. However, the gap mismatch limits the viewing angle as the observation angle increases because the lens array is curved and the display panel is flat. In the method implemented in this paper all elemental lenses have corresponding elemental s on the curved screen along horizontal direction as shown in Fig. 4(d), which is similar to the elemental s of the curved lens array method with flat display panel. However, the area of each elemental region on the screen is all the same and the gap is constant in the curved screen system. Thus the proposed system can overcome the gap mismatch problem effectively and with even larger observation angle the elemental s are integrated well at the exact location. 10 cm 13 cm (a) Location of (b) Flat lens array (c) Curved lens array (d)curved lens array and curved screen Figure 4. Computer-generated elemental s of (b) the conventional case (c) the curved lens array (d) the proposed case 342 Proc. of SPIE Vol. 5642

From the standpoint that the curved screen system overcomes the gap mismatch, the proposed system has no limitation of viewing angle if we use a lens array with more numbers of lenses. However, in using the curved screen some problems may occur as the viewing angle increases. First the brightness of the elemental is reduced as the viewing angle increases. Since the screen is curved and the screen gets inclined with respect to the light rays, the intensity is reduced as elemental gets farther from center axis. As the viewing angle increases the resolution also decreases. In the integral imaging the lateral and depth resolutions are affected by the resolution of the display panel that displays elemental s and also by light diffraction 13. In the proposed method the projector and the screen are used for displaying elemental s. Thus the lateral and depth resolutions depend on the resolution of the projector and the distance from the projector to the curved screen. The resolution also depends on the viewing angle because the screen is curved. As the viewing angle increases, the screen gets inclined more with respect to the light rays from the projector as shown in Fig. 5. Also, some elemental s are not focused well on the curved screen because the depth of focus of the projector is limited. Thus the integrated is out of focus as the viewing angle increases. However, if we use multiple projectors as in Ref. 15, the problems of brightness and focus might be solved. Another problem is that the elemental s are distorted in the process of projection. For solving the problem we need to correct the distortion finely for locating the elemental s on the exact corresponding location. For observing the integrated, the observers should be within the tolerance angle, which is determined by the elemental distortion, decrease in the resolution and the brightness, the projector s limited depth of focus, etc. if the maximum number of elemental lens n max is decided considering those problems above, the viewing angle Ω (see Fig.5) is derived as follows: ϕ Ω= 2 2nmax arctan 2 r, where ϕ is the lens pitch and r is the radius of curvature of lens array. In integral imaging the lens law determines the depth of scene and the reconstructed has best quality at central depth plane, of which location is determined by the lens law. If the is distant from the central depth plane, the quality of 3D is degraded and the thickness of is limited. In the proposed method the central depth plane is curved because the screen and the lens array is curved. Images are displayed around the central depth plane; we suggest calling the region the depth region. Especially when the central depth is equal to the radius of curvature of the lens array, the depth region becomes the inner region of a circle that has the same center with the lens array curvature as shown in Fig 5. In this experiment the central depth is set as the radius of the curvature and the 3D s are displayed around the center, within the depth region. Thus in the proposed method the limitation of depth region causes limitation of size as shown in Fig. 5(a) and (b). region Curved screen Curved lens array 2nmax θ Depth region Ω Limitation of size 2nmaxθ Ω Limitation of size Barrier By depth By barrier By depth region (a) For real (b) For virtual Figure 5. The depth region and the limitation of size By barrier Proc. of SPIE Vol. 5642 343

In the proposed system the barriers eliminate the flipped effectively. Owing to the inherent curved structure of the system the barriers do not affect the viewing angle, which is different from the case of barriers of the conventional system. However, using barriers also causes the limitation of size since barriers prevent observers from seeing neighboring elemental through each lens as shown in Fig. 5. This is inevitable because the observation through a wrong lens results in flipping. The size decided by the barrier is proportional to the radius of curvature of lens array. 3. EXPERIMENTAL RESULTS In experiments, two Fresnel lens arrays are used as the lens system. One consists of 19 by 13 square elemental Fresnel lenses with lens width of 10 mm and focal length of 22 mm and it is used for displaying a real. Another for displaying virtual consists of 11 by 13 square elemental Fresnel lenses with width of 10 mm and the focal length of 22 mm. The radius of curvature of the curved lens array is 10 cm. We used transmission type screens and the radius of curvature of the curved screen for displaying real is 13 cm and the radius of curvature of the curved screen for displaying virtual is 8 cm. An Epson EMP-7700 projector was used to project elemental s on the curved screen and the distance from the projector to the screen was 110 cm. The barriers are thin rectangular plates with the height of 14 cm and the width of 26 mm for real setup, 18 mm for virtual setup. They are set up between the lens array and the display panel vertically. We displayed an apple. The location of the apple is 10 cm in front of the lens array as shown in Fig. 4(a). Figure 4(d) shows the calculated computer-generated elemental s for projection. As the elemental for projection gets farther from the central axis, the size of the elemental region becomes smaller because the screen is curved and we assumed that the lights from the projector are parallel. In real situations the lights from the projector are not parallel and spread out from a source of light in the projector. Thus the elemental s of Fig. 3(d) are not correctly d on the corresponding locations on the screen and become distorted. To correct the distortion we measured the divergence angle of the light from the projector experimentally and adjusted the elemental s for projection according to the divergence. To investigate the effect of widening viewing angle, we compare the integrated s of the conventional method and with those of the proposed method. Figure 6(a) shows the integrated s observed from different viewing directions in the conventional system. In the conventional scheme, which uses a flat lens array and a flat display panel, the theoretical viewing angle for the same lens specification is 9.4 along one side and the effective angle is 7 (one-side) experimentally 13. As the observation angle increases, the integrated becomes to disappear and the flipped s become to appear as shown in Fig. 6(a). However, in the proposed system, which uses both a curved screen and a curved lens array, we can see the continuously with even wider viewing angle. Figure 6(b) shows the integrated real s and Fig. 6(c) shows the virtual s in the proposed system using a curved screen and a curved lens array. The viewing angle is expanded considerably since all elemental lenses have corresponding elemental s in horizontal direction and the gap is constant. With observation angle increasing, however, the s are not integrated well. The quality of integrated is worse because elemental s on the screen suffer from the problems, which are the distortion, the decrease of resolution and the brightness, the obscurity because of projector s limited depth of focus, etc. The integrated in Fig. 6(b) and (c) shows this in detail. You can see the quality of the integrated becomes worse as the observation angle increases. If the observation angle increases to more than 40, the gets obscure and dim and we cannot see the correct apple any more. The vertical grids are owing to the thickness and location errors of barriers, which can be solved by making the barriers thin and locating them precisely. The small apple s in the side of integrated apple in Fig. 6(c) are elemental s in the screen. In virtual case if we add more lens array in horizontal direction, the elemental s are not visible and we may see the apple continuously with even larger angle. The effective maximum viewing angle of the proposed system is about 30 (one-side) experimentally as shown in Fig. 6(b) and (c). 344 Proc. of SPIE Vol. 5642

Flipped Left 7 Left 10 Center Right 7 (a) Right10 Flipped Left 10 Left 20 Left 30 Center Right 10 Right 20 Right 30 (b) Left 7 Left 13 Left 20 Left 30 Left 35 Center Right 7 Right 13 Right 20 Right 30 Right 35 (c) Figure 6. Integrated s observed from different viewing angles (a) real s by conventional method (b) real s by the proposed method (c) virtual s by the proposed method Proc. of SPIE Vol. 5642 345

As shown in the results, the viewing angle of the advanced system increased remarkably. The viewing angle of the proposed system is two times wide of that of the curved lens array system with a flat display panel and four times wide of that of the conventional system. If we can use a flexible display panel, all the problems that cause errors in elemental would be overcome easily and the viewing angle will be expanded more. Vertical viewing angle can be enhanced if we curve the screen and the lens array vertically. If spherically shaped lens array and screen were used, the viewing angle would be expanded in both directions. 4. CONCLUSION In conclusion, a method to widen the viewing angle in integral imaging has been proposed and demonstrated by experiments. By the use of a curved screen and a curved lens array with the modified computer-generated elemental s and the barriers, the viewing angle for real and virtual both were expanded remarkably. In the proposed system every elemental lens has a corresponding elemental centered at the corresponding elemental region, which is a unique characteristic, with keeping the gap constant. If we can use a flexible display panel, we expect to realize 3D display system that is quite free from limitation of viewing angle. ACKNOWLEDGMENT This work was supported by the Next-Generation Information Display R&D Center, one of the 21st Century Frontier R&D Programs funded by the Ministry of Science and Technology of Korea. REFERENCES 1. G. Lippmann, La photographie integrale, Comptes-Rendus, Acad. Sci. 146, 446-451 (1908). 2. N. Davies, M. McCormick, and M. Brewin, Design and analysis of an transfer system using microlens arrays, Opt. Eng. 33, 3624-3633 (1994). 3. S. Manolache, A. Aggoun, M. McCormick, N. Davies and S.-Y. Kung, Analytical model of a three-dimensional integral recording system that uses circular- and hexagonal-based spherical surface micro lenses. J. Opt. Soc. Am. A. 18, 1814-1821 (2001). 4. M. C. Forman, N. Davies, and M. McCormick, Continuous parallax in discrete pixelated integral three-dimensional displays, J. Opt. Soc. Am. A. 20, 411-420 (2003). 5. J. Arai, F. Okano, H. Isono, and I. Yuyama, Gradient index lens array method based on real time integral photography for three-dimensional s, Appl. Opt. 37, 2034-2045 (1998). 6. S.-W. Min, S. Jung, J.-H. Park, and B. Lee, Three-dimensional display system based on computer-generated integral photography, in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, eds., Proc. SPIE 4297, 187-195 (2001). 7. S.-W. Min, S. Jung, J.-H. Park, and B. Lee, "Study for wide-viewing integral photography using an aspheric Fresnel-lens array," Opt. Eng. 41, 2572-2576 (2002). 8. B. Lee, S. Jung, and J.-H. Park, Viewing-angle-enhanced integral imaging using lens switching, Opt. Lett. 27, 818-820 (2002). 9. S. Jung, J.-H. Park, H. Choi, and B. Lee, Wide-viewing integral three-dimensional imaging by use of orthogonal polarization switching, Appl. Opt. 42, 2513-2520 (2003). 10. S.-H. Shin and B. Javidi, Viewing-angle enhancement of specke-reduced volume holographic three-dimensional display by use of integral imaging, Appl. Opt. 41, 5562-5567 (2002). 11. S.-W. Min, B. Javidi and B. Lee, Enhanced three-dimensional integral imaging system by use of double display devices, Appl. Opt. 42, 4186-4195 (2003). 12. Y. Kim, J.-H. Park, H. Choi, S. Jung, S.-W, Min, and B. Lee, Viewing-angle-enhanced integral imaging system using a curved lens array, Opt. Express 12, 421-429 (2004), http://www.opticsexpress.org/abstract.cfm?uri=opex-12-3-421. 13. J.-H. Park, S.-W. Min, S. Jung, and B. Lee, "Analysis of viewing parameters for two display methods based on integral photography," Appl. Opt. 40, 5217-5232 (2001). 346 Proc. of SPIE Vol. 5642