Partially-overlapped viewing zone based integral imaging system with super wide viewing angle Zhao-Long Xiong, Qiong-Hua Wang,* Shu-Li Li, Huan Deng, and Chao-Chao Ji School of Electronics and Information Engineering, Sichuan University, Chengdu 610065, China * [email protected]

Abstract: In this paper, we analyze the relationship between viewer and viewing zones of integral imaging (II) system and present a partiallyoverlapped viewing zone (POVZ) based integral imaging system with a super wide viewing angle. In the proposed system, the viewing angle can be wider than the viewing angle of the conventional tracking based II system. In addition, the POVZ can eliminate the flipping and time delay of the 3D scene as well. The proposed II system has a super wide viewing angle of 120° without flipping effect about twice as wide as the conventional one. ©2014 Optical Society of America OCIS codes: (100.0100) Image processing; (230.2090) Electro-optical devices; (100.3010) Image reconstruction techniques.

References and links 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19.

G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. 146, 446–451 (1908). J. Hong, Y. Kim, H. J. Choi, J. Hahn, J. H. Park, H. Kim, S. W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues,” Appl. Opt. 50(34), H87–H115 (2011). X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). N. A. Dodgson, “Analysis of the viewing zone of the Cambridge autostereoscopic display,” Appl. Opt. 35(10), 1705–1710 (1996). F. Okano, H. Hoshino, J. Arai, and I. Yuyama, “Real-time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. 36(7), 1598–1603 (1997). G. Park, J. Hong, Y. Kim, and B. Lee, “Enhancement of viewing angle and viewing distance in integral imaging by head tracking,” in Digital Holography and Three-Dimensional Imaging, OSA Technical Digest (Optical Society of America, 2009), DWB27. G. Park, J. H. Jung, K. Hong, Y. Kim, Y. H. Kim, S. W. Min, and B. Lee, “Multi-viewer tracking integral imaging system and its viewing zone analysis,” Opt. Express 17(20), 17895–17908 (2009). D. C. Hwang, J. S. Park, S. C. Kim, D. H. Shin, and E. S. Kim, “Magnification of 3D reconstructed images in integral imaging using an intermediate-view reconstruction technique,” Appl. Opt. 45(19), 4631–4637 (2006). C. C. Ji, H. Deng, and Q. H. Wang, “Pixel extraction based integral imaging with controllable viewing direction,” J. Opt. 14(9), 095401 (2012). K. C. Kwon, C. Park, M. U. Erdenebat, J. S. Jeong, J. H. Choi, N. Kim, J. H. Park, Y. T. Lim, and K. H. Yoo, “High speed image space parallel processing for computer-generated integral imaging system,” Opt. Express 20(2), 732–740 (2012). S. H. Jiao, X. G. Wang, M. C. Zhou, W. M. Li, T. Hong, D. Nam, J. H. Lee, E. H. Wu, H. T. Wang, and J. Y. Kim, “Multiple ray cluster rendering for interactive integral imaging system,” Opt. Express 21(8), 10070–10086 (2013). Y. Kim, J. H. Park, H. Choi, S. Jung, S. W. Min, and B. Lee, “Viewing-angle-enhanced integral imaging system using a curved lens array,” Opt. Express 12(3), 421–429 (2004). G. Baasantseren, J. H. Park, K. C. Kwon, and N. Kim, “Viewing angle enhanced integral imaging display using two elemental image masks,” Opt. Express 17(16), 14405–14417 (2009). B. Lee, S. Jung, and J. H. Park, “Viewing-angle-enhanced integral imaging by lens switching,” Opt. Lett. 27(10), 818–820 (2002). S. Jung, J. H. Park, H. Choi, and B. Lee, “Wide-viewing integral three-dimensional imaging by use of orthogonal polarization switching,” Appl. Opt. 42(14), 2513–2520 (2003). H. Choi, J. H. Park, J. Kim, S. W. Cho, and B. Lee, “Wide-viewing-angle 3D/2D convertible display system using two display devices and a lens array,” Opt. Express 13(21), 8424–8432 (2005). R. Martínez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. Martínez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Opt. Express 15(24), 16255–16260 (2007). Y. Takaki, K. Tanaka, and J. Nakamura, “Super multi-view display with a lower resolution flat-panel display,” Opt. Express 19(5), 4129–4139 (2011). R. Taherkhani and K. Mohammad, “Designing a high accuracy 3D auto stereoscopic eye tracking display, using a common LCD monitor,” 3D Res. 3.3, 1–7 (2012).

#211190 - $15.00 USD Received 30 Apr 2014; revised 25 Jun 2014; accepted 28 Aug 2014; published 8 Sep 2014 (C) 2014 OSA 22 September 2014 | Vol. 22, No. 19 | DOI:10.1364/OE.22.022268 | OPTICS EXPRESS 22268

20. C. C. Smyth, “Apparatus for tracking the human eye with a retinal scanning display, and method thereof,” U.S. Patent No. 6, 120, 461. 19 Sep. 2000. 21. J. Nakamura, T. Takahashi, and Y. Takaki, “Enlargement of viewing freedom of reduced-view SMV display,” in IS&T/SPIE Electronic Imaging, International Society for Optics and Photonics (2012). 22. J. C. Yang, C. S. Wu, C. H. Hsiao, R. Y. Tsai, and Y. P. Hung, “Evaluation of an eye tracking technology for 3D display applications,” in 3DTV Conference (2008), p. 345. 23. K. S. Park, S. W. Min, and Y. Cho, “Viewpoint vector rendering for efficient elemental image generation,” IEICE Trans. Inf. Syst. E 90-D, 233–241 (2007). 24. K. C. Kwon, C. Park, M. U. Erdenebat, J. S. Jeong, J. H. Choi, N. Kim, J. H. Park, Y. T. Lim, and K. H. Yoo, “High speed image space parallel processing for computer-generated integral imaging system,” Opt. Express 20(2), 732–740 (2012). 25. Kinect, http://www.kinectfordevelopers.com/. Kinect is a registered trademark of Microsoft Corporation in the United States and/or other countries.

1. Introduction Integral imaging (II) can reconstruct the true 3D images without glasses and provides both horizontal and vertical parallaxes with continuous views [1–4]. However, there are still some problems, such as the limitations of 3D image resolution, depth range, and viewing angle which delay the practical application of II. In the past decades, many researchers have focused on solving these problems [5–8], and many technologies have been proposed, including computer graphic technology, head and eye tracking technology, and so on. In this paper, we focus on the viewing angle issue of an II system and use computergenerated integral imaging (CGII) to capture the elemental images of 3D scene [9–11]. The viewing angle is defined as the scope that the viewer can observe the 3D images without obvious imperfection (such as flipping, etc.). In the conventional II system, viewers can watch 3D images in a very narrow region and the flipping images are observed within a slightly bigger angle. Many researchers focus on modifying optical structures of the II system [12–18]. The viewing angle is indeed enhanced by using a curved lens array [12], two elemental image masks [13], lens switching [14] and so on. But some of the structures are not practical because it is difficult to fabricate these specific devices. With the development of tracking technology, some researchers use head or eyes tracking technology to enhance the viewing angle. These works are remarkable and good performances are obtained [19–22]. A viewer’s position is obtained by head or eyes tracking and the tracking results are used for generating elemental images in proper positions. A great contribution has been made by Gilbae Park et al. to enhance the viewing angle of the II system based on the head tracking [6, 7]. In their methods, the viewer is always located at the central position of the viewing zone. So the viewing angle is limited by the maximum tracking angle of the tracking device [7]. And the flipping and time delay of the reconstructed 3D scene may occur when the viewer moves fast. In this paper, we propose a partially-overlapped viewing zone (POVZ) based II system with a super wide viewing angle which consists of a conventional II system and a tracking device. In the POVZ II system, the viewing zones are rearranged within 120° to eliminate the flipping 3D images in crosstalk zones of the conventional II system. Besides, the flipping and time delay issues in the conventional tracking based II system are also eliminated. In the POVZ II system, a tracking device obtains the viewer’s 3D position, and the system generates a corresponding adaptive elemental image array (AEIA) which reconstructs the 3D scene based on the tracking information in real time. Then we introduce the generation method for AEIA of the POVZ based on the viewer’s position. In the experiment, we build the POVZ II system with a super wide viewing angle of 120°, which is due to the region of lack of obvious imperfection. 2. Principle of the proposed POVZ II system As shown in Fig. 1, the architecture of the proposed system is composed of four parts: the input and tracking part includes the parameter and 3D scene data input and the tracking of the viewer’s position, the calculation part obtains the POVZ and virtual camera array

#211190 - $15.00 USD Received 30 Apr 2014; revised 25 Jun 2014; accepted 28 Aug 2014; published 8 Sep 2014 (C) 2014 OSA 22 September 2014 | Vol. 22, No. 19 | DOI:10.1364/OE.22.022268 | OPTICS EXPRESS 22269

information, the pixel mapping process generates the AEIA based on the parallax images, and the display process displays the AEIA for the viewer.

Fig. 1. Architecture of the proposed POVZ based II system.

In the paper, section 2.1 explains the difference of principles between the proposed and conventional system. Then we determine the region of the POVZ. Section 2.2 analyzes the relationship between the viewing zones and the AEIAs and calculates the shift of the AEIA. Section 2.3 proposes the generation method for the AEIAs. 2.1 Comparison of viewing zones in conventional tracking based II system and POVZ II system In the conventional tracking based II system, as shown in Fig. 2(a), the EIA is updated according to the tracking result in real time to make sure that the viewer is always located at the central position of the viewing zone [6]. So the viewing angle of the conventional tracking based II system θvc is limited by the tracking device’s largest tracking angle θtr and the viewing angle of the conventional II system θ0:

θ vc ≤ θtr + θ 0 .

(1)

And most of the conventional tracking based II systems use the tracked viewer’s position to change the EIA in real time [7], so when the viewer moves out of the tracking region, and if the system does not record the viewer’s last position information, the system cannot display the EIA exactly. In this case, the viewing angle θvc is no more than the largest tracking angle θtr [6]. What's more, because of the limitation of tracking device’s response time and accuracy, time delay will affect the viewing effect when the viewer moves fast.

#211190 - $15.00 USD Received 30 Apr 2014; revised 25 Jun 2014; accepted 28 Aug 2014; published 8 Sep 2014 (C) 2014 OSA 22 September 2014 | Vol. 22, No. 19 | DOI:10.1364/OE.22.022268 | OPTICS EXPRESS 22270

Fig. 2. Comparison of viewing zones between (a) the conventional tracking based II system and (b) the proposed POVZ II system.

In our tracking based II system, by using the POVZ, the viewing angle θv can be wider than the viewing angle of the conventional tracking based II system θvc. As shown in Fig. 2(b), while the viewer moves out of the tracking region, the proposed system optimizes the viewing zone according to the last tracked information, and almost all region of the viewing zone is arranged out of the tracking region, so the viewer can watch the 3D images in a wider angle without tracking. As shown in Fig. 2(b), viewing space is divided into several viewing zones in horizontal and vertical directions and coded by Vi, j. V0, 0 is the viewing zone of the conventional II system and regarded as the original viewing zone in our II system. The boundary of Vi, j can be decided by V0, 0 with a certain shift. The adjacent Vi, j and Vi-1, j have a partiallyoverlapped part denoted as Pi, j_i-1, j, as shown in Fig. 2(b). Pi, j_i-1, j reconstructs the same 3D scene in the overlapped zone of Vi, j and Vi-1, j. The proportion of Pi, j_i-1, j in Vi, j is denoted as the overlapped coefficient of viewing zone Vi, j which determines the reach size of the POVZ. The overlapped coefficient is a variable that decreases gradually from the center of the viewing space to the edge. The overlapped coefficient of V0, 0 is the largest coefficient and denoted by initial overlapped coefficient t. In the proposed POVZ system, each viewing zone has a corresponding AEIA, and the AEIA of Vi, j is denoted as Ai, j. The adjacent viewing zones are segregated by the angular bisector of the angle range of Pi, j_i-1, j (dotted red lines in Fig. 2(b)) which serves as a trigger line to send a signal to update the AEIA when the viewer moves from one viewing zone to the adjacent one. The tracking device detects the viewer’s position in viewing space in real time. When the viewer moves to Vi, j and arrives at the trigger line in Pi, j_i-1, j, the AEIA is changed from A i-1, j to Ai, j. When the viewer moves out of the maximum tracking angle, the AEIA keeps unchanged. Because almost all region of Vi, j, not half region of Vi, j, is arranged out of the tracking region, the viewing angle θv in the proposed system is wider than θvc in the conventional tracking based II system. The viewing angle θv can be calculated by

θ v ≈ θtr + 2θ 0 − ε min ,

(2)

where εmin is the angle range of the most marginal overlapped viewing zone Pi, j_i-1, j. In our system, εmin is a small value, and θ0 is the viewing angle of V0, 0 in the POVZ II system. Due to the certain width of the overlapped viewing zone, even though the viewer moves from one viewing zone to the adjacent one rapidly, the display system has time to change the AEIA and the viewer will observe the continuous 3D scene without any flipping and time delay. 2.2 Relationship between the viewing zones and the AEIAs in the POVZ II system In the proposed system, Vi, j can be decided by the corresponding Ai, j and Ai, j has a corresponding content updates and pixel shift Δni, j comparing to A0, 0. In the POVZ II system, the AEIA will be changed only if the viewer arrives at the trigger lines.

#211190 - $15.00 USD Received 30 Apr 2014; revised 25 Jun 2014; accepted 28 Aug 2014; published 8 Sep 2014 (C) 2014 OSA 22 September 2014 | Vol. 22, No. 19 | DOI:10.1364/OE.22.022268 | OPTICS EXPRESS 22271

As shown in Fig. 3, we assume that the origin point O is at the center of the lens array, and the tracking device obtains the viewer’s 3D position as P(x, y, z) in real time. So the tracked viewer’s angle θ in the viewing space can be deduced as: 

x  

 y   

θ =  arc tan   , arc tan    . z z 

(3)

Assume that each elemental image has u × v pixels, and its size is rh × rv. According to the viewer’s position, we can obtain the information of viewing zone Vi, j where the viewer is by the following equations:   g x θ  x < xmax = z ⋅ tan  tr  ⋅  round  (1 ) t r z −  2  h h    , i=  g  θtr    x ≥ xmax  round  (1 − t )r ⋅ tan  2   h h   

(4)

  g round    (1 − tv )rv j=  g  round  (1 − t )r v v  

(5)

y ⋅  z θ  ⋅ tan  tr    2 

θ  y < ymax = z ⋅ tan  tr   2 

,

y ≥ ymax

where g is the gap between the AEIA and the lens array, and th and tv are the initial overlapped coefficients of A0, 0 and A-1, 0 in the horizontal and vertical directions, respectively. th and tv can be expressed as the proportion of the overlapped pixels in the elemental images in A0, 0 and A-1, 0, as shown in Fig. 3. They are both within the range of (0, 1). The maximum tracking region is (-xmax, xmax) and (-ymax, ymax) at the viewing distance z.

Fig. 3. Relationship between the viewing zones and the AEIAs in the POVZ II system.

In our system, Ai, j reconstructs the 3D images within the viewing zone Vi, j. In order to make sure that almost all region of the viewing zone is out of the tracking region when the viewer moves to the tracking boundary, each viewing zone Vi, j has a shift exactly. The movement of Vi, j can be decided by the pixel shift Δni, j of Ai, j. The pixel shift Δni, j includes a conventional shift (Δni, j)c and an additional shift (Δni, j)a. The former is same with the pixel shift in conventional tracking based II system which ensures the viewer always located at the center of the viewing zone. The latter ensures the viewer located at an off-center position in #211190 - $15.00 USD Received 30 Apr 2014; revised 25 Jun 2014; accepted 28 Aug 2014; published 8 Sep 2014 (C) 2014 OSA 22 September 2014 | Vol. 22, No. 19 | DOI:10.1364/OE.22.022268 | OPTICS EXPRESS 22272

the corresponding viewing zone. As shown in Fig. 3, the viewer is located at P(x, y, z) and the corresponding Ai, j has a pixel shift Δni, j. The conventional shift (Δni, j)c moves the viewing zone to the viewer’s position; and the additional shift (Δni, j)a allows the viewing zone has an additional movement which contributes to the wider viewing angle. Combining the conventional and additional shift, Δni, j in the horizontal and vertical directions is denoted by Δni , j = (Δni , j )c + (Δni , j ) a = ((Δμi , j )c + (Δμi , j ) a , (Δλi , j )c + (Δλi , j ) a ),

(6)

where (Δμi, j)c, (Δμi, j)a, (Δλi, j)c and (Δλi, j)a are the conventional shift and additional shift in the horizontal and vertical directions, respectively, and can be deduced as:

(Δμi , j )c = −i ⋅ (1 − th ),

(7)

   u ⋅ r ⋅ (1 − t )  h h (Δμi , j ) a = −round  ⋅i ,   θtr    2 g ⋅ tan  2      

(8)

(Δλi , j )c = − j ⋅ (1 − tv ),

(9)

  v ⋅ r ⋅ (1 − t ) v v (Δλi , j ) a = − round  ⋅   θtr  2 g tan ⋅  2     

  j ,   

(10)

where (i, j) is decided by the tracked viewer’s position and the parameters of the II system. With the pixel shift Δni, j the region of Vi, j can be determined. From Eqs. (4)–(10) we can know the relationship between the viewing zones and the corresponding AEIA. 2.3 Generation method for AEIA of the proposed POVZ II system In the proposed POVZ II system, we improve the viewpoint vector rendering (VVR) [23, 24] method to obtain the AIEAs efficiently. As shown in Fig. 4, after arranging the 3D scene in advance, we set a virtual camera array to pick up the 3D information, and each camera has an orthographic geometry. The number of the virtual cameras is just equal to the number of pixels in each elemental image of the AEIA. We suppose the number of micro-lens in lens array is M × N, and the size of elemental image is u × v pixels. As shown in Fig. 4, in horizontal direction, u virtual cameras are needed to obtain the u orthographic projection images. Then the orthographic projection images are interleaved to generate A0, 0 based on VVR method.

#211190 - $15.00 USD Received 30 Apr 2014; revised 25 Jun 2014; accepted 28 Aug 2014; published 8 Sep 2014 (C) 2014 OSA 22 September 2014 | Vol. 22, No. 19 | DOI:10.1364/OE.22.022268 | OPTICS EXPRESS 22273

Fig. 4. Generation process for AEIAs of the proposed POVZ II system.

We can obtain Ai, j for Vi, j as shown in Fig. 4. The virtual camera array for Ai, j has a specific shift ΔDi, j comparing to A0, 0, but both of them have the same convergent point Pcon. The shift ΔDi, j of the virtual camera array also includes the horizontal and vertical shifts in order to pick up the wider angle parallax images. The shift ΔDi, j can be determined by the pixel shift Δni, j of Ai, j: ΔDi , j =

( ( ( Δμ

)

) + (Δμi , j ) a ) ⋅ d , ( (Δλi , j )c + (Δλi , j ) a ) ⋅ d ,

i, j c

(11)

where d is the distance between the adjacent cameras in the virtual camera array both in horizontal and vertical directions. We get the u × v orthographic projection images for Ai, j, and in the m′-th column and the n′-th row orthographic projection image, the pixel in the m-th column and the n-th row is denoted as I(m, n)m′, n′. The pixel I(m, n)m′, n′ is mapped to the p-th column and the q-th row pixel in Ai, j which is denoted as I′i, j(p, q), as shown in Fig. 4. Thus, we can obtain Eq. (12) as: I i′, j ( p, q) = I (m, n) m′, n′ .

(12)

And the relationship between p, q, m, n, m′ and n′ in Eq. (12) can be obtained by p = (m′ + 1) × u − m − 1 + Δ c μi , j + Δ a μi , j ,

(13)

q = (n′ + 1) × v − n − 1 + Δ c λi , j + Δ a λi , j .

(14)

In this way, loop m′ from 0 to u-1, n′ from 0 to v-1, m from 0 to M-1, and n from 0 to N-1, all the pixels in the orthographic projection images are mapped to Ai, j. Moreover, the proposed pixel mapping is processed in parallel by GPU which uses compute unified device architecture (CUDA) as the development environment to achieve real-time display. From this section, we can know the acquisition and the pixel mapping in the generation process of the AEIAs. 3. Experiments and results

In our experiment, we use a Kinect® as the tracking device to obtain the viewer’s head 3D position [25]. The experimental setup is shown in Fig. 5.

#211190 - $15.00 USD Received 30 Apr 2014; revised 25 Jun 2014; accepted 28 Aug 2014; published 8 Sep 2014 (C) 2014 OSA 22 September 2014 | Vol. 22, No. 19 | DOI:10.1364/OE.22.022268 | OPTICS EXPRESS 22274

Fig. 5. Experimental setup of the II system with super wide viewing angle.

The proposed II system is configured with the specification in Table 1. We use the pinhole array instead of the lens array. Each elemental image contains 13 × 13 pixels which are covered by one elemental pinhole. So we build 13 × 13 virtual cameras to pick up the AEIAs. Table 1. Configuration parameters and experiment environment of the proposed II system Pinhole array LCD panel Rendering parameter Tracking system Maximum tracking angle Proposed POVZ II viewing angle

Pinhole amount Interval Pixel pitch Gap pinhole array and LCD AEIA resolution Elemental image size Kinect In horizontal direction In horizontal direction

196 × 110 3.030mm 0.2331mm 4.210mm 2560 × 1440 pixels 13 × 13 pixels Kinect SDK for Windows Ver. 1.7 57° 120°

In our experiment, we build up a “man head” as the 3D scene and the central depth plane is located at the center of the head as shown in Fig. 6. The viewing angle of the conventional tracking based II display is 57°( ± 2 8. 5°) which is equal to the maximum tracking angle of Kinect when the viewing distance is about 3.1m.

Fig. 6. 3D scene built in experiments.

When the viewer’s position is tracked, the AEIAs are obtained and two of them are shown in Figs. 7(a) and 7(b). A0, 0 is obtained when the viewer’s position is P1(0.0m, 0.1m, 3.1m) with the viewing angle of 0° in the viewing space, and A3, 0 is captured as the result when the

#211190 - $15.00 USD Received 30 Apr 2014; revised 25 Jun 2014; accepted 28 Aug 2014; published 8 Sep 2014 (C) 2014 OSA 22 September 2014 | Vol. 22, No. 19 | DOI:10.1364/OE.22.022268 | OPTICS EXPRESS 22275

viewer moves to P2(3.7m, 0.1m, 2.4m) with the viewing angle of about 55°. Simultaneously, the virtual camera array has a shift of (18 × 5.9)mm in horizontal direction according to Eq. (11). The viewing zones are shown in Figs. 7(c) and 7(d), and we can see that the system displays A3, 0 as the most marginal AEIA when the viewing angle is out 27.5°.

Fig. 7. AEIAs and viewing zones (a) the A0, 0 and corresponding elemental images, (b) the A3, 0 and corresponding elemental images, (c) the region of V0, 0, (d) the region of V3, 0.

In our experiment, each elemental image in A-1, 0 has 9 overlapped pixels with the corresponding elemental image in A0, 0, so the initial overlapped coefficient t is 9/13 in the horizontal direction. The viewing space is divided into seven viewing zones in the horizontal directions and there are seven corresponding AEIAs. Generally, the region of Vi, j, the pixel shift Δni, j of AEIAi, j, the corresponding shift of virtual camera array and the trigger angle of the adjacent viewing zone are shown in Table 2 in detail. Table 2. Region of POVZ and the trigger angle in experiment Vi, j Region Δni, (pixel) ΔDi, j (mm) Trigger

(−3, 0) (−60.0°)(−25.5°) 18

(−2, 0) (−45.1°)(−14.0°) 10

(−1, 0) (−24.6°)( + 10.2°) 4

(0, 0)

(1, 0)

(−17.5°)( + 17.5°) 0

(−10.2°)( + 24.6°) −4

−106.2

−59.0

−23.6

0

23.6

j

−27.5°

−18.2°

−7.6°

7.6°

18.2°

(2, 0) ( + 14.4°)( + 46.8°)

(3, 0) ( + 25.5°)( + 60.0°)

−10

−18

59.0

106.2 27.5°

When the viewer moves in front of the II display, the images from different positions are captured, as shown in Fig. 8. By practical measurement, the maximum viewing angle without flipping effect in the conventional II system is only 35°, as shown in Figs. 8(a)–8(c). But in the proposed POVZ II system the maximum viewing angle without flipping effect is 120°( ± 60°), as shown in Figs. 8(d)–8(h). Due to the proposed POVZ, the super wide viewing angle can be achieved. The flipping and time delay of the 3D scene when the viewer moves fast are further reduced by the good performance of the response time and accuracy of the tracking device.

#211190 - $15.00 USD Received 30 Apr 2014; revised 25 Jun 2014; accepted 28 Aug 2014; published 8 Sep 2014 (C) 2014 OSA 22 September 2014 | Vol. 22, No. 19 | DOI:10.1364/OE.22.022268 | OPTICS EXPRESS 22276

Fig. 8. Viewing angle of the conventional II system and movie (Media 1): (a) leftmost view, (b) middle view, (c) rightmost view; and viewing angle of the POVZ II system and movie (Media 2): (d) leftmost view, (f) middle view, (h) rightmost view, (e) and (g) comparison to the conventional II viewing angle.

4. Conclusion

An II system based on POVZ is proposed to enhance the viewing angle effectively without flipping and time delay even if the viewer moves quickly. The POVZs allot the viewing space according to the relationship between AEIAs and viewing zones. And the generation method for AEAs of the POVZ is also proposed. In the experiment, the viewing angle of the II system is 120° without flipping effect which is over twice of the maximum tracking angle of the Kinect. In addition, the viewing angle of our II system can be extended with a tracking device having better performance. Applying the POVZ for each viewer in the multi-viewer tracking II system, it may be possible to display the 3D images for each viewer with super wide viewing angle. Acknowledgments

The work is supported by the “973” Program under Grant No. 2013CB328802, the NSFC under Grant Nos. 61225022 and 61320106015, and the “863” Program under Grant Nos. 2012AA011901 and 2012AA03A301.

#211190 - $15.00 USD Received 30 Apr 2014; revised 25 Jun 2014; accepted 28 Aug 2014; published 8 Sep 2014 (C) 2014 OSA 22 September 2014 | Vol. 22, No. 19 | DOI:10.1364/OE.22.022268 | OPTICS EXPRESS 22277

Partially-overlapped viewing zone based integral imaging system with super wide viewing angle.

In this paper, we analyze the relationship between viewer and viewing zones of integral imaging (II) system and present a partially-overlapped viewing...
4MB Sizes 0 Downloads 4 Views