Chinese Optics Letters, 2024, 22 (1): 011101, Published Online: Jan. 9, 2024  

Optimizing depth of field in 3D light-field display by analyzing and controlling light-beam divergence angle

Author Affiliations
State Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications (BUPT), Beijing 100876, China
Abstract
A concept of divergence angle of light beams (DALB) is proposed to analyze the depth of field (DOF) of a 3D light-field display system. The mathematical model between DOF and DALB is established, and the conclusion that DOF and DALB are inversely proportional is drawn. To reduce DALB and generate clear depth perception, a triple composite aspheric lens structure with a viewing angle of 100° is designed and experimentally demonstrated. The DALB-constrained 3D light-field display system significantly improves the clarity of 3D images and also performs well in imaging at a 3D scene with a DOF over 30 cm.

1. Introduction

3D light-field display (LFD) can restore original light-field distribution of a real 3D scene[16], which is regarded as an ideal and potential 3D display technology and can be better applied in many fields, especially for medical research, museum exhibitions, and military command.

In the early stages of light-field theory, the plenoptic function (a 7D function that describes light in a scene from position, angle, wavelength, and time) was used to model the distribution of light rays[7]. For practical applications, the light-field model has been simplified to five dimensions[8]. This 5D model can effectively represent the set of light rays emanating from every point in 3D space in every direction, where the light rays at every possible location are denoted by (x,y,z) and from every possible angle by (θ,φ). As shown in Fig. 1, the real scene can be interpreted as a combination of voxels A, and the corresponding light field F can be parameterized as L(x,y,z,θ,φ). To obtain such a light field, a 2D display device (such as an LCD, LED, or projector) is utilized to provide up to 2D location information (x,y), while a light control device comprising a series of light control units is employed to provide up to 2D directional information (θ,φ). Consequently, the 3D reconstructed scene can be interpreted as a combination of reconstructed voxels A˜, which only contains light ray information in four dimensions, and the corresponding reconstructed light field F˜ can be parameterized as L(x,y,θ,φ).

Fig. 1. 3D light field reconstruction process of the real scene and contrast at different DOF.

下载图片 查看所有图片

Depth of field (DOF) is defined as the range of depth within which a 3D image can be accurately reconstructed. Due to the lack of z dimension in the reconstructed light field, the DOF of the restored 3D scene is quite limited. In particular, the viewing disparity is proportional to the DOF in the 3D display, and more serious image aliasing usually occurs when the DOF exceeds a certain threshold, which degrades the imaging quality and lowers the visual experience. As shown in Fig. 1, at a depth of 5 cm, the viewing disparity is very minimal; thus the image is clear and acceptable. However, at a depth of 20 cm, image quality deteriorates dramatically, and depth information cannot be accurately expressed. Therefore, the key challenge facing 3D LFD is to develop effective methods to improve the DOF of 3D reframing scene.

In order to design a high-quality 3D LFD system, a number of studies about DOF analysis and promotion were performed[914]. Some researchers analyze the light intensity distribution in the amplitude-modulating pickup system and implement an optical pickup experiment using an amplitude-modulated sensor array (SA) to generate the DOF-enhanced elemental image array (EIA) for computational reconstruction to produce the 3D images with extended DOF[9]. This method is more suitable for enhancing the DOF of 3D scenes with small depth ranges. In another study, a gradient-amplitude-modulating (GAM) method was proposed to enhance DOF in an II pickup system[10]. However, the GAM method sacrifices light efficiency and may not be practical in scenarios with strict optical efficiency requirements. Adjusting lens parameters has also been attempted to overcome DOF limitations[1114], but this approach still suffers from degradation of light efficiency and information quantity.

Here, the divergence angle of a light beam (DALB) emitted from each light control unit and corresponding to different subpixel units is considered to be a crucial factor influencing DOF in 3D LFD. The present study examines the impact of DALB on DOF and proposes an optimization scheme that enhances DOF and ensures the display quality from different viewing perspectives.

2. Analysis

2.1. Relation between DALB and DOF

In the light-field reconstruction process, the intensity and color information of the voxels are loaded onto the subpixels of the LCD panel, while the direction and location information are provided by the light control panel. In the ideal theoretical analysis process of voxel reconstruction, the chief rays emitted from the center of subpixels a1 to an converge in the free space after passing through the corresponding light control units so that a point-like voxel “A” is reconstructed. This analysis method, which only considers the chief rays, has been widely used in voxel analysis of 3D-LFD[1520]. However, the actual light path reveals that it is not only the chief ray that participates in voxel generation. Due to the fact that the subpixels emit rays from any position, a light beam (LB) will be generated by the light modulation of light control units. As a result of this phenomenon, the actual constructed voxel is not point-like but spot-like, as illustrated in Fig. 2. Considering the minimum angular resolution of human eyes, when the diameter of speckles is less than a certain value, it can be recognized as a clear image point; otherwise, aliasing appears and affects the reconstruction quality, particularly by damaging the depth perception.

Fig. 2. LBs mapping of voxel A.

下载图片 查看所有图片

The DOF is the range of distances in which speckles of voxels can be distinguished from one another and objects can be imaged with clarity. Due to the fact that the diameter of the voxel speckle varies with the DALB, the DOF is in turn affected. As shown in Fig. 2, the reconstruction voxel A is regarded as an example to demonstrate how the DALB affects the DOF of the LFD system.

In Fig. 2, a set of LBs from display pixels are focused on the depth plane of voxel A by the light control panel, forming a spot with a diameter of w, and the chief rays of the beams converge at point A. This light-control panel is located at the XY coordinate plane, and the center of voxel A is located at the Z coordinate axis. These LBs are denoted by LB1 to LBN, respectively, where N represents their number. LBk is one LB to be modulated, the lateral distance between the center of its corresponding light control unit and the center of the voxel A is called Δxk, and its corresponding DALB is called βk. According to the geometrical relationship in Fig. 3, βk=arctanΔxk+12wDarctanΔxk12wD,Δxk=Dtan[α(k1)N1α2].

Fig. 3. (a) Optical path at 0° incidence angle without aberration; (b) optical path at 0° incidence angle in actual situations; (c) intensity distribution of ideal lens and actual lens at 0° incidence angle; (d) variation in DALB with increasing incidence angle.

下载图片 查看所有图片

In these equations, w is the diameter of the voxel A, which is an acceptable size to be able to form clear reconstructed images. α is the viewing angle of the voxel A. D is the distance from the voxel A to the light-control panel, which can reflect the DOF of the light field. According to the fact that D is usually much larger than w for a better stereo experience, further derivation of Eqs. (1) and (2) can be obtained, tanβk=wDD2+Δxk14w2wDD2+Δxk=wD+tan[α(k1)N1α2].

Afterwards, D can be deduced based on the above results and approximately expressed as a function of βk, D=wtanβktan[α(k1)N1α2].

The derived formula shows that the DOF of the reconstructed light field is inversely proportional to the DALB. It follows that diminishing DALB can elevate the DOF of the 3D light field.

2.2. Influencing factors of DALB in a lens-based 3D-LFD system

With the relation between DOF and DALB now ascertained, it becomes necessary to discuss the factors that impact DALB.

Lens array is an established solution as a light-control panel for light controlling. One of the basic optical properties of the lens element is the aberration characteristics, which causes the emergent light rays to deviate from their anticipated path. Thus, aberrations must be considered in angular analysis. In the upcoming section, the optical path of the lens unit will be employed to demonstrate how aberrations impact DALB in a lens-based 3D-LFD system.

Figures 3(a) and 3(b) show the optical path of an ideal lens unit and an actual lens unit when the direction of the LB is 0°. Here, the direction of the LB refers to the angle between the subpixel’s chief ray and the center of the corresponding lens unit. The spacing between the subpixel unit and the lens unit is set to the focal length of the lens unit, denoted by f. Under an aberration-free situation, the LB emitted from a subpixel will diffuse at an angle of βideal after passing through an ideal lens element, as depicted in Fig. 3(a). However, the DALB is highly vulnerable to aberrations in practical applications. As shown in Fig. 3(b), due to the presence of aberrations in the lens elements, this angle will expand by a certain amount to βactual. Furthermore, the light-intensity distribution in Fig. 3(c) also suggests that the existence of lens aberrations significantly increases DALB. In the above experiment, the aperture and focal length of the lens unit are 10 and 20 mm, respectively, and the size of the subpixel is 2 mm.

Snell’s law dictates that aberrations become more severe for light rays with larger incident angles. Thus, the direction of an LB also affects its divergence angle, as evidenced by the analysis above. Figure 3(d) illustrates the tendency of a DALB going through an ideal lens unit and an actual lens unit at different directions. With an increase in the direction angle, β corresponding to an ideal lens is on the decline, down from 5.72° to 4.64°, while β corresponding to an actual lens is on the rise, up from 10.81° to 19.71°. Moreover, β of the actual lens consistently surpasses that of the ideal lens. In Section 2.1, it is concluded that an increase in the DALB leads to decreased DOF, thereby impairing depth perception and damaging the visual experience at large viewing angles.

According to the theory of optical aberrations, when the direction angle increases, the influence of spherical, chromatic aberrations is small; however, coma, astigmatism, and field curvature become more serious.

In order to investigate the effects of coma, astigmatism, and field curvature on a DALB, aberration simulation experiments were conducted. Zernike polynomials are commonly employed to characterize the wavefront aberrations of optical systems. Therefore, specific aberrations were simulated by introducing Zernike coefficients onto an ideal lens and subsequently modeled in Zemax (optical simulation software). The angles of the DALB were measured using the RAID evaluation function operands at different angles.

Figure 4 presents the results of the aberration simulation experiments, illustrating the individual effects of three types of aberrations on the DALB. Δβ represents the difference between the DALBs of actual and ideal lenses, i.e., Δβ=βactualβideal. It can be indicated that the Δβ value for each of the three aberrations is always greater than 0°, and with an increase in incident angle, all three aberrations cause Δβ to rise, with field curvature having the most significant effect.

Fig. 4. Influence of three types of aberrations.

下载图片 查看所有图片

3. Optimization Method

Based on the analysis above, the imaging quality of the lens unit emerges as a crucial factor for reconstructing high-quality 3D light fields with a small DALB and a large DOF. In the subsequent discussion, the maximum incident angle is designed to be 50° (whole angle 100°). Under this situation, when using a single-lens unit, the resulting DALB will be extremely large and significantly impair the depth quality of the 3D image. For this reason, it becomes imperative to implement an aberration-limited lens structure with suppressed coma, astigmatism, and field curvature while prioritizing maximum optimization weight for the field curvature.

In classical optical design, a symmetrical-lens structure is employed to suppress field curvature, and a combination structure of lens and aperture stop serves to suppress coma and astigmatism. Additionally, the aberration theory stipulates that a reduced optical aperture of the lens can further eliminate the three kinds of aberrations. Taking these factors into account, an initial structure of the optimized compound lens is obtained, as shown in Fig. 5. This structure is composed of lens-1, lens-2, and an aperture stop, with lens-2 and the aperture stop aimed at suppressing coma and astigmatism, and lens-1 tasked with suppressing the field curvature. To enhance the effect of inhibiting aberrations, both lens-1 and lens-2 utilize smaller optical apertures.

Fig. 5. Three optimized optical structures and the initial structure of the compound lens.

下载图片 查看所有图片

In order to minimize the impact of aberrations, we introduced aspheric models on the two surfaces to suppress coma and astigmatism while balancing the higher-order aberrations. The aspheric surface formula is given in Eq. (5), z=cr21+1(1+k)c2r2+α2r2+α4r4+α6r6+,where r is the radial coordinate, k is the conic constant, c is the vertex curvature, and α2, α4, and α6 are the aspheric coefficients. The conic and higher-order aspheric coefficients of the two surfaces are set as variables, and the optimization goal is to make the outgoing light from one field as close to 0° as possible. After a series of optimization weight adjustments and iterative calculations, the optical structure and parameters of the designed compound lens are displayed in Fig. 6(a) and Tables 1 and 2.

Fig. 6. Design of the optimized compound lens. (a) Parameter description; (b) light-path distribution.

下载图片 查看所有图片

Table 1. Structural Parameters for the Optimized Compound Lens

ParametersDetailUnit
Refractive index of the air (nair)1
Refractive index of the lens-1 (n1)1.61
Refractive index of the lens-2 (n2)1.41
Optical aperture of the lens-1 (D1)0.40mm
Optical aperture of the lens-2 (D2)0.60mm
Aperture of the aperture stop (D3)0.10mm
Width of the compound lens (D4)0.878mm
Central thickness of the lens-1 (d1)0.209mm
Central thickness of the lens-2 (d2)0.374mm

查看所有表

Table 2. Aspheric Parameters of Lens Surface

SurfacerConicSecond-order termFourth-order termSixth-order term
10.226−0.5180.57112.355−193.26
2−0.3070−0.1270.418−4.84

查看所有表

Figure 6(b) illustrates the simulation results for the corrected light distribution based on the optimized compound lens elements. (Given the symmetry of the system, we only present one side of the viewing range for clarity.) We can see that, after optimization, the emergent LBs exhibit a nearly parallel distribution and hardly change with field of view.

To validate the effectiveness of the proposed method in reducing the DALB, a simulation experiment was conducted using a subpixel size of 0.06 mm (which is commonly used for one subpixel in a 65-inch 8K monitor) as an example. Five different lenses were simulated and modeled, including an ideal lens, a compound lens, and standard lenses with diameters of 100%, 30%, and 20%, respectively. The curve of the DALB with respect to the incident angle for these lenses is shown in Fig. 7.

Fig. 7. Change in the DALB of different incident angles of five kinds of lenses with the subpixel size of 0.05 mm in the (a) central viewing area and (b) peripheral viewing area.

下载图片 查看所有图片

In the central viewing area from 0° to 25°, the optimized compound lens consistently exhibited lower DALB compared to the standard lenses, and its DALB distribution closely resembled that of the ideal lens. It should be noted that the standard lens with 100% diameter theoretically achieves the same 100° viewing angle as the compound lens. However, due to its large curvature, it introduces severe aberrations that make accurate DALB measurement unfeasible. The standard lenses with 30% and 20% diameters sacrifice viewing angle to reduce the DALB, but the difference from the compound lens remains substantial.

In the peripheral viewing area from 25° to 50°, the optical performance of all standard lenses deteriorates significantly, whereas the optimized compound lens maintains high-quality light control.

In practical engineering applications, standard lenses have a limited capability to achieve a wide viewing angle, and they cannot provide clear display at 100°. In contrast, the proposed compound lens, with its unique optical design, shows significant improvement in reducing the DALB compared to standard lenses, which was previously unattainable.

4. Experiment

To verify the feasibility of the proposed method, the compound lens array is fabricated utilizing the UV embossing process, and the 3D LFD system based on this array is established for experimental validation. The manufactured compound lens array is demonstrated in Fig. 8(b), while the configuration of the experimental 3D LFD system is exhibited in Fig. 8(a). Due to the limitations of manufacturing precision, there may be slight but acceptable deviations between the width of the manufactured compound lens and its design value.

Fig. 8. (a) Configuration of the experimental 3D LFD system; (b) structural diagram of the proposed compound lens array.

下载图片 查看所有图片

Due to the presence of an aperture stop in the proposed compound lens, it inevitably reduces the light efficiency (measured at about 16%) and subsequently lowers the image brightness. To compensate for the brightness loss, the constructed optical system incorporates a high-intensity backlight of 40,000 nits, resulting in a final display brightness of 320 nits, which is considered suitable for most environments, effectively addressing the issue of optical efficiency loss.

In the experimental 3D LFD system, the horizontal viewing angle ranges from 50° to 50°. A 65-inch flat panel LCD device with a resolution of 7680×4320 is used to load the synthetic images, constructed using the method proposed by Yu et al.[21]. Placing the novel compound lens array at a distance of 0.5 mm in front of the display device, the constituted 3D image can be observed by observers when displaying the coded image on the LCD panel. The corresponding experimental dimension parameters of the 3D LFD system are listed in Table 3.

Table 3. Dimension Parameters of the 3D LFD System

ParametersDetailUnit
FOV of the proposed system100Degree
Size of the LCD panel64Inch
Resolution of the LCD panel7680 × 4320
Elemental image unit size (subpixel size)60µm
Compound lens unit size0.88mm
Number of the compound lens units794
Distance between the LCD panel and the compound lens array0.5mm

查看所有表

To validate the effectiveness of our proposed methods for aberration suppression and depth enhancement, we conducted a comparative experiment by displaying 3D images on two 3D LFD systems: one with the traditional single-lens array and the other with the newly proposed DALB-limited compound lens array. A model of two dragon sculptures with the theoretical depth of 30.35 cm is captured by 80 virtual cameras, coded, and loaded onto the LCD panel, which was then taken at a distance of 2 m using a Canon camera. The comparison results for different horizontal views are illustrated in Fig. 9 and Visualization 1. By enlarging the 3D image at viewing angles of 0°, 50°, and 50°, we can see that the redesigned system produced clearer reconstructed images and offered more detailed depth information after the DALB had been reduced, especially for the edge perspective. These findings confirm that the proposed light-field display method is capable of facilitating the 3D imaging quality.

Fig. 9. Comparison of different views of the displayed 3D effects based on (a) the proposed DALB-limited compound lens array (see Visualization 1) and (b) the traditional single-lens array.

下载图片 查看所有图片

To quantify the improvement of the DOF after limiting the DALB, a depth verification experiment is then carried out. The main parameters and 3D scene layout are shown in Fig. 10(a). More specifically, the experimental 3D scene features eight letters “A” and “B” placed at equal intervals but at unequal depths. The letter A denotes an object inside the screen, while the letter B represents an object outside the screen. The DOF range of the 3D scene was set from 0 to 30 cm.

Fig. 10. Confirmation experiments of DOF. (a) Arranged 3D scene; (b) displayed light-field image based on single-lens array and proposed compound lens array.

下载图片 查看所有图片

Figure 10(b) presents the restored light-field images displayed on the 3D LFD system based on the single-lens array and proposed compound lens array, respectively. Comparison of the images shows that the proposed system significantly improves image quality at each depth, particularly when the depth is 30 cm. As seen in the proposed system, letters can be clearly focused, whereas the profile of letters in the comparison system is blurred. These optical experimental results prove that our proposed method of reducing the DALB can obtain a larger DOF, and the 3D LFD system consisting of the designed DALB limited optical structures is particularly suitable for displaying 3D scenes with a large depth.

5. Conclusion

In this paper, a novel DOF analytical approach based on the DALB in a 3D LFD system is proposed. A mathematical model is established to analyze the relationship between the DALB and DOF, and a conclusion that they are inversely proportional is drawn. According to optical path simulation, the expansion of the DALB in standard lenses is attributed to optical aberrations, particularly coma, astigmatism, and field curvature. In order to improve depth quality, a novel compound lens with a two-tier aspherical lens and slit aperture is designed and tested to restrain these aberrations and reduce the DALB diffusion. A 3D LFD system based on the proposed compound lens array is established for experimental validation. The validity of our proposed methods was demonstrated through depth verification experiments. Compared to the 3D images reconstituted by conventional single-lens-based 3D display systems, the proposed DALB-limited 3D display achieved a larger DOF and improved 3D imaging quality. Using the experimental display system, a clear 3D image with a DOF of 30.35 cm within a range of 100° can be viewed.

References

[1] Y. Li, N. Li, D. Wang, et al.. Tunable liquid crystal grating based holographic 3D display system with wide viewing angle and large size. Light Sci. Appl., 2022, 11: 188.

[2] D. Wang, C. Liu, C. Shen, et al.. Holographic capture and projection system of real object based on tunable zoom lens. PhotoniX, 2020, 1: 6.

[3] X. Yan, Z. Yan, T. Jing, et al.. Enhancement of effective viewable information in integral imaging display systems with holographic diffuser: quantitative characterization, analysis, and validation. Opt. Laser Technol., 2023, 161: 109101.

[4] J. Wen, X. Jiang, X. Yan, et al.. Dual-mode light field display with enhanced viewing range: from far distance viewable to near distance touchable. Optik, 2022, 252: 168403.

[5] J. Wen, X. Yan, X. Jiang, et al.. Integral imaging based light field display with holographic diffusor: principles, potentials and restrictions. Opt. Express, 2019, 27: 27441.

[6] X. Gao, X. Sang, X. Yu, et al.. 360 light field 3D display system based on a triplet lenses array and holographic functional screen. Chin. Opt. Lett., 2017, 15: 081902.

[7] P. Moon, D. E. Spencer. Theory of the photic field. J. Frank. Inst., 1953, 255: 33.

[8] AdelsonE. H.BergenJ. R., “The plenoptic function and the elements of early vision,” in Computational Models of Visual Processing (1991), p. 3.

[9] C. Luo, Q. Wang, H. Deng, et al.. Extended depth-of-field in integral-imaging pickup process based on amplitude-modulated sensor arrays. Opt. Eng., 2015, 54: 073108.

[10] C. Luo, H. Deng, L. Li, et al.. Integral imaging pickup method with extended depth-of-field by gradient-amplitude modulation. J. Display Technol., 2016, 12: 1205.

[11] H. Yun, A. Llavador, G. Saavedra, et al.. Three-dimensional imaging system with both improved lateral resolution and depth of field considering non-uniform system parameters. Appl. Opt., 2018, 57: 9423.

[12] K. Kwon, M. Erdenebat, Y. Lim, et al.. Enhancement of the depth-of-field of integral imaging microscope by using switchable bifocal liquid-crystalline polymer micro lens array. Opt. Express, 2017, 25: 30503.

[13] X. Shen, B. Javidi. Large depth of focus dynamic micro integral imaging for optical see-through augmented reality display using a focus-tunable lens. Appl. Opt., 2018, 57: B184.

[14] H. Du, L. Dong, M. Liu, et al.. Increasing aperture and depth of field simultaneously with wavefront coding technology. Appl. Opt., 2019, 58: 4746.

[15] X. Gao, X. Sang, W. Zhang, et al.. Viewing resolution and viewing angle enhanced tabletop 3D light field display based on voxel superimposition and collimated backlight. Opt. Commun., 2020, 474: 126157.

[16] P. Wang, X. Sang, X. Yu, et al.. A full-parallax tabletop three dimensional light-field display with high viewpoint density and large viewing angle based on space-multiplexed voxel screen. Opt. Commun., 2021, 488: 126757.

[17] S. Shi, J. Wang, J. Ding, et al.. Parametric study on light field volumetric particle image velocimetry. Flow Meas. Instrum., 2016, 49: 70.

[18] C. Gao, X. Sang, X. Yu, et al.. Space-division-multiplexed catadioptric integrated backlight and symmetrical triplet-compound lenticular array based on ORM criterion for 90-degree viewing angle and low-crosstalk directional backlight 3D light-field display. Opt. Express, 2020, 28: 35074.

[19] X. Yu, X. Sang, X. Gao, et al.. Dynamic three-dimensional light-field display with large viewing angle based on compound lenticular lens array and multi-projectors. Opt. Express, 2019, 27: 16024.

[20] Z. Wang, L. Zhu, H. Zhang, et al.. Real-time volumetric reconstruction of biological dynamics with light-field microscopy and deep learning. Nat. Methods, 2021, 18: 551.

[21] X. Yu, X. Sang, D. Chen, et al.. Autostereoscopic three-dimensional display with high dense views and the narrow structure pitch. Chin. Opt. Lett., 2014, 12: 060008.

Xunbo Yu, Yiping Wang, Xin Gao, Hanyu Li, Kexin Liu, Binbin Yan, Xinzhu Sang. Optimizing depth of field in 3D light-field display by analyzing and controlling light-beam divergence angle[J]. Chinese Optics Letters, 2024, 22(1): 011101.

引用该论文: TXT   |   EndNote

相关论文

加载中...

关于本站 Cookie 的使用提示

中国光学期刊网使用基于 cookie 的技术来更好地为您提供各项服务,点击此处了解我们的隐私策略。 如您需继续使用本网站,请您授权我们使用本地 cookie 来保存部分信息。
全站搜索
您最值得信赖的光电行业旗舰网络服务平台!