Computational 4D imaging of light-in-flight with relativistic effects Download: 652次
1. INTRODUCTION
Optical imaging of ultra-fast phenomena [1,2] provides critical information for understanding fundamental aspects of the world we live in [35" target="_self" style="display: inline;">–
Compared to everyday photography, imaging light-in-flight is particularly interesting because light is both the medium carrying information to the camera in the form of scattered photons and the object to be imaged itself. In such a scenario, the speed of light cannot be treated as an infinite number, as is otherwise true for everyday photography. The recording of such events will be significantly observer-dependent and will exhibit spatiotemporal distortions [18–
To explain this point further, see two examples in Fig.
Fig. 1. Schematics of difference between imaging (a) a moving car and (b) a flying light pulse. stands for the time during which the object moves from position A to position B, and and denote the time of flight for the scattered photons to propagate to the camera from positions A and B, respectively.
In order to retrieve observer-independent information of light-in-flight, this relativistic effect (finite light speed of ) needs to be compensated for to determine the accurate time when the event actually happens rather than the arrival time at which it is detected by the camera. In holographic light-in-flight imaging, this compensation can be performed using a graphical method based on the ellipsoids of the holodiagram [22]. A straightforward approach is to simply remove the time-of-flight, i.e., and in Fig.
Interestingly, we note that the observer-dependent data of light-in-flight contains more information than the aforementioned works have exploited. Here, we demonstrate that the relativistic effects can be compensated during the imaging of light-in-flight by further exploiting the (, , ) data recorded by a SPAD camera via a strictly built optical model and a computation layer to obtain non-distorted time of a flying light pulse, without any additional measurements or auxiliary ranging equipment. Simultaneously, the information of an extra dimension, i.e., dimension, can be retrieved, leading to the observer-independent space-time (, , , ) four-dimensional (4D) reconstruction of light-in-flight. The proposed scheme enables the accurate visualization of transient optical phenomena such as light scattering or interaction with materials.
2. EXPERIMENTAL SETUP
Our experimental system is illustrated in Fig.
Fig. 2. Experimental system for light-in-flight measurement and data processing. (a) In the experiment, the pulsed laser and the SPAD camera are synchronized via a trigger generator. Placed at , a 636 nm pulsed laser emits pulses across the field of view of the SPAD camera. The SPAD camera, with a lens of 3.5 mm focal length, is located at . The object focal plane of the camera is the plane at , having a field of view of . The SPAD camera collects the scattered photons from the propagating laser pulses and records a histogram at each pixel using TCSPC mode. (b) The raw data of the histograms is fitted with a Gaussian distribution. Histograms with widths too large or too small are discarded (pixels 1 and 2). Malfunctioning pixels with abnormally large counts are also discarded (pixel 4), leaving only effective pixels (pixel 3). (c) The arrival time of the scattered photons in the effective pixels is determined as the peak position of the fitted Gaussian distribution, and a pixel versus arrival time can be obtained. Consequently, the projection of the light path on the plane, as well as the arrival times along the path is obtained, forming the ( , , ) three-dimensional data of light-in-flight.
The accurate estimation for the arrival time of the scattered photons at each pixel of the SPAD camera is important for the light-in-flight reconstruction. Gaussian fitting is performed on the raw histogram data at each pixel to suppress the statistical random noise of photon counting. Under the assumption that the background light and dark count of the SPAD camera, whose temporal distribution is quasi-flat, add a bias to the histogram, a constant term is added to the Gaussian polynomial during the fitting in order to improve the estimation accuracy of the arrival time. During the data processing, if the width of a fitted Gaussian curve is much larger or smaller than the expected width, the corresponding pixel is assumed to be malfunctioning or extremely noisy, and is therefore discarded. Furthermore, the systematic overall delay, which is mainly caused by electronic jitter of the related devices and is different for each pixel, is compensated by a temporal offset for each effective Gaussian curve. The offset for the corresponding pixel is determined as the temporal difference between the measured peak position of the corresponding pixel and its theoretical value, which is measured when the camera is uniformly illuminated with a collimated and expanded pulsed laser.
Once the raw histogram data is Gaussian fitted and the temporal-delay is compensated at each pixel, the peak of its histogram is determined, which represents the arrival time of the light recorded at the corresponding pixel. As shown in Fig.
3. OPTICAL MODEL AND COMPUTATIONAL LAYER
In order to transfer the arrival time distorted by the relativistic effects to the accurate time at which the light pulse actually is at a given position, an optical model is built as shown in Fig.
Fig. 3. Optical model for the computation of propagation time . and are the angles of and , respectively. and are the lengths of BA and BE , respectively. BE is the projection of BD on the reference plane (RP).
Using the calculated propagation angle , the propagation time can be determined for each recorded (, ). Furthermore, the information of the corresponding (, ) can also be retrieved via the knowledge of . Therefore, the observer-independent 4D (, , , ) information of light-in-flight is reconstructed.
The procedure to reconstruct multiple light paths is illustrated in Fig.
Fig. 4. Reconstruction procedure for consecutive light paths. (a) For light path 1 (LP1), the reference plane (RP1) is the plane containing the starting point 1 (S1). The spatial location of the projection (PP1), propagation angle, and ending point (E1) of LP1 are determined using the proposed geometric model. (b) E1 is used as S2 for the reconstruction of LP2, and RP2 is the plane containing S2. The equation of LP2 and the position of E2 can be obtained. (c) In the same manner, LP3 and E3 are determined with RP3.
4. RESULTS
4.1 A. Propagation Angle Estimation
An experiment is performed using the setup in Fig.
During the experiment, the plane containing the laser-emitting point is selected to be the reference plane. The histogram of any malfunctioning pixels is discarded and its corresponding arrival time is determined as the linear interpolated value of the arrival times from the neighboring pixels. It is worth mentioning that theoretically the propagation angle can be estimated using one or several arrival times [25,26]. However, due to the discrete nature of the temporal measuring with the SPAD camera and the noise recorded during a practical experiment, the greater number of are involved in the calculation, the more accurate the estimated will be. Figure
Fig. 5. Experimental results of the propagation angle estimation. (a) Angle error resulting from using different numbers of for the estimation of . (b) Calculated propagation time with respect to arrival time at different propagation angles. (c) The variation of measured full width at half maximum for a laser pulse with respect to its propagation angle , caused by the relativistic effects.
Figure
4.2 B. Light-in-Flight Reconstruction
A second experiment is then performed to reconstruct light-in-flight in a 3D space of , where the pulses are emitted from a laser and reflected by two mirrors to generate three consecutive light paths across the FOV of the SPAD camera. In the experiment, there is a 40 mm distance between the FOV and any optical elements (e.g., the laser source and the mirrors) in order to avoid spurious scattering of light into the measurement. The emitting point of the pulsed laser is selected as (0,0,0) of the coordinate for calculation. The object focal plane of the SPAD camera is determined to be the plane at . Using the same configuration as before, the SPAD camera records the observer-dependent (, , ) information of the three light paths inside the FOV. The reconstruction of light-in-flight is performed by sequentially determining the light paths from laser emitting point to Mirror 1, to Mirror 2, and then to the exiting point. The reconstruction procedure is given in previous section.
Figure
Fig. 6. Experimental 4D reconstruction of light-in-flight. (a) A reconstruction of a laser pulse reflected by two mirrors is demonstrated. The RMSEs of the reconstruction (red line) to the ground truth (dashed line) in position and time are 1.75 mm and 3.84 ps, respectively. (b) The difference between the calculated propagation time (red line) and measured arrival time (blue line) at each recorded frame. The propagation time is in good agreement with the ground truth (dashed line), demonstrating a feasible compensation for the relativistic effects via the proposed scheme.
Figure
5. DISCUSSION AND CONCLUSION
The estimation of propagation angle is crucial to the light-in-flight reconstruction in this work, and we have demonstrated accurate estimations of propagation angle from to 10°. Theoretically, the proposed approach can be used to estimate any angle between , not including . Practically, there are two major aspects, noise and diffusion, to consider when measuring a steep angle. On the noise aspect, the reconstruction will be even more accurate with a steeper angle because the relativistic effect is more obvious and the difference between the measured data of two adjacent pixels is easier to be recorded above the noise. An accurate reconstruction at a smaller angle is more difficult to achieve because the milder distortion can be easily drowned in the systematic noise. From the diffusion aspect, a forward steep angle increases the signal-to-noise ratio of the measured data while a backward steep angle decreases it.
The proposed method has the assumption that the light-in-flight to be reconstructed happens in a uniform and homogenous medium, where light propagates in a straight line at a fixed velocity. However, it is also possible to reconstruct self-bending light beams, such as Airy beams [27], in a differential manner. That is, the self-bending light path of the Airy beam propagation viewed as a combination of many tiny straight paths, each of which can then be reconstructed individually by the proposed method.
The position error for a reconstructed light path is mainly determined by the estimation accuracy of the propagation angle and the recorded light path projection on a SPAD camera. The estimation accuracy of the propagation angle can be further improved by taking more measurements or by using a camera with lower noise. The accuracy of the recorded light path projection is limited by the pixel resolution () and fill factor (1.5%) of the SPAD camera. In particular, when a light pulse is propagating in a quasi-horizontal direction, the small variation in the direction cannot be spatially resolved by the SPAD camera, which accumulates as the light pulse propagates and degrades the resulting accuracy of the reconstruction. A newly developed SPAD camera with a pixel resolution and 61% fill factor [28] will improve the reconstruction accuracy of the proposed light-in-flight imaging system. A backside-illuminated multi-collection-gate silicon sensor can also be used in light-in-flight imaging [29] to provide a higher fill factor, larger photoreceptive area, and higher spatial resolution, with a temporal resolution currently of 10 ns, but its sensitivity is not as good as a SPAD camera. However, the ultimate limit for temporal resolution of these cameras implies that in the future, sub-ns temporal resolution could be achievable, thus allowing precise light-in-flight measurements with just one single laser shot, as shown in the proof-of-concept work by Etoh et al. [29].
In summary, we have proposed a computational imaging scheme to achieve the reconstruction of light-in-flight in observer-independent 4D (, , , ) by recording the scattered photons of the light propagation with a SPAD camera and compensating the relativistic effects via an optical model-based computation layer. The relativistic effects in this context refer to the spatiotemporal distortion caused by the fact that the speed of light needs to be treated as a finite number in certain scenarios such as transient imaging. The estimation of the light propagation angle , which is crucial to the 4D light-in-flight reconstruction, has a mean error of 0.15° for the range from to 10°. In the reconstruction of the light-in-flight in a 3D space of , the temporal accuracy is improved from 174.80 ps of the distorted arrival time to 3.84 ps of the compensated propagation time. The spatial accuracy of the reconstruction is 1.75 mm, which is better than both the 8 mm transverse spatial resolution determined by the optical setup of the system and the 16.5 mm longitude spatial resolution determined by the 55 ps time resolution of the SPAD camera. The improvement is mainly achieved by the accurate estimation of the propagation angle , where the random-natured noise and the inaccuracy of discrete measurement are suppressed by estimation involving multiple measurements. The accurately estimated propagation angle can be further exploited to correct other distorted measurements.
The proposed 4D imaging scheme is applicable to the reconstruction of light-in-flight for other circumstances, such as light traveling inside a cavity or interacting with other materials. This work provides the ability to expand the recording and measuring of repeatable ultra-fast events with extremely low scattering from 3D to 4D. It can also be applied to observe optical phenomena which pose a difficulty for other imaging schemes, e.g., the behavior of light in micro- or nanostructures and the interaction between light and matter.
Article Outline
Yue Zheng, Ming-Jie Sun, Zhi-Guang Wang, Daniele Faccio. Computational 4D imaging of light-in-flight with relativistic effects[J]. Photonics Research, 2020, 8(7): 07001072.