Professional Documents
Culture Documents
Tof System Design Part 2 Optical Design For Time of Flight Depth Sensing Cameras
Tof System Design Part 2 Optical Design For Time of Flight Depth Sensing Cameras
Tof System Design Part 2 Optical Design For Time of Flight Depth Sensing Cameras
Abstract Modulated/Pulsed
LED/Laser Source
Optics plays a key role in time of flight (ToF) depth sensing cameras, and the
optical design dictates the complexity and feasibility of the final system and Emitter
Target Trigger
its performance. 3D ToF cameras have certain distinct characteristics1 that 3D Data
drive special optics requirements. This article presents the depth sensing optical Output
ToF is an emerging 3D sensing and imaging technology that has found numer- Figure 1. The concept of ToF technology.
ous applications in areas such as autonomous vehicles, virtual and augmented
reality, feature identification, and object dimensioning. ToF cameras acquire depth Depth Sensing Optical System Architecture
images by measuring the time it takes the light to travel from a light source to Figure 2 shows the optical system architecture. It can be broken down into two
objects in the scene and back to the pixel array. The specific type of technology main sub-module categories: imaging module (also known as receiver or Rx)
that Analog Devices’ ADSD3100 backside illuminated (BSI) CMOS sensor imple- and illumination module (also known as transmitter or Tx). The following sec-
ments is called continuous wave (CW) modulation, which is an indirect ToF sensing tions introduce the function of each component, requirements distinct to the
method. In a CW ToF camera, the light from an amplitude modulated light source ToF system, and corresponding design examples.
is backscattered by objects in the camera’s field of view (FOV), and the phase
shift between the emitted waveform and the reflected waveform is measured. By Illumination Module
measuring the phase shift at multiple modulation frequencies, one can calculate a The illumination module consists of a light source, a driver that drives the light
depth value for each pixel. The phase shift is obtained by measuring the correlation source at a high modulation frequency, and a diffuser that projects the optical beam
between the emitted waveform and the received waveform at different relative from the light source to the designed field of illumination (FOI), as illustrated in
delays using in-pixel photon mixing demodulation.2 The concept of CW ToF is Figure 2.
shown in Figure 1.
VISIT ANALOG.COM
Diffuser FOI FOV
Light Source
(VCSEL + Driver)
Imaging Optics
ToF Imager
Light Source and Driver not much light returns to the sensor (which is the case for faraway or low
ToF modules normally use light sources that are narrow band with low tempera- reflectivity objects).
ture dependence of the wavelength, including vertical cavity surface emitting X Human perception
lasers (VCSELs) and edge emitting lasers (EELs). Light emitting diodes (LEDs) are
generally too slow for ToF modulation requirements. VCSELs have gained more While the human eye is insensitive in the near infrared (NIR) wavelength
popularity over recent years due to their lower cost, form factor, and reliability, range, light at 850 nm can be perceived by the human eye. On the other hand,
along with being easy to integrate into ToF modules. Compared with EELs (that 940 nm is invisible to the human eye.
emit from the side) and LEDs (that emit from the sides and top), VCSELs emit X Sunlight
beams perpendicular to their surface, which offers better yield in production and
lower fabrication cost. In addition, the desired FOI can be achieved by using a Although the solar emission is maximum in the visible region of the spec-
single engineered diffuser with the designed divergence and optical profile. The trum, the energy in the NIR region is still significant. Sunlight (and ambient
optimization of the laser driver, as well as the electrical design and layout of the light more generally) can increase depth noise and reduce the range of a ToF
printed circuit boards (PCBs) and light source are critically important to achieve camera. Fortunately, due to atmospheric absorption, there is a dip in sunlight
high modulation contrast and high optical power. irradiance in the 920 nm to 960 nm region, where the solar irradiance is less
than half compared to the 850 nm region (see Figure 3). In outdoor applica-
Illumination Wavelength (850 nm vs. 940 nm) tions, operating the ToF system at 940 nm provides better ambient light
While the ToF operational principle does not depend on the wavelength (rather immunity and leads to better depth sensing performance.
it depends on the speed of light) and therefore the wavelength should not affect the 1.50
accuracy, the choice of wavelength can affect the system-level performance in some Etr W × m−2 × nm−1
use cases. The following are some considerations when choosing the wavelength. 1.25
Global Tilt W × m−2 × nm−1
Direct and Circumsolar
Spectral Irradiance (W m-2 nm-1)
W × m−2 × nm−1
X Sensor quantum efficiency and responsivity:
1.00
Quantum efficiency (QE) and responsivity (R) are linked to each other.
0.75
■ QE measures the ability of a photodetector to convert photons into electrons.
2 ToF System Design—Part 2: Optical Design for Time of Flight Depth Sensing Cameras
Table 1 lists a few examples of FOI and their corresponding radiant intensity nor-
malized to the radiant intensity of a 60° × 45° FOI. Note that the radiant intensity I(θ) R0
is calculated as optical power per rectangular solid angle.
dA
Illumination Profile
~ cos−n(Θ)
FOI (FWHM)
FOV of the
Imaging Lens
(a) Radiant Intensity (Cross-Section) (b) Radiant Intensity (3D) (c) Radiant Intensity (Top View)
VISIT ANALOG.COM 3
In general, the optical efficiency can be improved by having a collimator lens The following examples demonstrate how the CRA and the focused light cone
between the light source and the diffuser to reduce the input angle to the sizes (effective f/#) across the field affect the RI. The lens system of Example 1 in
diffuser, or by choosing a light source with a smaller divergence angle. Figure 7 has larger CRAs and gradually decreasing imaging cones (that is,
increased f/#) as the field angle increases. The corresponding RI drops signifi-
Imaging Module cantly with the field angle as shown in the corresponding RI plot. Example 2 in
The imaging module consists of an imaging lens assembly, band-pass filter Figure 7 demonstrates that the RI can be well maintained by minimizing the
(BPF), and microlens array on the imager. The thickness and material of the CRA as well as keeping uniform f/# across the field.
backside optical stacks on the imager should be optimized for low back-reflec- X Stray light
tion. Figure 6 shows an illustration of the imaging module.
Stray light is any unintended light in a system that could be detected by the
sensor. Stray light can come from in or out of field sources that form a ghost
image (for example, lens flare) through even numbers of reflections. Stray
light can also emanate from opto-mechanical structures and any scattering
Imaging Lens
surfaces. ToF systems are particularly sensitive to stray light because the
multipath nature of stray light contributes different optical path lengths to a
Band-Pass Filter
pixel, which leads to depth measurement inaccuracies. Several strategies in
the design process need to be used to reduce stray light, such as optimization
of the anti-reflection (AR) coating and the mechanical aperture, darkening
Imager
the lens edges and mounting structures, and custom design of the BPF to
optimize for wavelength and CRA.
The following are some items that can impact stray light in a system:
Microlens Array
■ Vignetting
Ideally there should not be any vignetting in a ToF lens system. Vignetting
cuts off the imaging rays and is sometimes used as a technique to increase
Figure 6. Illustration of the imaging module. the image quality while trading off the brightness of the peripheral fields.
However, the cutoff rays often bounce inside the lens system and tend to
ToF Imaging Lens Design Considerations cause stray light issues.
Since the ToF camera collects light generated by active illumination, the efficiency ■ AR coating
and uniformity of the light collection on the pixel array greatly affect the overall
performance. The lens needs to have a strong collecting power, high transmis- AR coating on the optical elements reduces the reflectance of each surface
sion, and low stray light. The following are design considerations for ToF lenses, and can effectively reduce the impact of lens reflections on depth calcula-
which are distinct from traditional RGB camera lenses. tion. AR coatings should be carefully designed for the light source wavelength
range and the angle range for the incident angles on the lens surfaces.
X Light collecting efficiency
■ Number of lens elements
Light collection efficiency is proportional to 1/(f/#)2, where f/# = (focal length)/
(aperture size). The smaller the f/#, the better the efficiency. There are some Although adding more lens elements provides more freedom to achieve the
trade-offs with a small f/# optical system. As the aperture size increases, design specifications and better image quality in terms of resolution, it also
there tend to be more vignetting and aberrations, which make the optics increases the inevitable back reflections from the lens elements as well as
more challenging to design. A low f/# system also tends to have shallower increasing complexity and cost.
depth of field. ■ Band-pass filter (BPF)
X Relative illumination (RI) and chief ray angle (CRA)
The BPF cuts off ambient light contribution and is essential for ToF systems.
RI is defined as: The BPF design should be tailored to the following parameters to have the
Illumination at a single point on the sensor best performance.
RI = (7)
Maximum illumination within the field (a) Lens parameters such as f/# and CRA across the field
The sensor illuminance declines based on the (cos q)4 law, in a distortion and (b) Light source parameters such as bandwidth, nominal wavelength toler-
vignetting free lens system where q is the CRA incident angle on the sensor ance, and thermal shift
plane. The result is a relative darkening of the image toward the sensor bor-
(c) Substrate material properties to low incident angle drift vs. wavelength
der. The irradiance fall-off can be reduced by introducing negative distortion
or low thermal drift vs. wavelength
in the lens system.
X Microlens array
The max CRA at the sensor edge should be optimized based on the imager
microlens array specification. A smaller CRA helps narrow the bandwidth of A ToF backside illuminated (BSI) sensor normally has a layer of microlens
the BPF to achieve better ambient light immunity. array that converges rays incident to the image sensor and maximizes the
4 ToF System Design—Part 2: Optical Design for Time of Flight Depth Sensing Cameras
Example 1 Example 2
Band-Pass Filter
Band-Pass Filter Light from the Scene
Light from
the Scene
Lens System
Lens System
Sensor Plane
Sensor Plane
1.0 1.0
0.9 0.9
0.8 0.8
0.7 0.7
Relative Illumination
Relative Illumination
0.6 0.6
0.5 0.5
0.4 0.4
0.3 0.3
0.2 0.2
0.1 0.1
0 0
0 0.3 0.6 0.9 1.2 1.5 1.8 2.1 2.4 2.7 3.0 0 0.3 0.5 0.8 1.0 1.3 1.5 1.8 2.0 2.3 2.5
Y Field (mm) Y Field (mm)
number of photons that reach the pixel modulation region. The geometry of
the microlens is optimized to achieve the highest absorption within the pixel
CRA
region where photons are converted into electrons.
Image Height
In many lens designs, the CRA of the lens increases as the image height Imaging Lens
increases toward the edge of the sensor, as shown in Figure 8. This oblique Max RRA
incidence leads to absorption loss in the pixel and crosstalk between adja-
cent pixels when the CRA is too big. It is important to design or choose an
imaging lens such that the CRA of the lens matches the specifications of the Note:
CRA: Chief Ray Angle; RRA: Rim Ray Angle
microlens array it is designed for. For example, the optimal CRA that matches
with ADI ToF sensor ADSD3100 is at around 12° at the sensor horizontal and
Figure 8. Max CRA of an imaging lens.
vertical edges.
VISIT ANALOG.COM 5
Conclusion References
ToF optics have unique requirements to achieve optimal performance. This 1
P aul O’Sullivan and Nicolas Le Dortz. “Time of Flight System Design—Part 1:
article provides an overview of a 3D ToF camera optical architecture and design System Overview.” Analog Dialogue, Vol. 55, No. 3, July 2021.
guidelines for the illumination and imaging sub-modules to help design such 2
C yrus S. Bamji, Swati Mehta, Barry Thompson, Tamer Elkhatib, Stefan Wurster,
optical systems and/or choose sub-components. For the illumination sub-module,
Onur Akkaya, Andrew Payne, John Godbaz, Mike Fenton, Vijay Rajasekaran,
the key factors are power efficiency, reliability, and the ability of the light source to
Larry Prather, Satya Nagaraja, Vishali Mogallapu, Dane Snow, Rich McCauley,
be driven at high modulation frequency with high modulation contrast. Wavelength
Mustansir Mukadam, Iskender Agi, Shaun McCarthy, Zhanping Xu, Travis Perry,
selection consideration between 850 nm and 940 nm, as well as how to specify the
William Qian, Vei-Han Chan, Prabhu Adepu, Gazi Ali, Muneeb Ahmed, Aditya
illumination profiles are discussed in detail. For the imaging sub-module, the lens
Mukherjee, Sheethal Nayak, Dave Gampell, Sunil Acharya, Lou Kordus, and Pat
design considerations including f/#, CRA that matches with the microlens specifi-
O’Connor. “IMpixel 65nm BSI 320MHz demodulated TOF Image sensor with 3μm
cation, and stray light control are critical to system-level performance.
global shutter pixels and analog binning.” 2018 IEEE International Solid-State
Circuits Conference (ISSCC), February 2018.
3
“Reference Air Mass 1.5 Spectra.” National Renewable Energy Laboratory.
For regional headquarters, sales, and distributors or ©2021 Analog Devices, Inc. All rights reserved. VISIT ANALOG.COM
to contact customer service and technical support, Trademarks and registered trademarks are
visit analog.com/contact. the property of their respective owners.