Professional Documents
Culture Documents
Nanophotonics For Light Detection and Ranging Technology: Review Article
Nanophotonics For Light Detection and Ranging Technology: Review Article
https://doi.org/10.1038/s41565-021-00895-3
Light detection and ranging (LiDAR) technology, a laser-based imaging technique for accurate distance measurement, is con-
sidered one of the most crucial sensor technologies for autonomous vehicles, artificially intelligent robots and unmanned aerial
vehicle reconnaissance. Until recently, LiDAR has relied on light sources and detectors mounted on multiple mechanically rotat-
ing optical transmitters and receivers to cover an entire scene. Such an architecture gives rise to limitations in terms of the
imaging frame rate and resolution. In this Review, we examine how novel nanophotonic platforms could overcome the hardware
restrictions of existing LiDAR technologies. After briefly introducing the basic principles of LiDAR, we present the device speci-
fications required by the industrial sector. We then review a variety of LiDAR-relevant nanophotonic approaches such as inte-
grated photonic circuits, optical phased antenna arrays and flat optical devices based on metasurfaces. The latter have already
demonstrated exceptional functional beam manipulation properties, such as active beam deflection, point-cloud generation
and device integration using scalable manufacturing methods, and are expected to disrupt modern optical technologies. In the
outlook, we address the upcoming physics and engineering challenges that must be overcome from the viewpoint of incorporat-
ing nanophotonic technologies into commercially viable, fast, ultrathin and lightweight LiDAR systems.
L
ight detection and ranging (LiDAR) is a surveying method LiDAR in the real world and its potential economic impact, even in
that emerged after the development of its older sister: radio the short term, there has been a considerable increase in research
detection and ranging (radar). The traditional LiDAR method and development in both the hardware and software sectors, with
utilizes a pulsed light source to illuminate a target object; by mea- the appearance of many new startups6.
suring the return time of the reflected light pulses (known as time As a specific, but extremely relevant, example, LiDAR system
of flight (TOF)), it is possible to calculate the object distance. decision times for autonomous vehicles need to be fast enough to
LiDAR technology, traditionally classified into ground-based, air- allow them to safely come to a full stop in case of danger3. Certain
borne and spaceborne LiDAR, began to develop in earnest after the demands related to human safety have yet to be simultaneously ful-
invention of the laser in the 1960s1. Airborne LiDAR is commonly filled: measurement ranges of ≥150 m (or ~1 µs TOF, Box 1), the
mounted on aircraft and satellites mainly for measuring atmo- ability to distinguish objects of 10 cm in size, a 360° real-time oper-
spheric conditions and environmental observations. Spaceborne ational range and an optical system that overcomes poor weather
LiDAR2 has been used in spacecraft for docking distance measure- conditions and is robust to different solar illumination conditions.
ments on the space station or in probe robots for space explora- On top of these requirements, it is expected that LiDAR systems will
tion. Ground-based LiDAR, initially used for simple measurements be manufactured as compact and affordable chip-scale sensors. As
(mainly distance and vehicle speed), is today considered a crucial of today, however, no commercial LiDAR sensor meets all of these
component in diverse applications such as autonomous vehicles3, requirements. The majority of commercially available LiDAR sys-
artificially intelligent robots4 and unmanned aerial vehicle recon- tems are based on macromechanical scanners and microelectrome-
naissance5. Consumer electronic devices, including the iPhone and chanical systems (MEMS) that are bulky and vulnerable to external
iPad from Apple, Inc., Kinect indoor motion capture sensors from impact.
the Microsoft Corporation and so forth, include LiDAR sensors for State-of-the-art advances in nanophotonics have recently been
augmented or virtual reality displays. Industry-oriented applica- considered as supporting or even alternative technologies to con-
tions include LiDAR sensors for self-driving cars or robot vision in ventional LiDAR systems (Fig. 1). In particular, several miniatur-
manufacturing factories. Finally, aerial applications exploit LiDAR ized beam steering platforms, such as chip-scaled optical phased
sensors in drone surveillance for terrain mapping, and space appli- arrays and flat optical devices based on metasurfaces, can realisti-
cations for planetary rovers. cally shrink the device footprint. Nanophotonic LiDAR platforms
Several LiDAR systems have been optimized to meet the various can also offer improved imaging capabilities in terms of both scan-
requirements of the application under consideration. For instance, ning rate and image information contents.
for consumer electronics, the device cost and footprint are impor- In this Review, we highlight the recent nanophotonic approaches
tant, but the measurement accuracy, measurable distance and sys- leading to ultracompact LiDAR sensors. In the first and second sec-
tem robustness are key for precision devices. Considering the use of tions, we introduce the basic operation and delve into advanced
1
Department of Mechanical Engineering, Pohang University of Science and Technology (POSTECH), Pohang, Republic of Korea. 2Université Côte d’Azur,
Centre de Recherche sur l’Hétéro‐Epitaxie et ses Applications (CRHEA), CNRS, Valbonne, France. 3Department of Chemical Engineering, Pohang University
of Science and Technology (POSTECH), Pohang, Republic of Korea. 4Department of Information and Communication Engineering, Yeungnam University,
Gyeongsan, Republic of Korea. 5Advanced Technology Research Center, SL Corporation, Gyeongsan, Republic of Korea. 6These authors contributed equally:
Inki Kim, Renato Juliano Martins. ✉e-mail: patrice.genevet@crhea.cnrs.fr; jsrho@postech.ac.kr
The most basic ranging system of LiDAR relies on direct TOF70,104. equation that governs the LiDAR process. This is known as the
The concept involves monitoring the time delay Δτ between the LiDAR scattering form equation, given by:
incident pulse of light and the backscattered signal from the tar-
get (Direct TOF measurement schematic figure). The measured A
[ ]
time delay gives the depth of the object as R = c Δτ NLiDAR = Nlaser × β × 2 × Tf (R) × Tb (R) × η + Nb
2 (left panel of R
the Working principle of direct TOF, AMCW and FMCW figure),
where c is the velocity of light. Full three-dimensional (3D) scene where η represents LiDAR efficiency including the optical losses
reconstruction with full depth information is performed by scan- and degree of overlap between the laser beam and the receiver
ning the two azimuthal axes. FOV, Tf and Tb indicate transmission of incident light and back-
Many automotive LiDAR systems use a pulsed scattered light, respectively, and Nb is background noise. The
905-nm-wavelength laser as the light source. The important information concerning the object can be obtained by solving the
parameters of the illuminated pulsed laser and direct TOF equation for β. Note that several other physical processes includ-
measurement schematic are described in the Direct TOF ing inelastic and resonant scattering, absorption, Doppler shifts
measurement schematic figure. The pulse width (w) is usually a and so forth are neglected but could influence the detectivity and
few to tens of nanoseconds, and the smaller the value of w the image rendering.
higher the spatial resolution of the TOF measurement. The spatial A second technique that utilizes a continuous source modulated
resolution Rres can be calculated as Rres = c w2 . As a simple example, at a constant frequency (known as amplitude-modulated
a 2-ns-pulsed laser can distinguish two objects 30 cm apart from continuous-wave, AMCW) determines the depth of a target object
each other. Assuming a maximum distance (Rmax) to be measured, using modulated intensity of a light source induced by periodic
Rmax = 200 m, the round-trip time (or Δτ) should be 1.33 µs. phase shifts originating from the round-trip to the target3. This
Moreover, to remove any ambiguity from the range measurement, phase difference provides a TOF of Δτ = Δϕ 2πf . Therefore, the
which affects the maximum measurable distance, the pulse period depth follows as R = 2c Δϕ
2πf , where Δϕ and f are the phase shift
(p) should be larger than Δτ. Another important measurable and modulation frequency of the source, respectively (middle
factor is range accuracy. The current state-of-the-art accuracy is panel of the Working principle of direct TOF, AMCW and FMCW
close to ±3 cm, which depends on the time counting error (0.2 ns). figure). AMCW LiDAR generally uses a continuous-wave laser
Taking the different physical mechanisms into consideration, as a light source, which makes it promising for indoor consumer
such as the laser emission power, light propagation, interaction applications. A common real-life example that includes a TOF
of light with the scattering object (for simplicity here only camera is in Microsoft Kinect devices.
instantaneous elastic scattering and zero Doppler shift), detection Unlike the two techniques described above, a third one exploits
and the overall LiDAR efficiency, it is possible to derive a general a continuous wave of which the frequency is modulated in the time
Forward signal
propagation (Tf(R))
Distance
Transmitter
module (Nlaser)
Velocity
CPU Pulse width
LiDAR Object
Period
Receiver
module (A/R2) Backward signal
propagation (Tb(R))
Detector
Reconstructed
3D image
Image processing
and rendering
Direct TOF measurement schematic. A pulsed laser is used as the light source and the time-delayed backscattered signal from an object (car) is
detected by a photodiode. The round-trip time (or time delay) directly gives the TOF, and advanced measurement techniques such as FMCW can
simultaneously measure the velocity of a moving object. Nlaser, number of transmitted photons; β, Angular scattering probability; A/R2, solid angle for
the collection probability; T, light transmission in a given medium. Through proper image processing and rendering, the measured 3D object can be
reconstructed.
cT
domain using a tunable laser or an electro-optic modulator (known The distance, R = 4B f + f− , and relative velocity of the target
(+ )
as frequency-modulated continuous wave, FMCW)3,105. The round of vr ≈ λ20 fd = λ40 (f+ − f− ) can thus be retrieved, where λ0 is the
trip to the target results in periodic shifts of frequencies; the mixed operating wavelength (for the sake of convenience, a triangular
signal of the emitted source and reflected signal produces a beat frequency-modulated chirp is commonly used) (right panel of
frequency, which is directly proportional to the TOF following the Working principle of direct TOF, AMCW and FMCW figure).
B
the relation, fb = τchirp Δτ , where B and τchirp are the bandwidth Thus, FMCW can simultaneously measure both the distance and
and chirp period time of the frequency sweep, respectively. velocity of a moving target, establishing a method of detection
Interestingly, the detected signal can have an additional frequency with an increased amount of information. The main challenges
shift due to the Doppler effect when the target object is in motion with current methods are precisely modulating the frequency at
with a velocity v. Owing to the Doppler frequency fd, two periodic will using an electro-optic modulator and realizing a coherent
local maxima frequencies are observed; f+ = Δf + fd and f− = Δf − fd detection scheme.
a b c
f
fd
I ∆f
I B
∆Ф 1/f τchirp
τ
fb
∆τ f+
f–
τ τ τ
Illustrations of the working principles of direct TOF (a), AMCW (b) and FMCW (c). The solid blue and red lines represent the incident pulse and the
backscattered signal from a target, respectively. The purple line shows the period of the oscillating continuous waves.
measurement principles of LiDAR (Table 1). The third and fourth mechanical rotation of LiDAR systems and may become a reliable
sections cover imaging strategies, sources and detectors from the inertia-free LiDAR solution.
perspective of the hardware components. LiDAR specifications for Electro-optic-sampling timing detection has been adapted for
industrial applications (mainly autonomous vehicles) are also cov- ultrafast and subnanometre-precision TOF measurements9. Here,
ered. We then discuss the limitations of existing LiDAR systems in the optical pulses generated by a mode-locked laser are split into
terms of their: (1) bulk volume, (2) field of view (FOV), (3) mea- two paths: one is sent to a high-speed photodiode to convert the
surement range and (4) computational capabilities needed to handle pulse train into photocurrent pulses, and the other experiences
real-time big datasets; and explore how recent nanophotonic solu- TOF delay. The photocurrent pulses are utilized to detect the rela-
tions could outperform conventional LiDAR. Sections five to eight tive position between peaks of the converted photocurrent pulse
introduce nanophotonic approaches, mostly based on integrated trains and the optical pulses before and after experiencing the TOF
photonic circuits for optical phase-arrays, and metasurfaces. Finally, change. This relative position of optical pulses is recorded as the
we provide an outlook of the opportunities offered by these emerg- output voltage as a function of relative timing between the optical
ing approaches, highlighting the physics and engineering challenges pulse and the photocurrent pulse, eventually providing the desired
for real-life, ultrathin and lightweight LiDAR systems. depth information. TOF detection based on electro-optic-sampling
timing detection enables target depth measurement with subnano-
Advanced TOF techniques metre resolution and few-nanosecond acquisition times, which
Although most LiDAR systems perceive the surroundings by mea- leads to dynamic real-time detection of single events in microscale
suring the TOF using conventional techniques such as the direct devices. This method has been successfully applied to TOF-based
pulsed approach, the AMCW approach or the FMCW approach sensor systems such as 3D surface profilometry and strainmeters by
(Box 1), advanced TOF techniques of interest for solid-state LiDAR adapting the sensor head accordingly9.
have been proposed. In particular, photonic time-stretched tech- To fulfil the demands of a miniaturized and compact LiDAR
nology for TOF data acquisition7 relying on a continuous time solution, on-chip soliton microcomb-based TOF detection meth-
signal stretched by various dispersion devices can overcome the ods have been introduced10–12. Soliton microcombs comprised of a
sampling limitation of an analogue-to-digital converter. One series of equidistant discretized optical pulses in the time domain
such time-stretched TOF method8 is a single-shot imaging and can be generated using integrated silicon-based microresonators.
rotation-free 1D line-scanning technique with a single laser and The generated pulse train is then utilized as the source for direct
detector. Here, the broadband laser source is temporally modu- TOF10,11 and FMCW12 (see ‘Paths towards integrated chip-scale sili-
lated into a discrete set of pulses of different wavelengths. Each con photonic LiDAR’).
pulse is then diffracted at a given diffraction angle such that the
time-encoded echoes from the target contain information of both Illumination methods of LiDAR
the depth and angle, through the TOF and diffraction signals, The detection performances of LiDAR systems depend on the scan-
respectively. This enables 1D line scanning without any mechanical ning and detecting methods and the diverse approaches available for
movement. The figures of merit for the depth and angular detec- the treatment of the TOF information (Fig. 2a). These differences
tion range of such a time-stretched technique are determined by must thus be considered when implementing nanophotonic-based
the interval between the temporally discretized pulses, the spatial approaches.
dispersion of the optical elements for diffraction and the band- To scan an entire 360° space, and thus recognize 3D objects
width of the source. The time-stretched TOF method eliminates the through point or parallel line scanning, a mirror rotates around
Miniaturization
Nanophotonic LiDAR
Fig. 1 | A conceptual schematic of a nanophotonic LiDAR system. One promising application of LiDAR is in autonomous vehicles, where LiDAR can
detect the surrounding objects on the road. Conventional LiDAR technology used in autonomous vehicles comprises above-mentioned macromechanical
scanners or microelectromechanical systems, which are bulky and vulnerable to external impacts. Newly proposed nanophotonic LiDAR systems could not
only shrink the device form factor, but also significantly boost the device capabilities. The next breakthrough to further improving the FOV and enabling
faster scanning might not be far away with scalable chip-scale fabrication.
one axis with a multichannel light source and detector pair. Most inertia of the mirror, causing efficiency reduction. Other main defi-
commercial LiDAR systems use this rotating macromechanical ciencies are: vulnerability to shocks and vibrations, limited FOV
approach, which includes heavy and bulky pulsed laser optics. The compared with the mechanical scanning method and the presence
inertia of the mechanical components limits the frame rate to typi- of spurious diffraction effects because the periodic assembly of tens
cally <50 Hz and sometimes the durability of the system. Moreover, of micrometre-sized mirrors introduces periodic modulation of the
the electric wiring and connectors in a mechanically rotating device laser signal. In contrast, solid-state devices such as optical phased
are complex, requiring inductively coupled coils as a power supply arrays (OPA) and flash LiDAR resolve the problem of moving parts.
and a photoelectric force sensor to provide feedback for motor speed The OPA approach consists of a silicon waveguide array with
(Fig. 2b). Faster and more compact MEMS devices, which prevail in tunable phase delays. It can be orders of magnitude faster and more
current commercial products, are still composed of movable parts reliable than mechanical LiDAR. Directional light beam scanning
(Fig. 2c). Reducing the size of mechanical parts to the microscale at any arbitrary angle is performed by individually adjusting the
offers the advantages of system miniaturization and lower power phase delay in each of the waveguides. As there are no moving
consumption. However, MEMS LiDAR systems still suffer from parts, OPA designs have many advantages, such as higher mechan-
several shortcomings. For instance, there is an inevitable trade-off ical robustness and frame rates of 100 kHz or more. Above all,
between the deflection speed and the beam size owing to the finite OPA systems can be further miniaturized down to the chip scale
Scanning
LiDAR
b Macroscanner c
IC for image
Laser receiver processing
MEMS mirror
Laser emitter
Lens and
photodiode
Diode laser
Diffuser
Fig. 2 | A schematic of conventional macroscanner and MEMS-type LiDAR systems. a, The principle of LiDAR scanning and 3D depth recognition.
b, The hardware components of a macroscanner-type LiDAR device. The light source is generated by dozens of emitters and the device is rotated using
a motor. Additional light sources are required to increase the vertical resolution. c, The hardware components of a MEMS-type LiDAR device. The light
emitted from the diode laser is quickly scanned through the MEMS mirror, and the diffuser can improve the FOV in the horizontal or vertical direction.
IC, integrated chip.
and integrated with the laser system. However, limitations exist in object. The multibeam-type device consists of a 2D vertical cavity
terms of insertion loss and circuit integration issues. The geom- surface emitting laser (VCSEL) and a corresponding detector array,
etry of the waveguide requires a large unit cell size and spacing such as a single-photon avalanche diode. The scattered signal is
between adjacent antennas, which produces multiple beams and recorded using a multichannel detector. Each channel is aligned so
reduces the effective steering angle along with waveguide inser- that it only collects light coming from a given direction (and dif-
tion loss issues. In addition, to create a well-focused beam spot and fraction order). A subsequent image processing algorithm enables
achieve large deflection angles (that is, a large numerical aperture), real-time 3D object sensing by retrieving the TOF for each single
the number of phased array unit cells has to be large, resulting in point. Flash LiDAR requires an additional divergent optical system
complicated electronics. capable of illuminating a wide FOV, a high-power pulsed laser to
Flash LiDAR works by generating an angular distribution of light compensate for the signal intensity dispersion in the point cloud
spots, also known as a point cloud, to illuminate a target object and and sensitive detector arrays (Table 2). In particular, the laser power
recover 3D information in one shot. There are two types of flash is strictly limited for eye safety. To overcome this limitation and
LiDAR, one is a single-beam type and the other is a multibeam type. achieve high accuracy3, so-called row-by-row electronic scanning
The single-beam type device consists of a light source coupled to a was recently applied to multibeam flash LiDAR using a 2D VCSEL
diffractive optical component that illuminates different parts of the and single-photon avalanche diode array.
Fig. 3 | Active beam steering metasurfaces. a, TCO material approach: beam steering via electric gating of diffraction gratings with ITO. The diffraction
angle can be tuned using the number of periodically gated nanoantennas18. The inset at the bottom of a represents the applied voltage profile. b, MQW
approach. Left: schematic of an all-dielectric MQW metasurface. The metasurfaces were fabricated on a GaAs substrate, on which a distributed
Bragg reflector and 1,230-nm-thick MQW layer were deposited. The MQW grating has double slit patterns that enhance hybrid Mie and guided
mode resonances. Right: experimental results for the far-field intensity of the scattered light under 0 V (top) and −10 V (bottom). The light intensity is
normalized to the zeroth-order non-diffracting beam intensity at 0° (ref. 26). DBR, distributed Bragg reflector; I0, intensity of the incident light; Ir, intensity of
the reflected light; ϕ0, phase of incident light; ϕr, phase of reflected light; Va, applied voltage. c, MEMS approach. Top: the deflection angle of the transmitted
light can be modulated by suspended silicon metasurfaces actuated by MEMS. Bottom: experimental results for the far-field intensity of continuously
steered beams under the indicated voltages. Scale bars, 3 μm (ref. 28). Yellow and red metasurfaces represent the device before and after applying a voltage
bias, respectively. d, Liquid crystal approach. Top: schematic of metasurfaces and liquid-crystal-based SLMs. Bottom: measured transmission intensity
of two-level (left; 0 and 8 V, represented in blue) or three-level gating (right; 0, 3.5 and 8 V, represented in green and blue). The maximum achievable
diffraction angle is 11° (ref. 38). The colours in the metasurfaces indicate the differently applied electric bias. hLC, 1,500 nm-thick LC cell; w, 1,080 nm-wide
electrode; g, 60 nm gap between the electrodes. e, Phase-change material approach. Top: schematic of actively switchable plasmonic metasurfaces in
amorphous (left) and crystalline (right) states. Bottom: depending on the state of the active layer, the specifically designed metasurface set can have
strong resonance (expressed in brighter colours) interacting with incoming light, through which the diffraction angle of deviated beam can be switched.
The lower part of panel e represents the diffracted beam passing through amorphous and crystalline states of phase change materials, respectively.
Infrared camera images are used to observe the diffracted beam43. LCP, left circularly polarized; RCP, right circularly polarized. Figure adapted with
permission from ref. 18, ACS (a); ref. 26, Springer Nature Limited (b); ref. 28, AAAS (c); ref. 38, AAAS (d); ref. 43, Springer Nature Limited (e).
electric field confinement increases further near the charge accu- (NIR). In ITO, it can be slightly adjusted by charge concentration
mulation layer, which is beneficial for electrically tunable nanopho- modulation, making ITO a suitable material for active beam steer-
tonic devices17. The ENZ region typically falls in the near-infrared ing technology in that region18–20.
a b
Incidence
1.0
Dynamic I0e–φ0 Ire–φr
Va = 0 V
beam steering
Normalized intensity
0.5
1.0
MQW resonators Va = –10 V
DBR 0.5
–1 +1
GaAs
y 0
x –15 –10 –5 0 5 10 15
Angle (°)
c d
Left bias Zero bias Right bias
L
Sin
SiO2 V
Sip
hLC V
g
w
0V 2.75 V
z 20 30 30
0 10
0 10 20
–10 –10
y –30 –20 (°) –30 –20 n angle
(°)
n angle
Deflectio Deflectio
Amorphous Crystalline
RCP RCP
Amorphous Crystalline
Intensity (a.u.)
Intensity (a.u.)
scanning with high spatial resolution. Beam scanning control meth- at 2.28 µm. In the crystalline state, the resonances moved to 4.1 µm
ods with two-axis MEMS mirrors (for example, Lissajous scanning) and 3.1 µm, respectively. Thus, each antenna interacted with both
can improve the temporal (several tens of megahertz) and spatial the amorphous and crystalline states, and thereby shifted a diffrac-
resolution (a few degrees) of LiDAR systems36. In addition to the tion angle of a diffracted beam. This strategy could also be exploited
beam steering applications, MEMS have been used to demonstrate to realize focal length shifting lenses.
dynamic photodetectors featuring a large FOV of 42°, with the goal Vanadium dioxide (VO2), a material that exhibits a metal–insula-
of reducing the size of photodetector pixel arrays37. tor transition, can also be used for thermally tunable metasurfaces.
The VO2 undergoes a phase change at a lower temperature than
Liquid crystals. An SLM, which can control the wavefront of light GeSbTe alloys (for example, 68 °C for VO2 and 160 °C for Ge3Sb2Te6)
by modulating the phase or intensity of incoming light with liquid and induces refractive index modulation from the visible to NIR
crystals, has been applied to video projection, hologram genera- region44–46. The index modulation of the VO2 layer itself is not very
tion and LiDAR technology, but the large pixel size (on the order large in the visible to NIR region (ΔRe(n) ≈ 0.5 and 1.4 at 600 nm
of micrometres) reduced the resolution of the projected images and and 1,200 nm, respectively), but plasmonic or dielectric metasur-
degraded the wavefront quality due to diffraction. Furthermore, the faces could improve its switching time (by around several millisec-
FOV defined by the angular coverage of the first diffractive order onds), degree of spectrum modulation and radiation pattern47–49.
was also limited to a maximum of a few degrees. However, integrat- In conclusion, most beam steering metasurface devices dem-
ing metasurfaces with liquid crystal technology could open a prom- onstrated so far provide only 1D point scanning and a maximum
ising pathway to active beam manipulating devices. FOV that is similar to the minimum standard of existing commer-
Li et al. proposed a metasurface-based miniaturized SLM tech- cially available products (around 30° to 40°). Continuous 1D line
nique that can modulate the transmitted light wavefront with min- scanning and 2D point scanning have not yet been demonstrated.
iaturized pixels38 (Fig. 3d). A TiO2 metasurface, designed according Furthermore, most ITO-based plasmonic active devices work close
to the Huygens principle, showed both high transmittance, by mini- to an absorption resonance and thus suffer from low diffraction effi-
mizing backscattered light through the optimization of the electric ciency. In addition, the modulation speed still needs to be improved
and magnetic dipole resonance, and coverage of the entire 2π phase up to several megahertz, as the tuning speed of micrometre-scale
delay. A 205-nm-high metasurface structure was inserted into a liq- liquid-crystal-based devices is limited to several kilohertz. Another
uid crystal sandwich cell, which enabled three-level phase retarda- limitation is the strong correlation between amplitude and phase
tions according to the liquid crystal molecule orientation (0°, 45° modulation, causing a low signal-to-noise ratio.
and 90°) in the wavelength range of 660–670 nm. The metasurface–
SLM device relied on a unit cell consisting of three nanodisks to Metasurfaces for structured illumination and flash LiDAR
increase the diffraction efficiency, with simulations predicting a 48% The FOV of the current state-of-the-art beam steering metasur-
diffraction efficiency for the first order at 665 nm. Experimentally, faces, ranging from 20° to 40°, is still far narrower than the scanning
a three-level phase-modulating SLM was tested for a metasurface range required for current commercially viable LiDAR systems that
with 1 μm pixel size. Applying bias voltages of 0, 3.5 and 8 V resulted demand full scene scanning. Note that full scene scanning can be
in first-order deflection at an angle of about 11° (that is, a 22° FOV) achieved with the combination of multiple LiDAR sensors, thus a
(Fig. 3d). The demonstrated diffraction efficiency was relatively range of 60° to 120° for a single sensor could still be sufficient if sys-
low, about 15%, and could be further increased to 36% with a larger tem design workarounds are implemented. To overcome the limita-
pixel size and stable gating method. The intention in using metasur- tions of the conventional LiDAR scanning approach, flash LiDAR
faces in traditional SLM devices is to reduce both the pixel size (by uses a diffractive optical element (DOE)-based point-spreading
about 1/3) and the liquid crystal thickness (by about 1/2). Although component to project light at large angles. Overall, these struc-
these reported results did not show any drastic improvement in the tured illumination schemes have several advantages as they do not
switching or scanning rates (limited by the liquid crystal modula- require beam scanning and can reduce the load on depth compu-
tion speed of several kilohertz), liquid crystal actuation in a min- tation and the corresponding processing units. For a simple case,
iaturized cell could improve the modulation speed for applications the acquisition time (Tacq) of a single shot for point/line-scanning
requiring high-frame-rate LiDAR imaging. and flash-type-scanning LiDAR with a detailed specification of
90° × 10° total angle view, 0.2° × 0.2° spatial resolution, 450 × 50 pix-
Phase-change materials. The biggest advantage of switchable and els (Nx × Ny) and 10 frames per second (FPS) is calculated as follows.
reconfigurable phase-change materials is that they can be directly Tacq for point/line/flash-scanning methods is calculated as Tacq = 1/
incorporated within metallic or dielectric metasurfaces39–42. In par- (FPS × Nx × Ny) = 4.44 µs, Tacq = 1/(FPS × Ny) = 222 µs and Tacq = 1/
ticular, chalcogenide phase-change materials such as GeSbTe alloys (FPS) = 100 ms, respectively50. Thus, compared with scanning meth-
can be switched between amorphous and crystalline states quickly ods demanding fast microsecond scan speeds and processing times,
(in nanoseconds) and repeatedly (~105 cycles) through external flash-type LiDAR (or structured illumination) can employ simpler
thermal, optical and electric stimuli42. They also possess a high scanning and processing units, requiring millisecond scanning or
refractive index contrast in the infrared region and non-volatile processing times. However, to precisely manipulate the structured
characteristics. beam path and to scan farther, additional optical components such
Yin et al. proposed beam switching and bifocal zoom lens as fisheye lenses or high-power lasers are still required. This is a one
devices using active plasmonic metasurfaces (Fig. 3e)43. They were specific area where metasurface-based approaches could have a major
composed of a 50-nm-thick active Ge3Sb2Te6 layer, a 15-nm-thick influence. The DOE manipulates the wavefront of light by modulat-
ZnS:SiO2 capping layer to prevent oxidation and a 40-nm-thick ing the phase using micro-sized etched structures, generating a point
Au metasurface. The active layer was in its amorphous state when cloud in both the transmission and reflection modes. For this reason,
deposited and switched to the crystalline state when heated at scrambling metasurfaces, large-area beam deflectors with pixelated
160 °C. At a wavelength of 3.1 µm, there was a high refractive index metasurfaces and other structured illuminating methods are promis-
contrast between the amorphous (n ≈ 3.5 + 0.001i) and crystal- ing alternatives to the conventional DOE, offering a smaller footprint
line states (n ≈ 6.5 + 0.06i), with negligible loss. The lens can also and simplifying the optical system (removing the requirement for a
be equipped with two different sets of plasmonic elements, each fisheye lens, for example). Integrating metasurfaces directly on top of
with a different plasmon resonance. In the amorphous state, one a light source array is of interest to adjust the number of individual
set showed a resonance at 3.1 µm, whereas the other set resonated pixels operating according to the scan range or purpose.
Iris Sample
2 µm
NA = 0.9
30° 60°
DOE
Camera
Collimator
VCSEL array
c d
1
θ = 15°
θ = 10°
θ = 8°
Normalized intensity
x axis (mm) θ = 6°
SiO2 θ = 4°
Total TiO2
SiO2
S-pol
TiO2
P-pol
SiO2 θ = 2°
GaP n
a.u.
0 MQW
12 30 –30 GaP p
θ = 0°
Au
8
60 –60
0
4 y
0 2 4 6 8 10 12
90 –90 z axis (mm)
0
x
Fig. 4 | Point-cloud-generating metasurfaces and device integration. a, Metasurface DOEs compared with conventional DOEs. The scanning electron
microscope (SEM) image (top right) shows fabricated 2D metasurfaces, the Fourier space image (bottom right) shows transmitted light from a
metasurface DOE. The dashed circles represent diffraction angles of 30° and 60° (ref. 51). b, Scrambling metasurfaces. Top: schematic of the experimental
set-up for scrambling metasurfaces to produce a point cloud. Bottom: experimental results for a point cloud covering 4π space at 633 nm (ref. 52).
Dashed yellow box shows a certain part of the point cloud patterns with uniform intensity and distance between neighboring points. c, LED-integrated
metasurfaces. Top: the working mechanism of the LED-integrated metasurfaces. A hybrid Bragg grating–Si metasurface, called a resonant cavity LED
(RCLED), can deflect collimating beams by about 30°. Bottom: emission pattern of hybrid RCLED with Si metasurfaces63. S-pol and P-pol indicate
transverse-electric polarized light and transverse-magnetic polarized light, respectively. The layer diagram represents the normalized (to the maximum
power) intensity of the angular emission spectrum corresponding to the diffraction angle (in the semicircle). d, VCSEL-integrated metasurfaces. Left:
optical microscope image (top) and SEM images (middle and bottom) of 10 × 10 array metasurfaces. Each array has a different deflection angle and
depending on the electric gating, wide-ranging dynamic beam steering is possible. Scale bars represent 10 mm (top), 1 mm (middle left), 200 μm (middle
right) and 100 μm (bottom right). Right: the measured far-field deflected beam intensity on the xz plane where deflecting angles θ vary up to 15° (ref. 65).
The y axis represents x axis displacement where the scale of each section is 2 mm (black arrow). Figure adapted with permission from ref. 51, ACS (a);
ref. 52, Springer Nature Limited (b); ref. 63, John Wiley and Sons (c); ref. 65, Springer Nature Limited (d).
Scrambling metasurfaces. Diffractive elements are often used to structures suffer from low efficiency, poor beam intensity unifor-
scan 3D objects with a single-shot illumination/flash. Unfortunately, mity and a limited FOV. As a large FOV with high efficiency and
conventional DOEs realized by periodic binary-phase grating beam intensity uniformity demands element periodicities that are
smaller than the operating wavelength, and multiple phase level depending on the distance between the light source and mask. By
modulation is realized by controlling the etching depth of the DOE, exploiting the structured illumination properties of the light beam,
the fabrication process can be an issue. A diffractive metasurface the distance of different objects can be retrieved by numerically
enhances transmission efficiency and improves the FOV and other decoding the local orientation of the double-helix point-spread
performance metrics, such as the working wavelength, device form function. However, the approach is still marred by a small numeri-
factor and polarized illumination. Ni et al.51 developed a metasur- cal aperture of 0.02 and low spatial resolution of 10 mm (depth
face DOE to achieve 2D spot array illumination with a 120° × 120° difference).
FOV (Fig. 4a). Using vectorial electromagnetic simulations, the Si Guo et al. introduced single-shot metalens depth sensors that
metasurface structure can be optimized to operate at the telecom- take advantage of two differently defocused images captured at
munication wavelength of 1,550 nm with up to 81% diffraction different spots56. The system is composed of two complementary
efficiency51. off-axis metalenses with distinct focal lengths. Multiple photosen-
Li et al. developed an amorphous-silicon-based scrambling sor pixels capture two differently defocused images in a single shot
metasurface inspired by Lambertian surfaces that appear uniformly and the optical information is exploited to retrieve the distance
bright from all viewing angles52. They overcome the drawback of information using a depth-reconstruction algorithm. This approach
most DOE and Lambertian surfaces, which only operate in the achieves 10 cm spatial resolution with a millimetre-scale metalens.
half space (that is, working as either a half-spherical-reflection or The remaining challenges here are the small FOV, low spatial reso-
half-spherical-transmission type). The scrambling metasurface lution and limited distinguishable distance, which could be over-
comprising well-designed nanorods diffracts light in both the trans- come using actively focusable tunable metalenses.
mission and reflection modes with the same intensity. Such so-called More conventional structured light, such as beams carrying
transflective metasurfaces satisfy both the appropriate phase delay orbital angular momentum or non-diffracting Bessel beams, have
condition (δr = δt = π), where δr and δt are the phase delays derived also been used in depth measurements57–61. As bulky optical com-
from the geometric phase in the reflection and transmission modes, ponents such as Q-plates or axicons are required for these systems,
respectively, and the appropriate reflection and transmission coef- metasurfaces are expected to play an important role in the miniatur-
ficient conditions (rl = rs = tl = ts), where rl, rs, tl and ts represent the ization of these optical devices. Overall, structured illumination has
reflection and transmission coefficients of light propagating with many advantages as it does not require active lightning components
a polarization parallel to the long and short axes of the nanorods, and can reduce the load on the depth computation and correspond-
respectively. These metasurfaces have a theoretical polarization ing processing units62.
conversion efficiency of 27% for both reflected and transmitted In conclusion, the efficiency of structured light illumination
light, which was then experimentally verified using a blazed grating devices should be improved further (for example, to produce a
and a 2 × 2 beam splitter. The fabricated device operates between polarization conversion efficiency of scrambling metasurface
470 and 650 nm, and the diffraction angle is close to 90° (that is, >60%). Compared with scanning methods, the incident laser beam
a 180° FOV)52. Using the same principle, numerous sub-diffracted is divided into an array of several thousands of beam spots, split-
beams can be generated that exhibit the same intensity in the reflec- ting the power into many weaker diffracted beams. In their current
tion and transmission modes, with both reflected and transmitted form, these devices can only achieve depth measurements over a
signals covering the half-spherical space (or 2π space). Therefore, range of a few metres.
a cloud of random points can be generated to cover the entire 4π
space (Fig. 4b). For example, according to the calculation, a unit Device-integrated metasurfaces
pixel consisting of a 100 × 100 array of nanostructures is designed, A big leap forwards in the miniaturization of LiDAR would be to
and then arranged into 10 × 10 pixelated arrays to produce 4,044 integrate both the source and the beam shaping devices into a sin-
randomly distributed diffracted sub-beams that scatter over the gle compound system. Depending on the application, LiDAR uses
entire 4π space with the same intensity (Fig. 4b)52. For certain appli- light sources such as fibre lasers, micro-chip lasers, diode lasers
cations, uniform-backscattering half-space (2π-space) metasurfaces and LEDs. To adjust the beam properties of each source, additional
can also be designed53. components are generally needed, making the entire system bulky.
Monolithic device-integrated metasurfaces combined with light
Large-area metasurface beam deflectors. Li et al. proposed sources are an alternative to control the characteristics of the emit-
large-area pixelated metasurfaces that can control the scattered ted beams during the processing of the devices at the chip-level.
light within each pixel subunit and adjust the far field of the scat-
tered beams through a 21 × 21 pixel array54. A laser beam at 940 nm LED emission control by metasurfaces. LEDs emit incoherent and
produced a 441 point array on a 2D screen located 1 m away from Lambertian-shaped light. They have numerous advantages such as
the device. The largest diffraction beam angle was about 15°. The low cost, low power consumption and a long lifespan, but require
device had several limitations, such as a small angle of diffraction, additional bulky optical systems to tailor the properties of the emit-
a limited number of sub-diffracted beams, a strong non-diffracted ted light. Owing to the large angular distribution of the LED emis-
beam intensity and low diffraction efficiency. Those drawbacks can sion, it is difficult to integrate metasurfaces, because the latter are
be overcome with novel design methodologies such as scrambling generally designed assuming a unidirectional, coherent plane wave
metasurfaces, or using low-loss dielectric materials such as GaN or source. However, two recent studies have demonstrated metasur-
TiO2. With the scrambling metasurface design technology along faces emitting directional light from LEDs.
with large-area fabrication technology at a smaller resolution, a Khaidarov et al.63 developed a technique to control the emit-
metasurface-based flash LiDAR able to scatter a point cloud with ted beam properties from a GaP LED by integrating it with meta-
high efficiency over the entire 4π spherical space could be realized, surfaces. This approach solved the problem of the broad k-space
in principle. distribution by incorporating a Fabry–Perot resonator structure63
(Fig. 4c). A hybrid Bragg Au RCLED was constructed by installing a
Other structured illumination approaches. Jin et al. proposed a Bragg mirror made up of multilayers of TiO2 and SiO2 on the front
metasurface phase mask producing rotating point-spread function and a Au reflector on the back of a GaP LED. This design drasti-
for depth sensing55. The illuminating light passes through a phase cally reduced the divergent angle emitted from the LED, emitting a
mask to form a double-helix point-spread function that features a mostly collimated light source normal to the LED surface. On the
distinctive intensity distribution (that is, different rotating angle) LED exit facet, an amorphous-silicon-based metasurface device
Fig. 5 | Photonics integrated LiDAR approaches. a, On-chip design integrated LiDAR system. A distributed feedback laser was used to drive the system.
Using a two-channel procedure—transmitter (TX) and receiver (RX)—it is possible to measure speed (bottom left) and range (bottom right) at a distance
of 60 m with only 5 mW of emitted laser power using fast Fourier transform spectra to increase the range67. FM DFB, frequency modulated distributed
feedback laser; BPD, balanced photodiode; DLI, delay line interferometer; iPH, phase shifters channels; E, emitted field; R, reflected field; BPD, balanced
photodiode; FC, fibre circulator; LO, local oscillator. b, Microcomb-based LiDAR. Top: experimental scheme to set up dual-comb experiments. Bottom:
measured profile of a bullet moving at a speed of 150 m s−1 (ref. 10) dotted lines represent the limits of measurement. DKS, dissipative Kerr soliton states;
OCT, optical coherence tomography used to compare the results. c, Massively parallel coherent laser ranging using a soliton microcomb. Single-comb
photon tagging and applied frequency modulation function. Bottom: set-up (left) and ranging experiment (right) (ref. 12). E(t) and E(ω) correspond to
the electric field in the time and frequency domain, respectively. trep, period of the modulation; ωp, central frequency; μ, the spectral limits; AFG, arbitrary
function generator; EOM, electrical optical modulator; DEMUX, the demultiplexer; CIRC, optical circulator. d, Inversely designed Lorentzian resonator
on silicon-on-insulator to implement non-reciprocal transmission71. Left: illustration of the optical path to perform the experiment. Right: design of the
resonator. EDFA, erbium doped fibre amplifier; PD, photodiode. Figure adapted with permission from ref. 67, IEEE (a); ref. 10, AAAS (b); ref. 12, Springer
Nature Limited (c); and ref. 71, Springer Nature Limited (d).
combs10. A pair of free-running continuous-wave lasers pumped emitted–received signal with the other signal used as a reference in
dual Kerr comb generators detuned by 96 kHz. The signal comb a balanced photodiode. The other local oscillator comb was again
was then divided by a 50:50 fibre coupler, corresponding to the divided into two parts, one routed to the measurement photodiode
a b
SN1/RX iPH
BPD #2 Ultrafast ranging
BPD #3
BPD #4
FM DFB 75 BPD #5 Bullet
laser BPD #6 Collimator 150 m s−1
25 BPD #7
#8
Dual DKS set-up
BPD
#9 E
BPD Lens
LO R R
SN2/TX #2
Si3N4
E #3
#4
#5 Air gun
#6
#7
#8
Time (10 µs per division)
#9
Dual-DKS-comb (dynamic)
FC
DLI 2 Swept-source OCT (static)
BPD
ϕ shift
Profile (mm)
1
60
5
Speed (km h−1)
Distance (m)
40 0
0
20
0 1 2 3 4 5 6 7 8
–5
0 Position (1 mm per division)
20 40 60 0 20 40 60
Time (s) Time (s)
c
E(t ) E(t ) trep trep
t t
E(ω) E(ω) ∆ω
~sech2
ω ω
Si3N4 CIRC
EOM ω–µ
CW pump (x–µ,v–µ)
90/10
AFG
ω+µ ωp
Digital signal
processing
DEMUX
(x0,v0)
θ0
Si3N4 50/50
θ+µ
x–µ,v–µ ... x0,v0 ... x+µ,v+µ
(x+µ,v+µ)
v
d
Distance measurement set-up Drop waveguide
Oscilloscope
Fano resonator
Port 3
PD1 PD2
1 µm
Port 1 Port 2
Circulator Circulator
E H
k
Incident r
beam
Length
Vt Au
Period (Λ)
x
y
Λc
Interconnect wire through
AI for the bottom gate (Vb)
c d e
Vertical (°)
2
Pulse laser
Photo- 3.4
detector 1
0 2.7
0 1 2 3 4 5 6
Tunable
Horizontal (°)
metasurface
Fig. 6 | All-solid-state active metaphotonic spatial light modulator (SLM). a, Schematic of nanobeam-shaped active metasurface array. Λc, channel
period. b, Cross-section of the unit cell, comprising the top Au metasurface, active ITO middle layer and bottom Al reflector layer. Active beam steering
cab be realized by adjusting top and bottom electric bias. E, electric field; H, magnetic field; k, wave vector. c, Schematic of a 3D LiDAR experimental
set-up. d, Scanned objects. e, Measured 3D depth image using the metaphotonic SLM. The scanned angle range, spatial angle resolution and distance are
6° × 4°, 0.2° × 0.2° and 4.7 m, respectively102. Here, the horizontal and vertical directions (x and y axis labels) are relative to the ground. Figure adapted
with permission from ref. 102, Springer Nature Limited.
most current LiDAR systems consume about 10–20 W of power, state-of-the-art LiDAR generally uses VCSEL array-type direct
and heat generation can become severe over long periods of use. In TOF sensors. The light illumination components are composed of
automotive applications, because the LiDAR sensors are integrated 10,000–20,000 VCSEL arrays (but to increase the signal-to-noise
into vehicles that generate lots of heat, the sensors should be able to ratio, around 5,000 point sources are illuminated in a single-shot) to
function between −40 °C and 150 °C, according to the Automotive control a beam path from each laser cell, a microlens array is needed
Electronic Council qualification or AEC-Q100, grade 0. Here, and, according to desired purposes, additional external lenses are
dielectric or refractory materials such as W or TiN should be used. required, making the entire system bulky. Ultracompact nanopho-
In a similar vein, materials that are sensitive to moisture and tend tonic LiDAR should not only match the performance of commercial
to oxidize, such as certain metals, should be avoided. Wire-type counterparts, but should also provide better solutions for emerging
heaters and anti-fog coating technology could be integrated into applications such as indoor robots or drones with full 4π spherical
LiDAR systems for de-icing and defrosting. Finally, vibration sta- space measurements, omnidirectional cameras, smartphones and
bility is required to maintain precise scanning, alignment and light closed-circuit television. For those applications, spherical detector
detection under uncontrollable external impacts. As nanophotonic arrays, or nanophotonic-augmented angle-sensitive nanowire pho-
LiDAR systems can be monolithically manufactured, some of these todetectors should be developed103.
issues could be resolved. Beyond such enhanced performance, supplanting current
Metasurfaces combined with liquid crystals, LEDs and VCSELs LiDAR technologies requires new and cheap solutions with excep-
could drastically reduce the footprint of LiDAR devices, integrating tional maturation speed to market from conception to delivery. This
the emission, scanning and receiving components into a single unit. is certainly a challenge for the new startups and companies that are
The designer capabilities of the nanophotonic solutions discussed focusing on active beam control with nanophotonic components.
in this Review, in terms of beam shaping, polarization and oper-
ating wavelength, will enable real-life, fast, ultra-thin, lightweight Received: 26 August 2020; Accepted: 10 March 2021;
and high-end LiDAR systems in the near future. At the moment, Published online: 6 May 2021