Final Paper1

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

VLSI Implementation of Motion Estimation

Algorithm
Dr Mahesh1 Anupama B2 Apoorva T K 3
ECE ECE ECE
PESCE PESCE PESCE
Mandya, India Mandya, India Mandya, India
mahesh@pesce.ac.in anupamaa803@gmail.com apoorvatk123@gmail.com

C V Nithish Bharadwaj4 Srilakshmi N R5


ECE ECE
PESCE PESCE
Mandya, India Mandya, India
nithishcv1@gmail.com srinr2062002@gmail.com

Abstract— In the pursuit of advancing motion architecture. The two different kinds of eye structures—
detection and analysis in visual scenes, this project the compound eye and the simple eye, or ocelli—are
aims in development of mixed mode Application among their most distinctive characteristics. The
Specific Integrated Circuit (ASIC) system compound eye, a wonder of biological engineering made
implementation for a motion estimation algorithm. up of several ommatidia that each act as a tiny eye unit, is
Inspired by the intricate mechanisms found in nature, of particular interest. Because of their sophisticated
particularly in the locomotion of insects, our approach anatomy, insects can see a large area and have excellent
harnesses the principles of compound eyes and time- motion detection abilities, which helps them navigate
of-travel optic flow sensing. The algorithm intricately through challenging surroundings.Our motion sensing
integrates time stamps associated with each frame, approach is based on the blueprint found in nature and
transcending conventional optical flow estimation aims to mimic the accuracy and efficiency of biological
methodologies. systems. The idea of optic flow, which is based on
Central to our endeavor is the development of determining the temporal difference between successive
photoreceptor cells and a precise time-of-traversal visual inputs to infer object movement within the field of
measurement unit. The hardware architecture view, is fundamental to our methodology. In order to do
facilitates the capture of 8-bit 2D raw optic flow, this, we have created a model of a photoreceptor that is
employing innovative techniques like logarithmic based on the biological processes that underlie insect
response and ambient light adaptation. The proposed vision. Our motion detecting technology is based on this
ASIC architecture encompasses blocks for time stamp paradigm, which makes it possible to convert photons into
updating, 2D array recording, and time-of-travel electrical impulses. In addition, we created and executed a
calculation along horizontal, vertical, and diagonal positioning and time-of-traversal measuring device that
directions. These calculated values are then converted uses timestamp analysis to determine how long it takes an
into velocities through lookup tables and projected item to move across pixels in the visual field.
onto x and y axes, resulting in high-fidelity 8-bit 2-D We have verified the effectiveness and dependability of
optic flow (Vx, Vy). The simulation results of the our system by putting it through stringent integration,
system which are found to be promising with respect testing, and validation processes. Our method, which is
to power and area compared to the analogous based on biomimicry concepts, provides a flexible
algorithm implementations in digital architecture. framework for improving motion sensing abilities and
opening up new avenues for study and business.
Keywords— photodiode, photoreceptor, optic flow, time Unlike most creatures that move their eyes, flies have a
stamp update, time of travel, velocity projection etc unique kind of vision in which their retinas move under
fixed lenses. Like other animals, they use this movement
to detect motion and stabilize images. It's called the
I. INTRODUCTION optokinetic reflex. Additionally, when traversing
obstacles, it increases retinal overlap, which improves in
depth perception. This finding casts doubt on earlier
theories regarding insect vision and raises the possibility
In the field of biomimicry, where researchers look to
that other insects may share comparable retinal
nature's designs for inspiration to improve technology,
movements, pointing to a more dynamic visual system.
insects are particularly notable as excellent examples of
[1]. Fruit flies, or Drosophila, use unique neural
utility and efficiency. Insects provide important insights
computations in their fixed compound eyes to sense
into the functioning of sensory systems and locomotion,
motion. T4 and T5 cells, which use linear and nonlinear
even with their small size and seemingly simple

979-8-3503-0159-5/23/$31.00 ©2023 IEEE


processing stages, respectively, to detect ON and OFF avoidance is made possible using an omnidirectional
movements, are important routes. A realistic model of the fisheye camera and IMU data that adjust for rotation.
motion detection circuitry has been created by functional [9]. Drawing inspiration from insect vision, Patrick A.
research in conjunction with a detailed mapping of the Shoemaker, Andrew M. Hyslop, and J. Sean Humbert
connectome of the fly's visual system, including synaptic investigate optic flow estimation for autonomous flying. A
connections, neurotransmitters, and neuron morphology. fly-like robot uses a virtual environment and a bio-
This model highlights the sophisticated and intricate inspired control algorithm to navigate an obstacle-filled
structure of the fly's visual system, demonstrating arena in a manner similar to that of insects, specifically
evolutionary advances and providing insights into the honeybees. The study contrasts conventional computing
principles of brain computing that apply to a wider range techniques with bioinspired motion detection systems.
of species [2]. A computer model based on physiological When actual optic flow is present, bio-inspired algorithms
investigations is provided, examining how fruit flies initially display less mutual information. However, in
(Drosophila) detect object motion direction versus contrast to computational approaches, their performance
cluttered backdrops. It includes wide-field horizontally dramatically improves when integrated spatially. This
and vertically sensitive as well as ON and OFF routes. suggests that improved optic flow estimation reliability
for navigation is achieved by spatial integration in insect
The model can reliably identify larger, quicker, and more vision. [10] This work addresses altitude management as
contrasted moving objects, which encourages the an obstacle avoidance problem in ultra-light indoor
development of neuromorphic sensors for intelligent airplanes by proposing a unique control system using
devices. [3]. Avoiding quickly approaching objects is translatory optic flow. Micro fliers use lightweight
essential for both intelligent robots and mammals. A cameras and MEMS rate gyros to avoid the ground and
computational model is inspired by the quick escape ceiling without explicitly estimating their altitude. When
circuit of flies, which includes LPLC2 neurons that tested on a model micro flyer, encouraging outcomes were
integrate motion detector inputs to recognize looming seen. Parameter optimization and the improvement of
structures. Tested on virtual robots, this model flight models for practical use are among the upcoming
demonstrates strong collision avoidance and useful tasks. [11]. A time-stamp image is produced by the
flexibility by varying parameters according on robot size. suggested time-stamp-based optic flow estimate, which
It promises to be useful in noisy conditions for real-time tracks the arrival timings of moving objects at each pixel.
applications. [4]. With a small form, the Curved Artificial An 8-bit subtractor and a lookup table for velocity
Compound Eye provides a high temporal resolution and conversion are used to track position changes over time in
wide field of view, drawing inspiration from insect vision. order to determine motion velocity. Inspired by the visual
The flexible imager is made up of three flat layers that are processing of insects, this system calculates velocities
curved: the flexible circuit board, the neuromorphic based on time differences and updates time-stamps to
photodetector array, and the microlens array. efficiently detect motion.An improved version of Tobi
Delbrueck's Adaptive Photoreceptor, the Self Biasing
The spatial resolution and light adaption of insect eyes are Adaptive Photoreceptor circuit is analyzed in Daniel
mimicked in this prototype, which may find use in Oberhoff's research at the University of Zurich's Institute
surveillance systems, medical instruments, and of Neuroinformatics [12]. In order to partially succeed, it
automobile navigation. [5]. Recent developments in entails theoretical modeling, stability analysis, and
genetics and recording methods enable the investigation instability mitigation. [13]. Previously used as a genetic
of individual neurons involved in the visual activities of model, Drosophila melanogaster is now used to help with
Drosophila melanogaster. Motion detection, the functions in-depth research on behavior and visual processing.
of photoreceptors in color and polarization vision, and Phototaxis and optomotor response are examples of
visual processing during flight are among the subjects vision-driven behaviors that show detailed visual
covered. The understanding of brain-behavior that these processing from photoreceptors to brain regions,
discoveries provide is essential for researching visual influencing behaviors ranging from simple responses to
course control circuitry.[6] Drosophila's R7 and R8 sophisticated navigation. [14].Through the use of a simple
photoreceptors show wavelength opponent features as a motion detector algorithm inspired by insect visual
result of interactions with nearby ommatidia that are pathways, the work investigates the development of bio-
regulated by the Dm9 interneuron. This provides efficient inspired optic flow sensors for micro-air vehicles. With
decorrelation by shaping the spectral tuning. Similar to the use of high-pass and low-pass filters, this algorithm
mammalian systems, two-photon imaging enhances adjusts to changing lighting conditions to improve
chromatic information extraction by revealing a complex contrast and identify motion. One major breakthrough is
spatio-chromatic receptive field. [7]. Through R7 and R8 an optic flow estimation core based on time stamps that
photoreceptors with wavelength opponent properties, makes motion identification easier by timing the passage
which are controlled by the Dm9 neuron, flies process of visual features. For contexts where hardware is
spectrum information. In doing so, chromatic information constrained, such as MAVs, this strategy works well. The
is preserved and signals are decorrelates. This model design illustrates the viability of extending the 1D model
predicts a broadband surround and color opponent center, to 2D and improving autonomous navigation systems by
enabling Drosophila to interpret visual information striking a balance between hardware efficiency and power
efficiently.[8]. Autonomous Micro Aerial Vehicles create consumption. [15].The VLSI implementation of motion
depth maps by using optical flow from onboard cameras estimation algorithms necessitates striking a balance
to securely navigate inside passageways. Collision between power efficiency and computational precision.
Holistic techniques are often overlooked in favor of one
over the other in current research. There is a lack of photoreceptor model. Their model includes two essential
research on innovative architectural layouts or VLSI- characteristics of insect eyes: adaptation to ambient light
specific algorithm modifications. To close this gap and and logarithmic response. Let's dissect these attributes:
achieve effective, high-performance solutions, Logarithmic Reaction: Natural settings can have wide
interdisciplinary efforts combining knowledge of motion variations in light intensity. The photoreceptor's
estimating algorithms, VLSI design, and system-level logarithmic response enables it to efficiently gather data at
optimization are required. Reducing circuit area through a broad spectrum of light intensities. This implies that
the use of VLSI technology can greatly increase overall even in extremely bright or extremely dark environments,
efficiency. minute variations in light intensity can be seen.
Time stamp based optical flow estimating algorithm [15]
was determined to be more effective in terms of Artificial photoreceptors that incorporate this trait can
processing area and power based on the models and function well in a range of illumination conditions, both
algorithms presented by many research groups; hence, it is indoors and outdoors.
suited for ASIC implementation. The ASIC system's Adaptation to Ambient Light: Many other creatures,
component design for the system and subsystem is shown including insects, have evolved defense systems against
in Section II. variations in the amount of ambient light. This
modification guarantees that visual systems maintain their
II. SYSTEM AND SUBSYSTEM DESIGN effectiveness and responsiveness in a variety of
illumination scenarios. Artificial systems can imitate
Block Diagram biological vision systems' dynamic reactivity by
incorporating this trait into the photoreceptor model.
This adaptation usually entails modifying the
photoreceptor's sensitivity in accordance with the ambient
light levels. Fig. 3 The model of a photoreceptor. The
mosfets' dimensions are displayed in Table 2. Creating a
photodiode module is the first step in developing a
photoreceptor model. This module faithfully simulates
how incident photons are converted into electrical
Fig 1. System Level Diagram impulses by a photodiode. It is incorporated into the larger
photoreceptor model after validation. This model contains
Figure 1 depicts the system-level sub-blocks of the processes for signal amplification and adaptability in
intended ASIC that will be put into practice. Each block addition to photon-to-electron conversion. Amplification
has the ability to operate independently based on the is necessary to increase the photodiode's tiny
signal received from the stage before it. Below is a full photocurrent, and adaptation makes sure that the
explanation of each sub-block's functions. sensitivity changes with the amount of light present.
First subblock: The photodiode is represented by this This integrated technique makes it easier to simulate the
block, which is a Verilog A code model. This block complete procedure accurately, from photon detection to
considers the voltage input in relation to the intensity of signal processing (fig. 4 shows the schematic for the
the light. This block produces current as its output. Figure same). The resulting photoreceptor model provides
2 depicts the photodiode's test bench. Figure 2 shows that insights into the operation and possible uses of optical
V0 is the diode reference voltage and that V1 is the value sensing devices, making it an invaluable tool for research
proportional to the light intensity. The photodiode current and development.
fluctuation is displayed in Table 1.
Fig 3. Photoreceptor

Fig 2. Testbench of Photodiode

TABLE 1. Simulation values of photodiode


Sl. No V1 in volts V2 in Id in femto
millivolts Amps Explanation of the function of each MOSFET in the given
1 1 2 2 circuit:
2 2 2 1.43  M1 (Source-follower): Buffers the voltage from
3 3 2 1.27 the photodiode D1, isolating it from the rest of
4 5 2 1.15 the circuit.
 M2 (Amplification): Amplifies the buffered
Sub-block II: It is truly amazing how much effort signal from M1.
Delbrück and Mead put into creating the first working
 M3 (Load): Acts as a load for the amplifier stage time stamp update block has been constructed. It consists
involving M2, ensuring proper operation. of 64 units, each of which includes an object detection
 M4 (Inverting Amplifier): Inverts and further feature with a 1 bit input. If an item is present, the
amplifies the signal from M2. feature's value is binary 1, otherwise it is binary 0. The
 M5 (Current Source): Provides a stable current time stamp update block is synthesized and displayed in
source for the inverting amplifier stage involving Fig. 6. If the value of the 1-bit feature is binary 1, the
M4. output of each unit records the value of an 8-bit counter
that is connected parallel to every other unit.
Additional components:
- D1 (Photodiode): Converts light into an electrical signal. Time Stamp 2D Array: Located in the second sub-block
- C1 and C2 (Capacitors): Stabilize and set the frequency
response of the amplifier stages.

TABLE 2. Mosfet dimensions


Sl. No Mosfet Size in µm
(W/L)
1 M1 6.4/9.6
2 M2 12.0/5.6
3 M3 5.6/9.6
4 M4 8.8/3.2
5 M5 8.8/9.6

of Figure 5, this 2D time stamp array stores the time


stamp data that is updated by the time stamp update block.
The synthesis of this block is shown in Fig 7. The
sequential 64 inputs of the time stamp 2D array block,
each of 8 bits (the result of the time stamp update block),
will be transformed into an 8X8 2D array.
Each pixel in the optic flow sensor's field of view has its
time stamp recorded in this array.

Fig 6. Schematic of time stamp update

Fig 7. Synthesis of time stamp 2D array


Fig 4. Schematic of photoreceptor
Time of travel: As shown in fig 5, four time of travel
blocks are used to calculate the time of travel along four
Sub-block III: The optic flow estimation core, which is
directions, which are horizontal, vertical and two
further divided into 5 sub-blocks, is seen in Figure 5.
diagonals. This block has 2D array (output of the previous
These sub-blocks handle raw 8-bit optic flow data
from nearby 3x3 time stamp data.

block) as input and the output of this block will be time of


travel which is measure of difference between
neighbouring pixel values when any corresponding pixel
value changes. Fig 8 shows the synthesis of time of travel.
The measured time of travel values in four directions are
converted to velocities from 4 look up tables. Finally, the
four converted velocities are projected to x and y axes to
find 8 bit 2-D optic flow (Vx, Vy).
Fig 5. Optic flow estimation core

Time Stamp Update: The first sub-block in Figure 5 is


Fig 8. Synthesis of time of travel
titled "Time Stamp Update," and it uses 1-bit feature
information to update the 2D array's time stamp
Fig 9. Synthesis of velocity projection
information for each frame. As a digital subsystem, the
Whenever the pixel value changes, then it’s position value to the output (8-bit value for every 64 units). If en is 1 bit
will be considered. Based on this value we determine the 1, the latch will save the value of the 8 bit counter.
magnitude of velocity and direction of travel in the

III. RESULTS Fig 13. Simulation (console window result) of


timestamp 2D array

Figure 13 shows how the Time Stamp Update block's


output is transformed into a 2D array. This enhances
data manipulation and makes data retrieval easier for
the user. The outcome is displayed in a console
window, with each row and column of data consisting
of eight bits. The rows and columns in the timestamp
2D array that correspond to the counter values of the
timestamp update block are simply updated.

Fig 10. Simulation of photodiode Fig 14. Simulation (console window result) of time of
travel and velocity projection
Figure 10 shows that the voltage related to intensity i.e.,
V2, is kept constant at 2 mV, while the diode reference
voltage V1, which is altered and measured in volts, is kept The time of travel (TOT) simulation results are displayed
constant. The appropriate anode and cathode currents, as in Figure 14. The difference between the values of the
displayed in the waveform, are obtained based on the neighboring pixels is used to calculate TOT. Variables a,
value of V1. For instance, Id=1.43 femtoamps if V1=2V b, and m, n show how the location in a 2D array has
and V2=2mV. Therefore, table 1 displays the equivalent changed in figure 14. By calculating the difference
Id values in relation to the change in V1. between the pixel values at positions a, b, and m, n, the
variable tot stores the TOT. The following equations are
also used to determine velocity projected on the x and y
direction components (u and v, respectively), using the
values of these variables.
u=V WE + ( V SW_NE+ V NW_SE ¿ ∕ √ 2
v=V SN + ( VSW_NE−VNW_SE ¿ ∕ √ 2
where
VWE = Velocity in horizontal direction
VSN = Velocity in vertical direction

Fig 11. Simulation of Photoreceptor


VSW_NE , VNW_SE = Velocities in diagonal direction

First, we examined figure 11's c1, c2=0, and M1 is off. Consider matrix[3][2]=8'd2, matrix[3][3]=8'd9.
When vpb and vcb are off, M3 will be on and M4 will be tot is calculated by the difference between the two-pixel
off. As a result, in these circumstances, just the values. So tot is tot=matrix[3][3] - matrix[3][2]
photodiode current will be amplified, yielding the desired tot=8'd9-8'd2=8’d7
result. When vpb is zero and vcb is one, which indicates Hence tot is 7 which is shown in the console window, in
that M3 and M4 are on, the higher current will be taken which 12 says the total time of travel value which is
into account and amplified. The output is what will come calculated by adding all individual tot values.
from this. So, a=3, b=2, and m=3, n=3.
Here the code is written such that we will get to know the

Fig 12. Simulation of timestamp update

The time stamp update waveform is displayed in Figure


12.The randomized variable en (64-bit bus for 64 units)
represents a single bit feature. Latching is the name given

velocity direction and x and y co-ordinate value.


Here the pixel moves towards west, so
Vwe=1,
Vsn=Vsw_ne=Vnw_se=0,
x=1, y=0
So
u=Vwe+(Vsw_ne + Vnw_se)/√2
u=1+(0+0)/√2
u=1 [5] Dario Floreanoa, Ramon Pericet-Camaraa , Stéphane
v=Vsn+(Vsw_ne + Vnw_se)/√2 Violletb, Franck Ruffierb, Andreas Brücknerc, Robert
v=0+(0+0)/√2 Leitelc, Wolfgang Bussc, Mohsine Menounid, Fabien
v=0 Expertb, Raphaël Justonb, Michal Karol Dobrzynskia,
Then multiplying x and y component to u and v, then Geraud L’Eplatteniera , Fabian Recktenwalde , Hanspeter
u=x×u=1×1 A. Mallote, and Nicolas Franceschini (2013, June)
u=1 Miniature curved artificial compound eyes.
v=y×v=0×0 [6] Alexander Borst (2014 August). Fly visual course
v=0 control: behaviour, algorithms and circuits.
So final u and v is 1 and 0 which is shown in the console [7] Sarah L. Heath, Matthias P. Christenson, Elie Oriol,
output of velocity projection. Maia Saavedra-Weisenhaus, Jessica R. Kohn, Rudy
Behnia (2020, January). Circuit Mechanisms Underlying
Chromatic Encoding in Drosophila Photoreceptors.

TABLE 3. Area of all the blocks [8] llias Sourikopoulos, Sara Headayat, Christophe Loyez,
Sl.No Block Area Francois Danneville, Virginie Hoel, Eric Mercier, and
1 Photodiode 1.936m m 2 Alian Cappy(2017, March).A 4-fJ/Spike Artificial Neuron
in 65 nm CMOS Technology.
2 Photoreceptor 2.98m m 2 [9] T Delbruk, C. A. Maed (1996, April). ANALOG VLSI
3 Time Stamp Update 1.236m m 2 PHOTOTRANSDUCTION by continuous-time, adaptive,
4 Time Stamp 2D array 1.0948m m 2 logarithmic photoreceptor circuits.
[10] Daniel Oberhof (2003, March). Semesterarbeit in
5 Time Of Travel and 0.5089m m 2 Neuroinformatics An AVLSI-case study on the Self
Velocity
biasing Adaptive Photerecptor.
[11] Timothy A. Currier, Michelle M. Pang, Thomas R.
Clandinin (2023, March). Visual processing in the fly,
from photoreceptors to behavior.
[12] Simon Zingg, Davide Scaramuzza, Stephan Weiss,
IV. CONCLUSION
Roland Siegwart (2010, May). MAV Navigation through
Indoor Corridors Using Optical Flow.
The ASIC implementation if found to be effective in
[13] Patrick A. Shoemaker, Andrew M. Hyslop, J. Sean
terms of area power and speed. Overall area standing at
Humbert (2011, May). Optic Flow Estimation on
5.824mm2 with clock frequency set to be the power is
Trajectories Generated by Bio Inspired Closed-Loop
estimated to be 280mW.
Flight.
The results are found to be promising, further the system
[14] Antoine Beyeler, Jean-Christophe Zufferey and Dario
can be completely implemented as a off pin chip. The
Floreano (2007, January). 3D Vision-based Navigation for
optical flow-based estimation in mixed mode signal is
Indoor Microflyers.
found to be more effective compared to the digital counter
[15] Seok Jun Park (2014). Bio-Inspired Optic Flow
parts that are used in flow estimation.
Sensors for Artificial Compound Eyes.
V. AKNOWLEDGEMENT

We extend our deepest gratitude to the dedicated staff of


Project Lab, Communication Lab, and VLSI Lab and
Neuromorphic computing Lab PESCE within the
Department of Electronics and Communication
Engineering (ECE) at PES College of Engineering
(PESCE), Mandya for providing us all the resources and
guidence.

VI. REFERENCES

[1] Karin Nordstrom, Andrew B. Barron (2023, January).


Vision: Flies move their eyes.
[2] Alexander Borst and Lukas N. Groschner. (2023,
July). How Flies See Motion.
[3] Qinbing Fu1, Shigang Yue1. (2020, July). Modelling
Drosophila motion vision pathways for decoding the
direction of translating objects against cluttered moving
backgrounds.
[4] Junyu Zhao, Shengkai Xi, Yan Li, Aike Guo, Zhihua
Wu (2023, April). A fly inspired solution to looming
detection for collision avoidance.

You might also like