Professional Documents
Culture Documents
Final Paper1
Final Paper1
Final Paper1
Algorithm
Dr Mahesh1 Anupama B2 Apoorva T K 3
ECE ECE ECE
PESCE PESCE PESCE
Mandya, India Mandya, India Mandya, India
mahesh@pesce.ac.in anupamaa803@gmail.com apoorvatk123@gmail.com
Abstract— In the pursuit of advancing motion architecture. The two different kinds of eye structures—
detection and analysis in visual scenes, this project the compound eye and the simple eye, or ocelli—are
aims in development of mixed mode Application among their most distinctive characteristics. The
Specific Integrated Circuit (ASIC) system compound eye, a wonder of biological engineering made
implementation for a motion estimation algorithm. up of several ommatidia that each act as a tiny eye unit, is
Inspired by the intricate mechanisms found in nature, of particular interest. Because of their sophisticated
particularly in the locomotion of insects, our approach anatomy, insects can see a large area and have excellent
harnesses the principles of compound eyes and time- motion detection abilities, which helps them navigate
of-travel optic flow sensing. The algorithm intricately through challenging surroundings.Our motion sensing
integrates time stamps associated with each frame, approach is based on the blueprint found in nature and
transcending conventional optical flow estimation aims to mimic the accuracy and efficiency of biological
methodologies. systems. The idea of optic flow, which is based on
Central to our endeavor is the development of determining the temporal difference between successive
photoreceptor cells and a precise time-of-traversal visual inputs to infer object movement within the field of
measurement unit. The hardware architecture view, is fundamental to our methodology. In order to do
facilitates the capture of 8-bit 2D raw optic flow, this, we have created a model of a photoreceptor that is
employing innovative techniques like logarithmic based on the biological processes that underlie insect
response and ambient light adaptation. The proposed vision. Our motion detecting technology is based on this
ASIC architecture encompasses blocks for time stamp paradigm, which makes it possible to convert photons into
updating, 2D array recording, and time-of-travel electrical impulses. In addition, we created and executed a
calculation along horizontal, vertical, and diagonal positioning and time-of-traversal measuring device that
directions. These calculated values are then converted uses timestamp analysis to determine how long it takes an
into velocities through lookup tables and projected item to move across pixels in the visual field.
onto x and y axes, resulting in high-fidelity 8-bit 2-D We have verified the effectiveness and dependability of
optic flow (Vx, Vy). The simulation results of the our system by putting it through stringent integration,
system which are found to be promising with respect testing, and validation processes. Our method, which is
to power and area compared to the analogous based on biomimicry concepts, provides a flexible
algorithm implementations in digital architecture. framework for improving motion sensing abilities and
opening up new avenues for study and business.
Keywords— photodiode, photoreceptor, optic flow, time Unlike most creatures that move their eyes, flies have a
stamp update, time of travel, velocity projection etc unique kind of vision in which their retinas move under
fixed lenses. Like other animals, they use this movement
to detect motion and stabilize images. It's called the
I. INTRODUCTION optokinetic reflex. Additionally, when traversing
obstacles, it increases retinal overlap, which improves in
depth perception. This finding casts doubt on earlier
theories regarding insect vision and raises the possibility
In the field of biomimicry, where researchers look to
that other insects may share comparable retinal
nature's designs for inspiration to improve technology,
movements, pointing to a more dynamic visual system.
insects are particularly notable as excellent examples of
[1]. Fruit flies, or Drosophila, use unique neural
utility and efficiency. Insects provide important insights
computations in their fixed compound eyes to sense
into the functioning of sensory systems and locomotion,
motion. T4 and T5 cells, which use linear and nonlinear
even with their small size and seemingly simple
Fig 10. Simulation of photodiode Fig 14. Simulation (console window result) of time of
travel and velocity projection
Figure 10 shows that the voltage related to intensity i.e.,
V2, is kept constant at 2 mV, while the diode reference
voltage V1, which is altered and measured in volts, is kept The time of travel (TOT) simulation results are displayed
constant. The appropriate anode and cathode currents, as in Figure 14. The difference between the values of the
displayed in the waveform, are obtained based on the neighboring pixels is used to calculate TOT. Variables a,
value of V1. For instance, Id=1.43 femtoamps if V1=2V b, and m, n show how the location in a 2D array has
and V2=2mV. Therefore, table 1 displays the equivalent changed in figure 14. By calculating the difference
Id values in relation to the change in V1. between the pixel values at positions a, b, and m, n, the
variable tot stores the TOT. The following equations are
also used to determine velocity projected on the x and y
direction components (u and v, respectively), using the
values of these variables.
u=V WE + ( V SW_NE+ V NW_SE ¿ ∕ √ 2
v=V SN + ( VSW_NE−VNW_SE ¿ ∕ √ 2
where
VWE = Velocity in horizontal direction
VSN = Velocity in vertical direction
First, we examined figure 11's c1, c2=0, and M1 is off. Consider matrix[3][2]=8'd2, matrix[3][3]=8'd9.
When vpb and vcb are off, M3 will be on and M4 will be tot is calculated by the difference between the two-pixel
off. As a result, in these circumstances, just the values. So tot is tot=matrix[3][3] - matrix[3][2]
photodiode current will be amplified, yielding the desired tot=8'd9-8'd2=8’d7
result. When vpb is zero and vcb is one, which indicates Hence tot is 7 which is shown in the console window, in
that M3 and M4 are on, the higher current will be taken which 12 says the total time of travel value which is
into account and amplified. The output is what will come calculated by adding all individual tot values.
from this. So, a=3, b=2, and m=3, n=3.
Here the code is written such that we will get to know the
TABLE 3. Area of all the blocks [8] llias Sourikopoulos, Sara Headayat, Christophe Loyez,
Sl.No Block Area Francois Danneville, Virginie Hoel, Eric Mercier, and
1 Photodiode 1.936m m 2 Alian Cappy(2017, March).A 4-fJ/Spike Artificial Neuron
in 65 nm CMOS Technology.
2 Photoreceptor 2.98m m 2 [9] T Delbruk, C. A. Maed (1996, April). ANALOG VLSI
3 Time Stamp Update 1.236m m 2 PHOTOTRANSDUCTION by continuous-time, adaptive,
4 Time Stamp 2D array 1.0948m m 2 logarithmic photoreceptor circuits.
[10] Daniel Oberhof (2003, March). Semesterarbeit in
5 Time Of Travel and 0.5089m m 2 Neuroinformatics An AVLSI-case study on the Self
Velocity
biasing Adaptive Photerecptor.
[11] Timothy A. Currier, Michelle M. Pang, Thomas R.
Clandinin (2023, March). Visual processing in the fly,
from photoreceptors to behavior.
[12] Simon Zingg, Davide Scaramuzza, Stephan Weiss,
IV. CONCLUSION
Roland Siegwart (2010, May). MAV Navigation through
Indoor Corridors Using Optical Flow.
The ASIC implementation if found to be effective in
[13] Patrick A. Shoemaker, Andrew M. Hyslop, J. Sean
terms of area power and speed. Overall area standing at
Humbert (2011, May). Optic Flow Estimation on
5.824mm2 with clock frequency set to be the power is
Trajectories Generated by Bio Inspired Closed-Loop
estimated to be 280mW.
Flight.
The results are found to be promising, further the system
[14] Antoine Beyeler, Jean-Christophe Zufferey and Dario
can be completely implemented as a off pin chip. The
Floreano (2007, January). 3D Vision-based Navigation for
optical flow-based estimation in mixed mode signal is
Indoor Microflyers.
found to be more effective compared to the digital counter
[15] Seok Jun Park (2014). Bio-Inspired Optic Flow
parts that are used in flow estimation.
Sensors for Artificial Compound Eyes.
V. AKNOWLEDGEMENT
VI. REFERENCES