Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 12

Available online at www.sciencedirect.

com

ScienceDirect

Engineers and scientist around the world are working every day to make the human life better and comfort. The world of
science keeps on developing in every field which paves way to new innovations. The fourth industry revolution gives
birth to automation in every fields. The entire automotive industry anticipates the launch of autonomous vehicles on
road. The special feature of this vehicle is that it can perceive the environment and surroundings and makes decision of
its own without the driver assistance. The above task is done by using various precise sensors, processors, and a pre-
programmed actuator. The full autonomous vehicle can navigate its occupants to the destination requested by the user,
without their assistance. These vehicles are the big revolution of the era, in the field of robotics, Indeed, technically these
vehicles are designed based on various fields of engineering like mechanical, electrical, computer science, control
engineering etc.,

Autonomous vehicle has great impact on the environment when it comes into existence as they are designed in
way that they are environment friendly in many aspects. Autonomous car chooses the route taking the shortest path to
the destination, route with less traffic and it will stay restricted to the lane. Ultimately, it results in less consumption of
fuel, less carbon emission into the atmosphere and reduced travel time. Precision and accuracy of the autonomous
vehicle greatly reduces the traffic congestion. For example, consider an instance where vehicles are waiting for the
traffic signal. The distance between each vehicle is very less as the Autonomous algorithm is more accurate in terms of
physical location so that it reduces the void spaces which results in reducing traffic congestion. These features have
positive impact on environment

Road accidents are one of the major causes of death, as according to report by Deshpande, that nearly 3000
people died daily because of road accidents, other than that it has also been reported that if some safety measures are not
taken this will grow up to 2.4 million a year making the 5th largest cause of death in the World. This number can be
greatly reduced by putting autonomous cars into action which are far more reliable and react swiftly than humans. It will
also cause reduction in the traffic congestion, as the efficiency of autonomous car makes it reliable in a way of keeping
very small gaps between vehicles, and its outstanding management of speed and time.
The project is an eco-system designed to use autonomous line following vehicle to provide shuttle service within a closed premise.
The eco system embraces stations spread over the campus between which the transportation is facilitated. The station with AI powered
monitoring system houses the electric scooters. The user can create a temporary riding account by logging in the system at the station
which allows the user get access to the vehicle. The user can drive manually to any of the stations of his choice. He/she can park the
vehicle at station near the his/her destination and has to logout his/her temporary riding account. The AI powered monitoring system
keeps track of the vehicles, stations, vehicle log. When the demand for vehicle rises at any of the stations the vehicle drives on its own
to that station by following path line laid along the sides of the road. The vehicle can act as both manual electric scooter and
autonomous line following vehicle.

2.1 Project methodology

For the shuttle service to be effective, AI powered vehicles which are capable of self-driving is developed and incorporated. A module
is designed and developed to integrate the self-driving technology in the vehicle. The self-driving module is an integration of sensory
devices, processing unit and control system. For the ease of integration of self-driving module, an electric vehicle with following
specification is chose

Fig. 1. Work process

2.2 Sensory devices

PIXY 2.0, a smart CMOS (Complementary metal oxide semiconductor) camera has been selected to detect the path of the vehicle.
Pixy2 has algorithms that detect and track lines. The algorithms can also detect intersections and Barcodes as well. The Barcode aids
to select the vehicles respective path. It does all of this at 60 frames-per-second

Fig. 2. PIXY cam


The PIXY cam is hooked up to Arduino Mega through the ICSP port. The video is recorded at rate of 60 frames per second. Each
frame is processed and the split into two halves. The path line is considered as a vector and the degree of deviation from the centre of
the frame is used to calculate the error values. using pre-defined program. The calculated error value is sent to the processing unit
(VPU), based on which the other systems are controlled.

Fig. 3. TF Mini LIDAR


The obstacle detection system uses LIDAR (Light detection and ranging) sensor. TF Mini LIDAR sensor is used in the system. The
TF Mini is a ToF (Time of Flight) LiDAR sensor capable of measuring the distance to an object as close as 30 centimetres and as far
as 12 meters. The sensor is connected to processing unit through I2C port. The measured distance is sent to the processing unit (VPU),
When LiDAR encounters an obstacle at the distance less than 300 cm, brakes will be applied.

2.3 Processing unit


The processing unit is a Programable Micro-controller board (Arduino Mega2560). A decision-making algorithm is programmed
into the microcontroller board. It receives the data from various sensors and the data are processed. Based on the decision made by the
algorithm, appropriate signal is passed to the required control system. The data is transferred between the sensors and micro-processor
at the rate of 115200 bits/second.

Fig. 4. Arduino Mega 2560 with WiFi module

2.4 Control system


In the drive system, the throttle connection of the controller is over looped with an external 5v signal controlled by the
processing unit. Two different speed settings are required for various driving conditions. The different speed settings are achieved by
two separate LM 2596 DC-DC buck convertors which is switched by two individual relays. These relays are controlled by processing
unit.

Fig. 5. Drive system.


The most crucial and important system in autonomous driving mode is Steering system. Stepper motor is used for the steering system
as it can be precise up to 1.6° of rotation on both directions.

The rating of stepper motor and ratio of gear box required to steer the vehicle is calculated as follows,

Total weight of the vehicle = 220 kg (including driver and pillion)


Front: Rear axle weight distribution ratio = 1:4
Front axle load = Total weight of vehicle * Front: Rear axle weight distribution ratio
Front axle load = 220 * (¼) = 55kg
Diameter of gear, D1 = 20cm
Diameter of pinion, D2 = 10cm
Centre to centre distance = (D1+D2) * 0.5
C-C distance, D0 = (20+10) * 0.5 = 15cm
Castor angle, α = 43.6°

Torque required = Front axle load * sin α * D0


Trequired = 55 * sin (43.6°) * 15 = 568.9 kg-cm

NEMA 23 stepper motor with 1:4.25 planetary gear box is considered,

Rated torque of NEMA 23 is 72 kg-cm


Planetary gear box ratio = 1:4.25
Steering gear ratio = 1:2

Torque obtained = Rated torque * Planetary gear box ratio * Steering gear ratio
Tobtained = 72 * 306 * 2 = 604 kg-cm
Tobtained > Trequired.

NEMA 23 stepper motor with 1:4.25 planetary gear box is selected. BH-MSD-6A-N micro step driver is used to control the stepper
motor.

Fig. 6. (a) Stepper motor; (b) micro step driver.

A spur gear train is used in the steering mechanism to transmit power from stepper motor to steering shaft. The pinion gear of the gear
train is attached to the shaft of the stepper motor. The steering column and the handle bar are coupled with a custom-made steering
shaft to which the driven gear is attached. A fabricated stepper motor housing is welded to the steering column. The stepper motor is
connected to the processing unit through the micro step driver. The processing unit controls the steering system based on processed
input sensor data. The stepper motor is powered by the 48V primary power source.
Fig. 7. (a) 3D model of Steering shaft; (b) steering shaft and gear.

The existing brake mechanism at rear wheel is used when driving manually whereas the front brakes are altered to stop/slow down
during self-driving. Based on the design considerations, Tower pro MG996R servo motor is used in the braking mechanism of the
front wheel. The servo motor is mounted on to the front wheel hub using some customized mountings. The brake lever is actuated
with a linkage connected to the servo motor. Based on the decision made by the algorithm the processing unit controls the braking
system.

Fig. 5. Brake system.

2.5 Monitoring mechanism

Monitoring system allows the admin and the users to monitor the status, availability of vehicle using IoT. It also allows the
admin to review the users log.

Vehicles are parked at the stations where the users can access the vehicle from any of the stations spread out across the campus. The
station is provided with a computer and a bar-code scanner which allows the user to interface with the system. A station houses a
panel which displays the vehicle availability with a combination of green and red LEDs where green indicates the vehicle is available
and red indicates the vehicle is unavailable. Each stand in the station is provided with charging ports for the vehicle

The admin has control over the system and the database to ensure proper functionality of system. The admin is provided
with an interface designed using HTML, CSS and programming languages such as PHP, JavaScript. The database is programmed
using SQL to store and manipulate the data. Admin has access to the entire database. Admin can also view and edit user information,
vehicle information, user log, vehicle log, vehicle availability etc.,

A software has been developed for the users to interface with the system. The Software requires the users to create their
own user account with some basic personal information. The user is required to login with their own user account to get access to
vehicles. The Software provides the user to check the vehicle availability in any of the stations.

To access the vehicle the user has to follow the sequence of steps as mentioned below
Step 1: The user has to scan their own ID card to login into their account
Step 2: The user has to scan the RFID tag of available vehicle which is indicated by a green light on the panel
(Once the credentials are validated, the vehicle is powered on.)
Step 3: The user can ride to any of the station of his/her interest and should park the vehicle at its respective stand
Step 4: To end his ride, the user has to scan the ID card in the system available at that station.
The software verifies the data with the database when the user scans his/her ID card. A temporary riding account (TRA) is created for
the user after the credentials are verified and it is open till the user ends his/her ride. A signal is transmitted from the system to a
dedicated microprocessor in the vehicle to trigger the relay connected across the power unit which turn on/off the vehicle.
Various tests have been performed to ensure the proper functionality of the system. Each and every component are bench tested before
it is integrated with the vehicle. Since various systems are integrated together in the project, the testing process is categorized into four
different procedures which resulted in effective testing.
 Vehicle performance test
 Self-driving capability test
 Obstacle detection test
 Software test

3.1 inference of vehicle performance test

It is noted that the vehicle could achieve the maximum speed of 45kmph in 13 seconds at standard conditions. The total running
distance of the vehicle at full charge condition is 32km with a load of 60 kg (driver). It is observed that the battery takes 3.5 hours to
fully charge the battery from 0% using standard charger. The vehicle can carry a maximum load of 220 kg with Factor of Safety (FoS)
of 0.5 at 0° gradeability.

3.2 Inference of self-driving capability test


It is observed that a black border line is required to distinguish the white path line from the road surface. It is also observed that
152.4mm (6”) wide white line with 25.4 mm wide black border is required for a proper vector detection . Based on the test results, 18
different parameters are tuned to enhance the proper vector detection. It is noted that the reliability of self-driving system is good at
various road conditions and atmospheric conditions. It is observed that the vector catching is proper when the path line in the field of
view is with a visibility of at least 40%.
3.3 Inference of Obstacle detection test
Acceptance angle of the LiDAR sensor is found to be 2.3˚. The side lengths of different detection ranges of LiDAR are
determined.
Table 1. Relationship between detection range and distance
Distance/m 1 2 3 4 5 6
Detection range
40 80 120 160 200 240
side length/mm

It is observed that the maximum detection range of LiDAR is 700 cm and minimum detection range is 30 cm.
As the vehicle is operated at low speed in self-driving mode the response time and stopping distance are negligible.

3.4 Inference of software test


The attempt to create a temporary riding account and to write into database was successful. No glitch is observed throughout the
process. It is observed the transmitting distance between the system and vehicle for powering on the vehicle should be less than 50m.
The project is an eco-system designed to use autonomous line following vehicle to provide shuttle service within a closed premise.
The eco system embraces stations spread over the campus between which the transportation is facilitated. The station with AI powered
monitoring system houses the electric scooters. The user can create a temporary riding account by logging in the system at the station
which allows the user get access to the vehicle. The user can drive manually to any of the stations of his choice. He/she can park the
vehicle at station near the his/her destination and has to logout his/her temporary riding account. The AI powered monitoring system
keeps track of the vehicles, stations, vehicle log. When the demand for vehicle rises at any of the stations the vehicle drives on its own
to that station by following path line laid along the sides of the road. The vehicle can act as both manual electric scooter and
autonomous line following vehicle.

2.1 Project methodology

2.2 For the shuttle service to be effective, AI powered vehicles which are capable of self-driving is developed and incorporated.
A module is designed and developed to integrate the self-driving technology in the vehicle. The self-driving module is an
integration of sensory devices, processing unit and control system. For the ease of integration of self-driving module, an
electric vehicle with following specification is chose

2.3

2.4 Fig. 1. Work process

2.5 Sensory devices

2.6 PIXY 2.0, a smart CMOS (Complementary metal oxide semiconductor) camera has been selected to detect the path of the
vehicle. Pixy2 has algorithms that detect and track lines. The algorithms can also detect intersections and Barcodes as well.
The Barcode aids to select the vehicles respective path. It does all of this at 60 frames-per-second

2.7

2.8 Fig. 2. PIXY cam

2.9 The PIXY cam is hooked up to Arduino Mega through the ICSP port. The video is recorded at rate of 60 frames per second.
Each frame is processed and the split into two halves. The path line is considered as a vector and the degree of deviation
from the centre of the frame is used to calculate the error values. using pre-defined program. The calculated error value is
sent to the processing unit (VPU), based on which the other systems are controlled.

2.10
2.11 Fig. 3. TF Mini LIDAR
2.12 The obstacle detection system uses LIDAR (Light detection and ranging) sensor. TF Mini LIDAR sensor is used in the
system. The TF Mini is a ToF (Time of Flight) LiDAR sensor capable of measuring the distance to an object as close as 30
centimetres and as far as 12 meters. The sensor is connected to processing unit through I2C port. The measured distance is
sent to the processing unit (VPU), When LiDAR encounters an obstacle at the distance less than 300 cm, brakes will be
applied.

2.13Processing unit
2.14 The processing unit is a Programable Micro-controller board (Arduino Mega2560). A decision-making algorithm is
programmed into the microcontroller board. It receives the data from various sensors and the data are processed. Based on
the decision made by the algorithm, appropriate signal is passed to the required control system. The data is transferred
between the sensors and micro-processor at the rate of 115200 bits/second.

2.15
2.16 Fig. 4. Arduino Mega 2560 with WiFi module

2.17Control system
a. In the drive system, the throttle connection of the controller is over looped with an external 5v signal controlled by
the processing unit. Two different speed settings are required for various driving conditions. The different speed
settings are achieved by two separate LM 2596 DC-DC buck convertors which is switched by two individual
relays. These relays are controlled by processing unit.

2.18
2.19 Fig. 5. Drive system.
2.20 The most crucial and important system in autonomous driving mode is Steering system. Stepper motor is used for the
steering system as it can be precise up to 1.6° of rotation on both directions.

2.21 The rating of stepper motor and ratio of gear box required to steer the vehicle is calculated as follows,

2.22 Total weight of the vehicle = 220 kg (including driver and pillion)
2.23 Front: Rear axle weight distribution ratio = 1:4
2.24 Front axle load = Total weight of vehicle * Front: Rear axle weight distribution ratio
2.25 Front axle load = 220 * (¼) = 55kg
2.26 Diameter of gear, D1 = 20cm
2.27 Diameter of pinion, D2 = 10cm
2.28 Centre to centre distance = (D1+D2) * 0.5
2.29 C-C distance, D0 = (20+10) * 0.5 = 15cm
2.30 Castor angle, α = 43.6°

2.31 Torque required = Front axle load * sin α * D0


2.32 Trequired = 55 * sin (43.6°) * 15 = 568.9 kg-cm

2.33 NEMA 23 stepper motor with 1:4.25 planetary gear box is considered,

2.34 Rated torque of NEMA 23 is 72 kg-cm


2.35 Planetary gear box ratio = 1:4.25
2.36 Steering gear ratio = 1:2

2.37 Torque obtained = Rated torque * Planetary gear box ratio * Steering gear ratio
2.38 Tobtained = 72 * 306 * 2 = 604 kg-cm
2.39 Tobtained > Trequired.

2.40 NEMA 23 stepper motor with 1:4.25 planetary gear box is selected. BH-MSD-6A-N micro step driver is used to control
the stepper motor.

2.41

2.42 Fig. 6. (a) Stepper motor; (b) micro step driver.

2.43 A spur gear train is used in the steering mechanism to transmit power from stepper motor to steering shaft. The pinion gear
of the gear train is attached to the shaft of the stepper motor. The steering column and the handle bar are coupled with a
custom-made steering shaft to which the driven gear is attached. A fabricated stepper motor housing is welded to the
steering column. The stepper motor is connected to the processing unit through the micro step driver. The processing unit
controls the steering system based on processed input sensor data. The stepper motor is powered by the 48V primary power
source.
2.44

2.45 Fig. 7. (a) 3D model of Steering shaft; (b) steering shaft and gear.

2.46 The existing brake mechanism at rear wheel is used when driving manually whereas the front brakes are altered to stop/slow
down during self-driving. Based on the design considerations, Tower pro MG996R servo motor is used in the braking
mechanism of the front wheel. The servo motor is mounted on to the front wheel hub using some customized mountings.
The brake lever is actuated with a linkage connected to the servo motor. Based on the decision made by the algorithm the
processing unit controls the braking system.

2.47
2.48 Fig. 5. Brake system.

2.49Monitoring mechanism

a. Monitoring system allows the admin and the users to monitor the status, availability of vehicle using IoT. It also
allows the admin to review the users log.

2.50 Vehicles are parked at the stations where the users can access the vehicle from any of the stations spread out across the
campus. The station is provided with a computer and a bar-code scanner which allows the user to interface with the system.
A station houses a panel which displays the vehicle availability with a combination of green and red LEDs where green
indicates the vehicle is available and red indicates the vehicle is unavailable. Each stand in the station is provided with
charging ports for the vehicle

a. The admin has control over the system and the database to ensure proper functionality of system. The admin is
provided with an interface designed using HTML, CSS and programming languages such as PHP, JavaScript. The
database is programmed using SQL to store and manipulate the data. Admin has access to the entire database.
Admin can also view and edit user information, vehicle information, user log, vehicle log, vehicle availability etc.,

b. A software has been developed for the users to interface with the system. The Software requires the users to
create their own user account with some basic personal information. The user is required to login with their own
user account to get access to vehicles. The Software provides the user to check the vehicle availability in any of
the stations.

c. To access the vehicle the user has to follow the sequence of steps as mentioned below
Step 1: The user has to scan their own ID card to login into their account
Step 2: The user has to scan the RFID tag of available vehicle which is indicated by a green light on the panel
(Once the credentials are validated, the vehicle is powered on.)
Step 3: The user can ride to any of the station of his/her interest and should park the vehicle at its respective stand
Step 4: To end his ride, the user has to scan the ID card in the system available at that station.
The software verifies the data with the database when the user scans his/her ID card. A temporary riding account (TRA) is
created for the user after the credentials are verified and it is open till the user ends his/her ride. A signal is transmitted from
the system to a dedicated microprocessor in the vehicle to trigger the relay connected across the power unit which turn
on/off the vehicle.

You might also like