Professional Documents
Culture Documents
PLASTIC BOTTLE COLLECTING ROBOT-Semester 08
PLASTIC BOTTLE COLLECTING ROBOT-Semester 08
PLASTIC BOTTLE COLLECTING ROBOT-Semester 08
A PROJECT REPORT
Submitted by
DE ALWIS K K D T N
PERERA H P N
of
IN
MECHANICAL ENGINEERING
FACULTY OF ENGINEERING
OCTOMBER 2017
ii
SIGNATURE SIGNATURE
SOUTH EASTERN UNIVERSITY OF SRI LANKA SOUTH EASTERN UNIVERSITY OF SRI LANKA
OLUVIL OLUVIL
SIGNATURE
ENG. R. RATHEESAN
CO. SUPERVISOR
FACULTY OF ENGINEERING
OLUVIL
iii
ABSTRACT
This article presents the software implementation of the project: plastic bottle collecting robot, that
could be utilized in high-risk places such as high-rise building construction sites and highway
roads using wireless communications. The robot is built on the moving platform with robot arm,
and basket for gather bottles which can unload them in a certain place. The user can command the
robot to navigate it with android powered mobile phone via Bluetooth. A program developed in
Arduino application based on programming to control the all actuators. Image processing part is
done by Matlab and it measures the distance between object and the cmaera to arrange the angles
of robot arm components to reach the plastic bottle.
iv
TABLE OF CONTENTS
ABSTRACT ...................................................................................................................................iii
LIST OF TABLES .......................................................................................................................... v
LIST OF FIGURES ......................................................................................................................... v
LIST OF SYMBOLS AND ABBRIVIATIONS ............................................................................. v
ACKNOWLEDGEMENTS ........................................................................................................... vi
1. INTRODUCTION ....................................................................................................................... 1
2. LITERATURE REVIEW ............................................................................................................ 2
3. PROJECT OBJECTIVE .............................................................................................................. 6
4. METHODOLOGY ...................................................................................................................... 7
5. RESULT ...................................................................................................................................... 9
5.1 Platform ................................................................................................................................. 9
5.1.1 Bluetooth Connection ................................................................................................... 11
5.2 Object Distance Measurement Using a Single Camera ...................................................... 12
5.2.1 The Image Processing Algorithm ................................................................................. 12
5.2.2 MATLAB Function ...................................................................................................... 14
5.3 MATHLAB Robotics Toolbox ........................................................................................... 15
5.3.1 Create the Robot Arm in MATLAB ............................................................................ 16
3. Define DH (Denavit–Hartenberg) table, links using link command:................................ 16
5.4 Robotic Arm ........................................................................................................................ 18
5.5 Unloading ............................................................................................................................ 19
6. DIFFICULTIES ........................................................................................................................ 20
7. SUGGESTIONS AND FUTURE WORKS .............................................................................. 21
CONCLUSION ............................................................................................................................. 22
REFERENCE ................................................................................................................................ 23
APPENDIX 1 ................................................................................................................................ 24
APPENDIX 2 ................................................................................................................................ 30
APPENDIX 3 ................................................................................................................................ 32
v
LIST OF TABLES
LIST OF FIGURES
ACKNOWLEDGEMENTS
We would like to convey our heartiest thanks to following people who helped us in numerous
ways to completion of our Final Year Project.
We like to convey our sincere gratitude to the Senior Lecturer, Dr. A M Muzathik, Head of the
Department, Department of Mechanical Engineering, Faculty of Engineering, South Eastern
University of Sri Lanka for providing necessary support and advice.
We would like to thank the Final Year Project coordinator Eng.P. Balthazar in Department of
Mechanical Engineering, for the arrangement and coordination of the final year project.
We would like to express our special thanks of gratitude to our project main supervisor
Eng.Ramesha Soysa for the most valuable guidance in supervising this project. Also, we express
our special thanks of gratitude to our project co. supervisor Eng. R. Ratheesan for all other
supports on our project.
Further, we would like to thankful to the Workshop Engineer Mr. Subrey and workshop staffs
for their help and advices.
Moreover, we would like to convey our sincere gratitude to our parents, friends, and personnel
for their cooperative us to do the project.
DE ALWIS K K D T N
PERERA H P N
Department of Mechanical Engineering,
Faculty of Engineering,
South Eastern University of Sri Lanka.
1
1. INTRODUCTION
Environmental pollution is one of the major concerns in the world and one of the main
environmental pollutants is plastic. The reason for this is that plastics do not respond to decay with
the bacterial reaction. Therefore, among other pollutants, plastics such as drinking water bottles,
polythene bags, beverage cans, and yogurt cups causes a great threat to the environment.
The modern daily lifestyles, people prefer drinking bottled water for avoid health issues caused by
unclean supplies of water. Also, other instant drinks are packed in plastic bottles due to low cost,
ease of handling, no reaction with the content of water and carbonated drinks.
This however causes an increase in the amount of plastic disposal to the environment. Although
there are some proper ways to dispose plastic bottles or other items for recycling, some people are
not aware of them or do not take effort in following the procedure. They put empty water bottles
in everywhere despite the harm it causes to the environment we live in.
To reduce the environmental pollution, these types of wastes need to be collected and disposed
properly. Currently, in most countries, especially third world counties, plastic waste collection
process is done by manually. Although using human labor is efficient at times, there are locations
that are not safe and pose difficulties when collecting waste. Such critical areas are beside the
highways, construction sites, and beside the busy railway stations. Thus, developing a robot to
work on plastic waste collecting for these critical areas will allow us to safely perform the task.
Basically this Robot consist of few major systems such as moving platform, camera with image
processing system, robot arm and unloading system. All the controling part of the robot is done by
Arduino and MATLAB. Expect image processing, all other controling prosesses done by Arduino.
totally three arduino (Two Arrduino Mega 2560 and one Arduino Uno) boards were used for
control Robot arm, Unloading system and for the moving platform respectively.
Arduino Mega or Arduino Uno microcontrollers were not capable enough for the process of image
processing. then the image processing was done by Matlab software. the visual output of camera
of robot is observed by Matlab and calculates the distance between the captured object and the
camera. Then the the positioning angles of robot arm to reach the object also done by MATLAB.
2
2. LITERATURE REVIEW
Several research papers and articles relevant to software developing of the robot were referred to
while developing and improving our project (Plastic bottle collecting robot). The basic ideas and
the technologies observed in those publications are briefly discussed below.
2.1 Sending and Receiving Data via Bluetooth with an Android Device [1]
Android developers often need to use Bluetooth in their projects. Unfortunately, Bluetooth
can be confusing to use for those unfamiliar with the process. This application note details
a method to utilize Bluetooth to communicate with a microcontroller. This method includes
verifying Bluetooth support and status, pairing and connecting to the microcontroller’s
Bluetooth module, and sending and receiving data serially.
2.2 Bluetooth based home automation system using cell phone [2]
This paper presents the design and implementation of a low cost but flexible and secure
cell phone based home automation system. The design is based on a standalone Arduino
BT board and the home appliances are connected to the input/ output ports of this board
via relays. The communication between the cell phone and the Arduino BT board is
wireless. This system is designed to be low cost and scalable allowing variety of devices
to be controlled with minimum changes to its core.
2.3 Object Distance Measurement Using a Single Camera for Robotic Applications[3]
This thesis mainly focuses on the development of object distance measurement and feature
extraction algorithms using a single fixed camera and a single camera with variable pitch
angle based on image processing techniques. As a result, two different improved and
modified object distance measurement algorithms were proposed for cases where a camera
is fixed at a given angle in the vertical plane and when it is rotating in a vertical plane. In
the proposed algorithms, as a first step, the object distance and dimension such as length
and width were obtained using existing image processing techniques. Since the results were
not accurate due to lens distortion, noise, variable light intensity and other uncertainties
such as deviation of the position of the object from the optical axes of camera, in the second
step, the distance and dimension of the object obtained from existing techniques were
modified in the X- and Y-directions and for the orientation of the object about the Z-axis
in the object plane by using experimental data and identification techniques such as the
least square method.
3
This paper presents a 3R robot in eye to -hand orientation for the application of pick and
place of a moving object using vision system. The methodology consists of real time video
extraction, object detection using the motion of object as classification criterion and point
based tracking to predict future state of the moving object using Kalman filter in
MATLAB. The tracked estimate is transferred to the robot using Arduino platform. A
3R robot with three degrees of freedom is subjected to the implementation of the
methodology. A USB webcam is used to provide the visual information to the computer,
where image processing is done on the captured video and object is detected and the future
position of the object is predicted using Kalman filter algorithm.
2.5 Experiments in 3D measurements by using single camera and accurate motion [5]
2.6. Stereo vision images processing for real-time object distance and size
measurements [6]
In this project, proposed to utilize stereo vision system to accurately measure the distance
and size (height and width) of object in view. Object size identification is very useful in
building systems or applications especially in autonomous system navigation. Many recent
works have started to use multiple vision sensors or cameras for different type of
application such as 3D image constructions, occlusion detection etc. Multiple cameras
system has becoming more popular since cameras are now very cheap and easy to deploy
and utilize. The proposed measurement system consists of object detection on the stereo
images and blob extraction and distance and size calculation and object identification.
2.7. Detection and Tracking System of Moving Objects Based on MATLAB [7]
In this thesis, video surveillance system with moving object detection and tracking
capabilities is presented. This thesis is committed to the problems of defining and
4
developing the basic building blocks of video surveillance system. The video surveillance
system requires fast, reliable and robust algorithms for moving object detection and
tracking. The system can process both color and gray images from a stationary camera. It
can handle object detection in indoor or outdoor environment and under changing
illumination conditions. This paper presents detection and tracking system of moving
objects based on MATLAB. It is described for segmenting moving objects from the scene.
This paper explains proposed algorithm for object detection using image processing and
manipulation of the output pin state of Arduino board with ATmega 8 controller by
tracking the motion of the detected object. The object detection algorithm has been
developed on MATLAB platform by the combination of several image processing
algorithms. Using the theory of Image Acquisition and Fundamentals of Digital Image
Processing, the object has been detected in real time. Various features of an object such as
the shape, size and color can be used to detect and track the object. The variation in vertical
and horizontal axis of detected object is moderated by serial communication port and using
serial data communication, the state of Arduino board pin has been controlled. MATLAB
programming develops a computer vision system in the real time for object detection and
tracking using camera as an image acquisition hardware. Arduino programming provides
an interfacing of a hardware prototype with control signals generated by real time object
detection and tracking.
The Toolbox has always provided many functions that are useful for the study and
simulation of classical arm-type robotics, for example such things as kinematics, dynamics,
and trajectory generation. The Toolbox is based on a very general method of representing
the kinematics and dynamics of serial link manipulators. These parameters are
encapsulated in MATLAB objects — robot objects can be created by the user for any serial-
link manipulator. The Toolbox also provides functions for manipulating and converting
between datatypes such as vectors, homogeneous transformations and unit-quaternions
which are necessary to represent 3-dimensional position and orientation.
5
This document describes how to determine inverse kinematics for such a robot using the
Robotics Toolbox for MATLAB. Under actuation complicates the process of finding an
inverse kinematic solution, and it frustrates those who are new to robotics. For a robot with
6 joints it’s quite straightforward, but under actuation requires some careful thought about
the problem that needed to solve. It may consider the problem in two parts. First the
problem of moving the robot tool to a position. Second, moving the tool to a position and
tool orientation.
6
3. PROJECT OBJECTIVE
This robot is designed for collecting plastic bottles in high risk areas. All the movements of robot,
towards the plastic bottles are controlled by using a remote controller. And it initially planned to
move on a certain flat area without much obstacles. It contains a web camera and a Wi-Fi module.
This allows the robot to easily move to a certain location by remotely observing the vision of robot
through its web camera.
The communication between user and the robot is done by using Wi-Fi, which allows the operator
to easily command the robot within a higher range of distance. Although currently, the robot’s
navigation is done manually, the bottle collection part will be done by automatically. After a
command from user to collect the plastic bottles is given, the robot will manage the distance
between the object and itself automatically and it will collect the plastic bottles with its robot arm.
This robot is designed to collect about four bottles at a time and after collecting four bottles, it will
send a message to user that it has collected four bottles. Then, the robot will be navigated to a
certain unloading place by user. With the command of user, it will unload the collected plastic
bottles.
The controlling process of the robot is done by both Arduino and MATLAB software. Arduino is
for control all the inputs and outputs to the robot including actuators and MATLAB is for the
image processing part. Mainly the distance calculation between camera and the object is done by
MATLAB and the calculating the arrangements for robot arm components to reach the object is
done by the Robotic Toolbox of MATLAB.
7
4. METHODOLOGY
Start
NO YES
Number of collected
Return to the unloading position
bottles >=4
I. Visual development
• Using inverse kinematics of the robotic toolbox, find the angles of the every joint
• Send to the angle of the robot arm to the servo motors through Arduino board
V. Troubleshooting
5. RESULT
5.1 Platform
Battery
DC Motors
DC
Bluetooth mot
Arduino Motor
Module
Driver
(HC-05) Uno
DC
mot
External
In the beginning the platform control using only with Bluetooth. But after the improvement of
vision system between the bottle and platform distance will estimate by the matlab using webcam.
Then the platform move to the towards the bottle automatically for the collect the bottle from robot
arm. The robot arm have maximum and minimum length that it can move. So, it effect to the
distance between platform and bottle.
DC
Bluetooth mot
Arduino Motor
Module
Driver
(HC-05) Uno
DC
mot
External MATLAB
Image-based distance computation techniques have recently become an area of major research
interest in the fields of robotic and computer vision. Applying such superior techniques tackle
several other challenges like object detection, obstacle avoidance, and location finding.
The object distance is defined as the distance of the desired object from the center of the lens. If
the desired object is not located on the optical axes, it is called oblique object distance. The image
distance is defined as the distance from the focused image to the center of the lens. The proposed
object distance measurement in this thesis is based on finding the closest point from the object to
the bottom-center of the camera’s field of view.
• Background estimation
• Object tracking
• Feature analysis
• Depth estimating
This algorithm starts with an input video source and separates the background from the object
using a thresholding technique. Then, the desired object is tracked frame by frame. The object’s
features are extracted after completion of the tracking process from the image. Finally, using the
analyzed feature, the object distance will be calculated.
13
The idea of segmentation is to simplify or divide an image into meaningful components. There are
many approaches to segmentation that is concerned with finding features in an image, such as
partitioning the image into homogenous regions (object or background). Where each pixel
classifies to either of the two regions, the resulting image is called a binary image. Pixels with a
gray level greater than one threshold are objects, and pixels with a gray level less than or equal to
the threshold are background. In cases where multiple objects with pixels above the threshold are
presented in an image, a unique label is assigned to each connected component.
hold on
hold off
end
The Toolbox has always provided many functions that are useful for the study and simulation of
classical arm-type robotics, for example such things as kinematics, dynamics, and trajectory
generation.
The toolbox contains functions and classes to represent orientation and pose in 2D and 3D (SO(2),
SE(2), SO(3), SE(3)) as matrices, quaternions, twists, triple angles, and matrix exponentials. The
Toolbox also provides functions for manipulating and converting between datatypes such as
vectors, homogeneous transformations and unit-quaternions which are necessary to represent 3-
dimensional position and orientation.
16
The Toolbox uses a very general method of representing the kinematics and dynamics of serial-
link manipulators as MATLAB objects – robot objects can be created by the user for any serial-
link manipulator and a number of examples are provided for well-known robots from Kinova,
Universal Robotics, Rethink as well as classical robots such as the Puma 560 and the Stanford
arm.
• the code is mature and provides a point of comparison for other implementations of the
same algorithms;
• the routines are generally written in a straightforward manner which allows for easy
understanding, perhaps at the expense of computational efficiency. If you feel strongly
about computational efficiency then you can always rewrite the function to be more
efficient, compile the M-file using the MATLAB compiler, or create a MEX version;
• since source code is available there is a benefit for understanding and teaching.
4. Define Robot using SerialLink command. Its arrange the all values of the DH table:
Robot = SerialLink(L, 'name', 'LAB_1');
5. Define end-effector position, create translation matrix for the position and add to robot using
tool command:
T = transl([Pos_X Pos_Y Pos_Z]);
6. Define Base of the robot, create translation matrix for the position and add to robot using base
command:
base = [0 0 0];
In forward kinematics, the end-effector position and orientation is determined from the given set
of joint angles whereas in inverse kinematics, the vice versa is done. In robotic applications,
motion of end-effector is given in Cartesian coordinates. However, the motion of the robotic arm
is specified in terms of the joint angles, because the dynamics of the manipulator is described in
terms of these joint parameter.
The robotic arm design is influenced by many variables such as geometry of the manipulator,
dynamics involved, the structural characteristics of the linkage system (manipulator) and the
actuator characteristics. The robotic arm resembles a human arm. The stationary part of the robot
to which all other parts are attached is called its base. The links are designed to be slender members
to reduce its weight which is crucial in reducing the power consumption during its operation. The
joints are simple revolute. The hand of robot carries end-effector, might be any tool or gripping
mechanism. The design of end effector is crucial to the satisfactory performance of the robotic
arm and hence its design is dependent on the shape, size and weight of the object to be gripped.
The robotic arm joint angle generates by Robotic Toolbox send to the arm using Arduino
Mega2560 board. According to the angles, the arm will move to the position. The code for the
MATLAB data output to the servo motors through Arduino attached to the Appendix 2.
Gripper
Elbow
Shoulde
r
Base
Figure 14: Robotic Arm
19
5.5 Unloading
The robot has bucket, it will keep the bottle when the arm pick. If the bucket four bottle, the robot
moves to the unload position and unload the bottle. For that used one servo motor to unload the
bottle and Arduino Mega2560 board to control the servo motor. The Arduino code attached to the
Appendix 3.
6. DIFFICULTIES
Due to the available resources, the webcam is used low quality. It should be effect to the
detect the red color in increasing distance between robot and the bottle. As well as it given
wrong distance between robot and the bottle.
Arduino haven’t capability to store the matlab code without connection. So, whole time
arduino have to connect to the matlab to run matlab program and take the output through
arduino.
• Quality of Motors
The servo motors had power quality, because if that motors used in several time, it have a
more probability to damage the motor. Then, the servo motors were not work its accurate
range of degree.
The DC motors havn’t same quality and features, then motor to motor rotating movement
may be changed. It will effect to the platform movement.
• Improve the vision of the robot to detect all the colours of bottles
✓ Range between 1 – 3 cm
✓ Improve the bottle to robot distance range using quality camera
• Select a better microcontroller than Arduino to omit the wired connection and improve the
process of the robot
• Automate the robot to use around the university to collect waste without operator.
CONCLUSION
This project aims to develop the robot for collecting the plastic bottles at high risk areas such as
building construction sites and highway roads by incorporating navigation and sensing, and the
cooperation between human and robot using the wireless communication (Bluetooth HC-05).
Arduino Uno was used as the brain for processing all commands. So far, we developed the robot
that can move on the flat surface and incline flat surfaces via Bluetooth communication. We
developed the control process of this robot by using Arduino and MATLAB. However, this robot
still needs to be improved to operate automatically and control from further distance.
23
REFERENCE
[1] B. Wirsing, “Sending and Receiving Data via Bluetooth with an Android Device,” IEEE
Electron., p. 23, 2014.
[2] R. Piyare and M. Tazil, “Bluetooth based home automation system using cell phone,”
2011 IEEE 15th Int. Symp. Consum. Electron., pp. 192–195, 2011.
[3] P. Alizadeh, “Object Distance Measurement Using a Single Camera for Robotic
Applications by Peyman Alizadeh A thesis Submitted in partial fulfillment of the
requirements for the degree of Master of Applied Sciences ( M A Sc ) in Natural
Resources Engineering The Facult,” Object Distance Meas. Using a Single Camera
Robot. Appl., p. 126, 2015.
[4] H. Nanduri and M. Soni, “ VISION CONTROLLED PICK AND PLACE OF MOVING
OBJECT BY 3R ROBOT ,” no. 4, pp. 352–358, 2016.
[6] Y. M. Mustafah, R. Noor, H. Hasbi, and A. W. Azma, “Stereo vision images processing
for real-time object distance and size measurements,” 2012 Int. Conf. Comput. Commun.
Eng. ICCCE 2012, no. July, pp. 659–663, 2012.
[8] P. Shah and T. Vyas, “Interfacing of MATLAB with Arduino for Object Detection
Algorithm Implementation using Serial Communication,” Int. J. Adv. Robot. Syst., vol. 3,
no. 10, pp. 1067–1071, 2014.
[10] P. Corke, “4 is harder than 6 : Inverse kinematics for underactuated robots,” Int. J. Adv.
Robot. Syst., no. February, pp. 1–6, 2014.
24
APPENDIX 1
//int angle = 0;
int motorPin2 = 8;
int LD = 9;
int RD = 10;
int G = 2;
int state;
int flag=0; //makes sure that the serial only prints once the state
int trigger = 4;
int echo = 5;
void setup () {
pinMode(motorPin1, OUTPUT);
pinMode(motorPin2, OUTPUT);
pinMode(motorPin3, OUTPUT);
pinMode(motorPin4, OUTPUT);
pinMode(LD, OUTPUT);
pinMode(RD, OUTPUT);
pinMode(G, OUTPUT);
Serial.begin(9600); }
25
void loop() {
state = Serial.read();
flag=0;
if (state == '0') {
digitalWrite(motorPin3, LOW);
digitalWrite(motorPin4, LOW);
if(flag == 0){
Serial.println("Motor: off");
flag=1;
digitalWrite(motorPin3, HIGH);
digitalWrite(motorPin4, LOW);
analogWrite(LD, 60);
analogWrite(RD, 45);
26
if(flag == 0){
Serial.println("Motor: right");
flag=1;
digitalWrite(motorPin3, LOW);
digitalWrite(motorPin4, HIGH);
analogWrite(RD, 60);
analogWrite(LD, 45);
if(flag == 0){
Serial.println("Motor: left");
flag=1;
digitalWrite(motorPin1, LOW);
digitalWrite(motorPin2, HIGH);
digitalWrite(motorPin3, LOW);
digitalWrite(motorPin4, HIGH);
analogWrite(LD, 150);
27
analogWrite(RD, 150);
if (flag == 0){
Serial.println("Motor: backward");
flag=1;
digitalWrite(motorPin1, HIGH);
digitalWrite(motorPin2, LOW);
digitalWrite(motorPin3, HIGH);
digitalWrite(motorPin4, LOW);
analogWrite(LD, 150);
analogWrite(RD, 150);
if (flag == 0){
Serial.println("Motor: forward");
flag=1;
digitalWrite(G, LOW);
if (flag ==0){
Serial.println("H");
flag=1;
}
28
digitalWrite(G, HIGH);
if (flag ==0){
Serial.println("L");
flag=1;
pinMode(trigger,OUTPUT);
digitalWrite(trigger,LOW);
delayMicroseconds(2);
digitalWrite(trigger,HIGH);
delayMicroseconds(10);
digitalWrite(trigger,LOW);
delayMicroseconds(2);
pinMode(echo,INPUT);
time=pulseIn(echo,HIGH);
distance=time*34/2000;
digitalWrite(motorPin1, HIGH);
digitalWrite(motorPin2, LOW);
digitalWrite(motorPin3, HIGH);
29
digitalWrite(motorPin4, LOW);
analogWrite(LD, 80);
analogWrite(RD, 80);
digitalWrite(motorPin1, LOW);
digitalWrite(motorPin2, LOW);
digitalWrite(motorPin3, LOW);
digitalWrite(motorPin4, LOW);
Serial.print(distance);
Serial.print("cm");
Serial.println();
if (flag ==0){
Serial.println("Distance");
flag=1;
}
30
APPENDIX 2
function [] = Angles()
Theta=InverseKinematic;
Theta_1=(Theta(1)+180)/360;
Theta_2=(Theta(2)+180)/360;
Theta_3=(Theta(3)+180)/360;
a=arduino('COM3','Mega2560');
s1=servo(a,'D6'); %servo1 for base
s2=servo(a,'D9'); %servo2 for shoulder
s3=servo(a,'D10'); %servo 3 for elbow
s4=servo(a,'D11'); %servo 4 for gripper
writePosition(s1,0.5);
pause(2);
writePosition(s3,0.5);
pause(2);
writePosition(s2,0.5);
pause(2);
writePosition(s4,0.15);
pause(2);
writePosition(s1,Theta_1);
current_pos1 = readPosition(s1);
current_pos1 = current_pos1*180;
fprintf('Current motor position_1 is %d degrees\n', current_pos1);
pause(2);
writePosition(s2,Theta_2);
current_pos2 = readPosition(s2);
current_pos2 = current_pos2*180;
fprintf('Current motor position_2 is %d degrees\n', current_pos2);
pause(2);
writePosition(s3,Theta_3);
current_pos3 = readPosition(s3);
current_pos3 = current_pos3*180;
fprintf('Current motor position_3 is %d degrees\n', current_pos3);
pause(2);
writePosition(s4,0.25);
current_pos4 = readPosition(s4);
current_pos4 = current_pos4*180;
fprintf('Current motor position_4 is %d degrees\n', current_pos4);
pause(2);
pause(2);
end
APPENDIX 3
#include <Servo.h>
Servo servo_test;
void setup() {
pinMode(ledPin, OUTPUT);
pinMode(buttonPin, INPUT);
servo_test.attach(9);
void loop() {
buttonState = digitalRead(buttonPin);
if (buttonState == HIGH) {
digitalWrite(ledPin, HIGH);
for(int angle = 0; angle < 80; angle += 1) // command to move from 0 degrees to 180 degrees
delay(40);
} delay(2000);
33
for(int angle = 80; angle>=1; angle-=1) // command to move from 180 degrees to 0 degrees
delay(50);