Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

machines

Article
Drawing System with Dobot Magician Manipulator Based on
Image Processing
Pu-Sheng Tsai 1 , Ter-Feng Wu 2, *, Jen-Yang Chen 1 and Fu-Hsing Lee 3

1 Department of Electronic Engineering, Ming Chuan University, Taoyuan 33348, Taiwan;


pusheng@mail.mcu.edu.tw (P.-S.T.); jychen@mail.mcu.edu.tw (J.-Y.C.)
2 Department of Electrical Engineering, National Ilan University, Yilan 26047, Taiwan
3 Department of Electronic Engineering, China University of Science and Technology, Taipei 11581, Taiwan;
irving.lee71024@gmail.com
* Correspondence: tfwu@niu.edu.tw

Abstract: In this paper, the robot arm Dobot Magician and the Raspberry Pi development platform
were used to integrate image processing and robot-arm drawing. For this system, the Python
language built into Raspberry Pi was used as the working platform, and the real-time image stream
collected by the camera was used to determine the contour pixel coordinates of image objects. We
then performed gray-scale processing, image binarization, and edge detection. This paper proposes
an edge-point sequential arrangement method, which arranges the edge pixel coordinates of each
object in an orderly manner and places them in a set. Orderly arrangement means that the pixels
in the set are arranged counterclockwise to the closed curve of the object shape. This arrangement
simplifies the complexity of subsequent image processing and calculation of the drawing path. The
number of closed curves represents the number of strokes in the drawing of the manipulator. In

 order to reduce the complexity of the drawing of the manipulator, a fewer number of closed curves
Citation: Tsai, P.-S.; Wu, T.-F.; Chen,
will be necessary. To achieve this goal, we not only propose the 8-NN (abbreviation for eight-nearest-
J.-Y.; Lee, F.-H. Drawing System with neighbor) search, but also use to the 16-NN search and the 24-NN search methods. Drawing path
Dobot Magician Manipulator Based points are then converted into drawing coordinates for the Dobot Magician through the Raspberry
on Image Processing. Machines 2021, Pi platform. The structural design of the Dobot reduces the complexity of the experiment, and its
9, 302. https://doi.org/10.3390/ attitude and positioning control can be accurately carried out through the built-in API function or
machines9120302 the underlying communication protocol, which is more suitable for drawing applications than other
fixed-point manipulators. Experimental results show that the 24-NN search method can effectively
Academic Editors: Mingcong Deng reduce the number of closed curves and the number of strokes drawn by the manipulator.
and Marco Ceccarelli

Keywords: Dobot Magician manipulator; Raspberry; edge-point sequential arrangement; 8-NN


Received: 1 October 2021
search; 16-NN search; 24-NN search
Accepted: 18 November 2021
Published: 23 November 2021

Publisher’s Note: MDPI stays neutral


1. Introduction
with regard to jurisdictional claims in
published maps and institutional affil- Artificial-intelligence robots offer promising market potential, particularly service-
iations. oriented and entertainment robots. These include the currently popular sweeping robot,
the suitcase robot, the baby monitoring robot, and the housekeeper robot. Compared with
functions such as medical diagnosis or automatic driving, learning and entertainment-
oriented robots have a much higher degree of fault tolerance. Intelligent toys are another
Copyright: © 2021 by the authors.
channel for firms to use the Internet-of-Things to enter the home market. In addition
Licensee MDPI, Basel, Switzerland.
to providing domestic services, early education robots combine instructional functions
This article is an open access article
with entertaining programs. A popular example is the Star Wars BB-8 SPHERO remote-
distributed under the terms and control robot [1] and Anki’s Cozmo social robot [2], shown in Figures 1 and 2, respectively,.
conditions of the Creative Commons Another example is the LEGO BOOST robot [3], which provides a building block kit
Attribution (CC BY) license (https:// for children to write their own programs, shown in Figure 3. Due to its comprehensive
creativecommons.org/licenses/by/ and flexible sporty performance, omni-directional mobile platforms are widely used in
4.0/). entertainment-oriented robots. In [4], Xu provide a structural analysis and motion resolving

Machines 2021, 9, 302. https://doi.org/10.3390/machines9120302 https://www.mdpi.com/journal/machines


Machines 2021, 9, x FOR PEER REVIEW 2 of 20
Machines 2021, 9, x FOR PEER REVIEW 2 of 20
Machines 2021, 9, x FOR PEER REVIEW 2 of 20
Machines 2021, 9, 302 2 of 20

ible sporty performance, omni-directional mobile platforms are widely used in entertain-
ible sporty
ible sporty performance,
performance, omni-directional
omni-directional mobile platforms
platforms are widely
widely used
used in
in entertain-
entertain-
ment-oriented robots. In [4], Xu provide a mobile
structural analysisare
and motion resolving solu-
ment-oriented
solution to the
ment-oriented robots.
platformIn
robots.based[4],
Inbased Xu
[4],onon
Xu provide a structural
fourseparate
separate
provide Mecanum
a structural analysis and
wheels,
analysis motion
building
and resolving
motion the solu-
perspective
resolving solu-
tion to the platform four Mecanum wheels, building the perspective
tion to
to the
the platform
mathematical platform
model based on four
four separate
separate Mecanum wheels, building
building the
the perspective
perspective
tion
mathematical modelininits
itskinematics
based on
kinematics and
anddynamics terms.
Mecanum
dynamics wheels,
terms.
mathematical model in its kinematics and dynamics
mathematical model in its kinematics and dynamics terms. terms.

Figure1.
1. Star Wars
Wars BB-8SPHERO
SPHERO [1].
Figure 1. Star
Figure Star WarsBB-8
BB-8 SPHERO[1].
[1].
Figure 1. Star Wars BB-8 SPHERO [1].

Figure 2. Anki’s Cozmo [2].


Figure 2. Anki’s
Anki’s Cozmo [2].
[2].
Figure 2. Anki’s Cozmo
Figure 2. Cozmo [2].

Figure 3. LEGO BOOST [5].


Figure 3. LEGO BOOST [5].
Figure 3.
Figure 3. LEGO
LEGO BOOST
BOOST [5].
[5].
The range of applicability of artificial intelligence is diverse and continually expand-
Therange
rangeof ofapplicability
applicabilityofofofartificial
artificial intelligence is diverse
diverse and continually expand-
ing. The
The
In range
1984, theofperformance
applicability artificial
robot intelligence
Wabot-2 intelligence is diverse
is
[6,7], shown and
and
in Figurecontinually
continually
4, expanding.
was publishedexpand-
by
ing.
In In 1984,
1984, the performance
performance robot Wabot-2 [6,7], shown in Figure
Figure 4, was
was published by
the1984,
ing. In the performance
Research the
Office of Ichirorobotrobot
Wabot-2
Kato, Wabot-2
Waseda[6,7],[6,7],
shown in Figure
shown
University, in
Japan. 4,Itwas4,
can published by
published
recognize the
by
music
the
the Research
Research Office
Research Office of
of camera
Office Ichiro Ichiro
Kato,
of Ichiro Kato,
Waseda Waseda University,
University, Japan. Japan.
Itboth It can
can recognize recognize
music music
scores
scores through its lens Kato,
and Waseda
play musicUniversity,
flexibly Japan.
with It can and
hands recognize
feet. Inmusic
2007,
scores through
through
scores through
its its camera
camera
its camera
lens andlens andmusic
play
lens and play music
play music
flexibly flexibly
with
flexibly with hands
both
with both hands
both hands
and and feet.
feet.
and feet.
In In 2007,
2007,
In 2007,
the
the Department of Mechanical Engineering of Zhongyuan University published the “Ro-
the
the Department
Department
Department of of
of Mechanical
Mechanical
Mechanical Engineering
Engineering
Engineering of of Zhongyuan
Zhongyuan
of Zhongyuan University
University
University published
published
published the the “Ro-
“Robot
the “Ro-
bot String Orchestra”, shown in Figure 5. This system combines turning the piano, waving
bot String
String
String Orchestra”,
Orchestra”, shown
shown in Figure
Figure 5. This
This system combinesturning
turningthe thepiano,
piano,waving
waving
bot
the bow, Orchestra”,
and pressing theinstring
shown Figure
in 5. This
and 5. system
kneading, system combines
and combines turning
then cooperates the piano,
with waving
a multi-axis
the bow,
the
the bow, and
bow, and pressing
and pressing the
pressing thestring
the stringand
string andkneading,
and kneading, and
kneading, and then
and then cooperates
then cooperates with
cooperates with aaamulti-axis
with multi-axis
multi-axis
synchronous control string pressing device to realistically simulate the performance skills
synchronous
synchronous
synchronous control
control string
string pressing device to realistically simulate the performance
string pressing device to realistically simulate the performanceskills
control [8,9]. pressing device to realistically simulate the performance skills
skills
of human musicians
of
of human
human musicians
musicians
of human musicians [8,9]. [8,9].
[8,9].
Machines 2021, 9, 302 3 of 20
Machines 2021, 9, x FOR PEER REVIEW 3 of 20
Machines 2021, 9, x FOR PEER REVIEW 3 of 20

Figure4.4. Performance
Figure Performance robot
robotof
ofWaseda
WasedaUniversity
University[6].
[6].
Figure 4. Performance robot of Waseda University [6].

Figure 5. Robot string orchestra of Zhongyuan University [10].


Figure5.5.Robot
Figure Robotstring
stringorchestra
orchestraof
ofZhongyuan
ZhongyuanUniversity
University[10].
[10].
Technological progress in artificial intelligence provides artists with new techniques,
Technological
Technologicalprogress progressin inartificial
artificialintelligence
intelligenceprovides providesartists
artistswith
withnewnewtechniques,
techniques,
which is expected to ignite a revolution in visual art. British artists Nick and Rob Carter
which
which is expected to ignite a revolution in visual art. British artists Nick and RobCarter
is expected to ignite a revolution in visual art. British artists Nick and Rob Carter
exhibited the “Dark Factory Portraits” in the London Gallery. These portraits were
exhibited
exhibitedthe the“Dark
“Dark Factory
Factory Portraits”
Portraits”in the inLondon
the London Gallery. These portraits
Gallery. were painted
These portraits were
painted by an industrial robot named Kuka, which was developed for car production
by an industrial
painted robot named
by an industrial robot Kuka,
namedwhich Kuka, waswhich
developed for car production
was developed lines. The
for car production
lines. The Carters’ robot, Heidi, was adapted to paint portraits of well-known artists, as
Carters’
lines. The robot, Heidi,
Carters’ wasHeidi,
robot, adapted was toadapted
paint portraits
to paintofportraits
well-known artists, as shown
of well-known artists,in as
shown in Figure 6. The Heidi robot reads digital photos and uses image processing tech-
Figure
shown6.in The Figure Heidi robot
6. The Heidireads robotdigital
readsphotosdigitaland photosusesand image
usesprocessing
image processingtechnology tech-
nology
to convert
to two-dimensional
convert two-dimensional (2D) pictures
(2D) pictures
into corresponding
into corresponding code. This
code. This editing
enables
enables
nology to convert two-dimensional (2D) pictures into corresponding code. This enables
editing
of of
each robot each robot
action, action, including drawing, selecting brushes, applying brushstrokes,
editing of each robot including drawing,
action, including selecting
drawing, brushes,
selecting applying
brushes, brushstrokes,
applying brushstrokes,and
and cleaning
cleaning brushes.
brushes. RobotsRobots represent
represent tools tools
for for artists
artists rather rather
than than substitutions
substitutions for for crea-
creative
and cleaning brushes. Robots represent tools for artists rather than substitutions for crea-
tive thinking
thinking and thusand thus open up possibilities
new possibilities for creative workers.
tive thinking and open up new
thus open up new possibilities for creative workers.
for creative workers.
Manystudies
Many studies haveinvestigated
investigated theuse use of robots
robots forentertainment
entertainment purposes.In In [11],
Many studieshave have investigatedthe the useof of robotsfor for entertainmentpurposes.purposes. In[11], [11],
a robot
aarobot was
was equipped
equipped with
with aa powerful
powerful sensing
sensing ability,
ability, enabling
enabling it to
it todraw
draw a portrait
a portrait onon an
robot was equipped with a powerful sensing ability, enabling it to draw a portrait on an
uncalibrated
an uncalibrated surface
surfaceof arbitrary shape. This achievement was aided by breakthrough de-
uncalibrated surface of of arbitrary
arbitrary shape.
shape. This This achievement
achievement waswasaidedaidedby by breakthrough
breakthrough de-
velopments in in
developments terms
termsof of
mechanical
mechanical arms.arms. Robots
Robots havehave been applied
been appliedto music
to musiccomposition,
composi-
velopments in terms of mechanical arms. Robots have been applied to music composition,
whilewhile
tion, developments
developments related to 3Dtoprinting
related 3D printing and mechanical engraving are opening up pos-
while developments related to 3D printing and and mechanical
mechanical engraving
engraving are opening
are opening up
up pos-
sibilities in the
possibilities in world
the world of sculpture.
of sculpture. Robots
Robots arearealso learning
also learning calligraphy
calligraphy [12], asas
[12], shown
shown in
sibilities in the world of sculpture. Robots are also learning calligraphy [12], as shown in
Figure
in Figure7. 7.
InInthis
thisrule-bound
rule-bound artartform,
form, the
the robot
robot can
can learn
learn to
to write
write as
as fluently
fluently as
as a human.
a human.
Figure 7. In this rule-bound art form, the robot can learn to write as fluently as a human.
While font
While font and pressing
pressing are easy easy to learn,
learn, the changechange of the the shape of of thepenpen hairand and
While font and and pressing are are easyto to learn,the the changeof of theshape
shape ofthe the penhair hair and
judgment of the dryness and wetness of the pen remain difficult
judgment of the dryness and wetness of the pen remain difficult for artificial intelligence. In for artificial intelligence.
judgment of the dryness and wetness of the pen remain difficult for artificial intelligence.
In recent
recent years,years, artists
artists and and researchers
researchers havehave begun begun
applyingapplying
robotsrobots and automatic
and automatic machines ma-
In recent years, artists and researchers have begun applying robots and automatic ma-
chines
to to develop
develop novel forms novelofforms
artistic ofpainting.
artistic painting.
Interesting Interesting
examplesexamples
are givenare by given
the work by the
of
chines to develop novel forms of artistic painting. Interesting examples are given by the
work
the of thee-David,
system system e-David,
capable ofcapable of controlling
controlling a variety of a variety
painting of machines
painting machines
in order toincreate
order
work of the system e-David, capable of controlling a variety of painting machines in order
to create
robotic robotic[13].
artworks artworks
In [14],[13].
theIn [14], the non-photorealistic
non-photorealistic renderingthat
rendering techniques techniques
are applied that
to create robotic artworks [13]. In [14], the non-photorealistic rendering techniques that
Machines 2021, 9, 302 4 of 20
Machines 2021, 9, x FOR PEER REVIEW 4 of 20
Machines 2021, 9, x FOR PEER REVIEW 4 of 20

together withtogether
are applied a painting
withrobot to realize
a painting artworks
robot with
to realize originalwith
artworks styles. Further
original examples
styles. of
Further
are applied together with a painting robot to realize artworks with original styles. Further
artistic robots are the interactive system for painting artworks by regions presented
examples of artistic robots are the interactive system for painting artworks by regions pre- in [15]
examples
and the in of artistic robots are the
collaborative interactive system
shownforin painting artworks by regions pre-
sented [15] and thepainting support
collaborative systemsupport
painting [16]. shown
system in [16].
sented in [15] and the collaborative painting support system shown in [16].

Figure6.
6. Portrait drawing
drawing by industrial
industrial robotHeidi
Heidi [17].
Figure 6. Portrait
Figure Portrait drawing by
by industrial robot
robot Heidi[17].
[17].

Figure 7. Calligraphy robot [18].


Figure7.
Figure 7. Calligraphy
Calligraphy robot
robot [18].
[18].

The Dobot
The Dobot Magician [19] [19] is a desktop
desktopfour-axis
four-axismanipulator with anan accuracy of 0.2
The Magician [19] isisaadesktop four-axis manipulator
manipulator with
with accuracy
an accuracy of
of 0.2
mm
0.2 based on an STM32 industrial chip. It uses Python as the working platform and co-
mmmm based
based on onan an STM32
STM32 industrial
industrial chip.chip. It uses
It uses Python
Python as the
as the working
working platform
platform andandco-
operates with
cooperates withthetheOpenCV
OpenCVimage imageprocessing
processingdevelopment
development kit kit and the Python
and the Python API API (the
(the
operates with the OpenCV image processing development kit API (the
application
application interface) to control the manipulator. The Dobot Magician has been widely
application interface)
interface) to to control
control the the manipulator.
manipulator. The The Dobot
Dobot Magician
Magician has has been
been widely
widely
used
used in academic research [20]. For example, Tsao [21] wrote an Android application for
used in academic research research [20].[20]. For Forexample,
example,Tsao Tsao[21]
[21] wrote
wrote anan Android
Android application
application for
smartphones
for smartphones to drive
to drivelaser engraving [22]. Images are sent from the smartphone to Rasp-
smartphones to drive laserlaser engraving
engraving [22].[22].
Images Images are from
are sent sent from the smartphone
the smartphone to Rasp-to
berry Pi, which
Raspberry calls the
Pi, which OpenCV
calls the OpenCV function library library
function to carry out imageoutprocessing and find
berry Pi, which calls the OpenCV function library to carrytoout carry
image image processing
processing and find
contour
and findpoints.
contour These are used
points. These to plan
are usedlaser toengraving
plan laser path points, which
engraving path are transmitted
points, which
contour points. These are used to plan laser engraving path points, which are transmitted
to
are the manipulator.
transmitted to In
the [23,24],
manipulator. Dobot and
In Raspberry
[23,24], Dobot Pi were
and applied
Raspberry toPiintegrate
were image
applied
to the manipulator. In [23,24], Dobot and Raspberry Pi were applied to integrate image
processing
to integrate and image robot-arm
processing drawing. The plug-in
and robot-arm camera
drawing. Thestreams
plug-in the image,streams
camera cooperatesthe
processing and robot-arm drawing. The plug-in camera streams the image, cooperates
with the
image, third-party
cooperates kit OpenCV
with the function
third-party kitlibrary,
OpenCV carries out image
function processing,
library, carries outfinds
imagethe
with the third-party kit OpenCV function library, carries out image processing, finds the
contour points,
processing, finds classifies
the andpoints,
contour calculates the drawing
classifies and path points,
calculates the and then
drawing transmits
path points, them
and
contour points, classifies and calculates the drawing path points, and then transmits them
to the
then manipulator
transmits themfor drawing.
to the manipulatorA similar set-up hasAbeen
for drawing. similarapplied
set-upto perform electronic
to the manipulator for drawing. A similar set-up has been applied tohas been applied
perform electronicto
acupuncture
perform and
electronic massages
acupuncture [25,26].
and Acupoint
massages recognition
[25,26]. is
Acupointachieved through
recognition is image
achieved de-
acupuncture and massages [25,26]. Acupoint recognition is achieved through image de-
tection, and
through image thedetection,
manipulator andis controlled to move to the corresponding acupoints. In [27],
tection, and the manipulator isthe manipulator
controlled to moveis controlled to move to the
to the corresponding corresponding
acupoints. In [27],
the Dobot In
acupoints. Magician
[27], the and
Dobotmachine
Magician visionandwere applied
machine to the
vision wereautomatic
applied optical
to the detection
automatic
the Dobot Magician and machine vision were applied to the automatic optical detection
of defects
optical on production
detection of defects lines. The researchers
on production lines.in [28] researchers
integrated our atmega328p devel-
of defects on production lines. The researchers inThe
[28] integrated our in [28] integrated
atmega328p our
devel-
opment board
atmega328p with the Dobot
development board for with
electromechanical
the Dobot forintegration solid lineintegration
electromechanical automation. The
solid
opment board with the Dobot for electromechanical integration solid line automation. The
DGAC
line design was
automation. Theproposed
DGAC design in [29],was the proposed
spirit of gradient descent
in [29], the spirittraining
of gradientis embedded
descent
DGAC design was proposed in [29], the spirit of gradient descent training is embedded
in genetic
training is algorithm
embedded(GA) to construct
in genetic algorithma main(GA) controller
to constructto search
a main optimum
controller control effort
to search
in genetic algorithm (GA) to construct a main controller to search optimum control effort
optimum
under possiblecontrol effort under
occurrence possible occurrence
of uncertainties. of uncertainties.
The effectiveness The effectiveness
is demonstrated by simula- is
under possible occurrence of uncertainties. The effectiveness is demonstrated by simula-
demonstrated
tion results, and by simulation
its advantages results, and
are its advantages
indicated are indicated
in comparison withinother
comparison with
GA control
tion results, and its advantages are indicated in comparison with other GA control
other
schemes GA forcontrol schemes
four-axis for four-axis manipulators.
manipulators.
schemes for four-axis manipulators.
Machines 2021, 9, x FOR PEER REVIEW 5 of 20

Machines 2021, 9, 302 5 of 20

In this paper, the drawings of images and various curves are used to verify the feasi-
bility of Dobot manipulator drawing system. Connected-component labeling (CCL)
In this
[30,31] is anpaper, the drawings
algorithmic of images
application and various
of graph theory, curves
where are used of
subsets to connected
verify the feasibil-
compo-
ity of Dobot manipulator drawing system. Connected-component labeling (CCL) [30,31]
nents are uniquely labeled based on a given heuristic which is used in computer vision to
is an algorithmic application of graph theory, where subsets of connected components are
detect connected regions in binary digital images. Thus, in this paper, the method of CCL
uniquely labeled based on a given heuristic which is used in computer vision to detect
is applied to binary images construction which is used to cut an image in multiple single
connected regions in binary digital images. Thus, in this paper, the method of CCL is
objects. Besides, to reduce the complexity of robot path planning, the edge-point sequen-
applied to binary images construction which is used to cut an image in multiple single
tial arrangement method is explored to rearrange pixels’ image edge on the drawing path.
objects. Besides, to reduce the complexity of robot path planning, the edge-point sequential
This method is also investigated in this paper and is improved from the original idea of
arrangement method is explored to rearrange pixels’ image edge on the drawing path. This
connected component labeling. A popular expression is the parametric B-spline curve,
method is also investigated in this paper and is improved from the original idea of con-
typically defined by a set of control points and a set of B-spline basis functions, which has
nected component labeling. A popular expression is the parametric B-spline curve, typically
several beneficial characteristics. It can be generalized as (a) it lies entirely within the con-
defined by a set of control points and a set of B-spline basis functions, which has several
vex hull of its control points; (b) it is invariant under a rotation and a translation of the
beneficial characteristics. It can be generalized as (a) it lies entirely within the convex hull
coordinate
of its controlaxes; and(b)
points; (c)ititiscan be locally
invariant refined
under without
a rotation andchanging their of
a translation global shape. The
the coordinate
synthesis of the above concepts and B-spline properties leads to a successful methodology
axes; and (c) it can be locally refined without changing their global shape. The synthesis of
for above
the trajectory planning
concepts application.
and B-spline In thisleads
properties paper,
to athe divided methodology
successful B-spline function has been
for trajectory
proposed by Tsai [32] which is used to describe the mathematical expressions
planning application. In this paper, the divided B-spline function has been proposed of various
by
curves.
Tsai [32] which is used to describe the mathematical expressions of various curves.

2.2. System
SystemArchitecture
Architecture
Inthis
In thispaper,
paper,we weaimed
aimedtotorealize
realizethethedrawing
drawingfunction
functionof ofthe
themanipulator
manipulatorthroughthrough
machine vision. In addition to the mechanism of the four-axis
machine vision. In addition to the mechanism of the four-axis manipulator, the Dobot manipulator, the Dobot
Magician also includes
Magician includesexpansion
expansionmodules
modules(gripper
(gripper kit,kit,
airair
pump pumpbox,box,
conveyor
conveyorbelt, belt,
pho-
toelectric switch,
photoelectric andand
switch, color recognition
color recognition sensor), an external
sensor), an external communication
communication interface, and
interface,
a control
and program.
a control program. It also has
It also has1313I/O
I/Oextension
extensioninterfaces
interfacesand andprovides
provides Qt, C#, Python,
Python,
Java,VB,
Java, VB,and
andother
otherAPIs
APIsfor
fordevelopment.
development.Image Imagevision
visionisiscarried
carriedout
outusing
usingaaRaspberry
Raspberry
Pi
Pi33external
externalhigh-resolution
high-resolutionUSB USBcamera.
camera.Raspberry
RaspberryPi Pi33adopts
adoptsthetheBroadcom
BroadcomBCM2837
BCM2837
chipset,
chipset,which
whichisisaafour-core
four-core64-bit
64-bitARMv8
ARMv8 Series
SeriesCPU
CPU withwitha clock frequency
a clock frequency of 1.4 GHz.
of 1.4 GHz.It
has a dual-core
It has a dual-core videocore
videocoreIV 3D drawing
IV 3D drawing core; provides
core; provides40 GPIO and and
40 GPIO 4 groups of USB
4 groups 2.0;
of USB
and
2.0; supports 10/100
and supports Ethernet,
10/100 IEEE802.11
Ethernet, b/g/n
IEEE802.11 wireless
b/g/n network,
wireless Bluetooth
network, 4.1 network
Bluetooth 4.1 net-
function, and 15
work function, andpin15MIPI terminal
pin MIPI camera
terminal cameraconnection
connection function.
function. The
Thebuilt-in
built-inPython
Python
language
language is used as the working platform. The contour pixel coordinates of the objectin
is used as the working platform. The contour pixel coordinates of the object in
the
theimage
imageare aretransmitted
transmittedto tothe
theDobot
Dobotthrough
throughUART UARTusingusinggray-scale
gray-scaleprocessing,
processing,image
image
binarization,
binarization, edgeedge detection,
detection, andandedge-point
edge-point sequential
sequential arrangement
arrangement to todepict
depictthe theedge
edge
contour of the object. The overall architecture consists of the following
contour of the object. The overall architecture consists of the following five units shown five units shown in
Figure
in Figure8: Dobot
8: Dobot Magician
Magician (including
(including thethewriting-and-drawing
writing-and-drawingmodule), module),Raspberry
RaspberryPiPi33
embedded microprocessor, camera, user interface, and image
embedded microprocessor, camera, user interface, and image processing algorithm. processing algorithm.

Figure8.8.System
Figure Systemarchitecture.
architecture.
Machines 2021, 9, 302 6 of 20
Machines 2021, 9, x FOR PEER REVIEW 6 of 20
Machines 2021, 9, x FOR PEER REVIEW 6 of 20

2.1.
2.1. Specifications
SpecificationsofofDobot
DobotMagician
Magician
2.1. Specifications of Dobot Magician
As
Asshown
shownin inFigure
Figure9,9,the
themain
mainbody
bodyof ofthe
theDobot
Dobotcomprises
comprisesthe thebase,
base,base
basearm,
arm,rear
rear
arm, As shownand in Figure 9, the maincorresponding
body of the Dobot comprises the base, base arm, and
rear
arm,forearm,
forearm, andend endeffector.
effector.The
The correspondingfour fourjoints
jointsare
aredefined
definedasasJ1,
J1,J2,
J2,J3,
J3, and
arm,
J4. forearm, and end effector.origin
The corresponding four jointsofare defined as J1, J2, J3, and
J4. We
Wetake
takeJ2
J2as
asthe
theCartesian
Cartesian origin(0,0,0),
(0,0,0),and
andthe
theposition
position ofthe
theend
endtool
toolisisdefined
definedas as
J4.
(x, We
y, take
z). J2 as the Cartesian origin (0,0,0), and the position of the end tool is defined as
(x, y, z). The power supply and warning lights are located above the base. Thelength
The power supply and warning lights are located above the base. The lengthof of
(x, y,
the z). The power supply and warning lightslength
are located abovearmthe base. The lengththeof
theforearm
forearmof ofthe
themanipulator
manipulator isis147
147mm,
mm,the the lengthofofthe
therear
rear armisis135
135mm,
mm,and and the
the forearm
length of thearm
manipulator is 147 mm, theFigure
length of the rear arm is 135 mm, and the
lengthof ofthe
thebase
base armisis138
138mm,
mm,as asshown
shownin in Figure10.
10.
length of the base arm is 138 mm, as shown in Figure 10.

Figure 9.Image
Image ofDobot.
Dobot.
Figure 9. Image of
Figure 9. of Dobot.

Figure 10. Diagram of Dobot structure.


Figure 10.
Figure 10. Diagram
Diagram of
of Dobot
Dobot structure.
structure.

The maximum load of the manipulator is 500 g, the maximum extension distance is
The maximum
The maximum load load of of the
the manipulator
manipulator is is 500
500 g,
g, the
themaximum
maximumextension extension distance
distance isis
320 mm, the movement angle of the base arm joint J1 is (−90~+90°), the movement angle
◦ ), thethe
320 mm,
320 mm,thethemovement
movementangle angleofof thethe base
base arm armjoint J1 isJ1(−
joint is90~+90
(−90~+90°), movement
movement angle
angle of
of the rear arm joint J2 is (0~+85°), ◦ ), thethe movement angle of the forearm joint J3 is (−10~+90°),
the rear
of the arm
rear armjoint J2 is
joint J2 (0~+85
is (0~+85°), themovement
movement angle
angle ofofthe
theforearm
forearm joint J3J3isis(−
joint 10~+90◦ ),
(−10~+90°),
and the movement angle of the end tool J4 is (−135~+135°). ◦The working range of the X-Y
and
and the movement angle angle of of the
theendendtool (−135~+135 The
toolJ4J4isis(−135~+135°). ). The working
working rangerange of X-Y
of the the
axis of the manipulator is within the J1 activity angle and the maximum extension dis-
X-Y axis of the manipulator is within the J1 activity angle
axis of the manipulator is within the J1 activity angle and the maximum extension dis-and the maximum extension
tance, while
distance, while the activity space ofofthe ZZaxis isiswithin the J2 and J3 activity angle
angle andand the
tance, while thethe activity
activity spaceof
space theZ
the axisis
axis withinthe
within theJ2J2 and
and J3J3 activity and the
the
maximum
maximum extension distance. In addition, under a load of 250 g, the maximum rotation
maximum extension
extension distance.
distance. In addition, under a load load of of 250
250 g,
g, the
themaximum
maximum rotation
rotation
speed ofthe
speed the frontand and reararms arms aswell well asthethe baseisis320 320◦ /s,
°/s, while
while the maximum
maximum rotation
speed ofof the front
front and rearrear armsas as wellasas thebasebase is 320 °/s, while thethe maximum rotationrotation
speed
speed of the end tool is 320 ◦ °/s.
speed ofof the
the end
end tool
tool isis 320 /s.
320 °/s.
2.2. Writing-and-DrawingModule
2.2. Module
2.2. Writing-and-Drawing
Writing-and-Drawing Module
The writing-and-drawing
The module includes a pen and a holder,
pen holder, as shown in Fig-
Thewriting-and-drawing
writing-and-drawing module
moduleincludes a pen
includes andand
a pen a pen
a pen as shown
holder, in Figure
as shown 11.
in Fig-
ure
For 11. For our purposes, we replaced the pen with a brush. Figure 12 shows the specifi-
ure our purposes,
11. For we replaced
our purposes, the penthe
we replaced with
pena with
brush. FigureFigure
a brush. 12 shows the specification
12 shows the specifi-
cation parameters
parameters of Dobotofwith
Dobot with a pen-holder
a pen-holder end tool.end tool.
cation parameters of Dobot with a pen-holder end tool.
Machines 2021, 9, 302 7 of 20
Machines 2021, 9,2021,
Machines x FOR9, xPEER
FOR REVIEW
PEER REVIEW 7 of 20
7 of 20

Figure 11. Writing


Figure
Figure andand
11.11.Writing
Writingdrawing kit. kit.
drawing
and drawing kit.

Figure 12. Equipment


Figure
Figure 12. specification
12.Equipment
Equipment parameters.
specification
specification parameters.
parameters.

2.3.
2.3. Underlying
2.3. Underlying Communication
Communication
Underlying Communication Protocol
Protocol
Protocol
The APIThe API provided
The provided
API provided with
with with thetheDobot
the Dobot cannot
cannot
Dobot directly
directly
cannot callcall
directly and and
call control
control
and themanipulator
the
control manipulator
the on
manipulator
Raspberry
on Raspberry Pi. Therefore,
Pi. Therefore, we used
we used the Python
the Python pySerial
pySerial module
module for serial
for serialcommunication
communica- to
on Raspberry Pi. Therefore, we used the Python pySerial module for serial communica-
connect
tion to connect Raspberry
Raspberry Pi with
Pi withthe manipulator
the manipulator through
through the USB
the USB interface. The system must
tion to connect Raspberry Pi with the manipulator through the interface. The system
USB interface. The system
follow
mustmust
follow the communication
the communication protocol
protocol formulated
formulatedby the Dobot.
by the The pySerial module controls
follow the communication protocol formulated byDobot. The pySerial
the Dobot. module
The pySerial module
the manipulator through serial communication via the USB interface. The communication
controls the manipulator
controls the manipulatorthrough serialserial
through communication
communicationvia the viaUSB
the interface. The com-
USB interface. The com-
protocol includes the header, data length, transmitted data, and check code of the data
munication protocol
munication includes
protocol the header,
includes data length,
the header, transmitted
data length, data, data,
transmitted and check code code
and check of of
packet. Table 1 shows the relevant communication protocol.
the data
the packet. TableTable
data packet. 1 shows the relevant
1 shows communication
the relevant communication protocol.
protocol.
Table 1. Communication protocol of Dobot Magician.
TableTable
1. Communication
1. Communication protocol of Dobot
protocol Magician.
of Dobot Magician.
Payload
Payload
Payload Ctrl
Header Length Checksum
Header
Header Length Length ID Ctrl Ctrl Params
Checksum
Checksum
ID ID rw Is Params
QueuedParams
rw rw High is Queued
is Queued
Byte Low Byte
0xAA As Payload
0xAA 0xAA 1 Byte 1 Byte
High Byte 0x10Byte
High Byte Low or
Low Byte 0x00 or
0xAA 1 byte 1
1 byte byte 1 byte As directed Directed
Payload
As directed Calculate
calculate
Payload calculate
0xAA 0xAA 0x10 or0x10
0x00or 0x00
0x000x00 0x01or 0x01 0x01
or0x00

The physical
The layerlayer
Thephysical
physical is encoded
layer by the
isisencoded
encoded byIEEE-754
by the 32-bit
theIEEE-754
IEEE-754 single
32-bit
32-bit precision
single
single floating-point
precision
precision floating-point
floating-point
number.
number. For
Forexample,
For example,
number. if weififwant
example, we
wewant to
tocontrol
to control
want the
thedisplacement
the displacement
control of the
displacement of the
themanipulator
ofmanipulator fromfrom
manipulator a aa
from
certain
certain pointpoint
certain to
to the
point the
thecoordinate
tocoordinate point
coordinate (𝑥 =( x(𝑥
point
point 160, 𝑦 = y90,
==160,
160, 𝑧90,
𝑦==90,= −20, 𝑟20,
z𝑧==−−20, = 0).
r𝑟==Then,
00).
). Then, the
thedata
the data
Then, data
encoding
encoding process
process is is carried
carried out out
as as
shownshown
in in
Table
encoding process is carried out as shown in Table 2. Table
2. 2.
Machines 2021, 9, x FOR PEER REVIEW 8 of 20

Table 2. Example of signal encoding.


Machines 2021, 9, 302 8 of 20
Displacement coordinate 𝑥 𝑦 z 𝑟

Decimal system 160 90 -20 0

Table 2. Example of signal encoding.


0100001100100000 0100001010110100 1100000110100000 0000000000000000
IEEE-754
Displacement 0000000000000000 0000000000000000 0000000000000000 0000000000000000
x y z r
coordinate
16-Base system 0x430x200x000x00 0x420xb40x000x00 0xc10xa00x000x00 0x000x000x000x00
Decimal system 160 90 −20 0
Small byte 0100001100100000 0100001010110100 1100000110100000 0000000000000000
IEEE-754 0x000x000x200x43 0x000x000xb40x42 0x000x000xa00xc1 0x000x000x000x00
align type 0000000000000000 0000000000000000 0000000000000000 0000000000000000
16-Base system 0x430x200x000x00 0x420xb40x000x00 0xc10xa00x000x00 0x000x000x000x00
Small byte
0x000x000x200x43 0x000x000xb40x42 0x000x000xa00xc1 0x000x000x000x00
align type

𝑟: the end tool is used when the steering gear (servo motor) is installed. Although
r: the end
this system is tool
not is usedthe
used, when the must
value steering
be gear (servo
brought in.motor)
Figureis13
installed. Although
shows the this
conversion
system is not used, the value must be brought in. Figure 13 shows the conversion
results verified by the program written in Python; the following data can be used to con- results
verified
trol the by the program
manipulator written
with in Python;
the PTP the following
mode command datathe
through can be used
write to control
() method the
of pySe-
manipulator
rial. with the PTP mode command through the write () method of pySerial.

Figure13.
Figure 13.Verification
Verificationofofcommunication
communicationinstructions.
instructions.

2.4.
2.4.Camera
CameraLens
Lens
The
The imagesources
image sourcesare
arefiles
filesstored
storedin
inthe
thesystem
systemororreal-time
real-timeimages
imagescaptured
capturedthrough
through
the
the external CCD lens. The external lens has an automatic focusing function, asasshown
external CCD lens. The external lens has an automatic focusing function, shownin
in Figure 8. It uses the transmission interface of USB 2.0 and is interconnected
Figure 8. It uses the transmission interface of USB 2.0 and is interconnected with the USB with the
USB interface of Raspberry Pi3. Captured images are transmitted to the graphics
interface of Raspberry Pi3. Captured images are transmitted to the graphics window in- window
interface forimage
terface for imageprocessing.
processing.

2.5. Embedded Microprocessor


2.5. Embedded Microprocessor
Raspberry Pi offers powerful functionality, with a computing performance close to
Raspberry Pi offers powerful functionality, with a computing performance close to
that of a small single-board computer and advantages including small size, low cost,
that of a small single-board computer and advantages including small size, low cost, low
low power consumption, and low noise. It can be connected to a keyboard, a mouse,
power consumption, and low noise. It can be connected to a keyboard, a mouse, or a dis-
or a display. This high, low-cost performance is achieved by using a 64-bit embedded
play. This high, low-cost performance is achieved by using a 64-bit embedded micropro-
microprocessor with an I/O entity pin to communicate with external sensors such as
cessor with an I/O entity pin to communicate with external sensors such as
GPIO/I2C/UART interfaces. Specifications are as follows: (1) SOC: Broadcom BCM2837
GPIO/I2C/UART interfaces. Specifications are as follows: (1) SOC: Broadcom BCM2837
chipset; (2) microprocessor: four-core 64-bit ARMv8 cortex-A8, 1.4 GHz; (3) display core:
chipset; (2)
dual-core microprocessor:
videocore four-core
IV 3D graphics core;64-bit ARMv8LPDDR2,
(4) memory: cortex-A8, 1.4 (5)
1 GB; GHz; (3) display
network core:
functions:
dual-core videocore IV 3D graphics core; (4) memory: LPDDR2, 1 GB; (5) network
10/100 Ethernet, IEEE802.11 b/g/n wireless network, Bluetooth 4.1 (supporting general func-
tions: and
mode 10/100
lowEthernet,
power IEEE802.11
consumption b/g/n wireless
mode); network,
(6) video Bluetooth
output: HDMI 4.1(supporting
(supporting Revgen-
eral mode and low power consumption mode); (6) video output: HDMI
1.3 and 1.4), composite video terminal (supporting NTSC and PAL) and 3.5 mm audio (supporting Rev
1.3 and 1.4),
terminal; composite
(7) USB: video of
four groups terminal
USB 2.0;(supporting NTSC andfunction:
(8) GPIO connection PAL) and 403.5
pinmm2.54audio
mm
terminal, providing 27 GPIO and +3.3 V, +5 V, GND, and other power terminals; (9)2.54
terminal; (7) USB: four groups of USB 2.0; (8) GPIO connection function: 40 pin mm
camera
connection function: 15 pin MIPI terminal (CSI-2); (10) display connection function: display
serial interface terminal (DSI); (11) card reader: micro SD card reader, supporting SDIO;
(12) operating system: boot with micro SD card, supporting Linux and Windows 10 IOT;
and (13) size: 85 mm × 56 mm × 17 mm. Its appearance is shown in Figure 14.
terminal, providing 27 GPIO and +3.3 V, +5 V, GND, and other power terminals; (9) cam-
era connection function: 15 pin MIPI terminal (CSI-2); (10) display connection function:
terminal, providing 27 GPIO and +3.3 V, +5 V, GND, and other power terminals; (9) cam-
display serial interface terminal (DSI); (11) card reader: micro SD card reader, supporting
era connection function: 15 pin MIPI terminal (CSI-2); (10) display connection function:
SDIO; (12) operating system: boot with micro SD card, supporting Linux and Windows
Machines 2021, 9, 302 display serial interface terminal (DSI); (11) card reader: micro SD card reader, supporting
9 of 20
10 IOT; and (13) size: 85 mm × 56 mm × 17 mm. Its appearance is shown in Figure 14.
SDIO; (12) operating system: boot with micro SD card, supporting Linux and Windows
10 IOT; and (13) size: 85 mm × 56 mm × 17 mm. Its appearance is shown in Figure 14.

Figure 14. Components of Raspberry Pi.

Figure14.
Figure 14. Components
Components of
of Raspberry
Raspberry Pi.
Pi.
3. Machine-Vision Image Processing
3.1.
3.
3. Image Preprocessing
Machine-Vision
Machine-Vision Image
ImageProcessing
Processing
3.1. Image
The Preprocessing
proposed
3.1. Image Preprocessingdrawing system mainly uses Python language and imports it into the
OpenCVThe computer vision function library uses
to read the image source. Itimports
compiles it into a
The proposed
proposed drawing
drawing system
system mainly
mainly usesPythonPythonlanguage
languageand and importsititintointothe
the
visual
OpenCV window operation interface with the Tkinter graphical user interface function li-
OpenCV computer
computer vision
vision function library to
function library to read
read the
the image
image source.
source. ItItcompiles
compilesititinto
intoa
abrary
visualto window
simplify operation
the processinterface
of operating
withandthe controlling the manipulator.
Tkinter graphical user interfaceBefore the ro-
function
visual window operation interface with the Tkinter graphical user interface function li-
bot begins
library to draw,the
to simplify the input image
process must be
of operating andpreprocessed to communicate
controlling the manipulator.through
Before the the
brary to simplify the process of operating and controlling the manipulator. Before the ro-
underlying
robot beginscommunication
to draw, the inputprotocol
imageof the be
must robot.
preprocessed to communicate through the
bot begins to draw, the input image must be preprocessed to communicate through the
As shown
underlying in Figure 15,protocol
communication the image source
of the can be a pre-existing file in the system or a
robot.
underlying communication protocol of the robot.
real-time imageincaptured
As shown Figure 15,bythe the camera.
image Color
source canimages contain thefile
be a pre-existing pixel information
in the system or of a
As shown in Figure 15, the image source can be a pre-existing file in the system or a
three primary
real-time imagecolors: red (R),
captured green
by the (G), and
camera. blueimages
Color (B). Color images
contain themust
pixelfirst be converted
information of
real-time image captured by the camera. Color images contain the pixel information of
to grayscale
three primarytocolors:
describe
red color brightness.
(R), green (G), and Color
blue pixels are converted
(B). Color images must to first
a gray-level value
be converted
three primary colors: red (R), green (G), and blue (B). Color images must first be converted
to
ofgrayscale to describeblack,
0–255. 0 represents color brightness. Color pixels
and 255 is white. are converted
YIQ conversion is astofollows:
a gray-level value of
to grayscale to describe color brightness. Color pixels are converted to a gray-level value
0–255. 0 represents black, and𝑌255 is white. YIQ conversion is as follows:
of 0–255. 0 represents black, and 2550.299 is white. 0.587 0.114 𝑅
YIQ conversion is as follows:
 𝐼 = 0.596 −0.275 −0.321  𝐺  (1)
𝑌

Y
𝑄 0.299
0.299
0.212 0.587 0.114
0.587
−0.528 0.114
0.311 𝑅
𝐵
R
 I = 𝐼 = 0.596
0.596 −−0.275 −0.321 𝐺G 
0.275 −0.321 (1)
(1)
Brightness signal Q= 0.299 𝑄 × 𝑅 + 0.587
0.212 −−0.528
0.212 × 𝐺 + 0.114
0.528 0.311 0.311× 𝐵, 𝐵B
Brightness 𝑌 is the
Therefore, signal = converted
0.299 × 𝑅 grayscale
+ 0.587 × value and × ,𝐵, are color components.
𝐺 + 0.114
Therefore, 𝑌 is the converted grayscale value and , are color components.
Image Preprocessing
Estimate the
Cam Studio Gray Image Preprocessing
Threshold Edge
Camera Processing Binarization Detection
Valuethe
Estimate
Cam Studio Gray Threshold Edge
Camera Processing Binarization Detection
Value

Edge-Points Sequential
Arrangement Method
8-NN sets [Vj] : 187 Edge-Points Sequential
8-NN pixels [Pij] : 2758 Arrangement Method
8-NN sets [Vj] : 187
8-NN pixels [Pij] : 2758

Figure15.
Figure 15.Flow
Flowchart
chartof
ofimage
imageprocessing.
processing.

Figure 15. Flow chart of image processing.


Brightness signal = 0.299 × R + 0.587 × G + 0.114 × B,
Therefore, Y is the converted grayscale value and I, Q are color components.
Image binarization uses trial and error or the Otsu algorithm to obtain the image
threshold to distinguish whether each pixel belongs to black (0) or white (255). This helps
to reduce noise. The grayscale values are represented by pixel points f ( x, y) combined
Machines 2021, 9, x FOR PEER REVIEW 10 of 20

Image binarization uses trial and error or the Otsu algorithm to obtain the image
Image binarization uses trial and error or the Otsu algorithm to obtain the image
threshold to distinguish whether each pixel belongs to black (0) or white (255). This helps
Machines 2021, 9, 302 threshold to distinguish whether each pixel belongs to black (0) or white (255). This 10helps
10 of 20
to reduce noise. The grayscale values are represented by pixel points 𝑓(𝑥, 𝑦) combined
Machines 2021, 9, x FOR PEER REVIEW of 20
to reduce noise. The grayscale values are represented by pixel points 𝑓(𝑥, 𝑦) combined
with the image threshold 𝑘, as depicted in Figure 16. Binary operation is then performed
with the image threshold 𝑘, as depicted in Figure 16. Binary operation is then performed
to generate a binary image output.
to generate a binary image output.
with the image
Image threshold k,uses
binarization as depicted
trial andinerror
Figure or16.
theBinary operation istothen
Otsu algorithm performed
obtain the imageto
generate
thresholda to
binary image output.
distinguish whether each pixel belongs to black (0) or white (255). This helps
to reduce noise. The grayscale valuesare represented by pixel points 𝑓(𝑥, 𝑦) combined
with the image threshold 𝑘,fas i f f ( x,16.
0, Figure ≤k
y) Binary
( x,depicted
y) = in operation is then performed(2)
to generate a binary image output. 255, otherwise

Figure 16. Image threshold 𝑘 .


Figure 16. Image threshold 𝑘 .

0, 𝑖𝑓 𝑓(𝑥, 𝑦) ≤ 𝑘
𝑓(𝑥, 𝑦) = 0, 𝑖𝑓 𝑓(𝑥, 𝑦) ≤ 𝑘 (2)
𝑓(𝑥, 𝑦) = 255, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 (2)
255, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Therefore, 𝑓(𝑥, 𝑦) = 0 indicates that the pixel value is black and 𝑓(𝑥, 𝑦) = 255 in-
FigureTherefore, 𝑓(𝑥, 𝑦) =
Image threshold
16.Image k. 0 𝑘. indicates that the pixel value is black and 𝑓(𝑥, 𝑦) = 255 in-
Figure
dicates16.that the threshold
pixel value is white.
dicates that the pixel value is white.
Sobel edge detection is used to find the edges of the object in the binary image. Two
Therefore, x, y) = 0 indicates
Sobel edgef (detection is used tothat findthe thepixel value is black andin fthe y) = 255
( x, binary indicates
3 × 3 horizontal and vertical matrix 0, edges
convolution
of the
𝑖𝑓kernels
𝑓(𝑥, 𝑦)(asobject
≤ 𝑘shown image.
in Figure 17) or Sobel
Two
that the pixel value is white. 𝑓(𝑥, 𝑦) =
3 × 3 horizontal and vertical matrix convolution kernels (as shown in Figure 17) or Sobel (2)
operators are established in advance. The255, edges of𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
the image object are then obtained after
Sobel edge detection is used to find the edges of the object
operators are established in advance. The edges of the image object are then obtained afterin the binary image. Two
convolution of each
𝑓(𝑥,pixel
𝑦) =in the image, that
as shown in Figure 18.black
Convolution 𝑓(𝑥,is as=follows:
× 3 Therefore,
3convolution
horizontal and
of each vertical
pixel in0 the
indicates
matrix convolution
image, the pixel
as shown value
kernels
in Figure (asis shown
18. and
in Figure
Convolution is𝑦)
17)
as or255
Sobel
follows:in-
𝑔= 𝐺 +
dicates
operators are𝐺established in advance. The edges of the image object are then obtained after
that the pixel value is white. (3)
𝑔 =Sobel
𝐺 + 𝐺 detection is used to find the edges of the object in the binary image. Two (3)
= [(𝑧 +edge
convolution 2𝑧
of each ) − (𝑧in +
+ 𝑧 pixel the + 𝑧 )]
2𝑧image, + [(𝑧 +in2𝑧
as shown + 𝑧 18.
Figure ) − Convolution
(𝑧 + 2𝑧 + 𝑧 is)]as follows:
= [(𝑧 + 2𝑧 + 𝑧 ) − (𝑧 + 2𝑧 + 𝑧 )] + [(𝑧 + 2𝑧 + 𝑧 ) −
3 × 3 horizontal and vertical matrix convolution kernels (as shown in Figure 17) or Sobel (𝑧 + 2𝑧 + 𝑧 )]
Horizontal gradient
i1 after application of horizontal edge filter:
Horizontal
operators gradient
h are established
2 after
𝐺 in (𝑧 application
= advance.
+ 2𝑧 The )of−horizontal
+ 𝑧 edges (𝑧of+the +edge
2𝑧 image filter: are then obtained after
𝑧 ) object
g = Gx2 + Gy2 (𝑧 image, ) − (𝑧 in ) Convolution is as follows:
(4)
convolution of each pixel 𝐺 in = the + 2𝑧 +as𝑧 shown + 2𝑧 + 𝑧 18.
Figure (4)
(3)
Horizontal gradient after application2 of vertical edge filter: 2 2
1
= Horizontal
{[( z
𝑔 = 𝐺 +3 𝐺 4 + 2z gradient
+ z − after
z + application
2z
5 )𝐺 (=1(𝑧 + 82𝑧 + + z 7 )] of
+ vertical
z + 2zedge
+ z
𝑧 ) −[((𝑧7 + 2𝑧6 + 𝑧5 )) ( 1filter:
− z + 2z 2 + z 3 )] } (5)
(3)
𝐺 = (𝑧 + 2𝑧 + 𝑧 ) − (𝑧 + 2𝑧 + 𝑧 ) (5)
= [(𝑧 + 2𝑧 + 𝑧 ) − (𝑧 + 2𝑧 + 𝑧 )] + [(𝑧 + 2𝑧 + 𝑧 ) − (𝑧 + 2𝑧 + 𝑧 )]
Horizontal gradient after application of horizontal edge filter:
𝐺 = (𝑧 + 2𝑧 + 𝑧 ) − (𝑧 + 2𝑧 + 𝑧 ) (4)
Horizontal gradient after application of vertical edge filter:
𝐺 = (𝑧 + 2𝑧 + 𝑧 ) − (𝑧 + 2𝑧 + 𝑧 ) (5)

Figure 17.
Figure 17. 33 ×
× 33horizontal
horizontaland
andvertical
vertical matrix
matrix convolution
convolution kernels.
kernels.
Figure 17. 3 × 3 horizontal and vertical matrix convolution kernels.

Figure 17. 3 × 3 horizontal and vertical matrix convolution kernels.

Figure 18. Results of Sobel edge detection.


Figure18.
Figure 18.Results
Resultsof
ofSobel
Sobeledge
edgedetection.
detection.
3.2. Edge-Point Sequential Arrangement
3.2. Edge-Point
HorizontalSequential Arrangement
gradient after application of horizontal edge filter:

Gx = (z3 + 2z4 + z5 ) − (z1 + 2z8 + z7 ) (4)


Figure 18. Results of Sobel edge detection.

3.2. Edge-Point Sequential Arrangement


Machines 2021, 9, 302 11 of 20

Horizontal gradient after application of vertical edge filter:

Machines 2021, 9, x FOR PEER REVIEW


Gy = (z7 + 2z6 + z5 ) − (z1 + 2z2 + z3 ) 11 of 20
(5)

3.2. Edge-Point Sequential Arrangement


The purpose of the edge-point sequential arrangement is to arrange the pixel coordi-
The
nates of purpose of theedge
each object edge-point
point insequential
the image arrangement
into a set array. is to arrange
This the pixel
is carried out incoordi-
order to
nates
detectof each object edgeand
the adjacency point in the image
attributes between intoedge
a setpoint
array.pixels,
This isensuring
carried outthatintheorder
edgesto of
detect the adjacency and attributes between edge point pixels,
an object are placed in the same set. During image detection, the image is scanned from ensuring that the edges of
an top
object are placed
to bottom andinfrom
the left
same to set. During
right. A pixel image detection,
content of “0” isthedefined
image as is scanned from
the background,
topandto bottom and from left to right. A pixel content of “0” is defined
the foreground is denoted as “1”. The scanning process involves object detection and as the background,
andedgethe foreground
detection. The is denoted
numberas of“1”. Theand
objects scanning process
the number ofinvolves object detection
edge coordinates of eachand object
edgesetdetection. The number
are determined of objects and
simultaneously. Here, thethenumber
ith edgeof edge
pointcoordinates of each object
coordinate contained in the
set jth
areobject
determined simultaneously. j j j the 𝑖th edge point coordinate contained in the
Here,
can be defined as Pi = ( xi , yi ), where j = 1, · · · , m, indicating that m objects are
𝑗thdetected
object can inbethedefined
image.as 𝑃 i==(𝑥1, ,·𝑦· ·),, nwhere 𝑗 = 1, ⋯ , 𝑚, indicating that m objects are
And j , indicating that the number of n j edge points in
detected in the image. And 𝑖 = 1, ⋯ , 𝑛 , indicating that the number of 𝑛 edge points m in
thethe𝑗thjth object.
object. Therefore,
Therefore, thethe number
number ofofedge
edgepoints
pointscontained
containedininthe imageisis∑∑ n𝑛j . .InInthe
theimage
j =1
thescanning
scanningprocess,
process,onceonce the
the object
object isis detected,
detected, edge edge detection
detection begins.
begins. For
Forthis,
this,weweapply
apply a
a k-nearest-neighbor (k-NN) search and set k at 8, 16, and 24.
k-nearest-neighbor (k-NN) search and set k at 8, 16, and 24. Not every detected object Not every detected object will
willfeature
feature a complete
a complete closed
closed curve,
curve, andand there
there maymay be discontinuous
be discontinuous breakpoints.
breakpoints. If weIf we
apply
apply
onlyonly 8-NN,
8-NN, a complete
a complete curvecurve cannot
cannot be be obtained.
obtained. Therefore,
Therefore, 16-NN
16-NN andand 24-NN
24-NN ex-
expand
pandthe search to increase the allowable range of adjacency and matching attributes. AsAs
the search to increase the allowable range of adjacency and matching attributes. long
longas as there
there areare boundary
boundary pointspoints at adjacent
at adjacent positions,
positions, the algorithm
the algorithm will will continue
continue to
to search
search
untiluntil a closed
a closed curvecurve appears.
appears. Then,Then, the edge
the edge pointpoint coordinates
coordinates are recorded
are recorded in the
in the set array
set Varray 𝑉 . Figure
j . Figure 19 shows19 shows the 8-NN,
the 8-NN, 16-NN,16-NN, and and
24-NN 24-NN search
search structures
structures respectively.
respectively. This
This
paper paper takes
takes thethe reverse
reverse clockclock as search
as the the search direction,
direction, andsearch
and the the search
orderorder is repre- by
is represented
sented
numbers by numbers
1, 2, and1,3.2, and 3.

1 24 23 22 21 20 19
1 16 15 14 13 2 18
1 8 7 2 12 3 17
a

2 6 3 11 4 r r
16
r

3 4 5 4 10 c
5 d
15
5 6 7 8 9 c
6 d
14
c
7 d
8 9 10 11 12 13

Figure
Figure TheThe
19. 19. 8-NN,
8-NN, 16-NN,
16-NN, andand 24-NN
24-NN search
search structures.
structures.

ThisThis algorithm
algorithm searches
searches thethe whole
whole image
image to obtain
to obtain thethe edge
edge coordinate
coordinate data dataof of
all all
objects in the image. The detailed procedure of the algorithm is described
objects in the image. The detailed procedure of the algorithm is described in the following: in the following:
StepStep
1: 1:
TheTheoperation
operationprocedure
procedure of of the
the 8-NN
8-NNsearch
searchmethod
method is illustrated
is illustratedby theby closed
the
curve in Figure 20A, where the foreground pixels are denoted
closed curve in Figure 20A, where the foreground pixels are denoted as “a”, “b”, “c” and as “a”, “b”, “c” and so on.
Start object detection from the first point in the image matrix. If
so on. Start object detection from the first point in the image matrix. If an object is detected,an object is detected,
proceed
proceed to Step
to Step 2. aIfscanned
2. If a scanned pixel
pixel is background,
is background, nono processing
processing is performed.
is performed. In In
thisthis
step, the jth scan to the foreground pixel is defined as the starting
step, the 𝑗th scan to the foreground pixel is defined as the starting coordinate of the edge coordinate of the edge
j
point
point of the
of the 𝑗thjth object,
object, expressed
expressed 𝑃 Pand
as as 1 and is recorded
is recorded to to
thethe edge
edge point
point 𝑉 .VReferring
setset j . Referring
to Figure 20A, where the first searched foreground coordinate
to Figure 20A, where the first searched foreground coordinate is represented by a symbolis represented by a symbol

“”. F ”.
Machines 2021, 9, x FOR PEER REVIEW 12 of 20
Machines 2021, 9, 302 12 of 20
Machines 2021, 9, x FOR PEER REVIEW 12 of 20

1 8 7

a s
21 a
8 s 67 1 8 aa
7 s

b a s r b
32 a4 s
56 r
21 b
b
8 a67 a s r

c db q c db
3 4 5 q cc
32 db4 56 q
r r db r

e c d q p e c d q p e cc
3 d
d
4 5 q p

f e o p f e o p f e o p

g f m n o g f m n o g f m n o

g h l m n g h l m n g h l m n

h i j k l h i j k l h i j k l

i (A)
j k i (B)
j k i (C)
j k

(A) (B) (C)


Figure 20. 8-NN search schematic diagram.
Figure20.
Figure 20.8-NN
8-NNsearch
searchschematic
schematicdiagram.
diagram.
Step 2: With edge point 𝑃 , (𝑖 = 1) as the benchmark, perform 8-NN counterclock-
j
Step
wise.StepThe2:2: Withedge
search
With edgepoint
method is Ptoi ,𝑃judge
point (i , =
(𝑖 1=)whether
1) as the
as the benchmark,
there
benchmark, perform
are foreground
perform 8-NN
pixels
8-NN atcounterclock-
eight points
counterclockwise.
The
wise.
aroundsearch
The“”method
search
in the iscounterclockwise
method to judgeis towhether
judge whetherthere arethere
direction, foreground pixels
are foreground
as shown in atpixels
Figure eight
20B. points
at eight
Once around
the points
fore-
F
“ground
” in the
around “”counterclockwise
pixel “b”the
in scanned, direction,
is counterclockwise the i value as shown inasFigure
is increased
direction, shownby 20B.
1,inandOnce
thethe
Figure foreground
scanned
20B. Once edgethepixel
point
fore-
“b” is scanned,
coordinates
ground pixel the
are“b” storedi value
is in the
scanned, set the𝑉 i. value
is increased by 1, is and the scanned
increased edgethe
by 1, and point coordinates
scanned edge point are
stored in
coordinates the set V
are stored
Step 3: This .
pixel
j in the
“b” set 𝑉regarded
is now . as the reference point of the next 8-NN scan,
labeledStepas3:3:
Step This
“”This andpixel
pixelthe“b”“b”isisnow
previous now regarded
regarded
pixel “a” is setas
asthe reference
referencepoint
thebackground
to “0”. of
point ofthe
thenext
Repeat step8-NN
next 8-NN scan,
scan,
2, continue
labeled
with
labeled as
the F”scan,
““”
next
as and
andthe as previous
theshown
previous in pixel
pixel“a”
Figure isisset
20C.
“a” settotobackground
background“0”. “0”.Repeat
Repeatstep step2,2,continue
continue
with
withthe thenext
Step 4: Ifscan,
next scan,
the as
8-NN asshown
scan in
shown inFigure
resultsFigure in 20C.
no20C.
detections, it may indicate a breakpoint, refer to
Figure Step
Step 4:
21A. If
4: If the
Inthe 8-NN
this8-NN scan
case,scan results
resultssearch
the 8-NN in no detections,
in no seems
detections,to beititinvalid
may
mayindicate
indicate aabreakpoint,
since the breakpoint,
foreground refer
refer toto
pixels
Figure
cannot
Figure be 21A.
21A. In this
scanned,
In this case,case, the
see Figure 8-NN
the 8-NN search
21B. search
The 16-NNseems to
seemssearch be invalid
to be method since the
mustthe
invalid since foreground
be used pixels
to overcome
foreground pixels
cannot
cannot
the be
bescanned,
problem scanned, see
of breakpointsseeFigure
Figure on21B.
21B. The
closed 16-NN
16-NNsearch
Thecurves, search
see method
Figure 21C.must
method must be
beused
However, used totoovercome
8-NN overcome
search is
the
theproblem
still used forofthe
problem ofbreakpoints
next scan,on
breakpoints on
and closed
closed
16-NN curves,
curves,
search seeissee
Figure
usedFigure21C.21C.
only However,
when However,8-NN8-NN
a breakpoint search is stillis
is search
encoun-
used
tered. for the next scan, and 16-NN search is used only
still used for the next scan, and 16-NN search is used only when a breakpoint is encoun- when a breakpoint is encountered.
tered.
1 16 15 14 13
1 8 7 2 12
1 16 15 14 13
2 a 6 3 11
a s
1 8 7 2 a 12 a s

3 4 5 4 10 1 8 7
b
a
r 2 a 6 r 3 b r 11 r
s a a s
5 6 7 8 9 2 c d6
c d
r
q c d 3 4 5 r
q c 4 d 10 q 1 8 7 q
b rb r
p p 3 4 5
e
c d q
e
c d q
e
c
5 d
6 7 8 9 q
p e 2
c d
6 q
p

f
e
o
p
f
e
o
p
f
e
o
p
f
e
3 4 5 o
p
g m n g m n g m n g m n
f o f o f o f o
h l h l h l h l
g m n g m n g m n g m n
i j k i j k i j k i j k
h l h l h l h l
(A) (B) (C) (D)
i j k i j k i j k i j k

(A) (B) (C) (D)


Figure21.
Figure 21. 16-NN
16-NN search
searchschematic
schematicdiagram.
diagram.
Figure 21. 16-NN search schematic diagram.
Step 5:
Step 5: In
In another
another extreme
extreme case,
case, two
two breakpoints
breakpoints appear
appearin in the
the direction
directionof ofthe
theeight
eight
nearest neighbors of the benchmark, as shown in Figure 22A. At this time, not only 8-NN,eight
Step
nearest 5: In
neighbors another
of the extreme
benchmark,case, two
as breakpoints
shown in Figureappear
22A. in
At the
this direction
time, not of
onlythe 8-NN,
but
but
also also16-NN
nearest
the the 16-NN
neighborssearchofsearch
method method
is futileisand
the benchmark, futile
as andin
shown
cannot cannotthe find
Figure
find 22A.theAtforeground
foreground this time,
pixel, pixel,
seenot only
Figure see Fig-
8-NN,
22B. If
but 22B.
ure
16-NN also the16-NN
If
returns 16-NN search
returns
no results, onemethod
no results,
should is one
futile and
should
implement cannot It find
implement
24-NN. the
can be foreground
24-NN.
found It pixel,
canFigure
from be found seefrom
22C Fig-
that
ureforeground
the 22B.
Figure 22CIf 16-NN
that thereturns
pixel canno
be results,
foreground
“e” one can
pixel “e”
successfully should implement
be successfully
searched 24-NN.
by using 24-NN to It
searched canusing
by
solve be found
the 24-NN
problem from
to
of
two-layer
solve
Figurethe
22Cbreakpoints.
problem Figure 22D
that theofforeground
two-layer illustrates
breakpoints.
pixel “e” canthat
bethe
Figure next
22D scan still
illustrates
successfully adopts
searchedthatby the
the 8-NN
next
using search
scan
24-NN still
to
with
solvepixel
adopts theE
the “e” assearch
problem
8-NN the
of reference
two-layer point
with pixel F”.as the
E ““e”
breakpoints. Figure 22D illustrates
reference point “”.that the next scan still
Stepthe
adopts 6: 8-NN
Repeat Stepwith
search 3. Ifpixel
the edge
E “e” ofas the reference
object is apoint
closed“”.curve, edge detection
j j
is complete when boundary point coordinates Pi equal starting coordinates P1 ; that is;
j j j j
( xi , yi ) = ( x1 , y1 ). If the edge of the object is not a closed curve, once all 24 of the nearest
neighbors are found to be background pixels, scanning is stopped.
Machines 2021, 9, 302 13 of 20
Machines 2021, 9, x FOR PEER REVIEW 1

1 24 23 22 21 20 19
1 16 15 14 13 2 18
2 12 3 a
17

a
3 a
11 4 a r r
16
r a

r
4 10
r c
5 d b
15 r

5 6 7 8 9 q
6 5 14 1 8 7 q
c d q c d c cd q
q c d

p p
7 8 9 10 11 12 13 p
2 ee
6 p
e e e
ec d

o f o f o ff
3 4 5 o
f

g m n g m n g m n g m n

h l h l h l h l

i j k i j k i j k i j k

(A) (B) (C) (D)

Figure
Figure 22. 24-NN 22.schematic
search 24-NN search schematic diagram.
diagram.

Step exists
Step 7: Subtle noise 6: Repeat in mostStepimages;
3. If the edge ofathe
therefore, object is
threshold a closed
value for thecurve,
number edge detect
of object edges complete
is defined:when Tb . If boundary
the numberpoint of edge pixels owned
coordinates 𝑃 equalby anstarting is below 𝑃 ; th
object n jcoordinates
(𝑥 ,n𝑦j <
the threshold, i.e., ) =Tb(𝑥 , 𝑦 ).treat
, then If the edgej of
object as the
noise. That
object is is,
notdelete object
a closed j and
curve, its all
once edge
24 of the n
point records. Repeat
neighbors Stepare1 until
foundalltopixels in the image
be background information
pixels, scanningare scanned; that is,
is stopped.
the sequential arrangement of all edge
Step 7: Subtle noisepoints
exists inof most
all objects
images;is complete.
therefore,In this manner,
a threshold valuewefor the nu
can find the number of objects
of object edges in the image𝑇as. Ifwell
is defined: the as the edge-point
number sequential
of edge pixels owned by an object 𝑛 is
coordinates
of each object. the threshold, i.e., 𝑛 < 𝑇 , then treat object 𝑗 as noise. That is, delete object 𝑗 and it
Let us consider the example
point records. Repeat presented in Figure
Step 1 until all pixels20B. Theimage
in the algorithm searches
information arefor
scanned; t
edge-point coordinates in eight directions. When an edge point is detected
the sequential arrangement of all edge points of all objects is complete. In this in direction 3, mann
the coordinatescan
of this pixel are stored in the edge point set of the object. This
find the number of objects in the image as well as the edge-point sequential c pixel is then
regarded as thenates
reference
of eachpoint of the next 8-NN scan, and the initial pixel “F” is set as
object.
background data. When none
Let us considerof the eight nearest neighbors
the example presented are foreground
in Figure 20B. pixels, 16-NN search
The algorithm
is implemented.edge-point
An edge point is then detected in the fifth neighbor, see
coordinates in eight directions. When an edge point is detected Figure 21C. These in direc
coordinates arethe
then stored in the edge point set of the object, and this pixel
coordinates of this pixel are stored in the edge point set of the object. This is regarded as pixel i
the reference point of the next scan. The previous benchmark indicated by the
regarded as the reference point of the next 8-NN scan, and the initial pixel “” is orange edge
point in Figure background
21D is then considered
data. Whenbackground.
none of the eightWhen the 16-NN
nearest scan does
neighbors not detect pixels, 1
are foreground
any foregroundispixels,
implemented. An edge point is then detected in the fifth neighbor,point
24-NN is used to detect edge points. In Figure 22C, an edge is
see Figure 21C.
detected at the coordinates
seventh pixel. This is stored in the edge point set of the object. Pixel
are then stored in the edge point set of the object, and this pixel is reg “e” is
regarded as theasreference point point
the reference of the of next
thescan,
next seescan.Figure 22D.
The previous benchmark indicated by the o
edge point in Figure 21D is then considered background. When the 16-NN scan do
3.3. Segmented B-Spline Curve
detect any foreground pixels, 24-NN is used to detect edge points. In Figure 22C, an
Machines 2021, 9, x FOR PEER REVIEW The spline curve has curve characteristics such as local control (Figure 23), convex 14 of 20
point is detected at the seventh pixel. This is stored in the edge point set of the object
hull property (Figure 24), partition of unity (Figure 25), and differentiability.
“e” is regarded as the reference point of the next scan, see Figure 22D.

3.3. Segmented B-Spline Curve


The spline curve has curve characteristics such as local control (Figure 23), c
hull property (Figure 24), partition of unity (Figure 25), and differentiability.
The B-Spline base function can be defined by recursive deBoor-Cox equations:
𝑢−𝑢 𝑢 −𝑢
𝐵 , (𝑢) = 𝐵 , (𝑢) + 𝐵 , (𝑢),
𝑢 −𝑢 𝑢 −𝑢
1 𝑖𝑓 𝑢 ≤ 𝑢 0
𝐵, = , 𝑎𝑛𝑑 𝑑𝑒𝑓𝑖𝑛𝑒 = 0.
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 0
The B-polynomial of the spline curve is defined as follows:

𝑆(𝑢) = 𝐵 , (𝑢) 𝑃

where node vector 𝑈 = 𝑢 , 𝑢 , … , 𝑢 . The selection method is significantly rela


the shape described by the spline curve. If the curve has the same control points, the
Figure23.
Figure 23.Schematic
Schematic
of thediagram oflocal
curve of
diagram willlocal controlcharacteristics.
be control characteristics.
different due to different node vectors.
Machines 2021, 9, 302 14 of 20

Figure 23. Schematic diagram of local control characteristics.


Figure 23. Schematic diagram of local control characteristics.

Figure24.
Figure 24. Schematic
Schematic diagram
diagram of
ofconvex
convexhull
hullcharacteristics wherek 𝑘= 2.
characteristicswhere =2
Figure 24. Schematic diagram of convex hull characteristics where 𝑘 = 2

Figure 25. Unit division of B-spline curve.


Figure25.
Figure 25.Unit
Unitdivision
divisionof
ofB-spline
B-splinecurve.
curve.
Based on the above four characteristics, the element takes a uniform and open node
The
BasedB-Spline
on thebase
above function can be definedthe
four characteristics, byelement
recursive deBoor-Cox
takes a uniform equations:
and open node
vector, and the adjacent nodes are equidistant and have two ends. Tsai [32] suggested
vector, and the adjacent nodes are equidistant and have two ends. Tsai [32] suggested
gradually disassembling theu recursive
− ui formula into u a+ combination
1−u of zero-order base func-
gradually disassembling
Bi,k (uthe
tions. We then solve
the recursive
) =correspondingBi,k− (u) + intoi+akto
formula
1coefficients combination
deduce +1,kof
−1zero-order
BiEquation. base func-
(u), (9), which (6)
repre-
ui +k − ui
tions. We then solve the corresponding ui+k+to
coefficients 1− ui+1 Equation.
deduce (9), which repre-
sents a matrix piecewise spline curve. This replaces the recursive expression of traditional
sents a matrix piecewise spline curve. This replaces the recursive 0 expression of traditional
i ≤ ui +1 where 𝑆 (𝑡) represents
deBoor equations [33,34]. 𝑘 1= 3i f Foruexample, the second 𝑗 Par-

Bi,0 = 𝑘 = 3 For example,
deBoor equations [33,34]. ,
where (𝑡)f ine
and𝑆 de = 0. the second 𝑗 Par-
represents (7)
agraph (𝑗 = 0, 1, … , 𝑚). Each0 control point 𝑃 , 𝑃 , 𝑃 , 𝑃 0determines the shape of
otherwise
agraph (𝑗 = 0, 1, … , 𝑚). Each control point 𝑃 , 𝑃 , 𝑃 , 𝑃 determines
.the curve, which is constructed by the third-order spline base function, the segment the shape of
the curve,
The which is constructed
B-polynomial of the bycurve
splinebasethe third-order
is defined asspline
follows: base function, the segment
curve is shown in Figure 26. This function has the smoothing characteristic of cubic
curve is shown in Figure 26. This base function has the smoothing characteristic of cubic
differentiability, which is convenient for the trajectory design of constraint optimization.
n trajectory design of constraint optimization.
differentiability, which is convenient for the
The constructed spline curve becomes S(u) = an∑ input
Bi,k (source
u) Pi of the drawing system, while (8) the
The constructed spline curve becomes an input source of the drawing system, while the
curve trajectory is fed back to the Dobot to i =test
0 the drawing function of the manipulator.
curve trajectory is fed back to the Dobot to test the drawing function of the manipulator.
where node vector U = {u0 , u1 , . . . , un+k+1 }. The selection method is significantly related
to the shape described by the spline curve. If the curve has the same control points, the
shape of the curve will be different due to different node vectors.
Based on the above four characteristics, the element takes a uniform and open node
vector, and the adjacent nodes are equidistant and have two ends. Tsai [32] suggested
gradually disassembling the recursive formula into a combination of zero-order base
functions. We then solve the corresponding coefficients to deduce Equation (9), which
represents a matrix piecewise spline curve. This replaces the recursive expression of
traditional deBoor equations [33,34]. k = 3 For example, where S j (t) represents the second
j Paragraph ( j = 0, 1, . . . , m). Each control point Pj , Pj+1 , Pj+2 , Pj+3 determines the
shape of the curve, which is constructed by the third-order spline base function, the segment
curve is shown in Figure 26. This base function has the smoothing characteristic of cubic
differentiability, which is convenient for the trajectory design of constraint optimization.
Machines 2021, 9, x FOR PEER REVIEW 15 of 20
Machines 2021, 9, 302 15 of 20

1 1 1 1
The constructed spline curve becomes an input source ⎡− of the−drawing⎤ system, while the
6 2 2 6of
curve trajectory is fed back to the Dobot to test the⎢ drawing function ⎥ the manipulator.
1 2 𝑡
⎢ −1 0 ⎥
𝑠 (𝑡) = [𝑃 𝑃 𝑃 𝑃 − ] ⎢6 2 2 − 2
1 1 1 ⎥ 𝑡 t3
1 3
 
(9)
 ⎢1− 1 −1 1 0 1 2 1
6
 ⎥ 𝑡 t2 
2 3 
s j (t) = Pj Pj+1 Pj+2 Pj+3   −⎢1 2 1 2 1 2 1 6 ⎥ 𝑡 t1  (9)
⎢1 1

Machines 2021, 9, x FOR PEER REVIEW
2 2 2 6 ⎥ 15 of 20
t0
⎣6 6 0 0 0 0 0 0⎦

Figure 26.
Figure B-splinesegment
26. B-spline
B-spline segment curve.
segment curve.

4.
4. Experimental
Experimental Results
Experimental Results
Results
To
Toverify
verifythe
verify thefeasibility
the feasibilityof
feasibility ofthe
of theproposed
the proposedsystem,
proposed system,we
system, weperformed
we performedfunctional
performed tests
functionaltests
functional with
testswith
with
the proposed algorithms for
algorithms for
the proposed algorithms edge-point sequential
for edge-point arrangement
edge-point sequential
sequential arrangementand piecewise
arrangement and spline
and piecewise curve
piecewise spline
spline
representation. FigureFigure
curve representation.
representation. 27 shows
Figure 27 the module
27 shows
shows the kit and
themodule
module kit measured
kitand
and environment
measured
measured environmentloaded
environment loadedin
loaded
the system.
in the system.

Users Graphical Image Numerical


Interface Processing Processing
Module Module Module
Tkinter OpenCV Numpy

Underlying Draw ing System System


protocol Control Interface Processing
Dobot
Module
Communication
sys/OS
Ptotocol Based on Python

Serial Time
Communication Processing
Module Module
pySerial Time

Figure 27.
Figure27.
Figure Loading
27.Loading of
Loadingof module
ofmodule kit
modulekit and
kitand measured
andmeasured environment.
measuredenvironment.
environment.

4.1.
4.1. Graphical
4.1. Graphical Window
Graphical WindowInterface
Window Interface
Interface
We
We imported the Tkinter
We imported the Tkinter
imported the TkinterGUI
GUIfunction
GUI functionlibrary
function libraryinto
intothe
thePython
Pythonenvironment write
environmenttotowrite
write
the
the graphical
the graphical window
graphical window interface.
window interface. Image
interface. Image processing
Image processing includes
processingincludes gray-scale
includesgray-scale processing,
gray-scaleprocessing, adjusting
processing,adjust-
adjust-
binarization
ing
ing binarization thresholds,
binarization and edge
thresholds,
thresholds, and detection.
and edge The results
edge detection.
detection. The of image
The results
results ofpreprocessing
of image can be seen
imagepreprocessing
preprocessing can
can
in
be the image
seen in processing
the image window.
processing The function
window. The of robot
function drawing
of is
robot
be seen in the image processing window. The function of robot drawing is designed designed
drawing through
is designedthe
robot
throughcontrolrobot
key and image processing display area. Based on the user interface written by
through the the robot control
control key
key and
and image
imageprocessing
processingdisplay
displayarea.
area.Based
Basedon onthe
theuser
userinter-
inter-
Tkinter,
face the environment
written by Tkinter, comprises
the the following
environment comprisesfive working
the blocksfive
following shown in Figure
working blocks28:
face written by Tkinter, the environment comprises the following five working blocks
(1) originalFigure
shown image display area, (2) preprocessing displaypreprocessing
area, (3) function setting area,
shown in in Figure 28:28: (1)
(1) original
original image
image display
display area,
area, (2)
(2) preprocessingdisplay
displayarea,
area,(3)(3)
(4) edge-point
function sequence arrangement, and (5) robot moving coordinates.
function setting
setting area,
area, (4)
(4) edge-point
edge-point sequence
sequencearrangement,
arrangement,and and(5)
(5)robot
robotmoving
movingcoordi-
coordi-
nates.
nates.
Machines 2021, 9, x FOR PEER REVIEW 16 of 20

Machines 2021, 9, 302 16 of 20


Machines 2021, 9, x FOR PEER REVIEW 16 of 20

Read File Camera ON

Gray Sobel

OTSU TH Brightness

Read File
OTSU Threshold
Camera ON

Gray Sobel
Brightness Adj
OTSU TH Brightness

Binary InvBinary Erosion


OTSU Threshold
8-NN 16-NN
Brightness Adj
Cycloid 8-shape
Horizontal
Size : W846 H832 Size : W846 H832 Binary Erosion
Vertical InvBinary

8-NN 16-NN
X+ Y+ Z+
Connecting Dobot Drawing...
Cycloid 8-shape
X- Y- Z-
Horizontal
DrawingSize :XW846 H832Y Z Size : W846 H832
Position Vertical
F(+) B(-) R(+) L(-) U(+) D(-)
Position Coordinate
X:0.0 Y;0.0 Z:0.0 Exit
X+ Y+ Z+
Connecting Dobot Drawing...
X- Y- Z-
Figure 28. Graphical
Drawing X window
Y interfaceZ
Position F(+) B(-) R(+) L(-) U(+) D(-)
Position Coordinate
X:0.0 Y;0.0 Z:0.0 Exit

In addition, as shown in Figure 29, the coordinate system of the image is usually
Figure
different
Figure 28.Graphical
28. Graphical
from windowinterface.
the coordinate
window interface
system defined by the manipulator. It is assumed that for an
image 𝐶 , 𝐶 , the coordinate axis is (𝑔𝑥, 𝑔𝑦). The drawing range of the manipulator is
In addition, as shown in Figure 29, the coordinate system of the image is usually differ-
limitedIn to (△ 𝑝𝑥,△
addition, as𝑝𝑦) = (125𝑚𝑚,
shown in Figure230𝑚𝑚),
29, thewhich
coordinate
meanssystem axis isis(𝑝𝑥,
of the image
its coordinate 𝑝𝑦).
usually
ent from  the coordinate system defined by the manipulator. It is assumed that for an image
different
We definefromthethe coordinate
starting pointsystem
of thedefined
manipulator as 𝑋. The x-axis
by the manipulator. It is assumed
coordinatethatisfor an
then
Cx , Cy , the coordinate axis is (gx, gy). The drawing range of the manipulator is limited
𝐶 , 𝐶 axis is (𝑔𝑥, 𝑔𝑦).
𝑜𝑓𝑓𝑠𝑒𝑡𝑋, the starting Y-axis coordinate is 𝑜𝑓𝑓𝑠𝑒𝑡𝑌, and the image coordinates are ex-
image , the coordinate The drawing range of the manipulator is
to (4 px, 4 py) = (125 mm, 230 mm), which means its coordinate axis is ( px, py). We
pressed
limited to (△ 𝑝𝑥,△ 𝑝𝑦) = (125𝑚𝑚, 230𝑚𝑚), which means its coordinate axis is (𝑝𝑥, 𝑝𝑦).
as follows:
define the starting point of the manipulator as X. The X-axis coordinate is then o f f setX, the
We define the starting point of the∆𝑝𝑥 manipulator as 𝑋. The x-axis coordinate is then
starting Y-axis coordinate is o f f setY, and the image coordinates are expressed as follows:
𝑝𝑥 =
𝑜𝑓𝑓𝑠𝑒𝑡𝑋, the starting Y-axis coordinate 𝑔𝑦
is + 180 + 𝑜𝑓𝑓𝑠𝑒𝑡𝑋,
𝑜𝑓𝑓𝑠𝑒𝑡𝑌, and the image coordinates are ex-
𝐶
pressed as follows: ∆px (10)
px = Cy gy + 180 + o f f setX,
∆𝑝𝑦 (10)
𝑝𝑦==∆py
py𝑝𝑥 ∆𝑝𝑥 𝑔𝑥−+− 115 +f 𝑜𝑓𝑓𝑠𝑒𝑡𝑌
+ o+
= Cx𝐶gx𝑔𝑦 115
180 f setY
𝑜𝑓𝑓𝑠𝑒𝑡𝑋,
𝐶
(10)
∆𝑝𝑦
𝑝𝑦 = 𝑔𝑥 − 115 + 𝑜𝑓𝑓𝑠𝑒𝑡𝑌 (0,0)
𝐶
125mm

230mm

Figure29.
Figure 29.Coordinate
Coordinateconversion
conversionbetween
betweenimage
imageand
andmanipulator.
manipulator.

4.2.
4.2. Mechanical
MechanicalDrawing
DrawingofofEdge-Point
Edge-PointSequential
SequentialArrangement
Arrangement
After
After the image source is obtained, the number ofofclosed
the image source is obtained, the number closed sections
sections in in
thethe
filmfilm
andand
the
the number of pixels
Figure 29.ofCoordinate
number in each
pixels in conversion
each closed closed
betweensection
sectionimage are obtained through
and manipulator.
are obtained image preprocessing
through image preprocessing and
and edge-point
edge-point sequential
sequential arrangement.
arrangement. As shownAs shown in Figure
in Figure 30, the30, the of
image image
Pikachuof Pikachu
features
features
4.2. edge
187 187
Mechanicaledge point
Drawing
point sets sets obtained
of Edge-Point
obtained through an through
Sequential an 8-NN search.
Arrangement
8-NN search. Therefore, the
Therefore, the robot arm must robotmake
arm
must make
187 strokes 187 strokes
to complete
After the to
image source complete
the image the image
of Pikachu.
is obtained, of Pikachu.
For a 24-NN
the number For a 24-NN
search,
of closed search,
56 edge
sections in thepoint 56 edge
film sets
and are
the
point setsofare obtained. Forclosed
all k, section
number
obtained. pixels
For all k,in each
2758 pixels are2758 pixels
searched. are searched.
areOnce
obtained Onceimage
through
the system the system
determines determines
preprocessing
the closed and
curve
the closed curve and edge point coordinates, it encodes them into batch data according
edge-point sequential arrangement. As shown in Figure 30, the image of Pikachu features
to the communication protocol formulated by the Dobot Magician. The pyserial module
187 edge point sets obtained through an 8-NN search. Therefore, the robot arm must make
carries out serial communication through the USB interface to control the Dobot Magician.
187 strokes to complete the image of Pikachu. For a 24-NN search, 56 edge point sets are
Figure 31 depicts the process of drawing Pikachu.
obtained. For all k, 2758 pixels are searched. Once the system determines the closed curve
Machines 2021,
Machines 2021, 9,
9, xx FOR
FOR PEER
PEER REVIEW
REVIEW 17 of
17 of 20
20
Machines 2021, 9, x FOR PEER REVIEW 17 of 20

and edge point


and point coordinates, it
it encodes them
them into batch
batch data according
according to the
the communica-
and edge
edge point coordinates,
coordinates, it encodes
encodes them into
into batch data
data according to
to the communica-
communica-
tion protocol
tion protocol formulated
formulated byby the
the Dobot
Dobot Magician.
Magician. The
The pyserial
pyserial module
module carries
carries out
out serial
Machines 2021, 9, 302 tion protocol formulated by the Dobot Magician. The pyserial module carries out17serial of 20
serial
communication through
communication through the
the USB
USB interface
interface to
to control
control the
the Dobot
Dobot Magician.
Magician. Figure
Figure 31
31 de-
de-
communication through the USB interface to control the Dobot Magician. Figure 31 de-
picts the
picts the process
process of
of drawing
drawing Pikachu.
Pikachu.
picts the process of drawing Pikachu.

Figure 30.
Figure 30. Results
Results of
of drawing
drawing Pikachu.
Pikachu.
Figure 30.Results
Figure30. Resultsof
ofdrawing
drawingPikachu.
Pikachu.

Figure 31.
Figure 31. Drawing
Drawing process
process of
of manipulator.
manipulator.
Figure31.
Figure 31.Drawing
Drawingprocess
processof
ofmanipulator.
manipulator.

4.3. Drawing
4.3.
4.3. Drawing of of B-Spline
B-Spline Curve
Curve Manipulator
Manipulator
4.3. Drawing
Drawing of of B-Spline
B-SplineCurve
CurveManipulator
Manipulator
According to
According
According to Equation
Equation (9),
(9), the
the following
following two two groups
groups ofof control
control point
point coordinates
coordinates areare
Accordingto toEquation
Equation(9), (9),the
thefollowing
followingtwo twogroups
groupsof ofcontrol
controlpoint
pointcoordinates
coordinatesare are
input:
input:
input: {(−10,10),
{(−10,10), (−5.135,12.194),
(−5.135,12.194),
−10,10), (−5.135,12.194), (−1.1486,16.7401),
(−1.1486,16.7401),
(−5.135,12.194), (−1.1486,16.7401), (0,20),
(0,20),
(−1.1486,16.7401),(0,20), (2.1943,15.1351),
(2.1943,15.1351), (6.7491,11.1486),
(6.7491,11.1486),
input:{({(−10,10), (0,20),(2.1943,15.1351),
(2.1943,15.1351), (6.7491,11.1486),
(6.7491,11.1486),
(10,10),(5.135,7.8057),
(10,10), (5.135,7.8057),(1.1486,3.2599),
(5.135,7.8057), (1.1486,3.2599),
(1.1486,3.2599), (0,0),
(0,0), (−2.1943,4.8650),
(−2.1943,4.8650), (−6.7401,8.8514),
(−6.7401,8.8514), (−10,10)}
(−10,10)} and
and
(10,10), (5.135,7.8057), (1.1486,3.2599), (0,0), (−2.1943,4.8650), (−6.7401,8.8514),(−
(10,10), (0,0), ( − 2.1943,4.8650), ( − 6.7401,8.8514), 10,10)} and
(−10,10)} and
{(−10,0),
{({(−10,0), (−5,5),
−10,0), ((−5,5), (0,0),
−5,5), (0,0), (5,−5),
(5,−5), (10,0),
−5), (10,0), (5,5),
(5,5), (0,0),
(0,0), (−5,−5),
(−5,−5), (−10,0)}.
5,−5), (−10,0)}. Python
−10,0)}. Python then
Python then generates
generates a
{(−10,0), (−5,5), (0,0), (5,
(5,−5), (10,0), (5,5), (0,0),(−
(5,5),(0,0), (−5,−5), ((−10,0)}. Python generates aaa
then generates
group of
group of four
four pointed
pointed cycloids
cycloids and
and a group
group of of eight-shaped
eight-shaped spline
spline curves.
curves. Figures
Figures 32–34
32–34
group
groupof offour
fourpointed
pointedcycloids
cycloidsandandaaagroup
groupof ofeight-shaped
eight-shapedsplinesplinecurves.
curves. Figures
Figures32–34
32–34
showthe
show
show theresults
the resultsof
results ofthe
of thetwo
the twogroups
two groupsofof
groups ofcurves
curvesdrawn
curves drawnby
drawn bythe
by themanipulator.
the manipulator.
manipulator.
show the results of the two groups of curves drawn by the manipulator.

Figure 32.
Figure 32. Drawing
Drawing results
results of
of (A)
(A) quadrupole
quadrupole cycloid
cycloid and
and (B)
(B) 8-shaped
8-shaped B-spline
B-spline curve.
curve.
Figure 32.Drawing
Figure32. Drawingresults
resultsof
of(A)
(A)quadrupole
quadrupolecycloid
cycloidand
and(B)
(B)8-shaped
8-shapedB-spline
B-splinecurve.
curve.
Machines 2021, 9, 302 18 of 20
Machines 2021, 9, x FOR PEER REVIEW 18 of 20
Machines 2021, 9, x FOR PEER REVIEW 18 of 20

Figure33.
33. Drawingprocess
process ofquadrupole
quadrupole cycloidmanipulator.
manipulator.
Figure 33.Drawing
Figure Drawing processof
of quadrupolecycloid
cycloid manipulator.

Figure 34. Drawing process of 8-shaped curve manipulator.


Figure34.
Figure 34.Drawing
Drawingprocess
processof
of8-shaped
8-shapedcurve
curvemanipulator.
manipulator.

5. Conclusions and Directions for Future Research


5.5.Conclusions
Conclusionsand andDirections
Directionsfor forFuture
FutureResearch
Research
In
In this
this paper,
paper, the
the Dobot
Dobot Magician
Magician andf
andf Raspberry
Raspberry Pi Pi development
development platform
platform areareusedused
to
In this paper, the Dobot Magician andf Raspberry Pi development platform are used
to integrate
integrate image image processing
processing and and robot-arm
robot-arm drawing.
drawing. Since Since
the the
image image
format format
is is
scanned scanned
from
to integrate image processing and robot-arm drawing. Since the image format is scanned
from
left to left
righttoand
right andtop from to top to bottom, the drawing manipulator
cannotcannot
draw draw continu-
from left to rightfrom
and from bottom,
top the drawing
to bottom, manipulator
the drawing manipulator cannot continuously.
draw continu-
ously.
Our main Ourcontribution
main contribution is an algorithm
is an algorithm for edge-point
for edge-point sequential sequential arrangement,
arrangement, which
ously. Our main contribution is an algorithm for edge-point sequential arrangement,
which
not onlynot only searches
searches the closed the curves
closed curves in the drawn
in the drawn image butimage but
also also arranges
arranges the edge thepixel
edge
which not only searches the closed curves in the drawn image but also arranges the edge
pixel coordinates
coordinates of each of each closed curve in order and places them in a set. This means that
pixel coordinates ofclosed curve curve
each closed in order and places
in order them them
and places in a set.
in aThis means
set. This that the
means that
the
pixels pixels
in thein the set are arranged in a direction counterclockwise to the closed curve. The
the pixels in set
the are arranged
set are arranged in aindirection
a direction counterclockwise
counterclockwise to tothetheclosed
closedcurve.
curve.The The
number
number of closed curves is equal to the number of strokes drawn by the manipulator,
number of of closed
closed curves
curves isis equal
equal toto the
the number
number of of strokes
strokes drawn
drawn by by the
the manipulator,
manipulator,
which means
which means thatthat fewer
fewerclosed
closedcurves
curvesreduce
reduce thethecomplexity
complexity of the task.task.
of the Thus, this paper
which means that fewer closed curves reduce the complexity of the task. Thus,Thus, this
this paper
proposes
paper applying
proposes not only
applying not 8-NN
only but also
8-NN but16- and16-
also 24-NN.
and Through
24-NN. the Raspberry
Through the Python
Raspberry
proposes applying not only 8-NN but also 16- and 24-NN. Through the Raspberry Python
platform,
Python the drawing
platform, path points
the drawing pathare converted
points into drawing
are converted coordinates
into drawing for the robot
coordinates for
platform, the drawing path points are converted into drawing coordinates for the robot
arm.
the Experimental
robot arm. results
Experimental show
resultsthat
show24-NNthat effectively
24-NN reduces
effectively the
reducesnumber
the of closed
number of
arm. Experimental results show that 24-NN effectively reduces the number of closed
curvescurves
closed and thus andthe
thusnumber
the of strokes
number of drawndrawn
strokes by theby manipulator.
the manipulator.After After
forming a quad-
forming a
curves and thus the number of strokes drawn by the manipulator. After forming a quad-
rupole cycloid
quadrupole andand
cycloid an aneight-shaped
eight-shaped closed
closedspline
splinecurve
curvethrough
through the divided
divided B-spline
B-spline
rupole cycloid and an eight-shaped closed spline curve through the divided B-spline
curvealgorithm,
curve algorithm,ititestablishes
establishesaagroup groupofofarray
arraycoordinate
coordinatedata datafor formechanical-drawing
mechanical-drawing
curve algorithm, it establishes a group of array coordinate data for mechanical-drawing
Machines 2021, 9, 302 19 of 20

simulation, verification, and comparison. The image processing of machine vision is closely
related to the drawing effect of the manipulator. The effect of binarization can be adjusted
using the threshold; however, edge detection is achieved by convoluting the horizontal and
vertical edge filters. This effect depends entirely on the 3 × 3 arithmetic core. Directions
for future research include improving the Sobel edge detection algorithm and using the
proposed system with other end tools, such as laser engraving or 3D printing.

Author Contributions: P.-S.T. put forward the original idea of this paper and was responsible for the
planning and implementation of the whole research plan. T.-F.W. is responsible for thesis writing
and text editing. J.-Y.C. deduces the image processing algorithm in the manipulator drawing system
and implements it with Python program. F.-H.L. establishes the hardware architecture and function
verification of the manipulator drawing system. All authors have read and agreed to the published
version of the manuscript.
Funding: This research was funded by Ministry of Science and Technology (MOST) (https://www.
most.gov.tw, (accessed on 20 October 2021)), grant number MOST 110-2221-E-197-025.
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Not applicable.
Acknowledgments: The authors would like to thank the Ministry of Science and Technology (MOST)
of the Republic of China, Taiwan, for financially supporting this research under Contract No. MOST
110-2221-E-197-025. (https://www.most.gov.tw, (accessed on 20 October 2021)).
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Akella, P.; O’Reilly, O.M.; Sreenath, K. Controlling the Locomotion of Spherical Robots or why BB-8 Works. J. Mech. Robot. 2019,
11, 024501. [CrossRef]
2. Kennington, C.; Plane, S. Symbol, Conversational, and Societal Grounding with a Toy Robot. arXiv 2017, arXiv:1709.10486.
3. Souza, I.M.L.; Andrade, W.L.; Sampaio, L.M.R.; Araujo, A.L.S.O. A Systematic Review on the use of LEGO Robotics in Education.
In Proceedings of the 2018 IEEE Frontiers in Education Conference (FIE), San Jose, CA, USA, 3–6 October 2018; pp. 1–9.
4. Jia, Y.; Song, X.; Xu, S.S.-D. Modeling and motion analysis of four-Mecanum wheel omni-directional mobile platform. In Pro-
ceedings of the CACS International Conference on Automatic Control (CACS 2013), Nantou, Taiwan, 2–4 December 2013;
pp. 328–333.
5. LEGO®BOOST Videos. Available online: https://www.lego.com/en-us/themes/boost/videos (accessed on 28 October 2021).
6. Kato, I.; Ohteru, S.; Shirai, K.; Matsushima, T.; Narita, S.; Sugano, S.; Kobayashi, T.; Fujisawa, E. The robot musician “wabot-2”
(waseda robot-2). Robotics 1987, 3, 143–155. [CrossRef]
7. Sugano, S.; Kato, I. WABOT-2: Autonomous robot with dexterous finger-arm—Finger-arm coordination control in keyboard
performance. In Proceedings of the IEEE International Conference on Robotics and Automation, Raleigh, NC, USA, 31 March–3
April 1987; pp. 90–97.
8. Ogata, T.; Shimura, A.; Shibuya, K.; Sugano, S. A violin playing algorithm considering the change of phrase impression.
In Proceedings of the 2000 IEEE International Conference on Systems, Man and Cybernetics, Nashville, TN, USA, 8–11 October
2000; Volume 2, pp. 1342–1347.
9. Huang, H.H.; Li, W.H.; Chen, Y.J.; Wen, C.C. Automatic violin player. In Proceedings of the 10th World Congress on Intelligent
Control and Automation (WCICA), Beijing, China, 6–8 July 2012; pp. 3892–3897.
10. Zhongyuan University—Robot Playing String Orchestra. Available online: https://www.facebook.com/cycuViolinRobot
(accessed on 28 October 2021).
11. Jain, S.; Gupta, P.; Kumar, V.; Sharma, K. A force-controlled portrait drawing robot. In Proceedings of the IEEE International
Conference on Industrial Technology (ICIT), Seville, Spain, 17–19 March 2015; pp. 3160–3165.
12. Sun, Y.; Xu, Y. A calligraphy robot—Callibot: Design, analysis and applications. In Proceedings of the IEEE International
Conference on Robotics and Biomimetics (ROBIO), Shenzhen, China, 12–14 December 2013; pp. 85–190.
13. Gülzow, J.M.; Paetzold, P.; Deussen, O. Recent developments regarding painting robots for research in automatic painting,
artificial creativity, and machine learning. Appl. Sci. 2020, 10, 3396. [CrossRef]
14. Scalera, L.; Seriani, S.; Gasparetto, A.; Gallina, P. Non-photorealistic rendering techniques for artistic robotic painting. Robotics
2019, 8, 10. [CrossRef]
15. Otoniel, I.; Claudia, H.A.; Alfredo, C.O.; Arturo, D.P. Interactive system for painting artworks by regions using a robot. Robot.
Auton. Syst. 2019, 121, 103263.
Machines 2021, 9, 302 20 of 20

16. Guo, C.; Bai, T.; Lu, Y.; Lin, Y.; Xiong, G.; Wang, X.; Wang, F.Y. Skywork-daVinci: A novel CPSS-based painting support system. In
Proceedings of the IEEE 16th International Conference on Automation Science and Engineering (CASE), Hong Kong, China,
20–21 August 2020; pp. 673–678.
17. RobotDK: Lights out Robot Painting. Available online: https://robodk.com/blog/lights-out-robot-painting/ (accessed on 20
November 2021).
18. Robots Write Calligraphy. What Do You Think? Available online: https://kknews.cc/tech/alxnrrv.html (accessed on 24
August 2021).
19. Shenzhen Yuejiang Technology Co. Ltd. Dobot Magician Specifications/Shipping List/API Description. Available online:
https://www.dobot.cc./ (accessed on 24 August 2021).
20. Hock, O.; Šedo, J. Forward and Inverse Kinematics Using Pseudoinverse and Transposition Method for Robotic Arm DOBOT; IntechOpen:
Vienna, Austria, 2017.
21. Tsao, S.W. Design of Robot Arm Laser Engraving Systems Base on Android App. Master’s Thesis, National Taiwan Ocean
University, Keelung, Taiwan, 2019.
22. Truong, N.V.; Vu, D.L. Remote monitoring and control of industrial process via wireless network and Android platform.
In Proceedings of the IEEE International Conference on Control, Automation and Information Sciences, Ho Chi Minh City,
Vietnam, 26–29 November 2012; pp. 340–343.
23. Deussen, O.; Lindemeier, T.; Pirk, S.; Tautzenberger, M. Feedback-guided Stroke Placement for a Painting Machine. In Proceedings
of the Eighth Annual Symposium on Computational Aesthetics in Graphics, Visualization, and Imaging, Annecy, France, 4–6 June
2012; pp. 25–33.
24. Avanzato, R.L. Development of a MATLAB/ROS Interface to a Low-cost Robot Arm. In Proceedings of the 2020 ASEE Virtual
Annual Conference Content Access, Virtual Online, 22 June 2020.
25. Luo, R.C.; Hsu, C.; Chen, S. Electroencephalogram signal analysis as basis for effective evaluation of robotic therapeutic massage.
In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14
October 2016; pp. 2940–2945.
26. Dhawan, A.; Honrao, V. Implementation of Hand Detection based Techniques for Human Computer Interaction. arXiv 2013,
arXiv:1312.7560.
27. University in Finland Build a Industry 4.0 Assembly Line Using Dobot Magician. Available online: https://www.youtube.com/
watch?v=razSDc79Xi4 (accessed on 24 August 2021).
28. Myint, K.M.; Htun, Z.M.M.; Tun, H.M. Position Control Method for Pick and Place. Int. J. Sci. Res. 2016, 5, 57–61.
29. Kuo, S.; Fang, H.H. Design of GA-Based Control for Electrical Servo Drive. Adv. Mater. Res. 2011, 201, 2375–2378.
30. Hanan, S.; Markku, T. Efficient Component Labeling of Images of Arbitrary Dimension Represented by Linear Bintrees. IEEE
Trans. Pattern Anal. Mach. Intell. 1988, 10, 579–586. [CrossRef]
31. Lifeng, H.; Xiwei, R.; Qihang, G.; Xiao, Z.; Bin, Y.; Yuyan, C. labeling problem: A review of state-of-the-art algorithms. Pattern
Recognit. 2017, 70, 25–43.
32. Tsai, P.S. Modeling and Control for Wheeled Mobile Robots with Nonholonomic Constraints. Ph.D. Thesis, National Taiwan
University, Taipei, Taiwan, 2006.
33. de Boor, C. A Practical Guide to Splines; Springer: New York, NY, USA, 1978.
34. Shen, B.Z.; Chen, C.S. A Novel Online Time-Optimal Kinodynamic Motion Planning for Autonomous Mobile Robot Using
NURBS. In Proceedings of the International Conference on Advanced Robotics and Intelligent Systems (ARIS 2020), Taipei,
Taiwan, 19–21 August 2020.

You might also like