Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

,(((WK,QWHUQDWLRQDO&RQIHUHQFHRQ$XWRPDWLRQ6FLHQFHDQG(QJLQHHULQJ &$6(

A Time-of-Flight On-Robot Proximity Sensing System to Achieve


Human Detection for Collaborative Robots
Odysseus Alexander Adamides1 , Anmol Saiprasad Modur2 , Shitij Kumar∗,3 , Ferat Sahin4

Abstract— The sensor system presented in this article demon- contact and mitigates the impact forces applied to the human.
strates the results of designing an exteroceptive sensing device Especially in safety-rated monitored stop and speed and
for proximity sensing for collaborative robots. The intention separation monitoring, proximity detection is very important.
of this design’s application is to develop an on-robot small
footprint proximity sensing device. The design was assembled Outside of 2-D scanning LIDARs, there has been minimal
and put through a number of benchmark tests to validate the research regarding devices that can be mounted on the robot
performance of the time-of-flight(ToF) sensor system when used to perform these collaborative tasks. Hence, we designed a
in proximity sensing: Single Sensor Characterization, Sensor small footprint sensor array that can be mounted on-robot
Overlap Characterization, and Sensor Ranging Under Motion. for proximity sensing during collaborative tasks.
Through these tests, the ToF sensor ring achieves real time data
throughput while minimizing blind spots. In this paper, a time-of-flight (ToF) sensor ring was built
for proximity sensing with no blind spots with real-time data
I. INTRODUCTION throughput. The ring was tested to see its object detection
performance in static and dynamic settings. The results
In industry, automated machinery is used to perform a
validated the sensor’s potential use in the field of human
multitude of tasks within proximity to humans. Human safety
robot collaboration.
during these tasks is critical. In order to maintain human
Prior to discussion of our proposed method, we describe
safety, procedures and physical safety measures are put in
the current sensors used for proximity sensing. We then
place. The International Organization for Standardization
discuss the sensor’s ring design, as well as the testing
(ISO) specifies safety measures as physical and electrical
performed on the proximity sensing system of single sensor
safeguards [1]. Physical safeguards consist of cages and
characterization, sensor overlap characterization, and sensor
barriers which separate humans from machinery. Electrical
ranging under motion. We end with recommendations and
safeguards include sensors, such as 2-D Lidar sensors and
future applications of this sensor ring design.
light curtains, in order to limit environment changes and
human proximity to machinery. II. L ITERATURE S URVEY
One of the biggest applications of human safety with
In order to perform these high level avoidance tasks, robots
respect to machinery is human robot collaboration. In this
and their workspaces must be outfitted with sensor systems.
field, safety relies on the ability of the robot to detect the
These systems monitor the behavior of the robot and the state
presence of the human. Researchers in the field of human-
of the robot’s environment. The ability to characterize these
robot collaboration have been investigating many different
states is crucial to perform object and collision avoidance.
solutions to the human detection problem. Approaches fall
Object and collision avoidance can be performed through
into one of the four types of ISO defined collaboration
a number of sensor systems. The two major categories these
operations: safety-rated monitored stop, hand guiding, speed
systems fall under are proprioceptive and exteroceptive [3].
and separation monitoring, and power and force limiting by
Proprioceptive sensors (i.e. data include, motor encoders,
inherent design of control [2]. These collaborative operation
and gyroscopes) measure data internal to the robot such
types are defined in ISO clause 10218-1 sections 5.10.2 to
as joint speed, torque, and force. [3]. The data from these
5.10.5. In the safety-rated monitored stop, the operation is
types of sensors are crucial for performing power and force
halted when the human operator enters the workspace. In
limiting and hand guiding collaboration actions. Exterocep-
hand guiding, the robot only moves based on human user
tive sensors measure data regarding the robot’s environment
input. In speed and separation monitoring, the robot’s speed
including the distance and speed of the robot with respect to
is controlled based on the distance between the human and
a moving or fixed object. This data is essential to perform
the robot while both are in the collaborative workspace.
speed and separation monitoring and safety-rated monitored
Power force and limiting by inherent design acts after a
stop collaborative actions.
collision occurs. In this last operation mode, the robot detects
From the robot’s perspective, exteroceptive sensing can be
1 Odysseus Adamides† , Graduate Student, oaa8092@rit.edu further categorized as extrinsic or intrinsic. Extrinsic sensing
2 Anmol Modur† , Undergraduate Student, asm7148@rit.edu has been performed via 2-D Lidar sensors, motion capture
3 Shitij Kumar † , Ph.D. Candidate, spk4422@rit.edu*
cameras, routers and RGB-D cameras such as Microsoft
4 Ferat Sahin† , Professor, feseee@rit.edu
† Department of Electrical and Microelectronics Engineering, Rochester
Kinect [4], [5], [6]. These devices collect data about the
Institute of Technology, Rochester, NY 14623, USA environment so the robot can decide how to move. In [5],
*Corresponding Author a Microsoft Kinect was used to perform human motion

‹,((( 
(a) (b) (c)

Fig. 1. (a) A setup with Universal Robot UR 10 and a Human sharing the workspace. For reference. (b) A 8 Node ToF Ring mounted on the end-effector
of the UR 10 robot [13].(c) The proposed 16-Node ToF ring mounted on the end-effector of the UR 10 robot.

prediction. At the same time, a 2D Lidar was mounted to III. T HE S ENSOR R ING D ESIGN
the robot rail setup to detect the presence of a human within
the safety radius of robot operation. In article [7], Baxter,
a robot by Rethink Robotics, was used to perform tasks In this study we investigate the benefits of an intrinsic
in a hybrid workspace with another human. The human exteroceptive ring design. The ring’s competitiveness is de-
motion was tracked using a motion capture system. This rived from the independent sensor’s robustness, along with
system used cameras mounted around the environment that the coverage generated by the spacing of the sensors in the
tracked markers placed on the human hand. The markers ring. Sensor spacing and positioning on the ring mitigate
determined the location of the hand in the workspace and blind spots within a critical reaction distance of the robot.
could determine if a collision was imminent. This study revealed that the ToF ring sensor demonstrated
a larger range of linearity in comparison with the Sharp IR
In contrast to extrinsic systems, intrinsic systems have sensors used in [10]. The IR sensors in [10] were nonlinear,
sensors mounted directly on the robot. Intrinsic sensors have highly susceptible to environmental noise, and could not
commonly been ultrasonic sensors, time of time-of-flight detect transparent objects like glass or poly-carbonate. The
sensors, infrared sensors, capacitive touch sensors, tactile ToF sensors in this study had a programmable field-of-view
sensors, and on-hand depth cameras [8] [9]. In [10] IR or FoV. The ToF sensor is a 16x16 Single Photon Avalanche
sensors were placed on an ABB IRB140 industrial robot to Diode (SPAD) matrix with a twenty-seven-degree FoV [14].
perform distance monitoring to determine if a human-robot During its ranging operation, the ToF sensor can collect
collision was going to occur. In article [11], infrared sensors multiple samples that are weighted to the strength of the
rings were created and placed on an ABB FRIDA dual-arm returned signal and averaged to output a single distance.
robot. The sensor data helped the robot avoid a human in In [13], the initial ring was designed to have eight ToF
the workspace. In article [12], a mobile robot was outfitted nodes. This setup was made with off-the-shelf parts and fitted
with an ultrasonic sensor, camera, and IR sensors to perform specifically to a UR-10 Robot by Universal Robotics.The
collision avoidance. max ranging distance of the individual sensor was 1.2 m.
Exteroceptive solutions bring a set of challenges to moni- The max data rate for one of the ToF sensors was 30 Hz
toring the robot’s workspace. Extrinsic exteroceptive sensors (Fig. 2(a)). The overall performance with all the sensors tied
are highly sensitive to changes in their placement and require together was less than the max speed. Limitations could also
continuous calibration to ensure their accuracy. Additionally, be attributed to post processing performed on proximity data
the large amount of data generated by the system requires at each sensor node.
more processing overhead which decreases system response In the current study the new ring was designed to increase
time and efficiency. Intrinsic exteroceptive sensor solutions data rate, decrease size, and adhere to multiple robot link
bring a wider range of flexibility to the human collision configurations (Fig. 2(b)). This design can be setup to have as
avoidance problem. A drawback of intrinsic systems, due little as 4 sensors and as many as 32. This flexibility expands
to their specific data output, is their decreased ability to the applications of the ring to any robot arm shape or size.
catch important details about the robot’s environment. The Off-the-shelf parts were replaced with custom circuitry in
previous study [13] shows the benefit of using on-robot ToF order to control the data management and configure sensor
sensors with collaborative robots. Our proposed ToF sensor node addresses. Additionally, data processing was moved out
solution sheds light on the use of intrinsic exteroceptive sen- of the sensor ring. This design choice mitigated computation
sors, minimizing blind spots and maximizing coverage and interference with the stream of real-time proximity data,
improving on existing designs in flexibility, responsiveness generating a higher stable data rate compared to the original
and data throughput (Fig. 1, Fig. 2). ToF Ring design.


(a) (b)

Fig. 2. (a) The original off-the-shelf 8 node ToF Ring Design [13] (b) The new modular ring design in a 16 node implementation. The ring is outfitted
on top of a stepper motor for Ranging Under Motion Testing

this precision. This could be explained by the operation of


the sensor’s FoV.
This tells us that the occupied FoV has a direct correlation
to the precision and accuracy of the sample (as observed in
Fig. 4(a), Fig. 4(c)). If multiple objects are detected within
the sensor’s FoV, the distance output is the average distance
among the objects. In the tests with different object sizes,
the closer those objects were to the sensor relative to the
Fig. 3. Sensor Ring Test Setup background wall, the higher the accuracy in the estimate of
the target’s distance. This is because closer objects take up
a higher percentage of the FoV and the reflected signals are
IV. S ENSOR R ING E VALUATION stronger, causing the estimated distance to fall closer to the
target. As objects move further away, both metrics decrease.
The prototype ToF Ring was built to encompass a 4-
This decreases the accuracy of the sensor. There is a point
inch diameter end-effector link. The link size was chosen to
at which the object can no longer be differentiated from its
replicate the final link of Universal Robotics’s UR10 which
background. This is the point of highest precision error. After
represents a commonly used robot in the industry and was
this point, it is uncertain what the sensor reading represents.
the robot of choice [13]. This ring is different, as it uses 16
However, its constant nature represents the background it
ToF sensors instead of 8, in order to minimize blind spots
sees (Fig. 4(b)). These observations show that the ToF sensor
and optimize the ring’s FoV (see Fig. 2(b)). A stepper motor
on this ring generates very reliable target acquisition out to
was mounted to the bottom of the ToF ring to precisely rotate
800 mm for smaller targets (20 - 100 mm), and 1000 mm for
the Ring at a constant rpm in Section IV-C of testing (Fig.
larger 300 mm targets. The 300 mm target size is comparable
3).
to the average size of human body parts including the head,
waist depth and shoulder depth [15].
A. Single Sensor Characterization
The ToF sensors were the fundamental building blocks of B. Sensor Overlap Characterization
the full integrated ToF ring system. Therefore, the first tests Sensor ranging overlap is crucial in order to make the ToF
performed evaluated the single sensor’s behavior. Validation ring a robust sensor system for human detection. Although
of a single sensor’s effectiveness provided insight into the the current 8-sensor ring is operational and greatly reduces
effectiveness of the overall system. The single sensor char- human injury risk in the collaborative workspace, blind spots
acterization consisted of collecting ranging data for targets still exist, leaving open vulnerabilities in the system [13].
in several different scenarios. First, a target was placed at These blind spots occur due to the spacing of the sensors as
a set distance and ranged by a single ToF sensor. This data seen in Fig. 5. In order to eliminate blind spots, the ring
collection was performed at distances from 20 cm to 150 configuration needs to be changed. The goal of the new
cm increments of 10 cm. Next, this procedure was repeated sensor ring was to find the least amount of sensors that
varying target widths. The tested target widths ranged from were able to achieve near total coverage in field-of-view,
1 cm to 30 cm. as well as understand how placement and positioning of
Data samples were collected in sets of 500 points. These the sensors affected this coverage (Fig. 5). As evidenced in
collections were averaged and the standard deviation was the single sensor characterization tests, proper FoV coverage
taken at each distance for given target width. The results was imperative to single sensor performance. Additionally,
showed that the precision of the ToF sensor decreases as changes in FoV affect multi-sensor performance. Perfor-
the target is moved away from the sensor. For any given mance degradation in multi-sensor applications arise in the
distance, however, the target’s width had a greater effect on form of blind spots for a given separation distance between


(a) (b) (c)

Fig. 4. (a) Precision of sensor in mm measuring different object widths at different distances. (b) Measured distance vs actual distance of targets with
different widths. Error accumulates the smaller the object is as well as how much field-of-view they occupy. (c) Percentage accuracy of sensor ranging
targets of different widths at different distances.

Fig. 5. Comparing the field-of-view of an 8-node sensor array to 16. The


range is set to 1.2m, the limit set for the tests. This shows how there are
clearly visible blind spots in the 8-node array vs. 16 array.

sensors.
Fig. 6. Representing the lost coverage i.e. the blind spot caused by
Starting with the characterizations of the sensor allowed placing ToF sensors on a arc, in comparison to placing them co-linearly
us to draw a baseline approach. Each sensor placed on a (left). Geometric Model of two sensor overlapping at distance d on the
cylindrical robot joint covers a field-of-view able to detect circumference of a ring with radius r (right).
objects. The more sensors placed creates difficulties in feasi-
bility including portability, power consumption, and physical
limitations but is able to increase the detectable area in • φ is the FOV angle for the sensor.
which the sensor system performs. An abundance of sensors • d is the overlapping distance from the arc between the
introduces the concept of overlap in FoV. This is important, two sensors and the point of overlap.
as overlap minimizes blind spots but can decrease precision • γ is the angle between the FOV of a sensor and the
in target detection. The balance between overlap and number line segment (base of triangle formed by the two sensor
of sensors creates the intended robust system. The geometric nodes and the center of the link.) between the two sensor
model shown in Fig. 6 was used to mathematically derive the nodes.
relation between the radius of the cylindrical link, the FoV • b is the distance from the line segment between the two
of the sensor, and the overlapping distance. These equations sensor and to the point of overlap.
are derived below: The number of sensors on the ring was determined by the
    minimum stopping distance of the UR-10 robot [13]. From
θ θ
r + d = r × cos + tan(γ) × r × sin (1a) full speed, the braking distance of the robot is 500 mm.
2 2
This meant that sensor overlap needed to happen around
θ × r = distance between ToF Nodes (1b) this critical distance. In Eq. (1a) - Eq. (1c), the desired
spacing is calculated to fit 16 sensors around UR-10’s 2
and   in radius link (r). Each of the 16 sensors around this link
θ φ
γ= + 90◦ − (1c) take up 22.5◦ of the cylindrical link (θ ). This information
2 2
was used to determine the sensor spacing Eq. (1a) and
where as shown in Figure 6 the variables are finally an overlap distance of 517.45mm Eq. (1b) (also
• θ is the angle from the center of the link that determines refer Fig. 6). This overlap distance is within 20 mm of
the spacing between the sensors to ensure minimum the critical stopping distance. These calculations determined
blind spots. that a 16-node sensor satisfies the critical stopping distance


caused by this tilt.
A simple test was devised to validate the overlap of the
sensor ring. A 20 cm wide target was placed between two
sensors and was positioned at interval locations away from
the sensors. This particular size was chosen as it is, on
average, the width of a human head [15]. Overlap was seen
at all tested distances from 300 mm to 600 mm. The target
was seen at 300 mm, which was below the calculated overlap
distance. It was observed that as the target was moved farther
(a)
away, the ranging deviation increased.

(b)

Fig. 7. (a) Ideal sensor placement with each sensor aligned radially. Left:
Field-of-view of sensors showing blind spots to a distance of 517mm, Top-
Right: Zoom of alignment of sensor, Bottom-Right: Zoom of edge of FoV.
(b) Actual Sensor Placement with each sensor aligned 5mm offset. Left:
Field-of-view of sensors showing blind spots to a distance of 517mm, Top-
Right: Zoom of alignment of sensor with offset, Bottom-Right: Zoom of
edge of FoV. Fig. 8. Distance error measuring a 20cm object at different distances away
from the sensors.

The distance error was calculated by taking the difference


requirements for the proximity sensing application [16]. between the readings of the two sensors and verifying the
Placing the sensors at exactly 22.5 degrees apart ensures overlap condition. The increase in error occurred because
minimal blind spots in testing but in reality matching this the target distance increased and the strength of the signal
exactness is not possible. Small discrepancies in the place- reflected back to the SPAD matrix decreased [14]. These
ment of each sensor around the robot joint may create an results match the behaviors seen in the single sensor test. As
unintentional blind spot as seen in Fig. 7(b). Even a small the target distance increased, the distance error increased;
adjustment in increasing the sensor to sensor distance by however, note that around the overlap location, 450 mm,
1 mm will cause the overlap distance to vary by 23 mm. the error dipped before continuing on its increasing trend.
Although the average overlap distance will remain constant, A potential cause of this behavior could be attributed to the
this alone will not ensure proper radial coverage. overlap region, where both sensors are receiving an equal
In order to test our ToF ring, all the previous requirements amount of ranging data. Since they received a similar return
need to be met or addressed as best as possible. One of signal, this could potentially explain why the difference
the inherent issues with the modular nature of the system is between the readings becomes slightly less (Fig. 8).
the difficulty in securing each sensor. To do this, the sensor
system has a bracing that slides over and under each node C. Sensor Ranging Under Motion
holding it in place. This brace was laser cut out of acrylic The final tests performed on the ToF sensor ring demon-
to ensure absolute precision in placing each sensor at the strated the most challenging operation case. Here, the test
specified angles. It is to be noted that it was difficult to fixture was used to replicate movement of the ring while
place the brace such that each ToF sensor was angled radially attached to a robot arm. The test case represented a robot
and would introduce unnecessary cable stress. To resolve rotating horizontally at a constant rpm. In this test case, if
this, each sensor was aligned differently which resulted in a a human entered the robot workspace, the sensors would
5mm offset of each sensor. To validate this, in MATLAB, a understand that a target has breached its FoV. To recreate this
sensor ring model was generated using Eq. (1a) - Eq. (1c) scenario, the sensor ring test fixture was built with a stepper
) ( refer Fig. 7(a), Fig. 7(b)). The model generated both motor as its base (Fig. 3). The sensor ring was surrounded
the ideal placement case and the offset placement case. The by a target guard that allowed only one sensor to collect data
offset case did not prove to be a drawback, as the sensor to at a time. This guard made data interpretation much easier
sensor measurement was preserved, keeping the blind spots by masking off unnecessary data, while keeping the noise of
the same area. The offset skewed the FoV by tilting it 4.8 the environment low. Using the guard set all non target data
degrees. With the initial test, there were no significant effects to less than 100 mm (Fig. 3).


The motor was spun through a range of velocities at
varying target widths. Each target width was tested from 4
revolutions per minute (rpm) up to 60 rpm. This test range
was determined based on the slowest rate that gave each
sensor an opportunity to detect the object in a 15 second
test. The fastest rate was chosen to test the sensor-system
significantly above the fastest rate an industrial robot could
operate at.
Each set of rpm captures, for a set target width, were
bench-marked using precision and accuracy evaluation. This
approach was similar to the one during single sensor charac-
terization (Section IV-A). A valid target capture range δ was
taken as ±5σ i.e. 5 times the standard deviation (σ ), for a
given target width at a mean distance μ. Any points outside
Fig. 9. 8 cm wide target at 1m away. Each color peak represents a sensor
of this range were considered a failed target acquisition. identifying the target through the target hole.
This acquisition representation was used to determine which
amplitudes were valid data points on which to perform signal
analysis. The standard deviation was taken across all valid
points for each speed set. The average value was taken for
each valid pulse in an rpm set. These valid pulse averages
were averaged together.
The raw data generated a periodic square wave across the
16 sensors. The 4 rpm case in Fig. 9 showed a clear visual
representation of the system behavior. The height of each
pulse was the distance at which the sensor saw the target.
In stable operation, the period and amplitude of the square
wave remained constant. The sensor system operation was
stable from 4 rpm up to 20 rpm. Signal degradation began to
appear after this speed (Fig. 10). After this speed the sensor
readings lose precision. One explanation of this is the sensor
ranging measurements start to miss the target. In these cases Fig. 10. 8 cm wide target at 1m away. Each color peak represents a sensor
identifying the target through the target hole. At this speed, sensors began
we noticed that the rotation rate was faster than the sensor’s to miss the target.
refresh rate (27 ms in short range continuous mode). This
discrepancy resembles the same behavior seen in the overlap
and static sensor tests. An explanation for why there was a decrease in precision
In Fig. 11 it is observed that for 8 cm targets 1 m away, at higher speeds can be found in the way that the data from
the average valid target acquisition was within 10 cm of the the sensors were acquired. In these tests, the system was
desired 1 m distance. This resulted in having less than 10% set to report pseudo-synchronously. This means the system
error in distance for 5 to 60 rpms readings. All data points would not report a capture until all 16 sensors had new, valid
between 4 and 20 rpms contained more than 130 samples. data points. This limitation was applied to these tests in order
This trend visually identified the end of the stable ranging to maintain the approach of testing the sensor system in the
motion region, which resided at 20 rpm. To highlight the most challenging application environment. If all nodes were
significance of this value, the fastest angular velocity a UR- set to acquire its data asynchronously, each sensor would
10 robot can rotate is 180 degrees per second. This rate is be able to operate at max speed, allowing each one to take
equivalent to 30 rpm. Though there were inconsistencies in more samples in the given time. This would allow for the
the 30 rpm test, it did not exceed more than 3 sensor pulse averaging of the sensor to acquire the minimum amount of
skips (Fig. 10). In order to better quantify this behavior of the accurate results. It is difficult to capture all this data from
sensor ring, a formulation for the radial coverage is defined a modular system and is would be recommended that a
as:   balanced approach be used.
dropped nodes
ζ = 1− % (2) V. C ONCLUSION
total nodes
The current 8-node sensing system generated signifi-
This results in a 81.25% radial coverage ζ at any given cant results with regard to human detection in the robot’s
point. Compared to the original 8-sensor ring’s 62.5% radial workspace. That system had been operated in real time on-
coverage, the new sensor ring’s design generated a 18.75% Robot with real hardware. The satisfactory nature of the
increase in radial workspace coverage at max operating speed system operates even with significant blind spots between
near the 1.2m sensor ranging limit. nodes. Through implementation of the new 16-node ring,


ACKNOWLEDGMENT
The authors are grateful to the staff of Multi Agent Bio-
Robotics Laboratory (MABL) and the CM Collaborative
Robotics Research (CMCR) Lab for their valuable inputs.
R EFERENCES
[1] Matthias, Bjoern. (2015). ISO/TS 15066 - Collaborative Robots -
Present Status.
[2] U. B. Himmelsbach, T. M. Wendt and M. Lai, ”Towards Safe Speed
and Separation Monitoring in Human-Robot Collaboration with 3D-
Time-of-Flight Cameras,” 2018 Second IEEE International Conference
on Robotic Computing (IRC), Laguna Hills, CA, 2018, pp. 197-200.
[3] B. Siciliano and O. Khatib Perception, in Springer handbook of
robotics, Berlin: Springer, 2016, pp. 7984.
[4] M. J. Rosenstrauch and J. Krger, ”Safe human robot collaboration
Operation area segmentation for dynamic adjustable distance moni-
Fig. 11. Red line representing the actual distance away from the target, toring,” 2018 4th International Conference on Control, Automation
Blue markers are the average distance recorded with error, Orange Line and Robotics (ICCAR), Auckland, 2018, pp. 17-21.
representing the samples identifying the target in 15 seconds per rpm. [5] V. V. Unhelkar et al., ”Human-Aware Robotic Assistant for Collabo-
rative Assembly: Integrating Human Motion Prediction With Planning
in Time,” in IEEE Robotics and Automation Letters, vol. 3, no. 3, pp.
2394-2401, July 2018.
the new system has the potential to exhibit exceptional [6] G. Dumonteil, G. Manfredi, M. Devy, A. Confetti and D. Sidobre,
performance in comparison to the already significant results ”Reactive planning on a collaborative robot for industrial applica-
collected from the current 8-node configuration. The new tions,” 2015 12th International Conference on Informatics in Control,
Automation and Robotics (ICINCO), Colmar, 2015, pp. 450-457.
configuration eliminates blind spots beyond 0.5 m and dras- [7] B. Sadrfaridpour and Y. Wang, ”Collaborative Assembly in Hybrid
tically increases sub human depth target recognition at 1 m. Manufacturing Cells: An Integrated Framework for HumanRobot
Lastly, though this new sensing system was tested with a Interaction,” in IEEE Transactions on Automation Science and En-
gineering, vol. 15, no. 3, pp. 1178-1192, July 2018.
16-node configuration, the system can be expanded up to a [8] M. Ghandour, H. Liu, N. Stoll and K. Thurow, ”A hybrid collision
32-node ring. Maintaining the same overlap coverage and avoidance system for indoor mobile robots based on human-robot
sensor distance, the 32-node system could be mounted to a interaction,” 2016 17th International Conference on Mechatronics -
Mechatronika (ME), Prague, 2016, pp. 1-7.
link with a diameter of 232 mm (9.134 in), expanding the [9] T. Schlegl, T. Krger, A. Gaschler, O. Khatib and H. Zangl, ”Vir-
possibilities ever further. tual whiskers Highly responsive robot collision avoidance,” 2013
IEEE/RSJ International Conference on Intelligent Robots and Systems,
VI. F UTURE W ORK Tokyo, 2013, pp. 5373-5379. doi: 10.1109/IROS.2013.6697134
[10] G. Buizza Avanzini, N. M. Ceriani, A. M. Zanchettin, P. Rocco
The next steps will involve ring fixture and robot imple- and L. Bascetta, ”Safety Control of Industrial Robots Based on
mentation. A modular band design will be created to house a Distributed Distance Sensor,” in IEEE Transactions on Control
Systems Technology, vol. 22, no. 6, pp. 2127-2140, Nov. 2014.
the ToF ring. This band will then be outfitted onto the UR- [11] N. M. Ceriani, A. M. Zanchettin, P. Rocco, A. Stolt and A. Robertsson,
10 robot link and tested to see its effectiveness of proximity ”Reactive Task Adaptation Based on Hierarchical Constraints Classi-
detection while the robot is in motion, as shown in our fication for Safe Industrial Robots,” in IEEE/ASME Transactions on
Mechatronics, vol. 20, no. 6, pp. 2935-2949, Dec. 2015.
previous and current work [13][17]. A prototype setup is [12] A. M. Alajlan, M. M. Almasri and K. M. Elleithy, ”Multi-sensor based
shown in Figure 12 to compare sensor density on the robot collision avoidance algorithm for mobile robot,” 2015 Long Island
links in comparison to the previous version [13]. The band Systems, Applications and Technology, Farmingdale, NY, 2015, pp.
1-6.
will also be tested on different arm sizes and shapes including [13] S. Kumar, C. Savur and F. Sahin, ”Dynamic Awareness of an
prismatic and organic joints. These tests will investigate the Industrial Robotic Arm Using Time-of-Flight Laser-Ranging Sen-
ring’s blind spot mitigation in non cylindrical scenarios. sors,” 2018 IEEE International Conference on Systems, Man, and
Cybernetics (SMC), Miyazaki, Japan, 2018, pp. 2850-2857. doi:
10.1109/SMC.2018.00485
[14] STMicroelectronics, ”A new generation, long distance ranging Time-
of-Flight sensor based on STs FlightSense technology,” VL53L1X
Datasheet, Nov. 2018
[15] Wagner, Dan, Birt, Michael, Duncanson, J. P., Joseph A., and Snyder,
Human Factors Design Guide For Acquisition of Commercial-Off-
The-Shelf Subsystems, Non-Developmental Items, and Developmental
Systems, Welcome to ROSA P, 15-Jan-1996. [Online]. Available:
https://rosap.ntl.bts.gov/view/dot/12782. [Accessed: 02-Mar-2019].
[16] O. A. Adamides, (Source Thesis)“A Time of Flight on-Robot Proxim-
ity Sensing System for Collaborative Robotics.” Order No. 13882043,
Rochester Institute of Technology, Ann Arbor, 2019.
[17] S.Kumar, S.Arora, and F.Sahin, “Speed and separation monitoring
using On-Robot Time-of-Flight Laser-ranging Sensor Arrays.” 2019
IEEE 15th International Conference on Automation Science and
Engineering (CASE), August 22-26, 2019, Vancouver, BC, Canada
Fig. 12. Sensor Density comparison between Old ToF Ring (bottom ring)
and New ToF Ring (top ring)



You might also like