Professional Documents
Culture Documents
InternshipReport EdisonGEREN
InternshipReport EdisonGEREN
ADAPTABLE SIMULATION
MODEL OF A TACTILE
SENSOR ARRAY
MII - MSR 2014-2015
Internship Report
Edison GERENA
2901302
SUMMARY
1. INTRODUCTION _______________________________________________ 4
2. DESCRIPTION OF THE INTERNSHIP_________________________________ 6
2.1.
Abstract _______________________________________________________________ 6
2.2.
2.2.1.
Organisation _______________________________________________________ 7
2.2.2.
2.3.
2.4.
Scheduling____________________________________________________________ 10
3.2.
3.3.
3.4.
3.4.1.
3.4.2.
3.4.3.
4. METHODOLOGY ______________________________________________ 24
4.1.
4.2.
4.3.
4.3.1.
5. RESULTS ____________________________________________________ 43
5.1.
5.2.
Page 2
6. CONCLUSIONS _______________________________________________ 46
6.1.
6.2.
Page 3
1. INTRODUCTION
Dexterous manipulation is an important research topic in robotics nowadays due to its importance
for integrating robots in daily life, especially for cooperative tasks where objects can be handled
and exchanged between humans and robots.
A fundamental challenge in neuroscience is how external information arrives through the different
senses and how it is processed. As humans, we use principally the vision and the touch sensory
receptors to be able to interact with the environment and perform dexterous manipulation. We
use vision to discern overall shape and appearance of objects, but rely on touch to tell if objects
are rough, wet, slippery, or warm. [1]
We have many studies in visual sensors and also in the simulation of this visual sensors but we are
not as mature as it relates to tactile sensors.
By touching an object it is possible to measure contact properties such as contact forces, torques,
and contact position. From these, we can estimate object properties such as geometry, stiffness,
and surface condition. This information can then be used to control grasping or manipulation, to
detect slip, and also to create or improve object models [2].
For this reason, sensors that can retrieve tactile information have been developed in order to
equip robot hands with such a sense. Having tactile information about the object that is being
manipulated would certainly increase the dexterity of a robotic hand.
Many types of sensors have been used to get tactile information, and recently, the tendency is to
use pressure sensor arrays. The pressure sensors array are formed by individual pressure sensors
arranged in a matrix and they can measure mechanical properties of the contact, typically sensing
normal forces and contact positions. Different tactile sensors arrays are available with a variety of
properties, working principles and shapes. Rigid sensors range from the simple planar sensors to
Page 4
ones shaped to curve around a robot fingertip. Also some flexible sensor types are available which
can be for example wrapped around a humanoid robot arm [3].
The performance of the real tactile sensors developed until now is far from human sensing
capabilities. Nevertheless, they have been used in robot manipulation in the last few years for
different purposes in dexterous manipulation, including robot control, collision detection, object
recognition and also in rehabilitation medicine, gait analysis systems and tele-robotic operations
[4]. These various applications of tactile sensors show the importance of their use in robot
manipulation.
In this context and because robot hardware is expensive and hard to maintain, simulation is a
major tool used to support research and development of robotic systems and algorithms, adding
flexibility and reproducibility to the experiments.
In the last decades, robotics simulators have been widely used within robotics and applied in many
different studies including planning, learning and experimentation. The benefit of simulation is
recognized and the use appears as common practice. The most widely used for the community is
Gazebo[14], but there are also OpenRave, RobWorkSim, GraspIt, among others.
However, the capability and availability of current software packages are limited, particularly, the
missing option to simulate tactile sensors from current software packages. With an increased
focus on grasping utilizing sensors, the need for tools simulating sensor feedback is evident.
Having a tactile sensor model that enables tactile sensing will be of great benefit to the robotics
community. Therefore, the aim of this internship is to study the general mechanical and electrical
properties of such sensors to create a simulated model. This model should be versatile enough to
be able to simulate the different pressure sensor arrays on the market.
Page 5
ABSTRACT
Dexterous manipulation is an important research topic in robotics nowadays due to its importance
for integrating robots in daily life where objects can be handled and exchanged between humans
and robots. Having tactile information about the object that is being manipulated would certainly
increase the dexterity of a robotic hand. Many types of sensors have been used to get tactile
information, and recently, the tendency is to use pressure sensor arrays, these sensors are formed
by individual pressure sensors arranged in a matrix. The information obtained is fundamental for
both object manipulation and recognition.
In order to create algorithms that use this kind of sensors, a simulated model is required. The
models that have been created until now offer an information and versatility restricted.
The aim of this internship is to study the general mechanical and electrical properties of such
sensors to create a simulation model. This model should be versatile enough to be able to simulate
the different pressure sensor arrays on the market.
2.2.
This internship takes place at the Institut des Systmes Intelligents et de Robotique (ISIR). The
ISIR is a multidisciplinary research laboratory that brings together researchers and academics from
different disciplines of Engineering Sciences and Life Sciences.
The ISIR was created on 1 January 2007 by grouping together research teams from the University
Pierre et Marie Curie (UPMC) in the field of Engineering Sciences with the ambition to create a
multidisciplinary research institute focused on "Robotics and the Living".
Page 6
Currently the ISIR is associated with the Universit Pierre et Marie Curie (UPMC) 1 and the Centre
National de la Recherche Scientifique (CNRS) through primarily lInstitut des Sciences de
lInformation et de leurs Interactions2 and secondarily lInstitut des Sciences de lIngnierie et
des Systmes3 and lInstitut des Sciences Biologique4.
2.2.1. Organisation
ISIR is headed by Philippe Bidaud and it counts with four different research teams:
www.upmc.fr
www.cnrs.fr/ins2i
3 www.cnrs.fr/insis
4 www.cnrs.fr/insb
2
Page 7
INTERACTION
The "interaction" team aims to develop the science and technology that put agents in
communication with physical and virtual worlds of optical, mechanical and acoustic
means.
The team consists of four groups:
o Perception mechanical and mechanical perception
o Micro-nano robotics
o Active Multimodal Perception
o Integration Multimodal, Interaction and Social Signal.
This internship takes place in the handling and redundancy axe and specifically in the theme of
dexter manipulation in the SyRoCo team.
Page 8
Page 9
2.3.
Realistic simulation requires accurate modelling of contacts between bodies and, in a practical
level, accurate simulation of touch sensors. In order to develop this, the steps of the internship
were:
Make a review of the technical characteristics of existing sensors in order to determine the
parameters to be identified.
Make a review of the literature of the existing simulated models in order to determine
their versatility and compatibility with reality.
Make a short review of the principals characteristics of the ODE physic engine compatibles
with Gazebo simulator [6], because we propose acquitting the forces enacting on the
tactile sensor from this simulation environment and these contact forces are physic engine
depended.
Development of the sensor simulation model like a plugin in Gazebo/ROS and write in C++.
2.4.
SCHEDULING
1st month
2nd month
Definition of the
system architecture
Page 10
3rd month
System Design
4th month
Experimentation
Implementation of
the simulation model
Characterisation of
Weiss sensor
5th month
Validation
3. RELATED WORK
As humans, we use our vision, touch, taste, smell and sound sensory receptors as a means to
experience and interact with the surrounding environment. By exploiting one or a combination of
these senses, humans discover new and unstructured environments.
Human dextery is an amazing thing, people are able to interact with a variety of shapes and sizes
objects and perform complex task with a stunning accuracy. This is due in part to the physical
structure of our hands (multiple finger with many degrees of freedom), and in part to our
sophisticated control capabilities. In large measure this control capability is founded on tactile and
force sensing, especially the ability to sense conditions at the finger-object contact.
For the last third decades, robotics researcher have worked to create an artificial sense of touch
to give robots some of the same manipulation capabilities that humans possess, this researches
focused on the creation of sensor devices and object recognition algorithms with a particular
attention given to array sensors that emulate the skin working, but there is still much to be learned
from what is known about the human sensory system.
3.1.
The human sense of touch has served as the main source of inspiration for the development or
robotic tactile sensing. There is an important distinction between two different components of
contact sensing in humans:
Kinaesthetic sensing:
receptors:
o Muscle spindles: respond to changes in muscles length
o Tendon organs: sense muscle tension
Page 11
Cutaneous sensing: refers to perception of contact information with receptors in the skin.
Tactile sensory signals due to contact events are provided by mechanoreceptors that
innervate the outer layers of the skin. In summary, four different types of sensors have
been identified, each with their function and sensing range. The different properties of
human cutaneous mechanoreceptors are summarized in table 1.
Receptor Type
Field Diameter
Frequency Range
FA I
SA I
3-4 mm
3-4 mm
10 60 Hz
DC 30 Hz
FA II
SA II
> 20 mm
> 10 mm
50 100 Hz
DC 15 Hz
Postulated Sensed
Parameter
Kin stretch
Compressive stress
(curvature)
Vibration
Directional Skin stretch
The density of the receptors type I is higher at the fingertips, while type II receptors are more
uniformly distributed throughout the fingers and palm of the hand. There are about 17.000
mechanoreceptors in the grasping surfaces of the human hand, and center-to-center spacing
ranges from about 0.7 mm in the fingertip to 2.0 mm in the palm [5]. These two points indicate
the high significance of high spatial and temporal resolution in dynamic mechanical interactions,
typically during the making, breaking or variation of contact.
In summary, there are different types of receptors in the skin and muscles, and these sensor
respond to a variety of stimuli and sense different parameters.
3.2.
In robot hands like in humans, dexterous manipulation skills will require a range of sensor for
different parameters like tactile array sensor, force-torque sensor, joint angle sensor, actuator
effort sensor or dynamics sensor that measure vibration, stress changes etc
Page 12
For this work, we will concentrated in the tactile sensor array technology because this is the type
of sensor that will be simulated.
The tactile sensor array imitates the distributed sensory arrangement of human skin. These
sensors consist of individual pressure-sensitive elements called texels arranged in an array over
the contact surface of the sensor. As an object come into contact with the sensor, the
displacement or pressure at each individual element is measured, which provides knowledge of
the local surface shape and/or the pressure distribution across the contact between the sensor
and the object. Different transducer technology has been employed but the principle it is
fundamentally the same.
The review of these methods and their relative advantage and disadvantage are summarized in
table 2 by Mohsin [4].
Transduction
technique
Capacitive
Modulated
parameter
Change in
capacitance
Piezoresistive
Changed in
resistance
Piezoelectric
Strain (stress)
polarization
Inductive LVDT
Optoelectric
Page 13
Advantages
Excellent sensitivity
God spatial resolution
Large dynamic range
High spatial resolution
High scanning rate in
mesh
Structured sensors
High frequency response
High sensitivity
High dynamic range
Disadvantages
Stray capacitance
Noise susceptible
Complexity of measurement
electronics
Lower repeatability
Hysteresis
Higher power consumption
Poor spatial resolution
Dynamic sensing only
Change in magnetic
coupling
Linear output
Uni-directional
measurement
High dynamic range
Moving parts
Low spatial resolution
Bulky
Poor reliability
More suitable for force/torque
measurement applications
Light
intensity/spectrum
change
Bulky in size
Non-conformable
Strain gauges
Change in resistance
Sensing range
Sensitivity
Low cost
Established product
Multicomponent
sensors
Coupling of multiple
intrinsic
parameters
Ability to overcome
certain
limitations via
combination of
intrinsic parameters
Calibration
Susceptible to temperature
changes
Susceptible to humidity
Design complexity
EMI induced errors
Non-linearity
Hysteresis
Discrete assembly
Higher assembly costs
Table 2. Transduction techniques and their relative advantage and disadvantage. From [4]
Tactile sensors arrays are typically covered with a soft, elastic material, such as rubber. The main
purpose of this covering is to protect the sensors from damage, particularly from shear forces and
also as noise reduction by natural low-pass filtering. [24]. It also has a major effects on the
response of the sensor, a covering will tend to spread the response of the sensor to a stimulus
across the sensor, this means that when a point load is applied to the top of the covering it will
result in force over an area at the bottom of the covering, which is in contact with the sensor.
Performance considerations in sensor array design include sensitivity, spatial resolution,
bandwidth, range, accuracy, hysteresis, linearity and size. These will be the parameters to be
modelisables in our simulator sensor solution.
Regarding the geometry of the sensor, there are many different rigid sensors, from the simple
planar sensors to ones shaped to curve around a robot fingertip. There are also some flexible
sensor types which can be for example wrapped around a humanoid robot arm [20].
Since the outputs of this type of sensor consists of a number of force measurements arranged in
a grid, it is natural to think of them as tactile images, and this perspective become the basis of
most of the ideas in dexterous manipulation and identification techniques because this tactile
images allows numerous techniques from the computer vision and image processing literature to
be applied to the haptic domain.
Page 14
3.3.
In the last decades, simulators have been widely used within robotics and applied in many
different studies including planning, learning and experimentation. However, the capability and
availability of current software packages is somewhat limited. In particular, the option to simulate
tactile sensors is missing from current software packages.
Simulation of tactile sensors for robot grasping is a fairly new field of research. Using the available
simulation environments and physics engines, there are some existing models.
In order to create a model of the sensor dynamics three different areas are addressed: tactile
sensor construction (geometry-based), contact model and friction modelling.
In the table 3, we summarized the simulators of tactile sensor array existing:
Name
by Tegin
(2005)
by
Pezzementi
(2010)
Page 15
Robot
simulator
GraspIT
OpenGL
Physics
Engine
Contact
Model
Friction
Model
GraspIT
Impulse
method
Coulomb
Model
-----
It models the
spread of
forces as a
Ignore
linear operator effects
that is
characterized
Geometry-based
Ref.
[6]
[7]
RobWorkSim
(2010)
OpenGrasp
(2012)
SkinSim
(2014)
RobWork
ODE,
Bullet,
Moby
ODE,
OpenRave BULLET,
PhysX
FISICAS
Gazebo
and ROS
ODE
by a Gaussian
point spread
function
Physics engine
dependent.
It describes
the
deformation
as a function
of the distance
from the point
force.
Penalty
method
Physics engine
dependent.
Modelling the
robot skin as
multi-element
spring-massdamper
systems
Physics
engine
dependent
the LuGre
model
Physics
engine
dependent
[8]
[9]
[10]
From a scientific point of view, a simulator for robot grasping should provide primarily a realistic
simulation of dynamic properties.
Page 16
3.4.
ROBOT SIMULATORS
Choosing a robot simulator is not an easy task despite the numerous tools existing in the market,
this is because there is not a general-purpose simulator which dominates the others in terms of
performance or application.
Aaron Staranowicz [13] made a good comparison between commercial and open-source robotic
simulation tools. He presents a comprehensive and detailed overview and a comparison between
the most recent and popular robotic software for simulation and interfacing with real robots.
Serena Ivaldi [14] made a survey based on user feedback about the use of dynamics simulation in
the robotics research community. The survey report has been instrumental for choosing Gazebo
as the base for the new simulator for the iCub humanoid robot.
Page 17
According to their survey, researchers stressed about the stability of simulation (with a median
rate of 5/5) and the most important criteria for the adoption of a tool is a realistic simulation (32%
of the surveyed), open-source software (24%) and the same code for both real and simulated
robots (19%).
The results of the survey also show that Gazebo is the second most known (only 15% declares that
have never heard of it) and the most used simulator (13% uses it as the main tool), followed by
ODE (10% and 11% respectively).
Since our sensor tactile simulated will be used in The HANDLE project5 and because is the most
predominant and used simulator, we choose to develop our system in Gazebo simulator using ODE
physic engine.
Page 18
3.4.1.
GAZEBO simulator
Gazebo6 is continually developed at the Research Lab of the University of Southern California as
part of the Player projects7. It is designed to create a 3D dynamic multi-robot environment capable
of recreating the complex world that will be encountered by the next generation of robots. All
simulated objects have mass, velocity, friction, and numerous other attributes that allow them to
behave realistically when pushed, pulled, knocked over, or carried. It relies on external robot
control software like ROS and Player.
The robots themselves are dynamic structures composed of rigid bodies connected via joints.
Forces, both angular and linear, can be applied to surfaces and joints to generate locomotion and
interaction with an environment. The world itself is described by landscapes, extruded buildings,
and other user created objects. Almost every aspect of the simulation is controllable, from lighting
conditions to friction coefficients.
Gazebo is completely open source and freely available. As a result, Gazebo has an active base of
contributors who are rapidly evolving the package to meet their ever-changing needs.
Figure 6. The diagram shows the connectivity between the Client, Player, Gazebo and ODE. From [13]
6
7
Page 19
Gazebo offers the ability to accurately and efficiently simulate populations of robots in complex
indoor and outdoor environments. Their principals features are:
Dynamics
Simulation:
Access
multiple
high-performance
physics
engines
Sensors: Generate sensor data, optionally with noise, from laser range finders, 2D/3D
cameras, Kinect style sensors, contact sensors, force-torque, and more.
Plugins: Develop custom plugins for robot, sensor, and environmental control. Plugins
provide direct access to Gazebo's API.
Robot Models: Many robots are provided including PR2, Pioneer2 DX, iRobot Create, and
TurtleBot, Shadow Hand. There is also the possibility of building your own robot using SDF.
The physics engines are designed to simulate the dynamics and kinematics associated with
articulated rigid bodies. These engines includes many features such as numerous joints, collision
detection, mass and rotational functions, and many geometries including arbitrary triangle
meshes. Gazebo utilizes these features by providing a layer of abstraction situated between the
physic engines and Gazebo models.
3.4.2.
The main task of all physics engines is to solve the forward dynamics problem which is to find the
motion of the system, given the forces acting on a system. We'll briefly explain the essential factors
that determine the overall performance of the ODE physics engine with an accent in the factors
that play an important role for the contact sensor8:
Page 20
The integrator
The integrator is responsible for calculating a bodys position given the forces acting on it. The
performance of the integrator effects the accuracy of the simulation.
The numerical integrator used by ODE is Eulers method:
(0 + ) = 0 + (0 )
ODE's current integrator is very stable, but not particularly accurate unless the step size is small.
Object representation
The object are represented like rigid bodies with arbitrary mass distribution, which means that
the position and velocity change over the time, but the mass, the center of mass and the inertia
matrix remain constant over time.
The shape of a rigid body is not a dynamical property. It is only collision detection that takes into
account the detailed shape of the body.
In ODE, a joint is a relationship that is enforced between two bodies so that they can only have
certain positions and orientations relative to each other. This relationship is called a constraint
(the words joint and constraint are often used interchangeably).
The three most common constraints are prismatic, revolute, and spherical constraints
Contact determination
Contact points
If two bodies touch, or if a body touches a static feature in its environment, the contact is
represented by one or more "contact points".
Page 21
Each extra contact point added to the simulation will slow down the simulation, so ODE ignore
contact points in the interests of speed. ODE only keep the contact points in the edges. Every time
that ODE detects a contact point created a contact joint.
Contact Joint
The contact joint prevents body 1 and body 2 from inter-penetrating at the contact point. It does
this by only allowing the bodies to have an "outgoing" velocity in the direction of the contact
normal.
Contact joints can simulate friction at the contact by applying special forces in the two friction
directions that are perpendicular to the normal. It can also simulate how bouncy or soft it is, and
various other properties.
The current collision primitives are sphere, box, cylinder, capsule, plane, ray, and triangular mesh.
Friction Approximation
ODE used the Coulomb friction model which is simple and effective to model friction at contact
points. It is a simple relationship between the normal and tangential forces present at a contact
point.
The rule is: | | | |, where and are the normal and tangential force vectors
respectively, and is the friction coefficient (typically a number around 1.0).
Page 22
3.4.3.
According to Koeing [12], Gazebo has a number of important limitations, while it is designed as
an outdoor simulator, the fidelity of this simulation is limited; for example, physics models of soil,
sand, grass, and other pliable surfaces normally found in nature are beyond the scope of this
project, it does not include either deformable objects, and fluid and thermal dynamics. These
features are currently lacking in Gazebo due to their complexity, although some may be added as
the need arises.
In particular, most of the available tools are still based on physics engines classically designed and
used for videogames and computer graphics, where the most important is the qualitative, not
quantitative simulations. Indeed, current algorithms for physical simulation have several
shortcomings which limit their use for problems of practical interest to the roboticist. However,
for humanoid robotics, Gazebo emerges as the best choice among the open-source projects.
A tactile sensor is designed to measure the object mechanical properties, such as contact forces
and contact positions. Acquiring the force enacting on the tactile sensor from an existing
simulation environment can be troublesome.
To simulate a tactile array, it is important to know the area of contact. Due to the softness of the
tactile sensors, the area may depend significantly on the deformation, which appears as a
penetration of the rigid bodies.
It is also required that contacts are distributed rather uniformly on the complete area of the
contact. Unfortunately ODE uses contact reduction schemes (as it mentioned above), to collect
the contact points into a minimal set to enhance performance. The contact points would be
strongly dependent on the triangle sizes in the contacting meshes, and ODE only produce contacts
on the edge of the contacting areas. The ODE engine has a force estimation noisy because the
representation of rigid bodies.
In summary, we need to know all the contact surface and not only some points, and simulate the
softness of the tactile sensors. We will try to overcome these limitations and propose a reliable
tactile simulator in the next section.
Page 23
4. METHODOLOGY
After having made a review of the characteristics of the tactile sensors and their main uses, having
studied robotics simulators, their limitations and strengths, and having analysed different
parameters we conclude that the simulated tactile sensor must meet these requirements:
i.
It should allow the creation of sensors with different shape and size, in the easy way.
ii.
Its output should reproduce the response of real sensors with sufficient fidelity to validate
algorithms applied to it.
iii.
iv.
In order to increase the opportunities of being adopted and used in the scientific
community, the simulator should be open source and make use of open standards for file
formats and other representations.
4.1.
As we have seen in the review of tactile sensor, different tactile sensors are available with a variety
of shapes. The idea was to create a tactile sensor that could be adapted to model any shape.
The first proposed solution is to create the geometry by a triangularized mesh, which can be
obtained from a CAD model or 3D modeller like Blender.
In order to have the points of contacts in the correct position we considered the possibility of add
the cones in the position of each texel, (ODE only produce contacts on the edge of the contacting
areas), so the fact of put a cone in the position of each texel ensure the good positioning of the
contact positions, this operation can be realized in any CAD.
Page 24
However this can be computationally expensive. Then it decided use a low polygon mesh for a
collision element, and a higher polygon mesh for the visual. This low polygon mesh can be
obtained from a 3D mesh processing software like MeshLab.
used triangle mesh consumes greater resources and is preferred used of the built-in
shapes (box, sphere, cylinder) as the collision element,
making the physic representation becomes a complicated process that involves the use of
three different software,
the maximum number of contacts allowed between two entities in Gazebo is 30, this
means that we only could simulate tactile sensors with 30 texels.
The second solution proposed is inspired by a new approach to the construction of tactile array
sensors based on MEMS (MicroElectroMechanical Systems)[23, 24 ,25]. The chips include tightly
integrated instrumentation amplifiers, analog-to digital converters, pressure sensors, and control
circuitry that provides excellent signal quality over standard digital bus interfaces. The resulting
array electronics provide robust and compliant grasping surfaces for specific hand designs.
Page 25
We choose to represent each texel like a simple box-shaped collision element and make the tactile
sensor like a structure of several texels. In this way, we used a built-in shapes (box) like a collision
elements. This allows faster simulation and permits to overcome the problem of maximum of
contacts of Gazebo. It also enables the systems to modelize any form in a simple way, the only
constraint is to know the position of each texel in the reference frame of the sensor.
Each collision element (each texel) will have four contacts points at least.
z
x
Depth
Y
Width
Figure 10. A. The 4 contacts points of a texel B. The reference frame of a collision box in Gazebo C. the tactile sensor like a
structure of several texels
ii.
The output should reproduce the response of real sensors with sufficient fidelity to
validate algorithms applied to it.
The second condition requires an understanding of the characteristic response function of sensors
and a good contact force model.
Page 26
It has been plan to use the forces (contact forces and friction forces) given by ODE. To overcome
the problems of ODE (rigid bodies and strength noisy estimate) we need to make some signals
treatments.
This functionalities will be added to Gazebo as a plugin, the tactile sensor array plugin extends the
contact sensor. All normal forces on each box shape are summed and computes pressure by
dividing by the area of the box. The area is computed by multiplying the two largest box
dimensions of a contact element. A node is assigned to each tactile array patch to collect this data
and publish two topics.
One topic with the contact information including:
simulation time
the sensor ID
sensed force
sensed pressure
sensed output
SensorData Topic
Header Msg
SensorData Msg
Header
String
Texeldata[]
Int
Int
Int
String
Simulation time
Sensor Id
Sensor Data
Figure 11. ROS topic handle sensor data and data management
Page 27
Sequence id
Seconds time stamp
Nanoseconds time stamp
Frame
TexelData Msg
Int
Texel id
Double
Force
Double
Pressure
Int
Sensor Out
And one topic with a contact image: each n by m tactile sensor patch encodes the force reading
into an n by m 8 bit monochrome image. Each pixel of this image can have values ranging from 0
to 255 representing the normalized force sensed by individual sensors. The contact image
message used the Image_msg9 of Ros sensor_msg classe.
Rviz10 is used to view the tactile image.
Users will be able to regulate:
the range
the saturation
the noise
iii.
iv.
The simulator should be open source and make use of open standards for file formats and
other representations.
The third and fourth points are accomplish directly by Gazebo simulator architecture and their
interaction with ROS. The only thing that we will need to do is make sure not to overload much
each simulation step with heavy complicated calculations.
10
Page 28
Gazebo Plugin Give the force and position of each contact point
Page 29
sensor Data
sensor Image
4.2.
For the conception of the system we divided the work into four principal tasks:
Task 1
Task 2
Task 3
Task 1
In order to verify the proper functioning of the physical model, we have created a planar sensor
with 3x4 texels and then we have tested the different objects collision.
Figure 12. Contacts point between the sensor and the different objects
The main objectif of this test is to see if the physical model makes the correct contact points in
every situation. When the object is bigger, when it has the same size of one texel, when it is smaller
than a texel, and when the objet has differents forms (cylinder, sphere, box, mesh).
Page 30
Task 2
The GazeboRosTactileSensor is added to Gazebo as a plugin, which extends the functionalitys of
contact sensor.
gazebo::GazeboRosTactileSensor
Figure 13. Inheritance diagram for gazebo::GazeboRosTactileSensor
Initialization:
Using collision box, create the simulated sensor texels
Parametrize the collision elements using URDF/SDF
Parametrize the sensor elements using URDF/SDF
Begin:
Read the values of SDF
Get the collision_Id of each texel
Init Ros node and allow to publish for the 2 topics
for each time-step do
for each texel (collision element) do
get all the contact point sorted by collision element
get the contact force of each contact point
add the force of each contact point to get the total contact force in
this texel
divided the total force by the area of the collision element to
calculate the pressure
end
Make the signal treatments
Characterized the tactile sensor output
Convert the tactile sensor output to image
Update the sensorData msg
Publish the two topics
end
end
In the line 4 we will be enable to fixed the features that the plugin give us to parametrize the
sensor
rows: corresponding of the number of rows that compose the tactile sensor
columns: corresponding of the number of columns that compose the tactile sensor
rangeMin: defines the minimum value of force (in N) that the sensor is able to detect
rangeMin: defines the maximum value of force (in N) that the sensor is able to detect
satMin: the minimum value of the output sensor
satMax: the maximum value of the output sensor
tactile_rate: the frequency (in Hz) that the sensor publish data
liniaritySensibility: the slope of the calibration curve
liniarityOfset: the output value of the linear curve when force equals zero
dataTopicName: the name of topic publishing for Sensor Data
imageTopicName the name of topic publishing for Sensor Image
cutFrequency, order Filter, sigmax, sigmay, ksizex and ksizey will be explained in the next section.
Page 32
Sensor Output
satMax
() = +
linearitySensibility
linearityOfset
satMin
rangeMin
rangeMax
Force [N]
Task 3
In order to have a proper signal output we need to overcome two problems mentioned avobe,
the noisy force and the non soft-contact collision.
This noisy force become to the fact of hard colission, in the real world the tactile sensor arrays are
covered with a soft material, such as rubber to reduce the noise by natural low-pass filtering.
We propose to solve the noisy force problem in the same way, using a difital low-pass filtering
So we have somme signal analisys in order to choose our filter.
This is a tipically signal force given by ODE for one collision element and their spectral analysis:
Page 33
1.4
0.9
1.2
0.8
1
0.7
Force (N)
0.6
0.8
0.5
0.6
0.4
0.3
0.4
0.2
0.2
0.1
200
400
600
800
1000
1200
1400
1600
1800
50
100
150
200
Steps
250
300
Frequency (Hz)
350
400
450
500
Figure 16. Noisy force when an object is in collision with one texel in static configuration
0.9
3.5
0.8
3
0.7
2.5
Force (N)
0.6
0.5
0.4
1.5
0.3
1
0.2
0.5
0.1
200
400
600
800
1000
Steps
1200
1400
1600
1800
50
100
150
200
250
300
Frequency (Hz)
350
400
450
500
Figure 17. Noisy force when an object become in collision with one texel in dynamic configuration
We choose to use a Butterworth Filter because we need as flat response in the pass band of the
sensor in order to have a correct tactile images. The frequency response of the Butterworth
Filter approximation function is also often referred to as maximally flat (no ripples) response
because the pass band is designed to have a frequency response which is as flat as mathematically
possible from 0Hz (DC) until the cut-off frequency at -3dB with no ripples.
Page 34
Figure 18. Ideal Frequency Response for a Butterworth Filter. From http://www.electronics-tutorials.ws/filter/filter_8.html
0.5
5
0.45
0.4
Force (N)
Force (N)
0.35
0.3
0.25
2
0.2
0.15
1
0.1
0.05
500
1000
1500
Steps
2000
2500
3000
200
400
600
800
1000
1200
1400
1600
1800
Steps
The fact to filter the signal also allows us to make some dynamic characterization of the sensors,
because we can modify the properties of the system transient response to an input changing the
parameters of the filter.
We make the following simulation: An object in contact with a texel, the object is removed and
then put back, this give the follow signal.
Page 35
2.5
Force (N)
1.5
0.5
0
200
400
600
800
Steps
1000
1200
1400
After we filtering with changing the order of the filter and the cut frequency:
Signal of contact force after filter
1.2
1.2
0.8
0.8
Force (N)
Force (N)
1.4
0.6
0.6
0.4
0.4
0.2
0.2
-0.2
500
1000
-0.2
1500
500
Steps
1.2
0.8
0.8
Force (N)
Force (N)
1.2
0.6
0.6
0.4
0.4
0.2
0.2
-0.2
500
1000
Steps
Page 36
1500
-0.2
1000
Steps
1500
500
1000
1500
Steps
cutFrequency and orderFilter parameters for the plugin allow the utilisateur to choose the order
and the cut frequency of the Butterworth Filter.
The second problem the non soft-contact collision, presents a difficultty for the simulation of
tactile sensor array because in real life the soft covering tend to spread applied forces across the
surface of the sensor, potentially exciting adjacent sensing elements even when a point force is
applied.
We model this process as a linear operator that is characterized by a point spread function.
The point spread function (PSF) describes the response of an imaging system to a point source or
point object. A more general term for the PSF is a system's impulse response, the PSF being the
impulse response of a focused optical system.
The image of a complex object can then be seen as a convolution of the true object and the PSF.
Figure 26. Different imaging model for tactile sensing. From [7]
Pezzementi [7] has shown that the response of a tactile sensor system (the material covering
them) can be modelize by a PSF using a Gaussian function, and that nearly identical outputs can
be obtained from finite element model (FEM) or point spread function model.
Page 37
We modelize the process using two Gaussian functions 2D, one for the spread along the X axis,
and one for the spread along the Y axis. This allow our simulator to simulate any impulse
response.
sigmax, sigmay, ksizex and ksizey parameters for the plugin allow the utilisateur to fixed the
impulse response of our sensor.
Once a particular sensor has been characterized, any number of identical sensors may be
replicated in simulation by simply copying the characterization parameters, this reduces problems
due to lack of material, and allows for example to covering completely a hand with sensors.
Page 38
4.3.
To experimentally characterize the performance of the proposed tactile array simulated we need
to characterize the Weiss WTS 0614-34 sensor, to do this, we need the technical specification and
the calibration of the sensor.
The Weiss sensor is a sensor with 6 columns and 14 rows, with an spatial resolution of 3,4 mm.,
sampling rate of 271 Frames/s and a weight of 10g. All of this characteristics can be easily
parametrized in our simulated sensor. For the mechanical drawing we constructed our collision,
and we use a 3D model for the visual
Figure 29. Collision, visual and complete model of the simulated Weiss sensor
Page 39
4.3.1.
ROS node to synchronize the Weiss sensor and the Ati Nano17 data [D]
3000
weiss Output
weiss Output
4000
2000
Data
Fit
Confidence bounds
1000
0
0.5
1.5
Force [N]
2.5
Observation 16
X: 1.853
Y: 3895
3000
2000
Data
Fit
Confidence bounds
1000
0
0.5
2000
Data
Fit
Confidence bounds
1000
0.5
1.5
Force [N]
2.5
Data
Fit
Confidence bounds
1000
0
0.5
1.5
Force [N]
2.5
Observation 52
X: 1.385
Y: 3895
3000
weiss Output
weiss Output
2000
2000
Data
Fit
Confidence bounds
1000
0.5
1.5
Force [N]
2.5
Observation 70
X: 1.347
Y: 3895
3000
2000
Data
Fit
Confidence bounds
1000
0
0.5
1.5
Force [N]
3500
Texel
Texel
Texel
Texel
Texel
3000
15
63
38
40
39
weiss Output
2500
2000
1500
1000
500
0.5
1.5
Figure 32. Fitting data for all the trials and texels
Page 41
2.5
Observation 16
X: 1.647
Y: 3879
3000
4000
4000
Observation 23
X: 1.756
Y: 3868
3000
weiss Output
weiss Output
4000
1.5
Force [N]
4th trial texel39
2.5
Force [N]
3.5
4.5
2.5
In the Figure 32, we see that the response of the Weiss sensor can be characterized by a
polynomial of the 5th degree. However, for our simulator we used a linear equation that is a
correct approximation. This is the linear response of Weiss Sensor that we will be used in our test
for validation.
3500
3000
weiss Output
2500
= 2665,7 493,87
2000
1500
1000
500
X: 0.1853
Y: 0
0.5
1.5
2
Force [N]
2.5
3.5
Figure 33. Linear Response of Weiss Sensor used for simulation and the parametrized Plugin SDF
Page 42
5. RESULTS
5.1.
STATIC EXPERIMENTATION
In the static experiments, we compared the response of the real Weiss sensor, and the response
of our simulate Weiss sensor.
Real
Simulated
Weight
Figure 34. The shaped indenters and the weight (2x100g and 500g)
For this reason we made different shape indenters in a 3D printer (see Figure 34), and these
shapes will be applied to the real sensor and the simulated sensor in the same configuration.
Each shape was applied with a 100g load and 500g load, and is rotated in 0, 45 and 90.
The results of this comparison are summarized in the Figure 36.
Real sensor
image
Simulated sensor
image
Real
sensor image
Simulated
sensor image
Real sensor
image
SimulatedSimulated
sensor image
Real
sensor image
sensor image
Comparation
between Images
Comparation
between Images
Figure 35. Comparison between real and simulated tactile image. A Hexagonal nut with 500g load. B Two box with 100g load
each one
Comparation
between Images
Comparation
between Images
Page 43
DIFFERENCE [%]
14
12
10
8
6
4
2
0
100 g
500 g
Figure 36. Summarized comparison between tear and simulated tactile image
To compare the tactile image we take in to account the number of different texels but also the
percentage of the difference between them. In this way the biggest differences are penalized
more than the smaller ones. The indenter shapes with a load of 500g have a higher percentage
difference that the ones with 100g.
The median of the difference between a real tactile image and a simulated tactile image is 4,66%,
these results are satisfactory and allow to validate our sensor model in static configuration.
The simulated and real tactile images of all the set of configuration are in the annexes.
5.2.
DYNAMIC EXPERIMENTATION
The simulated sensor will be used to perform typical tasks related to tactile sensors and compared
the results to a reals sensors existing in the laboratory in order to establish the performance of
the simulated model.
Page 44
We have done some experiments with a Shadow hand and the adaptive grasp algorithms, for that
we will used the Weiss sensor in the palm, and we parametrize a new sensor models for the
middle and proximal phalanges of the Shadow Hand fingers.
Figure 37. Real Shadow Robot Dextrous Hand with sensors and the simulated hand
In order to have a correct results we need to make the same experiments with the real hand, but
for the moment we are still waiting for the new sensors to arrive. Also in the future we need to
make some experiments of control force with the real and simulated software.
Figure 38. Simulated hand with 4x1 sensor in the middle phalanges, 4x2 tactile sensors in the proximal phalanges and 6x14
sensor in the palm, making a grasping operation.
Page 45
6. CONCLUSIONS
6.1.
In this report we described a new model for tactile sensor array simulation, we presented a case
study demonstrating the capability of the system and validated the model through experiments,
however in order to validate the tactile model we still need to do different experiments using the
most common use-cases of tactile sensors in robotic grasping and manipulation tasks.
This model of the tactile sensor array could be useful in the development of any device with
human-robot interaction like a medical device, rehabilitations robots or in virtual reality.
It will be of great benefit to the robotics community, because it allows to investigate new
algorithms without the need to implement them on physical hardware, or as a step in determining
performance requirements on that hardware.
Also, using this tactile sensor model can enable researchers to do experiments that should be
theoretically possible but, due to the current limitations in the existing hardware, are still difficult,
in this way, overcome manufacturing and economic limitations.
Therefore, arbitrarily accurate tactile sensors could be created in simulation, allowing the
investigation of the effects of spatial and force resolution on tactile image processing, tactile
object recognition, and tactile manipulation algorithms. Moreover, this system could be used to
perform regression testing to determine optimal sensor design and placement criteria.
In the short term, the work that has been done during this internship will be used by the SIROCO
Team to facilitate the research in object recognition and tactile control of a dexterous hand,
however in the mid-term the results might be published and put as in to disposition of the
researchers community that way the efficiency and robustness of the sensor model developed
would be tested.
Page 46
6.2.
PERSONAL ENRICHMENT
This internship has allowed me to develop many professional skills by working in a project, from
the beginning to the end, putting in practice several tools learned in my years of training,
especially in the last 2 years while doing the Master MSR, as well as, to learn new tools like ROS,
C++ and GAZEBO, which will certainly be a plus in my future professional life.
Being part of the area of scientific research, field that I did not know before and that I have found
very interesting, has permit me to use the state-art methods, to formalize a problem, to
implement and conduct experiments while increasing my capacities of analysis.
All of this has being enriched by the team work with the PhD students and professors of the
laboratory who has being of great help in the progress of this work.
Page 47
7. BIBLIOGRAPHIC REFERENCES
[1]
Zachary Pezzementi
Object Recognition Using Tactile Array Sensors,
Doctoral Dissertation, Johns Hopkins University Baltimore, MD, USA 2012.
[2]
[3]
[4]
[5]
R. D. Howe. Tactile sensing and control of robotic manipulation. Advanced Robotics, 8(3),
1994.
[6]
[7]
[8]
[9]
Page 48
[10]
[11]
M. Shimojo, "Spatial filtering characteristic of elastic cover for tactile sensor," in Robotics
and Automation, 1994. Proceedings., 1994 IEEE International Conference on, 1994, pp.
287-292. Sensor Covering modelise par FEM
[12]
N. Koenig and A. Howard. Design and use paradigms for gazebo, an open-source multirobot simulator. Technical report, USC Center for Robotics and Embedded Systems, CRES04-002, 2004.
[13]
Staranowicz, Aaron, and Gian Luca Mariottini. "A survey and comparison of commercial
and open-source robotic simulator software." Proceedings of the 4th International
Conference on PErvasive Technologies Related to Assistive Environments. ACM, 2011.
[14]
Ivaldi, Serena, Peters, Jan, Padois, Vincent, et al. Tools for simulating humanoid robot
dynamics: a survey based on user feedback. In : Humanoid Robots (Humanoids), 2014 14th
IEEE-RAS International Conference on. IEEE, 2014. p. 842-849.
[15]
Erez, Tom, Tassa, Yuval, Et Todorov, Emanuel. Simulation Tools for Model-Based Robotics:
Comparison of Bullet, Havok, MuJoCo, ODE and PhysX.
[16]
Roennau, Arne, Sutter, F., Heppner, G., et al. Evaluation of physics engines for robotic
simulations with a special focus on the dynamics of walking robots. In : Advanced Robotics
(ICAR), 2013 16th International Conference on. IEEE, 2013. p. 1-7.
[17]
[18]
[19]
Page 49
[20]
[21]
Weiss Robotics
http://www.weiss-robotics.de/en/english/technology/tactile-sensors.html
[22]
Inexpensive and Easily Customized Tactile Array Sensors using MEMS Barometers Chips
Yaroslav Tenzer, Leif P. Jentoft, Robert D. Howe Harvard School of Engineering and Applied
Sciences
[23]
An integrated MEMS three-dimensional tactile sensor with large force range Tao Mei a ,
Wen J. State Key Laboratories of Transducer Technology, Institute of Intelligent Machines,
Chinese Academy of Sciences, Hefei, Anhui 230031, China 1999.
[24]
R. Volpe and P. Khosla, "A theoretical and experimental investigation of explicit force
control strategies for manipulators," Automatic Control, IEEE Transactions on, vol. 38, pp.
1634-1650, 1993.
[25]
Page 50
8. ANNEXES
Comparison between real and simulated tactile images for all the configurations:
Texel
Small square
100g and
45 Images
Comparation
between
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 4
Image changed: 0.4155%
Comparation
between
Small square
500g and
45 Images
Small square
500g Images
Comparation
between
Real sensor imagePixels (total): 84
Simulated sensor image
Pixels changed: 13
Image changed: 1.9561%
Comparation
Big squarebetween
100g Images
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 14
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 13
Big square
100g and 45
Comparation between Images
Real sensor image
Simulated sensor image
Pixels (total): 84
Pixels changed: 17
Image changed: 5.1961%
Comparation
Big square
500g andbetween
45 Images
Pixels (total): 84
Real sensor image
sensor image
Pixels changed: Simulated
18
Image changed: 2.507%
Page 51
Comparation
Circle 100gbetween Images
Pixels (total): 84Simulated sensor image
Real sensor image
Pixels changed: 21
Image changed: 4.8926%
Circle 500g
Comparation
between
Big rectangle
100g and
45 Images
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 32
Image changed: 7.4463%
Comparation
Big rectangle
500g between Images
Pixels (total): 84Simulated sensor image
Real sensor image
Pixels changed: 21
Image changed: 1.4706%
Comparation
between
Big rectangle
500g and
90 Images
Pixels (total): 84
Pixels changed: 38
Small rectangle
100gbetween
and 45Images
Comparation
Pixels (total): 84
Pixels changed:Simulated
21
Real sensor image
sensor image
Image changed: 4.5565%
Page 52
Pixels (total): 84
Pixels changed: 11
Image changed: 1.0831%
Big rectangle
100gImages
and 90
Comparation
between
Real sensor imagePixels (total): 84
Simulated sensor image
Pixels changed: 18
Image changed: 2.0215%
Comparation
between
Big rectangle
500gImages
and 45
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 11
Image changed: 1.3585%
Comparation
between
Images
Small rectangle
100g
Pixels (total): 84
changed: 42
Real sensor imagePixels
Simulated
sensor image
Image changed:
9.3324%
Comparation
between
Images
Small rectangle
100g
and 90
Pixels (total): 84
Pixels changed: 19
Real sensor image
Simulated
sensor image
Image changed:
2.9272%
Comparation
Small rectangle
500gbetween
and 90Images
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 29
Image changed: 7.6611%
Comparation
Hexagonal
nut 100g between
and 90Images
Comparation
Images
Hexagonalbetween
nut 100g
Pixels (total): 84
Real sensor image
sensor image
Pixels changed:Simulated
26
Image changed: 12.5677%
Comparation
Images
Hexagonalbetween
nut 500g
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 15
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 22
Comparation
Hexagonal
nut 500g between
and 90Images
Pixels (total): 84
Real sensor image
sensor image
Pixels changed:Simulated
21
Image changed: 2.5397%
between Images
H-shapedComparation
100g and 45
Pixels (total): 84Simulated sensor image
Real sensor image
Pixels changed: 29
Image changed: 7.1102%
Page 53
Comparation
between Images
H-shaped 100g
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 29
Image changed: 6.704%
Comparation
Images
H-shaped between
100g and
90
Pixels (total): 84
H-shaped 500g
H-shapedComparation
500g and 90
between Images
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 43
Image changed: 12.6424%
Page 54
Big squarebetween
+ small Images
square 100g
Comparation
Pixels (total): 84
Real sensor image
Simulated sensor image
Pixels changed: 37
Image changed: 13.4594%