Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Noname manuscript No.

(will be inserted by the editor)

COLLABORATIVE HUMAN-ROBOT
INTERACTION INTERFACE
DEVELOPMENT FOR A SPINAL SURGERY ROBOTIC
ASSISTANT

Andres Amarillo · Emilio Sanchez · Jon


Oñativia

Received: date / Accepted: date

Abstract The growing introduction of robotics in not industrial applications


where the environment is unstructured and changing, has led to the need of devel-
opment of safer and more intuitive, human-robot interfaces. In the surgical field,
the use of collaborative robots has potential benefits, due to the combination of
surgeon’s experience, knowledge and flexibility with the robot’s accuracy, stiffness
and repeatability. Nevertheless, in order to guarantee a functional collaboration
in this environment the interaction between surgeon and robot must be intuitive,
natural, fast and easy to use. The commercial collaborative robots are less accu-
rate and less stiff, and the traditional industrial ones, have not intuitive interaction
interfaces. This paper presents a hand guided methodology for functional collab-
oration between surgeon and robot, developed for a robotic surgical assistant for
transpedicular fixation procedures. It is emphasized how a traditional industrial
robot can be used as a collaborative one when the available commercial collabora-
tive robots don’t have the required accuracy and stiffness for the task. The control
algorithms implemented in a robust architecture with industrial hardware are also
described.
Keywords Hand–Guiding · Human Robot Interface · collaborative surgical
Robotics · Transpedicular Fixation technology

1 Introduction

Spine is a main structure in the human body composed by 26 bones called verte-
brae and separated by intervertebral discs that protects and supports the spinal
cord and nerves. A wide number of medical conditions and injuries can affect the

A. Amarillo · E. Sanchez
Universidad de Navarra, Tecnun, Manuel Lardizabal 13, 20018, San Sebastián, Spain
E-mail: andres.amarillo@cyber-surgery.com
E. Sanchez
Ceit, Manuel Lardizabal 15, 20018,San Sebastián, Spain
A. Amarillo · J. Oñativia
Cyber Surgery S.L., Paseo Mikeletegui 71, 20009,San Sebastián, Spain.
2 Andres Amarillo et al.

vertebrae and spine causing pain and limiting mobility, leading to the reduction
of life quality. It is estimated that over 266 million individuals suffer some kind of
spinal Degenerative Disease [1] and that spine disorders restricting the activities
of daily living (ADLs) occurred in almost every third adult (32%) living in Europe
[2]. This incidence of spine disorders makes them a worldwide health concern.
Spinal instrumentation through pedicle screw fixation is one of the most com-
mon surgical procedures used in the treatment of spine disorders. This procedure
consists in the insertion of screws on the vertebrae, through the pedicles. These
screws are used as support of rods that are tighten in a way that keeps a proper
spine alignment [3]. Due to the closeness of nerve structures, an accurate placement
of the screws into the vertebrae is required in order to avoid neural damage. There
are different techniques used for this procedure, free hand, fluoroscopy guided and
optical navigation are the most frequently used. However, these techniques have
known issues, such as the accuracy dependence on surgeon’s experience in free
hand technique [4], the high exposure to radiation and the increased operating
times in fluoroscopy guided technique [4], the accuracy dependence on distance
form reference marker to cameras and the conditioning of the surgical workspace,
due to the need of a guaranteed line of sight between cameras and fiducial mark-
ers, in optical navigation technique [5]. As a way to improve the surgical process,
increasing accuracy and reducing the surgeon’s experience needed to perform the
procedure [6], Robotic Assisted Surgery (RAS) systems have been introduced into
the operating rooms (OR), there are mainly four RAS systems that assist in spinal
surgeries, the Renaissance R , Mazor X R (Mazor Robotics), the ExcelsiusGPS R
(Globus Medical) and the ROSA R (Medtech). Three of these RAS systems use
optical tracking as a way to identify the position of the spine and the vertebrae
relative to the robot. The use of this systems, as the optical navigation technique,
conditions the surgical workspace due to the need of a guaranteed line of sight
between the cameras and the fiducial markers, used to identify the position of the
patient’s vertebrae and the surgical tools. The other RAS system that does not
use optical tracking (Renaissance R ) has to be mounted directly on the patient’s
bone, therefore, requiring more and bigger incisions. Moreover, the patient has to
support the weight of the robot.

1.1 Robotic Surgical Assistant Descripton

To overcome previously mentioned limitations of state of the art RAS systems, a


novel RAS system has been developed (Fig 1). This RAS system has a surgical
planning software that allows the surgeon to plan the placement of pedicular screws
before the surgery using preoperative computerized tomography (CT) scans. Dur-
ing the surgery a Clamp is attached to the patient spine, and some fiducial markers
are used to register the Clamp position relative to the preoperative CT scans used
for the plan.The RAS system is safely connected to the patient using a devel-
oped highly accurate electromechanical tracking device. Using the surgical plan,
the intraoperative register and the tracking device information the system is able
to guide accurately the insertion of the screws with the desired location even if
the patient is moved(for further details refer to [7]). The use of the highly accu-
rate tracking device allows the system to avoid a misplacement that could lead to
COLLABORATIVE HUMAN-ROBOT INTERACTION INTERFACE 3

patient’s neural damage and reducing the surgical workspace conditioning of the
RAS systems with optical tracking technology [8].

Fig. 1 Developed RAS system for transpedicular fixation procedures.

This RAS system guides surgical instruments through a Tool guide, helping the
surgeon to introduce screws into vertebral body in a previously planned placement.
The RAS system has a 6 degree of freedom (DOF) traditional industrial robotic
arm, a Staübli TX40. The stiffness of the traditional industrial arm is important
in this RAS, because the robot has to keep the accurate positioning while bearing
efforts when the vertebra is being milled and when the surgical instruments are
being introduced through the robot’s cannulated tool [9]. This RAS system would
meet the requirements of accuracy for the procedure without conditioning the
surgical workspace, nevertheless, due to the unstructured environment in which
the robot perform its tasks, a human-robot interaction interface is required.

1.2 Interaction interface type selection

The interaction between humans and robots has been traditionally made using a
programmable controller, where the user requires specific training to be able to
make the robot move in a certain sequence. That sequence is created by record-
ing/saving several robot poses that are executed cyclically in a predefined order
to perform a task, sometimes taking into account external signals to be able to
have a basic interaction with the surrounding environment of the robot. Despite
the fact that this kind of interaction is not intuitive and requires the user to be
trained, it has been useful for industrial environment to automate repetitive tasks.
Nevertheless, with the widespread adoption of robotics in other fields where the
environment and workspace of the robot is less structured, such as surgical field,
this kind of interaction is not enough.
To be able to introduce the robot in less structured spaces some interfaces have
been developed with the use of joysticks and haptic devices that allow the remote
control of the robot. These interfaces are present in teleoperated robots with ap-
plications in several fields such as industrial [10] and Medical (i.e. Da Vinci R ,
Intuitive Surgical). The interfaces used in teleoperation enhance the performance
4 Andres Amarillo et al.

of the user, allowing the execution of complex tasks that require motion scal-
ing, adapting movements performed by the user in the joystick or haptic device.
Nevertheless, the use of these interfaces is more complex and usually the relative
placement of the remote interface and robot affects the intuitiveness. Also, even
when the teleoperated robot improves the user’s skills, they still have to actively
perform the task, since the robot is not autonomous.
In order to have the best of both, the human and robot skills, the ideal scenario
would be the cooperation, where both share tasks, taking action in phases where
each one’s best skills are needed. In this case the teleoperation interface is not the
best, for these reason and with the growing interest in the potential benefits of
the active cooperation between humans and robots, recent efforts has been taken
to create interfaces to interact in a natural way, through the use of gestures, voice
and physical contact [11][12][13].
The most widespread physical contact interface is the robot hand guidance,
that allows the user to control the robot just by directly pulling or pushing it, this
allows a more intuitive and direct interaction with the human [14]. Physical con-
tact interfaces are also useful to create a variable robot behavior using the input
of humans in shared workspaces, bringing more flexibility into the human-robot
cooperative processes. This robot behavior conditioned by physical contact, is of
interest in the medical field, to be more specific in the surgical one, where the
accuracy of the robot could be useful to improve the surgical results but where the
natural workflow of the procedures requires a fast and flexible adaptation of the
system behavior. The hand guidance feature in robotic arms is present in some
commercial arms, also known as collaborative, such as IIWA R (KUKA Robotics),
Franca R (Franka emika) and Sawyer R (Rethink robotics), these have been de-
veloped to enhance the user experience of the hand guidance feature and have
implemented the use of force/torque sensors directly into each robot joint. Doing
it improves the behavior of the hand guidance interface allowing the interaction
with the robot with physical contact in the entire robot structure. However, due
to the design of these robots to enhance the hand guiding experience of the user,
the mechanical stiffness of the robotic arm is reduced compared with traditional
robotic arms. For the developed RAS system, presented above, where high accu-
racy is required, even when the robot is being subjected to mechanical efforts, this
stiffness reduction is undesired as it may lead to accuracy reduction and therefore
to misplacement of the screws.
The developed RAS system must be able to adapt to the surgical workflow and
workspace and its variations in a fast and intuitive way. There are different phases
on the surgical workflow using the RAS system where the hand-guiding option
would be useful. First, the system approaching phase. In this phase the robot has
to approach to the surgical area. From one surgery to another, patients are not
located at the exact same place. During this phase the physical interface would
be useful to allow the surgeon to approximate the robotic arm to the appropriate
surgical area just by pulling it. Second, the re-planification phase. In this phase
the surgeon can modify the planed trajectory during the surgery without having to
create a new planification offline. This feature would allow the surgeon to react in
a flexible way, guiding the robot (inside a restricted workspace) if it is considered
that the planification done before the procedure is not the optimal. Third, the
retraction phase, in this phase the surgeon can choose just by pushing the robot
COLLABORATIVE HUMAN-ROBOT INTERACTION INTERFACE 5

when and where the robotic arm should move in case that the surgeon or surgical
staff needs the entire workspace to perform a specific or an emergency procedure.
This paper presents the implementation of a hand guidance interface for the
RAS described above on a robust industrial architecture. The algorithms used to
control the Staubi TX40 industrial arm as a collaborative robot obtaining the
mechanical stiffness of a traditional industrial arm, with a flexible and intuitive
interaction similar to a collaborative one are also described.

2 HAND GUIDANCE ARCHITECTURE DESCRIPTION

The developed hand guidance interface has been implemented with a robust indus-
trial architecture. The TX40 robotic arm joints are controlled using the Staubli’s
controller with Unival drive technology, calculating robot trajectories in a pro-
gramable logic controller (PLC) where the motion of each joint is interpolated
and handled using the industrial PLCOpen R standard. A 6 degree of freedom
(DOF) Force/torque sensor with industrial CANopen R communication protocol
has been coupled into the RAS system with a handle that allows the surgeon to
use the hand guidance interface (fig 2).
The PLC obtains the force measured from the Force/torque sensor in a dis-
tributed CANopen R interface that communicates this information under the in-
dustrial Powerlink fieldbus. The control algorithms are processed on the PLC and
the resultant velocity commands are sent to the Unival Drive robot controller
technology under the same Powerlink fieldbus in real time. This architecture al-
lows to perform the update of each cycle of force readings, joint interpolation,
communication and control every 2 ms.
Although the safety concerns are out of the scope of this paper, they must be
considered since the robot is acting as a COBOT (collaborative robot). Therefore,
joint velocities and accelerations of the robot have been restricted. It is also impor-
tant to consider that this architecture makes the safety channel implementation
easy, where safety industrial hardware can be added as the system uses industrial
hardware, standards and communication protocols.

Fig. 2 Hand Guidance system architecture.


6 Andres Amarillo et al.

3 HAND GUIDING INTERFACE DEVELOPMENT

The control architecture used to implement the hand guiding interface is based on
the concept of mechanical admittance as presented on [15] where the robot’s end
effector velocity reaction is obtained from the force exerted by the user, in this
case, the surgeon. This relation can be mathematically modeled by the differential
equation of the dynamic model of the system.

3.1 ADMITTANCE CONTROLLER

For the hand guiding interface developed here a mass - damper model has been
proposed, for each spatial direction, in order to be able to decouple the motion
of the robotic arm in all cartesian directions. Hence the equation that describe
the behavior of the system in the different cartesian directions due to the forces
applied by the user is:
F (t) = M v̇(t) + bv(t) (1)
where F(t) is the force in the end effector of the robot, M is the desired mass
for each direction, b is the damping coefficient for each direction and v̇ (t) and
v(t) are the acceleration and velocity of the mass. There is an equivalent equa-
tion considering the Coriolis and centrifugal effects negligible, that describes the
behavior due to the torques:

T (t) = I θ̈(t) + cθ̇(t) (2)


where T(t) is the torque in the end-effector of the robot, I is the inertia for
each rotational direction, c is the damping coefficient for each rotational direction
and θ̈(t) and θ̇(t) are the angular acceleration and velocity respectively. With
differential equations (1)(2), and applying the Laplace transform, the transfer
functions that relate the linear and angular velocities with the applied force and
torque can be obtained,
   
v(S) 1
F (S) = M Sv(s) + bv(S) ⇒ = (3)
F (S) MS + b
   
θ̇(S) 1
T (S) = IS θ̇(S) + cθ̇(S) ⇒ = (4)
T (S) IS + c
With these transfer functions, the robot’s end effector cartesian velocity can
be obtained as a result of the force and torques applied by the user,
     
θ̇x
 
vx   fx   Tx
vy  = 1  fy  ,  1  Ty 
θ̇y  = (5)

MS + b IS + c
vz fz θ̇z Tz
For the developed hand guiding interface an isotropic behavior of the system
through all the cartesian directions is desired. Thus, the values used for the mass
and damping coefficient are the same for all cartesian directions. The forces and
torques used in the previous equations are in the robot’s end effector and rela-
tive to the robot’s base frame. In the RAS where the hand guidance has been
implemented it is not the case (Fig 3).
COLLABORATIVE HUMAN-ROBOT INTERACTION INTERFACE 7

Fig. 3 Hand Guidance system architecture.

The handle to use the hand guiding interface is not attached directly to the
robot’s end effector. It is attached to a mechanical device that allows the use of
the RAS tools and the hand guiding interface at the same time (Fig 3). The orien-
tations of the handle are not coincident with the robot’s end effector orientations.
Moreover, the forces and torques measured on the sensor are on its own refer-
ence frame. For these reasons, the forces and torques measured on the F/T sensor
f t), (T
((fd d f t)) have to be transformed to the robot’s end effector and relative to

the robot’s base frame ((f dR ), (T


d R )) . This can be done using the rotation matrix

from robot’s base frame to end effector frame R REE , that can be obtained from
the homogeneous transformation matrix calculated with the forward kinematics
of the robotic arm, and using the rotation matrix that describes the orientation
of the F/T sensor relative to the end effector EE Rf t , obtaining the respective
forces:    
fx f t fx R
 ee   ee 
fx fx
ee EE f t R R
fy  = ( Rf t ) fy  ⇒ fy  = ( Ree ) fy ee  (6)
   
fz ee fz f t fz R fz ee

where fx ee is the force expressed in the robot’s end effector frame. The proce-
dure would be similar for the torques measured on the FT sensor. With these
transformations of the force, the velocity would be described by equations:
 
 fx ee θ˙x  Tx ee
     
vx  
1 fy ee  ˙ 1 ee
v = vy  = , ω = θy  = Ty  (7)
MS + b IS + c
vz fz ee ˙
θz Tz ee

The robot’s end effector cartesian velocity vector would be:


 
v
ẋ = (8)
ω

With the robot’s end effector cartesian velocity the robot’s joint velocities (q̇) can
be calculated to move the robot according with the force/torque applied by the
surgeon (see fig 4). This is done using the Jacobian matrix (J) of the robotic arm
as stated on [16]:
q̇ = J −1 ẋ (9)
8 Andres Amarillo et al.

3.2 INVERSE KINEMATICS CLOSE LOOP (IKCL)

To compute the robot’s joint velocity based on the angular and translational veloc-
ities, obtained from the admittance controller presented on the previous section,
the Jacobian inverse is used, nevertheless, the use of matrix inversion could lead
to numerical instabilities. There are some methods that have been developed to
handle this problem, among the most common ones are those based on the general-
ized Jacobian inverse calculation (Moore-Penrose pseudoinverse)[17], this method
is commonly used when the Jacobian matrix is not invertible. It is the case of
redundant manipulators, where the actuated degrees of freedom (DOF) are more
than the needed to achieve a task. In the case of the RAS system being explained
in this paper the Jacobian is a square invertible matrix so the use of the pseudo
inverse method would reduce numerical errors, but it is computationally more
complex.
For the implemented interface being described in this paper a simplification of
the Jacobian inverse scheme for the velocity, presented by Siciliano in his purposed
IKCL[18] has been used (Fig 5). The control law used in this research is:

q̇ = J −1 ((ẋ − J q̇real )Kp ė), (10)

where q̇real is the real joint velocities of the robot arm, kp is a proportional gain
and ė denote the error between the calculated end effector velocity and the current
end effector velocity. ė can be partitioned as
 
ėp
ė = , (11)
ėω
where ėp is the position error and ėω orientation error. In the same way J can be
partitioned into two (3xn) matrices so the closed loop can be presented as in Fig
5.
 
Jp
J= (12)

In the proposal of Siciliano, the IKCL has feedback with the output of the
control law. In this way, the error obtained and reduced would be the error caused
by the Jacobian inversion. In the method proposed in this paper the feedback
is the vector of real joint velocities of the robotic arm. With this approach the
velocity error obtained and reduced is the one caused by the Jacobian inversion
plus the error in the robot internal control.

Fig. 4 Admittance controller scheme.


COLLABORATIVE HUMAN-ROBOT INTERACTION INTERFACE 9

3.3 ORIENTATION RESTRICTION CONTROL

As previously mentioned, the hand guidance interface developed for the RAS sys-
tem has been made to be used in different phases of the surgical procedure. In
some of those phases it is of interest to restrict the orientation of the end effector,
allowing the user to move the robot end effector, but holding a specific orientation.
This could be controlled passing just the cartesian velocities into the controller.
Nevertheless, as the simulated mass of the implemented admittance controller has
been set with a low value, to enhance the user experience, as the force exerted is
proportional to the velocity of the robot, when the user exerts high forces it results
in high joint velocities commands. Due to the system safety joint velocity limita-
tions, the velocities executed by the robotic arm are not the calculated ones in the
admittance controller when these velocities are high, this behavior introduces an
orientation drift in the end effector of the robotic arm.
To reduce this orientation drift error of the end effector when a specific ori-
entation is desired, an strategy that we have called orientation restriction control
loop (OCL) has been implemented.
With the real orientation of the end effector (Ree ) calculated through the
forward kinematics, it is possible to obtain a rotation transformation between it
and the desired orientation (Rref ). This rotation transformation is expressed as:

ee
Rref = Rref − 1Ree (13)

where ee Rref is a rotation matrix between the desired orientation of the end
effector and the real one. In order to compensate the orientation error in the end
effector, the angular velocities are calculated. To do it, the rotation transformation
e
eRr ef is converted to axis angle representation, obtaining the rotation vector

→u and the rotation angle θ, the product θ.− →
u by a proportional gain gives the
respective angular velocities of the end effector to compensate the orientation
error when a reference orientation has to be hold while using the hand guiding
interface.

Fig. 5 Simplified closed-loop IKP tracking solution scheme.


10 Andres Amarillo et al.

Fig. 6 Inverse kinematics with orientation restriction control scheme.

4 CONTROL ALGORITHM RESULTS

To test the inverse kinematics close loop, some experimental tests were performed.
The system was used with the hand guidance interface with random directions,
and forces between 0 and 40Nm. To test the worst case the force was exerted in
an oscillatory way and it was intended to reach the maximum force at each cycle.
The values used in the tests were obtained using the control PLC’s trace tool, with
a sampling time of 2 ms.
One test was performed with the IKCL deactivated and another one was per-
formed with the IKCL activated. The cartesian velocity commands and the real
resultant cartesian velocities were monitored. In Fig 7 the commanded cartesian
velocities (reference) and resultant ones (Real) with IKCL deactivated are shown.
In Fig 8 the resultant cartesian velocities on the end effector of the robot compared
to the commanded ones with the IKCL activated are shown.

Fig. 7 Cartesian velocity reference vs cartesian velocity response without IKCL.


COLLABORATIVE HUMAN-ROBOT INTERACTION INTERFACE 11

It can be seen comparing Fig 7 and Fig 8 that the proposed method has smaller
tracking error between the commanded and real velocities of the end effector,
particularly when the velocities are high. High Command velocities result from a
high exerted force and a small simulated inertia, as stated before, configured in
this way to enhance the user experience with the hand guiding interface.

0.2
Real
Velocity in X (m/s)

Calculated
0.1

-0.1
0 2 4 6 8 10 12 14 16 18
time (s)

0.2
Real
Velocity in Y (m/s)

Calculated
0.1

-0.1

0 2 4 6 8 10 12 14 16 18
time (s)

0.2
Real
Velocity in Z (m/s)

0.1 Calculated

-0.1

-0.2
0 2 4 6 8 10 12 14 16 18
time (s)

Fig. 8 Cartesian velocity reference vs cartesian response with IKCL.

The error between the real velocities and velocities calculated from the admit-
tance controller were obtained for both cases (Fig 9), with IKCL and without it.
The root square mean (RMS) error obtained from the tests for cartesian velocities
without the IKCL was 0.04116 m/s, and the RMS error obtained with the IKCL
was 0.023m/s.

To test the orientation restriction control loop, the hand-guiding interface was
configured to be restricted in an orientation expressed in XYZ Euler angles of
(0,90,0) degrees, and random forces from 0 to 40N were exerted on the handle.
The resultant orientation of the end effector was monitored. The test was per-
formed with orientation control loop and without it. Fig 10 shows the real end
effector orientation for both cases while the hand guiding interface is being used.
It can be seen that the proposed orientation control loop, for those cases where
an orientation restriction wants to be imposed, reduce the orientation drift error
due to joint velocity saturation of the robotic arm. In Fig 10 is evidenced that
without the orientation control loop the orientation error increases, while with the
orientation control loop the errors are corrected while the hand guidance interface
still being used. As in the case of the cartesian velocities error, these orientation
errors become significant as the exerted force is higher.
12 Andres Amarillo et al.

Fig. 9 Cartesian Velocity and angular velocity error with and without IKCL.

Otrientation restriction Closed loop vs Open loop


A orientation (Deg)

2
OCL
1 No OCL

-1

-2
0 5 10 15 20 25 30
time (s)
B orientation (Deg)

90.4
OCL
No OCL
90.2

90

89.8
0 5 10 15 20 25 30
time (s)
C orientation (Deg)

10
OCL
No OCL
0

-10

-20
0 5 10 15 20 25 30
time (s)

Fig. 10 Orientation restricted to 0o ,0o ,90o with and without orientation control loop (OCL)

5 DISCUSION

In this paper a physical interaction interface with enhanced admittance controller


algorithms over a robust, industrial architecture for a RAS has been proposed,
endowing an industrial traditional robot with a collaborative behavior. This is
useful in a surgical environment to interact in an intuitive and natural way with
the robotic assistant while keeping the mechanical stiffness of the industrial robot,
essential in the accuracy of the transpedicular fixation surgical procedure. Never-
theless, it is important to remark that in comparison with commercial collaborative
robotic arms, the developed interface allows the control of the end effector of the
COLLABORATIVE HUMAN-ROBOT INTERACTION INTERFACE 13

robot, while the commercial collaborative robots allows the control of robot end
effector and individual joints using the same physical contact interface. This is due
to the integration of a force sensor in each robot joint, compared to the proposed
integration of a single 6DOF force/torque sensor.
The developed orientation closed loop strategy, could also be applied to a posi-
tion control closed loop, allowing the system to hold the position while modifying
the orientation through the use of the hand guidance interface described in this
paper.
It is important to emphasize that the work presented in this paper allows
the implementation of a hand guidance interface in a traditional robotic arm.
Endowing the system with the robustness and stiffness of this type of robotic
arms, presenting in this way an alternative to commercial collaborative robots
which lack of stiffness represents a issue for their introduction in some surgical
applications that require the system to bear efforts while holding an accurate
position and orientation.
Although the interface developed and shown in this paper is being developed for
a RAS, this architecture and control algorithms could be applied to other fields
and applications, taking into account that the robotic arm is being controlled
through an industrial PLC. Moreover, even if the proposed architecture has been
defined with a Staubli TX40 robot it could be applied to other robots, using the
respective technology of each manufacturer to allow the robot joint motion control
by the PLC.

References

1. Vijay M. Ravindra, et al. “Degenerative Lumbar Spine Disease: Estimating Global Incidence
and Worldwide”. Global Spine Journal. Dec 2018; Vol 8(8). p. 784–794. eISSN: 21925690.
2. Filip Raciborski, Robert Gasik, and Anna Klak. “Disorders of the spine. A major
health and social problem”. Rheumatology Journal. 2016. Vol 54(4). p.196–200. DOI:
10.5114/reum.2016.62474.
3. R Morales-Ávalos, R Elizondo-Omaña, F Vı́lchez-Cavazos, et al. “Fijación vertebral por
vı́a transpedicular. Importancia de los estudios anatómicos y de imagen”. Acta Ortopédica
Mexicana. Nov.-Dic 2012, Vol. 26(6). p.402-411.
4. ] V Puvanesarajah, J Liauw, S Lo et al. “Techniques and accuracy of thoracolumbar pedicle
screw placement”. World Journal of Orthopedy. April 2014. Vol 5(2). p.112-123 eISSN 2218-
5836.
5. KM Scheufler, J Franke, A Eckardt et al. “Accuracy of image-guided pedicle screw
placement using intraoperative computed tomography-based navigation with automated
referencing”. Neurosurgery Journal. December 2011; Vol 69(6): p.1307-1316. DOI:
10.1227/NEU.0b013e31822ba190.
6. C. Schizas, E. Thein, B. Kwiatkowsk, et al. “Pedicle screw insertion: Robotic assistance ver-
sus conventional c-arm fluoroscopy”. Belgian Orthopedic Act. April 2012. Vol 78(2). p.240-5.
7. A. Amarillo, E. Sanchez, C. Suarez, et al. “Diseño cinemático de dispositivo de tracking del
paciente para procedimientos quirúrgicos”. XXXV Congreso Anual de la Sociedad Espanola
de Ingenierı́a Biomedica. November 2017. P.275-278. ISBN: 978-84-9082-797-0.
8. A Amarillo, J Oñativia and E Sanchez. “RoboTracker: Collaborative robotic assistant de-
vice with electromechanical patient tracking for spinal surgery”. 2018 IEEE/RSJ Inter-
national Conference on Intelligent Robots and Systems, October 2018. p.1312-1317. DOI:
10.1109/IROS.2018.8594467.
9. V. Puvanesarajah. “Techniques and accuracy of thoracolumbar pedicle screw placement”.
World Journal of Orthopedics. 2015. Vol 5(2). p.112-123. DOI: 10.5312/wjo.v5.i2.112.
10. I. Aliaga, A. Rubio, E Sanchez. “Experimental Quantitative Comparison of Different Con-
trol Architectures for Master-Slave Teleoperation”. IEEE Transactions on Control Systems
Technology. January 2004. Vol 12(1). p.2-11. DOI: 10.1109/TCST.2003.819586.
14 Andres Amarillo et al.

11. M. Sathiyanarayanan, S. Azharuddin. S Kumar. et al. “Gesture controlled robot for mili-
tary purpose”. International Journal for Technological Research In Engineering. July 2014.
Vol 1(11). p. 1300-1303. ISSNe: 2347 – 4718.
12. M. Anurag, M. Pooja, K. Akshay. et al. “A voice-controlled personal assistant robot”,
International Conference on Industrial Instrumentation and Control, July 2015, DOI:
10.1109/IIC.2015.7150798.
13. M. Hanses, R. Behrens and N. Elkmann. “Hand-guiding robots along predefined geometric
paths under hard joint constraints”. IEEE International conference on emerging technologies
and factory automation. November 2016. p. DOI: 10.1109/ETFA.2016.7733600.
14. A. De Santis, B.Siciliano, A.De Luca, et al. “An atlas of physical human-robot
interaction”. Mechanism and machine Theory. 2008. Vol 43(3). p. 253-270. DOI:
10.1016/j.mechmachtheory.2007.03.003.
15. J. Melo, A. Bertelsen, D. Borro, et al. “Nuevo asistente robótico para cirugı́a:
Arquitectura y algoritmos de control”, December 2012. Vol 87(6). p. 645-654. DOI:
http://dx.doi.org/10.6036/5011
16. J Craig. “Introduction to Robotics. Mechanics and control”. Third Edition. 2005 Pearson
education. ISBN 0-13-123629-6.
17. Ben-Israel, Adi. “The Moore of the Moore-Penrose Inverse.” The Electronic Journal of
Linear Algebra, August 2002. Vol 9(1) p.150-157. DOI: 9. 10.13001/1081-3810.1083.
18. B. Siciliano. “A closed-loop inverse kinematic scheme for on-line joint-based robot control”.
Robotica. 1990. Vol 8(1). p. 231-243. DOI:10.1017/s0263574700000096.

You might also like