Professional Documents
Culture Documents
Ch3 Robot Vision, Programming, Applications
Ch3 Robot Vision, Programming, Applications
Ch3 Robot Vision, Programming, Applications
The process of deriving, featuring and analyzing the details from the 3 dimensional
object in the form of 2D picture is known as Robot Vision or Machine Vision, As the application
utilizes the computer for processing it is known as Computer Vision.
VIDICON CAMERA
Construction:
Lens-Focuses the image of the object on to the camera.
Face plate-Glass cover at front end of the camera
Transparent metal coating-Acts as electrode which derives electrical video signal.
Photo sensitive layer- is the layer of resistive particles whose resistance is inversely proportional
to the light intensity.
Wire mesh- Decelerates the beam of electrons so that they reach photo sensitive layer with zero
velocity.
Electron gun-Generates beam of electrons that scan the photo sensitive layer.
Beam deflection coil- Deflect the electron beam vertically and horizontally for scanning.
Focusing coil-The electron beam is focused by this coil.
Tube pins-Acts as connector to the electric supply source.
Glass Envelope- Provides housing for above elements.
Working Principle:
The metal coating of the face plate is applied with positive voltage.
The photo sensitive layer acts like a capacitor with negative charge on the inner surface
and positive charge on the opposite side, as electron beam strikes.
The light striking the photo sensitive layer reduces the resistance and the current starts
flowing and neutralizes the positive charge.
As the image is formed on the target layer, the concentration of electrons is high in dark
area and low in lighted area.
The electrons so formed flow through metal layer and through the tube pins.
Variations in current during the electron beam scanning motion produces a video signal
proportional to the intensity of input image.
LIGHTING TECHNIQUE AND DEVICES
The fundamental types of lighting devices used in robot vision are classified into the
following:
a) Diffuse surface devices: are exemplified by the Fluorescent Lamps & Lighted
tables.
b) Condensor projectors: Transforms an diverging light source into focusing light
source.
c) Flood or Spot projectors: Used to illuminate object surface areas from all angles.
d) Collimator: is a device which produces parallel beam of light on the object whose
image is to be captured.
e) Imagers: Example slide projectors and optical Enlargers produce at the object plane
real form of an image.
There are basically two major type illumination techniques:
o Front lighting : Here, source of light is on the same side of the camera.
o Rear / Back lighting: The source of light is on the opposite side of the camera.
Sampling
Let the function f(x, y) denote the two dimensional image pattern on the image device.
The geometric co ordinates x and y of the image plane are digitized to get information by the
process known as ‘Image sampling’
After sampling the digitized function f (x, y) in the spatial co ordinates is generated
which can be easily stored in the computer memory.
R
Hence the scanning rate per line, RL = + Rd
N
R + NRd
Number of pixels that can be processed / line, Pn =
N .S
Generally the scan line change over is given in the percentage of the scan rate for a line.
Quantization
The digitization of the amplitude of the image function f(x , y) depending on the intensity
of the pixel is known as Quantization.
Fr
Then L =
2n
The digital approximation of the analog signal gives the error in quantization as
1
Quantization error, eq = ± ( L ) = ±
2 ( )
1 Fr
2 2n
.
Encoding
Depending on the image created on the face plate of the camera , the intensity of the
different pixel would be different.
The digitized amplitude code is reperesented by the binary sequence of digits , which is
known as encoding.
Image Storage
The storage of the digitized image in the computer memory is done by the frame buffer
which can be made a part of the computer or frame grabber.
The access and acquisition of the image is done in 1/30 second by the frame grabber of a
video camera.
An average camera can produce a disturbance free data with a 6 bit buffer where as the
quantization of the frame has a specification of 8 bits. The lower level bits are rejected in the
operation of noise cleaning.
As the resolution of general human eye 26 (64), the image grabbed by the video camera
would be sufficiently good. The row and column of the frame grabber are synchronized by the
electron beam of the camera. The signal sent by the computer to the position address (x, y) of the
face plate reads the information stored in the frame buffer , uniquely addressed by sampling and
quantization.
Problems
1. A roaster scan system of vision has a frame of 256 lines, having 1/3 sec. as the scanning
rate. It may be assumed that the electron beam takes 10% of the scan time to move from
one line to other line. If there are 256 pixels per line , determine the sampling rate.
Solution:
Given data:
R = 1/3 sec, N= 256 lines, Pn = 256 pixels, Rd = 10% of R/N = 0.1R/N
R + NRd R + NRd
Using the formula, Pn = , from this S = = 5.6 × 10 -6 s / pixel.
N .S N .Pn
2. The maximum voltage range for a 8 bit capacity of ADC is 18 V. Calculate the
Quantization levels, Quantization level spacing & Quantization error.
[ Answer : Q = 256, L = 0.0703 V, eq = 0.03515 ]
It refers to all operations and manipulations that are carried out on the captured image
based on application specific needs.
2. SEGMENTATION
o It is the process of separating the regions that has distinctive features in the image
based on the intensity level, colors, edges or boundaries with continuities among
different elements in the image.
o Segmentation is the first step in the machine vision for identifying an object from
its background and its surrounding objects.
o Segmentation is done by using techniques like THRESHOLDING, EDGE
DETECTION and REGION BASED TECHNIQUES like Region growing and
Region Splitting.
Threshoding: “Thresholding is the process of setting a minimum amplitude limit for the
intensity level to identify the object from its background and the minimum value set for
comparison is known as Threshold limit,”
o A threshold limit is selected by analyzing the intensity levels in the entire image
and plotting the histogram using the statistical distribution of the pixel values in
the image.
o When the intensity level exceeds the threshold limit , the point is identified as an
OBJECT and as the BACKGROUND if it is below the threshold level.
o Let Ii ( x, y) be the intensity of a point (x, y) in an image and ti be the selected
threshold limit for the same image then the threshold image is given as,
Ti (x, y) = 1, if Ii(x, y) > ti
= 0, if Ii (x, y) < ti
o In the threshold image Ti (x, y), pixels with values ‘1’ are identified as pixels of
‘OBJECT’ and pixels with values ‘0’ are identified as ‘BACKGROUND’.
o Global threshold : Common threshold limit for entire image.
o Local threshold: For each pixel in relation to its neighboring pixels.
Edge Detection:
It is a technique used to detect the outer most edge of an object of interest in a given
frame.
It means extracting the feature from an object for the purpose of recognition.
Some features can be used in Robot vision are length, breadth, diameter, perimeter, area,
centre of gravity, angle of slant edges etc. which are used for comparison and
IDENTIFICATION of object.
4 . OBJECT RECOGNITION
* To identify the object by its real world names. The process of identifying an object and
assigning an individual name to it is known as ‘RECOGNITION’
* For example, identifying the SPANNER and GEAR in a given frame is possible technically as
two different objects. But it is required to identify which one is a gear and which one is a spanner
and each of the objects should be given a suitable name.
* In ‘Template Matching’, the segmented window is compared digitally with a model image
stored in the memory. The model image which is known as ‘TEMPLATE’ is generated during
Training session conducted for the vision system. * A no: of templates of an object are captured
at different angles in order to recognize the object from any direction. Therefore the memory
required for storage is VERY LARGE and processing time increases dramatically even for
recognizing single object. The object should be simple and have clear and distinctive shape to
employ this technique.
‘Structural Matching’ technique is carried out by comparing the object’s pattern’s physical
relationships between two or more entities of the same object.
o The segmented image is reduced to form a skeletal structure from which required
features are extracted for comparison.
o When the features extracted from the segmented image matched with the
parameters stored in the memory, then the object is identified an d assigned its
name. For object recognition , the vision system should have enough information
for comparing the image captured during real time operations.
5. INTERPRETATION
* It endorses a vision system with a higher level of recognition about its environment than that
offered by any of the concepts discussed before and encompasses all these methods as integral
part of understanding a visual scene.
____________________________________________________________________________
INTRODUCTION:
Robot languages have been developed for ease of control of motions of robot’s arms
having different structures and geometrical capabilities. The following are the major contributors
to develop robot languages.
8. RPL:
o Developed by SRI International.
o A Compiler is used to convert program into Codes that can be interpreted by an
Interpreter.
o Unimation PUMA 500 can be controlled with the help of RPL.
o RPL uses syntax like FORTRAN.
o It is modular and flexible.
9. AUTO PASS:
o Proposed by IBM to facilitate a human assembler to perform the activities.
o Sophisticated World modeling system and helpful to keep track of the objects.
CLASSIFICATION:
o Here, the task is defined through a command , say ‘ TIGHTEN THE NUT ’
o The robot should be capable of performing step by step functions to accomplish
the objective of tightening the nut.
o This is possible only the robot must have 3D Vision, Intelligence for capable of
making decisions such as robot must find the NUT & SPANNER, pick them up
and place them in a sequential manner and finally tighten the nut.
The program and control methods are actuated through software running on an operating
system in which manipulation of data takes place.
1. MONITOR mode:
The programmer can define locations, load a particular piece of information on a
particular register, store information in the memory, save, transfer programs from
storage into computer control memory, Enable or Disable and move back and
forth into its edit and run mode.
2. EDIT mode:
o Here, the programmer can edit or change a set of instructions of existing program
or introduce a new set of information.
o Any error shown by the monitor can be corrected.
o To come out of Edit mode, command ‘E’ should be given.
3. RUN or EXECUTIVE mode:
o The programs to carry out a predefined task can be executed in the run mode. The
sequential steps as written by the programmer are followed during run mode.
o DRY RUN can be used to test the program by making the switch instruction
DISABLE.
o The signals are made non operational by the disable switch.
o After Dry run, the switch may be made operational by the instruction ENABLE.
o By DEBUGGING, the errors in the program can be rectified.
o The path or coordinate points of locations are to be redefined and corrected in
EDIT mode. Then after ending of EDIT mode, the RUN mode may be actuated.
The robot will run following the correct trajectory.
o The operating system for implementing robot language program uses either an
INTERPRETER or a COMPILER.
o An Interpreter take the Source program one line at a time and generates
equivalent code that is understood by the controller.
o If any error in source program, the user can correct the source program and line is
reinterpreted. Every line is executed by the interpreter as and when it is written.
o VAL is the language of PUMA class of robots and source codes or instructions
are processed by an interpreter.
o Compiler is the software in the operating system that converts source code into
the object code / machine code after compilation of the whole program.
o The robot controller can then read and process machine codes.
o Execution time for compiled program is fast while editing of an interpreted
program is very fast.
1. INTRODUCTION
VAL is an example of robot programming language and an operating system
which regulates and controls a robotic system by following the user commands
and instructions. It is designed to be highly INTERACTIVE to minimize the
programming time.
o The VAL O.S is stored in ROM, when the controller is powered ‘ON’, VAL is
immediately available.
o It is flexible and versatile, can perform most of the commands even while a user
program is being executed.
o Normally, the motions and actions of the robot are controlled by a program stored
in RAM, called USER Programs or VAL Programs.
o VAL also contains editor to edit the program. In addition, the editor has a simple
mode of operation whereby a program step and location definition are
automatically generated each time a button, RECORD on the manual control unit
( TEACH PENDANT) is pressed. It can also includes SUBROUTINES.
2. LOCATIONS
o Robot location refers the position and orientation of end effectors or tools.
o VAL has two ways of robot location
1. Precision point
2. Transformation
o Precision point is to express a location in terms of the positions of the individual
joints.
o Transformation is to express in terms of Cartesian co-ordinates( X,Y,Z) and
Orientation angles of the robot tool relative to a reference frame fixed in the robot
base.
3.TRAJECTORY CONTROL
VAL uses two different methods to control the path of a robot from one location to
another.
o The methods either (i) Interpolate between initial and final position of each point,
producing a complicated tool-tip curve in a space or (ii) move the robot tip along
a straight line path.
o For the first case , called Joint interpolated motion, the total motion time is set to
that of the joint requiring the longest time to complete its motion. This provides
the fastest response of the robot.
o Straight line motion is produced by applying an interpolating function to the
world co ordinate location of the robot tool and rapidly transforming the
interpolated tool location to joint commands.
o The motion speed of tool-tip can be accurately controlled but slower than
corresponding joint interpolated motions.(disadvantage)
Program: DEMO. RG
1. APPRO PART, 50
2. MOVES PART
3. CLOSEI
4. DEPARTS 150
5. APPROS BOX, 200
6. MOVE BOX
7. OPENI
8. DEPART 75
END
Explanation:
In this example,
MONITOR COMMANDS
To enter and execute a program, one has to give certain VAL Commands called as
‘Monitor commands’
o Location variables can be set equal to the current location or previously defined
location by HERE and POINT command. The current location can be displayed
by using WHERE command.
o TEACH command records a series of location values under the control of
RECORD button on manual control unit.
Example: How to teach a position PART by HERE command, and the resulting positions
displayed on the screen.
To define position, a command like POINT PART = P 1 may be given where a location
variable PART is equal to the value P 1.
A command say TEACH P 1 is used to record a location variable P1 when the record button on
the Teach Pendant is pressed.
Successive locations can be assigned as P1,P2…..so on, by teaching new locations on the path
and pressing record button each time.
Editing Programs:
EDIT permits to modify the program. Several sub commands in EDIT to properly edit a
program.
.
E
E means exit of Editing mode and return to the monitor mode.
The command DIR displays the names of all user programs in the system memory.
The commands are ‘LISTL’ and ‘LISTP’ display the value of location variables and the steps of
user programs respectively.
The commands that can be used for Loading the programs, Locations and both programs
and locations contained in a specified disk into the system memory are
Program Control:
o The command to specify the speed for all subsequent robot motions, SPEED 30
(30% of monitor speed)
o The commands that executes a specified user program for once, any number of
times or indefinitely are
EXECUTE
EXECUTE, 5 (Execute 5 times)
EXECUTE -1 ( Execute indefinitely)
o The command that terminates program execution after completion of the current
step is ‘ABORT’
o In VAL II, a single joint (JT2) may be changed by diving it , say 60֯ at a speed of
30% of the monitor speed.
In order to grip some objects, the end effectors is required to align such that its Z axis is
parallel to the nearest axis of the world co-ordinate system. The command is
‘DO ALIGN’
Similarly other DO command may be ‘DO MOVE PART’ (Part is in variable location)
PROGRAM INSTRUCTIONS
Any robot configuration change is accomplished during the execution of the next motion
instruction other than straight line motion.
RIGHTY or LEFTY – Change the robot configuration to resemble a right or left human arm
respectively.
ABOVE or BELOW commands make the elbow of the robot to point up or down
respectively.
Motion Control
DRAW- moves the robot along the straight line through specified distances in X,Y & Z
directions.
APPRO- moves the robot to a location which is at an offset ( along tool Z axis ) from a
specified point.
APPROS, DEPARTS do the same as APPRO & DEPART instructions but along straight line
paths.
CIRCLE moves the robot through circular interpolation via three specified point locations.
Hand Control
OPEN and CLOSE-Indicate respectively the opening and closing of the gripper during the
next instruction
CLOSEI 75 In VAL II, if a servo controlled gripper is used, then this command causes the
gripper to close immediately to 75mm.
GRASP 20, 15 –Causes the gripper to close immediately and checks whether the opening is
less than the amount of 20 mm. If so, the program branches to the statement 15.
MOVES PART, 30 –Indicates servo controlled end effecter causes a straight line motion to
a point defined by PART and the gripper opening is changed to 30 mm.
MOVET PART, 30 – causes the gripper to move to position, PART with an opening of 30
mm by joint interpolated motion.
The instructions that do the same as the corresponding monitor commands are
PROMPT In VALII, this command often helps the operator to respond by typing in the
value requested and pressing the return key.
Indicates the quotations on the CRT and the system waits for the operator to respond by
assigning some value to variable name Y1 and there the program is executed.
GOTO 20-performs an unconditional branch to the program step identified by a given level,
20.
GOSUB and RETURN are necessary to transfer control to a subroutine and back to the
calling program respectively.
IF ROW LT 3 THEN
ELSE
END
If the logical expression ( say Row is less than 3 in a matrix) is TRUE, then the instruction
steps between THEN and ELSE are executed.
If the logical expression is FALSE then the instruction steps between ELSE and END are
executed.
SIGNAL- Turns the signals ON or OFF at the specified output channels. And it is also
helpful to communicate with the peripheral equipments interfaced with robots in the work
cell.
IFSIG and WAIT -test the states of one or more external signals.
The command say, SIGNAL 2, -3 indicates that output signal 2(positive) is to be turned ON
and output signal 3(negative) is to be turned OFF.
WAIT SIG (-1,2) will prevent the program execution until external input signal 1 is turned
OFF(Negative) and external input signal 2 is turned ON.(positive)
The additional command, REACT –VAR 2, SUB TRAY indicates that the reactions are
invoked if the external binary signal identified is a negative variable , VAR 2.
If the current state of the signal is OFF , signal is monitored for a transition from OFF to ON
and then again OFF.
When the reaction or specified signal transition is deleted, the program control is transferred
to the subroutine named as TRAY.
VAL II may communicate with either digital or analog signals through input / output
modules.
IOPUT and IOGET are the commands that are used either to send or receive output
respectively to a digital I / O module.
Analog signals can also be communicated through analog input/output modules by analog to
digital converter(ADC) or digital to analog converter (DAC).
DAC 1 = CONST(constant)
Similarly the command VAR 1 = ADC (1) returns the current input at analog channel
number 1 as an integer value in the range from -2048 to +2047.
WELDING INSTRUCTIONS
WSET- Sets the speed, welding voltage and current as a welding condition identified by a
number (1-4).
For example, WSET 1 = 13, 54.3,63 sets a welding speed of 13 mm/s , welding voltage of
54.3% and welding current of 63% as welding condition 1.
WVSET sets a weaving pattern, setting some or all of the following parameters: cycle
distance, amplitude, right end stop distance, right end stop time, center stop distance, left end
stop distance and left end stop time.
WVSET 1 = 10, 7, 2, 0, 1, 3, 0
WSTART- starts welding under present welding condition and weaving condition
1. MATERIALS HANDLING
o Parts Transfer
o Parts Sorting
o Heat Treatment
o Palletizing
2. MACHINE LOADING & UNLOADING
o Die Casting
o Injection Moulding
o Farming, Stamping & Trimming Process
o Metal cutting machine tools like Lathe, machine tools etc.
3. MACHINING
o De burring
o Drilling
o Grinding
o Milling
o Threading, shaping etc.
4. MAINTENANCE
o Assembly : Mating parts or Parts inserting problems
o Inspection, Welding (Spot welding, Arc welding, Seam tracking)
o Spray Painting / Finishing
Storage / Buffer
Hopper
Magazining
Transporting
Moving
Feeding
Escaping
Positioning
Orienting
Aligning
Inserting
For any automated or flexible robotic assembly certain basic rules and procedures are
required to be followed for ease of assembly
1. The components for the assembled product should be selected for ease of assembly.
2. Parts should be designed for feeding and orienting for automated assembly for which the
product simplification and if necessary, redesigning the products are necessitated.
3. For robotic assembly, a suitable gripping device should be designed for ease of assembly.
Problems and characteristics of Assembly:
o The basic problem considered in assembly is ‘ Peg in the hole’
o During insertion of the peg into the hole, vision and tactile sensing work in
coordinated and integrated manner.
o For robot to assemble, such characteristics of vision and force sensors are very
useful.
The following 3 main types of robots suitable for assembly operations.
o Cartesian robots
-It has 3 D.O.F (PPP)
-Suitable for simple assembly operations
-They have high accuracy and repeatability.
o Revolute robots
-are also used for assembly tasks, that operate on high level languages VAL &
VAL II and have 6 D.O.Fs.
-There are World & Tool co ordinate systems
Eg: PUMA Robot( Programmable Universal Machine for Assembly)
o SCARA robots
-Selective Compliance Assembly Robot Arm(SCARA) are suitable for assembly.
-These robots are provided with direct drive motors that allow high speeds with
accelerations and backlash free, fast and accurate motions.
-The accuracy is around +/- 0.076 mm and repeatability is +/- 0.025 mm.
WELDING
Robots are find wide applications in welding and the robots used for the purpose
of welding are tool handling robots.
1. SPOT WELDING
Spot welding is widely used in fastening sheet metals and in automobile body assembly,
frames, panels, fabrication of metal furniture , domestic appliances etc.
Spot welding is done by fusing two metals at the spots where heat is generated by
allowing an electric current to pass through the electrons for a specific duration of time
and pressing the joining surfaces with the electrodes.
A spot welding robot has
o A robot manipulator with several degrees of freedom.
o A welding gun held on a robot wrist.
o Controller and power source
o Input / Output Interfaces.
The operations involved in Spot welding are,
o Squeezing the two metal surfaces between the electrodes.
o Welding by passing current for specific duration of time depending on the type of
materials and its thickness.
o Releasing the grip.
In order to perform the spot welding,
o a robot should have enough Payload capacity 50-100 kg, good repeatability, good
linear speed (30-90 m/ min) and good angular speed ( 60֯-180֯ / sec)
o The weld gun should be properly oriented and positioned on the product.
o Power rating varies from 30-150 KVA and secondary voltage varies from 5 to
22V.
o Robot welding produces weldment of better quality and provides greater safety
for workers.
o The welding parameters like cycle time, voltage, current etc can be changed and
different trajectories can be programmed.
2. ARC WELDING
Arc welding is a continuous welding process in which the work pieces are joined
by an airtight seal between the pieces to be joined.
o An electric arc struck between welding electrode and the work piece produces
necessary heat that causes fusion of two metal surfaces.
o A high temperature of about 3000-3300֯ C melts the metal.
o Arc welding is done by direct current of 100 A to 300 A at 10 -30 V.
o In order to prevent Oxidation, Inert gases are used and the electrodes are coated
with some flux.
Basic Components of Robotic Arc welding system
1. Power Source
o For better monitoring of welding parameters, the power source shall incorporate
Voltmeter and Ammeter to check the fluctuations in voltage & current.
2. Wire feed Unit
o It consists of a feed motor, sets of feed, drive rollers, speed regulator and wire
guiding arrangement.
o The wire feed rates vary from 100 mm / min to 3000 mm / min.
MATERIALS HANDLING
o The jointed arm robots with 3-5 degrees of freedom can serve the material
handling application.
o Hydraulic or pneumatic drive with manual or powered lead through teaching
would give motion in the present robot designs.
o The next generation robots are expected to use servo motors with Programmable
Automation control(PAC).
o Polar (P2R / PRR) robots , Cylindrical (2PR / PPR) robots and Jointed arm
(3R/RRR) robots with 4-5 degrees of freedoms are used for such applications.
o Electronic and servo drives are the future trends in the drives as compared to
present electrical and hydraulic drives.
o PAC can replace the powered lead programs.
o Micro motion controllers with vision can be control system of next generation
rather than the present point to point or limited response systems.
SPRAY PAINTING
o The jointed arm robots with 6 D.O.Fs having hydraulic drive are operated in
continuous path controlled by manual lead through, as seen in present robots.
o Adaptive arm and PLC controls operating in complicated and unsafe atmosphere
are future trend.
INSPECTION