Professional Documents
Culture Documents
Beamex Book Ultimate Calibration
Beamex Book Ultimate Calibration
Beamex Book Ultimate Calibration
2nd Edition
Ultimate Calibration
2nd Edition
Beamex has used reasonable efforts to ensure that this book contains both accurate
and comprehensive information. Notwithstanding the foregoing, the content of this book is
provided as is without any representations, warranties or guarantees of any kind, whether
express or implied, in relation to the accuracy, completeness, adequacy, currency, quality,
timeliness or fitness for a particular purpose of the content and information provided on this
book. The contents of this book are for general informational purposes only. Furthermore, this
book provides examples of some of the laws, regulations and standards related to calibration and
is not intended to be definitive. It is the responsibility of a company to determine which laws,
regulations and standards apply in specific circumstances.
Contents
Preface by the CEO of Beamex Group 7
QUALITY, REGULATIONS AND TRACEABILITY
Quality standards and industry regulations 11
A basic quality calibration program 35
Traceable and efficient calibrations in the process industry 57
CALIBRATION MANAGEMENT AND MAINTENANCE
Why Calibrate? What is the risk of not calibrating? 73
Why use software for calibration management? 79
How often should instruments be calibrated? 89
How often should calibrators be calibrated? 97
Paperless calibration improves quality and cuts costs 101
Intelligent commissioning 107
Successfully executing a system integration project 115
CALIBRATION IN INDUSTRIAL APPLICATIONS
The benefits of using a documenting calibrator 125
Calibration of weighing instruments Part 1 131
Calibration of weighing instruments Part 2 137
Calibrating temperature instruments 143
Calculating total uncertainty of temperature calibration with a dry block 149
Fieldbus transmitters must also be calibrated 157
Configuring and calibrating smart instruments 163
Calibration in hazardous environments 169
The safest way to calibrate to calibrate Fieldbus instruments 175
APPENDIX: Calibration terminology A to Z 181
foreword
Preface
Quality,
Regulations and
Traceability
Calibration requirements
according to quality standards
and industry regulations
11
finger plus the width of the palm of the hand of the Pharaoh or King
ruling at that time.2
The Royal Cubit Master was carved out of a block of granite to
endure for all times. Workers engaged in building tombs, temples,
pyramids, etc. were supplied with cubits made of wood or granite. The
Royal Architect or Foreman of the construction site was responsible for
maintaining & transferring the unit of length to workers instruments.
They were required to bring back their cubit sticks at each full moon
to be compared to the Royal Cubit Master.
Failure to do so was punishable by death. Though the punishment
prescribed was severe, the Egyptians had anticipated the spirit of the
present day system of legal metrology, standards, traceability and
calibration recall.
With this standardization and uniformity of length, the Egyptians
achieved surprising accuracy. Thousands of workers were engaged in
building the Great Pyramid of Giza. Through the use of cubit sticks,
they achieved an accuracy of 0.05%. In roughly 756 feet or 230.36276
meters, they were within 4.5 inches or 11.43 centimeters.
The need for calibration has been around for at least 5000 years.
In todays calibration environment, there are basically two types of
requirements: ISO standards and regulatory requirements. The biggest
difference between the two is simple ISO standards are voluntary, and
regulatory requirements are mandatory. If an organization volunteers
to meet ISO 9000 standards, they pay a company to audit them to that
standard to ensure they are following their quality manual and are
within compliance. On the other hand, if a company is manufacturing
a drug that must meet regulatory requirements, they are inspected
by government inspectors for compliance to federal regulations. In
the case of ISO standards, a set of guidelines are used to write their
quality manual and other standard operating procedures (SOPs) and
they show how they comply with the standard. However, the federal
regulations specify in greater detail what a company must do to meet
the requirements set forth in the Code of Federal Regulations (CFRs).
In Europe, detailed information for achieving regulatory compliance
is provided in Eudralex - Volume 4 of The rules governing medicinal
products in the European Union.
The Pharmaceutical Inspection Convention and Pharmaceutical
Inspection Co-operation Scheme (PIC/S) aims to improve
harmonisation of Good Manufacturing Practice (GMP) standards
and guidance documents.
12
13
14
15
16
17
18
20
21
22
23
Operational qualification
15. The completion of a successful Operational qualification
should allow the finalisation of calibration, operating and cleaning
procedures, operator training and preventative maintenance
requirements. It should permit a formal release of the facilities,
systems and equipment.
Qualification of established (in-use) facilities, systems and equipment
19. Evidence should be available to support and verify the operating
parameters and limits for the critical variables of the operating
equipment. Additionally, the calibration, cleaning, preventative
maintenance, operating procedures and operator training procedures
and records should be documented.
PROCESS VALIDATION
Prospective validation
24. Prospective validation should include, but not be limited to the
following:
(a)short description of the process;
(b)summary of the critical processing steps to be investigated;
(c)list of the equipment/facilities to be used (including measuring/
monitoring/recording equipment) together with its calibration
status
(d)finished product specifications for release;
(e)list of analytical methods, as appropriate;
(f)proposed in-process controls with acceptance criteria;
(g)additional testing to be carried out, with acceptance criteria and
analytical validation, as appropriate;
(h)sampling plan;
(i)methods for recording and evaluating results
(j)functions and responsibilities;
(k)proposed timetable.
24
EU GMP Annex 11
The EU GMP Annex 11 defines EU requirements for computerised
systems, and applies to all forms of computerised systems used as part
of GMP regulated activities.
Main page for the EudraLex - Volume 4 Good manufacturing practice
(GMP) Guidelines:
http://ec.europa.eu/health/documents/eudralex/vol-4/index_en.htm
EUROPEAN COMMISSION
HEALTH AND CONSUMERS DIRECTORATE-GENERAL
Public Health and Risk Assessment
Pharmaceuticals
Brussels,
SANCO/C8/AM/sl/ares(2010)1064599
EudraLex
The Rules Governing Medicinal Products in the European Union
Volume 4
Good Manufacturing Practice
Medicinal Products for Human and Veterinary Use
Annex 11: Computerised Systems
Legal basis for publishing the detailed guidelines: Article 47 of Directive
2001/83/EC on the Community code relating to medicinal products
for human use and Article 51 of Directive 2001/82/EC on the
Community code relating to veterinary medicinal products. This
document provides guidance for the interpretation of the principles
25
26
GAMP
GAMP is a Community of Practice (COP) of the International
Society for Pharmaceutical Engineering (ISPE). The GAMP COP
aims to provide guidance and understanding concerning GxP
computerized systems. COPs provide networking opportunities
for people interested in similar topics. The GAMP COP organizes
discussion forums for its members and ISPE organises GAMP related
training courses and educational seminars.
GAMP itself was founded in 1991 in the United Kingdom to
deal with the evolving FDA expectations for Good Manufacturing
Practice (GMP) compliance of manufacturing and related systems.
Since 1994, the organization entered into a partnership with the ISPE
and published its first GAMP guidelines.
Three regional Steering Committees, GAMP Japan, GAMP
Europe, and GAMP Americas support the GAMP Council which
oversee the operation of the COP and is the main link to ISPE.
Several local GAMP COPs, such as GAMP Americas, GAMP
Nordic, GAMP DACH (Germany, Austria, Switzerland), GAMP
Francophone, GAMP Italiano and GAMP Japan, produce technical
content and translate ISPE technical documents. They also bring the
GAMP community closer to its members, in collaboration with
ISPEs local Affiliates in these regions.
The most well known GAMP publication is GAMP 5 A RiskBased Approach to GxP Computerized Systems. This is the latest major
revision and was released in January 2008. There is also a series of
related GAMP guidance on specific topics, including:
GAMP Good Practice Guide: A Risk-Based Approach to
Calibration Management (Second Edition)
GAMP Good Practice Guide: A Risk-Based Approach to GxP
Compliant Laboratory Computerized Systems (Second Edition)
GAMP Good Practice Guide: A Risk-Based Approach to GxP
Process Control Systems (Second Edition)
GAMP Good Practice Guide: A Risk-Based Approach to Operation
of GxP Computerized Systems - A Companion Volume to GAMP 5
GAMP Good Practice Guide: Electronic Data Archiving
GAMP Good Practice Guide: Global Information Systems Control
and Compliance
27
28
29
30
31
32
34
A basic quality
calibration program
The bottom line is this all test equipment that make a quantitative
measurement require periodic calibration. It is as simple as that.
However, before we go any further, we need to clarify two definitions
that are critical to this subject calibration and traceability.
By definition:
Calibration is a comparison of two measurement devices or systems,
one of known uncertainty (your standard) and one of unknown
uncertainty (your test equipment or instrument).
Traceability is the property of the result of a measurement or the value
of a standard whereby it can be related to stated references, usually
national or international standards, through an unbroken chain of
calibrations all having stated uncertainties.
The calibration of any piece of equipment or system is simply
a comparison between the standard being used (with its known
uncertainty), and the unit under test (UUT) or test instrument that
is being calibrated (the uncertainty is unknown, and that is why it is
being calibrated). It does not make any difference if you adjust, align or
repair the item, nor if you cannot adjust or align it. The comparison to
a standard that is more accurate, no matter the circumstances is called
calibration. Many people are under the misconception that an item
must be adjusted or aligned in order to be calibrated. Nothing could
be further from the truth.
Before we can get any deeper into what traceability is, we should
explain two different traceability pyramids. When we talk about
traceability to a national or international standard, the everyday
calibration technician is usually situated close to the bottom of the
pyramid, so a graphic illustration of these pyramids is important.
The two examples in figures 1 and 2 are similar, but differ depending
on where you are in the chain, or certain parts of the world.
There are basically two ways to maintain traceability during
calibration the use of an uncertainty budget (performing uncertainty
calculations for each measurement); and using a test uncertainty ratio
(TUR) of 4:1. First, lets discuss the use of uncertainty budgets.
According to the European cooperation for Accreditation of
Laboratories, publication reference (EAL-G12) Traceability of Measuring
and Test Equipment to National Standards; the purpose of which is to
give guidance on the calibration and maintenance of measuring
36
BIPM
NMIs
Reference standards
Working metrology labs
General purpose calibration labs
(inside a company)
Users test equipment
Figure 1
SI units
Primary stds.
Secondary standards
Reference standards
Working standards
Users test equipment
Figure 2
Note: NMI = National Metrology Institute
37
39
40
was accomplished and signed off. When the proper training is not
documented and signed off by the trainer and trainee, then it is the
same as if the training never happened.
What is a quality calibration program?
A quality calibration program consists of several broad items referred
to in the Quality System Regulation (QSR) from the Food and Drug
Administration (FDA). These items are also referred to by other
standards (ISO 9000, etc.) and regulations throughout most industries
that regulate or monitor production and manufacturing of all types
of products. One of the most stringent requirements can be found in
the current Good Manufacturing Procedures (GMP).
The basic premise and foundation of a quality calibration program
is to Say what you do, Do what you say, Record what you did, Check the
results, and Act on the difference. Lets break these down into simple
terms.
Say what you do means write in detail how to do your job. This
includes calibration procedures, work instructions and standard
operating procedures (SOPs).
Do what you say means follow the documented procedures or
instructions every time you calibrate, or perform a function that
follows specific written instructions.
Record what you did means that you must record the results of your
measurements and adjustments, including what your standard(s) read or
indicated both before and after any adjustments might be made.
Check the results means make certain the test equipment meets
the tolerances, accuracies, or upper/lower limits specified in your
procedures or instructions.
Act on the difference means if the test equipment is out of tolerance,
youre required to inform the user/owner of the equipment because
they may have to re-evaluate manufactured goods, change a process,
or recall a product.3
Say what you do means write in detail how to do your job. This
includes calibration procedures, work instructions and SOPs. All
of your calibration procedures should be formatted the same as
other SOPs within your company. Here is an example of common
formatting for SOPs:
41
1. Procedures
2. Scope
3. Responsibilities
4. Definitions
5. Procedure
6. Related Procedures
7. Forms and Records
8. Document History
After section 4. Definitions, you should have a table listing all of the
instruments or systems that would be calibrated by that procedure,
along with their range and tolerances. After that you should have a
list of the standards to be used to calibrate the items. This table should
also include the standards range and specifications. Then the actual
calibration procedure starts in section 5. Procedures.
Manufacturers manuals usually provide an alignment procedure
that can be used as a template for writing a calibration procedure. They
should show what standards accomplish the calibration of a specific
range and/or function. A complete calibration must be performed
prior to any adjustment or alignment. An alignment procedure and/
or preventive maintenance inspection (PMI) may be incorporated
into your SOP as long as it is separate from the actual calibration
procedure.
There are, generally speaking, two types of calibration procedures:
Generic: temperature gages and thermometers, pressure and vacuum
gages, pipettes, micrometers, power supplies and water baths.
Specific: spectrophotometers, thermal cyclers, and balances/scales.
Generic SOPs are written to show how to calibrate a large variety
of items in a general context. Specific SOPs are written to show stepby-step procedures for each different type of test instrument within
a group of items. Possibly, the calibration form is designed to follow
specific steps (number wise); and removes doubt by the calibration
technician on what data goes into which data field.
Do what you say means follow the documented procedures or
instructions every time you calibrate, or perform a function that follows
specific written instructions. This means following published calibration
procedures every time you calibrate a piece of test equipment.
Have the latest edition of the procedure available for use by your
calibration technicians. Have a system in place for updating your
42
43
record the reading (on the standard and the UUT) and continue with
the rest of the calibration to the end of the calibration procedure. If one
were to stop at the point where an OOT is found, make an adjustment,
then proceed with the calibration, there is a good possibility that the
adjustment affected other ranges or parts of the calibration. This is why
the entire calibration is performed prior to adjustment or alignment.
There will be times when an instrument has a catastrophic failure.
It just dies and cannot be calibrated. This should be noted in the
calibration record. Then, once the problem is found and repaired, an
As Found calibration is performed. The UUT is treated the same as
any OOT unit, but you would not have been able to collect the original
As Found readings.
As Left readings are taken after repair, alignment, or adjustment.
Not all UUTs would be considered OOT when As Left readings
are taken. In some circumstances, it might be metrology department
policy to adjust an item if it is more than beyond its in-tolerance
range, while still meeting its specifications. In this type of situation,
after the UUT is adjusted to be as close to optimum as possible, a
complete calibration is again performed, collecting the As Left
readings for the final calibration record. Another example would be
when preventive maintenance inspection is going to be performed on
an item. The calibration is performed, collecting the As Found data.
Then the PMI is completed, and an As Left set of data is collected.
If the item is found to be out-of-tolerance at that time, there would
not be a problem since it was found to be in tolerance during the first
calibration. It would be obvious that something happened during the
cleaning, alignment or adjustment and that after a final adjustment
was completed to bring the unit back into tolerance, a final As Left
calibration would be performed.
The standard reading, from the working or reference standard you are
using to calibrate the UUT, will also be recorded on the calibration form.
Usually, the standard is set at a predetermined output, and the UUT is
read to see how much it deviates from the standard. This is a best practice
policy that has been in use in the metrology community since calibration
started. However, there will be times when this is not possible.
One example when it would not be practical to set the standard and
take a reading is during the calibration of water baths. The water bath
is set to a predetermined temperature, and the temperature standard is
used to record the actual reading. Compare this to the calibration of
pressure gages where a pressure standard is set to a standard pressure,
44
and the gage(s) under test are then read, and their pressures recorded on
the calibration record, and compared to the standard to see if they are
in or out of tolerance. In other case, just as the calibration of autoclaves,
they are set to complete a sterilization cycle and a temperature device
records all of the temperature readings throughout the cycle and the
readings are checked to see if the autoclave met its specifications. The
same happens when calibrating thermometers. They, along with the
standard, are placed in a dry block and a particular temperature is
set. The UUT is compared to the reference after equilibration, and a
determination is made as to the in or out of tolerance of the UUT. As
can be seen by the above examples, it is not always possible to set the
standard and take a reading from the UUT.
Also on the calibration form should be an area to identify the
standard(s) that were used, along with their next calibration due
date(s), plus their specifications and range.
There should also be a place to identify which calibration procedure
was used, along with the procedures revision number. There must be
a statement showing traceability to your NMI, or in the case of most
companies in the USA, to NIST, or to any artifact that was used as a
standard.
You should include any uncertainty budgets if used, or at least a
statement that a TUR of 4:1 was met.
List environment conditions when appropriate and show if they pass
or fail. According to NCSL International Calibration Control Systems
for the Biomedical and Pharmaceutical Industry Recommended
Practice RP-6, paragraph 5.11: The calibration environment need be
controlled only to the extent required by the most environmentally
sensitive measurement performed in the area.4
According to ANSI/NCSL Z540.3-2006, paragraph 5.3.6 Influence
factors and conditions: All factors and conditions of the calibration
area that adversely influence the calibration results shall be defined,
monitored, recorded, and mitigated to meet calibration process
requirements. Note: Influencing factors and conditions may include
temperature, humidity, vibration, electromagnetic interference, dust,
etc. Calibration shall be stopped when the adverse effects of the
influence factors ad conditions jeopardize the results of the calibration.5
If the conditions within the area that calibrations are being performed
require monitoring according to the standard or requirements that must
be met, then a formal program must be in place for tracking those
conditions and reviewing the data. If this is the case, then there should
be a place in the calibration form for showing that those conditions were
45
either met, were not met, or are not applicable to that calibration.
You should indicate on the form if the calibration passed or failed.
If the UUT had an out-of-tolerance condition, then there should
be a place to show what happened to the UUT, with the following
possibilities as an example:
The user/customer was notified and the UUT was adjusted
and meets specifications.
The user/customer was notified and the UUT was given
a limited calibration with their written approval.
The user/customer was notified and the UUT was taken
out of service and tagged as unusable.
Notice that in each circumstance that the user/customer must be
notified of any and all OOTs. This is called for in all of the standards
and regulations. The user/customer, even if internal to the company
performing the calibrations, must be informed if their test equipment
does not meet their specifications.
There should be an area set aside in the calibration form for making
comments or remarks. Enough space should be available for the
calibration technician to include information about the calibration,
OOT conditions, what was accomplished if an OOT was found, etc.
And finally, the calibration record must be signed and dated by
the technician performing the calibration. In some instances, the
calibration record requires a second set of eyes. This means that an
individual higher up the chain of command (supervisor, manager,
QA inspector, etc.) must review the calibration record and also sign
and date that it has been reviewed, audited, or inspected before it is
considered a completed record. If this is the case, there should be a
place on the form for the final reviewer to sign and date.
What do you do if, after recording your results, you find that you
have made an error, or transposed the wrong numbers, and want to
correct the error? For hard copy records, draw a single line through
the entry, write the correct data, and then place your initials and date
next to the data using black ink. Do not use white-out, or erase the
original data. For making corrections to electronic records (eRecords),
use whatever tracking system the software uses; or make a duplicate
record from scratch with the correct data and explain in the comments
block what happened, and date and sign accordingly.
There should be only one way to file your records, both hard copy
46
and eRecords no matter which system you use, put it into your
written procedures.
An example for filing hard copy records:
Each record is filed by its unique ID number
Records are filed with the newest in the front
Records are filed within a specified time frame
An example for filing eRecords:
Filed by ID number, calibration certificate number and calibration
date
Placed on a secure drive that has regular backup
eRecords are filed within a specified time frame
There are many different ways to manage your calibration data since
there are a variety of ways to collect that data. Hard copy records collected
during the calibration of test instruments have been discussed in detail
already. But the collection of data by electronic means, or through the
use of calibration software, process controllers, etc., should also be
considered. Is the system validated and instrumentation qualified prior
to use? If you are using any type of computerized system, validation of
that software is mandatory. How is the data collected and stored? Is it in
its native format or dumped into a spreadsheet for analysis? All of these
need to be considered to allow for review, analysis, and/or compilation
into your forms, and eventual storage.
The use of computerized data collection brings with it not only
increased productivity and savings in time and effort; but also new
problems in how to collect, manage, review and store the data. It
cannot be emphasized enough the criticality of validating your
software, data lines and storage systems when going entirely electronic
with your calibration records and data management.
Check the results means make certain the test equipment meets
the tolerances, accuracies, or upper/lower limits specified in your
procedures or instructions.
There are various ways to do this. Calibration forms should have the
range and their tolerances listed for each piece of test equipment being
calibrated. In some instances it is apparent what the tolerances will be
for the items being calibrated. In other cases it is not quite so apparent.
47
48
Typical calibration
process as shown
in a flow chart
Start
As found
test
Save
As found
results
NO
Adjustment
required?
As Left
test
YES
YES
Adjust
as needed
Within
limits?
NO
Save
As Left
results
End
49
50
51
52
53
54
55
57
WHAT IS MEASUREMENT?
In technical standards terms the word measurement has been
defined as:
A set of experimental operations for the purpose of determining
the value of a quantity.
What is then the value of quantity? According to the standards
the true value of a quantity is:
The value which characterizes a quantity perfectly defined
during the conditions which exist at the moment when the value
is observed. Note: the true value of a quantity is an ideal concept
and, in general, it cannot be known.
Therefore all instruments display false indications!
A set of experimental
operations for the
purpose of determining
the value of a quantity.
HIERARCHY
OF ACCURACY
TRUE VALUE
International
National standard
Authorized
Laboratories
Instr. Departments
House and working standards
Process instrumentation
58
2. Why measure?
The purpose of a process plant is to convert raw material, energy,
manpower and capital into products in the best possible way. This
conversion always involves optimizing, which must be done better
than the competitors. In practice, optimization is done by means of
process automation. Anyhow, regardless of how advanced the process
automation system is, the control cannot be better than the quality of
measurements from the process.
3. Why calibrate
The primary reason for calibrating is based on the fact that even the
best measuring instruments lack in absolute stability, in other words,
EVERYTHING IS BASED
ON MEASUREMENTS
CONTROLS
INSTRUMENTATION
MEASUREMENTS
Production
Factors
ADJUSTMENTS
PROCESS
Products
59
they drift and lose their ability to give accurate measurements. This
drift makes recalibration necessary.
Environment conditions, elapsed time and type of application can
all affect the stability of an instrument. Even instruments of the same
manufacturer, type and range can show varying performance. One
unit can be found to have good stability, while another performs
differently.
Other good reasons for calibration are:
Environment conditions,
elapsed time and type
of application can all
affect the stability of an
instrument.
QUALITY MAINTENANCE
QUALITY
QP
C1 C2
C1C7 CALIBRATIONS
C3
C4
GOOD AS NEW
C5 C6 C7
QM
LOWER
TOLERANCE
Q1
Q2
Q3
QZM
PURCHASE
T1
T2
QP PURCHASED QUALITY
QZM ZERO MAINTAINED QUALITY
QM MAINTAINED QUALITY
60
T3
TIME
4. Traceability
Calibrations must be traceable. Traceability is a declaration stating
to which national standard a certain instrument has been compared.
SI-UNITS
International
standards
National
standards
Reference
standards
Working
standards
Process
standards
61
Software systems
need features such as
Electronic Signature, Audit
Trail, User Management,
and Security System
to be able to comply
with these regulations.
62
Validation of measure
ment and test methods
(procedures) is generally
necessary to prove that
the methods are suitable
for the intended use.
Validation
Validation of measurement and test methods (procedures) is generally
necessary to prove that the methods are suitable for the intended use.
Non-linearity
Non-linearity is the maximum deviation of a transducers output from
a defined straight line. Non-linearity is specified by the Terminal
Based method or the Best Fit Straight Line method.
Resolution
Resolution is the smallest interval that can be read between two
readings.
63
Sensitivity
Sensitivity is the smallest variation in input, which can be detected as
an output. Good resolution is required in order to detect sensitivity.
Hysteresis
The deviation in output at any point within the instruments sensing
range, when first approaching this point with increasing values, and
then with decreasing values.
Stability is expressed as
the change in percentage
in the calibrated output
of an instrument over a
specified period, usually
90 days to 12 months,
under normal operating
conditions.
Repeatability
Repeatability is the capability of an instrument to give the same
output among repeated inputs of the same value over a period of time.
Repeatability is often expressed in the form of standard deviation.
Temperature coefficient
The change in a calibrators accuracy caused by changes in ambient
temperature (deviation from reference conditions). The temperature
coefficient is usually expressed as % F.S. / C or % of RDG/ C.
Stability
Often referred to as drift, stability is expressed as the change in
percentage in the calibrated output of an instrument over a specified
period, usually 90 days to 12 months, under normal operating
conditions. Drift is usually given as a typical value.
Accuracy
Generally accuracy figures state the closeness of a measured value
to a known reference value. The accuracy of the reference value is
generally not included in the figures. It must also be checked if errors
like non-linearity, hysteresis, temperature effects etc. are included in
the accuracy figures provided.
Accuracy is usually expressed % F.S. or % of RDG + adder. The
difference between these two expressions is great. The only way to
compare accuracy presented in different ways is to calculate the total
error at certain points.
64
Uncertainty
Uncertainty is an estimate of the limits, at a given cover factor (or
confidence level), which contain the true value.
Uncertainty is evaluated according to either a Type A or a Type
B method. Type A involves the statistical analysis of a series of
measurements. In this case, uncertainty is calculated using Type A
uncertainties, i.e. the effects of these components include measurement
errors, which can vary in magnitude and in sign, in an unpredictable
manner. The other group of components, Type B, could be said to be of
a systematic nature. Systematic errors or effects remain constant during
the measurement. Examples of systematic effects include errors in
reference value, set-up of the measuring, ambient conditions, etc. Type
B uncertainty is used when the uncertainty of a single measurement
is expressed.
It should be noted that, in general, errors due to observer fallibility
cannot be accommodated within the calculation of uncertainty.
Examples of such errors include: errors in recording data, errors in
calculation, or the use of inappropriate technology.
Type A uncertainty
The type A method of calculation can be applied when several
independent measurements have been made under the same
conditions. If there is sufficient resolution in the measurement, there
will be an observable difference in the values measured.
The standard deviation, often called the root-mean-square
repeatability error, for a series of measurements under the same
conditions, is used for calculation. Standard deviation is used as a
measure of the dispersion of values.
Type B uncertainty
Type B evaluation of uncertainty involves the use of other means
to calculate uncertainty, rather than applying statistical analysis of a
series of measurements.
It involves the evaluation of uncertainty using scientific judgement
based on all available information concerning the possible variables.
Values belonging to this category may be derived from:
65
For uncertainty
specifications, there
must be a clear statement of cover probability
or confidence level.
The proper use of the available information calls for insight based
on experience and general knowledge. It is a skill that can be learnt
with practice. A well-based Type B evaluation of uncertainty can
be as reliable as a Type A evaluation of uncertainty, especially in
a measurement situation where a Type A evaluation is based only
on a comparatively small number of statistically independent
measurements.
Expanded uncertainty
The EA has decided that calibration laboratories accredited
by members of the EA shall state an expanded uncertainty of
measurement obtained by multiplying the uncertainty by a coverage
factor k. In cases where normal (Gaussian) distribution can be
assumed, the standard coverage factor, k=2, should be used. The
expanded uncertainty corresponds to a coverage probability (or
confidence level) of approximately 95%.
For uncertainty specifications, there must be a clear statement of
cover probability or confidence level. Usually one of the following
confidence levels are used:
1 s = 68%
2 s = 95%
3 s = 99%
7. CALIBRATION MANAGEMENT
Many companies do not pay enough attention to calibration
management although it is a requirement e.g. in ISO9001: 2008.
The maintenance management system may alert when calibration is
66
needed and then opens up a work order. Once the job has been done,
the work order will close and the maintenance system will be satisfied.
Unfortunately, what happens between opening and closing of the
work order is not documented very often. If something is documented,
it is usually in the form of a hand-written sheet that is then archived.
If the calibration results need to be examined at a later time, finding
the sheets requires a lot of effort.
Choosing professional tools for maintaining calibration records
and doing the calibrations can save a lot of time, effort and money.
An efficient calibration management system consists of calibration
management software and documenting calibrators.
Modern calibration management software can be a tool that
automates and simplifies calibration work at all levels. It automatically
creates a list of instruments waiting to be calibrated in the near future.
If the software is able to interface with other systems the scheduling
of calibrations can be done in the maintenance system from which
the work orders can be automatically loaded into the calibration
management software.
When the technician is about to calibrate an instrument, (s)he simply
downloads the instrument details from the calibration management
software into the memory of a documenting calibrator; no printed
notes, etc. are needed. The As Found and As Left are saved in the
calibrators memory, and there is no need to write down anything
with pen.
The instruments measurement ranges and error limits are defined in
the software and also downloaded to the calibrator. Thus the calibrator
is able to detect if the calibration was passed or failed immediately after
the last calibration point was recorded. There is no need to make tricky
calculations manually in the field.
All this saves an extensive amount of time and prevents the user
from making mistakes. The increase in work productivity allows for
more calibrations to be carried out within the same period of time
as before. Depending on what process variable is calibrated and how
many calibration points are recorded, using automated tools can be 5
to 10 times faster compared to manual recording.
While the calibration results are uploaded onto the database, the
software automatically detects the calibrator that was used, and the
traceability chain is documented without requiring any further actions
from the user.
Calibration records, including the full calibration history of an
The instruments
measurement ranges
and error limits are
defined in the software
and also downloaded
to the calibrator.
67
Implementing a modern
calibration management
system benefits
everybody who has
anything to do with
instrumentation.
68
A good, automated
calibration system
reduces workload.
References
[1] ISO9001: 2008 Quality Management Systems.
Requirements
[2] 21 CFR Part 11: Electronic Records;
Electronic Signatures
[3] 21 CFR Part 211: Current Good Manufacturing Practice
for Finished Pharmaceuticals
69
Calibration
Management and
Maintenance
why calibrate
Why Calibrate?
What is the risk of not calibrating?
73
why calibrate
74
why calibrate
manufacturing conditions (dust and dirt) and elapsed time are all
contributing factors here. Even instruments manufactured by the same
supplier can vary in their performance over time.
Calibration also ensures that product or batch quality remains high
and consistent over time. Quality systems such as ISO 9001, ISO 9002
and ISO 14001 require systematic, well-documented calibrations with
respect to accuracy, repeatability, uncertainty and confidence levels.
This affects all process manufacturers.
Armando Rivero Rubalcaba is head of Instrumentation at beer
producer Heineken (Spain). He comments: For Heineken, the quality
of the beer is a number one priority. All the plants in Spain have
received ISO 9001 and ISO 14001 certifications, in addition to the BRC
certificate of food safety. We must therefore ensure that all processes
correspond to the planned characteristics. The role of calibration is very
important to ensure the quality and safety of the processes.
Pharmaceutical manufacturers must follow current Good
Manufacturing Practices, GMP, requires that calibration records are
maintained and calibrations have to be carried out in accordance
with written, approved procedures. Typically, each instrument has a
master history record and a unique ID. All product, process and safety
instruments should also be physically tagged.
Furthermore, a calibration interval and error limits should be
defined for each instrument and standards should be traceable to
national and international standards. Standards must also be more
accurate than the required accuracy of the equipment being calibrated.
On the people side, there must be documented evidence that
employees involved in the calibration process have been properly
trained and competent. The company must also have a documented
change management system in place, with all electronic systems
complying with FDA regulations 21 CFR Part 11.
In the power generation, energy and utilities industries, instrument
calibration can help to optimize a companys production process or to
increase the plants production capacity. For example, at the Almaraz
Nuclear Power Plant in Spain, by improving the measurement of
reactor power parameters from 2% to 0.4%, enabled the reactor power
in each unit to be increased by 1.6%, which has a significant effect on
annual production capacity.
Safety is another important reason to calibrate instruments.
Production environments are potentially high risk areas for employees
and can involve high temperatures and high pressures. Incorrect
measurements in a hazardous area could lead to serious consequences,
75
why calibrate
Calibration is of great
importance, especially
from the viewpoint of
production safety and
quality of the final product.
76
why calibrate
When to calibrate
Due to drift, all instruments require calibrating at set intervals. How
often they are calibrated depends on a number of factors. First, the
manufacturer of the instrument will provide a recommended calibration
interval. This interval may be decreased if the instrument is being used
in a critical process or application. Quality standards may also dictate
how often a pressure or temperature sensor needs calibrating.
The most effective method of determining when an instrument
requires calibrating is to use some sort of history trend analysis. The
optimal calibration interval for different instruments can only be
determined with software-based history trend analysis. In this way,
highly stable sensors are not calibrated as often as those sensors that
are more susceptible to drift.
77
78
Calibration software
is one such tool that
can be used to support
and guide calibration
management activities,
with documentation being
a critical part of this.
79
80
81
82
Regardless of industry
sector, there seems to be
some general challenges
that companies face
when it comes to
calibration management.
Calibration software
With specialist calibration management software, users are provided
with an easy-to-use Windows Explorer-like interface. The software
manages and stores all instrument and calibration data. This
includes the planning and scheduling of calibration work; analysis
and optimisation of calibration frequency; production of reports,
certificates and labels; communication with smart calibrators; and
83
easy integration with CMM systems such as SAP and Maximo. The
result is a streamlined, automated calibration process, which improves
quality, plant productivity and efficiency.
Benefits of using calibration software
84
With software-based calibration management, planning and decisionmaking are improved. Procedures and calibration strategies can be
planned and all calibration assets managed by the software. Position,
device and calibrator databases are maintained, while automatic alerts
for scheduled calibrations can be set up.
Organisation also improves. The system no longer requires pens and
paper. Calibration instructions are created using the software to guide
engineers through the calibration process. These instructions can also
be downloaded to a technicians handheld documenting calibrator
while they are in the field.
Execution is more efficient and errors are eliminated. Using
software-based calibration management systems in conjunction with
documenting calibrators means that calibration results can be stored
in the calibrators memory, then automatically uploaded back to the
calibration software. There is no re-keying of calibration results from
a notebook to a database or spreadsheet. Human error is minimised
and engineers are freed up to perform more strategic analysis or other
important activities.
Documentation is also improved. The software generates reports
automatically and all calibration data is stored in one database rather
than multiple disparate systems. Calibration certificates, reports and
labels can all be printed out on paper or sent in electronic format.
Analysis becomes easier too, enabling engineers to optimise calibration
intervals using the softwares History Trend function.
Also, when a plant is being audited, calibration software can
facilitate both the preparation and the audit itself. Locating records
and verifying that the system works is effortless when compared to
traditional calibration record keeping.
Regulatory organisations and standards such as FDA and ISO
place demanding requirements on the recording of calibration data.
Calibration software has many functions that help in meeting these
requirements, such as Change Management, Audit Trail and Electronic
Signature functions. The Change Management feature in Beamexs
CMX software, for example, complies with FDA requirements.
Business benefits
For the business, implementing software-based calibration management
means overall costs will be reduced. These savings come from the
now-paperless calibration process, with no manual documentation
procedures. Engineers can analyse calibration results to see whether
the calibration intervals on plant instruments can be altered. For
example, those instruments that perform better than expected may
well justify a reduction in their calibration frequency.
Plant efficiencies should also improve, as the entire calibration process
is now streamlined and automated. Manual procedures are replaced
with automated, validated processes, which is particularly beneficial if
the company is replacing a lot of labour-intensive calibration activities.
Costly production downtime will also be reduced.
Even if a plant has already implemented a CMM system, calibration
management software can be easily integrated to this system. If the
plant instruments are already defined on a database, the calibration
management software can utilise the records available in the CMM
system database.
The integration will save time, reduce costs and increase productivity
by preventing unnecessary double effort and re-keying of works orders
in multiple systems. Integration also enables the plant to automate its
calibration management with smart calibrators, which simply is not
possible with a standalone CMM system.
Benefits for all process plants
Beamexs suite of calibration management software can benefit all
sizes of process plant. For relatively small plants, where calibration
data is needed for only one location, only a few instruments require
calibrating and where regulatory compliance is minimal, Beamex
CMX Light is the most appropriate software.
For medium-to-large sized companies that have multiple users who
have to deal with a large amount of instruments and calibration work, as
well as strict regulatory compliance, Beamex CMX Professional is ideal.
Beamexs high-end solution, CMX Enterprise, is suitable for process
manufacturers with multiple global sites, multilingual users and a very
large amount of instruments that require calibration. Here, a central
calibration management database is often implemented that is used
by multiple plants across the world.
CHECKLIST
Choosing the right
calibration software
Is it easy to use?
What are the specific
requirements in terms
of functionality?
Are there any IT
requirements or
restrictions for choosing
the software?
Does the calibration
software need to be
integrated with the plants
existing systems?
Is communication with
smart calibrators a
requirement?
Does the supplier offer
training, implementation,
support and upgrades?
Does the calibration
software need to be
scalable?
Can data be imported
to the software from the
plants current systems?
Does the software offer
regulatory compliance?
Suppliers references and
experience as a software
developer?
85
SUMMARY
Calibration software
improves calibration
management tasks
in all these areas
Planning &
decision-making
Organisation
Execution
Documentation
Analysis
86
Beamex users
Beamex conducted recently a survey of its customers, across all
industry sectors. The results showed that 82% of CMX Calibration
software customers said that using Beamex products had resulted in
cost savings in some part of their operations.
94% of CMX users stated that using Beamex products had improved
the efficiency of their calibration processes, whilst 92% said that using
CMX had improved the quality of their calibration system.
Summary
Every type of process plant, regardless of industry sector, can benefit
from implementing specialist calibration management software.
Compared to traditional, paper-based systems, in-house built legacy
calibration systems or calibration modules with CMM systems, using
dedicated calibration management software results in improved
quality, increased productivity and reduced costs of the entire
calibration process.
Despite these benefits, only one quarter of companies who need
to manage instrument calibrations actually use software designed for
that purpose.
87
89
90
91
92
93
SUMMARY
94
95
Uncertainty need is
one of the most important
things to consider when
determining the calibration
period.
97
Uncertainty need
One of the first things to evaluate is the uncertainty need of the
customer for their particular measurement device. Actually, the initial
selection of the measurement device should be also done based on this
evaluation. Uncertainty need is one of the most important things to
consider when determining the calibration period.
Stability history
In critical applications,
the costs of an outof-tolerance situation
can be extremely high
(e.g. pharmaceutical
applications) and therefore
calibrating the equipment
more often is safer.
When the customer has evaluated his/her needs and purchased suitable
measuring equipment, (s)he should monitor the stability history of the
measuring equipment. The stability history is important criteria when
deciding upon any changes in the calibration period. Comparing the
stability history of measuring equipment to the specified limits and
uncertainty needs provides a feasible tool for evaluating the calibration
period. Naturally, calibration management software with the history
analysis option is a great help in making this type of analysis.
The cost of recalibration vs. consequences
of an out-of-tolerance situation
Optimizing between recalibration costs and the consequences of an outof-tolerance situation is important. In critical applications, the costs of
an out-of-tolerance situation can be extremely high (e.g. pharmaceutical
applications) and therefore calibrating the equipment more often is
safer. However, in some non-critical applications, where the out-oftolerance consequences are not serious, calibration can be made less
frequently. Therefore, evaluating of the consequences of an out-oftolerance situation is something to be considered. The corrective actions
in such a case should also be made into an operating procedure.
Some measurements in a factory typically have more effect on a
product quality than others, and therefore some measurements are
more acute than others and should be also calibrated more often than
others.
Initial calibration period
When you purchase calibration equipment with which you are not
familiar, you still need to decide the initial calibration period. In this
98
In some cases,
crosschecking with
other similar measuring
equipment is also feasible
for detecting the need
for calibration.
SUMMARY
99
101
With paperless
systems, workflow
improves dramatically.
102
Paperless calibration
systems improve plant
efficiencies because the
entire calibration process
is now streamlined and
automated.
103
104
Suitable hardware
Rather than rely on engineers in the field accurately keying in
calibration results into suitably robust laptops or PDAs, it is better to
source the data electronically using documenting calibrators that are
specifically designed for this task.
Validation, training & education
Paperless systems also need validating in the users own environment.
Here, Beamex provides comprehensive validation, education and
training services for customers.
Education and training for users is critical, as this will help companies
to overcome the natural resistance to change amongst the workforce,
which may be used to dealing with traditional, paper-based systems.
Case study
Beamex is helping many organisations to implement paperless
calibration management systems, including Pharmaceuticals,
Chemicals, Power & Energy, Oil Gas & Petrochemicals companies.
Amongst these customers is UK firm Croda Chemicals Europe.
Based in East Yorkshire near Goole, the Croda plant uses pressurised
vessels to purify lanolin for healthcare and beauty products. Each
vessel needs to be certified at least once every two years in order
to demonstrate that the vessel is safe and structurally sound. This
includes a functionality check on all of the pressure instrumentation,
as well as the sensors that monitor the incoming chemical additives
and the outgoing effluent.
Senior Instrument Technician David Wright recalls what it was like
to perform all of those calibration operations with paper and pencil
during the companys regularly scheduled maintenance shutdowns:
It took us one week to perform the calibrations and a month to put
together the necessary paperwork.
Today, Croda uses the CMX calibration management software
system from Beamex, which coordinates data collection tasks and
archives the results. Its faster, easier and more accurate than our old
paper-based procedures, says Wright. Its saving us around 80 manhours per maintenance period and should pay for itself in less than
three years.
105
106
intelligent commissioning
Intelligent commissioning
Successful commissioning
of process instrumentation
must be considered within
the context of the overall
commissioning program.
107
intelligent commissioning
108
intelligent commissioning
Construction
Pre-commissioning
Mechanical completion
Commissioning
Trial operation
Initial start-up
Examine product specification
Examine production performance
Acceptance of plant
109
intelligent commissioning
damage in transit or storage. There are also many other reasons why
instruments should be calibrated during the commissioning phase
before start-up.
Assuring transmitter quality
First of all, the fact that an instrument or transmitter is new does
not automatically mean that it is within required specifications.
Calibrating a new instrument before installing or using it is a quality
assurance task. You can check the overall quality of the instrument to
see if it is defective and to ensure it has the correct, specified settings.
Reconfiguring a transmitter
110
intelligent commissioning
111
intelligent commissioning
By using a documenting
calibrator, the calibration
results are stored
automatically in the
calibrators memory
during the calibration
process.
SUMMARY
Calibration is beneficial during process plant commissioning
for various different reasons:
Transmitter quality assurance
Reconfiguring a transmitter
Monitoring the quality and stability of a transmitter
Entering the necessary transmitter data into a calibration
database and defining the optimal calibration interval
112
113
114
115
Seamless communication
Beamex CMX
Professional or Beamex
CMX Enterprise software
can easily be integrated
to CMM systems,
whether it is a Maximo,
SAP or Datastream
CMM system or even a
companys own, in-house
software for maintenance
management.
116
1. Scope of Work
2. Development and Implementation
3. Testing
4. Installation, Verification and Training
The four main phases are also often divided into sub-phases. A
schedule is usually defined for the completion of the entire project as
well as for the completion of each project phase. Each project phase
should be approved according to the acceptance procedures defined
in the offer, agreement, project plan or other document annexed to
the offer/agreement.
117
Scope of work
To ensure successful integration with a satisfied customer, defining
the correct scope of work (SOW) is crucial. The scope of work should
include a brief project description, services provided, main roles,
partner responsibilities and the desired outcome. The scope of work
is important to make sure that both the supplier and the customer
have understood the project in question and they have similar
expectations from it. The SOW is often developed through pre-studies
and workshops.
Defining what is not included in the scope of work is just as important
as defining what is included in it. This means that establishing some
framework and limitations for the project are also very important,
as the resourcing, scheduling and costs of the project depend greatly
on the scope of work. If the scope of work is not defined carefully,
questions or problems may appear later in the project, which will direct
the project back to phase one where a review of the scope is necessary.
This is an urgent but time-consuming matter and can be avoided if
the right people and decision-makers participate in the first project
phase. However, as changes to the original scope of work may be
necessary and required even in projects where the SOW phase has
been done carefully, it is important that the supplier and customer
agree on change management procedures as early as the starting phase
of the project.
118
Purpose / needs
Target
Suppliers responsibilities
Customers responsibilities
Project management and
project steering group
Change management
Scope of
work
(SOW)
Specications
documentation
Development
and
implementation
Implementation
documentation
Testing
Testing
documentation
Installation
Verication
Training
Instructional
documentation
FOLLOW UP
CLOSURE OF INTEGRATION PROJECT
119
Integrating a CMM
system with calibration
management software
is an important step in
the right direction when it
comes to EAM, Enterprise
Asset Management.
120
121
Calibration
in Industrial
Applications
124
125
126
Calibration software
ensures that calibration
procedures are carried out
at the correct time and that
calibration tasks are not
forgotten, overlooked or
overdue.
___________________
* Reported to the Industrial Instrumentation and Controls Technology Alliance and presented
at the TAMU ISA Symposium, January, 2004
127
SUMMARY
The benefits of using a documenting calibrator
128
129
131
We must remember
that the quality of the
evaluation of measuring
tolerance depends on
the collected information
through calibration.
132
133
SUMMARY
Calibration (or verification) is a fundamental tool for
maintaining a measuring system. It also assists the user in
obtaining the required quality of measurements in a process.
The following must be taken into consideration:
the type of procedure to be applied in confirming measuring
tolerance
the interpretation of information while abiding by the
calibration certificate
changing procedures based on received information
Quality calibration methods and data handling systems offer
state-of-the-art possibilities to any company.
134
135
The eccentricity test involves placing the object being weighed in the
middle of the load receptor as accurately as possible. This is sometimes
difficult due to the shape or construction of the object being weighed.
Typical calibration procedures include the eccentricity test. You can
determine how much the eccentricity of the load will affect the
indication on the scale by weighing the same weight at the corners of
the load receptor.
Test for errors in indication
The weighing test examines the error of the indication on the scale for
several predefined loads. This enables you to correct the errors and
definitions for non-linearity and hysteresis.
If the scales maximum load limit is extremely large, it may be
impractical to use standard weights for calibrating the entire range.
In such a case, suitable substitution mass is used instead. Substitution
mass should also be used if the construction of the scale does not allow
the use of standard weights.
A truck scale is unsuitable for weighing letters
The purpose of the minimum weight test is to determine the minimum
weight, which can be assuredly and accurately measured using the
scale in question. This condition is met if the measurement error is
less than 0.1% of the weight, with a probability of 99.73%.
Combined standard uncertainty of the error U(E)
Knowing the error of the scale indication at the point of each
calibration is not sufficient. You must also know how certain you can
be about the error found at each point of calibration. There are several
sources of uncertainty of the error, e.g.:
138
The masses of the weights are only known with a certain uncertainty.
Air convection causes extra force on the load receptor.
Air buoyancy around the weights varies according to barometric
pressure, air temperature and humidity.
A substitute load is used in calibrating the scale.
Digital scale indications are rounded to the resolution in use.
Analogous scales have limited readability.
There are random variations in the indications as can be seen in the
Repeatability Test.
The weights are not in the exact middle of the load receptor.
The values of uncertainty determined at each point of calibration are
expressed as standard uncertainties (coverage probability: 68.27%),
which correspond to one standard deviation of a normally distributed
variable. The combined standard uncertainty of the error at a certain
point of calibration has a coverage probability of 68.27% as well.
1.8 g
0.4 g
3.2 g
1.1 g
3.9 g
2.5 g
u(E)
68.27%
U(E) = 2u(E)
95.45 %
U(E) = 3u(E)
99.73 %
4.6 g
139
140
141
142
While standards
determine accuracy to
which manufacturers
must comply, they
nevertheless do
not determine the
permanency of accuracy.
Temperature sensors
The most commonly used sensors in the industry used for measuring
temperature are temperature sensors. They either convert temperature
into resistance (Resistance Temperature Detectors, RTD) or convert
temperature into low voltage (Thermocouples, T/C). RTDs are based
143
Temperature transmitters
The signal from the temperature sensor cannot be transmitted a
longer distance than the plant. Therefore, temperature transmitters
were developed to convert the sensor signal into a format that can
be transmitted easier. Most commonly, the transmitter converts the
signal from the temperature sensor into a standard ranging between 4
and 20 mA. Nowadays, transmitters with a digital output signal, such
as Fieldbus transmitters, are also being adopted, while the transmitter
converts the sensor signal, it also has an impact on the total accuracy,
and therefore the transmitter must be calibrated on regular basis.
A temperature transmitter can be calibrated using a temperature
calibrator.
Calibrating temperature instruments
To calibrate a temperature sensor, it must be inserted into a known
temperature. Sensors are calibrated either by using temperature
dry blocks for industrial field or liquid baths (laboratory). To make
comparisons, we compare the sensor to be calibrated and the reference
sensor. The most important criterion in the calibration of temperature
sensors is how accurate the sensors are at the same temperature.
The heat source may also have an internal temperature measurement
that can be used as reference, but to achieve better accuracy and
reliability, an external reference temperature sensor is recommended.
The uncertainty of calibration is not the same as the accuracy
of the device. Many factors influence the total uncertainty, and
performing calibration is not the least influencing factor. All heat
144
Measurement uncertainty
Axial homogeneity
Axial homogeneity is the temperature distribution in the
measurement zone along the boring (axial temperature
distribution).
Radial homogeneity
Radial homogeneity can be explained as the difference in
temperature occurring between the borings.
The uncertainty of
calibration is not the same
as the accuracy of the
device.
Loading effect
When several sensors are placed in the borings of the heat
source, they will affect accuracy. This phenomenon is called
loading effect.
Stability
Stability means variation of the temperature in the measurement
zone over time when the system has reached equilibrium. Thirty
minutes is commonly used.
Immersion depth
To achieve a more stable calibration, the immersion depth for a
probe should be sufficient for the sensor being calibrated. Stem
conduction, heat flux along the length of the thermometer stem,
affects both the reference sensor and the unit being tested.
145
146
147
148
EURAMET
The EURAMET guideline (EURAMET /cg-13/v.01, July 2007
[previously EA-10/13]):
The Euramet calibration guide defines a normative way to calibrate
149
150
Stem conductance
Sensor to be
calibrated
Reference sensor
Axial uniformity
Internal sersor
Radial uniformity
Uncertainty components
that are related to
temperature calibration
are relevant to all
manufacturers dry
blocks.
Axial uniformity
Axial uniformity refers to the variation in temperature along
the vertical length of the insert. The Euramet calibration guide
states, dry wells should have a zone of sufficient temperature
homogeneity of at least 40 mm in length at the bottom of the
insert. The purpose of this homogenous measurement zone is to
cover various sensor constructions. The thermocouple typically
has its hot junction close to the tip of the probe whereas the PRT
sensing element may be 30 to 50 mm long. With this in mind, a
homogenous zone of at least 60 mm is recommended.
Radial uniformity
Radial uniformity refers to the variation in temperature between
the holes of the insert. Related uncertainty is caused, for example,
by the placement of the heaters, thermal properties of materials
and alignment of the insert holes. Non-symmetrical loading or
probes with significantly different thermal conductivity (for
example large diameter probes) may cause additional temperature
variation.
151
Loading effect
Every probe in the insert conducts heat either from or into the
insert. The more the load, the more the ambient temperature
will affect the measurements. Sufficient immersion depth and
dual zone control helps to reduce load-related uncertainties. The
loading effect is not visible in the control sensor indication and the
controller cannot completely compensate for this shift.
152
Using an external
reference sensor
enables more accurate
measurement of the
temperature of the probes
to be calibrated.
Radial uniformity
Radial uniformity is still present when using an external reference
probe and should be taken into account as specified.
Loading effect
Since the internal sensor cannot completely compensate the loadrelated temperature shift inside the insert, the external reference
sensor is within the same calibration volume as the sensors to be
calibrated. The loading effect is usually much less significant with
an external reference sensor.
153
154
CALCULATION EXAMPLES
Specification (C)
Display Accuracy
0.10
0.058
Hysteresis
0.025
0.014
Axial Uniformity
0.02
0.012
Radial Uniformity
0.01
0.006
Stability
0.005
0.003
Loading Effect
0.05
0.029
Combined Uncertainty:
0.067
Expanded Uncertainty:
0.135
Specification (C)
Axial Uniformity
0.02
0.012
Radial Uniformity
0.01
0.006
Stability
0.005
0.003
Loading Effect
0.005
0.003
0.006
0.003
Combined Uncertainty:
0.014
Expanded Uncertainty:
0.028
Specification (C)
Short-term repeatability
0.007
0.004
Drift
0.007
0.004
Hysteresis
0.01
0.006
0.01
0.006
Calibration uncertainty
Combined Uncertainty:
0.010
Expanded Uncertainty:
0.020
Combined uncertainty:
0.017
Expanded Uncertainty:
0.034
155
Fieldbus transmitters
must also be calibrated
Fieldbus transmitters
must be calibrated as
well, but how can it be
done?
History of fieldbus
Back in the 1940s, instrumentation utilized mainly pneumatic signals
to transfer information from transmitters. During the 1960s, the mA
signal was introduced, making things much easier. In the 1970s,
computerized control systems began to make their arrival. The first
digital, smart transmitter was introduced in the 1980s, using first
proprietary protocols. The first fieldbus was introduced in 1988, and
throughout the 1990s a number of various fieldbuses were developed.
During the 1990s, manufacturers battled to see whose fieldbus
would be the one most commonly used. A standard was finally set
in the year 2000 when the IEC61158 standard was approved. The
157
158
Although fieldbus
hardware may cost the
same as conventional,
or even a little bit more,
the total installation costs
for a fieldbus factory is
far less than conventional.
159
160
A modern smart
transmitter typically
outperforms an older type
of conventional transmitter
regarding measurement
accuracy and stability.
163
no longer simply measure the output analog signal, but they need to
have the possibility to communicate with the transmitter and read the
digital signal. That brings a whole new challenge - how can the digital
output be read?
Thinking of the opposite of a smart transmitter, i.e. a non-smart
transmitter, would be a transmitter with a purely analog (or even
pneumatic) output signal.
Smart transmitter protocols
It is crucial to remember
that although a
communicator can be
used for configuration, it is
not a reference standard
and therefore cannot be
used for metrological
calibration.
164
165
The solution
166
Why calibrate?
A modern transmitter is advertised as being smart and extremely
accurate and sometimes sales people tell you they dont need
to be calibrated at all because they are so smart. So why
would you calibrate them? First of all, the output protocol of a
transmitter does not change the fundamental need for calibration.
There are numerous reasons to calibrate instruments initially and
periodically. A short summary of the main reasons include;
167
Calibration in
hazardous environments
169
Group IIC
Acetylene
Group IIB+H2 Hydrogen
Group IIB
Ethylene
Group IIA
Propane
171
Hazardous locations
can exist in virtually all
industries, stores,
and in the home.
172
173
Hazardous area
classifications in IEC/
European countries are:
Zone 0: an explosive gas
& air mixture is continuously
present or present for a long
time.
Zone 1: an explosive gas &
air mixture is likely to occur
in normal operation.
Zone 2: an explosive gas
& air mixture is not likely to
occur in normal operation,
and if it occurs it will exist
only for a short time.
175
176
The differences in design and technical features were made with one
purpose in mindto ensure that the device is safe to use and is unable
to cause an ignition. The surface of the device is made of conductive
material. The battery of an intrinsically safe calibrator is usually slower
to charge and it discharges quicker. Many times intrinsically safe
equipment operate only with dry batteries, but the Beamex intrinsically
safe calibrators operate with chargeable batteries. When charging the
battery, it must be done in a non-Ex area. External pressure modules
can be used with IS-calibrators, but they must also be intrinsically
safe. There are also usually small differences with electrical ranges
compared to regular industrial calibrators (e.g. maximum is lower).
Making a calibrator safe and unable to cause ignition typical
technical differences:
Surface made of conductive material
Constraints in using the device (listed in Product Safety Note)
Small differences with electrical ranges (e.g. maximum is lower)
Battery slower to charge, quicker to discharge
Battery must be charged in a non-Ex area
When using external pressure modules, they must be IS-versions
178
There are certain aspects that need special attention when doing
service or repair on an intrinsically safe calibrator. The most important
thing to remember is that an intrinsically safe calibrator must maintain
its intrinsic safety after the service or repair. The best way to do this
is to send it to the manufacturer or to an authorized service company
for repair. Recalibration can be done by calibration laboratories (still
preferably with ISO/IEC 17025 accreditation).
Safe fieldbus calibration with the Beamex MC5-IS Intrinsically
Safe Multifunction Calibrator
The Beamex MC5-IS Intrinsically Safe Multifunction Calibrator
is a high accuracy, all-in-one calibrator for extreme environments.
Being an all-in-one calibrator, the MC5-IS replaces many individual
measurement devices and calibrators. The MC5-IS is also ATEX
and IECEx certified. The MC5-IS has calibration capabilities
for pressure, temperature, electrical and frequency signals. It is
a documenting calibrator, which means that it communicates
seamlessly with calibration software. Using documenting calibrators
with calibration software can remarkably improve the efficiency and
quality of the entire calibration process. The MC5-IS also has HART
communication. The MC5-IS also has HART communication. The
MC5-IS can also be used for calibrating Foundation Fieldbus H1 or
Profibus PA transmitters.
Calibration terminology
A to Z
1
______________
1.Bucher, Jay L. 2004. The Metrology Handbook. Milwaukee: ASQ Quality Press.
181
Terms that are not in this glossary may be found in one of these
primary references:
1.ISO. 1993. International vocabulary of basic and general terms in
metrology (called the VIM); BIPM, IEC, IFCC, ISO, IUPAC,
IUPAP, and OIML. Geneva: ISO.
2.ANSI/NCSL. 1997. ANSI/NCSL Z540-2-1997, U. S. Guide to the
expression of uncertainty in measurement (called the GUM). Boulder,
CO: NCSL International.
3.NCSL. 1999. NCSL Glossary of metrology-related terms. 2nd ed.
Boulder, CO: NCSL International.
Some terms may be listed in this glossary in order to expand on
the definition, but should be considered an addition to the references
listed above, not a replacement of them. (It is assumed that a calibration
or metrology activity owns copies of these as part of its basic reference
material.)
Glossary
Accreditation (of a laboratory) Formal recognition by an accreditation
body that a calibration or testing laboratory is able to competently
perform the calibrations or tests listed in the accreditation
scope document. Accreditation includes evaluation of both the
quality management system and the competence to perform the
measurements listed in the scope.
Accreditation body An organization that conducts laboratory
accreditation evaluations in conformance to ISO Guide 58.
Accreditation certificate Document issued by an accreditation
body to a laboratory that has met the conditions and criteria for
accreditation. The certificate, with the documented measurement
parameters and their best uncertainties, serves as proof of accredited
status for the time period listed. An accreditation certificate
without the documented parameters is incomplete.
Accreditation criteria Set of requirements used by an accrediting
body that a laboratory must meet in order to be accredited.
Accuracy (of a measurement) Accuracy is a qualitative indication of
how closely the result of a measurement agrees with the true value
of the parameter being measured. (VIM, 3.5) Because the true
value is always unknown, accuracy of a measurement is always
182
183
184
185
186
where
CI is the confidence interval,
n is the number of items in the sample,
p is the proportion of items of a given type in the population,
s is the sample standard deviation,
x is the sample mean, and
t is the Students T value for 2 and (n 1) ( is the level of
significance).
187
188
189
190
191
192
193
the calibration test indicates success of the repair. Minor repairs are
defined as repairs that take no longer than a short time as defined
by laboratory management, and where no parts have to be ordered
from external suppliers, and where substantial disassembly of the
instrument is not required. Contrast with: calibration (1), repair
Reported value One or more numerical results of a calibration
process, with the associated measurement uncertainty, as recorded
on a calibration report or certificate. The specific type and format
vary according to the type of measurement being made. In general,
most reported values will be in one of these formats:
Measurement result and uncertainty. The reported value is
usually the mean of a number of repeat measurements. The
uncertainty is usually expanded uncertainty as defined in the
GUM.
Deviation from the nominal (or reference) value and
uncertainty. The reported value is the difference between
the nominal value and the mean of a number of repeat
measurements. The uncertainty of the deviation is usually
expanded uncertainty as defined in the GUM.
Estimated systematic error and uncertainty. The value may be
reported this way when it is known that the instrument is part
of a measuring system and the systematic error will be used
to calculate a correction that will apply to the measurement
system results.
Round robin See: Interlaboratory Comparison
Scope of accreditation For an accredited calibration or testing
laboratory, the scope is a documented list of calibration or testing
fields, parameters, specific measurements, or calibrations and
their best measurement, uncertainty. The scope document is an
attachment to the certificate of accreditation and the certificate is
incomplete without it. Only the calibration or testing areas that
the laboratory is accredited for are listed in the scope document,
and only the listed areas may be offered as accredited calibrations
or tests. The accreditation body usually defines the format and
other details.
Self-calibration Self-calibration is a process performed by a user for
the purpose of making an IM&TE instrument or system ready
for use. The process may be required at intervals such as every
power-on sequence; or once per shift, day, or week of continuous
operation; or if the ambient temperature changes by a specified
194
195
196
197
199
200
201
PORTABLE CALIBRATORS
WORKSTATIONS
CALIBRATION SOFTWARE
202
PROFESSIONAL SERVICES
about beamex
About Beamex
One of the worlds leading providers of calibration
solutions.
Develops and manufactures high-quality
calibration equipment, software, systems and
services for the calibration and maintenance of
process instruments.
Certified in accordance with the ISO 9001:2008
quality standard.
Comprehensive product range includes portable
calibrators, workstations, calibration software,
accessories, professional services and industryspecific solutions.
Products and services available in more than
60 countries. More than 10,000 companies
worldwide utilize Beamexs calibration solutions.
Customers from wide range of industries, such
as automotive, aviation, contractor engineering,
education, food and beverage, manufacturing,
marine, metal and mining, nuclear, oil and gas,
petrochemical and chemical, pharmaceutical,
power and energy, and pulp and paper.
For customers with requirements for accuracy,
versatility, efficiency, ease-of-use and reliability.
Beamexs Accredited Calibration Laboratory
is accredited and approved by FINAS (Finnish
Accreditation Service). FINAS is a member of all
Multilateral Recognition Agreements / Mutual
Recognition Arrangements (MLA/MRA) signed by
European and other international organizations,
i.e. European co-operation for Accreditation
(EA), International Laboratory Accreditation
Cooperation (ILAC) and International
Accreditation Forum Inc. (IAF).
203