Alat Ukur Dan Instrumentasi

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 13

ALAT UKUR DAN

INSTRUMENTASI

Oleh :
A M E LYA AY U S A R I ( 1 8 0 1 0 0 4 )
SHERENINA PUTRI Q.
EKKLESIA CAHYO WIBOWO
A L FAT H A N N U R WA H I D

1
KALIBRASI

2
CALIBRATION
Definition

Purpose

Fuction

Benefit

Basic Principles of Calibration

3
DEFINITION
Calibration is the process of checking and setting the accuracy of the
measuring instrument by comparing it with standards / benchmarks.

PURPOSE
 Achieve measurement traceability. The measurement results can be
linked / traced to a higher standard / meticulous (national and /
international primary standards), through a series of uninterrupted
comparisons.
 Determine the deviation of the truth of the conventional value of the
appointment of a measuring instrument.
 Guarantee the results in accordance with national and international
standards.

4
BENEFIT
 Keep the condition of measuring instruments and measuring materials
to keep up with the specifications
 To support the quality system applied in various industries on
laboratory equipment and production.
 Can know the difference (deviation) between the true price and the
price indicated by the measuring instrument.
FUCTION
Maintain quality control by ensuring the performance and accuracy of
the various instruments used through determining the deviation of the
standard value with the value indicated by the measuring instrument.
Ensure the accuracy of the measuring instrument so that the instruments
used produce accurate measurements

5
Basic Principles of Calibration
 Unit Under Test
 Measuring standard
 Conditioned environment
 Operator

6
PRECISION
Definition

Example

Conclusion

7
DEFINITION
Precision is the ability of a measuring instrument to show the same number
when used repeatedly in the same measurement conditions and measuring
objects.
Precision is a measure of how close a series of measurements are to each
other.

EXAMPLE
I have a scale with a resolution of 0.1kg. Then I have a 25 kg sack of rice, I
will weigh in the scale. With the proviso that the scales are not moved,
weighing is done in the same place and condition,and I then weigh. First
data: 25.05 kg; I lowered the burden from the scales, and I put it back into
the development so I got the second data, for example: 25.01 kg next: 25.02
kg then: 25.11 kg Put up to 30 times the measurement so that you can get
30 pieces of data.

8
 The range of distances between the edges of the valley from the
normal distribution diagram is how we recognize the level of
precision (precision) of a measuring instrument.
 Then the smaller the width of the precision, the measuring
instrument is said to have better precision. Repeatability
(measurement repeatability) is better.

Conclusion :
A good measuring instrument should have good accuracy and
precision. This means that the average result of the appointment will
always be close to the standard value (which represents the level of
accuracy with the actual value) or accurate, and the repeatability of the
data display is good or precise.

9
ACCURACY
Definition

Example

10
DEFINITION
It is defined as the difference between the indicated value and the actual
value. The actual value may be a known standard and accuracy is
obtained by comparing it with the obtained value. If the difference is
small accuracy is high and vice versa. Accuracy depends on several
other parameters like hysteresis, linearity, sensitivity, offset, drift and so
on. It is usually expressed as a percentage of span, percentage of reading
or even absolute value. The standard value is set by the government so as
to maintain the standard

11
Example :
for example, a pressure gauge of range 0–10 bar has a quoted inaccuracy
of š1.0% f.s. (š1% of full-scale reading), then the maximum error to be
expected in any reading is 0.1 bar. This means that when the instrument
is
reading 1.0 bar, the possible error is 10% of this value. For this reason, it
is an important system design rule that instruments are chosen such that
their range is appropriate to the
spread of values being measured, in order that the best possible accuracy
is maintained in instrument readings. Thus, if we were measuring
pressures with expected values between 0 and 1 bar, we would not use
an instrument with a range of 0–10 bar. The term measurement
uncertainty is frequently used in place of inaccuracy.

12
TERIMA KASIH

13

You might also like