Addis Ababa University
Addis Ababa Institute of Technology
School of Mechanical & Industrial Engineering
             Metrology
                                   Introduction
•   Metrology: is the science of measurement, embracing both experimental and
    theoretical determinations at any level of uncertainty in any field of science and
    technology.
•   Metrology: is the science and process of ensuring that measurement meets specified
    degrees of both accuracy and precision.
•   A measuring instrument is a device used for determining the physical quantities.
    Example: pressure, length, volume and e.t.c.
•   All measuring instruments are subject to varying degrees of instrument error and
    measurement uncertainty.
•   Importance of measurements:
     ü To get accurate and precise data
     ü For measuring the composition of the material
                                   Cont…
Activities of metrology:
Ø Definition of the internationally accepted unit of measurements (SI)
Ø Realization of the units of measurements in practice
Ø Establishing traceability, linking measurements made in practice, to reference
   standards
Role of metrology:
v To develop new products and processes, companies need to measure quantity,
   quality and performance of the products
v Manufacturing of precision engineering components, used in aircrafts and
   spacecrafts, have tight specifications
                                          Cont…
• Metrology instruments are used in various sections of a manufacturing
   organizations. Example: Tool room, machine shop, foundry unit, standards
   room, press shop, electroplating room and etc.
Objectives of Metrology:
    Ø To determine the type of measuring instrument needed by the plant and ensure that they
       are well maintained in the plant by periodical calibration.
    Ø To find the process capabilities of newly developed processes
    Ø Standardization of the measuring methods used with reference to the prevailing standards.
    Ø Design of gauges and special inspection fixtures. Example: rotor blade dimension
       measurements
    Ø Application of statistical quality control techniques.
                                                     Cont…
Need of measurements:
Ø To monitor the process performance. To ensure that the number of rejects is as small as is
  economically practicable.
Ø To ensure that the raw materials , purchased parts and components conform to the
  purchasers specifications
Ø To evaluate the possibility of rework of defective parts
Ø To exclude the source of error, deficiencies in the process
Ø To establish limit gauging (inspection)
Ø To augment the reputation of the manufacturer
Classification of inspection:
      ü Manual (visual ) inspection
      ü Automatic inspection (contact, non-contact)
Depending on the area of application inspection can be:
Ø         Raw material inspection
Ø         Process inspection (on-process)
Ø         Tool and gauge inspection
Ø         Batch inspection
Ø         Final inspection
Ø         CAD to part analysis (first piece inspection)
                     Terminologies of metrology
1. Accuracy: the degree of closeness of a measurement of the quantity to that of the actual (true)
   value.
2. Precision: the degree to which repeated measurements under unchanged conditions show the
   same results
3. Sensitivity: refers to a minimum change in value of quantity being measured , that an
   instrument can reliably respond.
4. Resolution (discrimination): the fineness to which an instrument can be read
5. Calibration: is the quantitative determination of errors of measuring instruments and adjusting
   them to a minimum
6. Interchangeability: components selected randomly should assemble correctly with any mating
   components. this is possible when certain standards are strictly followed.
Advantages of interchangeability:
     ü reduces assembly cost, permits quick, cheap and satisfactory repairs
Precision without accuracy   Accuracy without Precision
                                                          Precision and accuracy
                                                                             8
                  Terminologies of metrology
7. Distortion (Abbe’s Law): the scale of a linear measuring system should be collinear with the
   displacement to be measured. The axis of instrument should align with the axis of the workpiece
8. Zero error: the instrument does not read zero when the input is zero. This can be corrected by
   some kind of zero offset adjustments
                  Terminologies of metrology
9. Parallax error: occurs when the pointer on a scale is not observed along a line normal to the scale.
    Reduce the distance between pointer and scale.
10. Range: indicates the maximum and minimum capacity of the measurement (instrument)
11: Readability: the ease with which a reading of measuring instruments can be read.
12: Nominal (true) value: the required value of a quantity .
   Error in the measurement : its deviation from the actual (nominal) value
                   Cont…
•   Span: is the difference between the maximum and
    minimum values of the input.
Stability
• Stability is the ability of a measuring device to give
   same output when used to measure a constant input
   over a period of time.
                                                      11
                      Cont…
Error (Bias): Error is the difference between the
 result of the measurement and the true value of
 the quantity being measured.
  error = measured value –true value
Example: A sensor might give a displacement reading of
  29.8 mm, when the actual displacement had been 30
  mm, then the error is -0.2 mm.
                                                  12
                        Cont…
Sensitivity : is the property of the measuring
  instrument to respond to changes in the
  measured quantity. It also can be expressed as
  the ratio of change of output to change of input.
ü Highly sensitive instrument will show larger
 fluctuations in output as a result of fluctuation in
 input, including noise.
                                                        13
 Nonlinearity
• Nonlinearity indicates the maximum
  deviation of the actual measured curve of an
  instrument from the ideal curve.
• Linearity is often specified in terms of
  percentage of nonlinearity, which is defined as:
Nonlinearity (%) depends upon environmental factors, including temperature,
vibration, acoustic noise level, and humidity. Therefore, it is important to know
under what conditions the specification is valid.
                                                                                    14
 Hysteresis
• Hysteresis is an error of a measuring device, which is defined as
   the maximum difference in output at any measurement value
   within the specified range when approaching the point first with
   increasing and then with decreasing the input parameter.
                                                               15
                Selection of instruments
• Factors for proper selection of measuring instruments:
   Ø Range of the instrument
   Ø Resolution : smallest dimensional input that can be detected
   Ø Accuracy expected
   Ø Installation requirement: mounting requirements, vibration isolation,
      compressed air requirement, wireless system, distance between the
      place of measurement and control room, ambient conditions.
   Ø Cost
   Ø Parameters to be measured: length, diameter, surface finish
   Ø Number of quantities to be measured
   Ø Final data requirements: for immediate or later use, format
              Linear measuring instruments
1. Vernier calliper
•   The Initial development of Vernier calliper was due to the invention of scale system for
    resolution amplification by Pierre Vernier in 1631. Then two centuries later, Joseph Brown, an
    American engineer, developed the scale system into what we know Vernier Calliper.
The characteristics of Vernier calliper:
Ø It has two types of scale: mm scale with 1mm resolu on and sub−mm scale (Vernier
  scale). The sub-mm scale is commonly from 0.05mm (50μm) to 0.01mm(10μm)
Ø It is an end standard (there are two opposite but parallel sides for measuring)
Ø It has several reading-scale types: with manual scale, digital and dial gauge.
§ Callipers with dial-gauge reading have the highest resolution compared to other types
Vernier caliper
             Calibration of vernier calicaliper
•    Vernier Calipers fall under the category of high precision measuring
     instruments meaning they provide measurements very precise in nature often accurate
     to the hundredth of a millimeter. This accuracy is achieved by the use of a vernier
     scale attached to a main scale which makes the vernier caliper different from other
     calipers.
•    This highly sensitive nature also makes it necessary for the vernier caliper to be
     constantly maintained and calibrated as even the slightest distortion in its settings could
     have an effect on the accuracy of the readings.
Standard Vernier caliper calibration procedures:
1.    The caliper’s jaws which are the parts responsible for measuring distances should
      be cleaned to make sure they are free of any dirt or grease. The gear should be
      moved back and forth to make sure that it is moving without any hindrance.
2.    Zero reading check. Bring the jaws in contact with each other and check the
      reading on the dial. It should be zero. If this is not the case then set it manually to
      zero.
3.    Insert a 0.500 inch (12.7 mm) standard gauge block between the jaws used to
      measure outer diameters. Both the jaws should be in contact with the block but do not press the
      jaws too tightly on the surface. Record the reading accurate to 3 decimal places. Take at least three
      readings to eliminate any inconsistency while measuring.
4.    Repeat the measurement with a 1 inch (25.4 mm) gauge block and afterwards with
      a 4 inch (101.6 mm) block. Note the readings
Micrometer
              Linear measuring instruments
2. Micrometer: was invented by William Gascoigne in the 17thcentury (in
   Yorkshire, England) and is a screw-based measuring instrument. The first
   commercialisation of micrometre was done by Brown and Shape in 1867.
•   The characteristics of micrometre:
     Ø Commonly, it has resolution from 2μm to 10μm obtained from screw
        mechanism.
     Ø It has main (rough) resolution of 1mm
     Ø It is an end standard (there are two opposite but parallel sides for measuring)
     Ø Like Vernier calliper, it has several reading-scale types: with manual scale, digital
        and dial gauge.
§   Callipers with dial-gauge reading have the highest resolution compared to other types
                 Linear measuring instruments
•   Micrometre has higher accuracy than Vernier calliper. The high accuracy of micrometre is not mainly due to
    higher resolution than calliper.
•   The fundamental difference between micrometre and Vernier calliper is on the basic configuration of their
    measurement and reading-scale axes.
•   Micrometre has coaxial measurement axis and reading scale axis. These coaxial axes cause that there is no
    Abbe offset in their structure (figure 3). Meanwhile, in figure 3, Vernier calliper has measurement and
    reading scale axes that are not coaxial. Instead, they are offset with a certain distance. Due to non-coaxial
    axes, there is Abbe offset, leading to Abbe error, on the structure of Vernier calliper.
                Figure 3: The location of the measurement and reading scale axes for micrometre (left) and Vernier calliper (right)
Thank You!