0% found this document useful (0 votes)
22 views10 pages

Unit-1 - BPI NEW

Instrumentation is vital in chemical plants for safety, efficiency, and process control, utilizing sensors and controllers to monitor key parameters. It ensures accurate process control, safety from hazards, quality assurance, energy efficiency, and environmental compliance. The document also covers the classification of instruments, their characteristics, selection criteria for various measuring devices, and the calibration process.

Uploaded by

dhavalsodha444
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views10 pages

Unit-1 - BPI NEW

Instrumentation is vital in chemical plants for safety, efficiency, and process control, utilizing sensors and controllers to monitor key parameters. It ensures accurate process control, safety from hazards, quality assurance, energy efficiency, and environmental compliance. The document also covers the classification of instruments, their characteristics, selection criteria for various measuring devices, and the calibration process.

Uploaded by

dhavalsodha444
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Unit-1 INTRODUCTION TO PROCESS INSTRUMENTATION

Importance of instrumentation in chemical plant

• Instrumentation plays a crucial role in chemical plants by ensuring safety, efficiency, and process
control. It involves the use of sensors, transmitters, controllers, and actuators to monitor and
regulate various parameters such as temperature, pressure, flow rate, and chemical
composition.
• Key Importance of Instrumentation:
• Process Control and Automation
• Ensures precise control of chemical reactions.
• Maintains optimal operating conditions.
• Reduces human intervention, minimizing errors.
• Safety and Hazard Prevention
• Detects leaks, overpressure, and temperature deviations.
• Prevents hazardous situations like explosions and toxic releases.
• Enables emergency shutdown systems (ESD) for accident prevention.
• Quality Assurance
• Maintains consistent product quality.
• Monitors critical parameters to meet industry standards.
• Reduces variations in product batches.

• Energy and Cost Efficiency

• Optimizes resource usage (e.g., raw materials, energy, and water).


• Reduces waste and operational costs.
• Enhances overall plant efficiency.

• Environmental Compliance

• Monitors emissions and effluents.


• Ensures adherence to environmental regulations.
• Helps in sustainable and eco-friendly production.

• Data Collection and Analysis

• Provides real-time data for better decision-making.


• Enables predictive maintenance to avoid downtime.
• Helps in process optimization and troubleshooting.
• Instrumentation is indispensable for modern chemical plants, ensuring
smooth operations, safety, and regulatory compliance.
Classification of instruments
Based on Functionality

• a) Indicating Instruments: Display real-time values of the measured quantity.


• b) Recording Instruments: Record data over time for monitoring and analysis.
• c) Controlling Instruments: Measure and regulate a process variable.
Based on Method of Measurement

• a) Direct Measuring Instruments: Provide measurement without requiring


additional calculations.

• b) Indirect Measuring Instruments: Measure one parameter to infer another.


Based on Operating Principle
• a) Mechanical Instruments: Use mechanical components like springs, levers,
or gears.
• b) Electrical Instruments: Use electrical properties (voltage, current,
resistance) for measurement.
• c) Electronic Instruments: Use semiconductor and digital circuits for
precision measurement.
• d) Pneumatic and Hydraulic Instruments: Operate using compressed air or liquid
pressure.

Based on Type of Output


• a) Analog Instruments: Display continuous values using a needle or dial.
• b) Digital Instruments: Provide numerical output on an LCD or LED screen.

Basic elements of instruments

The 'primary element is the part of the instrument that first utilizes energy from the measured
medium to produce a condition representing the value of the measured variable. In this case the
thermometer bulb is the primary element, because it first converts energy in the form of heat into
a fluid displacement, which is proportional to the temperature at the bulb.
• The secondary element merely converts the condition produced by the primary element
into a condition useful to the function of the instrument. In the example the secondary
element is the pressure spring, which converts the fluid displacement into a displacement
of a link.
• The manipulation element performs given operations on the condition produced by the
secondary element. motion of the pressure spring is modified by the cam in order to
correct for nonlinearity in the preceding conversion processes. The manipulation
element sometimes precedes the secondary element.
• The functioning element simply denotes the part of an instrument used for transmitting,
signaling, registering, indicating, or recording.
Static & Dynamic Characteristics:
STATIC CHARACTERISTICS OF INSTRUMENTS

Static characteristics describe the behavior of measuring instruments under steady-state or


slow-changing conditions. These characteristics are crucial when inputs do not change
rapidly with time.
1. Accuracy
Definition: The closeness of the measured value to the true (actual) value.
Accuracy Error=∣Measured Value−True Value∣
• High accuracy = low error.
• A thermometer showing 100.1°C instead of 100.0°C is quite accurate.
2. Precision
Definition: The degree to which repeated measurements give the same result under
unchanged conditions.
• It does not imply accuracy.
• Example: An instrument consistently showing 98.1°C for 100.0°C has high precision
but low accuracy.
3. Sensitivity
Definition: The ratio of change in output to change in input.
Mathematically:
Sensitivity=ΔInput/ΔOutput
• Example: If a pressure sensor gives 2 mV for each 1 Pa change, its sensitivity is 2
mV/Pa.
4. Linearity
Definition: The degree to which the output follows a straight-line relationship with input.
• Perfect linearity is rare. Most instruments approximate linear behavior within a
certain range.
• Deviation from linearity = non-linearity error.

5. Repeatability
Definition: The ability of an instrument to give the same reading for the same input under
the same conditions.
• Important in quality control.
• Slight differences in readings for the same input mean low repeatability.
6. Reproducibility
Definition: The ability of the instrument to produce the same output when used by different
people, in different environments, or at different times.
• Wider scope than repeatability.
7. Drift
Definition: Gradual change in output with time, even if input remains constant.
Types:
• Zero drift: Shift of the entire calibration curve.
• Span drift: Change in slope of the calibration line.
• Zonal drift: Drift over a particular input range.
8. Static Error in Instrument
Static error is the difference between the measured value (indicated by the instrument) and
the true value of the quantity being measured when the input remains constant (i.e., under
steady-state conditions).
Static Error=Measured Value−True Value
9. Resolution
Definition: The smallest change in input that produces a detectable change in output.
• Example: A digital scale with resolution 0.01 g can’t detect changes smaller than 0.01
g.
10. Dead Zone
Definition: Range of input in which there is no output response.

DYNAMIC CHARACTERISTICS OF INSTRUMENTS


These describe how an instrument responds to a time-varying input—that is, when the
input is changing rapidly.
1. Speed of Response
Definition: Time taken by an instrument to begin to respond to a change in input.
• Faster instruments are better for real-time monitoring.
2. Measuring Lag
Definition: Time delay between input signal and corresponding output.
Types:
• Retardation-type: Response starts after delay.
• Time-lag type: Output starts immediately but takes time to reach full value.
3. Fidelity
Definition: The ability of an instrument to reproduce input changes without distortion.
• Poor fidelity = waveform distortion.
4. Dynamic Error
Definition: The difference between the true value of the time-varying input and the value
indicated by the instrument.

Selection criteria for various measuring devices used in the chemical industry for
temperature, pressure, level, and flow measurement:

1. Temperature Measurement Devices


Common Devices:
• Thermocouples
• Resistance Temperature Detectors (RTDs)
• Bimetallic Thermometers
• Infrared Thermometers
• Thermistors
Selection Criteria:

Criterion Explanation

Choose a sensor that can withstand and accurately measure the


Temperature Range
process range.
Criterion Explanation

Accuracy & RTDs provide better accuracy than thermocouples; choose based on
Sensitivity application needs.

Thermocouples respond faster than RTDs; important for dynamic


Response Time
processes.

Chemical Sensor material must resist corrosion or degradation in the chemical


Compatibility environment.

Installation
Size and placement must fit into the process line or vessel.
Constraints

Maintenance & Prefer robust designs for harsh environments; bimetallic


Durability thermometers need minimal maintenance.

2. Pressure Measurement Devices


Common Devices:
• Bourdon Tube Gauges
• Diaphragm Gauges
• Strain Gauge Pressure Transducers
• Piezoelectric Sensors
• Manometers

Selection Criteria:

Criterion Explanation

Pressure Range Must cover the expected operating and maximum pressure levels.

Gauge, absolute, or differential pressure—select as per process


Type of Pressure
requirement.

Accuracy
Strain gauge transducers are more precise than bourdon tubes.
Requirements

Medium Corrosive, viscous, or particle-laden media may require diaphragm


Characteristics seals.

Some sensors drift at high temperatures; compensation may be


Temperature Effects
required.

Important for dynamic systems (e.g., pulsating flows or fast


Response Time
changes).
3. Level Measurement Devices
Common Devices:
• Sight Glasses
• Float-Based Level Sensors
• Ultrasonic Sensors
• Radar (Microwave) Level Transmitters
• Differential Pressure Transmitters
• Capacitance and Conductivity Probes
Selection Criteria:

Criterion Explanation

Tank Size & Geometry Choose sensor based on height and shape of tank or vessel.

Type of Material Solids, liquids, or slurries – different devices are suited for each.

Contact vs. Non- Radar and ultrasonic sensors are good for hazardous or hygienic
contact conditions.

Foam or Vapors
Radar preferred over ultrasonic if foam or vapors present.
Present

Conductivity of
Conductivity probes work only with conductive liquids.
Medium

Installation Flexibility Consider ease of mounting and wiring in confined spaces.

4. Flow Measurement Devices


Common Devices:
• Orifice Plates
• Venturi Meters
• Rotameters
• Electromagnetic Flow Meters
• Turbine Flow Meters
• Ultrasonic Flow Meters
• Coriolis Mass Flow Meters
Selection Criteria:
Criterion Explanation

Flow Rate Range Device must match expected minimum and maximum flow rates.

Fluid Type Gas, liquid, slurry, or steam – affects choice of flowmeter.

Viscosity & Electromagnetic meters need conductive liquids; turbine meters


Conductivity suit low-viscosity fluids.

Accuracy &
Coriolis and magnetic meters are highly accurate.
Repeatability

Installation
Straight pipe lengths, mounting orientation, and space availability.
Requirements

Devices like orifice plates cause high pressure drop; avoid if energy
Pressure Drop
loss is a concern.

Maintenance Needs Choose low-maintenance meters for hard-to-access areas.

What is Calibration?
Calibration is the process of comparing the measurement provided by an instrument with a
known standard or reference to check and correct its accuracy.
Calibration Procedure – necessary Steps
1. Examine the construction of an instrument and identify and list all the possible
inputs
2. Decide, as best as one can, which of the inputs will be significant in the application
for which the instrument is to be calibrated.
3. Procure apparatus that will allow all significant inputs to vary over the ranges
considered necessary.
4. By holding some inputs constant, varying others and recording the output, develop
the desired static input-output relations.
Standard Operating Procedure
1. Prepare the Instrument:
o Clean, power on, stabilize conditions.
o Ensure no physical damage or drift.
2. Select Calibration Standard:
o Must be traceable to a national/international standard.
o Should be more accurate than the instrument under test.
3. Apply Known Inputs:
o Apply input (e.g., pressure, temperature, voltage) from the standard.
4. Record Output:
o Note the output reading of the instrument.
5. Compare & Adjust:
o If the difference is outside acceptable limits, adjust the instrument.
6. Document Results:
o Record calibration data: input, output, deviation, corrections, technician’s
signature, date.
7. Label Instrument:
o Tag with calibration status: date of calibration, next due date, etc.

You might also like