Match the Right Temperature Sensor Configuration to the Application

industrial temperature sensor or transmitter with welded pad for heat conduction
Using a temperature sensor properly configured for
the application will result in enhanced process performance
Image courtesy Smart Sensors, Inc.
There are more temperature controlled operations than any of us could count in a lifetime. Each one exhibits an exclusive set of performance requirements and design challenges. Matching the means of temperature measurement, the control loop characteristics, and heat delivery method to the application are essential to achieving successful operation.

Step one is to measure the process temperature. This sounds simple until you start researching products and technologies for measuring temperature. Like the temperature controlled operations mentioned previously, there are more than you can count in a lifetime. To filter the possible candidates for temperature sensing devices, consider these aspects of your application and how well a particular sensor may fulfill your requirement.

  • Response Time - How rapidly the sensor will detect a change in process temperature is a function of how the sensor is constructed and how it is installed. Most temperature sensors are enclosed or encapsulated to provide protection for the somewhat vulnerable sensing element. Greater mass surrounding the sensing element will slow sensor response. Whether the slower response time will adversely impact process operation needs to be considered. More consideration is due to the manner in which the temperature sensor assembly is installed. Not all applications involve a fluid in which the sensor assembly can be conveniently immersed, and even these applications benefit from careful sensor placement.
  • Accuracy - Know what your process needs to be effective. Greater levels of accuracy will generally cost more, possibly require more care and attention to assure the accuracy is maintained. Accuracy is mostly related to the type of sensor, be it RTD, thermocouple, or another type.
  • Sensitivity - Related to the construction, installation, and type of sensor, think of sensitivity as the smallest step change in process temperature that the sensor will reliably report. The needs of the process should dictate the level of sensitivity specified for the temperature sensor assembly.
Let's look at a very simple application.
Heat tracing of piping systems is a common application throughout commercial and industrial settings experiencing periods of cold weather. Electric heat trace installations benefit from having some sort of control over the energy input. This control prevents excessive heating of the piping or applying heat when none is required, a substantial energy saving effort. A temperature sensor can be installed beneath the piping's insulation layer, strapped to the pipe outer surface. One sensor design option available to improve the performance of the sensor is a surface pad. The surface pad is a metal fixture welded to the sensing end of a temperature sensor assembly. It can be flat, for surface temperature measurements, or angled for installation on a curved surface, like a pipe. The increased surface contact achieved with the surface pad promotes the conduction of heat to the sensor element from the heated pipe in our example. This serves to reduce and improve the response time of the sensor. Adding some thermally conductive paste between the pad and the pipe surface can further enhance the performance. While the illustration is simple, the concepts apply across a broad range of potential applications that do not allow immersion of the temperature assembly in a fluid.

A simple modification or addition of an option to a standard sensor assembly can deliver substantially improved measurement results in many cases. Share your temperature measurement requirements and challenges with a process measurement specialist. Leverage your own process knowledge and experience with their product application expertise.


Calibration of Process Instrumentation

sanitary rtd temperature transmitter
Industrial temperature transmitter requires
periodic calibration to assure reliable performance
Image courtesy of Smart Sensors
Calibration is an essential part of keeping process measurement instrumentation delivering reliable and actionable information. All instruments utilized in process control are dependent on variables which translate from input to output. Calibration ensures the instrument is properly detecting and processing the input so that the output accurately represents a process condition. Typically, calibration involves the technician simulating an environmental condition and applying it to the measurement instrument. An input with a known quantity is introduced to the instrument, at which point the technician observes how the instrument responds, comparing instrument output to the known input signal.

Even if instruments are designed to withstand harsh physical conditions and last for long periods of time, routine calibration as defined by manufacturer, industry, and operator standards is necessary to periodically validate measurement performance. Information provided by measurement instruments is used for process control and decision making, so a difference between an instrument’s output signal and the actual process condition can impact process output or facility overall performance and safety.

In all cases, the operation of a measurement instrument should be referenced, or traceable, to a universally recognized and verified measurement standard. Maintaining the reference path between a field instrument and a recognized physical standard requires careful attention to detail and uncompromising adherence to procedure.

Instrument ranging is where a certain range of simulated input conditions are applied to an instrument and verifying that the relationship between input and output stays within a specified tolerance across the entire range of input values. Calibration and ranging differ in that calibration focuses more on whether or not the instrument is sensing the input variable accurately, whereas ranging focuses more on the instrument’s input and output. The difference is important to note because re-ranging and re-calibration are distinct procedures.

In order to calibrate an instrument correctly, a reference point is necessary. In some cases, the reference point can be produced by a portable instrument, allowing in-place calibration of a transmitter or sensor. In other cases, precisely manufactured or engineered standards exist that can be used for bench calibration. Documentation of each operation, verifying that proper procedure was followed and calibration values recorded, should be maintained on file for inspection.

As measurement instruments age, they are more susceptible to declination in stability. Any time maintenance is performed, calibration should be a required step since the calibration parameters are sourced from pre-set calibration data which allows for all the instruments in a system to function as a process control unit.

Typical calibration timetables vary depending on specifics related to equipment and use. Generally, calibration is performed at predetermined time intervals, with notable changes in instrument performance also being a reliable indicator for when an instrument may need a tune-up. A typical type of recalibration regarding the use of analog and smart instruments is the zero and span adjustment, where the zero and span values define the instrument’s specific range. Accuracy at specific input value points may also be included, if deemed significant.

The management of calibration and maintenance operations for process measurement instrumentation is a significant factor in facility and process operation. It can be performed with properly trained and equipped in-house personnel, or with the engagement of subcontractors. Calibration operations can be a significant cost center, with benefits accruing from increases in efficiency gained through the use of better calibration instrumentation that reduces task time.

Measurement of Oxygen in Processing Applications

optical oxygen sensor for process measurement and control
This optical oxygen sensor is one of many oxygen
measurement devices
Image courtesy Mettler-Toledo
The measurement of oxygen is used throughout many industrial processing operations. Knowing about oxygen measurement technology can lead to better measurement performance.

Mettler-Toledo, a recognized leader in process analytical measurement technology, has authored a comprehensive guide to oxygen measurement. Some of the covered topics include:

  • Theoretical background of oxygen measurement
  • Calibration of oxygen sensors
  • Description of oxygen measurement technologies
  • Common challenges with oxygen measurements
  • And more
A copy of the guide is included below. Share your process analytical requirements and challenges with measurement experts, combining your own knowledge and experience with their product application expertise to develop effective solutions. Ask for your own copy of the guide, too.



Direct Reading Level Indicator Gauge for Process Tanks

direct reading tank level gauge indicator
Direct reading level gauge continuously indicates
tank liquid level
Image courtesy Jogler
Anytime there is a process tank, there is a need to know how full it may be. There are numerous methods and technologies that can be applied, with varying levels of complexity and accuracy, to provide a measure and indication of tank liquid level.

A direct reading tank level gauge is essentially an extension of the tank that provides a visible indication of liquid level. The level is not inferred from a pressure reading or tank weight, nor is it represented by the movement of a float or other device. The actual process liquid can be seen by an operator or technician by looking at the clear display area of the gauge.

A direct reading level gauge connects to tank fittings at significantly high and low points along the tank side wall. The connections permit process liquid to flow into the gauge, with the level in the gauge being the same as that in the tank. A scale on the gauge provides a reference point for liquid level that can be recorded or used in other ways in the process. The simple device has no moving parts, requires no calibration, demands little to no maintenance. It can be the primary level indicating device for a manually operated fill, or act as a backup or local indicator for an automated process.

There are pressure limitations for these indicators. Higher pressure applications, or those with liquids that may foul the clear viewing area of the indicator are better handled with a magnetic level indicator. Like all instruments, proper application is the key to getting the best performance.

Share your level measurement and indication requirements and challenges with process measurement specialists, combining your own process knowledge and experience with their product application expertise to develop effective solutions.


Heated Impulse Lines on Pressure Gauges and Transmitters

self regulating heat trace cble
Successive cutaway view of self-regulating heat trace
cable showing various layers of material
Courtesy BriskHeat
Temperature of the environment surrounding process equipment and instruments can sometimes have a deleterious impact on its function. A common example is cold weather impact on the impulse lines connecting pressure gauges or transmitters to process piping in outdoor or unheated locations. While the process lines may be large, with sufficient mass flow and insulation to prevent freezing, this may not be the case for small diameter impulse lines. Liquid freezing in cold weather conditions can be a threat to process operation, depending on the type of liquid being used. A safeguard exists for impulse lines where the lines can be traced with a heat source, allowing for counteraction of the environmental conditions and maintenance of proper operation.

There are a number of ways to deliver heat to an impulse line. Recognize two essential goals, with the first being to prevent freezing or other changes to the fluid in the line that would impact the response or accuracy of the instrument reading. The second goal is related to the heat tracing itself. The delivered heat must not be great enough to impact the fluid in the impulse line and generate a false pressure reading. Optimally, delivering heat in a fashion that is limited to what is necessary to maintain the impulse line fluid in an ideal working state is best.

One example of heat tracing an impulse line is through the placement of a tube or small diameter pipe, located in close proximity to line, through which low pressure steam flows. Insulation is applied to the bundle and the steam line serves as a heat source. The tube transfers heat to the impulse line when steam flows. After the steam heats the impulse line, a steam trap accompanying the system collects condensate for return to the boiler. It is also conceivable that the steam line could ultimately vent to atmosphere, with no condensate return. There are a number of concerns that must be addressed in the design of the steam portion of this scenario, since it would be necessary to keep any condensate from freezing under all anticipated operating conditions, including process shutdown.

A second common solution for freeze protection of impulse lines is through the installation of electric heat tracing. Two-wire cable serves as protection against the cold. When powered, the heat from the cable keeps the line warm. Electric heat tracing is available in a broad range of physical configurations, including cables, tape, blankets, and other flexible and solid shapes. Control of the electric heat system can be accomplished with an external controller and sensor, or a self-regulating heat trace cable can be used. As with a steam heating system, there are some specific considerations for electric heat tracing. Thermal insulation is still considered a best practice. Electric power must be delivered to the installation, and a means of monitoring heat trace performance for faults or failure should be included in the design.

Share your heat tracing requirements and challenges for process piping and other industrial applications with a product specialist. There are many options and product variants from which to choose. A consultation can help direct you to the best solution.

Water Quality Testing - Turbidity Standards

turbidity calibration standards
ProCal turbidity standards are suitable for use
with instruments from other manufacturers.
Image Courtesy HF Scientific
Turbidity is a commonly measured indicator of water quality. Regardless of the instrument being used, frequent and regular calibration is part of the procedure assuring accurate and traceable results that may be used as evidence of regulatory compliance.

Calibration requires the use of a prepared sample of a known value. HF Scientific, manufacturer of water quality instrumentation, reagents, and standards, provides high quality premixed turbidity standards that are suitable for use with their instruments, as well as those of several other manufacturers.

Share your water quality analysis requirements and challenges with process analytic specialists, combining your own experience and knowledge with their product application expertise to develop effective solutions.


Liquid Level Measurement Using Hydrostatic Pressure

process tanks in dairy food processing facility
Hydrostatic pressure can be used to measure liquid level
in tanks or other vessels.
Pressure measurement is an inferential way to determine the height of a column of liquid in a vessel in process control. The vertical height of the fluid is directly proportional to the pressure at the bottom of the column, meaning the amount of pressure at the bottom of the column, due to gravity, relies on a constant to indicate a measurement. Regardless of whether the vessel is shaped like a funnel, a tube, a rectangle, or a concave polygon, the relationship between the height of the column and the accumulated fluid pressure is constant. Weight density depends on the liquid being measured, but the same method is used to determine the pressure.

A common method for measuring hydrostatic pressure is a simple gauge. The gauge is installed at the bottom of a vessel containing a column of liquid and returns a measurement in force per unit area units, such as PSI. Gauges can also be calibrated to return measurement in units representing the height of liquid since the linear relationship between the liquid height and the pressure. The particular density of a liquid allows for a calculation of specific gravity, which expresses how dense the liquid is when compared to water. Calculating the level or depth of a column of milk in a food and beverage industry storage vessel requires the hydrostatic pressure and the density of the milk. With these values, along with some constants, the depth of the liquid can be calculated.

The liquid depth measurement can be combined with known dimensions of the holding vessel to calculate the volume of liquid in the container. One measurement is made and combined with a host of constants to determine liquid volume. The density of the liquid must be constant in order for this method to be effective. Density variation would render the hydrostatic pressure measurement unreliable, so the method is best applied to operations where the liquid density is known and constant.

Interestingly, changes in liquid density will have no effect on measurement of liquid mass as opposed to volume as long as the area of the vessel being used to store the liquid remains constant. If a liquid inside a vessel that’s partially full were to experience a temperature increase, resulting in an expansion of volume with correspondingly lower density, the transmitter will be able to still calculate the exact mass of the liquid since the increase in the physical amount of liquid is proportional to a decrease in the liquid’s density. The intersecting relationships between the process variables in hydrostatic pressure measurement demonstrate both the flexibility of process instrumentation and how consistently reliable measurements depend on a number of process related factors.

Solutions to process instrumentation and measurement challenges are most effective when developed in concert with a product application specialist. The combination of user knowledge and experience with product application expertise will lead to a successful project.