What is an RTD sensor?

If you need to know what an RTD sensor is to potentially use in your operations, you’re in the right place. In this article, we’ll be covering not only what the sensor is, but also how it works, how to test it, and everything else you might need to be aware of when it comes to this type of sensor. 

So, what exactly is an RTD sensor? 

RTD stands for Resistance Temperature Detector, and it refers to a sensor that can change its resistance when the temperature of it changes, and it is used to measure temperature as well. The resistance of the RTD sensor goes up in line with when the temperature increases. A lot of RTDs are known as wire wound and they are made up of fine wire wrapped around a ceramic or glass core. 

The wire itself is typically made of platinum. RTD elements sit inside a protective probe in order to protect them from the environment they are used in and to make them more durable. Cheaper RTDs are known as thin film RTDs. They have a ceramic base with a fine platinum track placed on it. 

How does a resistance thermometer work? 

As previously mentioned, an RTD includes a resistance element and insulated wires made from platinum. In some cases, RTDs can have three or four wires to make for high accuracy and allow any errors relating to connection lead resistance to be eliminated. The resistance element is made from platinum because it is very durable and stable for longer periods and it has a linear connection between temperature and resistance, a broad temperature range, and a chemical inertness. 

In terms of its function, the RTD works on a basic principle, measuring the resistance of a metal to the flow of electricity. The higher the temperature of the metal, the higher the resistance. An electrical current passes through the sensor and the resistance element measures the resistance of the current passing through it. As the temperature of the resistance of the resistance element goes up the electrical resistance increases too. 

Electrical resistance is measured in Ohms. So, when it is measured the resistance value can then be swapped into temperature depending on the characteristics of the element. Normally, the response time for an RTD is between 0.5 and 5 seconds, making them well-suited to a wide variety of applications. 

How to test an RTD sensor 

To test your RTD heat trace sensor you should first set your multimeter to a resistance mode. Next, check the readings across the terminals of the RTD, at room temperature the reading should be about 110 ohms. Remember that the reading value could be different, depending on the temperature of the room.  

Lastly, put the RTD temperature sensor into ice water, checking the readings again after it has been in the water for a few minutes. You should now have a lower number than the room temperature reading (it should be roughly 100 ohms). 

What’s the difference between an RTD sensor and a thermocouple? 

There are several differences between RTD sensors and thermocouples. We’re going to outline the most important ones for you below. 

  • Thermocouples are typically smaller than RTDs, making them easier to use. 
  • Thermocouples have a wider temperature range of operation than RTDs (-200 to 2000C), in comparison to -200 to 600C. Therefore, thermocouples are suited to a longer list of applications. 
  • Thermocouples have a response time of between 0.1 and 10 seconds, which is quicker than RTD response times. 
  • RTDs runs the risk of self-heating, whereas this is only a negligible problem with thermocouples. 
  • Thermocouples are more sensitive than RTD sensors due to their faster reactions with variation in temperature. 
  • The relationship between resistance and temperature isn’t linear with thermocouples like it is with RTDs. 


For certain applications, an RTD sensor can be extremely useful in temperature measurement, but for others a thermocouple might be the better option. Contact TRM today to discuss your needs and our specialist team will be able to advise and provide you with a bespoke solution that meets the exact requirements of your operation. 

How to improve temperature measurement response time

When it comes to the four main process variables (flow, level, pressure, and temperature), temperature is the only one that can’t recognise and measure a sudden change relatively quickly. A quick movement in one of the other variables can be detected through instrumentation within a few seconds, but an increase or drop in temperature can take a while to completely quantify. 

For most, this is just a fact of life, we recognise and live with the characteristic, as temperature doesn’t typically change very fast anyway. However, there are some situations where the lag can cause big problems, such as industrial temperature measurement. Fortunately, there are a number of ways to improve the response time of temperature measurement, which we’ll be looking at in this article. 

First, why is there a lag in temperature measurement response times?

The two main methods of electronic temperature measurement are resistance (RTD sensor and thermistor) and voltage (thermocouple). In both approaches the value is shown at the top of the sensor, which might or might not be exactly equal to the process media temperature. Ensuring an accurate measurement depends on bringing the sensing element to the same temperature as the process media. This may sound like a fairly easy problem to solve, but it can actually be difficult in terms of actual applications. 

Using infrared to measure optically is pretty much instantaneous; however, it does come with serious limitations when measuring the temperature of gas or liquid (the most common types of process media). Infrared can be useful for several things, but for the majority of process temperature measurement applications, its applicability is limited, so we will be ignoring it for this article. 

A sensing element is normally enclosed in a stainless steel (sometimes platinum) sheath around 0.25 inches in diameter. The length can vary but the diameter is often this or smaller. The protective sheath is especially important for RTDs because of the delicacy of the sensing element. Thermocouples can come as naked wires without a sheath, but this is more of an exception than a rule. 

Heat from the process needs to be transferred through the sheath and packed inside it to reach the sensor, whatever the insulation might be, which causes the delay. Stainless steel is one of the most versatile alloys to be created, but it has one big drawback: it’s a poor conductor of heat. However, its many advantages outweigh this shortfall, making it the main material used for temperature sensor sheaths. 

The alloy determines the thermal conductivity value of a sheath, but the time it takes to reach the sensor is influenced by its size and thickness. The more material there is the longer it takes for the heat to move through the sheath wall to the sensor. As mentioned above, within the sheath the sensor is encased in an insulating material to protect it electrically and physically. So, the heat coming through the sheath also must heat the insulation before it gets to the sensor itself. 

Temperature difference is also a key factor as the greater the difference, the quicker the heat transfer. The rate of change slows when the measured and actual temperature gets closer to equilibrium. 

Getting past another layer 

There’s another big complication: a sensor is not usually inserted into the process by itself. In some real-world applications, it can happen, but for the most part, a sheathed sensor is inserted into a thermowell, which is then added into the process. The thermowell is part of the containment process and enables the sensor to be removed if needed without shutting it down. 

This has advantages, but it adds another layer of metal, through which the heat has to pass through to reach the sensor. 

Also, there is space between the inside of the thermowell and the sensor sheath, reducing the physical contact between the sensor and the process media. The inside of the thermowell effectively becomes and oven and a lot of the heat transfer needs to be by air instead of direct metal-to-metal contact. 

How to improve response time 

So far, we have looked at all the causes of slow response time, but what can you do to improve it? Firstly, you can improve the contact between the sensor sheath and thermowell. Check that the sensor is completely inserted into the thermowell, it’s quick and easy to fix if it isn’t. In some cases, sensors can be spring-loaded to keep the tip securely up against the end of the thermowell. 

Look for interior debris and internal deposits in the thermowell. Thermowells are not always made from stainless steel and more reactive alloys sometimes corrode, forming internal insulation, you need to clean out any debris. 

Confirm that the sensor sheath is the right size for the thermowell. The fit should be as close as possible to maximise contact, but if the thermowell bends or has debris inside, it can be tempting to compensate by using an undersized sensor. 

Add a little bit of silicone oil to the thermowell to help arrange heat transfer, as long as all the debris is cleared out and the installation is in the correct place to stop it leaking out. This reduces the effect of any internal air gap. Next, look at more involved solutions that are able to lower the amount of metal between the sensor and process. It’s not really possible to change the sheath itself, so this relates mostly to the thermowell. 

Use the thinnest thermowell you can. You need to be careful doing this as it’s part of the containment process, but if the thermowell isn’t in a moving fluid stream and is quite short, don’t make it any thicker than is required. Change the profile of the thermowell. If you’re worried about structural integrity because of fluid flow, a stepped or tapered thermowell could be an option. 


If these alterations aren’t enough to reduce the response time, more severe measures could be needed. These might include changing the location of the sensor, adding more sensors, or rethinking the temperature strategy regulation overall. Thankfully, no matter the approach, the range of temperature measurement options can offer a workable solution. Contact TRM today to discuss your industrial temperature measurement needs. 

How accurate are infrared thermometers and how are they used?

The main use of infrared (IR) thermometers is to measure temperature across various industrial and clinical environments. They are contact-free temperature measurement devices that perform especially well in situations where the object to be measured is fragile and dangerous to get close to, or when it’s not practical to use other types of thermometers. 

A key thing to note about infrared thermometers is that they use the concept of infrared radiation to gauge the surface temperature of objects without any physical contact. In this article, we’ll be looking in more depth at exactly how accurate infrared thermometers are and what you need to consider when selecting one. 

How does an infrared thermometer work? 

In a similar way to visible light, it is also possible to reflect, focus, or absorb infrared light. An infrared thermometer will use a lens to focus the infrared light coming from the object onto a detector called a thermopile. The thermopile is simply a few thermocouples connected in series or parallel. 

When the infrared radiation hits the thermopile surface it is absorbed and changed into heat. Voltage output is created in proportion to the incident infrared energy. The detector uses the output to work out the temperature, which is then shown on the screen. 

This might sound like a complex process but when put into practice, it only takes the thermometer a few seconds to record the temperature and display it in your chosen unit. 

How to get the most accurate results from an infrared thermometer 

There has been widespread discussion regarding the accuracy of infrared thermometers. To make sure that you get the most accurate results, we have put together some quick tips to help with your quality control. 

  • Check that you know your thermometer’s distance-to-spot ratio (D/S ratio) and get close enough to the target so that it only checks the area you want it to measure. 
  • Keep in mind that dust and steam can impact the accuracy of infrared thermometers. 
  • Ensure your thermometer lens is clean and free of any scratches that could alter the results. 
  • Look out and account for shiny, “low emissivity” objects when using your thermometer. 
  • Allow enough time for your thermometer to accurately adjust to the temperature of its surroundings. 

What do you need to consider when choosing an infrared thermometer? 


As we’ve just been looking at above, the most important part of any thermometer is how accurate it is. The accuracy of infrared thermometers is based on the D/S ratio. This ratio shows the maximum distance from where the thermometer can measure a certain surface area. 

For instance, if you are measuring the surface temperature of a 4-inch area with an infrared thermometer that has a D/S ratio of 8:1, the furthest distance from where you could get an accurate reading of the temperature will be 32 inches (8:1 x 4). Therefore, the bigger the ratios the further the distance you can measure the temperature from. However, when the distance is increased so too will the surface area. 


The emissivity of an infrared thermometer is how much energy it can put out at any given time. IR thermometers that have emissivity closer to 1.00 can measure more materials than those with a lower emissivity value. It will be beneficial to choose a thermometer that comes with an adjustable emissivity level to slightly alter the amount of energy emitted and make up for the energy reflected by the material in consideration for measuring temperature. 

Temperature range 

An IR thermometer’s temperature range influences the work you can do with it. You might want to get a thermometer with a broad temperature range to capture different processes with different temperatures. On the other hand, a thermometer with a narrower temperature range is advantageous in cases where high resolutions are needed to ensure the proper temperature control of a specific process. 

Reading speed 

Reading speed is the time the thermometer takes to provide a clear and accurate reading after going through its reading process. This element is key when measuring the temperature of a moving object, or in situations where the objects are quick to warm up. 


When it comes to industrial temperature measurement, an IR thermometer should have a rugged design to get an accurate measurement. For example, no-lens and Fresnel lens thermometers are more durable thanks to their polymer structure, which is very effective in protecting them. By contrast, Mica lens thermometers require a more durable shell and a carrying case included in their design to stop the lens from cracking. 


Infrared thermometers are invaluable for use whilst reading the temperature of a surface that is too difficult, dangerous, and practically impossible to reach. When positioned and used correctly they can be highly accurate and effective in their results as well as quick and easy to use. However, prior to choosing an infrared thermometer, it’s advisable to determine the temperature range of your application. If you need help with industrial temperature measurement in your operations, contact TRM today. 


Advantages of mineral wool insulation

Mineral wool is often used as an insulation material because of its helpful properties, such as being affordable and easy to handle. There are two types of mineral wool, rock, and glass. They are made from slightly different materials and whilst they’re fairly similar and are often used interchangeably in certain applications, there are other situations where one will work better than the other.  

In this guide, we’ll be looking at the key advantages of mineral wool insulation in relation to high-temperature measurement, so you know how it can work for your operations. 

Firstly, what is mineral wool insulation? 

Mineral wool includes spun yarn that is made of melted glass or stone. The threads are combined in a specific way to create a woolly structure. Then the wool is pushed down into boards or mineral wool batts that function as insulation material.  

Loose wool can particularly be blown in hollow spaces like cavity walls. It is a popular product that can be used for: 

  • Insulating walls (with a timber frame construction) 
  • Insulating cavity walls and exterior walls 
  • Thermal and acoustic insulation of partition walls and storey floors 
  • Insulating attic floors 
  • Insulating pitched roofs and flat roofs 
  • Multiple industrial applications (such as machines, air conditioners, etc) 

What are the advantages of mineral wool insulation?

Good thermal insulation

Mineral wool has an open fibre structure which means it can hold a large amount of air, making it a great insulator for regulating heat. The lambda value of this type of insulation is 0.03 W/mK to 0.04 W/mK, so both rock and glass wool insulation is not susceptible to thermal degradation. Therefore, it will maintain the same level of insulation over the lifetime of the building. Also, mineral wool insulation is stable, meaning it doesn’t expand or shrink. This keeps the joints between the material closed and thermal bridges to a minimum. 

Fire safety

Mineral wool insulation is fully resistant to fire and doesn’t conduct heat, meaning it’s ideal for use in environments that put high demands on fire safety and industrial temperature measurement. Therefore, it is most commonly used in fire-retardant products such as partition walls, fireproof doors, ceilings, and protective clothing. 

High levels of fire safety are an essential requirement for insurance companies in any type of building, and the use of fireproof insulations can in some cases be compulsory. When it comes to fire safety, mineral wool insulation is classified under Euro Class A and has the best score of all insulation materials, which is great for high temperature applications. 

Impressive soundproofing

Mineral wool insulation is effective against noise pollution due to its structure and composition. There are also special acoustic tiles for ceilings, walls, and floors that absorb sound waves, making them very useful in both industrial and consumer applications. 

For the latter, rock wool blankets are typically used in walls, floors, and ceilings. In the case of false or partition walls, a combination of plasterboard and mineral wool is often a good idea for absorbing sound waves. The frames need to be acoustically separated as much as possible to prevent contact bridges between the boards. 

Other benefits of mineral wool insulation include it’s less expensive than other materials, it doesn’t absorb moisture, so it is immune to mould, it is fully recyclable, and it has a minimal ecological footprint. It also has a wide range of applications. 

What’s the difference between glass wool and rock wool? 

As previously mentioned, both glass and rock wool are very similar insulation materials. The main difference between them relates to the fibre structure. The fibres in rock (or stone) wool are shorter than glass wool, so as a result rock wool has a higher density.  

Rock wool can withstand a higher pressure than glass wool. You can see the key differences between the two types of wool in the table below. 

Rock wool  Glass wool 
Shorter fibres  Long fibres 
High density  Lower density 
Lambda value 0.032-0.044 W/mK  Lambda value 0.035-0.039 W/mK 
Slightly lower fire resistance  High fire resistance 
High elasticity  Low elasticity 
High tensile strength  Low tensile strength 
Melting temp 700C  Melting temp 1000C 


Mineral wool insulation has many advantages that make it an important part of thermal resources management. Contact our team today to discuss your heat trace and temperature measurement needs. 


Enquiry Form