What is an RTD sensor?

If you need to know what an RTD sensor is to potentially use in your operations, you’re in the right place. In this article, we’ll be covering not only what the sensor is, but also how it works, how to test it, and everything else you might need to be aware of when it comes to this type of sensor. 

So, what exactly is an RTD sensor? 

RTD stands for Resistance Temperature Detector, and it refers to a sensor that can change its resistance when the temperature of it changes, and it is used to measure temperature as well. The resistance of the RTD sensor goes up in line with when the temperature increases. A lot of RTDs are known as wire wound and they are made up of fine wire wrapped around a ceramic or glass core. 

The wire itself is typically made of platinum. RTD elements sit inside a protective probe in order to protect them from the environment they are used in and to make them more durable. Cheaper RTDs are known as thin film RTDs. They have a ceramic base with a fine platinum track placed on it. 

How does a resistance thermometer work? 

As previously mentioned, an RTD includes a resistance element and insulated wires made from platinum. In some cases, RTDs can have three or four wires to make for high accuracy and allow any errors relating to connection lead resistance to be eliminated. The resistance element is made from platinum because it is very durable and stable for longer periods and it has a linear connection between temperature and resistance, a broad temperature range, and a chemical inertness. 

In terms of its function, the RTD works on a basic principle, measuring the resistance of a metal to the flow of electricity. The higher the temperature of the metal, the higher the resistance. An electrical current passes through the sensor and the resistance element measures the resistance of the current passing through it. As the temperature of the resistance of the resistance element goes up the electrical resistance increases too. 

Electrical resistance is measured in Ohms. So, when it is measured the resistance value can then be swapped into temperature depending on the characteristics of the element. Normally, the response time for an RTD is between 0.5 and 5 seconds, making them well-suited to a wide variety of applications. 

How to test an RTD sensor 

To test your RTD heat trace sensor you should first set your multimeter to a resistance mode. Next, check the readings across the terminals of the RTD, at room temperature the reading should be about 110 ohms. Remember that the reading value could be different, depending on the temperature of the room.  

Lastly, put the RTD temperature sensor into ice water, checking the readings again after it has been in the water for a few minutes. You should now have a lower number than the room temperature reading (it should be roughly 100 ohms). 

What’s the difference between an RTD sensor and a thermocouple? 

There are several differences between RTD sensors and thermocouples. We’re going to outline the most important ones for you below. 

  • Thermocouples are typically smaller than RTDs, making them easier to use. 
  • Thermocouples have a wider temperature range of operation than RTDs (-200 to 2000C), in comparison to -200 to 600C. Therefore, thermocouples are suited to a longer list of applications. 
  • Thermocouples have a response time of between 0.1 and 10 seconds, which is quicker than RTD response times. 
  • RTDs runs the risk of self-heating, whereas this is only a negligible problem with thermocouples. 
  • Thermocouples are more sensitive than RTD sensors due to their faster reactions with variation in temperature. 
  • The relationship between resistance and temperature isn’t linear with thermocouples like it is with RTDs. 

Conclusion 

For certain applications, an RTD sensor can be extremely useful in temperature measurement, but for others a thermocouple might be the better option. Contact TRM today to discuss your needs and our specialist team will be able to advise and provide you with a bespoke solution that meets the exact requirements of your operation. 

How accurate are infrared thermometers and how are they used?

The main use of infrared (IR) thermometers is to measure temperature across various industrial and clinical environments. They are contact-free temperature measurement devices that perform especially well in situations where the object to be measured is fragile and dangerous to get close to, or when it’s not practical to use other types of thermometers. 

A key thing to note about infrared thermometers is that they use the concept of infrared radiation to gauge the surface temperature of objects without any physical contact. In this article, we’ll be looking in more depth at exactly how accurate infrared thermometers are and what you need to consider when selecting one. 

How does an infrared thermometer work? 

In a similar way to visible light, it is also possible to reflect, focus, or absorb infrared light. An infrared thermometer will use a lens to focus the infrared light coming from the object onto a detector called a thermopile. The thermopile is simply a few thermocouples connected in series or parallel. 

When the infrared radiation hits the thermopile surface it is absorbed and changed into heat. Voltage output is created in proportion to the incident infrared energy. The detector uses the output to work out the temperature, which is then shown on the screen. 

This might sound like a complex process but when put into practice, it only takes the thermometer a few seconds to record the temperature and display it in your chosen unit. 

How to get the most accurate results from an infrared thermometer 

There has been widespread discussion regarding the accuracy of infrared thermometers. To make sure that you get the most accurate results, we have put together some quick tips to help with your quality control. 

  • Check that you know your thermometer’s distance-to-spot ratio (D/S ratio) and get close enough to the target so that it only checks the area you want it to measure. 
  • Keep in mind that dust and steam can impact the accuracy of infrared thermometers. 
  • Ensure your thermometer lens is clean and free of any scratches that could alter the results. 
  • Look out and account for shiny, “low emissivity” objects when using your thermometer. 
  • Allow enough time for your thermometer to accurately adjust to the temperature of its surroundings. 

What do you need to consider when choosing an infrared thermometer? 

Accuracy 

As we’ve just been looking at above, the most important part of any thermometer is how accurate it is. The accuracy of infrared thermometers is based on the D/S ratio. This ratio shows the maximum distance from where the thermometer can measure a certain surface area. 

For instance, if you are measuring the surface temperature of a 4-inch area with an infrared thermometer that has a D/S ratio of 8:1, the furthest distance from where you could get an accurate reading of the temperature will be 32 inches (8:1 x 4). Therefore, the bigger the ratios the further the distance you can measure the temperature from. However, when the distance is increased so too will the surface area. 

Emissivity 

The emissivity of an infrared thermometer is how much energy it can put out at any given time. IR thermometers that have emissivity closer to 1.00 can measure more materials than those with a lower emissivity value. It will be beneficial to choose a thermometer that comes with an adjustable emissivity level to slightly alter the amount of energy emitted and make up for the energy reflected by the material in consideration for measuring temperature. 

Temperature range 

An IR thermometer’s temperature range influences the work you can do with it. You might want to get a thermometer with a broad temperature range to capture different processes with different temperatures. On the other hand, a thermometer with a narrower temperature range is advantageous in cases where high resolutions are needed to ensure the proper temperature control of a specific process. 

Reading speed 

Reading speed is the time the thermometer takes to provide a clear and accurate reading after going through its reading process. This element is key when measuring the temperature of a moving object, or in situations where the objects are quick to warm up. 

Design 

When it comes to industrial temperature measurement, an IR thermometer should have a rugged design to get an accurate measurement. For example, no-lens and Fresnel lens thermometers are more durable thanks to their polymer structure, which is very effective in protecting them. By contrast, Mica lens thermometers require a more durable shell and a carrying case included in their design to stop the lens from cracking. 

Conclusion 

Infrared thermometers are invaluable for use whilst reading the temperature of a surface that is too difficult, dangerous, and practically impossible to reach. When positioned and used correctly they can be highly accurate and effective in their results as well as quick and easy to use. However, prior to choosing an infrared thermometer, it’s advisable to determine the temperature range of your application. If you need help with industrial temperature measurement in your operations, contact TRM today. 

  

Advantages of mineral wool insulation

Mineral wool is often used as an insulation material because of its helpful properties, such as being affordable and easy to handle. There are two types of mineral wool, rock, and glass. They are made from slightly different materials and whilst they’re fairly similar and are often used interchangeably in certain applications, there are other situations where one will work better than the other.  

In this guide, we’ll be looking at the key advantages of mineral wool insulation in relation to high-temperature measurement, so you know how it can work for your operations. 

Firstly, what is mineral wool insulation? 

Mineral wool includes spun yarn that is made of melted glass or stone. The threads are combined in a specific way to create a woolly structure. Then the wool is pushed down into boards or mineral wool batts that function as insulation material.  

Loose wool can particularly be blown in hollow spaces like cavity walls. It is a popular product that can be used for: 

  • Insulating walls (with a timber frame construction) 
  • Insulating cavity walls and exterior walls 
  • Thermal and acoustic insulation of partition walls and storey floors 
  • Insulating attic floors 
  • Insulating pitched roofs and flat roofs 
  • Multiple industrial applications (such as machines, air conditioners, etc) 

What are the advantages of mineral wool insulation?

Good thermal insulation

Mineral wool has an open fibre structure which means it can hold a large amount of air, making it a great insulator for regulating heat. The lambda value of this type of insulation is 0.03 W/mK to 0.04 W/mK, so both rock and glass wool insulation is not susceptible to thermal degradation. Therefore, it will maintain the same level of insulation over the lifetime of the building. Also, mineral wool insulation is stable, meaning it doesn’t expand or shrink. This keeps the joints between the material closed and thermal bridges to a minimum. 

Fire safety

Mineral wool insulation is fully resistant to fire and doesn’t conduct heat, meaning it’s ideal for use in environments that put high demands on fire safety and industrial temperature measurement. Therefore, it is most commonly used in fire-retardant products such as partition walls, fireproof doors, ceilings, and protective clothing. 

High levels of fire safety are an essential requirement for insurance companies in any type of building, and the use of fireproof insulations can in some cases be compulsory. When it comes to fire safety, mineral wool insulation is classified under Euro Class A and has the best score of all insulation materials, which is great for high temperature applications. 

Impressive soundproofing

Mineral wool insulation is effective against noise pollution due to its structure and composition. There are also special acoustic tiles for ceilings, walls, and floors that absorb sound waves, making them very useful in both industrial and consumer applications. 

For the latter, rock wool blankets are typically used in walls, floors, and ceilings. In the case of false or partition walls, a combination of plasterboard and mineral wool is often a good idea for absorbing sound waves. The frames need to be acoustically separated as much as possible to prevent contact bridges between the boards. 

Other benefits of mineral wool insulation include it’s less expensive than other materials, it doesn’t absorb moisture, so it is immune to mould, it is fully recyclable, and it has a minimal ecological footprint. It also has a wide range of applications. 

What’s the difference between glass wool and rock wool? 

As previously mentioned, both glass and rock wool are very similar insulation materials. The main difference between them relates to the fibre structure. The fibres in rock (or stone) wool are shorter than glass wool, so as a result rock wool has a higher density.  

Rock wool can withstand a higher pressure than glass wool. You can see the key differences between the two types of wool in the table below. 

Rock wool  Glass wool 
Shorter fibres  Long fibres 
High density  Lower density 
Lambda value 0.032-0.044 W/mK  Lambda value 0.035-0.039 W/mK 
Slightly lower fire resistance  High fire resistance 
High elasticity  Low elasticity 
High tensile strength  Low tensile strength 
Melting temp 700C  Melting temp 1000C 

Conclusion 

Mineral wool insulation has many advantages that make it an important part of thermal resources management. Contact our team today to discuss your heat trace and temperature measurement needs. 

X

Enquiry Form