When you are looking at a spec sheet for mass flow meters and mass flow controllers the amount of information presented can be a little overwhelming. Especially, if you are not familiar with what all of the terminology means or if you are not sure which of the specs will have the most impact on what you are trying to accomplish.
Listed below are some of the more important mass flow meter and mass flow controller specs and what they mean in plain english.
Accuracy is a measurement of how accurately an instrument performs at different flow ranges.
Accuracy is generally measured in one of two ways: percentage of full scale flow or percentage of reading.
Error as a percentage of full scale is established by multiplying the error percentage by the full scale flow. The less you flow through the device the less accurate the reading will be. For that reason, you don’t want to get a larger device than you need. Devices with error expressed as a percentage of full scale are most accurate when flowing at full scale.
Error expressed as a percentage of reading expresses error as a percentage of what the device is actually flowing. Simply, if a instrument’s accuracy is rated to +/-1% of reading an instrument will be accurate to +/-1% of whatever the instrument is flowing. At 100SLPM the instrument will be accurate to within +/-1SLPM, and at 10SLPM of flow the unit will be accurate to within +/-.1SLPM.
Accuracy, regardless of measurement method, is generally dependant on operating conditions. Operating conditions are usually defined as the pressure and temperature of the gas flowing through the instrument. Manufacturers will rate their instrument’s error based on some predefined set of operating conditions, usually standard pressure and temperature. So, if your gas temperature and/or gas pressure do not meet those conditions specified by the manufacturer the accuracy of your unit could be off by quite a bit. Some units, like Alicat’s, are internally compensated which means that sensors inside the device measure temperature and pressure conditions and make real time corrections for variations in gas conditions. Real time corrections for variations in gas conditions take a lot of worry about maintaining consistent process conditions.
Repeatability measures an instrument’s ability to repeat flow functions accurately.
A unit’s repeatability is generally measured by monitoring a flow instrument’s reading at a given flow rate, turning off the flow allowing instrument to return to zero for a given period of time, and then resuming the same flow. The instrument’s repeatability is determined by examining the difference between the original flow reading and the flow reading after the flow has been turned off and resumed.
Simply, repeatability measures how repeatable an instrument’s reading will be at the same flow rate.
Turndown ratio is a measure of the useable range of an instrument, expressed as the ratio of maximum flow to minimum flow. Simply put, it is the minimum amount of fluid that can be measured by the device. Turndown ratio indicates how much of the instrument range can produce accurate readings, which is very important when you want to measure or control a very wide flow range without having to change instruments. This article explains why turndown ratio is important.
Warm up time
Warm up time measures the amount of time it takes for an instrument to become stable for use. Thermal units tend to have the longest warm up times. Some units can take up to 30 minutes to become stable to within 2%FS. This is an important specification if you turn your unit off at the end of the day.
Pressure drop describes the loss of pressure as a fluid travels through a pipe or channel and any instruments along the way. If you blew into a mile long pipe, it’s unlikely that anything would come out the other end. As the air flows through the pipe, friction with the pipe walls and between the gas particles causes a loss of pressure. Pressure drop is approximately proportional to the distance the gas travels. Every component that comes in contact with the gas–every instrument, fitting, bend, pipe wall, etc.–induces some pressure drop. This article explains how pressure drop is measured and why it is important.
Since pressure drop is a flow killer, gas processes that have little available differential pressure are best optimized by making sure that every component in the system generates as little pressure drop as possible. Alicat’s Whisper Series of low pressure drop gas flow meters and controllers can help.
Zero Shift or Offset error
Zero shift or offset shift is defined as how far from zero an instrument will move when pressure and/or temperature are changed. Offset error does not affect the slope of the calibration curve, any offset error will be the same throughout the flow range. Offset error is measured in %FS (or %reading)/degree change in temp (or psi change in pressure) Simply, for every change in degree temp or change in psi the calibration is offset by the percentage of error.
Span Shift or Span error
Span shift or span error is defined as a shift in the slope of the calibration curve with zero not changing. The calibration curve of the device will be affected differently at different flow ranges. Span error is measured in %FS (or %reading)/degree change in temp (or psi change in pressure) Simply, for every change in degree temp or change in psi the calibration is offset by the percentage of error
Zero shift or span shift can also be referred to as ‘Temperature coefficients’ or ‘Pressure Coefficients’ and will be measured the same way. Be sure to pay attention to the units of measure as some manufacturers will measure span or offset error percentages by measuring in degrees F or single psi instead of degrees C or atm’s of pressure.
Dead band as it relates to a pressure switch is the band in between which the switch trips (the setpoint) and where the switch resets.