By Amanda Moser, Senior Laboratory Assessor Level 2
Posted: November 2011
We experience pressure every day. Feeling the wind on your face, drinking through a straw, even sitting in your chair all give you sensations of pressure. So why can pressure be such a confusing concept to apply to laboratory testing? There are several factors. Pressure is measured in several different units – kilopascals, millimeters of mercury, pounds per square inch, and many others. Pressure can also be measured from several different perspectives, depending on the kind of equipment used. In addition, laboratory work may require the application of either a partial vacuum (also called negative pressure), or a positive pressure. Finally, there are terms for specific kinds of pressure (residual, partial, barometric, etc.) used in laboratory testing that can add to the confusion.
The Basics of Pressure
So what is pressure? Pressure is defined as the force exerted over an area. If you push your finger against a wall, you are exerting pressure. In this case, the pressure is not very strong and you are unlikely to damage the wall. However, if you push a thumbtack against the wall you can make a hole. The force you are using is the same, but the point of the thumbtack is much smaller in area, thus applying a greater pressure.
Gas Pressure
Often times the pressures used in laboratory work are pressures exerted by a gas. Gas pressure is complex because it is influenced by the volume the gas fills, its temperature, and the amount of gas. For instance, think about a balloon full of air – it is nearly round because the gas inside is pushing with the same force in all directions.
Units of Pressure
We have already learned that pressure is a measure of the force applied over a specific area. The units for measuring pressure that we used today were invented at various times in history for use in different applications. The International Standard unit of pressure is the Pascal (Pa), which is equal to 1 Newton per square meter. This unit is very small – you’re feeling over 100,000 Pa on your skin right now, just from the atmosphere around you. The more commonly used unit is the kilopascal (kPa), which is 1000 times greater than a Pascal.
There are also many older units of pressure that are still in common use, especially in the United States. Pounds per square inch (psi or lb/in2) is one of the easiest units to visualize – the atmosphere around you is pushing with a force of approximately 15 pounds on every square inch of your body. Converting between each of the different measurement units is part of what makes working with pressure difficult.
Measuring in Mercury
The earliest units for pressure were based on the pressure exerted by a column of liquid mercury inside a glass tube, much like the mercury manometers in use today. Mercury (chemical symbol Hg) is a very dense liquid at room temperature and evaporates very slowly, making it ideal for use in measuring pressure. To create the same pressure using water instead of mercury, the column would need to be 13 times taller. Plus, water evaporates readily, and therefore provides inaccurate readings.
U-tube manometers, like those in the figure below, are often used in laboratory testing. When both ends of the manometer are open, the level of the liquid in the manometer is the same on each side of the tube. The atmosphere is exerting pressure on both sides of the mercury column, and the mercury is in turn exerting pressure back. The system is in equilibrium. In a closed manometer system, air is removed from one end of the tube to create a vacuum. When a vacuum, or negative pressure, is applied to the other end of the system, the mercury is drawn up into the column. The pressure exerted by the mercury remains constant. Since the pressure exerted on the mercury has decreased, the mercury will move up the manometer until a new equilibrium is reached. Conversely, when a positive pressure is applied to the system, the mercury will move up the opposite end of the tube. The difference in height between the two sides of the mercury column can be used to measure the pressure of the system.
At sea level, the atmosphere exerts sufficient pressure to support a mercury column that is 760 millimeters high. Pressure measurements relative to mercury are still commonly used today. Millimeters of mercury (mm Hg) is still the standard unit for many pressures in medical applications, such as blood pressure. Inches of mercury (in Hg) is a common unit for pressure in laboratory settings. However, many state and local governments have restricted the use of mercury, mostly due to health concerns, and mercury manometers are not always available. Several ways of measuring pressure have been developed that do not rely on mercury. For more information on mercury reduction initiatives, see "Getting Rid of Mercury: A New Frontier for Temperature Measurement".
Laboratory Testing and Pressure Units
Multiple systems of measurement can also cause confusion because of the size difference in the units. For example, in the Theoretical Maximum Specific Gravity and Density of Hot Mix Asphalt test (commonly called the Rice test, AASHTO T 209 or ASTM D 2041) the pressure range required is 25 to 30 millimeters of mercury - a very small range and nearly a vacuum. The pressure range used for saturation of specimens for Resistance of Compacted Hot Mix Asphalt to Moisture-Induced Damage (commonly called the TSR, or Lottman Test, AASHTO T 283 or ASTM D 4867) is 10 to 26 inches of mercury - a wide range with lots of room for adjustment. However, these ranges are frequently confused by laboratory personnel because the two tests use similar equipment and the numbers sound similar, even though the units are different. If converted to millimeters of mercury, the range required for the TSR test is 254 to 660 mm Hg, which is quite different than the range required for the Rice test!Absolute Pressure vs. Relative Pressure
In addition to the different units, pressure can also be measured from different perspectives. For example, think of a person running a 100-meter dash. You could say the runner has already gone 75 meters or you could say there are 25 meters left to run. Either way is correct, but the answers appear to be different. Pressure poses a similar situation because it can be measured by comparison to atmospheric pressure or to a complete vacuum.
Has the runner already run 75 meters, or are there 25 meters left to run?
Relative Pressure - Relatively Useful
Relative pressure is measured “relative” to the pressure created by Earth’s atmosphere. A relative pressure gauge reads “0” when not in use and a positive or negative pressure while in use. Instruments that measure relative pressure are easy to make and less expensive than absolute pressure gauges. So why doesn’t everyone use relative pressure all the time? Unfortunately, relative pressure is not as useful when it comes to calculations for most engineering purposes. Relative pressure changes based on the atmospheric pressure at the time of the measurement, which can vary based on elevation, weather conditions, and several other factors. Relative pressure is useful in some applications, but the more accurate measurement is absolute pressure.
Absolute Pressure – Absolutely the Best
Absolute pressure is measured by comparing the applied pressure to an absolute vacuum, or zero pressure. Absolute pressure doesn’t have to be corrected based on elevation and temperature. Mercury manometers measure absolute pressure. Absolute pressure gauges are usually more expensive than relative pressure gauges, and read approximately 760 mm Hg (or atmospheric pressure) at sea level when they are not in use.
Working with Vacuum
In construction materials testing, many test methods require the application of vacuum. Applying a vacuum means creating negative pressure, so that the pressure inside the container is less than the pressure created by the atmosphere. Test methods often direct you to “increase the vacuum,” which is the same as decreasing the pressure in the container. Applying a vacuum is like the runner from the earlier example turning around and heading back towards the starting line. We could be measuring the runner’s distance from either end of the race and the runner could be headed either direction. On top of that, many relative pressure gauges used in the laboratory do not indicate that the reading is a negative pressure. No wonder pressure is confusing!
Applying a vacuum is like running from the finish line towards the starting line.
Because of their low cost, relative pressure gauges are sometimes used for tests that specify absolute pressure measurement. For example, the degassing oven, which is used after aging asphalt using a pressure-aging vessel (AASHTO R 28 or ASTM D 6521), is equipped with a relative pressure gauge. However, the method requires that the test sample be subjected to 15 ± 5 kPa of absolute pressure. The pressure readings must be converted from a relative pressure reading to an absolute pressure measurement.
Barometric Pressure and Adjusting for Altitude
Barometric pressure is the same as atmospheric pressure – it is the pressure of the air around you at the time the reading is taken. Barometric pressure varies based on altitude and temperature. If you are at sea level, you are feeling about 14.7 psi on your skin from the atmosphere around you. However, if you are in mile-high Denver, Colorado, the pressure is only about 12 psi. If you hike up into the mountains outside of Denver, the pressure might go down to just 8.5 psi. Not many people build laboratories on top of mountains, but clearly the change in barometric pressure can make a big difference in test results, especially when using a relative pressure gauge. The pressure used in the degassing oven described above should be adjusted for local barometric pressure conditions so that the absolute pressure used is 15 ± 5 kPa.
So What About Those Other Types of Pressure?
There are plenty of other terms for pressure besides relative, absolute, and barometric. Residual pressure is the pressure remaining in a container after you apply a vacuum. It’s the pressure that is “left over,” hence the name. Partial pressure, used in many engineering activities, is the pressure applied by just one gas in a mixture of gases. The partial pressure of oxygen is an important number for scuba divers when monitoring their breathing equipment.
Now the Pressure is On!
There are a lot of factors that influence gas pressure and how we measure it in the laboratory. The different units of measurement, ways of measuring, the issue of vacuum, and the many terms for pressure can make dealing with pressure a confusing mess. Remember to check which units are listed, investigate if it is relative pressure or absolute pressure, and be careful when applying a vacuum. Following these guidelines should help reduce the pressure on you during laboratory testing!
Printer Friendly Version