Overtightening a threaded stud will stretch it, distorting its shape and increasing the risk of fatigue cracking. Undertightening the connection means the fastener could come loose. To prevent either risk, torque wrenches for large bolts on vehicles and industrial equipment, and torque screwdrivers for electronic assemblies and medical devices, are designed to indicate to the user when a certain pre-set turning force (torque) has been reached. Depending on their mechanism, that indication might be an audible click, or involve a needle moving against a scale, or a number rolling around a dial. Some torque devices are fixed at one setting; on others, the setting is variable, and can be adjusted by compressing a spring or adjusting a mechanism (see box, below).
Like any precision tool, the performance of any of these torque devices can drift, and so they need regular checks andcalibration to make sure they remain accurate (see also box, below).
For safety’s sake, checking and (re-)calibrating of a torque tool is usually carried out by an accredited calibration laboratory or service. But the subject needn’t end there; learning how torque devices are calibrated helps users better understand their strengths and limitations.
Exactly how the test and adjustment protocol should be carried out is defined by a recently-revised standard, ISO 6789:2017. What is remarkable is how much more comprehensive the new version is compared with its predecessor from 14 years before, observes Ron Sangster, managing director, Advanced Witness Systems, a manufacturer of calibration machinery and instrumentation. (Sangster is also chairman of a BSi committee on torque standards). AWS has also published a white paper on the subject, available via www.is.gd/payuju.
Sangster explains that ISO 6789:2003 focused on manufacturing compliance; the burden was on torque tool manufacturers to ensure that their torque wrenches and screwdrivers met the stated tolerances, which were given in simple tolerance bands: variance of less than 6% under 10Nm rated torque, and 4% above, was considered acceptable; anything outside those bands was unacceptable.
That approach and those figures remain in the first part of the new standard, Sangster points out, but are joined by a second part that provides a more statistically-sophisticated calibration method, which is said to be only for those tools whose performance could deteriorate over time or with use (in other words, every single one).
According to Sangster, the change was driven by car manufacturers, particularly from Germany, wanting tighter controls in manufacturing to improve quality control. He observes: “There’s a lot of computing, a lot of paper. It’s treating torque wrenches as more like instruments than tools. The costs of testing has doubled; some labs aren’t doing calibration anymore.”
A key concept in part II is measurement uncertainty, which is a measuring science, or metrology, concept that remains poorly understood, argues Sangster. He says it is often conflated with measurement error, which is a deviation from the expected torque. He defines uncertainty as how believable the tool’s measurement is. Then he poses a question: Which is better: a tool with low error but high uncertainty, or high error but low uncertainty? The latter, because if a tool produces a believable error, it can be compensated for. But you can’t compensate for uncertainty. That sets the limit of how much information your torque tool provides.
ISO 6789:2017 provides mathematical formulae for calculating uncertainty and error. Observes Sangster: “If a set torque is important, it needs a tight tolerance. You’ve got to believe it’s repeatable. If it’s got a poor uncertainty, you have less belief that it’s doing the job as it should. That’s down to the quality of the engineering to decide at what level the tool torque is acceptable. Lower uncertainty is acceptable, but it is probably more expensive.”
To provide more certainty about uncertainty, Part II defines seven sources of uncertainty in a torque tool:
- Variation in the scale, dial or display resolution. This depends on the type of scale. For example, if the end number has a revolving dial, a rule of thumb is to divide the scale graduations by two. If it’s a bending beam, parallax error can be a problem, as the angle of view from the perspective of a user on the end of the handle can distort the reading
- Reproducibility of torque tools. If you calibrate a tool in one place, what happens when you use it somewhere else in a different position? This is the most important source of error, says Sangster
- Geometric effects of the output drive of the torque tool. How square is the square drive at the end of the tool? How well have the parts been machined? This often depends on manufacturing quality
- Geometric effects of the interface between the output drive of the torque tool and the calibration system. Some adapters can be poor and deform under load
- Variation of the force loading point. This is where you apply force on the handle of the tool
- Repeatability. Variation in measurements of a tool measured in a single setting in a single position
- The measurement device at the target torque. This depends on the machine. (AWS’s torque measuring devices use their own control and feedback loop calibrated to a different standard, BS7882:2017, which allows calculation of its calibration error and uncertainty).
Generally, adjustable wrenches are tested at five readings at three settings between the lowest adjustment gradation and 100%. To establish the levels of uncertainty, a number of calibration operations are carried out according to ISO 6789:2017. Up to 150 of them might be required.
According to AWS, dealing with statistical uncertainty requires complex statistics and formal methods, and are difficult and time-consuming to measure and calculate manually. To cope with these demands, it provides both machines and software. Machines are said to take the error – and uncertainty – out of measurement. Software calculates and records the many iterations of tests required over a tool’s lifetime.
Sangster’s company has developed two machines: one for wrenches (pictured, right), and one for screwdrivers (pictured, p15). At the core of both is an intelligent transducer, where a twist of the shaft changes electrical resistance. That signal is amplified, made digital, and processed to find the key point of the cycle, for example the first peak of torque output. In both, a stepper motor applies the rotational force to the handle of the tool. The entire operation is computer-controlled.
The torque wrench calibrator holds the wrench horizontally and drives it up and down automatically. “For click-type torque wrenches, it drives to the first click, stores the data, and then drives back to zero,” Sangster says.
Developed over the past 18 months is the company’s new torque screwdriver calibrator. Although it has some commonality of components, it mounts the tool vertically, rather than horizontally, and imparts the required twisting force on to the handle via a 3D-printed insert (specific to the tool) that fits around the handle and into the barrel of the machine.
A detail in the new standard had a big effect on the way the machine was designed, Sangster reports. It specifies that, for wrenches, the tester must carry out the last 20% of tightening up to the rated torque in a short period, ranging from 0.5-2 sec, depending on capacity. This requires that the operation slow down so that the tester doesn’t overshoot the target value.
But for torque screwdrivers, that same time limit became a maximum limit; the last 20% of torque should take between one half and one second. That proved very difficult to accomplish in practice, Sangster reports, because of manufacturing variations in the tools themselves. There are three main types of torque screwdrivers: dial-type, click-type and cam-type, which are the most common. The latter consists of two spring-loaded dimpled plates with ball bearings between them. Twisting overcomes the resistance of the ball to come out of the indent.
Sangster reflects: “The manufacturing tolerance has such a very large effect on the timing in the travel, when rotating. There is variation in those dimples and the circular disc, and they are not at all uniform. Another variation is the depth of the indent that the ball sits in. We learned a lot about the variations of manufacturing of torque screwdrivers.”
In the end, the only solution to achieve predictable timing was to do an automatic learning cycle. For each torque screwdriver, the machine twists the tool through 360°, recording each of the positions – which might possibly number four, six or eight – and the exact timing pattern. This is said to be a unique feature.
BOX: Prevent your spring being spragged
Torque wrenches that adjust torque control by means of spring compression should be wound down to take the pressure off the spring. Sangster observes: “Mechanics and fitters invariably don’t do that; they put it back in the toolbox as-is. That’s the reality, and that’s why the standard never quite fits the real world.” If they don’t turn down the adjustment, the internal spring can ‘sprag’ – take a set. He adds that the standard proposes a way to overcome this during calibration by ‘exercising’ the wrench three times before taking a measurement – dialling the device all the way to its upper limit, and then back down again.
BOX: How often should torque tools be calibrated?
According to AWS, ISO 6789:2017 only gives one absolute rule: the maximum interval should be 24 months. That said, it advises using a risk assessment to set calibration frequency on a risk basis. Calibrations should be more frequent for high-value or safety-critical applications, where there are recall costs to consider, and also if measurement history shows that the tool tends to go out of calibration. Other relevant factors in setting a frequency include maximum permissable error; frequency of use; typical load during operation; customer and legislation requirements; if the tool has been overloaded.