[ad_1]

Like many of you, I have a bench full of electronic instruments. The newest is my Rigol oscilloscope, only a few years old, while the oldest is probably my RF signal generator that dates from some time in the early 1950s. Some of those instruments have been with me for decades, and have been crucial in the gestation of countless projects.

If I follow the manufacturer’s recommendations then just like that PAT tester I should have them calibrated frequently. This process involves sending them off to a specialised lab where their readings are compared to a standard and they are adjusted accordingly, and when they return I know I can trust their readings. It’s important if you work in an industry where everything must be verified, for example I’m certain the folks down the road at Airbus use meticulously calibrated instruments when making assemblies for their aircraft, because there is no room for error in a safety critical application at 20000 feet.

But on my bench? Not so much, nobody is likely to face danger if my frequency counter has drifted by a few Hz.

How Many PPM Have Left The Building Over The Decades?

My first oscilloscope, a 1940s Cossor
Fortunately my 1940s Cossor is now retired, so its calibration status is no longer important.

So I have never had any of my instruments calibrated, and I’ve established that there’s no real need for me to do so. But let’s look at it another way, just how far out of calibration are they likely to be?

I can make a few educated guesses based upon what I know about the instruments in question. I am working against the inevitable degradation of components over time changing the parameters of the circuitry inside the instrument, and my estimate is based upon the quality of those parts and the type of circuit involved.

The accuracy of most instruments usually depends in some way upon two parts of its circuit; first whatever circuitry handles its signal path, and then whichever standard it uses against which to compare the incoming value to be measured. In both instances it’s likely that time will have done its work on whatever components lie in their path, perhaps changing their value, or introducing parasitic resistances.

This doesn’t matter in the case of my 1950s signal generator as its calibration was only as good as that of a pointer against a dial in the first place, but for my nearly 30 year old digital multimeter it might now be starting to show itself. Even those instruments which use references that should be beyond reproach aren’t immune, for example while a DMM may use an on-chip bandgap reference to compare voltages, it will still only be as good as the 30-year-old variable resistor used to trim it. All I can say is that if any of my instruments have drifted over time, they haven’t done so to the extent that I have been able to notice. Perhaps it’s as well that I don’t work in aerospace.

So far then, my instruments haven’t obviously let me down despite never seeing the inside of a calibration lab. But should I have done so? This is where it’s over to you, do any Hackaday readers take the time to have their instruments calibrated when they’re not required to by an exacting need at work? If so, why? Or do you have any tricks to DIY? As always, the comments are below.

[ad_2]

1 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *