Time to Update Our Standards

14598646597_9c7d086e1d_zTime to Update Our Standards

Not our personal or moral standards, rather the set of documents we rely upon as a foundation for reliability engineering tools and techniques.

We have a wide array of standards for reporting reliability test data to calculating confidence intervals on field returns. We have standards that describe various environmental conditions and appropriate testing levels suitable to evaluate your product. We define terms, concepts, processes, and techniques.

A Missing Element

Despite the many documents and impressive titles of numbers and abbreviations or acronyms, most of the standard related to reliability engineer fail to include sufficient context and rationale concerning when and why to use or modify the standard. If a specific test is to determine the expected lifetime of solder joints, well, which type of solder joints (shape, size, configuration, material, and process) is the standard appropriate and when does it not apply? Make the boundaries of applicability clear.

No single test works for all situations.

For example, a wrist watch standard defining how to test for specific water resistance claims does not evaluate the effects of corrosion. The standard has the watch or similar device exposed to a set of water conditions, then evaluate if the system is operating, nearly immediately after the water exposure.

We know that water encourages corrosion, yet takes time to occur. Water alone on a circuit board is no big deal (much of the time) it’s when the water facilitates the creation of additional and unwanted current paths that there is a problem. Metal migration and rusting, take time to occur.

If the standard for water resistance doesn’t evaluate corrosion, and it’s one of the ways your product fails, too bad. You can ‘pass’ the test, meet the standard, add it to your data sheet, and the customer will still experience a failure.

Same for many environmental testing, FMEA, life testing, field data analysis, and a range of other standards. They do not include the critical information necessary for appropriate application of the standard to your particular situation.

Connection to Value

Many, not all, standards provide a recipe to accomplish as task or evaluation. One of the values of the standard is different teams may replicate the results of one team by repeating the steps outlined in the standard.

One of issues with standards is they do not include how and why to actually accomplish the set of tasks and what to do with the results. In part, we need to clearly connect, say the task of testing a product across a range of temperature and humidity conditions, only if it will provide meaningful information.

Don’t run the test if the information is not needed, unnecessary or meaningless.

For example, if we expect that exposure to high temperature and humid conditions may increase the chance of product failure. We may want to know

  • how many failures will occur;
  • how the product will actually fail;
  • how the failure will initiate and progress;
  • when the failures occur under use conditions;

Or any number of reasons to use the results of the testing. Often we run a standard test with very few samples, experience no failures and erroneously conclude all it good. Then surprised that failures occur anyway when the product is in use.

The standard let us down.

The standard provided only a recipe or outline for a procedure and now that guidance and rationale on how it may or may not help us and our team resolve very real questions. Testing 3 units that all pass does not mean your solar panel will survive hot and humid conditions for 20 years with no failures. It doesn’t.

Only run the test or work to accomplish a process only if it is tied to answering a question. Focus on business decisions and the questions we have to resolve in order to make better decisions (i.e. Wrong less often).

Summary

Let’s change the way we read and use standards. You may need to add the how and why, the boundaries, and the connection to value for your situation. It’s not always easy. The people writing the standard often have sufficient experience to include guidelines to help you — when possible contact them and ask what was their thinking and what are the limitations.

If enough of us avoid simply meeting the requirements of the standard, we will

  • Enjoy reliable product performance
  • Create value to our organization with each test or task
  • And, eventually change how standards are written

Question use of reliability testing standards

Each of us have seen product life or component reliability claims on product literature or data sheets. We may even have received such claims stated as goals and been asked to support the claim with some form of an experiment. Standards bodies from ANSI, BSI, ISO, IEC, and others from around the world provide standard methods for testing products. This includes product life testing in some cases. Continue reading “Question use of reliability testing standards”

The language we use matters

During RAMS this year, Wayne Nelson made the point that language matters. One specific example was the substitution of ‘convincing’ for ‘statistically significant’ in an effort to clearly convey the ability of a test result to sway the reader. As in, ‘the test data clearly demonstrates…’

As reliability professionals let’s say what we mean in a clear and unambiguous manner.

As you may suspect, this topic is related to MTBF. Simply saying Continue reading “The language we use matters”