Monday, October 1, 2007

Rules for building test systems (part 2)

Back on September 12th I posted a list of five rules for building test systems. Here's a few more...


Leave something for Phase 2.
You must resist the temptation to add new features to the system. Release the test system to production, and THEN work on your plans for the upgrade.

Do NOT hard-code specs. The specs will always change. Put the specs into a database that you can query, a spreadsheet file, or at least some sort of configuration file.

Log what happens. Make sure the software logs what happens - you cannot always rely on the technician running the test to write down what happened when and why.

Be aware of the environment.
Your test equipment or the DUT itself may change as the temperature, humidity or pressure change. You have to take this into account.

Know your accuracy.
You need to know the specs of the test equipment in your system. To be more specific, you need to know the specs for how you use the equipment, because vendors will sometimes list different specs for different conditions.

2 comments:

Anonymous said...

Have you looked into using Off-The-Shelf test systems vs self-made test systems to do data management and test automation?

Unknown said...

That's a broad subject that deserves some thinking - maybe I'll post on it in a week or so. But in general, I'd rather not reinvent the wheel if I can help it. I've used several 'canned' systems, especially motion control and generic test instruments, in the past. But often the things I've had to test were unique enough that I had to build my own system, or at least use the off-the-shelf solutions as a sub-assembly for my final test station.

In fact, some test systems have their own scripting language you can use for building your own tests.

Data management is, again, a completely different topic. I'll address my experience with that sometime in the future.