Monday, September 24, 2007

Configuration control for test systems

Any firm writing software is concerned with configuration control. There has to be a way to manage what the latest version of the program really is. Especially with multiple programmers, you need to know what code is safe to run. Similarly, after the software is released the company needs to manage the configuration of the software out in the field & any revisions or upgrades it makes.

Document control is an important subject for any manufacturing firm. Work instructions, specs, hardware designs, test plans - these all have to be managed. Where I currently work we have a nice system called Omnify that performs these tasks well, but it took a lot of work to get to that point.

Hardware systems are not immune to configuration management issues, for many of the same reasons. Do a Google search for "hardware configuration control," and you will find items such as discussions about control issues on the old NASA Apollo program, telecommunications standards for configuring hardware on a network, and plenty of ads for software that helps to manage hardware configuration issues.

Having said all that, I have to conclude that similar concerns apply to test stations. For many test engineers this isn't a big issue. There is only one test system, they update it when they need to, and all is good. But what if you have several test stations that, for one reason or another, are different?

This is something I have struggled with lately. I have two different test stations that test the same things and run almost the same software. Both stations use an optical spectrometer, but they use different spectrometer models and that leads to slightly different software. When I upgrade the software on the stations, I need to maintain separate software images. When the technician does calibrations, I make sure there are different work instructions he follows for each station. If the firmware for the spectrometer is upgraded, it must be done separately and logged as separate activities.

The long term solution to this problem is upgrade both stations to the same model of spectrometer and therefore the same software. But in the short term it requires rigorous configuration control. That's just the way it is.

Wednesday, September 19, 2007

Autotestcon 2007

I worked for a company for about 3.5 years that build subsystems for aerospace applications. It was about half military and half civilian. Autotestcon claims to be "the United States’ largest conference focused on automatic test systems for US military systems." I never knew anyone who went to it, but that job was over a decade ago, so I'm dated.

Anyway, this convention is happening right now & lasts until tomorrow. It's at the convention center in Baltimore's Inner Harbor (a pretty nice location). I scrolled through the list of technical sessions and was impressed by the list. I have been to trade shows where there was a lot more marketing than actual learning, but the signal to noise ratio appears to be higher for this event.

So, if there's anyone reading this that went to this show this year, or has gone in past years, let me know what you thought of the show. I'll probably write an entry sometime down the road about trade shows for testing, and I'd appreciate the input.

Tuesday, September 18, 2007

Falling in love (with the system)

It is a thing of beauty. After months of meetings, charts, designs, purchase orders, assembly, phone calls, programming, and debugging...it WORKS. For an engineer, few moments are more satisfying than seeing the fruit of your labors run smoothly.

But there is a particular trap that a test engineer can fall into. Usually the system does not ship out the door - it's still there. The engineer will probably be running it, at least until he trains a technician to use it. As he uses it, he sees ways to speed it up, features to add, neat little things it could do. THIS is the trap: the test engineer spends so much time modifying/improving the system that he neglects his other tasks.


I will be honest and admit that I did this a time or two, especially when I was younger. My first love was a calorimeter I built in graduate school. I worked for months to create the right design, weeks on the code (Fortran!) to verify I was extracting the data correctly, more weeks writing software filters for the data. It was anticlimatic to actually just sit and take data and analyze that data. That's what I needed to do for my thesis - but I was happier playing with the code and the hardware. Eventually I had to force myself to do the actual testing.


Test systems are like children. They grow up and become productive - you have to let them go. But if you start taking pictures for your family album... then that's just weird.

Wednesday, September 12, 2007

Rules for building test systems

I've decided to create a list of rules to follow when building a test system. Now I've created an informal checklist over the years of what I do when I put together a new system, but it's time to codify the list. Furthermore, the items on the list are generic and not specific to any one industry. Maybe I'll create a separate rules list someday for actively running a test system, but not today.


Know what you are testing, or work closely with someone who does.
A test system built without knowledge of what it will test and how it will test will not work.

Document everything.
Eventually a technician will need to know how it works, unless you want to run the station yourself forever.

Respect Murphy.
Anything that can go wrong will. It's really just applied statistics. So plan for that when building the system.

Create PM plans and schedules.
Most people think of PM (preventive maintenance) in terms of manufacturing systems: you should have a schedule for when parts need to be oiled, when accelerometers have to be recalibrated, when to clean off gear assemblies. But test systems, especially high volume ones, need this maintenance as well.

Run it with actual parts.
NEVER proclaim the system is ready to go before you have ran it through its paces thoroughly with actual components. I cannot stress this enough.



There's my list. Please comment or email me if you have suggestions of your own. I'll probably update it in a month.

Tuesday, September 11, 2007

Patents for testing

Through several conversations with patent attorneys, I’ve learned something about patenting test methods.

Patents are not cheap. If you work for a large corporation that regularly submits patents, this is not a noticeable issue. But for everyone else, there has to be a very good justification to apply for a patent.

A company will apply for a patent for several reasons. The patent can provide a barrier to a competitor trying to enter a new market. It can protect the company from competitors who may try to use the research that company has done without paying a license fee. Some companies may not actively use the patents, but they can make money off the patents by licensing them. Patents can be a source of pride – listing all the patents a company has applied for is equivalent (in certain industries) to beating your chest.

But there is a catch: you have to be able to prove that the patent is being violated. For example, suppose you have an innovative new manufacturing process. This process is an intermediate step that deposits certain chemicals on the product, and later that layer is removed. This process saves money and improves the product, but technically there is no real way to prove that you do it. Because of that, your competitors could use the same process (that they read about in your patent application) and claim that they do not do that. Short of going into their manufacturing facility, you can't prove it. So, your company might be better off just classifying it as a trade secret and not patent it.

A similar conundrum can apply with test methods. You have a new way to test your product. It's clever, it saves money, it's faster. But how can you prove your competitors test their products that way?

This is something that I'm wrestling with right now. The only saving grace is that if this is the only way to reasonably test the product, then I can probably apply for the patent. We'll see.

Tuesday, September 4, 2007

There is no "test engineering" major

To my knowledge there is no "Test Engineering" degree offered by any reputable 4 year college. In fact, I doubt that most test engineers went to college planning on being a test engineer. Of the many test engineers I've known, their degrees have been in EE, Physics, CS, and ME - that list is roughly in numerical order as well. Are college students aware of test engineering as a specific position? Probably not.

This is different if they are CS majors. In that case, they've probably been exposed to software testing theories in classes. There are scores of books, websites, and blogs on the subject. They may have even interned as a SQA (software quality assurance) engineer. I've met Microsoft testers, and they've said that most programmers hired start out in a testing position before they do anything else.

But I think hardware testing is a lot more dependent on exactly what you are testing, so the nature of it is harder to teach in a classroom setting. Yes, there are a few books that present an overview of the subjects (like Test Engineering: A Concise Guide to Cost-effective Design, Development and Manufacture by O'Connor), but it's just an overview. You'll get exposure to the basic tools (oscilloscope, DAQ cards, etc) in EE classes. You may learn about statistical methods in a stats class or industrial engineering class. But to really learn about the specifics of testing in a certain field, then you have to dig into the details. For example, you'll never learn about the details of fiber optic testing in an undergraduate class. You either learn that on the job or maybe from a book (like the excellent Fiber Optic Test and Measurement by Derickson).


Of course, what you major in during college is not necessarily a predictor of what you'll do in life. I once worked with a manager of the Integrity Program for the F-22 at Lockheed Martin in Ft. Worth, TX. It was a fairly prominent engineering position with a good deal of responsibility. He had a BA in philosophy....