Thursday, December 27, 2007

Statistitical Analysis packages - JMP

When I posted on DOEs, I briefly mentioned a statistical analysis package called Jump (JMP). I want to write more about that.

For some people, test engineering is about putting together the system. You build it, you ship it out to your customer (internal or external), and you move on to the next project. For others, test engineering is about managing the test process. The job revolves around SPC chores, preventive maintenance, & setting up new test runs.

But sometimes test engineering involves analyzing the results of the testing. I've spent plenty of time putting together graphs in Excel. But that sort of work can overload you when you have huge data sets & multiple sets of variables to consider. You need a more serious piece of software than just a spreadsheet for filtering down the data, looking at box plots, and plotting trends.

I know a guy who is a great statistician. He loves Minitab. The only package like that I've ever used is JMP, but he says it's a good piece of software as well. I know I like it, and more and more I've been using it instead of Excel when I need to examine data.

There are numerous books published by SAS (the company that wrote JMP) on how to use the software. I picked up one of them, Elementary Statistics Using JMP, and have read about half of it by skipping around to specific sections. The book is knowledgeable, well-organized and worth the money.

Tuesday, December 25, 2007

Peace On Earth

It's been the Christmas season here in New England. Snow on the ground, shoppers in the malls. And while I know that plenty of people who read this post do not celebrate this holiday, I would like to wish everyone peace on earth. I don't think anyone can disagree that the world needs more of it.

Thursday, December 13, 2007

Vendor books about testing - National Instruments

Back in October I posted about recent test system manuals Keithley and Agilent had written. In a post last month I mentioned that NI had also put out a manual that I would eventually review as well. Here it is.


First Thoughts
NI is very good at marketing. They interact well with customers, get knowledgeable sales people embedded with key industries, and support their hardware and software. So when I say they excel at marketing it is truly meant as a compliment. Yet this proficiency also hurts them. Read on and you'll see.


Sections
There are four sections and 14 chapters divided amongst the sections. The first is just an introduction, the second discusses test system guidelines, the third goes over improving a system, and the last one consists of case studies.

Section 1
This single chapter reads more like a position paper for NI being the best ever than an introduction to a test system guide. Pity. For example, on just a single page (1-5) the author referenced three different marketing white papers. My hopes for the manual diminished.

Section 2
There were two saving graces to this section. Chapter five has a good overview of different buses, and chapter seven reviews the PXI standard. Otherwise it is more marketing than substance.

Section 3
These three chapters were somewhat of a revelation. The marketing was minimized in favor of looking at 1) ways to speed up a test, 2) measurement accuracy, and 3) system longevity. Cool.

Section 4
I liked the first chapter in the manual. Describing software-defined radio testing, it was short & too the point. But the other three case studies were all but useless. Okay, so Microsoft used LV and a PXI chassis to test the XBox - why not spend a few pages and describe the test architecture or obstacles that were overcome in the design. Each case study reads like an extended press release.


Summary
Unfortunately, this testing manual is more like the Agilent manual (bad) than the Keithley manual (good). It pushes a theme of "NI products are the best thing since sliced bread." The only time it mentions Agilent is to take them to task for the lack of support of IEEE1394 (VEE isn't mentioned at all). The manual could have used a good editor - the exact same graph, bandwidth as a function of latency, shows up an improbable FIVE times under different titles.

In other words, if it wasn't for section 3 I would write off the whole manual as a waste of space on my hard drive.

Outsourcing a test station, part 2

Last month I posted about my experiences with outsourcing test systems. Here's another update: training.

The schedule has slipped - they usually do - but it was a result of scheduling conflicts and money issues. The system was ready. Regardless, I did have an engineer from the contract manufacturer fly out to be trained on the system. He seemed like a good guy & knowledgeable, but there was a definite language gap. Furthermore, I only had two days to show him a system I've been using for a couple of years.

Speaking slowing and struggling for words, I think I eventually taught him enough so that he can run the station when there are no big problems. We started with an overview of what the system does, the separate components, and the basic procedure. He spent a 1/2 day just testing devices. We also went over common maintenance issues and problem points to check when it won't run. But there were some things that didn't translate well.

But now I think I have the answer. I'm using my digital camera (a nice 5 megapixel Canon) to film common tasks and maintenance fixes. Maybe my narration will help, maybe not. But if a picture is worth a thousand words, how much is a high res avi file worth?

Sunday, December 9, 2007

Virtual Instruments

I said in a post last month that I would read & review Designing Next Generation Test Systems - An In-Depth Developers Guide from National Instruments. I'm practically done now & will post my thoughts in a couple of days. But parts of this manual neatly dovetailed with a conversation I had earlier this week about virtual instruments.

NI is big on the concept of a virtual instrument - use the computer in place of the benchtop instrument to do the measurements. I've used this concept for potentiometers and oscilloscopes. But I just don't think this works in all cases, or even most cases. I have two reasons to back this opinion.

Complicated real-world measurements
There are some properties that are more than just a voltage or current. You need a good deal of physical hardware to actually acquire the data. Several examples I'm familiar with include optical spectrometers, digital communications analyzers, and (more esoteric) high energy particle detectors. A good deal of additional circuitry, physical devices, and sometimes patented techniques are involved.

Test Expertise
Hardware companies that build test equipment often have a good deal of knowledge and experience making that kind of measurement. That information often is built into the desktop instrument that performs that measurement. In most of those sorts of situations I would rather have the actual instrument than spend time and effort trying to duplicate that expertise myself.


I am not saying that virtual instruments are invalid. I think they work well for any non-complicated measurements or measurement techniques that are well-established (i.e. - the modern triggered oscilloscope was invented over 60 years ago). But sometimes you need the actual hardware.

Monday, December 3, 2007

LabVIEW Certifications

In 1999, near the height of the internet bubble, Microsoft Press published a book called titled After the Gold Rush, with the more descriptive subtitle, “Creating a True Profession of Software Engineering.” I started writing this post as a diatribe against National Instruments (NI), but ended up writing about how they are supporting what this book proposed.


The Book

Steve McConnell wrote the book. If you don’t know who McConnell is, then you don’t spend much time working on major software projects (or you do it in isolation). Several of the books he’s written over the years (Code Complete and Software Project Survival Guide to name a couple) I recommend as required reading for serious programmers. I don’t know if I would label him a genius – I’ve only talked with him at a couple of seminars about 7 years ago, so I hardly know him – but he has a unique knack for gathering the best practices of a particular subject and positioning them in an organized fashion under one tidy roof.

In other words, he knows his stuff. And his thesis in this particular book was that software engineering needs to be licensed like other engineering professions or like dentists, doctors, attorneys, nurses, etc. There are too many programmers that learned how to code here and there, follow no standard software conventions, write mediocre (or worse) code, and yet still consider themselves professionals. This makes good sense.


NI Certification Gripe

NI has a fairly elaborate certification process that has evolved over the past decade or so. Exams cost ~$200 and are valid for only two years - after that, you have to take the test again. My first impression was that this was a real scam. Get companies to buy into the idea that they have to have NI-certified programmers, forcing programmers to cough up money every couple of years or risk job stagnation, which leads to a nice revenue stream for NI.

I have a MS in PHYSICS that I completed over a dozen years ago – do I have to go back to my alma mater every couple of years and re-certify myself? No, because getting that degree implies a certain level of competency. If someone wants to gauge me on those matters, they can talk with me or look at my body of work since school. I have a driver’s license that I renew every few years, but I don’t have to prove I can parallel park when I show up at the DMV. It is assumed that I drive on a regular basis and as such keep my skills up to date.

Once I’ve taken the LV certification test & proven my skills, why should I pay to take it again every two years? Prospective employers can talk with me about various LV topics, or look at my resume. Furthermore, my resume can prove that I’ve been regularly employed and have kept my skills current. To take the argument further, Microsoft certifications for server admins (MCSA) or system engineers (MCSE) do not require re-certifications. Yes, they roll out new systems every few years, but people with an existing certification can upgrade via a subset of the full test suite.


Counterview

But then I started relating NI’s certification efforts to McConnell’s thesis of licensing software engineers. Certification adds some legitimacy to programming in LV. It weeds out some people who write substandard code yet try to pass themselves off as experienced users. It allows those engineers with the certification to (maybe) command a better salary.

So I’ve modified my view. I still think National Instruments is exploiting the certification process, but they’re also doing a good thing for the LV development community. And since LabVIEW belongs to them, that’s how it works.