Tuesday, February 17, 2009

Optimistic skepticism

I've been deep into updating two separate test stations recently, and I almost repeated a mistake I made last fall: wanting the code to work so much that I willingly believed that a bug had just one source.  Fortunately I dug deeper and found three separate sources of a single bad result.

For a couple of years in the late 90s I tested software for HP/Agilent.  When you test software, the goal is to break it.   You go to your job thinking, "What would a user of this software do that the programmer would not expect to happen, and will that behavior break the program?"  It was a difficult gear change at first, but I think it made me a better programmer (that's the reason that many young  programmers at Microsoft get their start testing code).

A primary goal of a programmer is to get his code to work.  There's an indescribable, almost physical thrill you get when an algorithm you've invested so much time into comes together and performs.  To subsequently try to break it is a literal buzzkill.  

Test engineers, at least the ones who also write code for their systems, are sometimes stuck between opposing priorities.  On the one hand, you enjoy building a new system that works.  On the other, the system has to work all the time - the worst result from a test system is erroneous data that you think is correct - so you have to spend a good deal of time trying to break it.  

It's a tough row to hoe, but that's just part of the job.

Wednesday, February 11, 2009

Adhering too tightly to the requirements

I've been really busy at work the past month plus - but I'm employed, so my complaints are few.

Read this article and watch the video (also on youtube).  The part that really troubled me was when the engineer says "this is just bad engineering" and the manager says:

That might be, but I can’t afford to worry about that. My job is to make sure the project follows this plan from start to finish.

There are two sad parts to this.  First, that this was a real event that happened at NASA.  Second, that I've seen some of this same behavior at big companies I've worked at before.  The management adheres to the process, and all the documentation they have, even in the face of contradicting data.  

As a test engineer, that is especially abhorrent to me: valid data is what I work to create for a living.  For a manager to favor process over data is just wrong.