Tuesday, December 30, 2008

Still around after the storm

It's been a busy month, personally and professionally.  Aside from new hires and a reorg at work, on December 11th there was a huge ice storm in the New England area that knocked out my electricity for ten days.   Luckily I got it back before Christmas - some people weren't as lucky.

Anyway, I have several more posts that I'll be getting up this winter.  As I've said before, I usually flag emails/articles/etc. when I find something interesting to say.  I'm about due for another clearing of the in-basket.

Shot of the woods near my house the day after the storm.

Friday, December 5, 2008

The power of repeated testing

I know I've talked before about statistics in relation to test engineering (here or here for example).  One thing I haven't really discussed is Gage R&R testing, but something personally happened to me this week that reminded me how useful that testing tool can be.

Where I live is close to the bottom of a hill within a subdivision.  I have found, through repeated testing, that if I a) shift my Toyota Prius into neutral at a specific speed once I turn into the subdivision and b) follow the same path, then I can overcome the two small hills and coast all the way down to the end of my driveway.  It's kind of a geeky thing to know, but I'm an engineer with a cool car built to do stuff like that.

Last week I got new tires.  A day later I took the normal coast route but the car felt different - a little slower while coasting.  I had shifted into neutral within a couple mph, the roads were not slippery so I didn't need to brake excessively for the curves, but it was definitely slower.  So I called the shop &, as I suspected, they need to be balanced better.

This is a perfect example of why you should run a Gage R&R on your test system every once in a while.    Like my regular "coast to the house" test, a regular test against a known standard will show you if there is something a little "off balance" with your system.

Thursday, November 20, 2008

A good test engineer studies the details, part 3

A couple of weeks ago I went on a camping trip up in the White Mountains of New Hampshire. I'm on the boy scout committee for my son's troop and one of the scout masters put together an overnight trip just for the committee adults. The plan was to hike up the side of a mountain, set up camp in an unheated cabin built back in the 30s (by the Civilian Conservation Corps, part of Roosevelt's New Deal), fix our own meals, then hike back down the next morning.

This is not something I've ever done before. I love to camp - we went on camping trips every year when I was young, and I take my kids on at least a couple trips every summer. But when I say "camping" the assumptions made are a) it's warm weather, b) I pack lots of stuff in the car, and c) it's at a state park or somewhere similarly well-organized. For this overnight trip we were each expected to hike a couple of miles uphill all the way while carrying our own food, a gallon of water, sleeping gear, and very warm clothes (forecast of ~20°F that night).

So I approached this almost like I would approach designing a new test: look at previous designs, step through the new test algorithm detail by detail, examine my constraints and design with them in mind, and finally buy the equipment I need. I started with the list I usually use for summer camping trips.  I shopped around and bought a good internal frame backpack with plenty of room for a sleeping bag, clothing, water, etc.   I did a preliminary pack of what I had to see how much room I had left.  Then I went through a mental checklist of what could go wrong & what I would need if it did.  Finally, I went out and bought the remainder of my equipment.

This level of detailed preparation paid off. My folding saw came in very handy cutting the wood we burned for heat that night. I stayed nice and toasty with my thermal underwear and sleeping bag (rated to 0°F). And my spare plastic baggies were invaluable in making the chocolate pudding we had for dessert.

 

 The point I am trying to make here is that my attention to details meant that I had everything I needed to be comfortable on an arduous trip. Similarly, that level of detailed planning will usually insure that your test system will give you valid data as well as recover from difficult situations.

A good test engineer studies the details, part 2

I've always been a big fan of PDAs.  I bought a Casio Telememo watch as a "congratulations on graduating college" present for myself.  It held up to 100 phone numbers plus it had calendar and memo functions.  I was hooked.  Since then I've had a host of such portable devices, including wring a program on my Hp48 calculator to hold contact info.  My next device will be either the new Blackberry Storm or maybe an iPod touch.

I view such a device as a necessity for a test engineer.  Sure, it can be a time-wasting toy at times, but I think there are 4 good reasons for using one:
  1. Having a calculator with good scientific functions in you pocket is really handy when checking your test results. 
  2. I have often kept technical specs (often in PDF format) on my Palm - this can be a lot handier than looking it up online.
  3. Writing down notes at any time in electronic form is very helpful when you have to refresh your memory 6 months later via a search function.
  4. More so that other engineers, test engineers are mobile - commuting from your office to the test lab, meetings, customer's facility, remote site, etc.  But sometimes a laptop is cumbersome, especially if you need to fit in tight spaces.
How does this fit in with my thesis that a good test engineer is detail-oriented?  Referring to the reasons above, I would say that having a tool like this allows you to focus more on the details.

 

Chronological list of my PDAs (1989-2008)

Casio Timebank,  HP48S, HP Jornada 680, HP Jornada 620, Palm TX

 


Monday, November 17, 2008

Testing Intel's latest chip

It's not every day that your profession is spotlighted in the New York Times.  It's even rarer that it is cast in a favorable light.  And yet, today there is a nice piece about Intel's testing of their latest chip, codenamed Nehalim.   The writer interviewed the guy in charge of product testing, and the article references the difficulty in designing the tests.  Of course, they had to talk about the Pentium floating point bug (I certainly remember that one), but that was way back in 1994.  

Anyway, I liked reading that article & I wish John Barton (the test chief) luck.  Testing something of that magnitude has to be keeping him up nights.

Thursday, November 13, 2008

A good test engineer studies the details, part 1

Back about 14 years ago I created my first website.  It had maybe a dozen different pages, talked about different interests, had a few jokes on it - the usual stuff that people were doing on the web way back then.  In 1997 I decided to upgrade my computer & ran into all sorts of grief debugging it.  Feeling the need to gripe about it afterwards (blogs weren't invented back then), I wrote a page about it and added it to my website.

I applied for a job at Hewlett Packard a year later.  During the interview I found out that the hiring manager had looked me up online - Google was invented back then -  and found my website & description of the upgrade.  One of the reasons he brought me in for an interview was because he was impressed over how I had a) dug into the problem, b) tried out several different solutions, c) identified the problem (bad EIDE plug on the motherboard) and d) fixed it.  It was that kind of attention to detail and problem solving skills he needed in a test engineer.  I got the job, moved to California, worked there a couple of years, and loved it.

That website is long gone, but last week I dug around and found the original HTML files I had created.  So, here's the complete page that helped me land the job:

=================================================================

The Upgrade From Heck

Early in 1997 I decided that my poor 486DX had seen its last. Fancying myself a clever fellow, I decided to upgrade on my own. I mail ordered a motherboard, Cyrix 686, 72pin memory, and a PCI video card, carefully shopping around to get the best price on each.

Tragedy struck. First, I found that the screwhole locations on motherboards are not standardized. Solution: epoxy plastic standoffs to my sturdy metal case. Then, the board would not recognize my hard drive. Here are the steps it took to fix this:

  1. Played around with the BIOS settings. No luck.
  2. Borrowed a brand new Western Digital drive from a friend. Still nada.
  3. Called tech support, who returned my call several days later with no answer.
  4. Borrowed a plug in IDE card from work that goes into an ISA slot. When I plugged the old hard drive into the card (rather than the built-into-the-motherboard EIDE plug), the computer did detect the drive.
  5. Faxed all these steps in a long memo to tech support, who shipped out a new motherboard free of charge. Problem solved.

Below is a picture I took of my version of dual-processing. During the month I was working on this problem, I still wanted to work on my computer. So I would have to unplug the new board (bottom), plug in the old board (top), turn on the computer, change the BIOS settings, and go.

Okay, I'm done venting now.

Wednesday, November 5, 2008

Relieved

Thank goodness it is, finally, November 5th.

Friday, October 31, 2008

A good test engineer studies the details (prologue)

Different professions require different skillsets.  To be good in that profession, it also helps to have certain character traits.  I think that being a good test engineer requires attention to detail.  Over the next week, I'll write about three different personal examples that I think illustrate this point.

Wednesday, October 15, 2008

Blog Action Day - Poverty

Today, October 15, 2008, is Blog Action Day, and the subject is Poverty.


Growing up, my family was at the lower end of middle class, a couple steps up from being really poor.  So I have always been strongly motivated to study hard, get a good job & have a good amount of cash.  Partly because of that, I have historically believed that if you are focused then you can pull yourself up by your bootstraps.  Not having lots of money doesn't prevent you from going to the library.  You can take advantage of what the public schools teach you, provided you study hard.  Loans and scholarships are available for college if you work for them.  Don't get distracted by parties, games, TV, etc. while in school & you should be just fine.

That was how I felt until my senior year in college.  That fall I participated in a mentoring program at a local elementary school.  I was paired up with a 4th grade boy, David, who needed help with math.  So once a week I would go to the school and we'd go through his math work for an hour or so after classes were over.  Once or twice we just hung out and threw a baseball outside as well.

What did this have to do with why did my opinions change?  I guess I just started realizing how being poor can handicap a kid.  When I was working with David I realized that things my mom had helped me learn when I was his age, he hadn't had that option - his mom had to work a lot since his parents had been divorced.  When I was a kid my parents had bought a set of children's encyclopedias (published by the Worldbook people), and I always had ready access to reading those at any time of the day.  The library is only open at limited times, and when you're a kid you can't just hop in a car & drive there.  Also, he was often hungry.  I remember bringing snacks once to a study session - I had skipped lunch to study for a test - and he wolfed down the food in a heartbeat.  It's kind of hard to focus on learning things when you're hungry.

Years later, I would think even more about David.  He lived in a rural area, where the schools just weren't very good.  The library was in the middle of town, so he couldn't get to it easily.  He didn't have a lot of time to do school work after hours because he had a lot of chores.  Then I related that to my dad.  I'm pretty sure that if he had been born into a different part of the country, & his family had more money then he would've been an engineer of some sort.  But he grew up in the woods of Kentucky, quit school when he was in 8th grade because he had to get a job, & bounced around from one factory job to another.  Sure, he managed to become an electrician, but that came at great effort and not until he was well in his 30s.

There are other ways that being poor can make it tough.  There is a strong link between nutrition and brain development.  If a child doesn't get good food, his brain as well as the rest of his body suffer.  If a child has to work a lot to help make ends meet, that means she can't spend time learning things she'll need to make her own life better as an adult.  If you live in a geographical area that is dirt poor, then the public amenities like schools, libraries, etc. are  probably either scant or nonexistent.

I think the point I'm trying to make is if your family is poor then sometimes there aren't any bootstraps worth pulling.

Tuesday, October 14, 2008

Testing very small stuff

I was cleaning out my inbox (again) and started reading a recent issue of Evaluation Engineering. I realized that I had referenced articles from them before, did a search, and found three separate times (one, two, three). So the next box car in my train of thought ran, "I wonder what they say about nanotech testing?"

For the last 9 months or so I've been involved in testing devices that involve either MEM structures or nanoscale devices. This has required a certain evolution in my thinking. For example, a few years ago I had never thought I'd have to automate & analyze the data from an interferometer that imaged micro-scale shutters. I did just that this past spring.

I think the first time I was really aware of nanotechnology as a going concern was back around 1991 when I read Great Mambo Chicken and the Transhuman Condition. Of course, that book is somewhat out of date 18 years later, but at the time it was a great read - I still have my copy.

SO I've been reading more about testing at this level lately. Here's a few articles:
Battery development & testing
Testing a nanotech system

Keithley has been particularly active in this area. Two years ago they introduced a nanotech testing blog. A couple of weeks ago I received a Nanotechnology Test & Measurement Resource Guide. Good for them.


After reading more details about nanotech testing over the past couple months, I've come up with two conclusions. One, I've barely scratched the surface. Two, nanotechnology is rapidly expanding, and I thnik the need for testing it will be key in the 21st century.

Saturday, September 20, 2008

Congrats to the LHC

In the early 1990s I was in grad school doing research for the Superconducting Super Collider (SSC).  I worked at the Texas Accelerator Center, where the "first foot" of the SSC was built.  Then I went to Fermilab, where I spent a couple years learning high energy physics and building a new type of calorimeter that would work at low-angle regions after the collision (where the radiation levels were particularly high).  

Congress started reducing the funding for the project, then cut it entirely in 1993.   My project lost its funding, and I escaped with my MS in Physics.  For years I was bitter over that whole episode, but eventually I realized that my life would have turned out very differently, perhaps for the worse, if I had stayed with the SSC.  Also, my experiences there - writing programs for data analysis, building a test system - started me down my current career path.  I can't be bitter about that.

In the 15 years since then, I've lost track of the cutting edge of high energy physics, but I still try to read up on it once in a while.  I was very excited on September 10th when they activated the Large Hadron Collider (the New York Times wrote a nice piece about it from a layman's perspective).  

So, cheers to everyone involved with the LHC.  When I was at Fermilab there was a sort of rivalry between us and CERN (where the LHC was being built), but I'm pleased that at least someone will be slamming protons into each other.  

Wednesday, September 17, 2008

Structural mechanics kit

Last month I received an email for a "complimentary copy" of an Introduction to Structural Mechanics Simulations CD.  It comes from Comsol, a company that writes software for physical modeling.  According to the email, the contents include tutorials on simulating:

- Static linear analysis
- Thermal stress
- Fluid-structure interaction
- Fatigue analysis
- Nonlinear materials
- Multiphysics user stories
- Tour of COMSOL Multiphysics

Now, I am in no way a mechanical engineer.  One of my very first jobs was as an engineer in an aerospace firm, and yes I did do some analyses of stress levels, gear matching exercises, and other mechanical engineering 'grunt work'.  But that job also taught me that if you dig into the details, then mechanical engineering is a serious discipline.  

But, if any of you out there are interested in this, here's the website:

Tuesday, September 16, 2008

Optimizing LabVIEW

Last month I attended a local NI-sponsored  LabVIEW seminar.  Usually I don't bother with those things.  They are aimed at novice users and are usually frequented by consultants looking for new work.  But our NI rep assured me that this one was aimed for experienced users.  Since they were also serving free lunch, I figured it wouldn't hurt.

It was actually pretty good.  The title was Advanced Performance Optimization in Labview, and the focus was improving programs using memory management.  Because of the unique ways that LV treats data, there are some tricks to speed up your programs.  I mean, if all you want to do is run an instrument and take data, these tips won't help.  But if you're controlling half a dozen instruments and dealing with reams of data, then you probably need all the help you can get.

The presenter was a guy named Brian Powell.  I listed his blog , Open Measurements, as a useful one this past summer.  He obviously knows his work well.

Anyway, I have a PDF of the presentation (including his notes).  If anyone wants it, email me and I'll forward it on to you.

Sunday, August 31, 2008

Coming this fall

With all the things going on this summer (new job, summer camps for the kids, big vacation trip), it has been over a month since my last post on here. In that time, I've had several ideas I've wanted to blog about. So I made a note/flag/task for it and kept working. I just got back from taking my kids on a 2400 mile trip (driving my Prius, which I love) to see some friends and family. That excursion marks the official end of my summer activities, so now I have more time for other things.

Here's a quick list of things I plan to write about over the next few months. In no particular order they are:

  • Testing small stuff
  • Intranets
  • LabVIEW v8.6
  • LabVIEW optimizations
  • Blog Action Day
  • Being organized
  • Contractors

Saturday, July 19, 2008

Clearing my mailbox, pt 3: VB6

I subscribe to various magazines online. Yes, the subscriptions are free, and yes most of those articles are written by the marketing arm of a company trying to peddle something. But sometimes you can still find interesting stuff.

One case in point is an article in the June edition of Evaluation Engineering. A marketing person from NI wrote a piece about life after VB6, since Microsoft is ending support for Visual Basic 6 at the end of this year. The question the author addressed was 'what do I need to know to upgrade from VB6 to VB.Net?' And I would say she did a credible job of covering the high points, as well as listing other references for further reading.

Back in 2000-2001 I programmed almost exclusively in VB6, with a smattering of C++ as well. When the time came to consider upgrading to VB.Net, the VP of engineering at my firm decided it was too expensive at the time. A contractor I worked with left after that - he felt that he needed to keep his skills current - and I started thinking about migrating back to LabVIEW. So this article had a personal historical interest for me.

Clearing my mailbox, pt 2: joys of dual monitors

Someone forwarded an article to me from the New York Times about the productivity improvements using dual monitors. The article is a little dated (2006), but makes a valid point. But if you program in LabVIEW, once you go dual you'll never want to go back. The advantage to having the front panel on one screen and the block diagram on the other is huge in my opinion. I liked it so much at work, that I found a cheap monitor on craigslist so I could do it at home as well.

Of course, now I also feel a little like an evil genius in a secret lair....

Cleaning my mailbox - LabVIEW vs C

My various email inboxes have been filling up with things I want to keep until I've had a chance to look at them. Today (Saturday) I had time to go over a couple.

National Instruments has a semi-monthly email newsletter where they publish details about new white papers, new products, etc. Some of the time it's just marketing and not useful, sometimes it's good. The article that made me keep this latest issue was about comparing LabVIEW with C. I was particularly interested in this white paper since I had blogged about this subject a year ago.

I judged this paper on two different levels. The first level is a marketing level: it's apparent goal is to convince C programmers that LabVIEW can do about anything that C can do for test engineering. Does it convince? Ehh, maybe. It certainly goes through a list of typical things that a C person may ask about. SO on that level I would give it good marks.

The second level I considered to be more philosophical: does it address why LabVIEW would be better than C, at least for test engineering? On this level it really falls flat. After I read it I couldn't come up with any single reason it gave me for why it would be preferable. But maybe I was asking too much.

Friday, July 4, 2008

One year anniversary, pt 2

As I said in my last post, I've been writing to this blog for a full year now. One of the things I said in my first post was that I hadn't found any blogs about regular test engineering. One year later, I should put a caveat to that statement. There are some blogs about test engineering, but they are usually written by someone who either a) works at NI (or some other huge test engineering firm), b) gets paid to write about test engineering (i.e. - a magazine), or c) written by a consultant (and writing a blog is a form of advertising for a consultant). This blog is still one of a very few in test engineering written not for profit or for promoting myself. It's just because I want to.

Having said all that, here's a list of test and engineering blogs that I read from time to time (arranged alphabetically). Maybe you'll find them useful and/or entertaining. But be warned - most of them focus on LabVIEW usage.


The Automated Test Blog
http://automatedtestblog.com/
A director of marketing at NI posts here once a month or so.

Blogs on Test & Measurement World
http://www.tmworld.com/blogs.html
This page is for a group of blogs written by people at that magazine, rather than a single blog. But it's useful to look at the summaries on this one page. These are often worth a quick read.

Expression Flow
http://expressionflow.com/
This LabVIEW programming blog has several different consultants posting to it from time to time.

Ideas in Wiring
http://ideasinwiring.blogspot.com/
The author workss at NI, but the blog is more of an independent piece.

Open Measurements
http://openmeas.blogspot.com/
Brian Powell is a senior R&D guy at NI. I've attended talks he's given before, & he seems like a great guy. As to be expected, his blog is specifically about LabVIEW.

Testing with NI
http://nitesting.blogspot.com/
Another test engineer writing a blog on his own. Good for him.

Thinking In G
http://thinkinging.com/
This blog is ran by Jim Kring, who co-wrote the great LabVIEW for Everyone book. He runs a consultant company.

VI Shots
http://vishots.com/
This is written by a pretty active consultant, Michael Aivaliotis. I see his name pop up quite a bit in LabVIEW circles.

Monday, June 30, 2008

One year anniversary, part 1

It was a year ago tomorrow that I wrote my first post to this blog. Technically, I created this blog a few days before that, and I started thinking about creating the blog several months before. But the first post sounds like a good enough date to use. In that year I've written 66 times - a little over one a week is about what I had originally aimed for, so I'm happy with that.

Anyway, I wanted to formally mark the occasion of one year. Note in the header that this is "part 1" - I'll write a related post tomorrow (or the day after at the latest).

Sunday, June 29, 2008

Boston testing is hiring

I touched on this over a month ago when I talked about moving to my new job. But it really bears repeating: there are plenty of test jobs available in the greater Boston/north of Boston area. Over the past 5 to 6 months I have averaged at least a call a week from a recruiter - just this past week I had four. And I'm not counting the emails I get from recruiters.

Now, I could be full of myself and say that I get all this attention because I'm just a great guy... But in reality, recruiters are looking for specific skill sets. For most contacts I've had, they want someone with a) several years experience as a test engineer, b) good knowledge of LabVIEW, c) some sort of engineering education, and d) live in the area (relocation is expensive right now). So, if you fit that criteria, come to the Boston area and you'll get an interview.

What do I base this observation on? First, the two companies I have left in the last 6 months are still trying to fill my old position. Second, over the last two weeks I've asked several recruiters if the test engineering field is busy. They have all replied that the market is tight and they are having troubles filling the open reqs. So, there you go.

Tuesday, June 10, 2008

Test Engineer Humor

While looking for software drivers I came across this page at Cal-Bay Systems.

I cracked up, especially at the shot of the PXI chassis with a tap in the middle. And the "BXI Chug 'n Play" standard in the specifications page was even better. It struck just the right tone to mimic a NI press release.

This is a great example of humor developed by a test engineer: obvious reference to the industry, plenty of spec details, and playful.

Monday, May 26, 2008

Test engineering in a startup

As I wrote a couple of weeks ago, I start work with a new company this week. This will be my fourth startup in the past 9 years. About a month ago I wrote something I had learned about funding startups, and that post got me to thinking about what I wrote last year about the different types of test engineering. Specifically, is a test engineer who works in high-tech startup companies a separate type of test engineer?

A successful test engineer in a startup needs a broad set of skills. First of all, you're probably the only test engineer in the company, so of course you need to program. At first you can set up some rudimentary manual test stations, but soon after that you'll want to automate. People won't have the patience to sit and run a manual station for very long.

You're putting together test systems, but in a startup time is often at a premium so you'll usually hire contractors for specialized assistance. Knowing something about mechanical as well as electrical engineering will help when working with those contractors.

Furthermore, a high tech startup has a strong need for data. The test engineer that knows how to handle a database - writing to it, reading from it, designing it - has a leg up. And if you can do decent data analysis (I blogged about JMP here), then you help out the other engineers as well as have some fun.

Finally, you have to be a bit of a people person. You're in a small group of people, most of them are engineers. Those engineers rely on the data from testing. You cannot just sit in your cube and program, or sit in the lab and build your systems. You should communicate with the engineers to find out what kind of testing they need, how the data should look, and a dozen other issues. Things can change fast in a startup, so the test requirements drawn up a month ago might have changed. You need to stay on top of those changes.

The skills I have described are of course used by test engineers. But it is the breadth of skills rather than the expertise in any particular skill that is important for a young company. So, back to my original question: Is "startup specialist" a separate type of test engineer, or is this more of a 'jack of all trades master of none' issue?

I don't have an answer to that question right now, but I'll think about it more. But there is one thing I can add. Since I've started making a career for myself by having a broad set of skills, I've never really liked the negative connotation of the "'jack of all trades" epithet. In reality, the full quote is, "Jack of all trades, master of none, though ofttimes better than master of one." When you think about it, that's a compliment that I can live with.

Saturday, May 24, 2008

Working at Google

I wrote in December about a friend I worked with who is a statistics master. Well, back a couple of months ago he left that company. He traveled across the country to work at Google.

Now, there are a hundred things I can easily think of that Google would do that involve statistics. So I'm sure he will not lack for work, and I believe that Google is lucky to have him. So, Bhairav, I wish you all the luck in the world out there on the west coast.

Monday, May 19, 2008

The Power of Complaining

Complain (Verb) - To express feelings of pain, dissatisfaction, or resentment


About three years ago I went to the yearly National Instruments Technical Symposium. Held every fall in Massachusettes, it is a mix of companies selling things (roughly 20 booths), programmers getting in touch with each other, and NI showcasing the newest updates to LabVIEW that they promoted at NI Week the previous August.

Well, three years ago I was disgusted with the state of the symposium's presentations. Without fail, all the presentations had a high percentage of marketing and a low percentage of actual technical content. And the technical content seemed to be pitched at either a) a beginner's level or b) extolling the great new things that had been added to LabVIEW (in other words, more marketing).

Most of the seminars and conventions I've attended over the years have a comment/rating sheet where you can grade your experience. Usually I check off a few things, write one or two sentences on what I liked, and that's it. This time, I roasted them. I wrote what I really thought of the day's events, and it wasn't pretty. I went into graphical detail of each talk I attended and why I felt it sucked. I also wrote that other people I had talked with had a similar opinion.

It must've hit a nerve. I received a call from a NI marketing guy in Austin a couple of weeks later. He wanted to talk in more detail about what I disliked (his wording - mine was stronger) and felt should've been done differently. The next time I talked with the local NI rep he mentioned that he had heard about my comments.

Well, over the last couple of years the LabVIEW symposiums I attended definitely had more technical content. A couple of weeks ago I attended the LabVIEW Developer Education Seminar. It's similar in spirit to the technical symposium but without the booths. And I have to say that this time NI did a great job of presenting good technical content. Every seminar I attended had solid information that I can use. Even better, there are actual notes with the presentation materials. I may have to keep that booklet.

So sometimes it pays to complain.

Thursday, May 8, 2008

Moving yet again

Back in January I wrote that I was moving to a new job. Well, about four months later, I'm doing it again. I start another job by the end of the month.

Supposedly the US economy is in a recession, but it hasn't hit the Boston area that hard - probably because most jobs lost have been in real estate and finance while this area is a hotbed for technology. If I had to guess, I'd say the engines of scholastic research (MIT, Harvard, UMass, Tufts, etc) combine with heavy military tech work (BAE, Raytheon, MIT Lincoln Lab) and a puritanical tradition of hard work to constantly churn up new applied science. That's partly why this will be my 4th start up company in the greater Boston area.

That, plus I'm a sucker for new tech.

Anyway, the same caveat that applied in January applies now. For the next month or so I don't expect to write much to this blog since I will be learning the new company. But after that, I'm sure I'll have more to say - I am rarely at a lack of words.

The right tool for the job

My dad was an electrician and something of a general purpose handyman. He had tools everywhere - from the shed to the basement to a fully-stocked work van. One of the many things I learned from him is that any job you do is a lot easier if you have the right tool. To that end he had a lot of different kinds of tools. As a kid and then a teenager (when I used to help him on weekends and the summer) it amazed me how inventive the people who designed those tools were.

I started carrying a pocketknife when I was about 12 or 13 years old, probably because it was a useful tool. I bought my first swiss army knife in college and loved it. I always used it. Knife, screwdriver, bottle opener, even a little saw - what else could an engineer-in-training want?

I found that answer when I bought my first Leatherman tool. In the last dozen years I've used that tool all across the country in clean rooms, trade shows, customer visits, and even at parties opening beer bottles. I still have it, I still use it, and any young engineers I encounter eventually hear that they should buy their own (and stop using mine).

So the other day I pulled it out to adjust a screw on a cabinet and wondered how long these tools had been around. It has been an essential tool in my career for a long time, but how much longer had they been around? Turns out that 2008 is the Leatherman tool's 25th anniversary. So this is my official toast to 25 years of the right tool for many jobs.

Monday, May 5, 2008

Practice Makes Perfect


I have always told my kids that the only way to really be good at something is to practice. You could be a gifted athelete, a natural genius, or a musical prodigy. But that natural talent can only take you so far. History is full of genius that disappointed as well as overachievers who thrived. As the band Rush put it:

You won't get wise
With the sleep still in your eyes
No matter what
Your dreams might be


Engineering can be the same way. During the last 10 to 12 months at my previous company my job mostly consisted of a) training new people, b) building and qualifying copies of existing test stations, and c) maintaining those existing stations. There was nothing inherently wrong with this activity. It needed to be done, and I was the only test engineer to do it. But that meant that my LabVIEW programming skills atrophied. Well, maybe not atrophied, but they certainly were not improved.

I've now been at my current company for close to 4 months, and it has been heavy on the programming. As a result, I have had to stretch out my skills. I've read up on certain programming techniques, played around with new ways to present data, and some of the algorithms I've developed are among the most complicated of my career.

I'll pat my own back a little and say that I'm really happy with some of this work. I've created new data structures I really like. I'm working with some aspects of LV code I haven't used before. I've improved my LV knowledge in a small but noticeable fashion.

It almost makes me want to go back and rewrite test code I did a couple of years ago. Almost.

Tuesday, April 29, 2008

Learn something new every day: startup funding

I learned something new a week or so ago about venture capitalists and startup funding. It was an uncomfortable nugget of information, but I now understand more about the dynamics of a previous employer.

I've worked at three startups over my career - five if you count small self-sustained groups within larger organizations. It's one aspect of test engineering. If you're building a new widget, you have to test it a LOT. Perhaps that's a subject of a separate post.

One of those startups spent money like it was going out of style: buying expensive equipment, relocation packages for people that weren't really suited for the job, building a brand new building. This profligate burn rate was a primary cause of the company's death. I had heard from several people in management that the board of directors encouraged the spending. The reasoning was that if you spend enough money then eventually you'll solve the problem. It's a quasi-statistical approach to R&D.

But in a conversation I had with a CEO of another startup I found an ulterior motive, the "slash and burn" method. It probably only works with a management team that isn't that bright, but that describes that particular company well. It goes like this:
  1. The board of directors is composed primarily of the initial investors in the company. They've invested money and own a certain percentage of the company.
  2. All startups go through multiple rounds of funding. In each round, the company's value is estimated based on certain milestones: design wins at major customers, working prototypes, improvements to the production process, etc. Based on that estimate, the new share price is set and investors get a number of shares based on how much they invest at that price.
  3. If the company spent it's previous funds without hitting certain milestones, then its valuation will not be as high.
  4. So, when the company goes looking for another round of funding, the investors can invest more money in the company at a better rate. This is good for the investors - they'll own a higher percentage of the company - but bad for the employees, because they're stock options are now worth less through dilution of the value.

So, if the board can convince the management team to wastefully spend cash, then (assuming the company succeeds of course), the board can end up making a lot more money in the end. It's kind of perverse, but true.

Tuesday, April 15, 2008

Embedded testing

I recently downloaded a document called "Embedded Design Guide" by Tektronix. It's a 54pg PDF file that serves as an introduction to testing Embedded Systems. You can find it here or here.

My experience with embedded systems work is very limited, confined mostly to writing code for a microcontroller or debugging software written on an embedded system. But I like to have at least a passing familiarity with most technology I have to use. I haven't finished reading the file, but it appears to provide just that: a passing familiarity.

Thursday, March 27, 2008

Linux on test systems, pt 5


In
July of 2007 I started reviewing app notes that Agilent published about using Linux on test systems. They've put out 5 papers on the subject (full list is here). This blog reviews #5, the last paper in the series.



Tips for Optimizing Test System Performance in Linux Soft Real-Time Applications


One of the first things the paper does is discuss the difference between soft real time applications and hard real time applications. To be honest, I didn't realize there was a noticeable distinction. But I found it referenced in Wikipedia, so it must be real... My experience with real time systems has been of both varieties, but I never quantized the difference. So, now I know something new.

One of the things I liked is the list of tips for optimizing response times. These tips aren't really specifically linked to Linux, and they seem obvious but sometimes it's good to see those "obvious" ideas listed.

■ Avoid it if you can
■ Put the burden of real-time control on your instruments
■ Use a fast PC with plenty of memory
■ Shut down unused services
■ Isolate the real-time part of your application

The author then goes on to discuss specific Linux techniques like time slices, the Linux scheduler, preemptive multitasking, and virtual memory & paging. Each of these discussions are paired with code, diagrams and graphs that dive into some technical details. Most of those details were admittedly beyond my skill level - I'm not a Linux guy - but the paper was written well enough for me to understand the subject matter.

----------------------

This is the last paper in the series, so I don't expect to blog much more on the subject, at least not until I install Linux on that computer at home. I'm still working heavy at my new job, so it may be another couple of weeks before my next post.

Wednesday, March 19, 2008

Starting A New Job

I’ve been with my latest job now for almost two months. In that time I’ve averaged about 1.5 emails a week and 1 phone call every two weeks with new job possibilities. Most of that activity I chalk up three items:

Ø I still have my resume listed on Monster.com. Many head hunters have programs that troll through piles of resumes and auto-generate emails if you match certain keywords. I’d say a third of the jobs I’ve seen were obviously not a fit for me, but my resume had the right keywords.

Ø I live in a fantastic technology environment. The greater Boston area. Yes, if you do pure software or web work, then the San Francisco Bay Area is probably better, but for cutting edge hardware work (electronics, optics, pharmaceuticals, MEMs) this area can’t be beat.

Ø Test Engineering is often a recession-proof career. We live in an imperfect world, so you always have to test your products. Coincidentally, US News & World Report had a special feature on good careers for a recession this week. Test engineering was not specifically mentioned, though engineering in general was listed.


I wrote those tidbits as a preface to my main point: what do you do before you start a new test engineering position? A couple of weeks ago I wrote about my list of things to do before I leave a position, so this list is complimentary to that. First impressions are important, and it always helps to start out on the right foot.


Familiarize yourself with the company
Yes, you already did this before you interviewed. You read their website, checked out their competitors, read their white papers, etc. Now it’s time to dig deeper. Read up on their technological practices, review the company’s history, and find information on how well their products work. Buy a book on the technology they use.

Project list
Email your new boss and see if you can get a list of what projects will be most important when you start. Having this list ahead of time will help you get a handle on what you can start doing the first couple weeks on the job when you need a break from filling out those new employee forms.

Equipment
Get a list of the current equipment they use. If you’ve used it before, then great. If not, download the manuals and start reading. Download the LabVIEW drivers for it. Find some example code using it.

People
Learn about who you'll be working with: Facebook, LinkedIn, or just google them. Knowing about their professional backgrounds (i.e. - papers they've written, patents they've filed, other companies they've worked at) helps you understand what they can do and, consequently, how you can fit in with them.

Tuesday, March 18, 2008

Linux on test systems, pt 4


In July of 2007 I started reviewing app notes that Agilent published about using Linux on test systems. They've put out 5 papers on the subject (full list is here). This blog reviews #4.


Using Linux to Control USB Instruments
As I have written before, I sometimes take a dim view of these white papers. It seems that their actual value is proportional to how much input comes from marketing - I just haven't determined if that proportion is direct, quadratic, or exponential. Maybe it depends on the company itself.

Having said that, when I started reading this latest paper I learned about the "USB Test and Measurement Class (USBTMC) specification," which I didn't know about prior to this. Any white paper that actually teaches me something must have something going for it. Plus the author provides sample code for create a generic USB driver that works with current Linux distributions - even better.

I should point out I haven't had a chance to test out this code. I have an older computer at home I am converting over to a Linux box, so I plan to do it at that time. But aside from that caveat, I think this paper was pretty well written.

----------------------

The remaining paper in the series is "Tips for Optimizing Test System Performance in Linux Soft Real-Time Applications." I'll review this one next week.

Monday, March 17, 2008

Linux on test systems, note

In July of 2007 I started reviewing app notes that Agilent published about using Linux on test systems. I reviewed the third of five back in November. In the past couple of months, while I was busy ending one job and starting another, Agilent went and published numbers four and five. So I'll review #4 this week and #5 in another week or two.

The full list of papers is here.

Thursday, March 6, 2008

Leaving your job

Over the past 15+ years I've worked for a lot of different companies, 8 including the company I joined in January. I don't look at myself as a job hopper - I think that's just the nature of the tech marketplace in today's economy. With one exception, those jumps have been voluntary, and over time I've developed a list of things to always do before I even give notice. Below are the highlights of that list.

Clean your computer Y'know that collection of 200 albums you got from BitTorrent and now you listen to at work? Burn it to CD/DVD and delete it from the hard drive. The same goes for those pictures you took last year in Washington DC, chat logs that you've saved, and those blue-humor jokes your friends sent you.

Archive your contacts
I have hundreds of business contacts in my PDA from people I've worked with over the years. Sometimes that list comes in real handy. I have that list because I always make sure I download my list of contacts from Outlook (or whatever mail app your company uses) before I leave. I also make sure I have cell phone numbers & personal email addresses from people I want to keep in touch with afterwards.

Backup your work Burn a CD or two of what you've done at your company. I'm not advocating stealing company secrets, but most engineers will have collected a lot of stuff on their hard drive over time: PDF files of interesting papers & "how-to's"; PDF technical manuals; install files for useful utilities. Also, I see nothing wrong with copying LabVIEW subVIs (or portions of them) that you spent a lot of time figuring out how to make work. This represents your experience, and whether you copy it from work or just recreate it on your own at home, that knowledge is yours. I use subVIs at work that I originally wrote 2, 4 and even 10 years ago somewhere else.

The flip side to this is you shouldn't take someone else's work as your own, nor should you take entire projects and use that at a competitor. That's stealing, no matter how you rationalize it.

Transition your work
This is maybe the most important one. Some people may look at it as a "don't burn bridges" policy, but I also think of it as "don't screw your friends." Until the company hires someone to replace you, your previous coworkers will have to take up the slack. So make a list of your current projects and their status, list potential people who could take them over, and list things you do on a regular basis (i.e. - preventive maintenance on equipment, software backups, etc.)


---------------
4-19-2008
Postscript
I had a conversation the other day with someone about this post, and he pointed out a couple of things that could get people into trouble. So let me clarify.
Archive your contacts. This is a tricky subject. I've read online about senior sales guys who have been sued for bringing contacts with them that they made while being paid for doing that. Even worse, some of the business contacts you've made while on company time could legally be considered that company's property. So, tread carefully here. I am certainly not a lawyer, and the best advice I could give would be to do the ethical thing: If you consider the person a friend, it's reasonable to keep in touch with that person after you leave the company. If you only relate with that person on a business level, take care.

Backing up you work. Do NOT take stuff that could be considered company property. What I was referring to were - for example - PDF files of papers you found online, installation files of shareware programs you've downloaded, or sample code you found on a discussion board. BUT, if you take anything that was written by the company, or someone in the company, or paid for by the company, taking it might get you sued. Basically, if you couldn't have gotten it on your own time at home, don't touch it. If you're an engineer, some of that stuff you probably downloaded at home the night before & brought to work anyway.

Wednesday, February 27, 2008

Storing large files

Usually data storage is the province of the IT department, but sometimes a test engineer has to get involved. Here's a good example:

A few years ago I started work on a system that would take pictures of lit LEDs at the wafer level. It would analyze the image and save analysis data to the database. But we also had a need to save the image itself.

The images were large (over a meg), and even as PNG files (a lossless compression) they were typically over 300k. I had thought about saving them as BLOB files in the database - that seemed like a simple solution. After discussing it with our IT consultant (who later became our IT manager), I decided to save the images on the local network and record the image location in the DB. A year later the test system was complete, and saving images worked very well. The images were often used to do post-mortem analyses on bad wafer lots or mask issues.

Well, this issue has come up again at my new job. In this case it's not just image files, but large dataset files as well. Again, we've decided to save them as separate files & just record the information. This time I actually have justification for the decision: a paper from Microsoft Research ("To BLOB or Not To BLOB" - cute) states that, assuming you're using MS SQL Server 2005, any data set larger than 256KB should be saved as a file instead of in the DB. In more detail, they state that there is actually a gray area between 256KB and 1MB, but you get the idea.

In summary, sometimes it's useful to know something about other disciplines. Especially if you are a test engineer.

Tuesday, February 26, 2008

New job plus one month

As I said over a month ago I started a new job and would need some time to settle in there. Well, I've her almost a month now and it's been pretty good so far. I just wrapped up phase 1 of my first project, I'm in the planning stages of at least one other project, and I have a bit of time now to get back to blogging. In the next week or so I want to write a bit about the following topics:

- Data storage
- Leaving your old job
- Starting a new job
- Outsourcing

These are all things that have come up in the past two months. Stay tuned.

Saturday, February 2, 2008

Training young engineers

One of my original rules for this blog is "keep the personal to a minimum." I'm going to write a few lines about my sons, but I think this post also applies - in a broader context - to anyone who has kids (or nieces/nephews) with an interest in engineering.

I'm sure that thousands of pages has been written about the "science crisis" in America, and the shrinking number of children who become interested in science and pursue it as a career. I am not about to offer my two cents on why it is happening or even whether it is real. And I'm not the sort of father to push my kids into the same field as I. But I do have kids who are interested in science and, like other fathers, I want to encourage them to learn. So I'm going to write a few paragraphs about that.

TOYS
For Christmas both of my boys got electronics kits. They've been interested in opening up some of the toys they have to see how they work, and asking me lots of questions. The younger one got a Snap Circuits set. It's kind of a cross between legos and electronics. So far he really likes it.


My older son got an Electronics Learning Lab. He played with it some when he first got it, but the jury is still out on whether he likes it. The manual for it is written by Forrest Mims, and while I like his work and have a couple of the notebooks he has written, I'm not sure if his style is suitable for kids learning electronics.

On the flip side, I liked this kit when I first saw it - it reminded me of the breadboard setup when I had my first electronics class in college.



FLL
A couple of years ago my older son joined the Robotics club at his middle school. They participate in the First Lego League competitions. They use Lego Mindstorms robots to complete specific tasks in a set amount of time, they research a specific topic and present their results, and they learn. My son has enjoyed it a lot, plus he was actually excited when I showed him some of my LabVIEW code, since the Mindstorms use a simplified version of LV for their programming.



So, there's a few ideas for anyone who has kids that are interested in electronics and science. Hopefully they're useful for you.

Friday, January 25, 2008

New Job

As I mentioned in my last post, I am leaving my current job. Since giving my two week notice, I've been very busy a) training people in tasks I normally do, b) showing them the details of current and past work and c) wrapping up loose ends in general. I haven't had time to post on the blog in over a week. Furthermore, I start my new job next week & expect that to tie me up for a couple of weeks during the "settling in" period.

Nevertheless, I expect to at least post a couple times in the next month or so. As usual, I often have ideas for posts that I've written down weeks or months ago - I'll polish up a couple and fit them into my schedule.

Friday, January 11, 2008

Changing jobs in the field

At the end of my last post, I noted that 67% of test engineers surveyed would recommend test engineering as a career to to kids or friends. The recent turbulence in my own test group made me wonder: what about the other 33%?

Three people, including myself, have left or are leaving to go to new companies. I'm going to be doing test engineering for a startup. So it's still in test engineering but with a different emphasis. The second person is a manufacturing test engineer at a big firm - basically the same work he did previously. The third person will be an electrical engineer for a test instrumentation company.

So that's three people. Two of them will still be in test engineering, and the third is (happily) going back to EE work (for test equipment). In an odd way that mirrors the 67% number.


On a related note, when I was testing equipment software for HP I went to several software conferences - most of them around San Jose, CA or Seattle, WA. Once I sat at a lunch table with about 4 software testers from Microsoft. During a conversation about programming, they said that a common path for programmers up in Redmond was to spend a couple of years just testing software before being allowed to write it.

Over the past dozen years I've met plenty of people who have moved in and out of test engineering. Some of them did something similar to the guys from Microsoft: get their EE degree, spend a couple of years testing circuits, and then "graduate" to designing circuits. But a more common path I've seen (and one I followed for two years) is from testing to sales. If you spend several years testing the product you get to know it very well. That knowledge serves you very well as an applications engineer when you help the customer use the product.


But more often than not, once you are in test engineering you stay there. You're good at it, you like it, it pays well, or some other reason. As Michael Corleone said, "Just when I thought that I was out they pull me back in."

Wednesday, January 2, 2008

Test Engineer Salaries

Happy New Year.


Back in August of last year I created a list of topics to cover. I have now written about each one except salaries. I hate to leave anything hanging, so now I'll post about that. This topic is only tangentially related to test engineering, since a major point I'll make applies to other careers as well.

Test and Measurement World publishes a yearly salary survey for the field. It's a good piece of information for comparing compensations - at least for those of you who live in the US - but it completely leaves out a crucial component: location. For example, when I worked at HP/Agilent out in California there were three different scale grades for engineers. Silicon Valley (Palo Alto, San Jose, etc) was at the top of the list, followed by Sonoma County (CA) and Boston (MA) on the second level, and then everywhere else. There was a 5-10% salary difference between each level - some places just costs more to live there.

If you want to look at pay levels and include geography, use Salary.com's calculator. Also, there are several cost of living calculators out there you can use. For example, according to Salary.com's calculator, it costs about 7% more to live north of Boston than it does in Austin, TX.


I should also point out that the T&MW salary survey includes some interesting tidbits in the Job responsibilities and career satisfaction section. In particular, 67% of those polled would recommend test engineering to their kids or a friend. That makes me feel pretty good about my profession.