Sunday, December 15, 2013

The Republican Brain

Science and technology are very important to me.  I look back on the thousands of years of human civilization, of failed empires, of the rise and fall of governments and it just makes me sad.  Then I look at how far civilization has come in just the past few hundred years, and I'm hopeful.  Maybe it's a little bit religious, but I do have faith that most people are basically good and, if we continue to emphasize scientific and engineering advancements, humanity will prosper.

Then I spend a little time watching Fox News, or reading a website like this, and I get depressed again.  The earth is getting warmer.  Mankind evolved from apes, which evolved from earlier species, all the way back to primordial ooze.  The Earth is a few billion years old, not 6000.  All these statements are supported by mountains of evidence and theories.  Why don't people accept this?  I finally got around to reading "The Republican Brain," and I have to say it was very convincing.

Saturday, December 7, 2013

Labview versioning hell, pt 2

Back in July I wanted to look at some fairly old code I had stumbled across - in the past I've had to do this a time or three.  This code was too old to up-convert with the version of LV I had, and I didn't have access to any older versions of LV.  While I worked on posting the code online to ask someone to convert it for me, I posted a suggestion to the LabVIEW Idea Exchange.

Basically, I was asking for a way to at least view very old code, and maybe include reasons for why it couldn't be up-converted.  The response I got was, basically, "If you are an SSP member you can download older LV versions.  If you don't pay to keep up your SSP, you're SOL."

Needless to say, I found that less than satisfying.  First of all, why would I want to spend the time and effort to download and keep track of older versions of Labview?  Second, it sounds like just another way to have owners of the LV software to continually pay money to NI...






Sunday, December 1, 2013

Test your code

So at my new company I'm back to software testing.  What goes around comes around I suppose - I first learned the ins and outs of software testing at HP about 15 years ago.  Even though most of my jobs have been hardware testing since then, I still enjoy reading about it (i.e. - here, there, and way back then).  As my dad has said many times, it never hurts to learn something new.

Speaking of which, I recently found two articles about the costs of NOT testing your software that I enjoyed, in a perverse sort of way.  The first article is a bit esoteric unless you've done serious code testing before.  Basically, it explains how Kaspersky released an update to software that wasn't regression-tested.  In other words they made changes to the software, and, while they may have tested their changes, they didn't test whether those changes would mess up the base code.  That's the whole point of a regression test suite.

The second article is of somewhat more personal importance.  For a long time I owned only Toyota cars.  But once I started reading about braking problems back in 2009, I decided to buy Ford instead (here and here).  This past October a court finally ruled against Toyota, and central to the case was the Engine Control Module's firmware.

Will companies forever keep neglecting software testing in favor of releasing product ASAP?  I mean, just reading a paper like this from NASA and anyone with half a brain should realize that software testing is of paramount importance.  Jeez.

Saturday, November 23, 2013

More salary surveys

I have a thing for salary surveys (here, there, and back then).  I will admit to using them as a gauge of how fairly I'm paid.  But I also think they're interesting because...

  • A good salary survey will break it out into several interesting data trends.  And I'm a geek for data analysis.
  • I like to see how engineers feel about their own circumstances
  • It's interesting to see how it relates to where the economy is


So, here's a couple of surveys that came out in the past month or so.  Enjoy.

Design News 2013 Salary Survey

Dr. Dobbs Developer Survey


Wednesday, November 20, 2013

Getting back to the blog

I'm getting a handle on things at my new job, so I'm looking to start posting on here at least a few times a month.  When I've come back from a break in the past, I usually create a list of what I want to post on.  This time around I'll just wing it.

Thursday, August 1, 2013

Yet another startup

I'm sure there's a 12-step program for this somewhere, but I think I may be addicted to start-up companies.  Last month I joined my seventh (or maybe eighth) high-tech startup firm.  As I've written numerous other times (here, there, and back then, to name a few), I have landed at numerous early-stage companies over the past dozen years or so.  That's one upside of living in the Boston area - there is never a shortage of smart engineers wanting to start a new high tech company.  And they all need test engineers.

At any rate, I may have to put this blog on hiatus for a while, again.  The startup experience can be intense at times.

But now that I think about it, start-up companies could be fodder for a couple more posts....


Wednesday, July 24, 2013

Test Executives - part 3


I started writing about test executive software last month, and then a couple weeks ago I wrote about off the shelf software.  Now I want to write about my experiences with "rolling your own" test executive.

I've worked with homegrown test executives at two different companies.  At the first company the test executive evolved out of a couple of different programs for testing different features of the same product.  In the second company, I worked with a test executive that had been written several years before I started. I'll address each one in a separate paragraph.

The executive I wrote myself was somewhat rudimentary.  For the products we were manufacturing, there were six distinct tests you could perform.  Each of those tests had from about 3 to 20 different numerical parameters that could be adjusted.  This just screamed out for scripting, so that's what I did.  The end result had the configurability I needed, and it was often used by test technicians, but it was missing several other features of an off the shelf system test executive:

  • It was only usable within the Labview framework I had written.  For example, it couldn't interface with any modules written in C++  without a lot of extra coding.
  • There was no programmatic logic in the scripting.  The scripts consisted entirely of what tests to perform under what conditions - no if-then or optional looping allowed.
  • Some reporting tools existed, but mostly in simple formats (saving CSV files or bitmaps of graphs).
  • It was written with testing a specific product, and had to be overhauled to test some other type of product.


The other test executive was written in .NET and made use of Measurement Studio for certain graphical presentations.  The scripting tools had quite a bit of programmatic control, and the reporting tools were more extensive.  But there were different problems.

  • Because it had been designed as a reconfigurable tool, it was horribly complicated to use.  We didn't let technicians near it without a lot of training.
  • It was only usable within the .NET framework and didn't play well with other modules.
  • It was written with testing a specific product, and had to be overhauled to test some other type of product.


That's a brief synopsis of the two in-house test executives.  So, what is the point I'm trying to make?  I'm not really saying that you should run out and buy a copy of Test Stand.  It's certainly a nice piece of software, but it's expensive and may be overkill for what you need.  I guess the lesson I've learned is that if you get to the point were you want to develop something for yourself to reuse over and over, learn from how the OTS software does it.  Specifically:

  1. Don't make it overly complicated
  2. Make it generic enough to use for different tasks
  3. Use programmable logic in the scripting
  4. Have plenty of reporting tools



Wednesday, July 3, 2013

Test Executives - part 2

A couple of weeks ago I started writing about test executive software.  I've decided that this topic is another three-parter, so in this 2nd post I'll write about off the shelf (OTS) software.  The third post will cover in-house test executives.

I mentioned last time that I have my own definition of a test executive software, but I never wrote that definition.  Well, for me the software is defined by what I expect it to do.  The three things I expect at a minimum are:
  1. Configurable testing.  I need the capability to switch up the order in which specific parameters are measured and under what conditions they are measured.  This is typically done with scripting or configurable sequence files.
  2. A solid, usable GUI.  A test executive is often used by a test technician instead of an engineer, so you need software that doesn't require a lot of care and feeding.
  3. Easy to see what's happening.  This is somewhat related to #2 above but more specific.  I like graphs, reports, and charts.

The OTS platforms I have experience with are TestStand and ATEasy.  I wrote about my experiences with TestStand a year ago (a three parter, starting here and continuing with this and that).  To be honest, I'm a lot more nebulous on ATEasy and my knowledge of it consists of three data points:
  • I sat through a long seminar on it once.
  • I worked with a guy who had used it before and liked it.
  • I evaluated it for about a week at a previous company.
Given that limited knowledge, I think it would serve as a cheap, limited alternative to TestStand.


On a final note, NI published a checklist for evaluating test executives.  While the list is a little self-serving and obviously slanted towards NI software, it's worth consideration as a starting point.

Monday, July 1, 2013

Happy SIX years

Six long years ago today I wrote my first post on this blog.  Since then I've written close to 200 posts on various topics.  Based on some of the feedback I've gotten, at least a few of them have been helpful.  So here's to six more.

Monday, June 24, 2013

Stumper questions - vindicated

Hmmm, I wrote this post last week about trick questions in interviews.  I decided I wanted a little support for it.  Synchronicity struck: THIS was published in the NYT just last week.

I particularly agreed with the statement, "On the hiring side, we found that brainteasers are a complete waste of time."

Sunday, June 23, 2013

Interviewing - stumper questions

In the process of digging through the list of partially-completed posts, I found a link on EDN that talked about technical questions you can ask prospective employees.  Rereading that little tidbit made me think about my stint phone screening college graduates for HP in the late 90s.  The company had a long list of pre-vetted questions that I could ask these poor souls.  The questions ranged from basic EE problems ("describe how you would implement a low pass filter") to programming issues ("what is a linked list?") to twisty questions (i.e. - the infamous "water in a locked room with a dead body").

Anyway, the point I wanted to make was that in today's world it seems somewhat silly to pepper a candidate with questions like this.  I mean, if you want to check their credentials then go take a look at their LinkedIn page.  Heck, just google them.  What's more important, in the long run, is what kind of person are you hiring and how will he/she fit into the group?

Thursday, June 20, 2013

Test Executives - part 1

For many years now I been a little obsessed with the subject of test executives.  I'll explain why in a minute, but first let me try to define what I think of as a test executive.

Wikipedia has a specific description of test executives, but my definition is more specific to my experiences.

So why do I want to write about this topic?  Ten years ago I was working for a small division of a Fortune 500 company.  We had just bought a complicated test/manufacturing tool developed by a consulting firm.  This firm had written a script driver that they used for most of their hardware projects.

That was an epiphany moment.  I had written a script driver for a specific test instrument when I worked at HP back in the late 90s, but I'd never really thought about writing one that I could reuse across projects.  Nor had I considered buying one off the shelf.  In the ten years since that moment I've worked at four other companies that developed four separate solutions.  So over the next couple posts I will write about different test executives - commercial as well as "roll-your-own" - and compare and contrast my experiences with each.

Tuesday, June 4, 2013

Historian Software

A week ago I sat in on a presentation for a type of software I had never had dealings with before: historian software.  It was GE's version of the software, called Proficy.  Before the demo I did a quick check on what the software was, and it really just sounded like a database with a nice GUI.  

But after the presentation I have to admit that it's really a lot more than that.  I'm not sure if I'll ever use anything like this, since it seems geared towards enterprise-level applications and I gravitate to startup companies.  But it's still neat. 

It's always cool to learn about something new in this field.  Maybe I'll check out NI's historian software, NI Citadel.

Tuesday, May 28, 2013

The latest software test tools

As I have written before, for about two years in the late 90s I worked for Hewlett Packard/Agilent in software testing.  Granted, my group tested firmware that was getting installed into complicated test systems, rather than testing huge web apps or desktop programs.  But it was software testing.

HP was thorough.  When I started with the test group it was just getting its feet under it.  So management sent us to in-house training classes and paid for us to go to several training seminars.  I even went to three different software test conferences.

Since I left Agilent all my jobs have been testing hardware.  But I still like to pay attention now and then to software testing issues.  That's why the Jolt Awards for Testing Tools popped up on my radar.  More interestingly, there was a commentary on those awards asking why there hasn't been a big leap in what kind of testing tools developers have.  Andrew Binstock, the author, makes the point that the types of tools being awarded (code inspection, unit testing, UI testing, browser testing, load and performance testing) are the same types of tools being used a decade ago.  Indeed, those types of tools are the same things I used back in 1998 and 99.

This question reminded me of something I read last week.  I've been working my way through the book Physics of the Future by Michio Kaku.  It's an interesting read, if a little overly hyped at times.  One of the topics covered in the book is the related subject of AI and the Singularity.  Dr. Kaku makes a point in this section that, while computer processor speed and memory keep improving by leaps and bounds, the software is still being written by people.  Visionaries who predict the coming robotic revolution or the end of history as we know it miss the point that the code still requires the creativity of people to be developed.

I think that's what is happening with the software test tools.  The hardware keeps improving, but the software improvements just don't follow Moore's Law in the same way.



Monday, May 27, 2013

Prepping for the CLD-R (part 3 of 3)


Last month I wrote about refreshing my Labview skills before I passed the CLD-R exam.  This topic turned into a set of three additional posts.  The first two were about available seminar materials and online help topics.

Now I'll write about the third (and last) thing I did to prep for the test:

Study the sample tests

I found two sample tests that NI had posted on their website for the CLD-R exam.  Who knows - if you dig hard enough maybe there are more.  The first thing I did was take one of the tests as if it were real - no cheating.  Then I followed these steps:

  1. Examined every write answer I got to make sure I understood why.
  2. Looked at every wrong answer to figure out the correct answer.
  3. Researched the specific topics that were a little fuzzy.
  4. Looked online to see if anyone else had worked out answers.  A few examples are here and here.  You can try looking at LAVA as well.
  5. Worked out IN DETAIL every single problem that had code attached to it.  
  6. Took the second test as if it were real.
  7. Repeated steps 1 through 5 for the second test.
  8. Took the first test again.  A couple weeks had passed since I first tried it, which was enough time for me to gauge whether I had improved.
  9. Took a couple days to digest my second pass.
  10. Repeated steps 8 and 9 for the second test.

I know all that effort may seem excessive, but what can I say?  Engineers tend to be like that.  Regardless of the effort involved, following those steps definitely helped me understand the topics NI thought were important for the test.

The fifth step - working the problems out - was particularly useful.  In my opinion just memorizing answers stimulates only one part of your brain.  I created a separate VI for each problem, added as much detail as I needed, and ran the VI until I got a satisfactory (an understandable) answer.  Going through that effort helped to create a sort of "muscle memory" that I could call on during the test itself.  Besides which, I think going over those topics helped me improve my programming skills in general, even if only a little.

So that's all I'm going to write about the CLD-R exam.  Good luck to anyone taking the exam, and I hope this helped.


Monday, May 20, 2013

Prepping for the CLD-R (part 2 of 3)


In a previous post I wrote about the tasks I undertook to refresh my Labview skills before I passed the CLD-R exam.  My first post on this topic discussed NI seminar materials.  Now I want to write about something else I focused on:

Review online help info

NI publishes a list of about a dozen items the test covers.  Of those, I found the following to be worth my time reviewing:  Events, error handling, timing, recursion, reentrancy, shared variables, and file IO.  So I went into NI's online help, bookmarked the pages relevant to that topic, and then reviewed it in detail.

For example, for timing I looked at these items:


I also went through the examples I could find online as well as the example finder in LV itself.  There was quite a bit of information available.

=====================

I'll write one more post on this topic of CLD-R review, probably in another week.  Then I'll hopefully get more current with my posts again.


Monday, May 13, 2013

Prepping for the CLD-R (part 1 of 3)

In my previous post I wrote about refreshing my Labview skills before I passed the CLD-R exam.  This post will talk about one of the things I focused on:

Seminar materials

I attended the Labview Developer Education Day back in March.  This accomplished three things related to my study for the CLD-R:

  1. Ni will often give out discounts for training or tests at these events, so that saved me some money.   
  2. Attending an all-day seminar sparked my memories in an entirely different way than just reading up on Labview tricks or working through code examples.  I think listening to other people talk about code issues stimulates your brain in a different way.
  3. I got to see presentations worked out in detail.

Number 3 on that list related to something that I had already been doing on and off since the beginning of the year: going through presentation examples.  Just go to NI.com and search for "Developer Days" or "Technical Symposium."  I did it just a few minutes ago, and one of the first items that popped up were the presentations from the 2013 Developer Days.  In fact, with some digging you can find presentations going back to at least 2009.  Even better, with some more work you can find the examples that go along with those presentations.

So I did exactly that.  I went dumpster diving into NI's archives, digging around until I found about 4 GB worth of data (including actual videos of some presentations).  And then I read through them all, skipping over the duplicates and the presentations aimed toward introductions and hardware-specific issues.  And I worked through every single example, making sure I understood it.  It helped.

So that's the first of three things I did.  I'll try to post the second one sometime this weekend.


Wednesday, April 17, 2013

Status update (CLD-R passed)

Well, it's been another few months since I've posted here and it will be another couple weeks before I post again, since I'm FINALLY going on vacation.  But at least I know what my next post will be.

I finally took (and passed) the Certified LabVIEW Developer Recertification exam (CLD-R).  I took the original exam three years ago.  After a year or so I debated whether I'd take the recert, branch out to the TestStand exam (since I'd been using that quite a bit in a previous job), or trying the LabVIEW architect exam (CLA).

That decision was decided by my company.  I received a project that, for a long period of time, required little programming and a lot of other types of test engineering: project management, design work, assembly, debug, and data analysis.  By the time I came up for air well over six months had gone by and my skills were a little rusty.  I decided to just go for the CLD-R.

I went on a month-long effort to get myself back into programming shape.  My next post will detail the steps to get back to that point, in the hopes that it might help someone else with their studying.  It might take more than one post...


Wednesday, January 9, 2013

IT Salary Survey

For anyone who's interested, Information Week is doing a salary survey for 2013.  I won't do it, since I can't really consider myself an IT professional.  But I sometimes look at these things (here or here), so I figured I'd pass it on.

Monday, January 7, 2013

Testing and mobile devices

Below are links for two recent articles.  Both articles discuss mobile devices and testing, but they cover very different subjects: testing apps for mobile devices and using a mobile device for testing.

The first article is sort of an editorial lament/challenge about the issues mobile app developers face.  Dr. Dobbs has a top rate rep for covering software development, and I'm not going to try to paraphrase what they published.

The Evaluation Engineering article is something of a mini-survey of software and hardware available for mobile devices (mostly Android and iOS devices).  A couple of these apps I already have on my iPad, but there were some new ones I hadn't seen before.  I particularly liked the spectrum analyzer app and the LogisScope app-and-hardware.  Neat stuff.


Sometimes I bookmark similar articles and read them together when I have time.  After I did that with these articles, it just sort of struck me how the mobile aspect of testing - which didn't even exist ten years ago - is just incredibly new and cool.  And yet it's still the same thing, just in a different (and cooler) package.

Friday, January 4, 2013

Test Engineer Resolutions

It's been a few months since I posted anything, though I've started several partial blog posts.  One of my resolutions for 2013 is to be more consistent on finishing up those incomplete ideas.

And since I'm on the subject of new year resolutions, I started thinking about that in a work context.  For the past five years or so I have made a point of tallying a list of personal resolutions for the new year.  They are things I want to do better, something I want to learn about, something new to do.  Some are very personal, some are work-specific, some are general.  At the end of the year I'll take time to record how I did and create a list for the next year (often with the same or similar resolutions).  My process is number-oriented, which shouldn't surprise anybody.

So when I looked at how I scored for 2012 and what I would resolve for 2013, I thought I should do the same for test engineering in general (as opposed to specific to me).  After thinking about it over the holidays, I came up with a top five list.

  1. Learn more about what I'm testing.  The first pure testing job I had was at HP, running optical test equipment through its paces to find glitches in the firmware.  The hiring manager loved  my experience using their hardware (or competitor hardware) in real world environments.  Partly because of the years I spent doing that, I think that the more you know about what you're testing the better you are as a test engineer.  
  2. Avoid "question the test" issues.  A natural response to unexpected or disappointing test data is to question the test system.  You can raise concerns about the accuracy of the system, bugs in the code, the person running it, how the test is performed, etc.  The best way to avoid this is to be proactive with the test systems: discuss issues with design engineers in advance; run regular GR&Rs on the systems; look at the data for your golden samples and see if it makes sense.  And if the issue is raised, don't get offended - just approach it like any other problem to solve.  
  3. Data dive once in a while.  If all you do is build/modify/run test systems, then you're missing out on what the data can tell you.  I've talked multiple times about using JMP to analyze data, but regardless of whether use use that, Minitab or just Excel, it's important to sometimes take a look at the data your systems are generating.  Don't rely on just letting the design engineers analyze the data.  Do it for yourself & you might just learn something interesting (see #1 above).
  4. Plan your career for the year.  You could argue that everyone should do this, and I wouldn't disagree. Having a plan is important.  As a test engineer some items to plan would be:
  5. Socialize.  Again, this one could be generalized for anyone in an office setting.  But test engineers can sometimes just get stuck back in the lab, running our systems, analyzing the data, and stay oblivious to changes going on in manufacturing or R&D.  It's good to keep lines of information open, and sometimes going out to lunch with people from other groups helps.

Wow, that was a long-winded post.  We'll see if I can stick to my resolution of writing more here throughout the year.