Sunday, December 14, 2008

BBST and me

I haven’t updated this blog in a while but with good reason. I have been sharpening my testing skills by participating in Black Box Software Testing (BBST) online course offered through the Association for Software Testing.

Cem Kaner is the lead instructor and main developer of the course. The BBST Foundations course is an introduction to the BBST series of courses that will eventually be offered through an online format.

If you want to have a consciousness-raising experience in the area of software testing you should take the BBST Foundations course. If you want to be challenged like you have never been challenged before by any other structured learning experience you should take the BBST Foundations course.

The subjects covered seem at first glance easy and maybe rudimentary. Now that I have completed the course I have a new appreciation of the richness of the material covered. I hope to express my thoughts about the material over the next few posts.

I will caution you that is a serious course that requires a substantial time commitment. For me it was very difficult to engage fully during the week. I had to schedule a few hours away from my day job to finish key assignments. My point is that this course does take time (they recommend 8 hours a week) and the deadlines are unforgiving. The instructor(s) spend as much if not more time administering the course so they expect students to put in the necessary time.

In my next post I will address the first topic of the BBST where the learning objectives and key terms are discussed.

In the meantime, sign up for the BBST if you have the time and want to participate in this exciting, new, and innovative learning endeavor.

Saturday, November 1, 2008

Resource for effective writting

Written communication skills are an especially important skill for any software tester. Most of us know good writting when we see it but probably haven't thought too much about the elements of good writting. Two podcasts from the Manager Tools site contain actionable information to help you write better.

Write More Effectively Part 1
Write More Effectively Part 2

Tuesday, October 28, 2008

Making assumptions based on incomplete information

Making assumptions and forming conclusions based on those assumptions can lead to wasted time and effort on the part of your project team. Anytime you are ready to form a conclusion about a cause and effect, ask yourself the question, do I have all of the information that I need to be 100% sure that this is the correct conclusion. If the answer is no, then ask yourself, how sure am I percentage-wise, 50 - 60%, 70 - 80%, 90 - 95%. If you are about 90% sure, that may not be enough. Ask yourself, what will it take to be as close to 100% sure as I can possibly be? Is there someone that I need to contact? Is there another test that I need to run? Very often there is another test that you can run but the effort to set it up may not be worth additional information gained. You need to think about the relative cost versus benefit and make a decision based on testing intuition. Below is a story about an assumption I made and how it caused extra time and effort to get my TV remote control to work.

One Sunday afternoon my wife and I were watching a movie. She got up to go to the kitchen. I pressed pause on the DVD remote and decided to switch to the TV to see the score of a basketball game. I hit the "Input" button on the TV remote but the DVD was still showing on the screen. I hit the button again and it still didn't switch to the regular TV. I then tried hitting other buttons on the remote, attempted to adjust the volume, change the channels and still no response. I tried all of the other buttons and none of them worked. I decided (without complete information) that the remote was the source of the problem. So I took the batteries out and put them back in and then tried the remote, that didn't work. I switched the batteries and tried again, this time the volume adjusted but the rest of the buttons didn't work. I decided it must be the batteries so I got up (notice my goal was not to get up) and went to the kitchen, found some new batteries, and tried the remote again. The remote still didn't work. Now I determined it wasn't the remote at all so I investigated the TV and discovered that my daughter’s toy was covering the TV infrared sensor. I felt pretty silly because I had a funny feeling it wasn't the remote but all of the information that I gathered led me to the conclusion that it was - new information led me to the correct conclusion.

How many of us have been burned by this tendency to assume without complete information. How many times have we wasted our time, or worst - someone else's time chasing solutions to problems that really didn't exist? My feeling is that software testing is an adventure that is made easier by playing close attention to details and our tendencies to buy into false assumptions. By having a mindset of skepticism about the assumptions behind cause and effect of application behavior, we can reduce time spent doing non-value added tasks and increase our test effectiveness exponentially.

Sunday, October 26, 2008

The future of software testing

I have been reading JW on Test since I learned James Whittaker started to blog. In his post titled "the future of software testing (part 8)", he states that "test code should ship with the binary". That is an interesting concept and directly relates to the idea I've been working on which I call "On Demand Regression Testing". Shipping test code would be fine for new code but what about legacy code?

Recently, at the IWST I heard a very interesting quote in a presentation given by Anthony Panozzo on Test Driven Development. The quote is from Michael Feather's book "Working Effectively with Legacy Code". It basically states that legacy code is code without tests. I would say that legacy code is code without tests and/or documentation but that is essentially the same.

In my current position I deal mainly with legacy code. That is code written over a period of decades with little documentation and certainly no tests. Sure it was tested when originally installed but those tests were not kept as a corporate asset and now they are gone.

To achieve the goal of regression testing this type of code, we started with a functional decomposition of the particular function we are testing. Then we look at the inputs, process, and outputs within the function. We create a logic table for the inputs and also examine any interfaces that the function may depend. While we are doing this software engineering analysis, we try to understand the business process and types of users involved. We document our findings and publish them along with our tests that generally run on an automated basis.

The main roadblock behind "on Demand Regression Testing" is the data. If we can find a way to either create data based on certain characteristics and use them in our automated suite, then we would be much further along. Then we would be able to wrap our application with tests rather than embed them within our code, since we don't have the skills nor does the organization have the will to do that, at least for now.

September 2008 IWST - Testers who write code

I attended the Indianapolis Workshops on Software Testing in September along with a colleague of mine. We found the experience to be very worthwhile and informative. There were a good number of software engineering/testing professionals and it was great to see what they were doing in their shops. I gave a presentation on integrating HP Quick Test Pro and Empirix Hammer applications. Mike Kelly blogged about it here. The notes from the presentation can be found here.

Saturday, October 25, 2008

First post

The purpose of this blog is to provide information about software engineering, project management, and testing and how to do these activities in a more efficient and effective manner. I will share my experiences, strategies, and philosophies and a few stories along the way. The title of the blog represents my feeling of how software projects should be viewed. Like a cruise in the Northern California Wine Country, software testing should not be difficult, it should be an enjoyable and sometimes difficult struggle. To do it effectively and consistently you need the right tools. I am not talking about automated software testing tools either. I am talking about tools such as the proper frame of mind, environment, experience, knowledge, and support. These are the tools that I hope to help you develop so that we can all build better software and have fun doing it.