Wednesday, September 22, 2010

JSON Test Tools

I'm working on a piece concerning JSON testing so I thought I'd put together a list of tools that I'm going to use in the piece. Here they are...


Fiddler - Fiddler is a web debugging proxy that captures HTTP traffic and allows you to inspect, analyze, and manipulate data going back and forth between browsers and servers.
FireBug - FireBug, a FireFox addon, is essential for anyone that is doing web development. It is among other things the best Javascript debugger available and its free.
JSONovich - JSONovich is also a FireFox addon that allows you to view formatted JSON in the browser.
JSON Diff - JSON Diff is a web app that allows you to perform diffs on JSON - it works great!!

The piece I am writing will demonstrate the use of all these tools to do data integrity tests on JSON responses. Stay tuned...

Golden Scenarios

Hello, it's been some time since I posted... I had a piece published on the quick testing tips site on golden scenarios. I'm interested in your thoughts on how to accelerate test design. Email me with your suggestions at cruisinqa@gmail.com

Wednesday, August 12, 2009

New view on exploratory testing

In the past I was reluctant to embrace exploratory testing as a means to test a product during development. At the time all I knew was the strict waterfall environment working for a major financial firm.

Now I find myself in a completely different environment (agile) and responsible for the quality of a complete rewrite of a major web application. I do not see any other way for my team to be successful in driving quality then to do major exploratory type testing along with automation.

I am experimenting with different ways to combine structure with exploratory-type testing. That is what I will be blogging about for at least the next few posts.

I am looking at a few sites right now to gauge what others are doing in the field such as Mike Kelly's series on session-based testing and Jonathan Kohl's site for ideas on how to manage this type of testing. Stay tuned...

Sunday, December 14, 2008

BBST and me

I haven’t updated this blog in a while but with good reason. I have been sharpening my testing skills by participating in Black Box Software Testing (BBST) online course offered through the Association for Software Testing.

Cem Kaner is the lead instructor and main developer of the course. The BBST Foundations course is an introduction to the BBST series of courses that will eventually be offered through an online format.

If you want to have a consciousness-raising experience in the area of software testing you should take the BBST Foundations course. If you want to be challenged like you have never been challenged before by any other structured learning experience you should take the BBST Foundations course.

The subjects covered seem at first glance easy and maybe rudimentary. Now that I have completed the course I have a new appreciation of the richness of the material covered. I hope to express my thoughts about the material over the next few posts.

I will caution you that is a serious course that requires a substantial time commitment. For me it was very difficult to engage fully during the week. I had to schedule a few hours away from my day job to finish key assignments. My point is that this course does take time (they recommend 8 hours a week) and the deadlines are unforgiving. The instructor(s) spend as much if not more time administering the course so they expect students to put in the necessary time.

In my next post I will address the first topic of the BBST where the learning objectives and key terms are discussed.

In the meantime, sign up for the BBST if you have the time and want to participate in this exciting, new, and innovative learning endeavor.

Saturday, November 1, 2008

Resource for effective writting

Written communication skills are an especially important skill for any software tester. Most of us know good writting when we see it but probably haven't thought too much about the elements of good writting. Two podcasts from the Manager Tools site contain actionable information to help you write better.

Write More Effectively Part 1
Write More Effectively Part 2

Tuesday, October 28, 2008

Making assumptions based on incomplete information

Making assumptions and forming conclusions based on those assumptions can lead to wasted time and effort on the part of your project team. Anytime you are ready to form a conclusion about a cause and effect, ask yourself the question, do I have all of the information that I need to be 100% sure that this is the correct conclusion. If the answer is no, then ask yourself, how sure am I percentage-wise, 50 - 60%, 70 - 80%, 90 - 95%. If you are about 90% sure, that may not be enough. Ask yourself, what will it take to be as close to 100% sure as I can possibly be? Is there someone that I need to contact? Is there another test that I need to run? Very often there is another test that you can run but the effort to set it up may not be worth additional information gained. You need to think about the relative cost versus benefit and make a decision based on testing intuition. Below is a story about an assumption I made and how it caused extra time and effort to get my TV remote control to work.

One Sunday afternoon my wife and I were watching a movie. She got up to go to the kitchen. I pressed pause on the DVD remote and decided to switch to the TV to see the score of a basketball game. I hit the "Input" button on the TV remote but the DVD was still showing on the screen. I hit the button again and it still didn't switch to the regular TV. I then tried hitting other buttons on the remote, attempted to adjust the volume, change the channels and still no response. I tried all of the other buttons and none of them worked. I decided (without complete information) that the remote was the source of the problem. So I took the batteries out and put them back in and then tried the remote, that didn't work. I switched the batteries and tried again, this time the volume adjusted but the rest of the buttons didn't work. I decided it must be the batteries so I got up (notice my goal was not to get up) and went to the kitchen, found some new batteries, and tried the remote again. The remote still didn't work. Now I determined it wasn't the remote at all so I investigated the TV and discovered that my daughter’s toy was covering the TV infrared sensor. I felt pretty silly because I had a funny feeling it wasn't the remote but all of the information that I gathered led me to the conclusion that it was - new information led me to the correct conclusion.

How many of us have been burned by this tendency to assume without complete information. How many times have we wasted our time, or worst - someone else's time chasing solutions to problems that really didn't exist? My feeling is that software testing is an adventure that is made easier by playing close attention to details and our tendencies to buy into false assumptions. By having a mindset of skepticism about the assumptions behind cause and effect of application behavior, we can reduce time spent doing non-value added tasks and increase our test effectiveness exponentially.