April 28, 2011

Interviews of Thought Leaders - Freddy Gustavsson


Having started his career as a software developer, my dear friend and a key member of the Software Testing Space group, Freddy Gustavsson made the leap into testing in 2001 after realizing that a carefully designed test approach will greatly increase the chance of success in any software project. He is interested in all parts of the test process, and has experience from several international projects. He works as a consultant for System Verification in Gothenburg, Sweden, where he specializes in test strategy, test design, test automation, test process improvement and test education. Freddy is an ISEB/ISTQB certified test analyst who also teaches courses on software testing and serves as a member of the internal Senior Advisory Board.

Here is the interview. Enjoy and learn.


[Inder] Did you start your career with software testing or have you held other software roles as well?
[Freddy] I actually made somewhat of a detour to reach my current position in software testing. At first I had no intention at all to work in the IT industry. Instead, I wanted to pursue a career as a teacher of languages at college level. So I studied German linguistics for two years at the university and prepared for some advanced English studies. However, my passion for web development grew stronger, and I decided to make a shift. In 1999 I started working as a web developer for one of Germany's e-business providers. There I learned tons of good stuff about development, projects, team work and testing. In 2001 I joined the QA team and became a tester. Later, I've finished a 3 year university education and a B.Sc. in Software Engineering because I wanted to understand my profession truly. I've also taken complementary courses in project management, usability, accessibility etc. Additionally, I spent one year teaching programming courses at university level, which was useful.


[Inder] Not all people appreciate software testing as a highly intellectual work; they feel that it is quite easy to test. Can you talk about few technical challenges that you have faced in your career?
[Freddy] It's a known fact that some employers will view testing as an entry point to development. It's as if they told their candidates: "We're sorry, but you're not qualified to play with the developers yet. But let's move you into testing. If you work hard and prove yourself worthy, then some day you might actually be allowed to work in development." Even worse is the situation where employees, who were rejected by other disciplines, are brought into testing, because managers feel that they "do least harm" there. This approach is just so wrong.
Testing is by no means a trivial task. Anyone who has worked as a professional tester knows this. However, not all people coming from other disciplines will understand testing well enough to realize its challenges. They might think of testing as "happy testing" or "randomly clicking around", not realizing that testing requires a controlled process. The presence of structure is vital to our job. At the core of the tester's job is the comparison of actual and expected results. However, this is surrounded by numerous items such as plans, strategies, procedures, methods and tools. Designing and implementing all of this in an effective and efficient way requires excellent business, technical, administrative and social skills.
Common technical challenges include quickly learning new applications and tools, understanding the environment in which each tool operates. Sometimes you also need to learn new protocols, scripting languages, or get an understanding of complex system architectures. This is part of the tester's job. Being curious and eager to learn new things is definitely helpful.


[Inder] Looking back at your software testing career, what have been some of the highlights?
[Freddy] The first highlight would be the recognition of testing as a profession. The second would be the insight that some of my carefully crafted test cases were actually detecting failures. The developers would lovingly refer to me as The Merciless Tester. Their indignation over each found problem would not last long. Instead they asked me to do more testing on their code. Although I have occasionally, and unintentionally, upset a few people through my work, in most cases the work with other disciplines has been both interesting and rewarding. The third highlight would be the move into consulting and a focus on test strategy, which might be the best job ever. Number four is the mentoring and educative role where I get to spread the word and (hopefully) make other people interested in learning more about our craft. Highlight number five would doubtless be this interview. ;-)


[Inder] Do you find testing enjoyable? How can software testing professionals derive more satisfaction from their craft?
[Freddy] Yes. Just like for other professionals there are a number of things that motivate me to do a great job. The most important is probably the feeling of making a valuable contribution to the client. Sometimes also to the community. For instance, a few years ago I was on the system test team for the national Swedish command and control system. The system was built for operation in emergency situations where ambulance, police or fire brigades might be needed. Any failures in the software might possibly have catastrophic (life or death) consequences. In that case every defect found and removed would benefit the whole community. As a tester you knew your work made a difference.
Having a good, respectful working relationship with coworkers and managers also makes it to the top of my list. Another important thing is the possibility to control your job, to be able to suggest ideas and improvements.


[Inder] What are some good resources for people wanting to enter software testing or establish them in this career? What activities can one do in their spare time to enhance their testing skills?
[Freddy] I want to refer to a good seminar on this topic, which I attended at the EuroSTAR 2010 conference in Copenhagen. Markus Gärtner from Germany gave a presentation on alternative paths for self-education in software testing. A number of options were presented:
  • Using social media (e.g. Twitter, LinkedIn, web sites, forums)
  • Learning to program (e.g. scripting languages, design patterns)
  • Reading books on testing
  • Joining online testing courses
  • Participating in testing challenges
  • Participating in organized problem solving activities like testing dojos, weekend testing or the Miagi-Do school of testing.

[Inder] What are your future career plans?
[Freddy] I look forward to an exciting future in which software testing will undoubtedly play an important role. Personally, while keeping an eye at the entire field of testing, I plan to specialize further into the areas that interest me most: test strategies, test process improvement and test design methodologies. I also plan to extend my teaching assignments in the future, since lecturing is a great way to spread the knowledge about testing while learning a lot myself. As they say: you learn as long as you teach.

April 23, 2011

Team productivity - How not to bring it down?

If you are fortunate enough to lead or manage a team of software test engineers, consider it a valuable responsibility. Check yourself if you find yourself adopting any of the following tactics. If you do, you can be assured that your team would be working far below their potential. And that would not the only problem you face. I am going to refer to the hypothetical Lead or Manager as LM for my examples below.

1. Poor communication
The LM knows about the incoming project or test run. She has been involved in planning, meetings with other stakeholders and knows a lot about it. But, the quantity of the information passed to the team members is so low or the quality so poor, that they get stuck at every step. They make assumptions and their work de-rails. Unless, of course, they check with her at each step. Takes up their time away from project work.
A variant of poor communication is an overdose of information. The LM floods the team members with documents not long before the start of the project. Worse, the info flood takes place just as the project starts. Now, a team member has to make one of two poor choices - stick to the schedule unprepared OR prepare and lag behind schedule.

2. Stalling
The LM is too process-oriented. She just loves checklists. A team member can consider himself successful only if she completes each of the 1,000 tasks in the checklist successfully. Did I mention that this is in addition to his testing task?

3. Unnatural competition
The LM promotes competition among the team members. Who writes test cases fastest? Who creates bug reports that developers always agree too? Who provides the LM the test results in the exact format that she likes? All this means that not only each team member has to do his work; he also needs to watch what others are doing. Nobody wants to be the last one, so the team members take short-cuts. Project work suffers.

4. Back-stabbing
The LM talks sweetly. After receiving so much appreciation (verbal and written - but always one-to-one) from her, a team member can never suspect the true feelings the LM has bottled up inside. These feelings are released in the reports the LM gives to her manager, other managers, customers and HR. The team member does come to know about the real feedback provided by the LM, but only when the damage is already done.

5. Feedback given unequally
Her praise is concise and in private. Reprimands are public affairs listing the mistakes committed in the past, the current lapse and hopelessness for the future. Needless to say, such reprimands make the team member quite uncomfortable and unsure of how to see his co-workers in the eye again.
A variant is when the LM provides no feedback. The team member has no idea whether he is performing well or poorly. The team member does not know what is round the corner. He does not know what he would be doing tomorrow.

6. False promises
The LM says anything to get the work done. It works, but only the first few times. In time, the team member is disillusioned.

7. Taking undue credit
The team members take the difficult project as a challenge. They work professionally and very hard. They surmount the obstacles. Even though tired, they don't cut corners. The LM takes all the credit for her "leadership" in time of need.

8. Scape-goating
The LM schemes. She already knows the personalities of her team members. She creates a Plan B (and Plan C) if things go south. When they do, she lets any one problem develop, gathers substantial "data" incriminating a chosen team member and dumps the entire situation on his head.

9. Unwillingness to change
The LM is aware of the negative effects some of her actions have on the team. But, it has worked for her in the past and she sees no incentive to change. So, if he is able to work with her, the team member makes the necessary mental adjustments or start looking elsewhere, in time.

Final words - As human beings, we tend to make the best possible choice. As I mentioned above, if you lead or manage a team, you have a big responsibility. Keep in mind that your actions affect not only yourself but also your team members. Take the correct action, even if causes pain in the short-term and is difficult. It will benefit both you and your team.

April 14, 2011

What is software regression?

Things are not as good as they used to be.

Before one can do an informed regression testing, it is important to understand software regression, which can happen after an event that changes the system. Software regression is deterioration in the software. Such decay can be functional, meaning one or more functions working earlier no longer do so. Or, it can be non-functional, for example, the software becomes slower/ outputs less or becomes (more) vulnerable to security threats.

April 11, 2011

How to do end to end exhaustive testing?

Testing a software application (except maybe a very simple program a few lines long) may well be an impossible task due to large number of:

1. All possible inputs
2. All possible input validations
3. All possible logic paths within the application
4. All possible outputs
5. All possible sequences of operations
6. All possible sequences of workflows
7. All possible speeds of execution
And the above for just with a single user
8. All combinations of types of users
9. All possible number of users
10. All possible lengths of time each user may operate the application
And so on (we have not even touched the types of test environments on which the tests could be run).

However, it is possible to exhaustively execute your test suite using the following tips:

1. Your test suite should have test cases covering each documented requirement. Here my assumption is that each requirement is documented clearly.
2. The test cases should be specific, concise and efficient. Each test case should have clear and unambiguous steps and expected results.
3. The configuration data, input test data and output test data should be clearly specified.
4. You should have a clean and stable test environment in which to execute your test suite.
5. In a perfectly working application, it should be possible to execute each test case in the suite.
6.. Each confirmed bug (found during testing or found by the client) should result in another test case written or an existing test case updated.
7. Important: You should not assume the correctness and completeness of your test suite by yourself. Review of such test suite by peers, business people, managers, clients and users may provide you valuable inputs to correct your test suite.
8. Discipline in maintaining your test suite and executing it would go a long way in preventing bugs leaked to the clients/ users of your application.

April 09, 2011

Conventional wisdom applied to software testing

Here are the conventional English proverbs as I see applied to the field of software testing:

1. A journey of a 1000 miles begins with a single step.
A test run of a 1000 test cases begins with a sanity test.

2. Genius is 10% inspiration and 90% perspiration.
Software testing is 10% test preparation and 90% test execution.

3. As you make your bed, so you must lie in it.
As you write your test cases (good or poor; complete or incomplete), so you must execute them.

4. A bird in hand is worth two in the bush.
A confirmed bug is worth two suspects.

5. A chain is no stronger than its weakest link.
The quality of an application is no better than that of its most buggy component.

6. Discretion is the better part of valor.
Review is the better part of executing a task.

7. Empty vessels make the loudest noise.
People who least understand software testing speak of it in the most black-and-white terms.

8. An ounce of prevention is worth a pound of cure.
One bug fixed in the beginning is worth 16 updates.

9. A penny saved is a penny gained.
One test case re-used is a test case written.

10. Where there's a will, there's a way.
Where there's an application, there are bugs yet to be found.

Hope you enjoyed these :). If you like any other English saying, let me know. I will translate it to software testing.

April 03, 2011

Checklist for application release notes

It is a norm to distribute release notes along with a software release. Software release notes are similar to the product literature that you get when you buy a physical product. You may vary the contents depending on whether the software release is a major release or a minor release, an upgrade to the application or just an update, customized for a specific customer or available generally. Find answers to the following questions to test the release notes that you receive before software testing.

Main information
  1. Does it mention the application name and the correct version number?
  2. Does it correctly list the features released and/ or bugs fixed in the software?
  3. Does it mention the correct software vendor name and date of the release?
  4. Does it mention the release numbers it requires as prerequisite or the release numbers it supersedes?
  5. Does it mention each of the available distribution modes (e.g. website download, email and disc)?
  6. Does it mention the correct distributed media (e.g. install package, executable programs or scripts) in which the software release exists?

Release feature information
  1. Does it explain each new feature (summary, benefits and details of how it works)?
  2. Does it explain bug fixed in this release?
Supporting information
  1. Does it list the minimum and recommended hardware, software and network requirements to deploy and use the release?
  2. Does it provide steps to backup the existing configuration and user data in the system?
  3. Does it provide detailed steps to install the release?
  4. Does it provide steps to restore the configuration and user data of the previous release?
  5. Does it provide the contact channels in case the software does not install or installs with problems?
Others
  1. Is the release notes available in a file format that can be viewed by the customer?
  2. Is the file format consistent with prior releases?