Monday, 17 May 2010

Bangalore Testers meet-up (15-May-2010)

I thought I would write a brief blog about a meet up of testers whilst I was in Bangalore, India.

It started with me sending a tweet out asking if any testers in Bangalore were interested in meeting up at the weekend. After a short while Santhosh Tuppard (@santhoshst ) responded by putting an invite on his blog site (–-may-15th-2010/) then all of a sudden there was a flurry of activity on twitter as people asked about timing and where and when.

So on the Saturday I made my way down to the forum and met up with seven testers from Bangalore.

Pradeep Soundararajan (@testedtested -
Ajay Balamurugadas – (@ajay184f -
Dhananjay Kumar (@dhantester)
Santhosh Tuppad (@santhosht -
+ two others – if you can remember who they add please add their details as a comment.

We had an agenda or some points to talk about:

Should we record our emotions when testing?
Exploratory Testing – good and bad points – how can we solve the bad parts?
Measuring Testing quality – How?

If we kept to these themes was a different matter.

After some brief introductions everyone was keen to know about me. After explaining I had been in IT for over 24 years which was older than some attending!!! Pradeep pointed out I did not look that old!!!!

We then started to discuss some testing issues and the following is what I remember of the conversations that took place. I am sure some of the others who attended will correct any details I have got wrong :o)

One testing issue we looked at is when teams are being managed remotely and managers at not in the same location as the testing team. Pradeep says this becomes a problem because managers want a breakdown of the testing activity all the time and the testing team spend a great deal of time answering questions from managers about what they are testing.

My thoughts on this was that anything that is actually stopping a tester from testing should be recorded and reported so at the end of each test phase/cycle when reporting back to management if a lot of time is being spent reporting test activities to managers rather than testing then this can be clearly seen and a ‘good’ manager will try to rectify it. One idea suggested was that the managers should become more hands on and take part in the testing activities. The exploratory testing approach makes this very easy to do.

There was a lot of interest in reporting what is stopping testing rather than any thing else – this lead on to a discussion of measuring testing and testing quality. This is always a difficult question and I am not sure we fully answered it in our discussions. I mentioned that when testing should we record our feelings and emotions about what we are testing. I suggested that we use a happy, sad face system so when we feed sad about testing we have a sad face when we feel angry an angry face and so on, It would then be very easy to see a trend for a area under test. If there were lots of angry faces then someone will spot and trend and start asking questions of the testers about why they feel this way. Pradeep pointed out that developers may try to influence the testers to always make happy faces. I stated that testers should be still independent of developers and that we should support and help developers but we would not tell them how to write an algorithm so why should they tell us how to test?

On the subject of measuring testing the discussion revolved around looking for trends rather than looking at numbers. The numbers could still be wrong but that is where talking helps to understand what the numbers and trends are saying.

Another point was made about using Tech support people as a resource for testers. I posted this on twitter as a question and got the following responses: (may not be in time/date order)

testingqa: @steveo1967 Tech support people hold some of the skills, but I wouldn't immediately say it implies they will be a good tester too #testing

testingqa @QualityFrog @michaelbolton My 1st role in IT was Tech Support role,those I worked alongside were slack & passed along cases for me to solve

testingqa @QualityFrog @michaelbolton So my faith in Tech Support staff isn't great ;) But I do agree that a good tech support person can hold an... appropriate background useful for testing, but even then Tech Support doesn't require an eye for detail often.

qualityfrog @michaelbolton @testingqa I've seen good ppl move fr tech supt 2 testing & vice-versa. there r similarities in foundation in what makes good

qualityfrog @michaelbolton @testingqa And, sadly, some in tech support are about as astute at faking support as some testers are in faking testing.

michaelbolton @testingqa Not automatic, but the skills and experience of support greatly overlap those for #testing. Great foundation, I'd argue.

michaelbolton @QualityFrog @testingqa I took *good* tech support person as read. Your pessimistic interpretation is also valid, alas.

So even here there are two sides to the argument but it appears that ‘good’ tech support could be valuable resource to testing. Maybe I am a little biased on this since my background is technical support and I believe that if you have a good grounding of how users think and technical knowledge as well (preferably at a coding level) then you have the right attributes to look at becoming a tester.

It was interesting that whilst talking to the group and I was aware I was doing a lot of the talking how much interest was shown in the subjects we were discussing, more importantly how much passion all the people at the meeting had for testing. I could see that in everyone’s faces a passion for learning and understanding. Key attributes of ‘excellent’ testers. I twittered about meeting the future teachers of software testing and I still believe that these people hold the key to the next 10 – 20 years of software testing.

The next subject that came up was one of certification – I explained that on the way to the office the other day I saw a sign that said “Learn software testing in two weeks”, everyone in the group laughed. We talked about the fact that software testing is not like programming in which once you have learnt the foundations of the language you then have the building blocks to create code. Software testing is not computer science but a mixture of many types of science including psychology and philosophy. It is not an exact science and there is plenty of scope to get it very wrong which we have all seen. The question of certification seems to be a very interesting one considering the latest blog posts coming from James Bach and Stewart Reid. (,

I do not understand the need why fellow professionals need to attack each others ideas, it is getting very political. This is just my own personal viewpoint and nothing to do with the meet up. I believe in the need to debate and discuss ideas and opinions but when it appears that someone is personally attacking someone it then appears like a personality problem and distracts from a meaningful debate on the issues – I really want to attend Eurostar now just to hear the debates.

This nicely leads us to the subject of certification that was brought up during the meet up. We discussed certification and all of us agreed that it appeared to be a money making scheme that gave no benefit to experienced testers. The concern of the group was that agencies would now demand that testers have this certification and people will take the exam worried that if they did not they would not get a job in testing. The group felt this was very wrong, to be pressurised in taking an exam because there is no other measurable way to prove you are an ‘excellent’ testers. As a group we came up with some ideas that may be a way forward and needs everyone in the testing community to push forward.

  • All testers should have an online presence
  • They should be involved in writing blogs.
  • Be actively involved in testing discussion (software testing club, twitter)
  • Should try to meet fellow testers a couple of times a year at testing meet up – the internet allows this very easily as this meet up has proved
Once this is done when you apply for a job and the agency asks for your certification – point to your active online presence in testing ask them to talk to peers who have met you and who can vouch for you. If we ALL did this then the need to pay organisations to prove you can test goes away. Let us as a testing community certify each other.

So with this last thought I did a little experiment on the people in the meet up involving the calculator experiment as taught by Michael Bolton – if you have experienced this from Michael then you will know what I mean if not then you need to find someone who knows because I am not going to spoil it by explaining on here. Thank you again Michael for giving me a useful way to demonstrate a key point.

So all too quickly the meet up had to end. Did we all learn something? I hope so. I know I did and I had a wonderful time and left feeling encouraged and motivated. I still think I talked far too much and for that I apologise hopefully if there is a next time I will encourage others to speak even more – you have been warned……

1 comment:

  1. "I do not understand the need why fellow professionals need to attack each others ideas, it is getting very political."

    Professionals need to attack each other's ideas precisely because it's getting political. The political dimension is that a small group of people are bullying testers and managers into handing over money for certifications that are clearly bogus. Certification, as Tom DeMarco points out, is really about disqualifying the majority. If that's not political, I don't know what is.

    Note that we don't mind people making money on teaching or talking about software testing. That's what we do for a living. What we object to is people peddling training and certifications in this coercive manner:

    Personal attacks against unrepentant bullies are warranted, in my view.

    Another reason for professionals to attack each other's ideas is to sharpen the good ones and reject the bad. James Bach and I do this to each other all the time.

    In addition, we invite feedback and criticism from those who agree or disagree with us, since we believe in actual ongoing improvement of the craft, rather than in repetition of folklore. This means that we will be wrong as time goes by, and we'll need that feedback from others to learn and to move forward.

    ---Michael B.