Uncategorized

PSQC 2018 Conference Talk

By | April 15th, 2018|Uncategorized|

The #PSQC18 conference on April 07, 2018 was around the theme of ‘adapting to change’. There were around 11 talks on different topics from testing, automation, agile, security and data analytics and a segment ‘My two cents in two minutes’ where participants were invited to share the challenges faced in the testing career in two minutes.

Background to the talk

For past 11 years I have been trying to understand why testers are treated as second class citizens in the software industry, one of the changes we need to make is ‘Technological Excellence’, for more read here. In that spirit, this talk was around how to become a ‘Technical tester’, where a hypothetical fresh graduate shares his testing career adventures and lessons he learns along the way solving each challenge.

Testers must be technical

I argue testers must be technical through a story our hypothetical tester shares with us. A story of his first testing job where features were left untested due to lack of technical depth, causing huge repercussions for the company.

Understanding the Technology stack

Our tester’s next job is about testing a hybrid application, this time he shares his journey of learning about technology stacks and how learning that helps him in testing.

Aptitude for designing algorithms

The next challenge is to learn automation due to the amount of testing he has to do. He learns it’s not about tools or languages which makes a good automation engineer, but something even more fundamental.

Develop the right attitude

Over time once the peaceful workdays have now turned into stressful debug marathons. On the run again to learn how to deal with it, his master guru tester explains the dilemma of a programmer life and how to deal with trouble shooting.

The talk seemed to be well received, so I made a separate video for the YouTube channel where you can enjoy the insightful journey of our young tester :

TALK MAIN PARTS:

  • Introduction: 0:15 – 1:00
  • Perceived vs actual career path of a tester: 1:00 – 3:19
  • Learning the importance of being technical: 3:20 – 7:50
  • Maximum test coverage through learning the Technology stack: 7:50 – 22:05
  • The Keystone of learning automation: 22:05 – 31:55
  • The bane of programming and the holy grail: 31:55 – 38:10
  • Sharing his adventures with the Guru tester (Recap): 38:10 – 40:20

 

The presentation slides can be downloaded here.

Automation Guild 2018 Round Table Q&A

By | January 26th, 2018|Uncategorized|

The Automation Guild 2018 conference this year was a nice event, people from around the world coming together to learn and improve their automation efforts.

I had the opportunity to be part of the expert round table panel along with Angie Jones and Oren Rubin which was a great experience. There were a lot of great questions, which showed how awesome the community was. Sharing some of those questions here and teasers from the answers.

 

How to catch bugs earlier

If you want to catch issues earlier in the cycle, then essentially you are talking about shifting left. Get tests executed earlier in the cycle to get early feedback. Divide the tests to execute at different stages of the development life cycle, some can be executed during development, during patch testing, during regression testing, basic tests after deployment on production and so on. Automation checks would go a long way in achieving this, but ultimately, it’s going to be testers who can really catch issues.

The purpose of automation is not to capture bugs, rather take over the mundane tasks which keep testers from spending their time ‘checking’ functionality. Automation is eventually not going to report a lot of bugs, it’s rather enabling testers to capture more bugs, a great point from Angie.

 

Which coding language is recommended

In my opinion, more important than selecting a language is developing the aptitude of designing algorithms. Learn the fundamentals of programming which are common across all languages. Once you can develop an algorithm to solve problems, language selection becomes less important. Plus, languages will keep on changing and you can never stick to one language forever. If you start with learning a language per se for instance Java, you might get tangled up in syntax issues and debugging problems than actually learning to code.

A great point from Oren was about JavaScript, it’s a very easy language to begin with and also is very popular (in fact most popular) these days. More importantly for automation engineers I think it makes more sense since UI automation is usually a great part and using JavaScript allows synergies with using client side scripts and browser’s native functions as well. Usage of promises is also a very powerful concept mentioned by Oren which can certainly help with dynamic delays.

 

Retry failing tests

A solution to flaky tests is sometimes suggested as rerunning the flaky test. In multiple tries if it does not fail consistently, means there is no bug here rather just a script issue.

All of us at the panel thought retrying flaky tests is not a great idea. It adds lots of execution time and pushes away the concept of trying to avoid flakiness in the first place. Rather, avoid flakiness at all costs. There are a lot of ways to do that, use what works best for you. Apart from tips like using dynamic delays, using the AUT’s client side functions etc, try breaking the test into smaller tests. Longer scripts have a greater tendency to fail and cause problems.

The ultimate solution for flakiness is to have a great automation framework built to support the execution and handle unexpected states of the application.

 

How to elevate automation’s success to management

The goal of automation is to identify risks and inform of any potential issues going into the field. Unfortunately, many times we end up working on issues which might not directly translate to the value added towards this goal. It’s also hard to get matrices which reflect this, but we should not stop trying. The ‘Matric black hole’, a term from the book ‘deep work’, talks about this concept as well and about the complex nature of getting matrices on productivity which tend to be subjective and very hard to measure.

One of the values at Quality Spectrum is generating ‘business value’. This point eludes to that goal, to get tangible results proportionate to the effort done. It’s a long and complex road for which certainly we don’t have answers yet, but it is an important enough problem to solve. For starters, I feel measuring how much more test coverage we have is a better one. Not in the sense of number of tests executed, but how much of the code base have we tested. Tools from companies like SeaLights can provide you with what areas of the code has been executed while you have been doing your tests and what areas remain to be tested. Another angle can be from the user perspective of coverage. This topic deserves a separate post, so I’ll just pause the discussion here.

A great point from Oren was the value of seeing passing test coverage. Getting the assurance of certain tests passing does add to the confidence in the build and allows testers to focus on other areas. Seeing a green mark on your automation results does have value and should not be discarded.

Management for the most part does not have a clear idea on how to measure value from automation and often are looking at the wrong metric. This comment from Angie was so true, I too never recall anywhere management having a clear vision of how to assess automation’s success, unless the person at the top has been doing automation themselves.

 

If you did not have the opportunity to join the guild this year, I’d recommend doing that. It gives a broad idea of what’s going on in the community, introduction to lots of new concepts and tools which one might not know if they were as important and where to start.

 

A lot of questions went unanswered on the first day’s round table I was at for which I am in the process of creating video replies for. The answers are uploaded to this playlist.

 

If you have any questions you’d like me to talk about, send them to questions@quality-spectrum.com.

Introduction

By | June 14th, 2017|Uncategorized|

The software quality assurance industry in general has been dragging along rest of the software industry. The formation of quality spectrum is to research and enhance testing practices. This book is a step in that direction, creating effective and efficient automation initiatives enhancing testing efforts for software product.

Why another automation blog?

There are many great ones out there, some we love to read and comment on. I find navigating these blogs really cumbersome. The relationships between posts are hard to find out. That’s where the idea of a “Blog Book” came in, a term we at Quality Spectrum have coined. Instead of just a blog, all posts collectively form a book like structure. Making navigation and getting a general sense of the topics discussed easier for the reader to digest.

Secondly, we intend to make this a community of people pooling in, fast forwarding the speed of ‘automation acumen’.

Contributing to the blog

Share your thoughts by commenting. We’d love to know your opinion on the topics. Eventually we hope thought leaders in our industry will help us evolve the automation spectrum and help us collectively create ‘the next generation tester’.