Management

The passionate knowledge worker

By | November 3rd, 2018|Management|

One of the biggest problem employers have is they are not able to find / or get their team passionate about what they do. And the biggest problem employees have they are not passionate about their 9 – 5 either. Quite silly if you think about it isn’t it?

So, I guess everyone should take a chill pill and get a compass to find what they really like. Off course that is easier said than done, but it’s not as hard as we often think it is either.

 

Finding what makes you happy

While we all would assume spending days on a resort sipping margaritas would make us happy, I think that happiness might not last long. We are driven by purpose, and just having fun is no purpose. As human beings we are wired to work towards a higher purpose, that’s how our species survived and how success is achieved.

To recap we need purpose which has to be selfless. So how to find that purpose? I wouldn’t imagine for there to be a silver bullet for this. However, here are a few ways that might help.

Find what activities you absolutely dislike or hate to do. I’m mentioning these early on as these are easily spotted. For example, for me doing brain dead work is something I hate. How do I know this? I tried doing brain dead work on multiple occasions and very diverse activities, the only common thing in them was brain dead, and it absolutely pissed me off. So, anything which does not have to do with deep thinking, that’s a big ‘no’ for me.

Next try finding what things you like to do. These are sometimes not as easily spotted but can be done with little attention. Again, I tried to figure out what activities made me happy (apart from watching Dwyane Johnson and Leonardo Decaprio’s acting). As a child I remember I loved to play with Lego or board games like monopoly. Growing up real time strategy was my thing. And during my engineering years I found my true love, programming. What was common in all of these though was building things (which off course needed thinking).

Having these few points, it was rather easy to find something I was passionate about. I must admit, I was not assertive in always doing what I loved, rather always played the hand I was given, but tried to somehow find something, in my subconscious, understanding I had developed for things I disliked and loved to do. And over time eventually I found an industry I could be passionate about and loved working in every minute of my life – Becoming a technical tester. Which meant not just learning automation and programming, but also learning how different software products were developed and worked inside out.

 

The employer’s dilemma

Now for the two groups we talked about earlier, the employee and the employer, how should they deal with this? Here’s my experience:

When hiring, employers tend to publish a ‘thesis’ of skills they’d like to see in the unicorn candidate, even if most of the skills mentioned might never be used in the candidate’s entire tenure. I find this to be very destructive and a waste of time for everyone.

Focus on the very few skills you want the candidates to be good at, and ONE they have to be passionate about. If they love doing the job you want them to do, they’ll be self driven and motivated. But the problem there is, not everyone has enough self awareness to know what they are passionate about. So, you’d have to judge for yourself if they are passionate about the subject matter you are interested in. My thoughts on hiring automation engineers can be read here.

 

For employees

Giving motivational talk about following your passion sounds very nice, but walking the talk is quite different. Finding a career that you love working in is not easy. I know passion does not pay bills. You have to play the hand you are given, but never loose sight of where you want to go.

A great example of this is James Dicks. I stumbled upon his profile by chance on Linkedin and was confused for a while going through his career history. On reaching out to him I found he was always passionate about flying. But to get there he needed a lot of money to get flight school training. So, he started working as a software developer. After years of writing code and injecting cash into his pilot training dream, he finally was managed to complete his flying hours and start commercial flying with Emirates and now Britsh Airways!

While this fairy tale might seem a far-fetched reality, it’s not as much if you think about it. The human body is highly adaptive to any circumstance. So long as you are moving towards your goal, serotonin (a chemical in our blood) will give the motivation you need from time to time.

 

Bridging the gap

Hiring a person or getting hired is not about if someone is good enough or not, it’s about aligning values and aspirations. Employers should give more weightage to attitude, which means looking for aligning values. For candidates understanding the company’s vision, values and culture would help them decide.

Monetary compensation is not mentioned here, because that’s a given. While this is a tricky thing to manage on both sides of the fence, fair compensation for the skill set needed and brought to the table can be used as a general guideline to follow.

While hiring passionate knowledge workers is the hardest part of running a business, it is the most crucial too. In the age of information, knowledge, experience and skill is the king. The only thing which will off-set that difference is attitude and passion. For an employer. having a team passionate about your goals, and for an employee, working with a team in alignment with your values is the ultimate prize.

 

Does Automation Save Money?

By | May 3rd, 2018|Management|

Like lots of folks, I used to calculate automation ROI by measuring ‘hours saved’ if a person were to do these checks instead of a machine. Perhaps that’s how the market trend generally evolved and a way for vendors to sell their products / services. After working for years in the industry and listening to other thought leaders and folks sharing in the community, I feel the ‘cost cutting’ might be there, but not in the way most of us think about it which should change the way we think about automation.

To make that a bit obvious, what would you say is the ROI of a piano for an average user? It’s not easy to quantify the return on investment for a ‘tool’, but that does not mean it’s any less important depending on the circumstance.

The cost saving silver bullet

For years till date automation tools and services have been sold as a method of reducing cost. In theory it does sound logical, however after working in the industry for years, I don’t know of anyone who has really ‘seen’ these cost cuttings including myself. Let’s dissect the calculation of cost reduction in detail to try and pinpoint the discrepancies.

The Formula

The story goes something like this:

“Savings per test cycle= Tests/checks automated  x  Execution effort (man hours) per check”

And then we’d calculate the break-even point when the savings equal the initial investment in preparing that automation suite plus any other costs etc. For an accountant this would make perfect sense, except the “effort per check” cost does not exist! Let me explain.

 

Automated checking Vs Testing

The first problem is equating automated checks execution time to a tester’s man hours. The way a machine runs a script is not the same as how a person would test that feature. There is a lot of background to this concept if you are not familiar with methodologies like Rapid Software Testing and related concepts. For those who are not, let me try and summarize the required concept quickly.

The verb “Testing” is an act of “Thinking” and “Communicating” on how to test a specific feature. Once the tester decides what to test, then he / she executes the scenarios. A machine is incapable to “Test” since it cannot “Think” neither can it “Communicate” like a human. It can only “execute” what it’s instructed to check.

(Thanks to the RST community, James Bach, Michael Bolton and folks for articulating this clarity)

 

The missing effort

Let’s take an example of a candidate application which would hypothetically require around 1000 man hours to test the complete application (btw many products would fit this description). How many testers would be needed to regress over this application within 2 weeks? Around 13 full time testers. Do you think the team would have 13 testers on the team? Mostly not, they would have less than needed people and make do with whatever time they get.

Now, half the effort of “Testing” was the thinking part which a machine cannot do (Some would argue, including me, a lot more than half). The other half is supposed to be spent on “Execution”, where only a small percentage is actually being spent since the team size is ALWAYS smaller than needed.

That’s how there ‘might’ have been ‘some’ savings in terms of man hours but practically there are next to none because most teams are not operating under the assumptions followed while calculating the ROI.

 

Then why Automate?

Increased test coverage

From our example, we were not able to test the complete application. And from my practical experience, many products are ‘way’ less tested they should be. Adding a dozen more tester’s does not seem to be practical either.

To cover more ground, testers can program a dumb machine to do the basic ‘execution’ which they have to unwillingly do (since its boring doing the same thing again) every time a release is going out. This frees up their time to do intelligent work and get the repetitive checks done by a machine.

 

Testers focus on important areas

This might seem a repetition from the point above, but there is a slight and important difference. Tester’s don’t just free up their time, but they can now also leverage the dumb grunt by focusing just on the thinking part and delegate as much possible the ‘execution’ part. A high percentage might not be possible, but if automation is leveraged properly, the test quality can improve significantly since most time would be spent on ‘thinking’ than doing repetitive stuff. More on test scenarios that are ideal candidates for automation here.

 

Quick feedback – Find problems earlier

How many times has it happened after a bug fix an important feature stops working altogether, and this comes to light at the 11th hour when there isn’t enough time to regress the fix properly either.

There is a lot of value in getting feedback quickly. Different checks running at different stages of the development process can provide the needed feedback at that point. As an example, a possible plan could be run unit tests and high-level checks during development, complete regression in QA stage, user acceptance tests on production, or any process that suits your product and team.

 

Quick feedback – A big step towards CD

The companies to be most successful are the ones taking an idea from the drawing board into the consumers hand most quickly. This is where continuous delivery comes in. The race to minimize the ‘time to market’ can become a huge factor giving the first movers advantage. This video will give more detail on how automation facilitates that.

 

Increased confidence in the product

An inherent problem with exploratory testing is “it’s done by humans”, which is a good thing too but people tend to forget. A tester might not test the same function the exact same way every time or forget to test altogether. With automated checks, we can be certain of what features were tested and if they are working.

This makes the decision to ship a release much easier and allow for some quantitative measures to take decisions on. Although this alone cannot be enough to make the call, but coupled with decent exploratory testing, it makes a difference.

It’s not just the team, customers of the product also can have a sense of satisfaction of knowing certain checks being automated ensuring the functionality will most probably have gone through a checking process.

 

Commitment to quality

From our example and my experience, most teams do not have enough testing staff to completely regress the application every time a change is made. Some would argue it’s also impractical. Having automation in place shows the commitment towards ensuring maximum areas of the application get tested or checked before shipping to the customers.

This is where the phrase ‘Quality is a mindset’ comes in. When we hold ourselves and our product to a high standard, it will necessitate to indulge into some form of automation process, because most modern applications are not possible to test adequately with a cost-effective sized team.

 

There are saving, but not the way we calculate them

Equating man hours to machine hours of execution is not the correct formula for finding your return on investment for an automation project. The returns do not come in tester’s man hours saved, rather in different forms which are by no means less important, just less obvious.

The real value comes from increased test coverage, allowing testers to focus on what really matters while delegating grunt work to a machine, get quicker feedback on fundamental problems or features, a major milestone towards reducing time to market, an increased confidence of the customers and the team in the product’s quality which shows the commitment to quality having an impact on the end consumer.

 

Feel free to share what other benefits do you feel automation brings to the table.

 

The ‘not’ so small code changes

By | November 29th, 2017|Management|

I recently made the good old ‘one liner change’ in our automation framework having a 22-hour batch run with one permutation only. We discovered a week before regression my ‘small change’ had caused a straight 25% false failures. When my awesome team suggested to revert the change, I realized I did not recognize the demon at that time. This demon goes by many names, ‘one liner change’, ‘small change’, ‘localized change’ and so on (and will be referred as demon from here on in letter and spirit).

 

Cause of the trap

Have you ever sat with a friend driving a car and you feel a bit unsafe, like he / she does not have as much control over the car? And how is the driver feeling in that moment, pretty confident for the most part. I feel a similar case happens when one person on the team feels about the ‘small change’ in code. In the moment of making the change, it feels like we were just putting a bandage on a little bunny’s scratch. Till later on it turns out that was a lion we just misfired a pump action at and now are running for our lives.

 

The demon effect

The demon is lethal mainly because it does not look like will create much of a problem. Often teams end up making such changes close to regression / release giving less time to react. Eventually the decision of releasing with bugs or delaying the release has to be taken which no one likes. I would equate this with “Putting good money after bad money” (adding more time to salvage the situation) hardly ever works except for gifting with more frustration (my published thesis on the subject here).

 

Small and ‘not so small’ change

While small changes or localized changes do exist, sometimes the demon is misunderstood for one. What defines a demon then? A formula I like to use is if the change is encapsulated within a small area of the application, a low-level module or one class affecting functionality of just that module or class. The mistake I have seen made is a functionality which is not encapsulated within a confined class / module but we feel it’s used only in one or few specific areas being confused with a localized change. Unless there is no way this change can be accessed or affect directly anything outside its scope, it’s not a localized change.

 

Quarantine the demon

The change must be done no doubt, but can be done in time of peace, not on the 11th hour a day before regression. Implement it in the early sprints of a release, so there is time to react and adjust before the deadline hits. Firstly, while implementing you hopefully will not be in a rush and can think through. Secondly there would be time to test issues around just that change and would be easier to identify the cause of bugs we see.

 

A general best practice is to make changes in increments rather than doing the whole big change at once. If you have automation running on UI or API level, that makes things easier. As you push the big change in increments, you get to know right away any outright effects it might have. And if you don’t have automation, I would say first think of doing that, otherwise you can focus manual testing as much possible towards testing the ripple.

 

Big decisions take time

Like in life big decisions take time not just to research on, but to think about and reflect upon. Similarly, I feel taking time for architectural changes to ‘sink in’ or internalize takes time. By doubling the resources working on it, it would not get done in half the time. For big changes, I usually start thinking about them ahead of time with my team, so when we do start doing it we have things in perspective and can make a better judgement.

 

Care to share what else would you would do to make an architectural change transition smoothly?

Manual Tests & Automated scripts Traceability

By | June 29th, 2017|Management|

The product had been tested for years now by the testing team doing exploratory tests, and writing test cases for important areas. The application size increased day by day eventually reaching to more than a thousand test cases. That is when the testing team thought of delegating the ‘checking’ part of regression to automated scripts to free some time for real testing.

Many product teams coming towards automation have reached this stage and are looking for a way to shift written manual tests to automated scripts. The tests in many cases include some rich scenarios which the team wants to leverage, and are looking for scripting an exact copy. Naturally this comes with inherited challenges, some of which I am about to share on how we managed for one particular product.

Before moving on, some tools claim to automate manual tests from a word document etc. that is not being discussed here (plus I yet have to see that work!).

 

Test case to script mapping

Ideally all manual tests should be part of the automation suite as it is. However, differences are bound to creep in. To maintain traceability between tests and automated scripts, creating a mapping document would be a good idea. Essentially map every manual test to an automation test. For a discrepancy in test scenarios, mention the reasons with appropriate tags (for ease of filtering).

As the application evolves, changes in manual tests come in and scripts need to be updated. Having this document would

  • The change would become way easier for the person updating scripts if any prior discrepancy was written with reasoning readily available.
  • Secondly during regression it would be very clear which areas automation is not looking at and the manual tests might want to look into.

 

Scripts incapability vs sentient beings

There are always some steps in manual testing which the testing tool is not able to perform. Could be a physical activity outside the product, portion of the application not automatable, a very complex bunch of scripts needed to improvise in different application states. Instead of just leaving out the test altogether I usually recommend

  • Alter the scenario to suit the script, salvage whatever you can, and forego what cannot be done.
  • Break the test in two. For the second test use pre-populated data / test scenario to avoid the area not automatable.

The mapping document comes in very handy here.

 

Manual test steps in report

Test reports generated from automated scripts should be readable primarily by the manual testing team. Usually I see teams with test reports showing all the automation mumbo-jumbo right off the bat, creating lots of confusion for someone not involved in automation.

I strongly advise to include test steps as it is from the manual test case in the automation test report. Under each step should be the script read / write details the tool is performing.  Non-automation folk can then make sense out of it, also it creates lots of ease for the automation team to fix issues.

 

Dual purpose

Apart from mapping differences from manual tests, this document was used by us to have an overview of the complete automation suite’s health. Scripts which we knew were faulty and needed updates, scripts needed in-depth investigation, scripts failing due to a reported issue, all these status updates were appended to this document.

Even if you don’t have manual tests to map to, still every automation project must have one spreadsheet with at least the fields listed. These are a huge time saver when managing batch runs / daily runs.

Care to share what you did to map manual tests?

Till next time, Happy automating!

How to Hire an Automation Engineer

By | June 29th, 2017|Management|

Recently I had the opportunity to go through the cycle of augmenting an automation engineer to join our team. It had been a while since I used to do this (much frequently) few years ago. This time I stumbled upon few ‘new’ realizations, especially for hiring automation engineers, which never occurred to me before. With many folks I see struggling to find suitable candidates for this title, I thought of jotting down some lessons learned through the process. The best part, its one step really (to being with)

 Cut the Crap

Candidates undervalue their skill set (especially the ones you want to hire). Furthermore, seeing a huge job description with all the buzz words the hiring manager could find on the internet, scares away potential candidates from applying. There are usually a few key skills for any given position which are vital for the team. With a lot of ‘name dropping’, candidates get confused and hesitate to apply due to lines like ‘Expert with 3+ years experience in SQL server’, which in many cases, there is a 5% probability the new hire will be writing complex queries.

Our job ad was not performing well. The candidates we got were not even a match on paper. HR suggested to have another look at the job description, and she was right. Hesitantly I reviewed and found the job add was scary to say the least. No doubt the position had considerable requirements, but the essentials were few. While cutting down the content, the few skills I felt were essential for an automation engineer / SDET (NOT a lead position) are presented here.

Programming aptitude

Not ‘10+ years of experience in Java’, who could code a talking parrot. If your project is in Java, don’t necessarily look for a Java guru. If a person has the aptitude of constructing algorithms and can demonstrate good programming skills with any language, he/she can learn the new language or framework. When the technology changes (which it will), they will be most comfortable to adapt, more willing to learn new tools / languages and leave their comfort zone.

Testing acumen

A term I use referring to a tester’s mind set. Who is able to craft test scenarios covering different aspects of the AUT, having the process related knowledge, the skill to extract requirements, to tell a great story when writing up an issue and so on. I always believe an automation project is as good as the scenarios being automated. If the scripts are checking trivial functionality, it’s not going to create the difference we want.

The counter argument I’ve heard, since test cases are being given, automation folks don’t need in depth testing experience or knowledge. Well, a person with technical insight and a tester’s eye will be more suitable to suggest what areas are not being tested and how to test them. Even if a team has Unit tests, integration tests and UI tests, still someone needs to create scenarios for the system level tests verifying the business logic on system level.

Attitude

The one thing an automation team might never get rid of is ‘technical problems’. Each day is going to be a new day. Either you will make mistakes and learn (for the most part) or you would be wise and spend considerable time in learning to avoid making the mistake. Through all these endeavors, the only thing to keep you going is the right attitude.

In any job having the right attitude is the most fundamental aspect, specially true for automation folks. They never run out of problems, and they can never get tired of solving them!

Smart creative

Under non-technical skills there are many qualities hiring managers aspire to, this one which I found lacking in some cases and most relevant specially in this type of position. A term coined by Google’s “Smart creatives”, which is the enhanced version of Peter Drucker’s knowledge workers. I understood it as ‘A person with the fire to learn new things, who is technical and business savy and has a creative personality’. Imagine what a smart creative technical tester would do – Put development on DEFCON 1! Actually no, instead I feel he / she should bridge the gap between both camps and get best of both worlds.

Automation exposure

I had some candidates who were not very savy with the traditional automation frameworks, but had great programming and learning skills, could develop algorithms and had learned other programming languages. Those candidates were equally good in my dictionary and I would definitely hire such a person.

If your project is in Selenium Cucumber, it would be unwise to hire someone who is already working on this technology, because there is no new learning for them. The increased salary will keep them content only for so long. Look for people who have the level of exposure you need, not necessarily in the same tool. For elementary positions a basic understanding of how generally UI automation works, common problems faced in automation and how the DOM works would suffice.

 The predicament

This position is fundamentally hard to fill. QA folks with the ‘testing acumen’ have traditionally kept a distance between them and technical aspects of developing software. Development folks have the technical exposure but lack the testing acumen. Where to find this mix breed!

Hence focus on the fundamentals. I feel by going deep into finding testing knowledge or technical skill set, the job gets even harder. Not to say hire someone with less of any of the two, instead hire someone with the ‘aptitude’, not necessarily +x years with a huge checklist.

The job description should revolve around the fundamentals, not more than a hand full of bullets. As the founding smart creatives put it “Widen the aperture”. Employers tend to narrow down candidates with very specific backgrounds only. We had an interesting candidate when we widened the aperture. He had an accounting background and demonstrated great skill in different development technologies too. A lot of hidden gems are left out when the filters are too tight around filtering resumes.

Care to share what else you would consider while hiring an automation engineer?