Deprecated: Function create_function() is deprecated in /home/qualit96/public_html/wp-content/plugins/revslider/includes/framework/functions-wordpress.class.php on line 258 Management Archives - Quality SpectrumSkip to content
In the old days of waterfall, we were extensive on documentation and had hefty test strategy documents as well. Traditional test strategy documents used to have mountains of information, a lot of which wasn’t really used practically. They were all great writing in theory but hardly followed.
As agile came along, many teams started to pivot towards not writing a test strategy at all. Perhaps one assumption was being agile means no documentation. This never sit right with me and wasn’t happy with the approach. Ever felt like you have a lot of ideas in your head, but since can’t see them on paper, your sixth sense keep’s telling you ‘I’m missing something’, that’s what it was.
What was needed
Having no test strategy document, important ingredients to bake quality into the products was difficult. So to me it was always clear we needed a test strategy, but having done the old school test strategy documents, I knew something like that was of no use either.
I started listing what I’d like an ideal test strategy document to be like. It should help with:
Defining the direction and vision of testing
Consolidate quality related attributes we need to consider
What to focus on testing, where in the tech stack & how?
What to automate and how?
Some practices / references engineers can refer to in day to day practice
These would give the whole team a sense of alignment towards a common goal, vital to bake quality into the product and not try to ‘gauge’ after the fact.
DevOps Test strategy
After transforming team’s quality practices for a while, I noticed certain things which I kept on preaching to most team and end up writing them in some shape or form. Meanwhile a dire need of a good training program was needed as well to train engineers on testing and automation practices. While designing the program I saw a perfect opportunity to design a course for building a test strategy for agile teams, and that’s when I came up with the “DevOps Test Strategy Mindmap”.
The mind map depicts the target state for common web applications running in an enterprise and how to plan your quality practices to get there. The purpose was to give teams a sample target state and develop one according which suits the team’s needs.
An important aspect of this test strategy was it should be easy to update and maintain. I love mind maps the most among all the different documenting techniques I’ve used, therefore made one for the test strategy. It helps to keep it:
Concise – To the point information
Complete – Covering important areas teams transforming into DevOps ways of working should consider
Easy to read & maintain – In the form of a mind map so readers can see the whole landscape at once
What would be the skill set you’d like the people driving the quality culture in an organization to have? I’ve always been excited to find out what would be the skill set of a veteran tester. Who can interface with senior executives and at the same time lead and mentor quality best practices in testers. This is an attempt to classify the skills on a very high level I’d like to see.
This certainly will be subjective from person / organization to another, and I can’t imagine any person who would be an expert in all these skills. But helps to draw out the important skills.
Ways of working
Understanding of what a DevOps culture is
Designing and developing quality practices which are efficient and effective
Understands the practical implementation of Agile principles and implementing them in a team
Implementing scrum best practices
Experience in driving desired behavior in teams
Leading by example / servant leadership
KPIs, Reporting, Metrics
Designing quality metrics which provide indication of a build’s health
Developing team KPIs
Pitfalls in metrics and how to mitigate them
Expose and report risks in large product solutions
Facilitating Product Development
Understanding core problem the product is solving
Building and communicating product context for testers
Make sure testing activities are in line with the core problem to solve
Facilitate UAT and collaborate to making the process effective
Socialize & collaborate with Senior Execs
Voice quality related concerns
Ability to make a point and get agreement from C-level executives
Oral and written communication skills
Designing vendor contracts
Acceptance of test schedules
Managing offshore vendors goals and day to day activities
Design test strategy in line with Tech stack, product / business use case & project constraints
Identify test coverage gaps / unexplored potential risky areas
How to push tests down to lower levels of tech stack
Strategy to leverage automation
Prioritize test scenarios
Design bug reporting flow
Using BDD to increase collaboration
Best practices for writing feature files
Cucumber / Serenity
Any other BDD tool
Questioning requirements and assumptions
Developing testing heuristics
Using developed testing heuristics
Teaching testing heuristics
Usage of effective documentation methods (e.g. Mind Maps)
Writing test cases (efficient and easily maintainable)
Understanding of which test to automate
Using testing heuristics to develop test scenarios
Test management tools
Automation in test
Automation architecture design
Designing API automation frameworks
Designing UI automation frameworks
Developing test harnesses
Test data creation tools / programs
Developing synergies between automation teams
Automation best practices, design patterns and anti-patterns
Fundamentals of framework design
Develop Maintainability in framework design
Develop Reusability in framework design
Develop Scalability in framework design
Develop Robustness in framework design
Writing clean and professional code
Seasoned practitioner of coding patterns
Developed coding guidelines and principles for teams to follow
Usage of static analysis tools (e.g. SonarQube)
Skilled in any one strongly typed language (Java, C# etc.)
Operational Acceptance Testing
Hands on experience solving API automation challenges
In depth understanding of HTML methods
Any other API automation tools
Hands on experience solving typical UI automation challenges
In depth understanding of how browser automation tools work
When I switched jobs recently, it was a very surreal experience. The stakes were high and there was a lot of ‘be careful’ advise. Luckily I trusted my gut and my values, which made it such a great experience.
Here’s how the story unfolds and some important lessons I would like to pass on to you.
It was another day at the office, I was working late trying to get some kinks out of a new test harness we were creating. I get a call from a company asking if I would be interested to explore a new position. I almost refused since I was not interested in a new job, but loved the caller’s demeanor and reluctantly agreed to ‘scope out’ the offer. Things started to work out and during the process, I felt this might work.
The tough decision
I worked in a matrix reporting hierarchy with multiple stakeholders. Over the years we managed to build great trust among ourselves. I knew if I’d leave abruptly that could harm the future goals for my team and lose momentum in the progress we were having. Plus I would have less time to ‘pass on’ the wisdom acquired. Also the timing of all this was very unfortunate. It coincided with some changes in the organization and how we operated and I could sense expectations from me and our team.
Against the advice I was getting from some, I decided to inform my current employer about the new position under discussion. At the time, no offer had been placed it was very uncertain what the offer would be. I ended up telling my manager, and some senior managers about the position. I made it very clear that no formal offer had yet been placed, but this is the blueprint of what’s going on.
I have to be honest, after the fact I felt this might have been a big mistake. But then I was able to reconcile what I did thinking I did what I thought was right, to preserve the relationship we had. And no matter what happens next, I’m glad I took a leap of faith for the good.
The biggest weapon you can have is empathy. It’s easier to feel empathy towards an individual near to us, but to a ‘company’, no way. Most people hate corporations and I would not blame them. But think about a company like this: a bunch of people like you and me who have to operate under certain restrictions. When I say be empathetic towards the company, does not mean the ‘LLC’ entity incorporated with the SEC, it’s the ‘people’ you have worked with.
I always say: The most important thing you will take away from your job is what you learned and the relationships you built
It’s hard to be ‘nice’ to a capitalistic face, who we feel will strike us down in the first opportunity they get. I don’t want to justify how most corporations run these days, but I do want to distinguish between ‘the company’ and the folks like us who work there. If you can’t come to terms with ‘the company’, think of the people you have spent so much time with. Make it easy and a pleasant experience for them. If you are moving on, it does not mean your relationship has to end with them too.
After informing management about the potential offer, I also started to delegate and train my team on the few missing pieces that I had been handling. Again at this time no formal offer was given, or any formal transition had started. The training was not just for my employer, it was for my team. We had shared tough times together, I wanted to leave them knowing they would be alright and would be well equipped before I leave.
After some time finally the offer was signed off on. By that time it had been almost 2 months since I had mentioned the position to management, which gave them enough time to plan. Also gave me enough time to cover bulk of what I wanted to train my team on. With the news the formal hand over process started in which we did a lot of documentation, created videos and even some last minute features that had to be done in the test harness.
The goodbye email
Most people send a generic goodbye email on the last day with some general lines saying I had a good time. For me this was different. It wasn’t an abstract set of names I had known without any human emotion. These were people I had cared about, and would continue to care about since I knew these were good people and who cared about me in return.
The most precious commodity we all have is time, and our subconscious knows that. Whenever we see a genuine effort by someone in sharing their time with us, we recognize that and respond differently.
So instead of a generic email, I met each person separately and went through what I had learned and admired about them. For those I could not meet but we’re close to me, I created individual videos for them and sent those in emails. For some I wrote individual emails thanking them and extending them my support. What followed was something I was not expecting..
What goes around comes around
I was not expecting much of a reaction, but to my surprise I got many times more love from everyone. I might have never felt such an emotional experience before as I did in those few weeks. The belief I had to spread knowledge, love and good revealed itself so beautifully. There were so many farewells, emails, calls, follow ups, kind words and just unreal responses from my co-workers, managers and senior managers which I will always cherish.
After all was said and done, the tough decision I took did not look like much of a tough decision at all. It felt like it was ‘exactly’ the right thing to do.
What if it all went south?
What if it didn’t work out. My employer would have felt I am actively looking for new positions while I had no real desire or intention of switching? That’s where trust comes in. They way they trusted me, and I trusted them back, I’m sure things would have worked out just fine. They understood I did this because I care. Unless someone is a really twisted freak, they would want to reciprocate with the same care.
We constantly undermine the good in others. Our first impulse to every new thought is mostly of fear, scarcity and negativity. Learning to trust people can be the greatest asset you can have.
All boils down to trust
We humans take this trust thing very seriously, It’s in our genetics. I have to admit you don’t succeed at building it every time, but most of the time it works out just fine.
But how to build trust? There are no shortcuts, you cannot fool people all the time. Building trust needs hard work and genuinely being ‘empathetic’. You care for others, they care back for you. It’s just that simple. You just have to take the leap of faith. Trust in each other, trust the good in others and around you. There is a lot more good than bad. We all are just more interested in the bad than the good, hence we create and attract the bad.
Did I just get lucky?
It can be argued I just got lucky. I happened to ‘charm’ my way into a few good books, not everyone is that lucky, and this might not go so well again.
I knew this would work because this was not my first rodeo. I had done this experiment many times before, sometimes intentionally and many times unintentionally. I cannot say it has always worked, but it has always been worth it. Even when it failed, there were a lot of things I salvaged and helped me become who I am.
If you did something good and it fires back, remember, you did what you did because of what you believe in. What the other person does is with them. If you spread good, you WILL eventually attract good in return. Just have faith, miracles happen all the time.
One of the biggest problem employers have is they are not able to find / or get their team passionate about what they do. And the biggest problem employees have they are not passionate about their 9 – 5 either. Quite silly if you think about it isn’t it?
So, I guess everyone should take a chill pill and get a compass to find what they really like. Off course that is easier said than done, but it’s not as hard as we often think it is either.
Finding what makes you happy
While we all would assume spending days on a resort sipping margaritas would make us happy, I think that happiness might not last long. We are driven by purpose, and just having fun is no purpose. As human beings we are wired to work towards a higher purpose, that’s how our species survived and how success is achieved.
To recap we need purpose which has to be selfless. So how to find that purpose? I wouldn’t imagine for there to be a silver bullet for this. However, here are a few ways that might help.
Find what activities you absolutely dislike or hate to do. I’m mentioning these early on as these are easily spotted. For example, for me doing brain dead work is something I hate. How do I know this? I tried doing brain dead work on multiple occasions and very diverse activities, the only common thing in them was brain dead, and it absolutely pissed me off. So, anything which does not have to do with deep thinking, that’s a big ‘no’ for me.
Next try finding what things you like to do. These are sometimes not as easily spotted but can be done with little attention. Again, I tried to figure out what activities made me happy (apart from watching Dwyane Johnson and Leonardo Decaprio’s acting). As a child I remember I loved to play with Lego or board games like monopoly. Growing up real time strategy was my thing. And during my engineering years I found my true love, programming. What was common in all of these though was building things (which off course needed thinking).
Having these few points, it was rather easy to find something I was passionate about. I must admit, I was not assertive in always doing what I loved, rather always played the hand I was given, but tried to somehow find something, in my subconscious, understanding I had developed for things I disliked and loved to do. And over time eventually I found an industry I could be passionate about and loved working in every minute of my life – Becoming a technical tester. Which meant not just learning automation and programming, but also learning how different software products were developed and worked inside out.
The employer’s dilemma
Now for the two groups we talked about earlier, the employee and the employer, how should they deal with this? Here’s my experience:
When hiring, employers tend to publish a ‘thesis’ of skills they’d like to see in the unicorn candidate, even if most of the skills mentioned might never be used in the candidate’s entire tenure. I find this to be very destructive and a waste of time for everyone.
Focus on the very few skills you want the candidates to be good at, and ONE they have to be passionate about. If they love doing the job you want them to do, they’ll be self driven and motivated. But the problem there is, not everyone has enough self awareness to know what they are passionate about. So, you’d have to judge for yourself if they are passionate about the subject matter you are interested in. My thoughts on hiring automation engineers can be read here.
Giving motivational talk about following your passion sounds very nice, but walking the talk is quite different. Finding a career that you love working in is not easy. I know passion does not pay bills. You have to play the hand you are given, but never loose sight of where you want to go.
A great example of this is James Dicks. I stumbled upon his profile by chance on Linkedin and was confused for a while going through his career history. On reaching out to him I found he was always passionate about flying. But to get there he needed a lot of money to get flight school training. So, he started working as a software developer. After years of writing code and injecting cash into his pilot training dream, he finally was managed to complete his flying hours and start commercial flying with Emirates and now Britsh Airways!
While this fairy tale might seem a far-fetched reality, it’s not as much if you think about it. The human body is highly adaptive to any circumstance. So long as you are moving towards your goal, serotonin (a chemical in our blood) will give the motivation you need from time to time.
Bridging the gap
Hiring a person or getting hired is not about if someone is good enough or not, it’s about aligning values and aspirations. Employers should give more weightage to attitude, which means looking for aligning values. For candidates understanding the company’s vision, values and culture would help them decide.
Monetary compensation is not mentioned here, because that’s a given. While this is a tricky thing to manage on both sides of the fence, fair compensation for the skill set needed and brought to the table can be used as a general guideline to follow.
While hiring passionate knowledge workers is the hardest part of running a business, it is the most crucial too. In the age of information, knowledge, experience and skill is the king. The only thing which will off-set that difference is attitude and passion. For an employer. having a team passionate about your goals, and for an employee, working with a team in alignment with your values is the ultimate prize.
Like lots of folks, I used to calculate automation ROI by measuring ‘hours saved’ if a person were to do these checks instead of a machine. Perhaps that’s how the market trend generally evolved and a way for vendors to sell their products / services. After working for years in the industry and listening to other thought leaders and folks sharing in the community, I feel the ‘cost cutting’ might be there, but not in the way most of us think about it which should change the way we think about automation.
To make that a bit obvious, what would you say is the ROI of a piano for an average user? It’s not easy to quantify the return on investment for a ‘tool’, but that does not mean it’s any less important depending on the circumstance.
The cost saving silver bullet
For years till date automation tools and services have been sold as a method of reducing cost. In theory it does sound logical, however after working in the industry for years, I don’t know of anyone who has really ‘seen’ these cost cuttings including myself. Let’s dissect the calculation of cost reduction in detail to try and pinpoint the discrepancies.
The story goes something like this:
“Savings per test cycle= Tests/checks automated x Execution effort (man hours) per check”
And then we’d calculate the break-even point when the savings equal the initial investment in preparing that automation suite plus any other costs etc. For an accountant this would make perfect sense, except the “effort per check” cost does not exist! Let me explain.
Automated checking Vs Testing
The first problem is equating automated checks execution time to a tester’s man hours. The way a machine runs a script is not the same as how a person would test that feature. There is a lot of background to this concept if you are not familiar with methodologies like Rapid Software Testing and related concepts. For those who are not, let me try and summarize the required concept quickly.
The verb “Testing” is an act of “Thinking” and “Communicating” on how to test a specific feature. Once the tester decides what to test, then he / she executes the scenarios. A machine is incapable to “Test” since it cannot “Think” neither can it “Communicate” like a human. It can only “execute” what it’s instructed to check.
(Thanks to the RST community, James Bach, Michael Bolton and folks for articulating this clarity)
The missing effort
Let’s take an example of a candidate application which would hypothetically require around 1000 man hours to test the complete application (btw many products would fit this description). How many testers would be needed to regress over this application within 2 weeks? Around 13 full time testers. Do you think the team would have 13 testers on the team? Mostly not, they would have less than needed people and make do with whatever time they get.
Now, half the effort of “Testing” was the thinking part which a machine cannot do (Some would argue, including me, a lot more than half). The other half is supposed to be spent on “Execution”, where only a small percentage is actually being spent since the team size is ALWAYS smaller than needed.
That’s how there ‘might’ have been ‘some’ savings in terms of man hours but practically there are next to none because most teams are not operating under the assumptions followed while calculating the ROI.
Then why Automate?
Increased test coverage
From our example, we were not able to test the complete application. And from my practical experience, many products are ‘way’ less tested they should be. Adding a dozen more tester’s does not seem to be practical either.
To cover more ground, testers can program a dumb machine to do the basic ‘execution’ which they have to unwillingly do (since its boring doing the same thing again) every time a release is going out. This frees up their time to do intelligent work and get the repetitive checks done by a machine.
Testers focus on important areas
This might seem a repetition from the point above, but there is a slight and important difference. Tester’s don’t just free up their time, but they can now also leverage the dumb grunt by focusing just on the thinking part and delegate as much possible the ‘execution’ part. A high percentage might not be possible, but if automation is leveraged properly, the test quality can improve significantly since most time would be spent on ‘thinking’ than doing repetitive stuff. More on test scenarios that are ideal candidates for automation here.
Quick feedback – Find problems earlier
How many times has it happened after a bug fix an important feature stops working altogether, and this comes to light at the 11th hour when there isn’t enough time to regress the fix properly either.
There is a lot of value in getting feedback quickly. Different checks running at different stages of the development process can provide the needed feedback at that point. As an example, a possible plan could be run unit tests and high-level checks during development, complete regression in QA stage, user acceptance tests on production, or any process that suits your product and team.
Quick feedback – A big step towards CD
The companies to be most successful are the ones taking an idea from the drawing board into the consumers hand most quickly. This is where continuous delivery comes in. The race to minimize the ‘time to market’ can become a huge factor giving the first movers advantage. This videowill give more detail on how automation facilitates that.
Increased confidence in the product
An inherent problem with exploratory testing is “it’s done by humans”, which is a good thing too but people tend to forget. A tester might not test the same function the exact same way every time or forget to test altogether. With automated checks, we can be certain of what features were tested and if they are working.
This makes the decision to ship a release much easier and allow for some quantitative measures to take decisions on. Although this alone cannot be enough to make the call, but coupled with decent exploratory testing, it makes a difference.
It’s not just the team, customers of the product also can have a sense of satisfaction of knowing certain checks being automated ensuring the functionality will most probably have gone through a checking process.
Commitment to quality
From our example and my experience, most teams do not have enough testing staff to completely regress the application every time a change is made. Some would argue it’s also impractical. Having automation in place shows the commitment towards ensuring maximum areas of the application get tested or checked before shipping to the customers.
This is where the phrase ‘Quality is a mindset’ comes in. When we hold ourselves and our product to a high standard, it will necessitate to indulge into some form of automation process, because most modern applications are not possible to test adequately with a cost-effective sized team.
There are saving, but not the way we calculate them
Equating man hours to machine hours of execution is not the correct formula for finding your return on investment for an automation project. The returns do not come in tester’s man hours saved, rather in different forms which are by no means less important, just less obvious.
The real value comes from increased test coverage, allowing testers to focus on what really matters while delegating grunt work to a machine, get quicker feedback on fundamental problems or features, a major milestone towards reducing time to market, an increased confidence of the customers and the team in the product’s quality which shows the commitment to quality having an impact on the end consumer.
Feel free to share what other benefits do you feel automation brings to the table.
I recently made the good old ‘one liner change’ in our automation framework having a 22-hour batch run with one permutation only. We discovered a week before regression my ‘small change’ had caused a straight 25% false failures. When my awesome team suggested to revert the change, I realized I did not recognize the demon at that time. This demon goes by many names, ‘one liner change’, ‘small change’, ‘localized change’ and so on (and will be referred as demon from here on in letter and spirit).
Cause of the trap
Have you ever sat with a friend driving a car and you feel a bit unsafe, like he / she does not have as much control over the car? And how is the driver feeling in that moment, pretty confident for the most part. I feel a similar case happens when one person on the team feels about the ‘small change’ in code. In the moment of making the change, it feels like we were just putting a bandage on a little bunny’s scratch. Till later on it turns out that was a lion we just misfired a pump action at and now are running for our lives.
The demon effect
The demon is lethal mainly because it does not look like will create much of a problem. Often teams end up making such changes close to regression / release giving less time to react. Eventually the decision of releasing with bugs or delaying the release has to be taken which no one likes. I would equate this with “Putting good money after bad money” (adding more time to salvage the situation) hardly ever works except for gifting with more frustration (my published thesis on the subject here).
Small and ‘not so small’ change
While small changes or localized changes do exist, sometimes the demon is misunderstood for one. What defines a demon then? A formula I like to use is if the change is encapsulated within a small area of the application, a low-level module or one class affecting functionality of just that module or class. The mistake I have seen made is a functionality which is not encapsulated within a confined class / module but we feel it’s used only in one or few specific areas being confused with a localized change. Unless there is no way this change can be accessed or affect directly anything outside its scope, it’s not a localized change.
Quarantine the demon
The change must be done no doubt, but can be done in time of peace, not on the 11th hour a day before regression. Implement it in the early sprints of a release, so there is time to react and adjust before the deadline hits. Firstly, while implementing you hopefully will not be in a rush and can think through. Secondly there would be time to test issues around just that change and would be easier to identify the cause of bugs we see.
A general best practice is to make changes in increments rather than doing the whole big change at once. If you have automation running on UI or API level, that makes things easier. As you push the big change in increments, you get to know right away any outright effects it might have. And if you don’t have automation, I would say first think of doing that, otherwise you can focus manual testing as much possible towards testing the ripple.
Big decisions take time
Like in life big decisions take time not just to research on, but to think about and reflect upon. Similarly, I feel taking time for architectural changes to ‘sink in’ or internalize takes time. By doubling the resources working on it, it would not get done in half the time. For big changes, I usually start thinking about them ahead of time with my team, so when we do start doing it we have things in perspective and can make a better judgement.
Care to share what else would you would do to make an architectural change transition smoothly?
The product had been tested for years now by the testing team doing exploratory tests, and writing test cases for important areas. The application size increased day by day eventually reaching to more than a thousand test cases. That is when the testing team thought of delegating the ‘checking’ part of regression to automated scripts to free some time for real testing.
Many product teams coming towards automation have reached this stage and are looking for a way to shift written manual tests to automated scripts. The tests in many cases include some rich scenarios which the team wants to leverage, and are looking for scripting an exact copy. Naturally this comes with inherited challenges, some of which I am about to share on how we managed for one particular product.
Before moving on, some tools claim to automate manual tests from a word document etc. that is not being discussed here (plus I yet have to see that work!).
Test case to script mapping
Ideally all manual tests should be part of the automation suite as it is. However, differences are bound to creep in. To maintain traceability between tests and automated scripts, creating a mapping document would be a good idea. Essentially map every manual test to an automation test. For a discrepancy in test scenarios, mention the reasons with appropriate tags (for ease of filtering).
As the application evolves, changes in manual tests come in and scripts need to be updated. Having this document would
The change would become way easier for the person updating scripts if any prior discrepancy was written with reasoning readily available.
Secondly during regression it would be very clear which areas automation is not looking at and the manual tests might want to look into.
Scripts incapability vs sentient beings
There are always some steps in manual testing which the testing tool is not able to perform. Could be a physical activity outside the product, portion of the application not automatable, a very complex bunch of scripts needed to improvise in different application states. Instead of just leaving out the test altogether I usually recommend
Alter the scenario to suit the script, salvage whatever you can, and forego what cannot be done.
Break the test in two. For the second test use pre-populated data / test scenario to avoid the area not automatable.
The mapping document comes in very handy here.
Manual test steps in report
Test reports generated from automated scripts should be readable primarily by the manual testing team. Usually I see teams with test reports showing all the automation mumbo-jumbo right off the bat, creating lots of confusion for someone not involved in automation.
I strongly advise to include test steps as it is from the manual test case in the automation test report. Under each step should be the script read / write details the tool is performing. Non-automation folk can then make sense out of it, also it creates lots of ease for the automation team to fix issues.
Apart from mapping differences from manual tests, this document was used by us to have an overview of the complete automation suite’s health. Scripts which we knew were faulty and needed updates, scripts needed in-depth investigation, scripts failing due to a reported issue, all these status updates were appended to this document.
Even if you don’t have manual tests to map to, still every automation project must have one spreadsheet with at least the fields listed. These are a huge time saver when managing batch runs / daily runs.
Recently I had the opportunity to go through the cycle of augmenting an automation engineer to join our team. It had been a while since I used to do this (much frequently) few years ago. This time I stumbled upon few ‘new’ realizations, especially for hiring automation engineers, which never occurred to me before. With many folks I see struggling to find suitable candidates for this title, I thought of jotting down some lessons learned through the process. The best part, its one step really (to being with)
Cut the Crap
Candidates undervalue their skill set (especially the ones you want to hire). Furthermore, seeing a huge job description with all the buzz words the hiring manager could find on the internet, scares away potential candidates from applying. There are usually a few key skills for any given position which are vital for the team. With a lot of ‘name dropping’, candidates get confused and hesitate to apply due to lines like ‘Expert with 3+ years experience in SQL server’, which in many cases, there is a 5% probability the new hire will be writing complex queries.
Our job ad was not performing well. The candidates we got were not even a match on paper. HR suggested to have another look at the job description, and she was right. Hesitantly I reviewed and found the job add was scary to say the least. No doubt the position had considerable requirements, but the essentials were few. While cutting down the content, the few skills I felt were essential for an automation engineer / SDET (NOT a lead position) are presented here.
Not ‘10+ years of experience in Java’, who could code a talking parrot. If your project is in Java, don’t necessarily look for a Java guru. If a person has the aptitude of constructing algorithms and can demonstrate good programming skills with any language, he/she can learn the new language or framework. When the technology changes (which it will), they will be most comfortable to adapt, more willing to learn new tools / languages and leave their comfort zone.
A term I use referring to a tester’s mind set. Who is able to craft test scenarios covering different aspects of the AUT, having the process related knowledge, the skill to extract requirements, to tell a great story when writing up an issue and so on. I always believe an automation project is as good as the scenarios being automated. If the scripts are checking trivial functionality, it’s not going to create the difference we want.
The counter argument I’ve heard, since test cases are being given, automation folks don’t need in depth testing experience or knowledge. Well, a person with technical insight and a tester’s eye will be more suitable to suggest what areas are not being tested and how to test them. Even if a team has Unit tests, integration tests and UI tests, still someone needs to create scenarios for the system level tests verifying the business logic on system level.
The one thing an automation team might never get rid of is ‘technical problems’. Each day is going to be a new day. Either you will make mistakes and learn (for the most part) or you would be wise and spend considerable time in learning to avoid making the mistake. Through all these endeavors, the only thing to keep you going is the right attitude.
In any job having the right attitude is the most fundamental aspect, specially true for automation folks. They never run out of problems, and they can never get tired of solving them!
Under non-technical skills there are many qualities hiring managers aspire to, this one which I found lacking in some cases and most relevant specially in this type of position. A term coined by Google’s “Smart creatives”, which is the enhanced version of Peter Drucker’s knowledge workers. I understood it as ‘A person with the fire to learn new things, who is technical and business savy and has a creative personality’. Imagine what a smart creative technical tester would do – Put development on DEFCON 1! Actually no, instead I feel he / she should bridge the gap between both camps and get best of both worlds.
I had some candidates who were not very savy with the traditional automation frameworks, but had great programming and learning skills, could develop algorithms and had learned other programming languages. Those candidates were equally good in my dictionary and I would definitely hire such a person.
If your project is in Selenium Cucumber, it would be unwise to hire someone who is already working on this technology, because there is no new learning for them. The increased salary will keep them content only for so long. Look for people who have the level of exposure you need, not necessarily in the same tool. For elementary positions a basic understanding of how generally UI automation works, common problems faced in automation and how the DOM works would suffice.
This position is fundamentally hard to fill. QA folks with the ‘testing acumen’ have traditionally kept a distance between them and technical aspects of developing software. Development folks have the technical exposure but lack the testing acumen. Where to find this mix breed!
Hence focus on the fundamentals. I feel by going deep into finding testing knowledge or technical skill set, the job gets even harder. Not to say hire someone with less of any of the two, instead hire someone with the ‘aptitude’, not necessarily +x years with a huge checklist.
The job description should revolve around the fundamentals, not more than a hand full of bullets. As the founding smart creatives put it “Widen the aperture”. Employers tend to narrow down candidates with very specific backgrounds only. We had an interesting candidate when we widened the aperture. He had an accounting background and demonstrated great skill in different development technologies too. A lot of hidden gems are left out when the filters are too tight around filtering resumes.
Care to share what else you would consider while hiring an automation engineer?