The product had been tested for years now by the testing team doing exploratory tests, and writing test cases for important areas. The application size increased day by day eventually reaching to more than a thousand test cases. That is when the testing team thought of delegating the \u2018checking\u2019 part of regression to automated scripts to free some time for real testing.<\/p>\n
Many product teams coming towards automation have reached this stage and are looking for a way to shift written manual tests to automated scripts. The tests in many cases include some rich scenarios which the team wants to leverage, and are looking for scripting an exact copy. Naturally this comes with inherited challenges, some of which I am about to share on how we managed for one particular product.<\/p>\n
Before moving on, some tools claim to automate manual tests from a word document etc. that is not being discussed here (plus I yet have to see that work!).<\/p>\n
<\/p>\n
Ideally all manual tests should be part of the automation suite as it is. However, differences are bound to creep in. To maintain traceability between tests and automated scripts, creating a mapping document would be a good idea. Essentially map every manual test to an automation test. For a\u00a0discrepancy in test scenarios, mention the reasons with appropriate tags (for ease of filtering).<\/p>\n
As the application evolves, changes in manual tests come in and scripts need to be updated. Having this document would<\/p>\n
<\/p>\n
There are always some steps in manual testing which the testing tool is not able to perform. Could be a physical activity outside the product, portion of the application not automatable, a very complex bunch of scripts needed to improvise in different application states. Instead of just leaving out the test altogether I usually recommend<\/p>\n
The mapping document comes in very handy here.<\/p>\n
<\/p>\n
Test reports generated from automated scripts should be readable primarily by the manual testing team. Usually I see teams with test reports showing all the automation mumbo-jumbo right off the bat, creating lots of confusion for someone not involved in automation.<\/p>\n
I strongly advise to include test steps as it is from the manual test case in the automation test report. Under each step should be the script read \/ write details the tool is performing. \u00a0Non-automation folk can then make sense out of it, also it creates lots of ease for the automation team to fix issues.<\/p>\n
<\/p>\n
Apart from mapping differences from manual tests, this document was used by us to have an overview of the complete automation suite\u2019s health. Scripts which we knew were faulty and needed updates, scripts needed in-depth investigation, scripts failing due to a reported issue, all these status updates were appended to this document.<\/p>\n
Even if you don\u2019t have manual tests to map to, still every automation project must have one spreadsheet with at least the fields listed. These are a huge time saver when managing batch runs \/ daily runs.<\/p>\n
Care to share what you did to map manual tests?<\/p>\n
Till next time, Happy automating!<\/p>\n<\/div>