How much time would you guess an average programmer spends on debugging code? The few studies I\u2019ve looked at estimate around 50% of the time. This is a great insight to improve one\u2019s efficiency. If extra thought is put into while writing the feature \/ script the first time, it can have a many times fold reduction in the time spent on debugging problems. Secondly if we get better at debugging, that too can have a large impact on a programmer\u2019s productivity<\/p>\n
In this article, with reference to UI automation, I\u2019ll briefly go over how to avoid having to debug more problems in the first place. Then some tips on debugging to increase the speed and effectiveness of fixes from my experience.<\/p>\n
Before we talk about how to debug, let\u2019s take a step back and discuss how to reduce the number of times we have to debug problems in the first place. With reference to UI automation problems, how to reduce flakiness, to manage maintenance time and debug lesser number of problems. Apart from flakiness, there are a few other common factors as well causing issues, being discussed further on.<\/p>\n
It is impossible to write a piece of code that would not have any problem ever, therefore debugging is inevitable. However, if a program is written with the deliberate intention to fix issues while the \u2018imminent\u2019 debug exercise, you and your team are in for a long and bumpy road. It\u2019s far better to spend extra time thinking over the algorithm and architectural impact \/ update instead of jumping right onto the implementation.<\/p>\n
Here are a few general tips on reducing \u2018inherent\u2019 flakiness and coding problems down the road.<\/p>\n
Luckily, I started my career on safety critical devices. One of the practices I picked from there was code complexity. A concept I don\u2019t see a lot of software development teams these days.<\/p>\n
\u201cCyclomatic Complexity\u201d is a software metric used to identify the complexity of a piece of code developed by Thomas J. McCabe, Sr in 1976 [1]<\/u><\/strong><\/a>. Using this (and other metrics) the complexity is calculated with the intent of requiring the code to be less than a specific benchmark value (for firmware written for embedded \/ IoT devices the benchmark is 30).<\/p>\n The premise of the concept is, once the code goes above a certain threshold, the software is more prone to defects [1]<\/u> <\/strong><\/a>and can go into unknown states, therefore should limit the amount of complexity we add to our code. Going into the details of it would require a separate article, to summarize on reducing it, as the number of decisions increase within a method, break into multiple methods \/ classes \/ modules to reduce the complexity.<\/p>\n Quality is built into the design, cannot be painted on a product once it\u2019s developed. The same goes for automation code, if you don\u2019t have a carefully thought framework developed, there is no way you can have a stable automation run. Adding bandages on deeply infected wounds does not solve the problem, the root cause must be fixed.<\/p>\n In my experience, there are four \u201cpillars of automation framework design\u201d to be followed, specifically for UI automation projects which are maintainability, reusability, scalability and robustness. This is a very extensive and important subject. To get started you can go over all four pillars brief description in this article<\/strong><\/a>. Following best practices while architecting helps in reducing the number of issues we would see and must debug.<\/p>\n Mostly flakiness in automated scripts is a direct result of poor architectural framework design. You can either face a problem learn from the mistakes and then correct them, or learn from other\u2019s experience and avoid introducing a problem. Best to learn from other\u2019s mistakes.<\/p>\n An automation environment can include a lot of things, while all components of an environment are important, referring the few having the most impact on flakiness here.<\/p>\nFramework design principles<\/h2>\n
Pillars of framework design<\/h3>\n
Environment stability<\/h3>\n
Test data<\/h3>\n