Most of us understand the advantages of automation in testing: Automation helps you run the checks faster, multiple times, and for a long time.
However, while automation can help you save time and money, there are a few hidden costs that many miss.
Learning costs
Most of the tools start with a login, a few swipes, and some basic data entry. The tool is demonstrated on the simplest of use cases, and everyone thinks it can be learned quickly.
But bring it on your actual project, and soon you realize that it is not as straightforward as it appeared. The dependencies with third-party plugins and specific versions should not be forgotten.
Once everything is set up, if you are trying out a tool, do not just test it for the use cases built into the tool. Try it on use cases similar to your real projects. Observe if you need help at every step or if you are able to learn the tool on your own. Try it on multiple machines and networks as well. The ideal scenario would be to pick up a few cases from your automation suite and let members who will work on this project develop them using the tool, as they would on the real project.
There have been multiple instances across teams when the tool looked simple to learn but did poorly when put through integration cases. It is a good idea to check out forums, reviews, and the resources section on the tool’s website to understand the types of problems already solved.
Keep in mind that the learning cost is never a one-time cost. It is a recurring cost every time new features are added in a release and new team members start working on the project.
Adoption costs
Change is hard, and people might not be willing to let go of their existing tools. Using a different tool is not straightforward. People have to let go of the biases and the effort already put in for the project.
Orders passed on in a top-down approach have rarely worked wonders in the long run. If the team on the ground doesn’t embrace adopting the tool, the journey to successful automation will be hard.
In one of the projects I worked on, there was a skill set mismatch. The tool was new to the market and supported only a limited set of languages—none of which the team was familiar with. However, the client was reluctant to use any other tool because they spent a lot of money on this one, so the team had to learn the supported languages. This kind of misunderstanding usually happens when the team is not involved in the purchasing decision.
Talk to people, know the details beyond the numbers, and ask for honest opinions on the adoption costs. A small proof of concept on the actual project use cases is worth the investment. You will also get an idea for the support for the tool from the people who will actually be using it.
Maintenance costs
Even if you can offset the tool costs by opting for open-source or free tools, there are maintenance costs to be aware of. Look for tools that are up to date with changes such as browser versions and advancements in technology.
Automation projects are like marathons and need to be able to work continuously for a long time. Most of the time, any technological change demands a lot of rework, and the tool must accommodate that. There is hardly any project where the automation team wrote the scripts and they were never required again.
However, I remember a project where the Ruby version was not upgraded after 2.4, as the scripts and framework would need a lot of rework to be compatible with the latest version, 2.7. Everyone was comfortable with the decision, even though they had to sacrifice a lot of newer capabilities.
There have been projects where every line of code worked as expected and a simple change, be it a different browser version or a flow change, broke the script beyond repair. Lack of testability and flexibility can delay progress. Be aware of such costs.
Maturity costs
Trying to force-fit automation into your testing has become a routine for many teams, all for the fear of missing out. There is no check on the need for automation, the maturity of existing processes, or how the automation would add value to the overall project.
A proof of concept is often only done to evaluate the ease of learning and suitability for the problem at hand. The final cost of procuring a tool overrules the team’s decision for their pick, and a different tool is chosen.
I saw an instance where the automation project was a known quarterly joke in the company. Every three months, the product delivered a major version and the automation suites broke. The cycle continued and more money was burned rewriting every six months.
Poor planning and a lack of long-term thinking hurt the team more than the choice of tool. Some teams are just not mature enough for automation, even if they practice agile development.
If you want to know if your team is ready for automation, find other teams—perhaps within your own organization—with good practices for automation, and see if you can adopt what they do. Could you learn from their mistakes? Could you save some costs by incorporating those lessons instead of reinventing the wheel?
10 Questions to Ask before Starting Automation
Before you jump into a new automation initiative, teams should ask the following questions:
- Is the tool ready for our real challenges?
- Is there enough help available to guide the teams on their journey?
- Was there a proof of concept done using our actual use cases?
- Are the teams willing to adopt the required changes?
- Have the teams chosen the tool, or is the tool being imposed on them?
- Is there a skill set match between the tool and the teams?
- Is there enough planning done for this project? Is it one-time planning or long-term?
- Is automation really necessary for our current projects?
- Has anyone else in the company tried anything similar before? Can we learn from them?
- Do we really need to spend for the tool, or would open source or free tools serve our purposes?
By thinking through and answering these questions, teams can reveal some of the hidden costs in automation before they become apparent.
User Comments
Really informative, thanks for sharing.