Avoid the Automation Maintenance Trap

Don’t save your problems until later.

Samuel Ferguson and Ali Rad

Implementing a test automation framework is one thing, ensuring that the framework is resilient and cost effective in the long term is another.

Many organisations want automation frameworks up and running quickly and covering many business scenarios. But the investment in your automation suite will only be worth-while if it outlasts your first release. Why? Achieving “Quick wins” can be deceptively cheap and easy, but not cost -effective if the resulting system is expensive to maintain.

Naïve automation means that you become a victim of:

  • Changes in the GUI
  • Changes to application functionality
  • Changes in test case logic
  • Changes in data
  • Environmental changes
  • Adding new mobile device, desktop or browser support

It’s time to take a step back and think of how we can fall into a maintenance trap.

  • Are you copy and pasting code?
  • Are you hard coding test data?
  • Are you not using dynamic element locators?
  • Are you not organising your code into manageable chunks?
  • Have you got long unwieldy scripts?

To ensure you have you or your Automation Testers working to their full effectiveness, you should plan your Automation Strategy.

We can leave it until next sprint.

Re-use and resilience to changes on SUT

Implementing a BDD Automation suite considerably improves the re-use of methods by utilising feature file scenarios. The Step definitions if implemented correctly can allow the rapid creation of a comprehensive regression suite.

Ensure that any code that is re-used has been implemented as a function – e.g. text boxes are automatically cleared before entry of data.

Ensure that when locating UI or DOM elements that you are using xPath where possible as this helps code to continue to work after common changes to the UI.

Management of data

Data import can be used in various methods but should never (or rarely) be hard coded in a resilient framework. BDD Frameworks allow for Scenario Outlines which extracts the data from an Examples table. It is easily understandable and editable for the end user of the Suite. This method would usually be preferred for if your data does not require unique values

Sharing data between tests during parallel execution or between phases of execution is often and is vital when testing Asynchronous calls.

A Shared Data Server provides essential functionality to:

  • Archive created data
  • Extract and re-use archived data between tests
    • i.e. Asynchronous calls
  • Record test status to allow conditional behaviour by the test framework

Organising your code into manageable chunks

Always ensure that you are using Page Object Model (Component Factory) to its full potential by breaking up SUT designs into re-useable elements. One example would be for navigation functionality shown on every page.

When automation tests are created, they’re often quite lengthy. Smaller scripts remove the chain effect of failure.

Using the shared data server to archive data and re-use it can help keep a backlog of successful records to continue a test.

We’ve got more important things to do.


When asked to create an automation framework in a short time span you can build a lot of technical debt. This will lead to the IT manager having a dales impression on progress and maintenance requirements for future releases which may become unmanageable. Implementing BDD, Page Object Model (Page Factory), Smart Functions, Shared Data Server and useful locators will significantly help the battle against the maintenance trap!