Case Study Banner
Functional Test Automation with DevOps Capabilities for a Travel Software Company

Functional Test Automation with DevOps Capabilities for a Travel Software Company

Case Studies, Functional Testing, Test Automation


The Client provides sales and back-office software solutions for travel agents and tour operators. They build solutions on the Microsoft stack, developing their software using Visual Studio, SQL Server and TFS. Testhouse had engaged with them to review their software processes and they identified a need for functional test automation support to assist their ongoing drive for quality improvements. They decided to explore the use of Selenium for automating tests as a replacement for their existing Telerik Test automation suite which was not fulfilling their demands for a scalable and easy to maintain test automation solution.


The client had a clear view of the automation solution they were looking to achieve. This needed to ensure a wide range of test coverage which would integrate into their CI process (using TFS) so that automation scripts could be run against every CI build (3+ times per day) and against their main QA environment (daily).

They also required detailed reporting that could be easily understood by their QA team, who had limited automation or developer skills. As a small business, budget was also a challenge and needed to compete with their previous automation approach of using internal staff for automated testing.


Testhouse provided a free proof of concept (POC) to demonstrate the capability of their custom-built Selenium frameworks. The client was given a choice of the implementation language to be used and choose C#, as this fitted with their internal development processes. A team of offshore automation testers were deployed by Testhouse to automate a pack of test cases that were pre-defined by the client. Our team contained Microsoft experts so that code could be stored in TFS and we could assist in setting up the automated test into their CI process. During the automation build phase, we had weekly progress calls with the client team to ensure that what was delivered met their expectations. This allowed us to identify any scripts issues at an early stage and refine the reporting standards to meet the needs of the client’s QA team. Scripts were delivered incrementally throughout the project so that value could be derived from the automation pack at an early stage.

Impact and Outcomes

  • Testhouse worked closely with the client to ensure that they received an automation solution that would align with their agile processes and fit their ongoing requirements.
  • Using Testhouse’s Selenium framework saved the client’s time and money and created a solution that can be maintained easily as part of their fast-paced release schedule.
  • Testhouse’s offshore team provided first class automation and DevOps capability whilst also aligning with the client’s budget constraints.
  • The client is now running 27 automated test flows against their application daily. Previously, these tests would only have been able to be run once every 40-day development cycle period.
  • The client engaged with Testhouse to provide ongoing flexible support to maintain their automated test pack.


“The key to our success with Testhouse was taking a collaborative approach in which we provided detailed requirements from the outset which were accurately converted by the Testhouse team into the end tests we sought. We met regularly to ensure the project was meeting our expectations and so any issues could be detected and rectified quickly and early in the process. We talked through any concerns we had and, on such occasions, achieved a resolution both parties agreed with.

We now have a suite of automated high-level tests flows that run daily. These will be also integrated into our CI process to enables us to run against multiple environments, providing a benchmark of quality daily. We fully expect to extend the suite to provide even greater coverage of the critical flows with in our web based applications, providing us with an even broader indicator of product quality.

Previously, workflow tests were only being run once per development cycle, during our regression period, and so the benefit gained has been two-fold. First, we are now able to catch issues as soon as they are introduced and, secondly, these tests can be run repeatedly, with the reporting feature requiring with minimal manual effort when reviewing test run results.”

Test Manager