Saturday, May 21, 2016

A Test Automation approach: Object Oriented frameworks - Part-I [ Introduction ]

B.J Rollison had recently posted about the programming paradigms in test automation. This inspired me to write-up some of the work my team has been doing in automation. My team has been working on a rather "non-traditional" approach on test automation of an product. I'm terming this as "non-traditional" as I've not seen this approach being discussed test automation community and forum. There's been a lot of discussions on key word driven and the much widely accepted data-driven approaches.

For the lack of a better terminology I'm terming this approach as "Object Oriented approach" in test automation. Considering the elegance of this approach, I'm sure this approach has been practiced by testing teams but just that its not been discussed in the testing community and forum.

The "Object Oriented approach" considers the system under test as single object or a series object depending upon the complexity of the application and business functionality. The user action performed on the system are considered as methods/functions of these object. This approach also makes a clear distinction between the Test automation framework and the test suites that consumes this framework. The code that implements the encapsulation of the application into the methods and classes are part of the framework. Test Suites instantiates the classes and invokes the methods of these classes to simulate specific sequence of user actions performed by this application.

To explain further, I'll summarize the project we've been working on. This is a business analytics application performing complex statistical analysis on a data warehouse and publishing the results. At a very high level, users create projects. These projects create data filters based on complex queries defined by users. There are up to 16 different types of data filters that are supported and up to 12 types of statistical analysis. Considering the data warehouse scenario, the underlying data undergoes periodic updates and the application requires to handle historic as well as current data conditions.

A traditional automation approach to script the screen level user actions would have to take into approach the innumerable combinations. Automating the screen level entities would have been a maintainance nightmare considering that the application GUI changes based on an agile development methodology.

The approach taken by us was to bifurcate the test automation development into two distinct entities, framework development and test suite development. The framework will provide the classes and methods and the test suites will strictly use these classes and methods for test development. For example, in our approach the framework provide classes and methods such as Project.create(...), Project.edit(...), Project.delete(...), Filter.create(...), Analysis.create(...), Analysis.verify(...), Filter.edit(...), Filter.delete(...), Application.close(...) etc. The Test suite strictly uses these classes and methods for the test development.

I'll explain further about the design philosophy, advantages and disadvantages of this approach in my next post.

Test Automation : Challenges On Internationalizing Test Suites

After having automated several projects using the OO Framework approach, we've been asked to Internationalize our tests suites. This means the test suites developed by us should also execute when the application under test (AUT)is running on a language other than English. Not only should the test suites 'execute' on a localized version of the AUT, but also find defects!
As a software developer of AUT, the task of internalization/localization is relatively straight forward and the strategy of internalization/localization is well document. The strategy involves moving the language specific string to resource files. Popular languages such as Java & C++ have inbuilt support for this strategy.
When it comes to internalizing automated test suites the stratety is relatively complex. Most of the GUI automation tools work on the basis of 'Object Recognition'. ie. These identify GUI object by a combination of visible string and object properties. Since the visible strings appear in the local language in non-english AUT, the Test scripts developed in english will fail to recognize the objects. This is challenge number one!. Thankfully many of the commercially available tools such as RFT and QTP support internationazing your testware.
Assuming that as a test developer you are able to recognize the GUI objects, the second challenge is that the test developed in English indeed pass in a localized language. Generally, the automated tests compares results with benchmarks ( stored as text files, image files or embedded strings in the script itself ). These benchmarks would be in most cases, language dependent. When internalizing your tests, the task of the test developer is to 'externalize' the benchmark. What's more, the inputs you provide to your AUT will also have to be in a local language. Assuming that the test suites are data driven, its expected to have a 'language version' of each of the data sources.
In most of the cases, you'd not know when language you'd be expected to run the test. That means, as a test developer your test suites should be 'truely language independent' and not say, my test suites is coded for English, Chinese, italian and French only! That's the challenge number three.
Next blog post I'll explain some strategy to overcome these challanges.

A Test Automation approach: Object Oriented frameworks - Part-II [ Objectives ]

In my previous post I've explained what is the "Object Oriented" approach in test automation. Again to clarify, this approach is not a incompatible with with data driven approach. In fact the its complementary and in our framework the all the input data comes from excel-based template files. So Object-Oriented and Data-driven approaches coexist and are complementary. However the same cannot be said about the keyword /action-word driven automated approach. The keyword driven approach the actions words and actions associated with this. This encapsulates the underlying complexity of the automation tool into readable actions that can be automated by a non-programmer.
The object oriented approach however focuses on specific user actions performed on the application. The complexity of the product is encapsulated by this approach rather than the complexity of the automation tool. This means that a automation framework developed with the object oriented approach can never be a generic one, it always has to be specific to the product you are automating. The object oriented approach also assumes the test developer has a sound understanding of object oriented concepts and the programming language.
The design philosophy for Object oriented framework are the following.
  • To model the application under test ( AUT ) as classes and the user actions as methods/functions of these classes
  • The test suite developers will user these classes and methods and only these classes and methods for test development.
  • The exceptions on the application should be handled by the framework and should return the application to a base state after error.
  • Any test suites developed by the framework should not exceed 100 lines of code.
  • The test suites development should be able to use all the object oriented features supported by the language.
  • The test suites should be tool independent. This means that the framework should encapsulate all the tool dependency within the framework. In case a tool migration is required, the framework can be rewritten and the test cases will not be altered.
In the next post, I'll explain the design details and approach of the framework we've developed and the benefits we've achieved with this approach.

A Test Automation approach: Object Oriented frameworks - Part-III [ Design ]

In this post, I'll try to explain the design of the Object Oriented framework with an simple example.


For the benefit of explaining the process I'm describe a fictitious banking application. Assume the banking application has three modules 'Savings Bank module', 'Current Bank module', 'Loans module'. The banking application supports user actions such as 'Create an Savings bank', current account', 'create a loan account', ‘close account’, ‘credit cash to account’, ‘deposit a cheque’, ‘enquire balance’, ‘check transactions’, ‘debit an account’. Also the banking application has administrative modules such as login module, customer maintenance module etc.


In a traditional automation approach, design of the automation begins with identification of the user cases/test cases that requires to be automated. The next step will be to identify the GUI components associated with the user cases identified and coding for these flows. This code will perform user actions such as enter values in the text boxes, click on the buttons, reading values from the GUI etc. A simple test case for opening an account and depositing cash will look similar to the code below after automation. The code below is an example written in a tool/language independent format.

Login:

Enter BankEmployeeName :'user1'
Enter password :'*******'
Create Account:
Verify home page
Click on 'create account link'
Enter radio button: 'savings bank'
Enter account holder name: myCustomerName
Enter address: xxxxx .
...
....

Click Ok
Verify Account creation:

Click on 'Search account'
Enter account holder name: myCustomerName
Click Ok
Read account holder name
Compare value with 'myCustomerName'
Read account holder address value
Compare value with 'myCustomerName'
...
...

Deposit Cash:
Click 'Cash Deposit'
Enter account number: xxxxx
Enter amount: xxxxx
Click OK
Verify Cash Deposit:
Click Check Balance:
Read account balance value
Compare value with 'yyyyy'
As you may have noticed, the test case depends on detailed design of the GUI. Each step in the test case relies on individual components on the GUI. For the case of simplicity I've not included exception handling in the above code.
With the Object oriented approach, the design do not start with the set of test case/user cases that requires to be automated. The design of the framework will view the banking application in terms of how as an end user will perceive it. For the object oriented framework approach, the banking application will be a considered as objects with actions on these objects. For the sake of example, the following will be the objects and methods on these objects.
Class Login:
Method: Login(user_name, password)
Method: Logout
Class Account:
Method: InitAccountObject()
Method: CreateAccount(.....)
Method: DepositCash(amount)
Method; EnquireBalance() .
Method: CloseAccount()
Method: GetAccountDetails()
Method: TransferBalance(amount, otherAccountObject)
Method: VerifyAccount()
With this approach, the complexity of the GUI and the actual intricacies of the workflow are encapsulated in the business classes that are defined as part of the framework. With an object oriented framework approach, the above test case will look like below.
Class : TestCaseOne:
Method: InitAccountObject(accountType)
Method: TestSteps: employee = New Object Login('user1', '*******')
myAccount = New Object Account();
myAccount.CreateAccount(accountType.......)
myAccount.VerifyAccount()
integer myOldBalance = EnquireBalance()
myAccount.DepositCash(amount)
integer myNewBalance = EnquireBalance()
Verify myNewBalance == myOldBalance + amount
As you may notice, in the above test case, there's a very clear demarcation of the GUI components and the business components. All the intricacies of the GUI is hidden away inside the business classes/methods. The automated test case, in this case 'TestCaseOne' only deals with the business aspects of the test. With this approach, the automated test case is insulated from the details of the GUI. With any changes to the GUI, the test case will not have to be redesigned. This translates into very good maintainability of the test cases. The second and most important advantage of the Object oriented framework is the test case extensibility. We could 'extend' the above test case to test for various related tests. For example, we can extend this test case to test for SavingsBankAccount, CurrentAccount and Loan account as below
Class SavingsBankAccountTest extends TestCaseOne
Method: InitAccountObject('SavingsBankAccount')
Class SavingsBankAccountTest extends TestCaseOne
Method: InitAccountObject('CurrentAccount')
With this approach, manual and exploratory testers can automate regression tests cases without the technical knowledge of the automation tool. Testers however should have fundamental knowledge of object oriented concepts and knowledge of the programming language.
In my next post, I'll explain the real benefits out team has achieved by implementing this approach of automation and also delve into how we've been analysing and evaluating the results of the test executions.

A Test Automation approach: Object Oriented frameworks - Part-IV [ Benefits ]

The product we work on is an enterprise-scale solution typically deployed across multiple physical machines and supported on heterogeneous operating systems. The product should work on about 6 types of operating systems and three types of web application server and 3 types of databases. The complete platform support matrix will typically have 12 to 15 unique test-configurations that require to be tested. This is where the need for control over the regression testing comes into picture.
A complete manual approach for covering all these test configurations is not really a feasible plan. This is where the need of automated test executions comes into picture.
The preferred approach we take is to identify the “primary test configurations” which are typically 2 or 3 test configurations. Functional testers typically does exploratory testing on these test configurations. At the same time, they also identify the test they’d like the get included in the regression suite. These tests are documented at a ‘business level’ i.e. in terms of use case that the end user will perform. What’s avoided in the test document is GUI level details. At this time, we have a dedicated tester who develops automation suite. These documented tests are what an “automation tester” will work as input. Since the automation tester will only use the business classes and methods that the automation framework exposes, development of these automation suite is very fast. Our future plan is to have the functional testers take-up the automation suite development. This approach makes sense since the functional testers have functional & domain knowledge of the application that the automation tester do not have. This will also help avoid documenting of test by functional testers for automation.
The important benefit of the object oriented framework is speed of automated test development. Previous to the framework, development of a typical test, automation suite used to take over a week. This has been reduced to less than a day. What’s more, these tests are much more readable and the automation code purely implements the business logic and not the intricacies of the GUI or the automation tool. This leads to efficient maintenance of the automation suite.
The real benefit of an object oriented approach of development is the use of inheritance and polymorphism. With the Object oriented framework, we are in a position to use inheritance concept on test cases. In functional testing testers typically use an existing test scenario and explore further by making focused changes to this existing test scenario. The same testing approach can be extended by us of inheritance in an object oriented approach. We “inherit” an existing automated test with very minimal change in code that this results in new test cases. In our product, one of the important feature is called ‘analysis’. With the object oriented approach we created a single test case class and extended this class to implement over 500 test cases. With a traditional test automation approach we’d have take several months to complete this suite. With the object oriented approach, the entire effort was less than 2 weeks.
The real benefit of this approach is to bring down the time to execute tests on the additional test configurations. Manual execution of the regression tests used to take about 1 week for each configuration. Considering that we have about 12 to 15 configurations this effort was the most time consuming and monotonous part of our testing process. With automation we’re able to bring down the execution time to about 2 days for each configuration. However even with automation suite in place, there are still some manual tests that require to be executed.
There’s a whole class of tests that would have been manually impossible to execute that we’re planning to implement with this framework. With any there are literally millions of combinations of scenarios that a typical user can use in the field. Functional testers only focus on a small subset of these tests. With an automation framework that works at business level, we have the possibility of getting a very high level of coverage. We are now focusing on the approach of high volume test automation with this framework. To give an example, we have 12 types of ‘analysis’ in our application. Each of these analysis can have 12 sub analysis and this can extend to any level. The ‘analysis’ works on data filters that can be of 16 groups. This leads to around 12*12*16 test scenarios. The complexity although is immense if considered to test manually is only a matter of coding 3 nested loops iterating for each type when implemented with framework.
Analysis and evaluation of results is certainly a challenge in any type of testing. The framework is designed to provide a pass/fail result of each of the verification point coded in the test suite. However it completely relying on the automated results is not sufficient on many of the cases. After the automation execution, functional testers furthur does exploratory testing on the results on the execution.
As like any testing effort, this type of automation approach is intended to improve and enhanse the productivity of functional testers.
This is the concluding part of this series of post.

Thursday, December 8, 2011

GTAC 2011: Opening Keynote Address - Test is Dead


Thought provoking Keynote address from GTAC

Wednesday, December 7, 2011

Managing the Test People : Judy McKay



I'd blogged previously about the lack of availability of  Test management resources and information. Here's an excellent book by Judy McKay about this subject. 
Judy's style of writing is casual yet very involved in details.  Its an excellent resource new managers and leads  as well as experienced hands.  Much of what's mentioned in this book is applicable for any management position. However there are specifics challenges on managing  testers. Judy has provided practical approaches in dealing with these challenges. 


 
Creative Commons License
The Elusive Bug by Rajesh Kazhankodath is licensed under a Creative Commons Attribution-Share Alike 2.5 India License.