Application Development Blog Posts
Learn and share on deeper, cross technology development topics such as integration and connectivity, automation, cloud extensibility, developing at scale, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member

Ask a layman what he understands by "Automation" and the most expected answer is "Doing something automatically" .

     Right!!! When something is done without the intervention of a human, it is automation. And how would you answer "Why automation?” Is it because we trust the machines more than humans or because machines can work tirelessly or because they can do the same job tenfold faster?

The answer is "All of it and much more”.

     Automation helps us with all of this. But keep in mind that we are humans, and 'it is human to err'. What if the creator of this unit of automation (in our context the automated script) does it the wrong way? The “wrong” would also get multiplied and multiply faster than we can realize something is not right. The whole idea is to do it the right way, and in the very beginning itself. It is those small things we ignore in the initial stages which later manifest as huge problems when the automation happens at a large scale. Everything multiplies, including the mistakes we have done and it becomes very difficult to correct it.

This is one of the reasons why some people still prefer manual testing, as they think more time goes in the maintenance and the correction of scripts in addition to their creation and execution.

     The power of automation has always been undermined because of the lack of being organized, structured and methodical in its creation. An automated script is best utilized when it is most reliable and re-usable. These two factors contribute towards easy execution (once the scripts are ready), maintenance (whenever there's a change in application) and accuracy of results (when the scripts get executed).

     A reliable script can be created only when the tester has a good understanding of the application, its usage and the configurations behind it. This requires a lot of reading and investigation of the application to know how it behaves under a given circumstance. Once this is done, the script can be created such that it handles the application for all possible application flow.

     A reusable script truly defines the meaning and purpose of automation. With a perfectly reusable script, further automation of upcoming applications becomes easier and faster. Maintenance is another take away from this attribute of a script. Reusability is a result of standardization of a script in all aspects like structure and naming convention. Let us look at them individually and see how they add to the script’s reusability.

Structure of an automated script: A well-structured script becomes easy to understand and adapt especially to those who take it over from others. It makes the script crisp without any unwanted coding. It is important to strictly limit the script to its purpose and keep only the absolute necessary.

For example, when it comes to validation part, which can be done in many ways (message check, status check, table check, field check and so on) it might not be required for every case. Also remember that a DB Table check takes extra efforts from the script to connect to the system and read the table. One execution may not make a difference, but on a large scale execution, it – does – matter. 

Such additional coding needs to be identified and eliminated. Let us analyze the necessary coding according to the purpose of the test:

1.      Functional Correctness: Validation is required before and after the execution of the application to see how the transaction has affected the existing data.

                         Validation before test --> Execution of tcode under test  --> Validation after test

2.      Performance Measurement: Performance is considered only after the application has been tested for its functional correctness. Validation has no purpose here as the focus of test is non-functional

         Execution of tcode under test

3.      Creation of Data for Performance: Usually massive data is required for Performance measurement.

      For e.g. a 1000 customers with 150 line items each… the same could be repeated for vendors, cost centers, and so on. Table checks on this scale of execution would create a huge load on the system and it would take hours to create such data, may be even days in some cases. It is best to avoid validation/table reads of any kind. Another point to keep in mind here is that using a functional module or a BAPI to create data saves a lot of time and effort. A TCD recording or a SAPGUI recoding should only be the last option.

                        Execution of tcode for data creation --> Validation after test

4.      Creation of data for system Setup: this is usually done on a fresh system, with no data. Hence verification only at the end would suffice.

                         Execution of tcode for data creation --> Validation after test

There is also a subtle aspect of being structured…  The Naming Convention.

Testers usually tend to name their script to suit their need, ignoring the fact that these transactions can be used by anyone in an integrated environment. Searching for existing scripts becomes easy when proper rules are followed while naming them. It may happen that more than one script exist for the same purpose, such duplication has to be avoided. Attributes like the purpose (unit or integration testing or customizing or performance), tcode executed, action (change or create or delete), release need to be kept in mind while setting up the rules.

The same goes for Parameters as well. Work becomes easy while binding the called script and the calling script (script references). Also quick log analysis is another take away from common naming conventions for parameter.

There is another factor that makes automation more meaningful and complete in all sense. That is documentation. Documentation is a record of what exactly is expected of the script. Its importance is realized at the time of hand over, maintenance and adaptation. However ‘Document Creation’ itself can be dealt with as a separate topic. The idea is that document creation should not be disregarded as unimportant.

Having done all this, we need to watch out for the scope of test. With new functionality getting developed over the older ones (e.g. enhancement of features), re-prioritization needs to be done regularly. Old functionality may not be relevant anymore or they must be stable enough to be omitted from the focus topics. This way the new features/developments get tested better.

Now let us summarize the write up. All the aspects mentioned above are not something we cannot do without. Automation can still happen without any of these factors. However, the benefits we draw from them can make a huge difference on time and efforts of both automation and maintenance. Understanding a script authored by someone else, Knowledge transfer, Adaptation, Corrections... these are just a few advantages to list down.

The world of automation is very vast and its benefits still remain unexplored.

4 Comments