USER ACCEPTANCE TEST
In order to highlight the UAT more vividly, let’s begin by asking what the difference is between a System Test and a User Acceptance Test.
A System Test is executed to ensure that the System is performing to specifications stated in the System Design and the Functional Specifications, whereas a User Acceptance Test ensures that the Application addresses and satisfies the Business Requirements and the Workflows.
The User Acceptance Test (UAT) will in fact certify that the applied technology has correctly automated the business processes in a given business environment. Therefore, a User Acceptance Test Plan will constitute the following set of artifacts to be listed as many times as there are test cases.
· Business Requirement
o Business Process
The UAT Test Plan will depend heavily on the Functional Specification to the extent that entire portions of it can be copied and pasted into the Test Plan. In addition to the artifacts listed above, a good Test Plan will also list the software program that pertains to the Business Process against which the Test cases are listed in chronological order and the relevant Screens.
A Business Process consists of Sub-processes, Workflows, Functions and Data. These business artifacts are translated into their automated components so that each Business Process is an Application, each Sub-process is a Screen and each Function under a specific Sub-process is a Tab. Therefore, when a Workflow is executed it weaves its way through these screens and there will be as many ways to execute the Workflow as there are Tabs (Functions). Each Tab will then execute a specific “what-if” Test Scenario.
The User/Tester will therefore be executing the many variations and “what-if” scenarios of an automated workflow, when running a test case.
A typical UAT would be initiated by a Login. If the User/Tester is allowed access to the system after authorization and authentication, the UAT is ready to proceed. The first item on the agenda is to check off the presentation of the correct screen, against the screen IDs listed in the Test Case. This task validates the Screen Flow Logic that had been tested during System Test. The next steps are as follows:
· Enter data into the fields presented in the screen.
· Ensure that the results presented in the Output screen that follows is identical to the expected results stated in the Test Case.
· Test for field edits with actual “erroneous” data. This task will again validate the edits tested during System Test.
· Ensure the correct error message is displayed.
· Ensure the quality of the error message by the level of communication in the description of the error. An error message should consist of:
o A display of the error.
o Instructions on next steps.
o Warnings (if any)
o Alerts (if any)
And none of this has to be displayed in flashing strobe lights and reverse video! A different font color should do the trick.
· At the end of testing a process it is important to note where the User/Tester is taken to at the end of testing that Business Process.
· The attributes of each field on the screen to be tested should be documented in the Test Case. For example, an alpha field versus an alpha-numeric field, number of characters, etc.
· User Friendliness: Test the fields to ensure that the auto-skip feature has been coded in the software, i.e. at the end of entering data into one field the cursor should automatically skip to the next field in line, without the User having to do it manually.
· Fault Tolerance: Record the level of fault tolerance. Test cases should be specifically designed in the UAT Test Plan to do that. This aspect is not the specific domain of critical mission systems alone. There are several ways in which recovery blocks and self-checking mechanisms can be constructed within an application to process faults in input data that would allow the User to continue to execute a Workflow after a delay of just a few seconds. Checkpoint/Recovery is one of them. Exception handling is another.