Professional Documents
Culture Documents
FALLSEM2013 14 - CP0600 - 10 Sep 2013 - RM01 - 3 Test Data
FALLSEM2013 14 - CP0600 - 10 Sep 2013 - RM01 - 3 Test Data
FALLSEM2013 14 - CP0600 - 10 Sep 2013 - RM01 - 3 Test Data
Objective
The objective of this phase is to determine whether a software system performs correctly in an executable mode. Depending on the severity of the problems, uncovered changes may need to be made to the software before it is placed in a production status. If the problems are extensive, it may be necessary to stop testing completely and return the software to the developers.
Concerns
Software not in a testable mode. Inadequate time/resources. Significant problems will not be uncovered during testing.
Workbench
Input
The testing during this phase must rely on the adequacy of the work performed during the earlier phases. The deliverables that are available during the validation testing include: System test plan (may include a unit test plan) Test data and/or test scripts Results of previous verification tests Inputs from third-party sources, such as computer operators
Do Procedures
This step involves the following three tasks: 1. Build the test data. 2. Execute tests. 3. Record test results.
Verify that the system can perform properly when internal program or system limitations have been exceeded
1.
2. 3.
Creating Test Data for Stress/Load Testing Identify input data used by the program.
4. 5.
Creating Test Scripts Data entry procedures required. Use of software packages.eg:-capture/playback Sequencing of events. Stop procedures. To develop, use, and maintain test scripts, testers should perform the following five steps: 1. Determine testing levels. 2. Develop test scripts. 3. Execute test scripts. 4. Analyze the results. 5. Maintain test scripts.
2.Develop Script
It normally done using the capture /playback tool. Script is series of related actions Script involves number of actions
Files involved Script components Terminal output/input., etc
3. Execute Script
Execute manually or by using capture/playback tools. Some considerations into script executions Environmental setup Program Library Date and time Security Serial dependencies Processing options,etc.
4. Analyze results
After executing the test script the result must be analyzed System components Terminal outputs (screens) File contents Environment variables, such as Status of logs & Performance data (stress results) Onscreen outputs Order of outputs processing Compliance of screens to specifications Ability to process actions Ability to browse through data
5. Maintain Scripts
Test scripts need to be maintained so that they can be used throughout development. The following areas should be incorporated into the script maintenance procedure:
Identifiers for each script. Purpose of scripts. Program/units tested by this script. Version of development data that was used to prepare script. Test cases included in script.
Testers must document the results of testing so that they know what was and was not achieved. The following attributes should be developed for each test case: Condition. Tells what is. Criteria. Tells what should be. Effect. Tells why the difference between what is and what should be is significant. Cause. Tells the reasons for the deviation. Documenting a statement of a user problem involves three tasks Documenting the Deviation Documenting the Effect Documenting the Cause(origin of the problem)
Output
Validation testing has the following three outputs: The test transactions to validate the software system The results from executing those transactions Variances from the expected results