Professional Documents
Culture Documents
Software Testing Concepts (Manual Testing) : Definitions
Software Testing Concepts (Manual Testing) : Definitions
Software Testing Concepts (Manual Testing) : Definitions
Definitions
Project : It is something developed based on a particular customer requirement and used by that particular customer only.
Product: Product is some thing that is developed based on the company’s specifications and used by multiple customers.
Quality: Quality is defined as not only the justification of the requirement but also the present of value (user friendly).
BIDDING THE PROJECT : Bedding the project is defined as request for proposal, estimation and signoff.
KICK OF MEETING:
It is a initial meeting conducted in the software company soon after the project is signed off in order to
discus the over view of the project and to select a project manager for the project.
Usually High Level Manager, Project Manager, Technical Manager, Quality Managers, Test leads and
Project leads will be involved in this meeting.
Design phase
Tasks HLD (High Level Designing) LLD (Low Level Designing)
Roles HLD is done by the CA (Chief Architect). LLD is done by the TL (Technical Lead).
Process
The chief architect will divided the whole project into modules by drawing some diagrams and technical
lead will divided each module into sub modules by drawing some diagrams using UML (Unified Modeling
Language).
The technical lead will also prepare the PSEUDO Code.
Testing Phase
Task Testing.
Roles Test Engineer.
Process
First of all the Test Engineer will receive the requirement documents and review it for under studying the
requirements.
If at all they get any doubts while understanding the requirements they will prepare the Review Report (RR)
with all the list of doubts.
Once the clarifications are given and after understanding the requirements clearly they will take the test case
template and write the test cases.
Once the build is released they will execute the test cases.
After executions if at all find any defects then they will list out them in a defect profile document.
Then they will send defect profile to the developers and wait for the next build.
Once the next build is released they will once again execute the test cases
If they find any defects they will follow the above procedure again and again till the product is defect free.
Once they feel product is defect free they will stop the process.
2 White box Testing (Or) Glass box Testing (Or) Clear box Testing
It is a method of testing in which one will perform testing on the structural part of an application. Usually
developers are white box testers perform it.
LEVELS OF TESTING
There are 5 levels of testing.
If one performs testing on a unit then that level of testing is known as unit level testing. It is white box testing
usually developers perform it.
Unit: - It is defined as a smallest part of an application.
If one perform testing on a module that is known as module level testing. It is black box testing usually test
engineers perform it.
3) Integration level testing
Once the modules are developing the developers will develop some interfaces and integrate the module with the
help of those interfaces while integration they will check whether the interfaces are working fine or not. It is a white
box testing and usually developers or white box testers perform it.
The developers will be integrating the modules in any one of the following approaches.
i) Top Down Approach (TDA)
In this approach the parent modules are developed first and then integrated with child modules.
ii) Bottom Up Approach (BUA)
In this approach the child modules are developed first and the integrated that to the corresponding parent
modules.
iii) Hybrid Approach
This approach is a mixed approach of both Top down and Bottom up approaches.
iv) Big Bang Approach
Once all the modules are ready at a time integrating them finally is known as big bang approach.
STUB
While integrating the modules in top down approach if at all any mandatory module is missing then that
module is replace with a temporary program known as STUB.
DRIVER
While integrating the modules in bottom up approach if at all any mandatory module is missing then that
module is replace with a temporary program known as DRIVER.
4) System level testing
Once the application is deployed into the environment then if one performs testing on the system it is known as
system level testing it is a black box testing and usually done by the test engineers.
The same system testing done in the presents of the user is known as user acceptance testing. It s a black box testing
usually done by the Test engineers.
ENVIRONMENT
Environment is a combination of 3 layers.
Presentation Layer.
Business Layer.
Database Layer.
Types of Environment
There are 4 types of environments.
1. Stand alone Environment / One – tire Architecture.
2. Client – Server Environment / Two – tire Architecture.
3. Web Environment / Three – tire Architecture.
4. Distributed Environment / N – tire Architecture.
1) Stand alone environment (Or) One – Tire Architecture.
This environment contains all the three layers that is Presentation layer, Business layered and Database layer in a
Single tier.
2) Client – Server Environment (Or) Two – Tire Architecture
In this environment two tiers will be there one tier is for client and other tier is for Database server. Presentation
layer and Business layer will be present in each and every client and the database will be present in database server.
3) Web Environment
In this Environment three tiers will be there client resides in one tier, application server resides in middle tier and
database server resides in the last tier. Every client will have the presentation layer, application server will have the
business layer and database server will have the database layer.
4) Distributed Environment
It is same as the web Environment but the business logic is distributed among application server in order to
distribute the load.
Web Server: It is software that provides web services to the client.
Application Server: It is a server that holds the business logic.
Ex: Ton tact, Tomcat, Web logic, web Spear etc………
UAT UATR
Del & Maint Delivery to Client
Advantages:
It is a simple model and easy to maintain project implementation is very easy.
Drawbacks:
Can’t incorporate new changes in the middle of the project development.
2) Prototype Model
S R S Doc Client Environ
Base lined conformation
Unclear Req
Prototype
S/W Prototype Demo to Client
Advantages:
When ever the customer with the requirements then this is the best model to gather the clear requirements.
e
Drawbacks:
It is not a complete model.
Time consuming model
Prototype has to be build company’s cost
The user may strict to the prototype and limit his requirements.
3) Evolutionary Model
Initial Req.
Development
Feed back
N with new Req
User
Application User Values
Acceptance
App is
Y
Base lined
Advantages
Whenever the customer is revolving the requirements this is the best suitable model.
Drawbacks
Refunding and
Risk root cause Analysis.
planning for
the next cycle. Estimation
Contingencies.
Implementation.
Advantages
This is the best-suited model for highly risk-based projects.
Drawbacks
Time consumed model, costly model and project monitoring and maintenance is difficult.
5) Fish Model
Verification:
Verification is a process of checking conducted on each and every role of an organization in order to check whether
he is doing his tasks in a right manner according to the guidelines or not. Right from the starting of the process tiles
the ending of the process. Usually the documents are verified in this process of checking.
Validation
Validation is a process of checking conducted on the developed product in order to check whether it is working
according to the requirements or not.
Delivery &
Analysis Design Coding Maintenance
Requiremen System
ts gathering Testing
HLD
SRS SCD
LLD
BRS
Black box
Review
Testing
Verification
Validation
Advantages
As the verification and validation are done the outcome of a Fish Model is a quality product.
6) V – Model
Verification Validation
Initial BRS Prepare Pro. Plan
& Prepare Test Plan
Analysis SRS Req. Phase Testing
Advantages
As the verification and validation are done along with the Test Management. The out come of V-Model is a quality
product.
Drawback
Some companies even call it as Sanitary Testing and also Smoke Testing. But some company’s will say that just
before the release of the built the developer’s will conduct the overall testing in order to check weather the build is
proper for detailed testing or not that is known as Smoke Testing and once the build is released once again the
testers will conduct the over all testing in order to check weather the build is proper for further detailed testing or
not. That is known as Sanitary Testing.
2) Regression Testing
It is a type of testing in which one will perform testing on the already tested functionality again and again this is
usually done in scenarios (Situations).
Scenario 1:
When ever the defects are raised by the Test Engineer rectified by the developer and the next build is released to the
testing department then the Test Engineer will test the defect functionality and it’s related functionalities once
again.
Scenario 2:
When ever some new changes are requested by the customer, those new features are incorporated by the developers,
next built is released to the testing department then the test engineers will test the related functionalities of the new
features once again which are already tested. That is also known as regression testing.
Note:
Testing the new features for the first time is new testing but not the regression testing.
3) Re – Testing:
It is a type of testing in which one will perform testing on the same function again and again with multiple sets of
data in order to come to a conclusion whether the functionality is working fine or not.
4) α - Testing:
It is a type of testing in which one (I.e., out Test Engineer) will perform user acceptance testing in our company in
the presents of the customer.
Advantages:
If at all any defects are found there is a chance of rectifying them immediately.
5) β - Testing:
It is a type of testing in which either third party testers or end users will perform user acceptance testing in the client
place before actual implementation.
6) Static Testing:
It is a type of testing in which one will perform testing on an application or it’s related factors with out performing
any actions.
Ex: GUI Testing, Document Testing, Code reviewing and etc…
7) Dynamic Testing:
It is a type of testing in which one will perform testing on the application by performing same action.
Ex: Functional Testing.
8) Installation Testing:
It is a type of testing in which one will install the application in to the environment by following the guidelines
given in the deployment document and if the installation is successful the one will come to a conclusion that the
guidelines are correct otherwise the guidelines are not correct.
9) Compatibility Testing:
It is a type of testing in which one may have to install the application into multiple number of environments
prepared with different combinations of environmental components in order to check whether the application is
suitable with these environments or not. This is use usually done to the products.
For example usually the developers will be doing any many changes to the program and check it’s performance it is
known as mutation testing.
i) Authentication Testing:
It is a type of testing in which a Test Engineer will enter different combinations of user names and
passwords in order to check whether only the authorized persons are accessing the application or not.
It is a type of testing in which one will perform testing on the application in his own style after understanding the
requirements clearly.
It contains 6 phases.
1. TEST PLANNING.
2. TEST DEVELOPMENT.
3. TEST EXECUTION.
4. RESULT ANALYSIS.
5. BUG TRACKING.
6. REPORTING.
1) TEST PLANNING
Plan:
Plan is a strategic document, which describes how to perform a task in an effective, efficient and optimized way.
Optimization:
Optimization is a process of reducing or utilizing the input resources to their maximum and getting the maximum
possible output.
Test Plan:
It is a strategic document, which describe how to perform testing on an application in an effective, efficient and
optimized way. The Test Lead prepares test plan.
1.0 INTERDUCTION.
1.1 Objective.
1.2 Reference Document.
2.0 COVERAGE OF TESTING.
2.1 Features to be Tested.
2.2 Features not to be Tested.
3.0 TEST STRATEGY.
3.1 Levels of Testing.
3.2 Types of Testing.
3.3 Test Design Technique.
3.4 Configuration Management.
3.5 Test Metrics.
3.6 Terminology.
3.7 Automation Plan.
3.8 List of Automated Tools.
4.0 BASE CRITERIA..
4.1 Acceptance Criteria.
4.2 Suspension Criteria.
5.0 TEST DELIVARABLES.
6.0 TEST ENVERONMENT.
7.0 RESOURCE PLANNING.
8.0 SHEDULING.
9.0 STAFFING AND TRAINING.
10.0 RISKS AND CONTINGENCES.
11.0 ASSUMPTIONS.
12.0 APPROVAL INFORMATION.
1.0 INTERDUCTION.
1.1 Objective.
The main purpose of the document is clearly described here in this section.
TEST PLAN
It is defined as a project level term, which is describes how to test a particular project in an organization.
Note:
Test strategy is common for all the projects. But test plan various from project to project.
3.6 Terminologies
The list of all the terms and the corresponding meanings are listed out here in this section
8.0 SCHEDULING.
The starting dates and the ending dates of each and ever task is clearly described here in this section.
Contingences
1. Proper plan ensurance.
2. People need to be maintained on bench.
3. What not to be tested has to be planed properly.
4. Severity priority based execution.
5. Proper training needs to be provided.
11.0 ASSUMPTIONS.
The list of all the assumptions that are to be assumed by a test engineer will be listed out here in this section.
1.Test Objective:
The purpose of the document is clearly described here in this section.
2.Test Scenarios:
The list of all the situations that are to be tested, that are listed out here in this section.
3.Test Procedure:
Test procedure is a functional level term, which describe how to test the functionality. So in this section one will
describe the plan for testing the functionality.
4.Test Data:
The data that is required for testing is made available here in this section.
5.Test Cases:
The list of all the detailed test cases is- listed out here in this section.
Note:
Some companies even maintain all the above five fields individually for each and every scenario. But some
companies maintain commonly for all the scenarios.
3. TEST EXECUTION.
During the test execution phase the test engineer will do the following.
4. RESULT ANALYSIS.
In this phase the test engineer will compare the expected value with actual value and mention the result as pass if
both are match other wise mentioned the result as fail.
5. BUG TRACKING.
Bug tracking is a process in which the defects are identifying, isolated and managed.
DEFECT PROFILE DOCUMENT
Defect ID:
The sequences of defect numbers are listed out here in this section.
Steps of Reproducibility:
The list of all the steps that are followed by a test engineer to identity the defect are listed out here in this section.
Submitter:
The test engineer name who submits the defect will be mentioned here in this section.
Date of Submission:
The date on which the defects submitted is mentioned here in this section.
Version Number:
The corresponding version number is mentioned here in this section.
Build Number:
Corresponding build number is mentioned here is this section.
Assigned to:
The project lead or development lead will mentioned the corresponding developers name for name the defect is
assigned.
Severity:
How serious the defect is, is described in terms of severity. It is classified in to 4 types.
1. FATAL Sev1 S1 1
2. MAJOR Sev2 S2 2
3. MINOR Sev3 S3 3
4. SUGGESION Sev4 S4 4
FATAL:
It is all the problems are related to navigational blocks or unavailability of functionality then such types of problems
are treated to be FATAL defect.
Usually the FATAL defects are given CRITICAL priority, MAJOR defects are given HIGH priority, MINOR
defects are given MEDIUM priority and SUGGITION defects are given LOW priority sent depending upon the
situation the priority may be changed by the project lead or development lead.
Ex: -
Low Severity High Priority Case:
In the case of customer visit all the look and feel defects, which are usually less savior, are given highest priority.
Testers Mistake
No
Require As Per Design
Test
really Yes
Develop Rectification
Defect
BH # 2
TESTING
IF
New/Open Yes No Stop the Testing
DEFEC
T
Is it
Reopen No really Yes
Closed
rectified
New / Open:
When ever the defect is found for the first time the test engineer will set the status as New / Open. But some
companies will say to set the status as only new at this situation and once the developers accept the defect they will
set the status as open.
If they feel rectified they will set the status as Closed. Other wise they will set the status as Reopen
Fault / Failure:
The customer identity the problem, after delivery. It is called Fault / Failure.
6. BUG REPORTING.
1). Classical Bug Reporting Process:
Project Lead
Test Lead
TL
PL
Caman Repository
TL PL
BTT
It is a software application that can be accessed only by the otherwise person and used for managing the complete
bug tracking process by providing all the facilities along with a defect profile template.
Note:
At the end of the testing process usually the test lead will prepare the test summary report which is also called as
test closure.
TEST DESIGN TECHNIQUES:
While developing the test cases if at all the test engineer feels complex in some areas to over come that complexity
usually the test engineer will use test design techniques.
Ex: Develop the test cases for E-Mail Test box whose validations are as follows.
Requirements:
1. It should accept Minimum 4 characters Maximum 20 characters.
2. It should accept only small characters.
3. It should accept @ and _ special symbols only.
Valid Invalid
4char 3char
5char 21char
12char A–Z
19char 0–9
20char All the Special Symbols apart
a–z form @ and _.
@ Alpha Numeric.
_ Blank Space
Dismal Numbers.
Test
Case Test Case Description Expected Value
ID Type
1 +ve Enter the value as per the VIT It should accept.
2 -ve Enter the value as per the IIT It should not accept.
Sl NO Input Sl No Input
1 abcd 1 abc
2 ab@zx 2 ABCD
3 abcdabcd@ab_ 3 ABCD123
4 abcdabcddcbaaccd_@z 4 12345.5
6 abcdabcdabcdabcd_xyz 6 abcdabcd-----abc*#)
7
TESTING-TYPES
Acceptance Testing: Testing conducted to enable a user/customer to determine whether to accept a software
product. Normally performed to validate the software meets a set of agreed acceptance criteria.
Accessibility Testing: Verifying a product is accessible to the people having disabilities (deaf, blind, mentally
disabled etc.).
Ad Hoc Testing: Similar to exploratory testing, but often taken to mean that the testers have significant
understanding of the software before testing it.
Agile Testing: Testing practice for projects using agile methodologies, treating development as the customer of
testing and emphasizing a test-first design paradigm.
Alpha testing: Testing of an application when development is nearing completion; minor design changes may
still be made as a result of such testing. Typically done by end-users or others, not by programmers or testers.
Application Binary Interface (ABI): A specification defining requirements for portability of applications in binary
forms across different system platforms and environments.
Application Programming Interface (API): A formalized set of software calls and routines that can be
referenced by an application program in order to access supporting system or network services.
Automated Software Quality (ASQ): The use of software tools, such as automated testing tools, to improve
software quality.
Automated Testing:
Testing employing software tools which execute tests without manual intervention. Can be applied in
GUI, performance, API, etc. testing.
The use of software to control the execution of tests, the comparison of actual outcomes to predicted
outcomes, the setting up of test preconditions, and other test control and test reporting functions.
Backus-Naur Form: A Meta language used to formally describe the syntax of a language.
Basic Block: A sequence of one or more consecutive, executable statements containing no branches.
Basis Path Testing: A white box test case design technique that uses the algorithmic flow of the program to
design tests.
Basis Set: The set of tests derived using basis path testing.
Baseline: The point at which some deliverable produced during the software engineering process is put under
formal change control.
Beta Testing: Testing when development and testing are essentially completed and final bugs and problems
need to be found before final release. Typically done by end-users or others, not by programmers or testers.
Binary Portability Testing: Testing an executable application for portability across system platforms and
environments, usually for conformation to an ABI specification.
Black Box Testing: Testing based on an analysis of the specification of a piece of software without reference to
its internal workings. The goal is to test how well the component conforms to the published requirements for the
component.
Bottom Up Testing: An approach to integration testing where the lowest level components are tested first, then
used to facilitate the testing of higher level components. The process is repeated until the component at the top
of the hierarchy is tested.
Boundary Testing: Test which focus on the boundary or limit conditions of the software being tested. (Some of
these tests are stress tests).
Bug: A fault in a program, which causes the program to perform in an unintended or unanticipated manner.
Boundary Value Analysis: BVA is similar to Equivalence Partitioning but focuses on "corner cases" or values
that are usually out of range as defined by the specification. It means that if a function expects all values in range
of negative 100 to positive 1000, test inputs would include negative 101 and positive 1001.
Branch Testing: Testing in which all branches in the program source code are tested at least once.
Breadth Testing: A test suite that exercises the full functionality of a product but does not test features in detail.
Capture/Replay Tool: A test tool that records test input as it is sent to the software under test. The input cases
stored can then be used to reproduce the test at a later time. Most commonly applied to GUI test tools.
CMM: The Capability Maturity Model for Software (CMM or SW-CMM) is a model for judging the maturity of the
software processes of an organization and for identifying the key practices that are required to increase the
maturity of these processes.
Cause Effect Graph: A graphical representation of inputs and the associated outputs effects which can be used
to design test cases.
Code Complete: Phase of development where functionality is implemented in entirety; bug fixes are all that are
left. All functions found in the Functional Specifications have been implemented.
Code Coverage: An analysis method that determines which parts of the software have been executed
(covered) by the test case suite and which parts have not been executed and therefore may require additional
attention.
Code Inspection: A formal testing technique where the programmer reviews source code with a group who ask
questions analyzing the program logic, analyzing the code with respect to a checklist of historically common
programming errors, and analyzing its compliance with coding standards.
Code Walkthrough: A formal testing technique where source code is traced by a group with a small set of test
cases, while the state of program variables is manually monitored, to analyze the programmer's logic and
assumptions.
Compatibility Testing: Testing whether software is compatible with other elements of a system with which it
should operate, e.g. browsers, Operating Systems, or hardware.
Concurrency Testing: Multi-user testing geared towards determining the effects of accessing the same
application code, module or database records. Identifies and measures the level of locking, deadlocking and use
of single-threaded code and locking semaphores.
Conformance Testing: The process of testing that an implementation conforms to the specification on which it
is based. Usually applied to testing conformance to a formal standard.
Context Driven Testing: The context-driven testing is flavor of Agile Testing that advocates continuous and
creative evaluation of testing opportunities in light of the potential information revealed and the value of that
information to the organization right now or it can be defined as testing driven by an understanding of the
environment, culture, and intended use of software. For example, the testing approach for life-critical medical
equipment software would be completely different than that for a low-cost computer game.
Conversion Testing: Testing of programs or procedures used to convert data from existing systems for use in
replacement systems.
Cyclomatic Complexity: A measure of the logical complexity of an algorithm, used in white-box testing.
Data Flow Diagram: A modeling notation that represents a functional decomposition of a system.
Data Driven Testing: Testing in which the action of a test case is parameterized by externally defined data
values, maintained as a file or spreadsheet. A common technique in Automated Testing.
Dependency Testing: Examines an application's requirements for pre-existing software, initial states and
configuration in order to maintain proper functionality.
Dynamic Testing: Testing software through executing it. See also Static Testing.
Endurance Testing: Checks for memory leaks or other problems that may occur with prolonged execution.
End-to-End testing: Testing a complete application environment in a situation that mimics real-world use, such
as interacting with a database, using network communications, or interacting with other hardware, applications,
or systems if appropriate.
Equivalence Class: A portion of a component's input or output domains for which the component's behavior is
assumed to be the same from the component's specification.
Equivalence Partitioning: A test case design technique for a component in which test cases are designed to
execute representatives from equivalence classes.
Exhaustive Testing: Testing which covers all combinations of input values and preconditions for an element of
the software under test.
Exploratory testing: Often taken to mean a creative, informal software test that is not based on formal test
plans or test cases; testers may be learning the software as they test it.
Failover Tests: Failover Tests verify of redundancy mechanisms while under load. For example, such testing
determines what will happen if multiple web servers are being used under peak anticipate load, and one of them
dies. Does the load balancer react quickly enough? Can the other web servers handle the sudden dumping of
extra load? This sort of testing allows technicians to address problems in advance, in the comfort of a testing
situation, rather than in the heat of a production outage.
Functional Decomposition: A technique used during planning, analysis and design; creates a functional
hierarchy for the software.
Functional Specification: A document that describes in detail the characteristics of the product with regard to
its intended features.
Testing the features and operational behavior of a product to ensure they correspond to its specifications.
Testing that ignores the internal mechanism of a system or component and focuses solely on the outputs
generated in response to selected inputs and execution conditions.
Gray Box Testing: A combination of Black Box and White Box testing methodologies: testing a piece of
software against its specification but using some knowledge of its internal workings.
Incremental Integration testing: Continuous testing of an application as new functionality is added; requires
that various aspects of an application's functionality be independent enough to work separately before all parts
of the program are completed, or that test drivers be developed as needed; done by programmers or by testers.
Independent Test Group (ITG): A group of people whose primary responsibility is software testing,
Inspection: A group review quality improvement process for written material. It consists of two aspects; product
(document itself) improvement and process improvement (of both document production and inspection).
Integration Testing: Testing of combined parts of an application to determine if they function together correctly.
Usually performed after unit and functional testing. This type of testing is especially relevant to client/server and
distributed systems.
Installation Testing: Confirms that the application under test recovers from expected or unexpected events
without loss of data or functionality. Events can include shortage of disk space, unexpected loss of
communication, or power out conditions.
Load Testing: Load Tests are end to end performance tests under anticipated production load. The primary
objective of this test is to determine the response times for various time critical transactions and business
processes and that they are within documented expectations (or Service Level Agreements - SLAs). The test
also measures the capability of the application to function correctly under load, by measuring transaction
pass/fail/error rates.
This is a major test, requiring substantial input from the business, so that anticipated activity can be accurately
simulated in a test situation. If the project has a pilot in production then logs from the pilot can be used to
generate ‘usage profiles’ that can be used as part of the testing process, and can even be used to ‘drive’ large
portions of the Load Test.
Load testing must be executed on “today’s” production size database, and optionally with a “projected”
database. If some database tables will be much larger in some months time, then Load testing should also be
conducted against a projected database. It is important that such tests are repeatable as they may need to be
executed several times in the first year of wide scale deployment, to ensure that new releases and changes in
database size do not push response times beyond prescribed SLAs.
Localization Testing: This term refers to making software specifically designed for a specific locality.
Loop Testing: A white box testing technique that exercises program loops.
Metric: A standard of measurement. Software metrics are the statistics describing the structure or content of a
program. A metric should be a real objective measurement of something such as number of bugs per lines of
code.
Monkey Testing: Testing a system or an Application on the fly, i.e. just few tests here and there to ensure the
system or an application does not crash out.
Mutation testing: A method for determining if a set of test data or test cases is useful, by deliberately
introducing various code changes ('bugs') and retesting with the original test data/cases to determine if the 'bugs'
are detected. Proper implementation requires large computational resources.
Network Sensitivity Tests: Network sensitivity tests are tests that set up scenarios of varying types of network
activity (traffic, error rates...), and then measure the impact of that traffic on various applications that are
bandwidth dependant. Very 'chatty' applications can appear to be more prone to response time degradation
under certain conditions than other applications that actually use more bandwidth. For example, some
applications may degrade to unacceptable levels of response time when a certain pattern of network traffic uses
50% of available bandwidth, while other applications are virtually un-changed in response time even with 85% of
available bandwidth consumed elsewhere.
This is a particularly important test for deployment of a time critical application over a WAN.
Negative Testing: Testing aimed at showing software does not work. Also known as "test to fail".
N+1 Testing: A variation of Regression Testing. Testing conducted with multiple cycles in which errors found in
test cycle N are resolved and the solution is retested in test cycle N+1. The cycles are typically repeated until the
solution reaches a steady state and there are no errors. See also Regression Testing.
Path Testing: Testing in which all paths in the program source code are tested at least once.
Performance Testing: Testing conducted to evaluate the compliance of a system or component with specified
performance requirements. Often this is performed using an automated test tool to simulate large number of
users. Also know as "Load Testing".
Performance Tests are tests that determine end to end timing (benchmarking) of various time critical business
processes and transactions, while the system is under low load, but with a production sized database. This sets
‘best possible’ performance expectation under a given configuration of infrastructure. It also highlights very early
in the testing process if changes need to be made before load testing should be undertaken. For example, a
customer search may take 15 seconds in a full sized database if indexes had not been applied correctly, or if an
SQL 'hint' was incorporated in a statement that had been optimized with a much smaller database. Such
performance testing would highlight such a slow customer search transaction, which could be remediate prior to
a full end to end load test.
Positive Testing: Testing aimed at showing software works. Also known as "test to pass".
Protocol Tests: Protocol tests involve the mechanisms used in an application, rather than the applications
themselves. For example, a protocol test of a web server may will involve a number of HTTP interactions that
would typically occur if a web browser were to interact with a web server - but the test would not be done using a
web browser. LoadRunner is usually used to drive load into a system using VUGen at a protocol level, so that a
small number of computers (Load Generators) can be used to simulate many thousands of users.
Quality Assurance: All those planned or systematic actions necessary to provide adequate confidence that a
product or service is of the type and quality needed and expected by the customer.
Quality Audit: A systematic and independent examination to determine whether quality activities and related
results comply with planned arrangements and whether these arrangements are implemented effectively and are
suitable to achieve objectives.
Quality Circle: A group of individuals with related interests that meet at regular intervals to consider problems or
other matters related to the quality of outputs of a process and to the correction of problems or to the
improvement of quality.
Quality Control: The operational techniques and the activities used to fulfill and verify requirements of quality.
Quality Management: That aspect of the overall management function that determines and implements the
quality policy.
Quality Policy: The overall intentions and direction of an organization as regards quality as formally expressed
by top management.
Quality System: The organizational structure, responsibilities, procedures, processes, and resources for
implementing quality management.
Race Condition: A cause of concurrency problems. Multiple accesses to a shared resource, at least one of
which is a write, with no mechanism used by either to moderate simultaneous access.
Ramp Testing: Continuously raising an input signal until the system breaks down.
Recovery Testing: Confirms that the program recovers from expected or unexpected events without loss of
data or functionality. Events can include shortage of disk space, unexpected loss of communication, or power
out conditions.
Regression Testing: Retesting a previously tested program following modification to ensure that faults have not
been introduced or uncovered as a result of the changes made.
Release Candidate: A pre-release version, which contains the desired functionality of the final version, but
which needs to be tested for bugs (which ideally should be removed before the final version is released).
Sanity Testing: Brief test of major functional elements of a piece of software to determine if it’s basically
operational. See also Smoke Testing.
Scalability Testing: Performance testing focused on ensuring the application under test gracefully handles
increases in work load.
Security Testing: Testing which confirms that the program can restrict access to authorized personnel and that
the authorized personnel can access the functions available to their security level.
Smoke Testing: Typically an initial testing effort to determine if a new software version is performing well
enough to accept it for a major testing effort. For example, if the new software is crashing systems every 5
minutes, bogging down systems to a crawl, or corrupting databases, the software may not be in a 'sane' enough
condition to warrant further testing in its current state.
Soak Testing: Soak testing is running a system at high levels of load for prolonged periods of time. A soak test
would normally execute several times more transactions in an entire day (or night) than would be expected in a
busy day, to identify and performance problems that appear after a large number of transactions have been
executed. Also, due to memory leaks and other defects, it is possible that a system may ‘stop’ working after a
certain number of transactions have been processed. It is important to identify such situations in a test
environment.
Sociability (sensitivity) Tests: Sensitivity analysis testing can determine impact of activities in one system on
another related system. Such testing involves a mathematical approach to determine the impact that one
system will have on another system. For example, web enabling a customer 'order status' facility may impact on
performance of telemarketing screens that interrogate the same tables in the same database. The issue of web
enabling can be that it is more successful than anticipated and can result in many more enquiries than originally
envisioned, which loads the IT systems with more work than had been planned.
Static Analysis: Analysis of a program carried out without executing the program.
Static Testing: Analysis of a program carried out without executing the program.
Storage Testing: Testing that verifies the program under test stores data files in the correct directories and that
it reserves sufficient space to prevent unexpected termination resulting from lack of space. This is external
storage as opposed to internal storage.
Stress Testing: Stress Tests determine the load under which a system fails, and how it fails. This is in contrast
to Load Testing, which attempts to simulate anticipated load. It is important to know in advance if a ‘stress’
situation will result in a catastrophic system failure, or if everything just “goes really slow”. There are various
varieties of Stress Tests, including spike, stepped and gradual ramp-up tests. Catastrophic failures require
restarting various infrastructures and contribute to downtime, a stress-full environment for support staff and
managers, as well as possible financial losses. This test is one of the most fundamental load and performance
tests.
Structural Testing: Testing based on an analysis of internal workings and structure of a piece of software. See
also White Box Testing.
System Testing: Testing that attempts to discover defects that are properties of the entire system rather than of
its individual components. It’s a black-box type testing that is based on overall requirements specifications;
covers all combined parts of a system.
Targeted Infrastructure Test: Targeted Infrastructure Tests are isolated tests of each layer and or component
in an end to end application configuration. It includes communications infrastructure, Load Balancers, Web
Servers, Application Servers, Crypto cards, Citrix Servers, Database… allowing for identification of any
performance issues that would fundamentally limit the overall ability of a system to deliver at a given
performance level.
Each test can be quite simple, For example, a test ensuring that 500 concurrent (idle) sessions can be
maintained by Web Servers and related equipment should be executed prior to a full 500 user end to end
performance test, as a configuration file somewhere in the system may limit the number of users to less than
500. It is much easier to identify such a configuration issue in a Targeted Infrastructure Test than in a full end to
end test.
Testability: The degree to which a system or component facilitates the establishment of test criteria and the
performance of tests to determine whether those criteria have been met.
Testing:
The process of exercising software to verify that it satisfies specified requirements and to detect errors.
The process of analyzing a software item to detect the differences between existing and required
conditions (that is, bugs), and to evaluate the features of the software item.
The process of operating a system or component under specified conditions, observing or recording the
results, and making an evaluation of some aspect of the system or component.
Test Bed: An execution environment configured for testing. May consist of specific hardware, OS, network
topology, configuration of the product under test, other application or system software, etc. The Test Plan for a
project should enumerate the test beds(s) to be used.
Test Case:
Test Case is a commonly used term for a specific test. This is usually the smallest unit of testing. A Test
Case will consist of information such as requirements testing, test steps, verification steps, prerequisites,
outputs, test environment, etc.
A set of inputs, execution preconditions, and expected outcomes developed for a particular objective,
such as to exercise a particular program path or to verify compliance with a specific requirement.
Test Driven Development: Testing methodology associated with Agile Programming in which every chunk of
code is covered by unit tests, which must all pass all the time, in an effort to eliminate unit-level and regression
bugs during development. Practitioners of TDD write a lot of tests, i.e. an equal number of lines of test code to
the size of the production code.
Test Driver: A program or test tool used to execute a tests. Also known as a Test Harness.
Test Environment: The hardware and software environment in which tests will be run, and any other software
with which the software under test interacts when under test including stubs and test drivers.
Test First Design: Test-first design is one of the mandatory practices of Extreme Programming (XP).It requires
that programmers do not write any production code until they have first written a unit test.
Test Harness: A program or test tool used to execute a test. Also known as a Test Driver.
Test Plan: A document describing the scope, approach, resources, and schedule of intended testing activities. It
identifies test items, the features to be tested, the testing tasks, who will do each task, and any risks requiring
contingency planning.
Test Procedure: A document providing detailed instructions for the execution of one or more test cases.
Test Script: Commonly used to refer to the instructions for a particular test that will be carried out by an
automated test tool.
Test Specification: A document specifying the test approach for a software feature or combination or features
and the inputs, predicted results and execution conditions for the associated tests.
Test Suite: A collection of tests used to validate the behavior of a product. The scope of a Test Suite varies from
organization to organization. There may be several Test Suites for a particular product for example. In most
cases however a Test Suite is a high level concept, grouping together hundreds or thousands of tests related by
what they are intended to test.
Test Tools: Computer programs used in the testing of a system, a component of the system, or its
documentation.
Thread Testing: A variation of top-down testing where the progressive integration of components follows the
implementation of subsets of the requirements, as opposed to the integration of components by successively
lower levels.
Thick Client Application Tests: A Thick Client (also referred to as a fat client) is a purpose built piece of
software that has been developed to work as a client with a server. It often has substantial business logic
embedded within it, beyond the simple validation that is able to be achieved through a web browser. A thick
client is often able to be very efficient with the amount of data that is transferred between it and its server, but is
also often sensitive to any poor communications links. Testing tools such as Win Runner are able to be used to
drive a Thick Client, so that response time can be measured under a variety of circumstances within a testing
regime.
Developing a load test based on thick client activity usually requires significantly more effort for the coding stage
of testing, as VUGen must be used to simulate the protocol between the client and the server. That protocol
may be database connection based, COM/DCOM based, a proprietary communications protocol or even a
combination of protocols.
Thin Client Application Tests: An internet browser that is used to run an application is said to be a thin client.
But even thin clients can consume substantial amounts of CPU time on the computer that they are running on.
This is particularly the case with complex web pages that utilize many recently introduced features to liven up a
web page. Rendering a page after hitting a SUBMIT button may take several seconds even though the server
may have responded to the request in less than one second. Testing tools such as WinRunner are able to be
used to drive a Thin Client, so that response time can be measured from a user’s perspective, rather than from a
protocol level.
Top Down Testing: An approach to integration testing where the component at the top of the component
hierarchy is tested first, with lower level components being simulated by stubs. Tested components are then
used to test lower level components. The process is repeated until the lowest level components have been
tested.
Traceability Matrix: A document showing the relationship between Test Requirements and Test Cases.
Tuning Cycle Tests: A series of test cycles can be executed with a primary purpose of identifying tuning
opportunities. Tests can be refined and re-targeted 'on the fly' to allow technology support staff to make
configuration changes so that the impact of those changes can be immediately measured.
Usability Testing: Testing the ease with which users can learn and use a product.
Use Case: The specification of tests that are conducted from the end-user perspective. Use cases tend to focus
on operating software as an end-user would conduct their day-to-day activities.
User Acceptance Testing: A formal product evaluation performed by a customer as a condition of purchase.
Unit Testing: The most 'micro' scale of testing; to test particular functions or code modules. Typically done by
the programmer and not by testers, as it requires detailed knowledge of the internal program design and code.
Not always easily done unless the application has a well-designed architecture with tight code; may require
developing test driver modules or test harnesses.
Verification: The process of determining whether of not the products of a given phase of the software
development cycle meet the implementation steps and can be traced to the incoming objectives established
during the previous phase. The techniques for verification are testing, inspection and reviewing.
Volume Testing: Testing which confirms that any values that may become large over time (such as
accumulated counts, logs, and data files), can be accommodated by the program and will not cause the program
to stop working or degrade its operation in any manner.
Volume Tests are often most appropriate to Messaging, Batch and Conversion processing type situations. In a
Volume Test, there is often no such measure as Response time. Instead, there is usually a concept of
Throughput.
A key to effective volume testing is the identification of the relevant capacity drivers. A capacity driver is
something that directly impacts on the total processing capacity. For a messaging system, a capacity driver may
well be the size of messages being processed. For batch processing, the type of records in the batch as well as
the size of the database that the batch process interfaces with will have an impact on the number of batch
records that can be processed per second.
Walkthrough: A review of requirements, designs or code characterized by the author of the material under
review guiding the progression of the review.
White Box Testing: Testing based on an analysis of internal workings and structure of a piece of software.
Includes techniques such as Branch Testing and Path Testing. Also known as Structural Testing and Glass Box
Testing. Contrast with Black Box Testing.
Workflow Testing: Scripted end-to-end testing which duplicates specific workflows which are expected to be
utilized by the end-user.
A common type of automated tool is the 'record/playback' type. For example, a tester could click through
all combinations of menu choices, dialog box choices, buttons, etc. in an application GUI and have them
'recorded' and the results logged by a tool. The 'recording' is typically in the form of text based on a
scripting language that is interpretable by the testing tool. If new buttons are added, or some underlying
code in the application is changed, etc. the application might then be retested by just 'playing back' the
'recorded' actions, and comparing the logging results to check effects of the changes. The problem with
such tools is that if there are continual changes to the system being tested, the 'recordings' may have to
be changed so much that it becomes very time-consuming to continuously update the scripts.
Additionally, interpretation and analysis of results (screens, data, logs, etc.) can be a difficult task. Note
that there are record/playback tools for text-based interfaces also, and for all types of platforms.
Another common type of approach for automation of functional testing is 'data-driven' or 'keyword-driven'
automated testing, in which the test drivers are separated from the data and/or actions utilized in testing
(an 'action' would be something like 'enter a value in a text box'). Test drivers can be in the form of
automated test tools or custom-written testing software. The data and actions can be more easily
maintained - such as via a spreadsheet - since they are separate from the test drivers. The test drivers
'read' the data/action information to perform specified tests. This approach can enable more efficient
control, development, documentation, and maintenance of automated tests/test cases.
Use Case
It is a description of functionality of certain feature of an application in terms of actors, actions and response.
Snap Shot:
User Name
Password
Connect To
OK Clear Cance
Functional Requirements
1. Login screen should contain user name, password connect to fields, login, clear and cancel buttons.
2. Connect to field is not a mandatory field but it should allow the user to select a database option who ever request it.
So that he can connect to the mentioned database while login in.
3. Upon entering valid username, valid password and clicking on login button corresponding page must be displayed
4. Upon entering some information into any of the fields and clicking on clear button all the fields must be cleared and
the cursor should be placed in the user name field.
5. Upon clicking on cancel button login screen must be closed.
1. Initially whenever the login screen is invoked (opened) the login, clear button must be disabled.
2. Cancel button must be always enabled.
3. Upon entering user name and password the login button must be enabled.
4. Upon entering some information into any of the field the clear button must be enable.
5. The tabbing must be User name, Password, Connect to, Login, Clear and Cancel.
Post – Conditions:
Either home page or admin page for valid users and error message for invalid users.
Flow of events:
Main Flow
Action Response
Actor invokes the application. Login screen is displayed with the following felids
username, password, connect to, login, clear and
cancel.
Actor enters valid username, valid password and Authentication, either home page or admin page is
clicks on login button. displayed depending upon the actor entered.
Actor enters valid username, valid password, Authentication, either home page or admin page is
selects a database option and clicks on login button. displayed with the mentioned data base connection
depending upon the actor entered.
Actor enters invalid username, valid password and Go to alternative flow table 1.
clicks on login button.
Actor enters valid username and invalid password Go to alternative flow table 2.
and clicks on login button
Actor enters invalid username and invalid password Go to alternative flow table 3.
and clicks on login button.
Actor enters some information into any of the fields Go to alternative flow table 4.
and clicks on the clear button.
Alternative Flow Table 1. (Invalid Username)
Action Response
Actor enters invalid username, valid password and Authenticates, an error message is displayed
clicks on login button. “Invalid Username Please Try Again”.
Action Response
Actor enters valid username, invalid password and Authenticates, an error message is displayed “
clicks on login button. Invalid Password Please Try Again”.
Action Response
Actor enters invalid username, invalid password Authenticates, an error message is displayed “
and clicks on login button. Invalid password and username”.
Action Response
Actor enters some information into any of the fields All the fields are cleared and the cursor is placed on
the user name.
Action Response
Actor clicks on the cancel button Login screen is closed.
Guidelines to be followed by a test engineer once the use case document is give to him.
Functional points
The point where a user can perform some action is known as functional point.
It is a document which contains a table of linking information used for tracing back for the reference in any kind of
questionable or confusion situations.
26 6 3 38 1
32 8 4 42 3
UCID TCID
TCID DPD
1 2
38 1
2 4 42 2
34 3
Quality Standards
Quality Standards are defined as a set of guidelines or help lines given to an organization to accomplish each and
every task in an effective, efficient & optimized way.
The BVQI will prepare the Quality standards for ISO as mentioned below..
1. All the companies are Invited
2. All the experts are Asked to write their own procedures
3. All the experts are Requested to read all the procedures
4. Voting is Conducted
5. High voting procedure is picked-up, customize and finalize as Quality Standard
Assessment of Procedures
BVQI
ATR
Company …….
……..
BVQI : Bureau Veritas Quality International
ATL : Assessment Team Lead ATM : Assessment Team Member
ATR : Assessment Team Report
BVQI will appoint ATLs and ATMs, ATMs will come to the company to do External Audit and prepare a
report (ATR) and submit the same to ATL, ATL will review it and in turn sends to BVQI. After finalizing
the report, BVQI will gives the ISO certification.
ISO 9000: It is used for startup company. The company which started
newly but don't have resources can get ISO 9000.
ISO 9001: It is given for full fledged companies. The company should have its own above 5
departments, such type of companies are given this ISO 9001.They should also follow the set
of guidelines. These are the terminology given for both IT & non-IT companies.
1 2 3 4 5
2 2 2 2 2
ISO 9002: In this Certification for Planning /Designing department they will take the help of ISO 9001.
1 2 3 4 5
2 2 2 2 2
ISO 9003: It is purely meant for testing. In this only Test department is available.
The purely third party testing companies are certified
3
2
ISO 9004: Research & Development and also continual growth of the company are concentrated.
CMM:(Capability Maturity Model)
SEI : CMM: (Software Engineering Institute): Initially a five level model is developed by the PHD people
from Software engineering Istitute located in USA. That is why it is named as SEI CMM.
CMM Level 2: REPEATABLE: Once we entered into it two guidelines are given one is well defined
guidelines another they should focus on reusability. Team should very strong in Technical.
(with Strong Team , well defined guidelines and Focus on Reusability )
CMM Level 3: DEFINED :It means Documented. At CMM Level 3 it should focus on Documentation.
(with Strong Team , well defined guidelines ,Focus on Reusability and focus on Documents)
CMM Level 4: MANAGED Each and every work in the organization must be measured ( Metrics).
It should focus on metrics apart from level 1 + level 2 + level 3.
(with Strong Team , well defined guidelines ,Focus on Reusability, focus on Documents and focus on
Marketing )
CMM P 0r PCMM: P stands for People capability & maturity of people. It focus on People.
6-Sigma :
99.76%
6-Units span
6-Sigma Axis
1 2 3 4 5 6 7 8 9 10
6-Sigma companies will develop the products in multiple cycles, for each cycle they plot the graph, whenever the
graph Spans for 6-units on the 6-sigma axis , they will stop the product and check for the Quality. So that
the quality will be 99.76% and above.
1. Define
2. Measure
3. Analyze
4. Improve
5. Control
Every company has to Define their own production process. After production they should Measure the output
with the defined value. Analyze the result with define value and out put values, if at all is there any miss-matched
item were produced then they should Improve the production process in such way to get out put with less
deviation. In order to do the same they also have Control on the whole process. So that they could able to get the
output in such way to plot the graph with 6-units span.
Quality Center
Introduction :
Tool name : Old version - Test Director by Mercury people.
: New Version - Quality Center by HP people
Type of the tool : Management tool
Company : Mercury Interactive Incorporation ( now by HP)
Version : 9.2 ( But we learning 9.0)
Scripting Language : No scripting language is used
1.Requirements Module: In this section, first we understand all the requirements, once we understood the
what are to be tested, listed out all the Main Requirements as well as sub-requirements (child
requirements).
This module used for building the requirement string. To do the same CR2
this module has provided Two options
a). New Requirement
b). New Child Requirement
These two requirements will do the following
i). One can attach any kind of attachments to the requirement MR2
ii). It will automatically shows the author name CR3
iii). It will generate IDs automatically for each and every requirements
iv). One can give direct cover status of the requirements like CR4
Whether the test is covered or Not
Whether its executed or Not
If executed, whether its passed or Failed
MR : Main Requirement
CR : Child Requirement
MT : Main Test
CT : Child Test
3.Test Lab Module : In this section, First of all we need to identify all the end-to-end scenarios , then for each
end-to-end scenarios we will create a Test Set.
1. To do the same one has to Build/ create the folders and create corresponding test sets with help of all the
available test sets based on the different end-to-end scenarios .
2. Once the tests are Build one can Execute them by either RUN or RUN ALL options provided in this
module.
RUN : is used for Running a single selected test in the test set
RUN ALL : is used for running all the tests in the test set
3. Once the execution is completed one can analyze the Results in the Functional tool (QTP) it self and if at
all one can identify the Defects and need to post them , he can do it from the Functional tool itself, other wise, go
to next module by name Defects Module.
4. Defects Module : This module acts like Bug Tracking tool and provides all the facilities to manage the
defects like Adding the defects, changing the status of defects etc., with complete bug tracking facility is
provided.
Once the Defects (if any) are been sent to Developers, they will re-build the application by removing such defects
and send it again for Testing.
This is called 2nd Build, doing testing on 2nd build on wards is called Re-Testing or Regression testing.
Here we do testing from 3rd module (Test Lab Module) to 4th module (Defects Module).
Till all the defects have been fixed.
So that we will have a proof for every related tasks ( above 4-modules).
Then we can say that we have done a GOOD Testing.
Requirements Module Test Plan Module Test Lab Module Defects Module
Plan a Test by Creating Creating Test Set for Bug Tracking
corresponding test each end-to-end
Creating MR plans for scenarios. Adding defects
Creating CR requirements.
Changing status..
Building Test set
etc.
Creating MT Executing Test set
Creating CT
Due to cos,t most of companies are not using Quality center tool, But, maintaining their Own systems for all
above 4 operations.
5. Business Component* : It is a special feature or module provided by Quality Center in order to perform
Business Process Testing. In this type of testing usually the subject experts i.e business analysts
will develop the business components based on the business flow of the application and then
Automated test engineers will build the script in them. So that one can perform testing in an even
more effective manner.
6. Dash Board* : It’s a special component provided by quality center which is used for generating the reports
across multiple projects at a time. Usually this facility will be used by Managers.
Site Administrator
Site Administrator is a very important component provided by Quality Center which is
used for following…
1. Creating new users
2. Deleting old users
3. Modifying the information of the users
4. Creating the Domain
5. Creating the projects i.e allocating the space for new projects in the server/ common resources area
6. Assigning the users for the project and all other administrative activities.
* We write Functions in Quality center in such way that to open Test script in QTP and when we save it that will
be saved in Quality center.
* Scripts can’t be written in quality center.
* Scripts can’t be executed in quality center.
1. MANUAL TESTING.
2. AUTOMATION TESTING.
1). MANUAL TESTING
It is a process in which all the phases of Software Testing Life Cycle like Test Planning, Test
Development, Test Execution, Result Analysis, Bug Tracking and Reporting are accomplished
successfully manually with human efforts.
It is a process in which all the drawbacks of manual testing are addressed (over come) properly
and provides speed and accuracy to the existing testing phase.
Note:
Automation Testing is not a replacement for manual testing it is just a continuation for a manual
testing in order to provide speed and accuracy.
AUTOMATION TOOL
Automated Tool is an Assistance of test engineers, which works based on the instructions and
information.
A test engineer should learn the following to work with any automated tool.
1. How to give the instruction.
2. How to give the information.
3. How to use its recording facility.
4. How to use its play back facility.
5. How to analysis the Result.
Introduction
Anatomy of Q.T.P
Add – In – Manager
Add in manager is a feature providing Q.T.P, used for making the Q.T.P compatible with the specified
environment.
1. Visual Basic
2. Active x.
3. Web.
Q.T.P is always compatible with standard windows environment apart from the above add ins any other
add in is required one need to purchase it by paying extra cost.
Test pane is an area provided by Q.T.P, which is used for developing, viewing and modifying the test
script.
1. Expert view
2. Keyboard view
Expert view.
Expert view represents the script in VB script format.
Keyboard View
It represents the scripts using a graphical user interface, which is further divided, into 4 parts.
1. Item
2. Operation
3. Value
4. Documentation
2) Active Screen
Active Screen is a feature provided by Q.T.P which holds the snap shots related to each and every script
statement and used for understanding the script easily as well as enhancing the script easily.
3) Data Table
Data table is also called as formula1 sheet, which is developed by the third party and integrated with the
Q.T.P.
Features: -
5) Tool Options
The options available in Manu bar, File Toolbar, and Testing Toolbar are known as Tool options.
1. It will generate the corresponding test script statement for every user action.
2. It will also store the required related information in the object repository.
Recording Modes
1) Normal recording
It is used for recording the operations perform at different contacts on the standard GUI objects.
2) Analog Recording
It is special recording mode provided by Q.T.P, which is used for recording the minimum operations on
the Non-Supported environments also.
1. Test plan
2. Generating the basic test
3. Enhancing the test
4. Debugging the test
5. Executing the test
6. Analyzing the results.
1) Test Planning
In this phase the automation test lead will do the following
Types of Checkpoints
1. Standard checkpoint
2. Bit map checkpoint
3. Text checkpoint
4. Text area checkpoint
5. Data base checkpoint
6. XML checkpoint
7. Page checkpoint
8. Table checkpoint
9. Image checkpoint
10. Accessibility checkpoint
1) Standard checkpoint.
Through application
Through active screen
2) Bitmap checkpoint.
It is used for checking the complete bitmap or a part of a Bitmap.
It cam be inserted in 2 ways.
Through the application.
Through the active screen.
3) Text Checkpoint
It is used for checking the Text present on a object. It can be inserted through application as well as
through active screen.
It is a process of replacing the constant values with the variables or parameters in order to increase the scope of the
test.
5) Measuring Transactions:
It is a concept provided by Q.T.P, which is used for calculating the time taken by an application to perform a specific
task or the execution time of a block of script statements.
To do the same Q.T.P has provided 2 options.
1. Start Transactions.
2. End Transactions.
Navigation through inserts Transaction points.
In order to avoid the above navigation one can insert the following statements directly into the script
Using the value of properties in Object Repository QTP will Identify the object in AUT ,
Hence, we can’t change the value of property of that object.
Usually we should not change the value of property of an object but when the value of Original
object’s property is changed by customer then we can change its property value.
We can identify an object with minimum number of properties, so, execution speed will be fast.
If those minimum properties are not sufficient to find an object in AUT then we go for some more
properties to identify that object.
Highlight button is used for High Light the corresponding object in AUT. Some times QTP will
name objects to un-meaningful names , by using this button we can easily find that object in
AUT and re-name to a meaning full names which we can easily remember. OR
Whenever a testing engineer is not understanding any object, then he will use HighLight button ,
so that it will make the corresponding object is highlighted in AUT. Thereby, one can easily
understand which object is this.
Per-Action Repository :- If at all this type of repository is selected , for every action a Separate
individual repository is created automatically and managed by QTP.
For sample test we go for per-action repository
Per-action repository can’t be Re-usable
Space required for storage is Less
Execution speed is fast
Shared Repository :- If at all this type of repository is selected ,then one need to create the shared
repository manually and associate it to the corresponding test manually.
For long run we go for shared repository even though we need to create manully.
Shared repository can be re-usable
Space required for storage is more
Execution speed is slow
Easily maintenance
Object Identification :
Object Identification concept is based on 4-types of properties and an Ordinal Identifier.
Those 4-Types of properties :-
BFP
MP MP
I I
OFP
AP AP
………………. ……………….
……………… II ……………… II
OI
OI
……………….
……………….
………………
……………… III III
A
All
All MP + All BFP I
MP
May be/All AP
All AP I
……………
…………..
Fresh
OI OFP
OI II ………………. II ………………. III
………………. ……………… ………………
………………
First of all QTP will use all the properties presented in the Object Repository Except the Ordinal
Identifier and try to identify the object. If it fails, it will check whether the Ordinal Identifier is available
or NOT. If it is available QTP will identify the Object Roughly or other wise STOPs.
Ordinal Identifier
* Generally we should not encourage the Ordinal Identifier but when the application is Stable
then we may use it.
* Once the application is stable then only we go for Automation, till such time we do only
manual testing.
1. Location :- If at all the location is selected as an OI then the QTP will generate the Numbers from
0,1,2,3……based on the Sequence of the objects located in the Application.
2. Index :- If at all the index is selected as an OI then the QTP will generated the Numbers from
0,1,2,3…. based on the Sequence of the Programs of the objects.
Runtime Object :- Run time object is he Original objects presented in the AUT .
Test Object :- Test object is the Reference object for the Runtime object created by QTP and used by QTP
to identify the Runtime object during the Execution.
Recovery Scenario
*** When you want to recover from any problem, First, face the problem manually, Find the solution
manually , Recover from that problem manually.
And in the same way you implement for Automation ( in QTP)
------------------------------------------------------
Open a new test , Type Function () , you will get
Function
End Function
Modify the above code as below
Function popup_recovery()
End Function
------ Copy it and open a new NotePad and past it .
Save it with .vbs (VBSctipt ) Extention as library file -------
Run the Test which will Pass the results of course with Warnings ( may ignore it)
=================================================================
Recovery Scenario with Object State
( When the object is disabled )
------------------------------------------------------
Open a new test , Type Function () , you will get
Function
End Function
Modify the above code as below
Function popup_recovery()
End Function
------ Copy it and open a new NotePad and past it .
Save it with .vbs (VBSctipt ) Extention as library file -------
Run the Test which will Pass the results of course with Warnings ( may ignore it)
=================================================================
Recovery Scenario with Test Run Error
During execution one step may not execute properly , QTP will ignore it run next step on words .
For that we just need to call an empty function ().
After some time Version 2.0 is released with some changes V2.0
1. Open the version 2.0 application
2. Use the *same generated script only Chennai
But when u run this program, Delhi
It execute fist city name (Chennai)
When it comes to second city name (Hyderabad)
Test will Stops and FAIL.
Though its missed , in order to continue the execution from next step onwards
We just call an empty function which is stored in a library file
3. Keep the cursor where the city name seems to be missed
4. Activate tool menu item Tools
5. Open recovery scenario manager wizard
6. select trigger event as Test Run error
7. choose the error type from drop-down box as Item in list or menu not fund
8. Click NEXT
9. Click NEXT to specify the recovery operation
10. select operation type as Function call
11. Click NEXT
12. choose library file path where an empty function was stored
13. Click NEXT
14. De-select check box ( of add an other recovery scenario )
15. Click NEXT
16. Select post-recovery as Proceed to Next Step
17. Give name and description for this scenario
18. Click NEXT
19. Select add scenario to current test
20. click on Finish
21. Save it with .qrs extension
When you Run it, it execute normally till it finds an error, when it found an error, It stops a while
(means calling an empty function ) and continue.
Environment Variables :-
The variables that are commonly used across the environment in many tests by different resources are know as
Environment Variables.
1.Built-in-variables :- These variables will be by default available in every test and can be directly used
in any test with help of following syntax.
Syntax : Environment.value(“Built-in-variables”)
Example :
var=environment.Value("OS") : to display the Operating System
msgbox var
2. User Defined Variables :- The variables which are required commonly in number of test apart from
the Built-in-variables need to be created by the user which are known as User Defined Variables .
User Defined Variables are created in environment file, any body in that
environment can Associate this file and use the variables in it.
2. External User Defined Variables :- which are Imported from other file
Example :
Open the Cal application
Put the tool under recording mode
Capture the objects properties of Cal application to Object Repository
Stop recording
Declaring the Environment Variables
Activate the menu item Test
Go to Settings
Select the Environment tab
Select variable type as User-defined
Select the check box of ‘load variables and values from an external file’
If you want you can make use of Exported data or you can create your own data in a file
with .xml extention file in the Environment folder
Browse that file
Click on Apply
Click on OK
Associating the Environment Variables ( by parameterizing )
Develop the script in test pane as below
' Setting the declered environment value (a ) to value1 edit button
VbWindow("Form1").VbEdit("val1").Set environment.Value("a")
' Setting the declered environment value (b ) to value2 edit button
VbWindow("Form1").VbEdit("val2").Set environment.Value("b")
' clicking on ADD button
VbWindow("Form1").VbButton("ADD").Click
Frame Work : Frame work is a Generic work designed by an expert and followed by many people to perfom a
particular task in an effective , efficient and optimized way.
1. Linear Frame work: This is a general and old frame work that can be used by many people.
Steps to follow ….
a). Generate the Basic Test
b). Enhance the test
c). Debug the test
d). Execute the test
e). Analyze the result
AUT
Example : Login
………
Tasks : Login ……..
Insert order Insert order
Open existing order ………
Logout ……..
Note : Here all the tasks are put together in one test pane and done the job
Open order
………
Put the tool under recoding mode
………
Open flight application
Logout
Login with username and password
Click on OK
Insert an order by keying all the required info therein
Click on insert order button
That order will be inserted successfully. After inserting the order
Open existing order by clicking on open folder icon
A open order window will appear, check the order number check box
Input the existing order number ( say 9)
Click on OK
The order will open, if necessary you may update / delete the opened order
Logout will be done by going to menu bar of the application and select file , select
exit .
The application will close
Stop recording
Run the test
Analyze the result
2. Modular Frame work : This is also a general frame work that can be used by some people.
Steps to follow ….
a). Prepare the Individual Components for different tasks
b). Make the require compensates are as Re-Usable
c). Prepare the desired Driver based on end-to-end scenario
d). Execute the Driver
e). Analyze the results
Example :
Tasks : Login
Insert order
Open existing order
Logout
Note 1 : After preparing complete action, split it into different individual actions and call them in a driver.
Note 2 : Other words in this frame work, we are calling the action/s ( Ex. Call login )
AUT Driver
Login login insert open logout
……… Call login
……..
Insert order Login Insert order Open order Logout Call insert
……… ……… …….. ……. ………
…….. ……… …… …… ……… Call open
Open order
……… Call logout
………
Logout Components
Put the tool under recoding mode
Open flight application
Login with username and password
Click on OK
Insert Order by keying all the required info therein
Click on insert order button
That order will be inserted successfully. After inserting the order
Open order by clicking on open folder icon
A open order window will appear, check the order number check box
Input the existing order number ( say 9)
Click on OK
The order will open, if necessary you may update / delete the opened order
Logout will be done by closing the window/application
Stop recording
Save the Script (say fl_application , No extention is required)
Split the script into 4 tasks ( login, insert_order, open_order and logout )
Keep the cursor at the beginning of 1st line of 2nd part ( i.e starting of insert order line)
Go to menu bar, click on Step , select Split Action
The split action window will appear
choose action type as independent of each other, give the 1st action name ( say
login) and leave the 2nd action name as it is ( because, again we are going to split the
2nd part )
Click OK
Save the changes. Next,
Keep the cursor at the beginning of 1st line of existing 2nd part ( i.e starting of insert
order line)
Go to menu bar, click on Step , select Split Action
The split action window will appear
choose action type as independent of each other, give the 1st action name ( say
insert_order) and leave the 2nd action name as it is ( because, again we are going to
split the 2nd part )
Click OK
Save the changes. Next,
Keep the cursor at the beginning of 1st line of existing 2nd part ( i.e starting of open
order line)
Go to menu bar, click on Step , select Split Action
The split action window will appear
choose action type as independent of each other, give the 1st action name ( say
open_order) and give the 2nd action name as logout ( because, its end of splits )
Click OK
Save the changes.
So, we have splited all 4 tasks/actions successfully.
Now make them as re-usable components
Open the just created action i.e login from drop-down box on the tool
Go to menu bar , click on Step , select Action Properties
Action properties window will appear
Select General tab
Check the Reusable action check box
Click on OK
Do the same for other actions too i.e insert_order, open_order and logout. Next
Open new Test
Re-name the action as Driver , Go to menu bar , click on Step , select Action
Properties. Action properties window will appear ,Select General tab , change the
action name as Driver.
Click on OK
Here we can call any or all those re-usable actions
Go to menu bar, click on Insert, select Call to Existing Action
Select action window will appear ,
Batch Testing :
Batch testing is a process of executing a group of tests at a time.
To do the same QTP has provided a separate tool by name “Test Batch Runner” and we have to
configure the tool settings as below…..
QTP -> Tools -> Options -> Run -> Check Allow other mercury products to run tests and components.
Regration testing : Testing the functionalities of function and all its related functionalities at a time is called
Regration testing. Or its is the process of executing number of test a time.
Example : ??
3. Keyword Driven frame work: This is also a general frame work that can be used by most of people.
Steps to follow ….
1.First of all create the folder stricture as follows short form : TRL RET L
ProjectName_Automation
Script
Repository …………….
…………….
…… ………
Library ……………..
Recovery
L1
L2
Environment L4
Test
Test
Key word
Log
2. Create the required Test Data files and save them in the corresponding folder ( Test Data folder).
3. Create the required Shared Repository files and save them in corresponding folder ( Repository folder)
4. Create the required Library files and save them in corresponding folder ( Library folder)
5. Create the required Recovery files and save them in corresponding folder ( Recovery folder)
6. Create the required Environment files and save them in corresponding folder ( Environment folder)
7. Open the Main Test and associate all the Required resources like Test Data files, Repository files,
library files, recovery files and environment files.
8. Develop the script in such way that it executes based on the keyword given in the data table.
9. Save the Test in the corresponding folder ( Test folder)
10. Whenever require Open it and execute and analyze the results.
Example :
Tasks : Login
Insert order
Open existing order
Logout
Note : Here, after preparing complete action, split it into different individual Functions and call them in a Test.
Note 2 : Other words in this frame work, we are calling the Function/s ( Ex. Call login() )
AUT Test
Login login() insert() open() logout() Case “l1”
……… login()
…….. Case “l2”
Insert order Login Insert order Open order Logout insert()
……… ……… …….. ……. ……… Case “l3”
…….. ……… …… …… ……… open()
Open order Case “l4”
……… logout()
………
Logout Functions stored in Library folder
So, for above 4 tasks, we need to create 4 functions and store them in a library life
under library folder. For that …
Open a new notepad and write down the below script in it
Function login()
* Here, Past the login code from Test script
End function
Function insert_order()
* Here, Past the insert_order code from Test script
End function
Function open_order()
* Here, Past the open_order code from Test script
End function
Function logout()
* Here, Past the logout code from Test script
End function
Recording properties into object repository
Put the tool under recoding mode
Activate flight application
Login with username and password
Click on OK
Insert Order by keying all the required info therein
Click on insert order button
That order will be inserted successfully. After inserting the order
Open order by clicking on open folder icon
A open order window will appear, check the order number check box
Input the existing order number ( say 9)
Click on OK
The order will open, if necessary you may update / delete the opened order
Logout will be done by closing the window/application
Stop recording
Creating Shared Repository
Open Object Repository, where in which you can see all the properties of objects
Click on Export
Browse to Repository folder and Save it with .tsr ( Test Script Repository )
extension.
Click on OK
Copying and Pasting the corresponding scripts to library file from a Test Script
Copy only login script from Test Script and past into opened notepad under
Function login() ……*…. End Function.
Do the same for other functions too (i.e insert_order, open_order, and logout )
Now Save that file with .vbs extension under Library folder
With this your currently open test area will be empty.
Now , open a new Test
Associate the required files to new test (into Repository ) as below
Menu -> Test -> settings -> Resource Tab -> select object repository type as Shared
Browse the saved repository file from Repository folder
Click on Apply and OK
In the same manner associate the required library files to new test
Associate the required files to new test (into Repository ) as below
Menu -> Test -> settings -> Resource Tab -> click on + (add) button
Browse the saved library file from Library folder
Click on Apply and OK
Creating data in data table
Activate data table of the test
Rename the 1st column name as “keys” ( by double clicking on it and type keys).
Enter data like l1,l2, and l3 for each row in the table ( specifying the key value in the
script so as to pick-up the relevant keyword from the data table ).
Now Develop / write the script in test area in such way that it uses some or all
resources and execute based on key values given in the data table.
For example : to login- insert order – open order and logout
var =datatable("keys", 1) :pick-up 1st value in data table and
assigned to var
Select Case var
Case "l1"
login()
Case "l2"
open()
Case "l4"
logout()
End Select
Note : System.run “path” is also can be used for invoke the application
4. Hybrid Frame work : Hybrid frame work is a mixer of two or more frame works.
Methods
Method : Method is some thing which is used by QTP to Perform any action is called a method.
Example : Click, set …. Etc
1. Capture Bit Map() : This method used for capturing a snap shot of an object during Execution and stores
in a desired location. When you want to send Defect to the developer, this captured bitmap
can also be sent in order to understand the error/defect easily.
Syntax : Object Hierarchy. captureBitmap “path of the location with a file name .bmp extension”
2. Exist () : The main purpose of this method is to check whether the object/ window is existing or NOT.
Exist method is used for checking the object’s existence. If the object is existing then it
will return TRUE and continue the execution. Otherwise , it will wait till the object exists or upto the
maximum time. Meanwhile at any point of time the object exists it will return TRUE and continue
its execution, otherwise, after the maximum time is finished it will return FALSE and continue its
execution.
* Exist method will make the QTP to wait till the default time Plus extra time (if any)
3. WaitProperty () : This method is used for make the tool to wait based on the object property’s value or
up to maximum time .
Syntax : Object Hierarchy. Waitproperty “propertyName”, Property value, extra time in milliseconds
Example :
Open the flight application and put tool in recoding mode
Open the order by clicking on open order, it will displays the open order window
Enter an existing order number and click on Ok. That order will be opened.
Stop recording.
Now , if you want to wait the tool even after clicking on OK button
take the property name ( as text),value( as OK) from object repository and put extra
time in milliseconds.
For example : OH.WaitProperty "text", OK,10000
And put the above code after the OK button clicked statement in the script
Run the test
Analyze the results
4. Wait () : This method is used for making the tool to wait till the maximum time is elapsed
Example :
3. Set Method : Set method is used mainly to perform on 3 objects i.e Mstatus: Married
a). Edit box
b). Check Box Emp Un Married
c). Radio Button Location:
Hyderabad
a) Edit Box : Set method is used for entering any value into an edit box
Syntax : Object Hierarchy. Set “value” Delhi
Chennai
Example :VbWindow(“Emp”).VbEdit(Ename).Set “ak”
b) Check Box : Set method is used for Selecting/de-selecting the check box
Syntax : Object Hierarchy. Set “ON/OFF”
c) Radio Button : Set method is used for selecting a Radio Button in a group
Syntax : Object Hierarchy. Set
Example :VbWindow(“Emp”).VbRadioButton(Location).Set
Example :
5. Set Secure Method: Which is used for setting the encrypted data into the edit box.
* Encrypted string can be generated with help of a tool “password encoder”.
Navigation for password encoder : Start -> Programs -> QTP -> Tools -> Password encoder
Example : *********
Example :
8. Type Method: Which is used for performing any kind of keyboard related operations.
Syntax : Object Hierarchy. Type keyvalue
Example : ggggg
Important Methods
RO ( Runtime Object ): Runtime object is the original object that was presented in the application (AUT).
TO ( Test Object ) : Test object the Reference of original object stored in the object repository and
used for identifying the original object in the AUT during the execution.
1. Get RO Property : This method is used for getting the Runtime Object property’s value
Text
Width etc…..
Note : By using getROproperty method we can avoid the standard checkpoint, because, the check points will
create an “internal file” which is a over burdon the execution and takes the memory to store that files.
OK
Else
Reporter.ReporterEvent 3, “myreporter”, “OK Button doesn’t existing” : 3=worning
End if
* This is the only script that we generated by tool and remaining thing is the enhancement to that script
done manually by test engineer.
2. Get TO Property : This method is used for getting the Test Object’s properties value.
Text
Width etc…..
Example :
var = VbWindow(“form1”).VbButton(“ADD”).GetTOproperty(“Text”)
Note : The properties that are learnt during the learning time are called Test Object Properties
Test Object Property = Properties presented in object Repository
+
Properties presented in secret place (smart brain ) if any.
Test object is a Reference of runtime objects which are stored in Object Repository and in
smart brain (if smart brain is enabled).
Whatever all the Properties Learnt during the learning time are called Test Object Properties.
When with help of Get RO properties its self we can get all the properties in an AUT ? Why Get RO and
Get TO properties?
Get RO property will be used to capture the all the properties’ present values of objects
in AUT during the Learning Time
where as Get TO property will be used to capture the all the properties’ present value of
objects in AUT during the execution time.
For example : In a application OK button is disabled by default ( This RO property) but when
entering some data therein the OK button will be enabled( This TO property).
3. Set TO Property : This method is used for Setting the Test Object’s properties value temporarily
during the execution.
Text OK
Width etc….. Start …..
Example : for clicking on same button for Start-Stop-Start. Assume that initially the button is named with Start
and want to be Stop when its clicked. And again if its clicked, it should be Start.
1St rename the button name as B1 in Object repository so as to don’t effect other buttons.
VbWindow(“form1”).VbButton(“B1”).Click
VbWindow(“form1”).VbButton(“B1”).SetTOproperty “text”, “stop”
VbWindow(“form1”).VbButton(“B1”).Click
VbWindow(“form1”).VbButton(“B1”).SetTOproperty “Text”, “Start”
Script :
All Data Table Methods are used in Runtime data table ONLY i.e we can’t do any thing with Design
time data table.
For every action a corresponding Sheet will be created in both data tables but only one Global sheet
for each data table.
In order to maintain test data clearly, simply and user friendly we Maintain each sheet for each
corresponding action.
Sheet can be inserted along with action only.
If you want to import data to Runtime data table , you need to add sheet manually in the same.
Local Sheet : In local sheet the focus will be on 1st row itself only
Global sheet : In global sheet for every iteration the focus need to change on next row and so on.
Methods :-
1. AddSheet() :- It is used for adding an extra sheet to the Run Time data table
Syntax : Datatable.AddSheet “Sheet Name”
2. DeleteSheet() :- It is used for deleting a specified sheet from the Run Time data table
Syntax : Datatable.Deletesheet “sheet name”
3. Import() :- It is used for importing the data presented in all the sheets in a xl file to the RTDT
Syntax : Datatable.import “path of the .xls file”
4. ImportSheet() :- It is used for importing the data presented in a specified sheet in the xl file to the
specified sheet in Run Time Data table.
Syntax : datatable.ImportSheet “path of the xl file”, Source sheet ID, Destination Sheet ID
5. Export() :- It is used for exporting the data presented in Run time data table to a specified laction.
Syntax : datatable.Export “path of the location with a file name .xls extension”
6. ExprotSheet() :- It is used for exporting the data presented in a specified sheet in Run time data table
to a specified location
Syntax : datatable.ExportSheet “Path of the location with a file name .xls extension” , Sheet ID to be exported
7. SetCurrentRow() :- It is used for making the QTP focus on a specified row
Syntax : datatable.SetCurrentRow(Row number)
8. SetNextRow() :- It is used for making the QTP focus on the Next row of Currently focused row
Syntax : datatable.SetNextRow
9. SetPreRow() :- It is used for making the QTP focus on the previous row of the currently focused row.
When this method executes, the cursor will go to the last row of the table and moves
upward one by one row till it reaches to 1st row in the table.
Syntax : datatable.SetPrevRow
10. GetSheet() :- It is used for making the QTP focused on a specified sheet.
Syntax : datatable.GetSheet( Sheet Id)
11. Value() :- It is used for getting a value from a specified sheet, specified column and currently focused row.
Syntax: datatable.value(“column name”,sheet Id)
12. GetRowCount() :- It is used for getting the row count of the global sheet by default. If at all one wants
the row count of a specified sheet , then, First he needs to focus on the sheet and
then get the row count.
Syntax 1 : variable=datatable.GetRowCount
Syntax 2 : variable= datatable.GetSheet.GetRowCount
Example : For add data sheet, import data to datasheet, ADD the two values, pass the result to data sheet, export
the data sheet to log folder.
Run
Analyze the result
Test data may be in Database, so that we should connect our test to database and retrieve the data
and use the same in test.
1. How to Connect 2. How to Establish connection 3. How to Retrieve and use the data
Record Set : It is a temporarily location where we can store the retrieve data from data base at time.
From that temporarily location we can use the data one by one or as per requirement in our
testing.
Connection : Connects the application and database.
* We need to create Object Instances for both Record Set and Connection.
Example : For example, test data is stored as below, do write the database connection script.
Testdata.mdb
v1 v2 res
10 20 30
30 30 60
30 20 50
90 90 180
2 8 10
For MSACCESS :
=============================
* For Oracle and SQL we will write both Provider name and Connection in one line and rest is same.
For Oracle
con.open “provider=oraoledb.1;server=locahost;uid=userID;pwd=password;database=database name”
For SQL
con.open “provider=sqloledb.1;server=locahost;uid=userID;pwd=password;database=database name”
Advanced Topics
1. Regular Expressions :-
During execution some times QTP may not identify the objects due to the regular
changes in its properties values. To over come this situation one need to replace the
corresponding Constant value in the object repository with a suitable regular
expressions.
Example : To send number of orders thru fax
Example :
Function file
Function add(a,b) : function name is add
Total = a + b : adding to values to Total
add = Total : Retuning the result to the function name it self
End function
Example :
Open the cal application
Add the cal application’s object properties to Object Repository
Activate menu item Step
Select the option Action properties
Select Parameters tab
Declare the input parameters values by clicking (+) add button
Give the name (a) , Type ( number) and some default number
Similarly Give the name (b) , Type ( number) and some default number
Declare the output parameters by clicking (+) add button
Give the name (a) and Type ( number)
Click on OK
Write the script as below
' setting the input parameter to val1
VbWindow("Form1").VbEdit("val1").Set parameter("a")
' setting the input parameter to val2
VbWindow("Form1").VbEdit("val2").Set parameter("b")
VbWindow("Form1").VbButton("ADD").Click
' getting the value of Result and passing to var1
var1=vbwindow("Form1").VbEdit("res").GetROProperty("text")
' passing that var1 to Output parameter c
parameter("c")=var1
Make this test as re-usable
Activate menu item Step
Select the option Action properties
Select the check box Reusable action
Click on OK
Save it say (myTest) ( no extension is required)
Open the new test
Call that saved re-usable action
** Once we created virtual object , it will be available to All Tests except we delete it.
Example :
Open the virtual object application ( a window with colorbutton shape)
Activate menu item , select Virtual Objects option
Select the option New virtual Object
Virtual Object wizard will appear, click on Next
Select the standard class that nearly match to your object ( say button )
Click Next
Mark the object on the AUT screen, Height and width will be captured
Click Next
Select the option for parent of the virtual object
Select Next
Specify the name (button) and collection name ( Button_collections)
Click OK
Put the tool under recoding mode
Click on the area you selected as button and any area on the screen
Stop recording
Analyze the results
Virtual Object Manager : Which is a feature provided by QTP used for creating and maintaining the virtual
objects.
Where as in 9.0, we can avoid above two gaps and create one or more shared repositories and one local
repository for a Test. No need of connection for test and shared repositories.
* One test -> more shared repositories + one local repository.
*That is the main / major change in 9.0 from 8.2
Action 2
……….
Action 3 OR3
………..
OR4
Test
Action 1 +
II. ………….
……….... Local
………… Repository
As shown in section-I, We can create number of Object Repositories independently for each action.
Say Action 1 is having OR1 (say Calculator Application )and OR3 (say Login Application ) shared repositories
Similarly Action 2 is having OR2(say flight Application ) and OR4 (say mindQcal Application )shared
repositories And Action 3 is having OR1 and OR2 shared repositories
In section –II, we can make use of those shared repositories as per requirement PLUS one local repository
which is created for this test.
Final word : in QTP 9.0 one can associate one or more shared repositories to an Action a part from its local
repository.
Information Pane :- This pane will shows the Syntax related information during the syntax check.
Missing Resources Pane : While opening a test if at all any associated resources like Repository or Recovery
files or Library files are missing then that information will be clearly shown in
this Missing Resource area.
QTP 9.1
There are 4 changes were made to this 9.1 version from 9.0
A new Option by name Navigate & Learn is introduced in object repository manager which is
used for learning the objects information in multiple pages or windows continuously by navigating.
A new option is provided by name Object Repository Comparison Tool in the object repository manager
which is used for compare TWO object repositories.
QTP 9.1 is compatible with windows vista Operating system and .Net Frame work 2.0 Environment
From QTP 9.1 onwards the company has announced that they may provide Single License for all the add-
ins also
QTP 9.2
There are 4 changes were made to this 9.2 version from 9.1
A new feature by name Screen Recorder is introduced. Which is used for recording the complete movie of
execution which can be again viewed back / play back in the result window.
One can handle the Object Repositories dynamically in QTP9.2 with the help of a new utility object
Repositories Collection.
Object spy feature is enhanced with a facility that can show the objects information continuously by
pointing the Hand icon on the object
When to use?
This method can be used when the operation has to be performed on all objects which meet the specific criteria. For
example, check all check boxes available in a table.
Also, there may be a case when certain base filter property keeps changing when hence the test fails. In that case, we can
use Programmatic Description to access the object.
How to Use?
There are two ways to Programmatic Description. They are
1. Entering the Programmatic Description directly into Test Statements
2. Create object description
The first method is relatively simpler. But creating a object description and accessing the object is more powerful.
The object to be referenced is described in detail as the test object description, in the test step itself. Specify the
property : = value of the object while addressing an object.
Syntax:
<TestObject>(“<PropertyName1>:= <PropertyValue1>”, …, “<PropertyNameN>:= <PropertyValueN>”)
TestObject: Test Object class.
PropertyName1: Property name which has to be matched.
PropertyValue1: Property value which has to be matched.
Example:
Dialog("text:=Remote Desktop Connection","regexpwndtitle:=Remote Desktop
Connection").WinEdit("nativeclass:=Edit","attached text:=&Computer:").Set “testMachine”
The minimum number of properties which are needed to uniquely identify the object has to be given as the
“PropertyName := PropertyValue” pair.
Also, property value can be a variable which can be the run-time property retrieved with GetROProperty.
For example, the above line can be split into two to retrieve the text value dynamically and that variable can be
substituted in the place of property value.
1. RemoteMachineName = Dialog("regexpwndtitle:=Remote Desktop Connection").GetROProperty("text")
2. Dialog("text:=" & RemoteMachineName,"regexpwndtitle:=Remote Desktop
Connection").WinEdit("nativeclass:=Edit","attached text:=&Computer:").Set “testMachine”
RemoteMachineName will get the text property at run-time. This property is used in the next line as the property
value.
For the reason of code readability, the long line of description can be replaced by a variable. In the above example,
Dialog("text:=Remote Desktop Connection","regexpwndtitle:=Remote Desktop Connection") can be replaced by a
variable, say dlgRemoteDskTop and the line of script can be rewritten as,
1. dlgRemoteDskTop = Dialog("text:=Remote Desktop Connection","regexpwndtitle:=Remote Desktop Connection")
2. dlgRemoteDskTop. WinEdit("nativeclass:=Edit","attached text:=&Computer:").Set “testMachine”
A description object has to be first created. Then the object should be described with it properties and property
values in detail. Once a description object has been created, properties can be added, edited and removed. It is
also possible to retrieve the property and value contained in it.
Syntax:
Set <Description_Name> = Description.Create()
Example: This
Dialog("text:=Remote Desktop Connection","regexpwndtitle:=Remote Desktop
Connection").WinEdit("nativeclass:=Edit","attached text:=&Computer:").Set “testMachine” can be re-written using
Description Objects as…
1. Set objDesc = Description.Create()
2. objDesc(“text”).Value = “Remote Desktop Connection"
3. objDesc(“regexpwndtitle”).Value = “Remote Desktop Connection
4. Dialog(objDesc).WinEdit("nativeclass:=Edit","attached text:=&Computer:").Set “testMachine”
Note: Here again, variable can be substituted in the place of property value.
Example:
1. Set objDesc = Description.Create()
2. RemoteMachineName = Dialog("regexpwndtitle:=Remote Desktop Connection").GetROProperty("text")
3. objDesc(“text”).Value = RemoteMachineName
4. objDesc(“regexpwndtitle”).Value = “Remote Desktop Connection
5. Dialog(objDesc).WinEdit("nativeclass:=Edit","attached text:=&Computer:").Set “testMachine”
Retrieving Child Objects
Child objects contained inside a parent object can be retrieved with ChildObject method. First a description of the
child object type to be retrieved should be formed. When this description is sent as a child to the ChildObject, all
the objects which match the specified the criteria are retrieved.
Syntax:
Set <return_Value> = TestObject.ChildObject(<object_Description>)
Note:
The parameter to ChildObject() should be a description object that has been created already. It does not take direct
substitution of descriptive programming (<Property_Name> := <Property_Value>) on the same line of script.
Example:
‘Create a Description object for the parent
Set objOE6Wnd= Description.Create
objOE6Wnd("regexpwndtitle").Value = "Outlook Express"
objOE6Wnd("regexpwndclass").Value = "Outlook Express Browser Class"
6) Display the employee name and annual salary for all employees.
SQL>select ename, 12*(sal+nvl(comm,0)) as "annual Sal" from emp
7) Display the names of all the employees who are working in depart number 10.
SQL>select emame from emp where deptno=10;
8) Display the names of all the employees who are working as clerks and a salary > 3000.
SQL>select ename from emp where job='CLERK' and sal>3000;
9) Display the employee number and name who are earning comm.
SQL>select empno,ename from emp where comm is not null;
10) Display the employee number and name who do not earn any comm.
SQL>select empno,ename from emp where comm is null;
11) Display the names of employees who are working as clerks,salesman or analyst and drawing a salary
more than 3000.
SQL>select ename from emp where job='CLERK' OR JOB='SALESMAN' OR JOB='ANALYST' AND SAL>3000;
12) Display the names of the employees who are working in the company for the past 5 years;
SQL>select ename from emp where to_char(sysdate,'YYYY')-to_char(hiredate,'YYYY')>=5;
13) Display the list of employees who have joined the company before
30-JUN-90 or after 31-DEC-90.
a)select ename from emp where hiredate < '30-JUN-1990' or hiredate > '31-DEC-90';
14) Display current Date.
SQL>select sysdate from dual;
15) Display the list of all users in your database(use catalog table).
SQL>select username from all_users;
18) Display the names of employees working in depart number 10 or 20 or 40 or employees working as
CLERKS,SALESMAN or ANALYST.
SQL>select ename from emp where deptno in(10,20,40) or job in ('CLERKS','SALESMAN','ANALYST');
19) Display the names of employees whose name starts with alaphabet S.
SQL>select ename from emp where ename like 'S%';
20) Display the Employee names for employees whose name ends with alaphabet S.
SQL>select ename from emp where ename like '%S';
21) Display the names of employees whose names have second alphabet A in their names.
SQL>select ename from emp where ename like '_A%';
22) select the names of the employee whose names is exactly five characters in length.
SQL>select ename from emp where length(ename)=5;
23) Display the names of the employee who are not working as MANAGERS.
SQL>select ename from emp where job not in('MANAGER');
24) Display the names of the employee who are not working as SALESMAN OR CLERK OR ANALYST.
SQL>select ename from emp where job not in('SALESMAN','CLERK','ANALYST');
25) Display all rows from emp table.The system should wait after every screen full of informaction.
SQL>set pause on
32) Display the maximum salary being paid to depart number 20.
SQL>select max(sal) from emp where deptno=20;
35) Display the total salary drawn by ANALYST working in depart number 40.
SQL>select sum(sal) from emp where job='ANALYST' and deptno=40;
36) Display the names of the employee in order of salary i.e the name of the employee earning lowest
salary should salary appear first.
SQL>select ename from emp order by sal;
39) Display empno,ename,deptno,sal sort the output first base on name and within name by deptno and
with in deptno by sal.
SQL>select empno,ename,deptno,sal from emp order by
40) Display the name of the employee along with their annual salary(sal*12).The name of the employee
earning highest annual salary should apper first.
SQL>select ename,sal*12 from emp order by sal desc;
41) Display name,salary,hra,pf,da,total salary for each employee. The output should be in the order of total
salary,hra 15% of salary,da 10% of salary,pf 5% salary,total salary will be(salary+hra+da)-pf.
SQL>select ename,sal,sal/100*15 as hra,sal/100*5 as pf,sal/100*10 as
da, sal+sal/100*15+sal/100*10-sal/100*5 as total from emp;
42) Display depart numbers and total number of employees working in each department.
SQL>select deptno,count(deptno)from emp group by deptno;
43) Display the various jobs and total number of employees within each job group.
SQL>select job,count(job)from emp group by job;
44) Display the depart numbers and total salary for each department.
SQL>select deptno,sum(sal) from emp group by deptno;
45) Display the depart numbers and max salary for each department.
SQL>select deptno,max(sal) from emp group by deptno;
46) Display the various jobs and total salary for each job
SQL>select job,sum(sal) from emp group by job;
47) Display the various jobs and total salary for each job
SQL>select job,min(sal) from emp group by job;
48) Display the depart numbers with more than three employees in each dept.
SQL>select deptno,count(deptno) from emp group by deptno having count(*)>3;
49) Display the various jobs along with total salary for each of the jobs
where total salary is greater than 40000.
SQL>select job,sum(sal) from emp group by job having sum(sal)>40000;
50) Display the various jobs along with total number of employees in each
job.The output should contain only those jobs with more than three employees.
SQL>select job,count(empno) from emp group by job having count(job)>3
51) Display the name of the empployee who earns highest salary.
SQL>select ename from emp where sal=(select max(sal) from emp);
52) Display the employee number and name for employee working as clerk and
earning highest salary among clerks.
SQL>select empno,ename from emp where where job='CLERK'
and sal=(select max(sal) from emp where job='CLERK');
53) Display the names of salesman who earns a salary more than the highest
salary of any clerk.
SQL>select ename,sal from emp where job='SALESMAN' and sal>(select
max(sal) from emp where job='CLERK');
54) Display the names of clerks who earn a salary more than the lowest
salary of any salesman.
SQL>select ename from emp where job='CLERK' and sal>(select min(sal)
from emp where job='SALESMAN');
Display the names of employees who earn a salary more than that of
Jones or that of salary grether than that of scott.
SQL>select ename,sal from emp where sal> (select sal from emp where ename='JONES')and sal> (select
sal from emp where ename='SCOTT');
55) Display the names of the employees who earn highest salary in their respective departments.
SQL>select ename,sal,deptno from emp where sal in(select max(sal) from emp group by deptno);
56) Display the names of the employees who earn highest salaries in their respective job groups.
SQL>select ename,sal,job from emp where sal in(select max(sal) from emp group by job)
57) Display the employee names who are working in accounting department.
SQL>select ename from emp where deptno=(select deptno from dept where
dname='ACCOUNTING')
59) Display the Job groups having total salary greater than the maximum
salary for managers.
SQL>SELECT JOB,SUM(SAL) FROM EMP GROUP BY JOB HAVING SUM(SAL)>(SELECT
MAX(SAL) FROM EMP WHERE JOB='MANAGER');
60) Display the names of employees from department number 10 with salary
grether than that of any employee working in other department.
SQL>select ename from emp where deptno=10 and sal>any(select sal from
emp where deptno not in 10).
61) Display the names of the employees from department number 10 with
salary greater than that of all employee working in other departments.
SQL>select ename from emp where deptno=10 and sal>all(select sal from
emp where deptno not in 10).
69) Find the First occurance of character 'a' from the following string i.e
'Computer Maintenance Corporation'.
SQL>SELECT INSTR('Computer Maintenance Corporation','a',1) FROM DUAL
71) Display the informaction from emp table.Where job manager is found it
should be displayed as boos(Use replace function).
SQL>select replace(JOB,'MANAGER','BOSS') FROM EMP;
75) Display the current date as 15th Augest Friday Nineteen Ninety Saven.
SQL>select to_char(sysdate,'ddth Month day year') from dual
76) Display the following output for each row from emp table.
scott has joined the company on wednesday 13th August ninten nintey.
SQL>select ENAME||' HAS JOINED THE COMPANY ON '||to_char(HIREDATE,'day
ddth Month year') from EMP;
77) Find the date for nearest saturday after current date.
SQL>SELECT NEXT_DAY(SYSDATE,'SATURDAY')FROM DUAL;
79) Display the date three months Before the current date.
SQL>select add_months(sysdate,3) from dual;
80) Display the common jobs from department number 10 and 20.
SQL>select job from emp where deptno=10 and job in(select job from emp
where deptno=20);
81) Display the jobs found in department 10 and 20 Eliminate duplicate jobs.
SQL>select distinct(job) from emp where deptno=10 or deptno=20
(or)
SQL>select distinct(job) from emp where deptno in(10,20);
83) Display the details of those who do not have any person working under them.
SQL>select e.ename from emp,emp e where emp.mgr=e.empno group by
e.ename having count(*)=1;
84) Display the details of those employees who are in sales department and
grade is 3.
SQL>select * from emp where deptno=(select deptno from dept where
dname='SALES')and sal between(select losal from salgrade where grade=3)and
(select hisal from salgrade where grade=3);
85) Display those who are not managers and who are managers any one.
i)display the managers names
SQL>select distinct(m.ename) from emp e,emp m where m.empno=e.mgr;
86) Display those employee whose name contains not less than 4 characters.
SQL>select ename from emp where length(ename)>4;
87) Display those department whose name start with "S" while the location
name ends with "K".
SQL>select dname from dept where dname like 'S%' and loc like '%K';
89) Display those employees whose salary is more than 3000 after giving 20%
increment.
SQL>select ename,sal from emp where (sal+sal*.2)>3000;
92) Display employee name,deptname,salary and comm for those sal in between
2000 to 5000 while location is chicago.
SQL>select ename,dname,sal,comm from emp,dept where sal between 2000
and 5000 and loc='CHICAGO' and emp.deptno=dept.deptno;
93)Display those employees whose salary greter than his manager salary.
SQL>select p.ename from emp e,emp p where e.empno=p.mgr and p.sal>e.sal
94) Display those employees who are working in the same dept where his
manager is work.
SQL>select p.ename from emp e,emp p where e.empno=p.mgr and
p.deptno=e.deptno;
95) Display those employees who are not working under any manager.
SQL>select ename from emp where mgr is null
96) Display grade and employees name for the dept no 10 or 30 but grade is
not 4 while joined the company before 31-dec-82.
SQL>select ename,grade from emp,salgrade where sal between losal and
hisal and deptno in(10,30) and grade<>4 and hiredate<'31-DEC-82';
97) Update the salary of each employee by 10% increment who are not
eligiblw for commission.
SQL>update emp set sal=sal+sal*10/100 where comm is null;
98) SELECT those employee who joined the company before 31-dec-82 while
their dept location is newyork or Chicago.
SQL>SELECT EMPNO,ENAME,HIREDATE,DNAME,LOC FROM EMP,DEPT
WHERE (EMP.DEPTNO=DEPT.DEPTNO) AND HIREDATE <'31-DEC-82' AND DEPT.LOC IN('CHICAGO','NEW
YORK');
101) Display name and salary of ford if his salary is equal to hisal of his grade
a)select ename,sal,grade from emp,salgrade where sal between losal and hisal
and ename ='FORD' AND HISAL=SAL;
102) Display employee name,job,depart name ,manager name,his grade and make
out an under department wise?
SQL>SELECT E.ENAME,E.JOB,DNAME,EMP.ENAME,GRADE FROM EMP,EMP
E,SALGRADE,DEPT
WHERE EMP.SAL BETWEEN LOSAL AND HISAL AND EMP.EMPNO=E.MGR
AND EMP.DEPTNO=DEPT.DEPTNO ORDER BY DNAME
103) List out all employees name,job,salary,grade and depart name for every
one in the company except 'CLERK'.Sort on salary display the highest salary?
SQL>SELECT ENAME,JOB,DNAME,SAL,GRADE FROM EMP,SALGRADE,DEPT WHERE
SAL BETWEEN LOSAL AND HISAL AND EMP.DEPTNO=DEPT.DEPTNO AND JOB
NOT IN('CLERK')ORDER BY SAL ASC;
104) Display the employee name,job and his manager.Display also employee who
are without manager?
SQL>select e.ename,e.job,eMP.ename AS Manager from emp,emp e where
emp.empno(+)=e.mgr
106) Display name of those employee who are getting the highest salary?
SQL>select ename from emp where sal=(select max(sal) from emp);
107) Display those employee whose salary is equal to average of maximum and
minimum?
SQL>select ename from emp where sal=(select max(sal)+min(sal)/2 from emp);
108) Select count of employee in each department where count greater than 3?
SQL>select count(*) from emp group by deptno having count(deptno)>3
109) Display dname where at least 3 are working and display only department name?
SQL>select distinct d.dname from dept d,emp e where d.deptno=e.deptno
and 3>any (select count(deptno) from emp group by deptno)
110) Display name of those managers name whose salary is more than average
salary of his company?
SQL>SELECT E.ENAME,EMP.ENAME FROM EMP,EMP E
WHERE EMP.EMPNO=E.MGR AND E.SAL>(SELECT AVG(SAL) FROM EMP);
111)Display those managers name whose salary is more than average salary of
his employee?
SQL>SELECT DISTINCT EMP.ENAME FROM EMP,EMP E WHERE
E.SAL <(SELECT AVG(EMP.SAL) FROM EMP
WHERE EMP.EMPNO=E.MGR GROUP BY EMP.ENAME) AND EMP.EMPNO=E.MGR;
112) Display employee name,sal,comm and net pay for those employee whose net pay is greter than or
equal to any other employee salary of the company?
SQL>select ename,sal,comm,sal+nvl(comm,0) as NetPay from emp
where sal+nvl(comm,0) >any (select sal from emp)
113) Display all employees names with total sal of company with each
employee name?
SQL>SELECT ENAME,(SELECT SUM(SAL) FROM EMP) FROM EMP;
115) Find out the number of employees whose salary is greater than their manager salary?
SQL>SELECT E.ENAME FROM EMP ,EMP E WHERE EMP.EMPNO=E.MGR
AND EMP.SAL<E.SAL;
119) Display those employee who joined in the company in the month of Dec?
SQL>select ename from emp where to_char(hiredate,'MON')='DEC';
123) Display those employee whose 10% of salary is equal to the year of joining?
SQL>select ename from emp where to_char(hiredate,'YY')=sal*0.1;
126) Display those employees who joined the company before 15 of the month?
a)select ename from emp where to_char(hiredate,'DD')<15;
127) Display those employee who has joined before 15th of the month.
a)select ename from emp where to_char(hiredate,'DD')<15;
131) Display those employees whose grade is equal to any number of sal but
not equal to first number of sal?
SQL> SELECT ENAME,GRADE FROM EMP,SALGRADE
WHERE GRADE NOT IN(SELECT SUBSTR(SAL,0,1)FROM EMP)
132) Print the details of all the employees who are Sub-ordinate to BLAKE?
SQL>select emp.ename from emp, emp e where emp.mgr=e.empno and
e.ename='BLAKE';
133) Display employee name and his salary whose salary is greater than
highest average of department number?
SQL>SELECT SAL FROM EMP WHERE SAL>(SELECT MAX(AVG(SAL)) FROM EMP
GROUP BY DEPTNO);
135) Display the half of the ename's in upper case and remaining lowercase?
SQL>SELECT
SUBSTR(LOWER(ENAME),1,3)||SUBSTR(UPPER(ENAME),3,LENGTH(ENAME)) FROM EMP;
136) Display the 10th record of emp table without using group by and rowid?
SQL>SELECT * FROM EMP WHERE ROWNUM<11
MINUS
SELECT * FROM EMP WHERE ROWNUM<10
140) Display those employee whose joining of month and grade is equal.
SQL>SELECT ENAME FROM EMP WHERE SAL BETWEEN
(SELECT LOSAL FROM SALGRADE WHERE
GRADE=TO_CHAR(HIREDATE,'MM')) AND
(SELECT HISAL FROM SALGRADE WHERE
GRADE=TO_CHAR(HIREDATE,'MM'));
146) Oops I forgot give the primary key constraint. Add in now.
SQL>alter table emp add primary key(empno);
149) I want to give a validation saying that salary cannot be greater 10,000
(note give a name to this constraint)
SQL>alter table emp add constraint chk_001 check(sal<=10000)
150) For the time being I have decided that I will not impose this validation.My boss has agreed to pay
more than 10,000.
SQL>again alter the table or drop constraint with alter table emp drop constraint chk_001 (or)Disable the
constraint by using alter table emp modify constraint chk_001 disable;
151) My boss has changed his mind. Now he doesn't want to pay more than
10,000.so revoke that salary constraint.
SQL>alter table emp modify constraint chk_001 enable;
153) Oh! This column should be related to empno. Give a command to add this constraint.
SQL>ALTER TABLE EMP ADD CONSTRAINT MGR_DEPT FOREIGN KEY(MGR) REFERENCES
EMP(EMPNO)
155) This deptno column should be related to deptno column of dept table;
SQL>alter table emp add constraint dept_001 foreign key(deptno)
reference dept(deptno) [deptno should be primary key]
157) Create table called as newemp. Using single command create this table
as well as get data into this table(use create table as);
SQL>create table newemp as select * from emp;
158) Delete the rows of employees who are working in the company for more
than 2 years.
SQL>delete from emp where (sysdate-hiredate)/365>2;
159) Provide a commission(10% Comm Of Sal) to employees who are not earning
any commission.
SQL>select sal*0.1 from emp where comm is null
161) Display employee name and department name for each employee.
SQL>select empno,dname from emp,dept where emp.deptno=dept.deptno
165) Display the department name and total number of employees in each
department.
SQL>select dname,count(ename) from emp,dept where
emp.deptno=dept.deptno group by dname;
166)Display the department name along with total salary in each department.
SQL>select dname,sum(sal) from emp,dept where emp.deptno=dept.deptno group by dname;
167) Display itemname and total sales amount for each item.
SQL>select itemname,sum(amount) from item group by itemname;
168) Write a Query To Delete The Repeted Rows from emp table;
SQL>Delete from emp where rowid not in(select min(rowid)from emp group
by ename)
------******------
2. Check if you are able to access and view all the documents from VSS.
Every TE must be able to perform the following operations with VSS.
Create a working folder in the local machine using the following steps:---
i) Select the root folder ($/) and right click on it and click on “Set working folder” option.
ii) Select ‘E’ drive and enter the folder name as E:\VSS.
iii) Click on create folder button and click on OK.
3. Check if you are able to copy any one of the documents (say BDD) to
your local software structure.
4. Check if your are able to check-in, the sample Document with the following
naming convention.
Steps to Check-in:
i) Go to FileAdd FilesSelect drive in which the document is presentSelect the file and click on
Add button.
ii) Add the comment (Optional)Click on OK.
i) Select the folder in the VSS tree and select the document to be checked out.
ii) Right click on document and select the option checkout.
iii) Enter the comment items of the need to be checked out and click on OK.
6. Check if your are able to observe the red icon for the checked out document.
7. Check if your are able to Edit the checked out document with VSS.
11.Select the document, right click on it and click on the delete button.