Professional Documents
Culture Documents
InfoVista Application Monitoring - Newtest
InfoVista Application Monitoring - Newtest
InfoVista Application Monitoring - Newtest
Seeing IT
InfoVista SA reserves the right to revise this documentation and to make changes in content from time to
time without obligation on the part of InfoVista to provide notification of such revision or change.
InfoVista provides this documentation without warranty of any kind, either implied or expressed,
including but not limited to, the implied warranties of merchantability and fitness for a particular purpose.
InfoVista may make improvements or changes to the product(s) and/or the program(s) described in this
document at any time.
If there is any software on removable media described in this documentation, it is furnished under a
license agreement included with the product in the installation procedure of the product. If you are
unable to locate a copy, please contact InfoVista and a copy will be provided to you.
USA Europe
InfoVista Corp InfoVista SA
10440 Little Patuxent Parkway, 6, rue de la Terre de Feu
Columbia MD 21044 91952 Courtabœuf Cedex
USA France
tel:+1 410 997 4470 tel: +33 1 64 86 79 00
fax:+1 410 997 4607 fax: +33 1 64 86 79 79
web: www.infovista.com
email: support@infovista.com
InfoVista documentation
The InfoVista documentation set comprises the following volumes:
InfoVista Installation Guide
This manual describes the installation of the product.
InfoVista Quick Start
This manual gives rapid (default) procedures for installing and starting the system
and describes how to produce your first reports using templates from the standard
libraries.
InfoVista User’s Guide
This manual describes the client graphic interface, the methodology of modeling the
Information System, and the tasks which can be performed with the product.
InfoVista Reference Manual
This manual documents the syntax of all commands used by the client software in
text mode.
VistaViews (of which this book describes one)
This is a set of manuals describing each of the standard libraries provided for
InfoVista.
Documentation supports
The InfoVista documents listed above are provided on paper and also as PDF files on
the software distribution media. The PDF files can be viewed with the viewer
software.
This book
This manual is divided into the following sections:
• Chapter 1 - Getting Started. This chapter provides an overview of the
product and its main functionalities. It also explains how to model your
resources and get reports up and running as quickly as possible.
• Chapter 2 - The Reports. This chapter describes individual reports in detail.
The Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Table of Report Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Newtest Probe Device View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
What can it do?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
The Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Scenario Performance and Availability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
What can it do? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
The Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Order Performance and Availability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
What can it do? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
The Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Order Group Performance and Availability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
What can it do? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
The Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Order Group Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
What can it do? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
The Table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
vi
Overview
Newtest technology
Newtest products measure the performances of a service or application and help
determine the cause of any errors, unavailabilities or response time deterioration.
Newtest software simulates user actions (e.g.: downloading a web page or backing up
a file etc.) and measures various related parameters. Different scenarios are applied
to each probe. The role of the probe is to carry out different end-to-end
measurements.
Prerequisites
You must install version 50 or later of the InfoVista SLM library on your server. The
auditec MIB is installed automatically with the VistaView.
In addition, you’ll need to set some basic parameters in the Property Values dialogue
box for the Instances you create.
• The IP address of the NT server (or the DNS name - InfoVista then
automatically looks up the IP address)
• The SNMP read/write community settings.
• The port number for the SNMP agent (by default, this is set to 161). This
port name must always be the same as the probe port number.
Quick Start
This section is for experienced users of InfoVista. If you are new to InfoVista, we
recommend you read the whole of Chapter 1 before starting.
Installation
The VistaView for Application Monitoring - Newtest is delivered as an InfoVista
library file called InfoVista_Application_Monitoring_-_Newtest.ivl.
During a first-time installation of InfoVista, these files are copied to disk but not
installed on the InfoVista server. During an update or a migration, the library file is
simply copied to disk. You must install it manually using File/Install VistaViews... in
the InfoVista main window.
After installation, your InfoVista object tree will contain the InfoVista Application
Monitoring - Newtest library.
See the InfoVista Installation Guide for more details.
Getting started
4 Quick Start
• port – the port number of the SNMP agent. If in doubt, use the default
value. The port number must always be the same as the probe port number.
• snmprd – the SNMP read community of the SNMP agent. If in doubt, use the
default value public. This often works.
Check that the Probe and the SNMP agent are configured and enabled and
authorized to accept SNMP requests coming from your InfoVista server.
Step 4
Start a Newtest Probe Device View report. This report template is in the InfoVista
Application Monitoring - Newtest library. It gives you:
• a description of each Scenario in terms of its Queue index, Scenario Index
and Scenario Name,
• a description of each Order in terms of its Queue index, Scenario Index,
Order Index and Order Name,
• the Value for each Order (the result of the Order execution),
• a Coefficient,
• the Unit used to measure the Order (usually milliseconds).
For details of this report, refer to Chapter 2.
Step 5
Create Instances for Newtest Scenario and Newtest Order Vistas. For each Instance
we need to:
1 Drag and drop the Probe Device View Instance into the relevant Contents
box.
2 Give the Instance a name.
3 Define the property values for each Instance. These values are provided in
the Probe Device View Report.
For Newtest Scenario Performance and Availability reports, enter the following
property values:
• Scenario Index,
• Scenario Queue Index.
For Newtest Order Performance and Availability reports, enter the following
property values:
• Order Index,
• Order Queue Index,
• Order Scenario Index
• A description (optional)
Step 6
Create Newtest Group Order Performance and Availability reports and Order Group
Summary reports by dragging and dropping the relevant Instances into the relevant
Contents boxes.
Step 6
Create reports by dropping the relevant Instances into the report template you wish
to generate. Alternatively, you can use other report generation methods.
For more details on the report templates available in the VistaView for Application
Monitoring - Newtest, refer to Chapter 2.
Getting started
6 Quick Start
The InfoVista Application Monitoring - Newtest library contains the following report
templates:
Newtest Probe Real Time Obtaining order Order and scenario auditec MIB
Device View and scenario names, indexes,
indexes and values. coefficients and time
units.
The Reports
10 Newtest Probe Device View
The Tables
There are two tables in this report: the Scenario list and the Order list. This report
provides an overview of the applications monitored by a Newtest probe. The various
indexes, names and values associated with each application are clearly displayed.
It is with the parameters defined by this Device View reports that we are able to create
Instances and Reports on monitored applications.
The Reports
12 Scenario Performance and Availability
The Graphs
The Scenario Performance and Availability graphs provide an overview of key
parameters for one application. Each scenario regroups a maximum of 10 orders
and the graphs in this report reflect the results produced by the orders.
Availability
This graph shows the percentage of scenarios that were successful. In other words,
we are measuring the availability of the application at the moment the scenario was
executed. The availability graph is calculated as the total number of executions
minus the number of failed scenarios expressed as a percentage.
Query Statistics
This graph provides a breakdown of the actual number of queries that succeeded or
failed. For failed query statistics, this graph uses the Indicator, ban - Failed Scenario
which calculates the number of unsuccessful scenario executions. The number of
success queries is calculated as the total number of executions minus the number of
failed executions.
Mean Response Time
This graph provides an average of all order response times for a scenario. Because a
scenario can simulate a number of different operations (loading, exiting, searching,
connecting etc.), it is sometimes useful to have an overall picture of all response
times. The graph is calculated using the Indicator (ban - Mean Response Time) which
averages the response times for each order’s results. See also below, Response Time
Details.
Response Time Details
This graph provides details of response times for each application scenario. In the
example used above, we can see that only one of the four applications has been
monitored. The response time for this application is given here in milliseconds.
Note that the Mean Response Time graph still provides an average of the four
applications that can be monitored, even those without monitored response times. If
the Response time details for one application is 1,000 milliseconds and there are
three other applications with no monitored response time, the Mean Response Time
will be 250 seconds (1,000 divided by four).
The Reports
14 Order Performance and Availability
The Graphs
The graphs provided in this report give a more detailed view of the results produced
by one specific order. With scenario reports, we were interested in an overall view of
a group of orders. In this report, we are measuring the availability and response
times for one application.
Availability
This graph shows the percentage of orders that were successful. In other words, we
are measuring the availability of an application at the moment an order transaction
was executed. The availability graph is calculated as the total number of samples
minus the number of failed transactions expressed as a percentage.
Query Statistics
This graph provides a breakdown of the actual number of queries that have
succeeded or failed. For failed query statistics, this graph uses the Indicator, ban -
Failed Transaction which calculates the number of unsuccessful order transactions.
The number of success queries is calculated as the total number of transactions less
the number of failed transactions.
Response Time
This graph shows an application response time to an order. Indicators used to
calculate this graph take the values from the last six orders and display an average of
these values.
Response Time Details
This graph shows the maximum and minimum response times to an order. This is
useful when troubleshooting (isolating specific ftp download problems, for
example).
The Reports
16 Order Group Performance and Availability
The Graphs
These graphs allow users to have an overall picture of a group of orders. This is
useful if we wish to place the performances of a particular type of order group
together.
For example, we might wish to evaluate the perfomances of different server
connections, web browser loads or TELNET. The user defines which Instances he
wishes to drag and drop to create Group Instances to reflect a particular group.
Mean Availabilty (%)
This graph shows the average availability for all applications in a specific group. In
other words, we are measuring the availability of a group of applications at the
moment an order transaction was executed. The Mean Availability graph is
calculated as a percentage of the total number of samples minus the number of failed
transactions.
Availability (%) - Bottom 10
This graph creates a league table of the bottom ten performers in terms of availabilty.
It allows us to immediately visualize which applications are problematic in terms of
availability. This graph uses the same indicators as with the Mean Availability graph
to diplay the information.
Mean Response Time
This graph groups together the average response times for each order and thus for
each aspect of an application that is being tested.
Response Time - Top 10
This graph returns response time information on the Top 10 applications. We can
see at a glimpse which, of the parameters we are monitoring, is performing the
fastest.
The Reports
18 Order Group Summary
The Table
This table provides a synthetic overview of order group results. It contains
information that is not necessarily given in the Order Group Performance and
Availability reports. Information is presented for each order and classed in terms of
group (i.e. order 1, order 2 etc.)
Availability (%)
This column shows the availability, expressed as a percentage, for an applications
monitored as part of a specific order. Availability is calculated as a percentage of the
total number of samples minus the number of failed transactions.
Response Time (ms)
This column provides the response time for each order sample. Indicators used to
calculate this information take their values from the last six orders and display an
average of these values.
Maximum Response Time (ms)
The results of each order sample in terms of Maximum Response time is shown in
this column. A zero reading means that the order result has not been produced.
Minimum Response Time (ms)
The results of each order sample in terms of Minimum Response time is shown in
this column. A zero reading means that the order result has not been produced.
Success Transaction
The number of successful transactions per order sample. The number of successful
transactions is calculated as the total number of transactions less the number of
failed transactions.
Failed Transaction
The number of failed transactions per order sample. For failed transaction statistics,
this graph uses the Indicator, ban - Failed Transaction which calculates the number of
unsuccessful order transactions.
The Reports
20 Order Group Summary
We welcome any comments you may have on our product and its documentation. Your remarks
will be examined thoroughly and taken into account for future versions of InfoVista.
Errors detected:
Suggested improvements: