Aws Tivoli PDF

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

Configuring Tivoli Workload

Scheduler in a firewalled
network 򔻐򗗠򙳰
Understanding how the components connect
The purpose of this document is to provide an overview of the connections used in a typical Tivoli
Workload Scheduler environment and to describe the most common firewall configuration scenarios
available across a Tivoli Workload Scheduler network. The environment pictured below is made up by a
master domain manager installed in the same system of the database server instance, a backup master
domain manager installed in the same system of the database client instance, and an agent. All the
available types of user interfaces -Tivoli Dynamic Workload Console, Job Scheduling Console, Remote
command line client, and local composer and conman CLIs - are also shown.

NATIVE

MASTER DOMAIN MANAGER


HTTP/HTTPS

TDWC
COMPOSER
DB JSC
CONMAN
LOCAL

DATABASE

EVENT
WAS
DYNAMIC
BACKUP MASTER DOMAIN MANAGER

COMPOSER
DB AGENT

CONMAN

WAS

WAS
CONMAN

WEB
browser

TDWC JSC Remote CLI

Figure 1. Tivoli Workload Scheduler connections overview.

Components overview
The master domain manager and its backup require access to a database to store scheduling object
definitions and historical report data. To do this they use the following components:
v The native command lines (composer and conman) are used to define objects through a connection via
Websphere Application Server (WAS).
Composer is used to create object (job, job streams, schedules, workstations, event rules, and so on)
definitions.
Conman is used to browse objects and run actions in the Symphony file (which contains production
plan information). It connects directly to the local copy of Symphony to run all actions except for
submitting jobs and job streams: these require access to the database through WAS installed as
embedded component on the master or backup master.
On the agents, no composer command line and no direct access to the database are available, but you
can install the WAS on the side to serve GUI connections.
v The command line client (Remote CLI) can run all composer and a few other commands used on the
master domain manager. It can be installed on workstations where no other Tivoli Workload Scheduler
components are installed and communicates by TCP/IP with the command line server, which is part of
the master domain manager. The information required to make the connection with the master domain
manager is defined either in the local options file or supplied as parameters to the command.
v The Dynamic Workload Console (TDWC) is a web-based interface from which you can run the
product. It covers 100 percent of all tasks runnable from the native command lines. You can interface
with the console from any system in the network where a supported web browser is installed.
v The Job Scheduling Console (JSC) is a legacy Java console that has now been replaced by the TDWC
and is no longer packaged with the product. It provides none of the functions featured since Tivoli
Workload Scheduler 8.4 and is included in this document only for the benefit of those using older
versions.

TCP connections description


The Tivoli Workload Scheduler components communicate through the following connection types:
Table 1. TCP connection overview.
Connection type Description
Is used by all workstations to exchange information about workload execution.
NATIVE The destinations are the netman ports (default is 31111) of the master, backup
master, and agents. The established connection must be acknowledged.

Native connections are also used by agents to notify the event processor (that by
default resides on the master) about event triggering. The destination is the EIF
port (default is 31131). No acknowledgment on the established connection is
required.
Is used by the composer command line and by the command line client to define
HTTP/HTTPS objects in the database through the WebSphere Application Server. It is also used
by the conman command line (on the master, backup master, and agents) to run
job and job stream submissions that require access to the database.

Monitoring agents running on the workstations for event-driven workload


automation use this kind of connection to download new event rule configuration
packages issued by the event processor.

The destinations on the master and the event processor (which are in the same
system by default) are the HTTP port (default is 31115) or the HTTPS port (default
is 31116). The established connection must be acknowledged.
Java Remote Method Invocation (RMI) used by the Dynamic Workload Console.
TDWC The destination is any target workstation with a WAS connector installed on the
boot port (default is 31117). The established connection must be acknowledged.
The default URL for Dynamic Workload Console is:
https://tdwc_server_IP_address:29043/ibm/console

2
Table 1. TCP connection overview. (continued)
Connection type Description
Java Remote Method Invocation (RMI) used by the Job Scheduling Console. The
JSC destination is any target workstation with a WAS connector installed on the boot
port (default is 31117). The established connection must be acknowledged. In
addition, the Job Scheduling Console needs to open a connection on the target
machine with ports defaulted as 31120 or 31121 for authentication purposes.
Local command line interfaces use HTTP or HTTPS communication pointing to
LOCAL local server address 127.0.0.1.
The DB2 server normally listens on port 50000 while the Oracle server normally
DATABASE listens on port 1521.

The WebSphere Application Server by default uses type-4 JDBC drivers to connect
with the database.

For reporting tasks, the Dynamic Workload Console contacts the DB server URL
directly on one of the ports specified above.
When the event-driven workload automation feature is enabled, this connection
EVENT type is used by the monman process of the monitoring agents to open a
connection with the event processor EIF port (default is 31131, or 31132 for SSL)
each time an event is triggered on the agent.
When the dynamic scheduling feature is enabled, this connection type is used by
DYNAMIC the workload broker component on the master or domain manager to dispatch a
job to the JobManager component port of the workstation selected for running the
job.

3
Static scheduling interprocess communication
In a static Tivoli Workload Scheduler network, a job is run by the workstation specified in the job
definition, as opposed to dynamic scheduling where the workload broker component chooses the best
available workstation at runtime.

This figure shows the connections used:

1 SYMPHONY DELIVERS STATUS UPDATES

MASTER DOMAIN MANAGER


CONMAN REMOTE COMMANDS
2 NATIVE
(START, STOP, RETRIEVE JOB LOG)

HTTP/HTTPS
3 PREDEFINED JOB AND JOB STREAM COMPOSER
SUBMISSION FROM REMOTE AGENT DB
DATABASE
CONMAN

WAS

2
3 CHILD DOMAIN

COMPOSER
DB
1
CONMAN
1

WAS

CONMAN
CONMAN CONMAN

BACKUP MASTER DOMAIN MANAGER


DOMAIN MANAGER AGENT

Figure 2. TCP connection flows used for static scheduling.

where:
1. The Symphony file is generated on the master domain manager and delivered to the rest of the
network hierarchically across domains. Any job and workstation status change will be updated across
the network hierarchically depending on the agent role in the domain.
2. Some conman commands such as start, stop and job log retrieval are directly delivered from the
issuing workstation to the target workstation. If the workstation is defined with the “behind firewall”
option, the command is delivered following the network hierarchy as in point 1.
For each incoming connection the netman process on the target workstation spawns a local writer
process to satisfy the request. All of these connections are targeted to the netman port on the remote
workstation.
3. From conman it is possible to submit predefined jobs and job streams into the plan directly from the
database. This requires a connection to the HTTP/HTTPS port of the WebSphere Application Server
on the master domain manager. It is possible only if the “Attributes for CLI connection” section in the
localopts file of the workstation is properly defined:

4
#----------------------------------------------------------------------------
# Attributes for CLI connections
#
HOST = 127.0.0.1 # Master hostname used when attempting a connection
PROTOCOL = https # Protocol used to establish a connection
PORT = 31116 # Protocol port
#PROXY = proxy.userlocal
#PROXYPORT = 3333
TIMEOUT = 3600 # Timeout in seconds to wait for server response

Connections overview for event-driven workload automation


When this feature is enabled, the following processes must be up and running:
v The event processor on the master domain manager or backup master.
v Monman on the agents.

The connections used in event-driven workload automation are shown in the following figure:

NATIVE

MASTER DOMAIN MANAGER


HTTP/HTTPS

EVENT
COMPOSER
DB
CONMAN

WAS

1
2

1
NOTIFICATION THAT A NEW RULE PACKAGE IS AVAILABLE
3

2 DOWNLOAD PACKAGE

3 EVENT TRIGGERING CONMAN

AGENT

Figure 3. TCP connection flows used in event-driven workload automation.

where:

5
1. The event processor (located on the WebSphere Application Server instance that runs on the master or
backup master) communicates to the target agent that a new rule configuration package is available.
The message follows the connection path described for static scheduling. The target process is
monman.
2. The monman process establishes an HTTP/HTTPS connection with the event processor to download
the event rule package (in pull mode) and starts monitoring events.
3. When an event is triggered on the agent, monman opens a connection to the event processor EIF port
and forwards the event.

Connections overview for dynamic scheduling


When this feature is enabled, the following processes must be up and running:
v The dynamic workload broker on the master domain manager, backup master, or domain manager.
v Job manager on the agents.

The connections used in dynamic scheduling are shown in the following figure:

HTTP/HTTPS

DYNAMIC
MASTER DOMAIN MANAGER

COMPOSER
DB
CONMAN

WAS

2
1

1
SYSTEM SCAN RESULTS AND JOB STATUS UPDATES

2 JOB DISPATCHING

CONMAN

AGENT

Figure 4. TCP connection flows used in dynamic scheduling.

where:

6
1. On agents where dynamic scheduling capabilities are enabled, the JobManager process is started.
From the agent, JobManager sends system and job status information to the dynamic workload broker
component that runs on the master or on a domain manager. A connection is opened to the
HTTP/HTTPS port of the WebSphere Application Server instance running on that workstation.
2. The dynamic workload broker component is an application that runs on the WebSphere Application
Server instance on the master or on a domain manager. It receives data from the dynamic agents and
stores it into the database. When the time comes to run a job, the workload broker uses this data to
select the best available agent to send the job to based on the requirements specified in the job
definition. To dispatch the job, it opens a link to the JobManager port of the selected agent.

A practical approach to configuring firewalls


You might not be running all the components and features described so far in this document, depending
also on which Tivoli Workload Scheduler version you are using. For instance:
v You are very likely to be running the Dynamic Workload Console as your user interface. There might
be some chance you also enter commands from the command lines. Or that you have a few remote
command line clients somewhere in your network. You should not be using the Job Scheduling
Console unless you are still with some older scheduler version than 8.4.
v You might not be using event-driven workload automation or dynamic scheduling.
In any case, the final firewall configuration is the sum of the accesses required by the features enabled for
the nodes involved.

The following step-by-step approach will help you complete the task of configuring your network across
a firewall:
1. Break down your Tivoli Workload Scheduler network into couples of systems divided by firewall.
2. For each couple, examine which ports need to be configured for the systems.
3. Evaluate each point to point scenario applicable to the systems involved and build an ACL (Access
Control List) table for each scenario.
4. Merge all the calculated ACL tables together.

7
ACL table format
Tivoli Workload Scheduler administrators should interlock with network administrators so that
information on needed firewall rules is exchanged in a form that can be easily written by people who are
not network specialists and that can be easily applied to a router.

The following table proposes to be a simple tool for Tivoli Workload Scheduler administrators to gather
data about the connections that should be allowed through a firewall so that Tivoli Workload Scheduler
can function properly:

FROM TO DESTINATION PORT CONNECTION

IP ADDRESS

Target workstation PROTOCOL


port number (if packet filtering is applied)
or CONNECTION NAME LABEL
(only for TWS Administrator to
easily check results)

The next sections show where to retrieve the port numbers you need to fill in your ACL tables.

Retrieving the WebSphere Application Server ports


To find the port numbers used by the WebSphere Application Server, run the showHostProperties
command.

The following is a sample host properties file where the port numbers you might need to enter in your
ACL tables are highlighted:
################################################################
Host Configuration Panel
################################################################
oldHostname=localhost
newHostname=localhost
################################################################
Ports Configuration Panel
################################################################
bootPort=31117
bootHost=myhost.ibm.com
soapPort=31118
soapHost= myhost.ibm.com
httpPort=31115
httpHost=*
httpsPort=31116
httpsHost=*

8
adminPort=31123
adminHost=*
adminSecurePort=31124
adminSecureHost=*
sasPort=31119
sasHost= myhost.ibm.com
csiServerAuthPort=31120
csiServerAuthHost= myhost.ibm.com
csiMuthualAuthPort=31121
csiMuthualAuthHost= myhost.ibm.com
orbPort=31122
orbHost= myhost.ibm.com

Retrieving the netman ports used for native communication


To find the port numbers used by every workstation in Tivoli Workload Scheduler (master, backup
master, domain managers, and agents), look at the localopts file in every workstation.

The following is a sample localopts file where the port numbers you need are highlighted:
#----------------------------------------------------------------------------
# The attributes of this workstation for the netman process:
#
nm mortal =no
nm port =31111 ######## netman port ########nm read =10
nm retry =800
#
#----------------------------------------------------------------------------
[...]
#----------------------------------------------------------------------------
# Attributes for CLI connections
#
HOST = 127.0.0.1 # Master hostname used when attempting a connection.
PROTOCOL = https # Protocol used to establish a connection
PORT = 31116 # Protocol port
#PROXY = proxy.userlocal
#PROXYPORT = 3333
TIMEOUT = 3600 # Timeout in seconds to wait a server response
#CLISSLSERVERAUTH = yes
#CLISSLCIPHER = MD5
#CLISSLSERVERCERTIFICATE =
#CLISSLTRUSTEDDIR =
DEFAULTWS = mdm84
USEROPTS = useropts_mdm84
#----------------------------------------------------------------------------

Retrieving the event processor ports used in event-driven workload


automation
To find the port numbers used by the event processor, look at the globalopts file in the master domain
manager. Use the optman ls command to open the file and look for the following options:
enEventProcessorHttpsProtocol / eh = YES
eventProcessorEIFPort / ee = 31131
eventProcessorEIFSSLPort / ef = 31132

Retrieving the job manager ports used in dynamic scheduling


To find the port number used by job manager, look at the ita.ini file on every dynamic agent.

The following is a section of a sample ita.ini file where the port number you need is highlighted. Note
that HTTP and HTTPS are not supported at the same time and that the port for the unused protocol
must be set to 0.
9
# ITA Properties file written Thu Aug 18 10:44:13 2011
...
[ITA]
net_backlog = 64
agents =
instance_name = tws_cpa_agent_mdm86
net_if = 0.0.0.0
net_if_ipv6 = ::
tcp_port = 0 #HTTP JobManager port number
ssl_port = 31114 #HTTPS JobManager port number
net_addr_monitor = yes
reregistration_delay = 30
agent_stop_timeout = 5
agent_restart_delay = 5
agent_info_wakeup = 600
custom_info_wakeup = 3200
...

Firewall connection scenarios


The following sections describe common firewall configuration scenarios. Each scenario includes the ACL
table of connections that need to be allowed to ensure that Tivoli Workload Scheduler works properly in
a firewalled network. The ports listed in the tables are identified by their names. To map them to the
specific values configured in your environment, see the preceding Retrieving... sections.

Also, remember that:


v The master domain manager can switch to the backup master domain manager.
v The event processor can switch to the one on the backup master domain manager even if the master
domain manager is not switching.
v The active dynamic workload broker component server can switch to the one on the backup (master)
domain manager only when the hosting (master) domain manager does.
Therefore, remember to build firewall rules also for all backup Tivoli Workload Scheduler instances.

Each scenario takes into consideration only one firewall at a time. In case of multiple firewalls across the
network two or more scenarios can be merged.

The scenarios are:


v “Firewall between the master and the rest of the Tivoli Workload Scheduler network” on page 11
v “Firewall between the event processor and an agent” on page 12
v “Firewall between dynamic workload broker and a dynamic agent” on page 13
v “Firewall between the Dynamic Workload Console and the Tivoli Workload Scheduler network” on
page 15
v “Firewall between the browser and the Dynamic Workload Console” on page 16
v “Firewall between the command line client and the master domain manager” on page 17
v “Firewall between the Job Brokering Definition Console and the dynamic workload broker” on page 18
v “Firewall between the Job Scheduling Console and the Tivoli Workload Scheduler network” on page 19

10
Firewall between the master and the rest of the Tivoli Workload
Scheduler network
The scenario in the next figure shows a firewall separating the master domain manager from the rest of
the Tivoli Workload Scheduler network.

MASTER DOMAIN MANAGER


NATIVE

HTTP/HTTPS

COMPOSER
DB
CONMAN

WAS

CONMAN

AGENT

Figure 5. Firewall between the master and the rest of the network.

In this scenario all the systems located on the other side of the firewall must have the BEHINDFIREWALL
option set to ON in their workstation (CPU) definition. The impacted connections are native (through
netman) connections. Also HTTP/HTTPS connections are impacted if submit job and submit job stream
actions must be allowed from the agent. No event driven functionality and dynamic scheduling are taken
into consideration in this scenario.
Table 2. Connections that must be allowed through a firewall between the MDM and the rest of the network.
From To Destination port Connection
MDM agent netman native

11
Table 2. Connections that must be allowed through a firewall between the MDM and the rest of the
network. (continued)
From To Destination port Connection
agent MDM netman native
agent MDM HTTP/HTTPS submit job/job stream
MDM agent established connection submit job/job stream
(reply)

Firewall between the event processor and an agent


The scenario in the next figure shows a firewall separating the event processor and a Tivoli Workload
Scheduler agent.

MASTER DOMAIN MANAGER


HTTP/HTTPS

EVENT
COMPOSER
DB
CONMAN

WAS

CONMAN

AGENT

Figure 6. Firewall between the event processor and a Tivoli Workload Scheduler agent.

In this scenario a firewall separates the agent from the event processor. The impacted connections are:

12
v EIF connections from the agent to the event processor to update the ongoing flow of events (one way
connection)
v HTTP/HTTPS connections from the agent to the event processor to download new rule configuration
packages

By default the event processor runs on the master domain manager so, unless it was installed elsewhere,
the table below should be merged with the one in the “Firewall between the master and the rest of the
Tivoli Workload Scheduler network” on page 11 scenario.
Table 3. Connections that must be allowed through a firewall between the event processor and an agent running
event-driven workload automation.
From To Destination port Connection
agent event processor EIF event triggering
agent event processor HTTP/HTTPS configuration download
event processor agent established connection configuration download
(reply)

Firewall between dynamic workload broker and a dynamic agent


The scenario in the next figure shows a firewall separating the dynamic workload broker from a dynamic
agent.

13
MASTER DOMAIN MANAGER HTTP/HTTPS

DYNAMIC
COMPOSER
DB
CONMAN

WAS

CONMAN

AGENT

Figure 7. Firewall between the dynamic workload broker and a dynamic agent.

In this scenario a firewall separates an agent installed with dynamic scheduling capabilities from the
dynamic workload broker component. The impacted connections are:
v The connection opened for job dispatching by the dynamic workload broker component (which we
assume runs on the master) to the JobManager process on the agent.
v The HTTP/HTTPS connection opened from the agent to send workstation updates to the dynamic
workload broker.

You should allow the following connections to make sure that the scheduler works properly:
Table 4. Connections that must be allowed through a firewall between the dynamic workload broker component and a
dynamic agent.
From To Destination port Connection
MDM agent JobManager dispatching of dynamic job
agent MDM HTTP/HTTPS workstation update
agent MDM established connection dispatching of dynamic job
(reply)

14
Table 4. Connections that must be allowed through a firewall between the dynamic workload broker component and a
dynamic agent. (continued)
From To Destination port Connection
MDM agent established connection workstation update (reply)

Firewall between the Dynamic Workload Console and the Tivoli


Workload Scheduler network
The scenario in the next figure shows a firewall separating the Dynamic Workload Console from the
Tivoli Workload Scheduler network.

MASTER DOMAIN MANAGER


TDWC

DATABASE
COMPOSER
DB
CONMAN

WAS

WEB
browser

TDWC

Figure 8. Firewall separating the Dynamic Workload Console from the rest of the Tivoli Workload Scheduler network.

The Dynamic Workload Console links to a WebSphere Application Server connector to perform object and
event-rule modeling and actions on plan objects. It also requires a direct connection to the database server
to run reporting actions. If you use dynamic scheduling, it must be connected by HTTP/HTTPS to the
dynamic workload broker component. The impacted connections are:
v Java RMI connection to WAS.
v Connection to the database.
v HTTP/HTTPS connection to dynamic workload broker.

You should allow the following connections to make sure that the Dynamic Workload Console works
properly:

15
Table 5. Connections that must be allowed through a firewall between the Dynamic Workload Console and a master
domain manager.
From To Destination port Connection
Dynamic Workload Console MDM (or other) boot and HTTP/HTTPS RMI and HTTP/HTTPS
Dynamic Workload Console MDM (or other) database database - reporting
MDM (or other) Dynamic Workload Console established connection RMI and HTTP/HTTPS
(reply)
MDM (or other) Dynamic Workload Console established connection database - reporting (reply)

Firewall between the browser and the Dynamic Workload Console


Put your short description here; used for first paragraph and abstract.

The scenario in the next figure shows a firewall separating a browser from the Dynamic Workload
Console.

MASTER DOMAIN MANAGER


HTTP/HTTPS

COMPOSER
DB
CONMAN

WAS

WEB
browser

TDWC

Figure 9. Firewall separating the browser and the Dynamic Workload Console.

The browser connects to the Dynamic Workload Console at one of the following URLs:
v https://TDWC_hostname:AdminSecurePort/ibm/console
v http://TDWC_hostname:AdminPort/ibm/console

You should allow the following connections to make sure that the Dynamic Workload Console works
properly:

16
Table 6. Connections that must be allowed through a firewall between a browser and the Dynamic Workload
Console.
From To Destination port Connection
browser Dynamic Workload Console Admin/AdminSecure HTTP/HTTPS
Dynamic Workload Console browser established connection HTTP/HTTPS (reply)

Firewall between the command line client and the master domain
manager
The scenario in the next figure shows a firewall separating a command line client (remote CLI) from the
rest of a Tivoli Workload Scheduler network.

HTTP/HTTPS
MASTER DOMAIN MANAGER

COMPOSER
DB
CONMAN

WAS

Remote CLI

Figure 10. Firewall between a command line client and the master domain manager.

The command line client can connect only to the master domain manager or to the backup master
because it requires access to the database. The impacted connection is HTTP/HTTPS.

You should allow the following connections to ensure that the command line client works properly:

17
Table 7. Connections that must be allowed through a firewall between the command line client and a master domain
manager (or backup master).
From To Destination port Connection
Remote CLI MDM (or BKM) http/https HTTP/HTTPS
MDM (or BKM) Remote CLI established connection HTTP/HTTPS (reply)

Firewall between the Job Brokering Definition Console and the


dynamic workload broker
The scenario in the next figure shows a firewall separating a Job Brokering Definition Console (JBDC) and
the dynamic workload broker component.

HTTP/HTTPS MASTER DOMAIN MANAGER

COMPOSER
DB
CONMAN

WAS

JBDC

Figure 11. Firewall between a Job Brokering Definition Console and the dynamic workload broker.

The Job Brokering Definition Console is a graphical user interface used to model dynamic job definitions
(JSDL). It requires an HTTP/HTTPS connection to the database through WAS where the dynamic
workload broker component is running.

You should allow the following connections to ensure that the JBDC can be used properly:

18
Table 8. Connections that must be allowed through a firewall between the Job Brokering Definition Console and the
dynamic workload broker.
From To Destination port Connection
JBDC MDM (or BKM, domain http/https HTTP/HTTPS
manager)
MDM (or BKM, domain JBDC established connection HTTP/HTTPS (reply)
manager)

Firewall between the Job Scheduling Console and the Tivoli Workload
Scheduler network
The scenario in the next figure shows a firewall separating the Job Scheduling Console from the Tivoli
Workload Scheduler network.

TWS INSTANCE

WAS

JSC
Figure 12. Firewall separating the Job Scheduling Console from the rest of the Tivoli Workload Scheduler network.

In this scenario the Job Scheduling Console is linked to a Tivoli Workload Scheduler instance (a master
domain manager, a backup master, or a fault-tolerant agent) attached to a connector. If the instance is a
fault-tolerant agent, only browsing and actions on plan objects are allowed. The impacted connections are
Java Remote Method Invocation connections to both the connector and the target Tivoli Workload
Scheduler instance.

You should allow the following connections to make sure that the scheduler works properly:
Table 9. Connections that must be allowed through a firewall between the Job Scheduling Console and a Tivoli
Workload Scheduler instance.
From To Destination port Connection
Job Scheduling Console agent (or other) boot RMI
Job Scheduling Console agent (or other) csiServerAuth RMI - authentication

19
Table 9. Connections that must be allowed through a firewall between the Job Scheduling Console and a Tivoli
Workload Scheduler instance. (continued)
From To Destination port Connection
Job Scheduling Console agent (or other) orb RMI - authentication
agent (or other) Job Scheduling Console established connection RMI (reply)

Configuring Tivoli Workload Scheduler in a firewalled network


© Copyright IBM Corporation 2011.
US Government Users Restricted Rights – Use, duplication or disclosure restricted by GSA ADP Schedule Contract
with IBM Corp.

You might also like