Professional Documents
Culture Documents
T24 Data Migration Tool - DMIG T24 - Reference Guide V1.3
T24 Data Migration Tool - DMIG T24 - Reference Guide V1.3
Reference Guide
Temenos T24™
2 Aug 2012
Temenos T24™
This document aims at explaining the functionality of a non-core T24™ Data Migration Tool that is
used for Data Load into T24™ from any legacy system.
3 Aug 2012
Temenos T24™
Table of Content
About this Document ............................................................................................................................... 3
Table of Content ...................................................................................................................................... 4
1. Introduction ....................................................................................................................................... 6
2. Product Overview ............................................................................................................................. 6
3. Tables ............................................................................................................................................... 7
3.1 DM.INFORMATION.................................................................................................................... 7
3.2 DM.DATA.TRANSLATION ......................................................................................................... 7
3.3 DM.PREREQUISITE .................................................................................................................. 8
3.4 DATA.MAPPING.VERIFICATION ............................................................................................ 10
3.5 DM.MAPPING.DEFINITION ..................................................................................................... 11
3.6 DM.SERVICE.CONTROL ........................................................................................................ 13
3.7 DM.SERVICE.SCHEDULER .................................................................................................... 14
3.8 DM.AA.MAPPING .................................................................................................................... 14
3.9 DM.DATA.COMPARE .............................................................................................................. 15
3.10 DM.EXTRACTOR................................................................................................................... 16
3.11 DM.MAPPING.EXTRACT ...................................................................................................... 18
4. How does it work ............................................................................................................................ 18
4.1 Menus .......................................................................................................................................... 19
4.1.1 Migration Pre-Requisite Check ..................................................................................................... 19
4.1.1.1 Check Migration Table Record Count ................................................................................ 21
4.1.2 Client System to T24 Migration .................................................................................................... 24
4.1.2.1 T24 Data Loading .................................................................................................................. 24
4.1.2.1 Data Migration Load Preparation ....................................................................................... 25
4.1.2.2 Data Mapping Definition ..................................................................................................... 36
4.1.2.3 DM Service Control ............................................................................................................ 41
4.1.2.4 Service Control ................................................................................................................... 47
4.1.2.5 Application backup ............................................................................................................. 49
4.1.2.6 AA.MAPPING ..................................................................................................................... 54
4.1.2.2 T24 Recon Data Extract ....................................................................................................... 57
4.1.3.1 DM.PREREQUISITE .......................................................................................................... 58
4.1.3.2 T24 Data Extract for Reconciliation ................................................................................... 62
4.1.3 T24 to T24 Migration .................................................................................................................. 107
5.1 DM Tool Additional Features ..................................................................................................... 108
5.1.1 DM.REF.TABLE .................................................................................................................. 108
5.1.2 DM.AA.CUSTOMER.PROXY.............................................................................................. 109
4 Aug 2012
Temenos T24™
5 Aug 2012
Temenos T24™
1. Introduction
The T24 Data Migration tool was originally written using the EB.PHANTOM option. Phantom
processing in T24 is gradually moving from EB.PHANTOM to TSA.SERVICE. The Data
Migration tool too has been modified to be run as a Service. The original functionalities of
Multi threaded transaction takeover
Takeover performance reporting, have been preserved in the latest version.
2. Product Overview
The T24™ Data Migration tool will be used during the pre-Live in any site implementations. It
basically covers the following functionalities.
Used the Standard OFS module to perform the Load. The Option is given to use
either OFS or standard jBase WRITE function to perform the Load.
Validations done on the data, prior to update.
Accepts data in any layout and format, that includes double byte characters (Provided
the jBase used for the implementation is 4.1 or higher)
Required less manual intervention during mapping of the incoming file to T24™
applications, which is a one time setup.
Supports load of Local Tables too that are site specific
Perform all standard T24™ validations and also includes any special cases to perform
local validations for the data being loaded.
Supports scaling, using the multi threading capabilities of TSA.SERVICE. However,
this is dependent more on the hardware capabilities.
Stop/Resume options in case the Server/T24™/Database connectivity is lost in the
middle of the load. In this case, the operation will be resumed from where it was
paused. Standard TSA.SERVICE transaction management capabilities are available.
Exception Handling is done to report any erroneous data present in the Data file. This
report also includes detailed description of the error that is raised during the load.
Supports Sequential Load of Data in multithread. For example, In Drawings Table,
where data is to be loaded in sequence of the ID.
Supports Read Replacement for Multi Company.
6 Aug 2012
Temenos T24™
3. Tables
The T24™ Data Migration tool package consists of the following Tables
3.1 DM.INFORMATION
Counting the number of records for every application, before and after migration, is a vital part
of Data Migration .The information of the number of counts per application is available in a
Live file DATA.MIG.COUNT. This application keeps track of the changes in the load count.
The entire operation can be successfully complete using the Browser
1.1 CLIENT.NAME The name of the client for which the migration is to be done
3.2 DM.DATA.TRANSLATION
The data provided by the Client will contain values that correspond to the Legacy Data
Format which cannot be loaded into T24 without any TRANSLATION. Hence a valid
TRANSLATION must be done so that the Data is converted into a T24 Data format and can
be loaded. The application DM.DATA.TRANSLATION is used for converting legacy data into
T24 format.
7 Aug 2012
Temenos T24™
3.3 DM.PREREQUISITE
This table is used to create folders and for copying of data files to the respective folders and
creating the DM.SERVICE.CONTROL (DSC) that are necessary for the load of Data to T24. Using the
application DM.PREREQUISITE we can create necessary folders and moves the data files to their
respective paths. We will also be able to create folders for placing reconciliation extracts from T24.
8 Aug 2012
Temenos T24™
USER table
1 DESCRIPTION Free text Description.
The task that will be done when the application is
verified. Either creation of DSCs or Only creation of
2 PROCESS Directories or Both can be specified here.
The prefix that will be used to identify the
DM.MAPPING.DEFINITION for creation of the
3 DMD.PREFIX respective DM.SERVICE.CONTROL.
The directory name where the data files are stored
and under this directory will other folders get created
which will be used as Path Name in
DM.SERVICE.CONTROL. The value DATA.IN is
defaulted, which can be changed to any other
4 DIRECTORY.NAME directory as and when needed.
This is an optional filed, where a valid
5 SELECT.LIST &SAVEDLISTS& name can be given
This is an Associated Multivalue set which will hold
the DMD name for which the
DM.SERVICE.CONTROL will be created followed by
a dot (.) and a two digit sequence number. (The
format is mandatory). The folders that need to be
created for the data file will be based on the input to
6.1 DMD.NAME this field.
This is an Associated Multivalue set .The data file
that will be used for the load will be selected
automatically based on the input in the previous field.
The data file must be present in the Directory that
has been specified in the field Directory Name. The
format of the data file must be as follows.
<dmd.name>.<2 digit
number>.<yyyymmddhhmm>.TXT
E.g.: CUSTOM.01.201203031653.txt
7.1 FILE.NAME
We need to specify if Reconciliation Folders are to be
12 RECONCILIATION created or not. By default, the value “No” is defaulted.
The Directory name must be specified here where
the reconciliation folders will be created. By default
13 RECON.DIR RECON.OUT is populated in this field.
This is a multivalue field that is used to provide the
Names of the folders that need to be created under
the Directory specified in the field Recon Dir.
14.1 RECON.SPEC.NAME
This is a multivalue field that contains DM Extractor
Id. The value entered must exist in the
15.1 DM.EXTRACTOR.NAME DM.EXTRACTOR table
16 FILE.CONV.REQD File conversion will be done if this field is set to YES.
This field specifies whether the file load is from
17 FILE.LOAD.TYPE Server or Local host
Conversion type is specified here. Values entered
18 CONVERSION.TYPE here are NT or Unix
9 Aug 2012
Temenos T24™
3.4 DATA.MAPPING.VERIFICATION
Verifying the correctness of the Data Mapping Table and the DM.MAPPING.DEFINITION is a
vital part of Data Migration, without which the data load, could go wrong.
DATA.MAPPING.VERIFICATION verifies, whether the Data Mapping Table and the
DM.MAPPING.DEFINITION are in synchronisation.
This application also produces an output file in the specified DMV directory with the status of
the Verification.
10 Aug 2012
Temenos T24™
3.5 DM.MAPPING.DEFINITION
This application allows the user to define the format of the data in the incoming tape.
The following provides the list of fields, positions and the associated description.
.
11 Aug 2012
Temenos T24™
12 Aug 2012
Temenos T24™
4. DM.OFS.SRC.VAL.ORD
With FIELD.VAL set to ‘YES’ for cases where
field validation processing must be triggered e.g.
AC.ACCOUNT.LINK are provided by default.
3.6 DM.SERVICE.CONTROL
This application is used
To define the company for which the data is loaded
The location of the incoming data
Control the execution of the data load process
The following are the fields and associated description
13 Aug 2012
Temenos T24™
3.7 DM.SERVICE.SCHEDULER
The services for validating, loading and authorizing of Data in T24 is required to be scheduled
and automated. Most of the tasks are manually performed which needs to be eliminated. The
multithread service will perform the tasks in an automated way to eliminate the manual task to an
extent.
3.8 DM.AA.MAPPING
This is the table where the parameters are provided.
The ID for this application must be in the format:
14 Aug 2012
Temenos T24™
3.9 DM.DATA.COMPARE
The Reconciliation extract will be taken from T24 after completion of load as per the process,
and the reconciliation extract from the Legacy will also be extracted and provided in the DATA.IN
Directory.
The table is used to compare both the Extracts and to provide the output of the comparison.
6 SOURCE.FILE.PATH The directory Name where the Extract is provided by the bank.
7 SOURCE.FILE.NAME The File Name of the Extract provided by the bank.
The Directory Name where the Recon Extract from T24 is
8 TARGET.FILE.PATH located.
The File Name of the Recon Extract from T24.
9 TARGET.FILE.NAME
The Output File Path where the reconciliation extract is available
after the comparison.
10 OUTPUT.FILE.PATH
The Output File name of the reconciliation extract after the
comparison.
11 OUTPUT.FILE.NAME
15 Aug 2012
Temenos T24™
15.1 SOURCE.FIELD This field specifies the first field of source data file.
3.10 DM.EXTRACTOR
Extraction of Data from T24 is necessary when a Migration is to take place between older
releases of T24 to a newer release or for reconciliation extracts.
The application DM.EXTRACTOR is used for extraction of data from T24.
16 Aug 2012
Temenos T24™
17 Aug 2012
Temenos T24™
3.11 DM.MAPPING.EXTRACT
It has three menus namely Migration Pre-Requisite Check, Client System to T24 Migration ,
T24 to T24 Migration to facilitate efficient working with DM tool.
18 Aug 2012
Temenos T24™
4.1 Menus
19 Aug 2012
Temenos T24™
System Dates
Table
DATES
Purpose
The DATES application is checked for the current system Date before start of Data Migration.
20 Aug 2012
Temenos T24™
Table
DM.INFORMATION
Purpose
The application is used to summarise the number of records Loaded. The information of the
number of counts per application is available in a Live file DATA.MIG.COUNT. This application
keeps track of the changes in the Load count.
This is the table where the parameters are provided. It can be accessed from the Command
line of the Browser.
The ID for this application must be only SYSTEM.
After selecting the correct ID you will get a screen as below. Input The application name and
commit it.
21 Aug 2012
Temenos T24™
DM.IFORMATION – verify
1. The count for each application is done and updated in the fields Live Count, Nau Count,
His Count.
After committing the record we need to verify the record. We select the record name to be
verified and click the button as shown.
22 Aug 2012
Temenos T24™
23 Aug 2012
Temenos T24™
The latest count of the application in the area will be the first multivalue set.
24 Aug 2012
Temenos T24™
Table
DM.PREREQUISITE
Purpose
Creation of folder, copying of data files to the respective folders and creating the
DM.SERVICE.CONTROL (DSC) is necessary for the load of Data to T24.DM.PREREQUISITE by
which we can create necessary folders and moves the data files to their respective paths. We will also
be able to create folders for Reconciliation.
The DSC is an integral part of data load. This application also creates specified DSCs with
appropriate field values through Open Financial Service (OFS).
OFS.SOURCE
The following OFS.SOURCE record DSC.CREATE must be available in the deployed area.
25 Aug 2012
Temenos T24™
DM.PREREQUISITE – Parameters
This is the table where the parameters are provided. It can be accessed from the Command
line of the Browser.
The ID for this application must be in the format:
A 3 digit number followed by an asterisk (*), followed by a valid user name that must be
present in the USER table. (The format is mandatory).
26 Aug 2012
Temenos T24™
27 Aug 2012
Temenos T24™
DM.PREREQUISITE – Verify
After doing the initial setup of parameters, DM.PREREQUISITE is verified. On verification 4
processes are triggered.
Creation of folders VERIFY01, VERIFY02, VERIFY03, LOAD01, AUTH01, DMV under the
specified folders.
Copying of the data file to the above folders in their respective paths.
Creation of DSC records via OFS.
Creation of Reconciliation folders.
If DMD with function ‘R’ or ‘D’ records are not available, DSC will not be created. This will be
28 Aug 2012
Temenos T24™
captured as override. For other DMDs with function as ‘V’, ‘I’ or ‘A’, if record is not present, it
will be an error.
After committing the record we need to verify the record. We type the record name to be
verified and click the button as shown.
29 Aug 2012
Temenos T24™
The folders are created for Validate, Input and Authorise and also the Reconciliation folders
are created as shown below and the data file is also copied to the respective path
30 Aug 2012
Temenos T24™
Table
DATA.MAPPING.VERIFICATION
Purpose
The menu is used to do the initial setup for the verification of the Data Mapping Sheet and the
DM.MAPPING.DEFINITION
.
Detailed Working Procedure
This is the table where the parameters are provided. It can be accessed from the Command
line of the Browser.
The ID for this application can be upto 65 characters
31 Aug 2012
Temenos T24™
After committing the record we need to verify the record. We type the record name to be
verified and click the button as shown.
32 Aug 2012
Temenos T24™
The compared DMV fields and the DMD fields are populated in the file as below
By providing the data file name only, the values in the data file will be validated and the PASS/FAIL
status will be updated based on the validation status of each record. The validation status of each record
will be stored in a separate file as a result.
The validation of the record values in the data file can be validated based on the verification type
provided by the user. The following are verification types,
FIRST
POSITION
RANDOM
1. FIRST
33 Aug 2012
Temenos T24™
If the datafile name is given and the verification type is “FIRST” then only the first string in the
datafile will be validated and the result for that particular record will be stored in a separate file in the DMV
folder. A sample record is provided below for reference.
On verifying the DMV, the PASS/FAIL status will be updated and the result will be stored in the
DMV folder. Only the file for data file will be created as the field verification is not provided in this case.
34 Aug 2012
Temenos T24™
The status of the validated datafile will be populated in the file as below
2. POSITION
If the datafile name is given and the verification type is “POSITION” then it’s mandatory to provide
values for the fields “FIRST.POSITION” and “LAST.POSITION”. The number of strings present between the
two position values in the datafile will be validated and the result for those records will be stored in a
separate file in the DMV folder.
35 Aug 2012
Temenos T24™
On verifying the DMV, the PASS/FAIL status will be updated and the result will be stored in the
DMV folder. Only the file for data file will be created as the field verification is not provided in this case.
3. RANDOM
If the datafile name is given and the verification type is “RANDOM” then a string will be picked
randomly from datafile and will be validated. The result for that particular record will be stored in a separate
file in the DMV folder. A sample record is provided below for reference.
On verifying the DMV, the PASS/FAIL status will be updated and the result will be stored in the
DMV folder. Only the file for datafile will be created as the field verification is not provided in this case.
Table
DM.MAPPING.DEFINITION
Purpose
This is the base mapping definition table, which maps the incoming data with T24™ fields.
The following items are defined here.
T24™ Table to be Loaded
Type of Load to be done (Example – OFS or flat jbase write)
Type of Action to be performed (Example – Only Validate the data or actual
update)
Function to be used (Allowed function are Input, Authorise, Delete, Reverse)
If not mentioned, Input function will be used by default.
Way to identify the each field in a transaction/record (Example – by delimiter or
position)
Associated multi-valued sets to map the T24™ field with Data, by giving the
position or delimiter to identify the data from the source file
Any Post Update routine that will be called for every transaction/record being
Loaded.
36 Aug 2012
Temenos T24™
done through a flat write rather than going through the OFS route.
2. This parameter is governed by the Field “ LOAD.TYPE”. This is a mandatory field and
the possible values are OFS.LOAD or WRITE or READ.REPLACEMENT. When value
OFS.LOAD is selected then the data loading will be done through the OFS route
whereas selecting a value of WRITE will enable direct write to the tables. The
READ.REPLACEMENT option will update certain fields of an already Loaded record
through direct write of only those fields.
4. File Type – In case of a flat write, the data can be directly written to either of the $NAU,
$HIS or LIVE files. This parameter is determined by the value in field “ FILE.TYPE”.
The possible values in this field are NULL, $NAU and $HIS. In case NULL is selected
the write will be to LIVE file else the direct write will be done to the respe ctive files
($NAU or $HIS).
5. Defining Incoming Data – Generally the incoming data from the client received in the
Data file will comprise of several records in the form of text strings. For the purpose of
loading these records have to be segregated and this segregation can be done using
different methods. The possible methods for segregation include (i) use of delimiters
for identifying the records and fields within a record (ii) use of position of different
fields e.g. field 7 in a record can be of length 6 and will be starting at a position 25.
This way a field can be defined in a record. In this case it is not possible to define the
multi-value and sub-value fields since the position for mapping will get fixed and it will
not be possible to support the variable field lengths in case of multi -value and sub value fields.
This option is handled through the mandatory field “ IN.DATA.DEF” which has 2 options viz DELIM or
POSITION. The option DELIM is used for defining the delimiters to be used whereas POSITION i s used for
defining the position of the fields.
5. Delimiter Definitions: It is possible to parameterize different delimiters that can be
used for identifying the records, multi -values and sub-values. These definitions can be
provided in the fields FM.DELIM, VM.DELIM and SM.DELIM as each of these fields
represent one type of delimitation.
6. ID Generation:
In T24™ it is possible that for some of the application the ID has to be system
generated whereas for certain other applications the ID has be to be provided
manually.
Accordingly 3 options have been provided in this mandatory field “ ID.TYPE” viz.
AUTO, DATA and ROUTINE.
When the option AUTO is selected the ID will be automatically generated by the
system.
For option DATA the record ID has to be part of the incoming data.
In case of some of the applications require routine for generating the Ids the n the
option ROUTINE can be used. In this case a single argument is passed and returned
back from the routine used for forming the record ID. The name of the ID routine
can be defined in the field ID.ROUTINE.
The ID position has to be defined in case of data being segregated by fixed positions.
This field ID.POSITION becomes a mandatory field if field IN.DATA.DEF has a value of POSITION.
7. Escape Sequence:
There is a provision to allow the user to change certain character in the data from one
kind to another. For example if a “,” has to be changed to “:” then this can be done
37 Aug 2012
Temenos T24™
using an associated multi -value field “ESC.SEQ.FR” and “ECS.SEQ.TO”. The value
defined in the field “ESC.SEQ.FR” will be changed to the value in the field “ESC.SEQ.TO”.
8. Mapping:
These are a set of associated multi -value fields from “APPL.FIELD.NAME” to
“FIELD.VALUE” that are used for mapping either OFS or flat write. This set of fields
comprise of APPL.FIELD.NAME, FIELD.POSITION, FIELD.LENGTH,
FIELD.ATTRIB and FIELD.VALUE.
9. Field “APPL.FIELD.NAME” is for defining the name of the field in given T24™
application (either core or local). It can take up to 65 characters. The field has to be defined in the
STANDARD.SELECTION record of the application. Either the field name or the field
number can be input in this field.
It is possible to provide the company for which the record is loaded. To do this, the
field name must be set to LOAD.COMPANY and the company code must be
provided in the associated field position.
For LD.LOANS.AND.DEPOSITS, an option to provide both the
LD.LOANS.AND.DEPOSITS record as well as the LD.SCHEDULE.DEFINE record is
available. The values must be separated using a message separator, defined using
‘MESSAGE.SEPARATOR’.
10. Field “FIELD.POSITION” is used to define the position of the field in the record given
by the client. For example if there is local reference field in the customer record
defined as field number 40.2 (MERCHANT.NUMBER) and this has been provided at
25th position in the data given by the Client then the field “ APPL.FIELD.NAME” should
have the value MERCHANT.NUMB ER and field “FIELD.POSITION” should have the
value of 25. This is used in case option “DELIM” or “POSITION” is opted in field
IN.DATA.DEF.
11. Field “FIELD.LENGTH is a mandatory field in case field “ IN.DATA.DEF” has value of
“POSITION” and for other cases this s is a no-input field.
12. Field FIELD.ATTRIB has two options viz. CONSTANT and ROUTINE. When option
“CONSTANT” is selected the value in the field “FIELD.VALUE” will be mapped to the
Application field as constant value. In case option “ROUTINE” is selected the n the
Name of the routine should be defined in the field FIELD.VALUE. The name of the
Routine should be preceded by ‘@’ and should be defined in the PGM.FILE as type “S”.
14. Field POST.UPDATE.ROUTINE is used to perform the post update processing for a
record. In applications like LD which are single authoriser, this routine will be used to
perform the authorisation of an unauthorised record. any routine is attached to the field
POST.UPDATE.RTN, it can be removed even after the record is authorised.
15. Field OFS.SOURCE is used to define the OFS.SOURCE record to be used when
Loading the data into T24 via OFS. Four OFS.SOURCE records DM.OFS.SRC, 15. Field OFS.SOURCE is
used to define the OFS.SOURCE record to be used when
Loading the data into T24 via OFS. Four OFS.SOURCE records DM.OFS.SRC, DM.OFS.SRC.VAL,
DM.OFS.SRC.ORD and DM.OFS.SRC.VAL.ORD are provided.
In cases where the field validation needs to be triggered for incoming OFS message
then the DM.OFS.SRC.VAL option need to be used and if OFS.REQUEST.DETAIL
has to be updated then the DM.OFS.SRC.VAL.ORD option need to be used .
38 Aug 2012
Temenos T24™
DMD Record:
39 Aug 2012
Temenos T24™
40 Aug 2012
Temenos T24™
Purpose
This work-file application is used to control the actual Load process. The following details must
be provided when the DM.SERVICE.CONTROL is set up .
Load Company – The Company for which the data is loaded. If this information is not
provided in the incoming tape, it must be provided here.
Data file directory and file name – A valid directory name and file name.
Number of Sessions – The number of sessions with which the TSA.SERVICE must be run.
User – The user name which will be used for the Load process. This username will get
updated in the TSA.SERVICE record related to the DM.SERVICE.CONTROL. Initially
DMUSER was hardcoded as the USER.
Error Details, No Of Error and Type Of Error – The details of the errors that are generated
during validation/process will be updated in these fields. Date, Started, Stopped and
Elapsed – The Current system Date, the start time, end time and the elapsed time of
validation/process activity will be updated in these fields.
41 Aug 2012
Temenos T24™
After setting up the above mentioned fields, the record must be committed. This must be done
in the INPUT mode.
When running for the first time, the system will check if the BATCH, TSA.SERVICE and
TSA.WORKLOAD.PROFILE are present and if no they will be created. This is checked in the file
DM.SERVICE.CONTROL.CONCAT. If they don’t exist, they are created by the system. The
following components are created.
Batch
A batch record with the id as DM.SERVICE -<<DM.MAPPING.DEFINITION name>> is created.
This contains three jobs namely
DM.SERVICE – A multi threaded job which reads the DM.SERVICE.DATA.FILE and updates
the job list. Each record is then processed using OFS.GLOBUS.MANAGER and the error
output is logged into <<DM.MAPPING.DEFINITION name>>.LOG.<<Session Number>>
42 Aug 2012
Temenos T24™
TSA.WORKLOAD.PROFILE record
TSA.SERVICE record
43 Aug 2012
Temenos T24™
As can be seen, the user in the TSA.SERVICE record is updated from the
DM.SERVICE.CONTROL.
Data File:
Output:
44 Aug 2012
Temenos T24™
45 Aug 2012
Temenos T24™
To mark the DM.SERVICE stop, the DM.SERVICE.CONTROL record must be opened in Input
mode and the RUN.STATUS must be set to STOP.
Immediately the TSA.SERVICE is marked for STOP. The agents will stop after processing the
current records.
46 Aug 2012
Temenos T24™
spawn more or less agents as required (in Phantom mode) or request for running more or less
agents (in Debug mode)
Right now this option is available only in Classic
Table
TSA.SERVICE
Purpose
The Menu is used to Start or Stop the TSA.SERVICE record TSM or any other service that
needs to be started or stopped manually.
Service Status
Enquiry
ENQUIRY DM.ENQ.TSA.STATUS
47 Aug 2012
Temenos T24™
Purpose
This enquiry provides the information for all Data Migration services as well as the TSM status.
Once the TSA.SERVICE has been marked as START, if the TSM is running in phantom mode
the tSA(agent) will be automatically started. If the same is running in debug mode, then the agents
will have to be manually initiated to start the actual process.
We need to refresh the enquiry regularly to know the status of the services. For this we need
to enable auto refresh as shown below.
48 Aug 2012
Temenos T24™
The result will be as below and the query will refresh every 5 seconds.
Table
DM.APP.BACKUP
49 Aug 2012
Temenos T24™
Purpose
Taking backup of the application is a necessary step in the process of migration. The following
Steps explain the usage of the application DM.APP.BACKUP by which we can take the
Backup of any application.
DM.APP.BACKUP - parameters
This is the table where the parameters are provided. It can be accessed from the MENU line
of the Browser.
50 Aug 2012
Temenos T24™
Backup Folder: The folder name where the APPLICATION records will be moved is defaulted
here.
No Of Agents: The number of agents that will be required to run the service to clear and backup
the APPLICATION.
Local Ref: Currently non inputtable, can be used for further developments.
Override: No overrides are generated with this application.
51 Aug 2012
Temenos T24™
52 Aug 2012
Temenos T24™
The Folder for taking the Application backup will be created under DM.BACKUP.
2. Start the TSM , but inputting and authorizing the TSA.SERVICE records TSM to “Start”
3. Initiate the TSM in the classic mode like below.
4. The records from the APPLICATION will be moved to the backup folder.
Before :
After
Count the number of records in the backup folder to verify all the records are been moved .
53 Aug 2012
Temenos T24™
4.1.2.6 AA.MAPPING
Table
DM.AA.MAPPING
Purpose
This application is used to extract the AA products with their respective Properties and
Property class.
54 Aug 2012
Temenos T24™
Input a record
55 Aug 2012
Temenos T24™
The output will be stored in AA.DMS.OUT directory under the particular product line name
56 Aug 2012
Temenos T24™
Service Control
Table
TSA.SERVICE
Purpose
To start and stop the services during extraction process.
Service Status
Table
DM.PREREQUISITE
Purpose
To monitor the services status during extraction process.
57 Aug 2012
Temenos T24™
4.1.3.1 DM.PREREQUISITE
This is the table where the parameters are provided. It can be accessed from the Command
line of the Browser.
A 3 digit number followed by an asterisk (*), followed by a valid user name that must be
present in the USER table. (The format is mandatory).
58 Aug 2012
Temenos T24™
Dmd Name: This is an Associated Multivalue set which will hold the DMD name for
Which the DM.SERVICE.CONTROL will be created followed by a dot (.) and a two
Digit sequence number. (The format is mandatory). The folders that need to be
Created for the data file will be based on the input to this field
59 Aug 2012
Temenos T24™
File Name: The data file that will be used for the Load will be selected automatically
based on the input in the previous field. The data file must be present in the Directory
that has been specified in the field Directory Name. The format of the data file must
be as follows.
Recon Dir: The Directory name must be specified here where the reconciliation
folders will be created. By default RECON.OUT is populated in this field.
Recon Spec Name: This is a multivalue field that is used to provide the Names of the
folders that need to be created under the Directory specified in the field Recon Dir.
Local Ref: Currently non inputtable, can be used for further developments .
DM.PREREQUISITE – verify
60 Aug 2012
Temenos T24™
61 Aug 2012
Temenos T24™
This is the table where the parameters are provided. It ca n be accessed from the Command
line of the Browser.
There are no specifications for the ID of this application.
62 Aug 2012
Temenos T24™
Sel Comb: The Selection Combination must be specified here. This is an associated
multivalue field.
Select List: The save list name must be enter ed in this field.
Target File: The output filename for the Extraction.
Output Dir: The output directory where the extracted file will reside.
FM Delim: The Field Marker that will be used in the extract to separate the fields.
VM Delim: The Value Marker that will be used in the extract to separate the Multi
values.
SM Delim: The Sub value Marker that will be used in the extract to separate the Sub
values.
Fld Label: The field description for the field to be extracted .
Fld Type: The extract type for the field is specified here.
Fld Name: The field name from the application which is to be extracted.
Fld Rtn: A subroutine can be attached to this field that will be used for different
extraction logic
DM.EXTRACTOR – authorize
TSA.WORKLOAD.PROFILE
63 Aug 2012
Temenos T24™
TSA.SERVICE
BATCH
64 Aug 2012
Temenos T24™
65 Aug 2012
Temenos T24™
The output file specified in the application is generated in the output directory.
66 Aug 2012
Temenos T24™
67 Aug 2012
Temenos T24™
68 Aug 2012
Temenos T24™
After authorising the record, the service has to be started as per the below screen shot.
Once the service is complete, the OFS will be listed as shown below.
69 Aug 2012
Temenos T24™
70 Aug 2012
Temenos T24™
71 Aug 2012
Temenos T24™
72 Aug 2012
Temenos T24™
73 Aug 2012
Temenos T24™
74 Aug 2012
Temenos T24™
OUTPUT:
75 Aug 2012
Temenos T24™
76 Aug 2012
Temenos T24™
77 Aug 2012
Temenos T24™
78 Aug 2012
Temenos T24™
79 Aug 2012
Temenos T24™
80 Aug 2012
Temenos T24™
81 Aug 2012
Temenos T24™
82 Aug 2012
Temenos T24™
83 Aug 2012
Temenos T24™
84 Aug 2012
Temenos T24™
85 Aug 2012
Temenos T24™
86 Aug 2012
Temenos T24™
87 Aug 2012
Temenos T24™
88 Aug 2012
Temenos T24™
89 Aug 2012
Temenos T24™
90 Aug 2012
Temenos T24™
91 Aug 2012
Temenos T24™
92 Aug 2012
Temenos T24™
TSA.SERVICE record:
On authorisation of the DM.EXTRACTOR record a record in TSA.SERVICE will be created.
93 Aug 2012
Temenos T24™
Sample Output :
TSA.SERVICE record:
On authorisation of the DM.EXTRACTOR record a record in TSA.SERVICE will be created.
94 Aug 2012
Temenos T24™
95 Aug 2012
Temenos T24™
Sample Output :
Append:
Append is the condition which is used to add a value in the first position or in the last position of the
field value. Also a value can be added at a position mentioned within the field value.
The field values are provided as given in the below screen shot to extract from more than one
application using the append condition.
96 Aug 2012
Temenos T24™
97 Aug 2012
Temenos T24™
After running the service, target file has been created in the output directory
98 Aug 2012
Temenos T24™
Extracted file
Trim:
Trim is the condition which is used to remove a value from the position which is mentioned within the
field value.
The field values are provided as given in the below screen shot to extract from more than one
application using the trim condition.
99 Aug 2012
Temenos T24™
After running the service, target file has been created in the output directory
Extracted file
Replace:
Replace is the condition which is used to replace one value with another value in the position
mentioned in the field value.
The format is VALUE[POSITION]
E.g.:
VALUE [number1, number2]
Or VALUE[number]
The field values are provided as given in the below screen shot to extract from more than one
application using the replace condition.
After running the service, target file has been created in the output directory
Extracted file
Service Control
Table
TSA.SERVICE
Purpose
The Menu is used to Start or Stop the TSA.SERVICE record TSM or any other service that
needs to be started or stopped manually.
Service Status
Enquiry
ENQUIRY DM.ENQ.TSA.STATUS
Purpose
This enquiry provides the information for all Data Migration services as well as the TSM status.
Once the TSA.SERVICE has been marked as START, if the TSM is running in phantom mode
the tSA(agent) will be automatically started. If the same is running in debug mode, then the agents
will have to be manually initiated to start the actual process.
We need to refresh the enquiry regularly to know the status of the services. For this we need
to enable auto refresh as shown below.
The result will be as below and the query will refresh every 5 seconds.
T24 to T24 migration has the same menus as in Client System to T24 Migration and the procedure is
also the same as seen above.
5.1.1 DM.REF.TABLE
It is data migration reference table for AA Migration. It contains three fields – DESCRIPTION,
APPLICATION, FIELD.NAME. If we want to extract any specific data (field value) from any
valid t24 table, we need to specify the application name and field names for which the values
has to be extracted.
5.1.2 DM.AA.CUSTOMER.PROXY
Description:
The DM.AA.CUSTOMER.PROXY is a concat table containing arrangement id and proxy
customer id. It is a multi value set. The concat file id must be customer id which provided in
the AA.ARRANGEMENT.ACTIVITY record. Before we update the concat file, we need to be
specify the required fields in DM.REF.TABLE (Ref 4.4.1). Based on the reference table it
updates DM.AA.CUSTOMER.PROXY table.
Functional Process:
The arrangement id and customer id extracted from AA.ARRANGEMENT.ACTIVITY table as
based on the certain condition. First select the AA records, if the ACTIVITY field has
‘PROXY.SERVICES-NEW-ARRANGEMENT’ with FIELD.NAME has ‘PROXY.
If all the records has the value ‘PROXY’ in field FIELD.NAME, the corresponding proxy
customer and arrangement id will be updated in concat file.
5.1.3 T24.AA.INT.ACT.DEFAULT
The subroutine will be attached as input routine in specified version with creation of PGM .FILE
and EB.API entry. This routine will default the values as ‘PROXY.ARRANGEMENT’ in the
fields FIELD.NAME and updates the corresponding arrangement id in the FIELD.VALUE field
in AA.ARRANGEMENT.ACTIVITY table (selected from DM.AA.CUSTOMER.PROXY) . It is
multi value set.
Prerequisites:
While inputting or Loading the data via this version the specified fields will be defaulted from the
concat table DM.AA.CUSTOMER.PROXY.
5.1.4 DM.AA.CUSTOMER
Description:
The DM.AA.CUSTOMER table is a concat file containing arrangement id for the
corresponding PROXY CUSTOMER. The arrangement id filed is a multi value set. The concat
file id must be customer id which provided in the AA.ARRANGEMENT.ACTIVITY (Proxy)
record. Before we update the concat file, we need to be specif y the required fields in
DM.REF.TABLE (Ref 4.4.1). Based on this reference table, it updates the
DM.AA.CUSTOMER table.
Functional Process:
The arrangement id extracted from AA.ARRANGEMENT.ACTIVITY table as based on the
certain condition. First select the AA records, if the ACTIVITY field has
‘INTERNET.SERVICES-NEW-ARRANGEMENT’ and update the arrangement id in the concat
table.
5.1.5 T24.EB.EXT.ARR.ACT.DEFAULT
The subroutine will be attached as input routine in specified version with creation of PGM.FILE
and EB.API entry. This routine will default the values in the fields ARRANGEMENT from the
concat table DM.AA.CUSTOMER in EB.EXTERNAL.USER table.
Prerequisites
Create the PGM.FILE entry for t he T24.EB.EXT.ARR.ACT.DEFAULT.
While inputting or Loading the data via this version the arrangement id will be defaulted from
the concat table DM.AA.CUSTOMER.
This is used to convert the data file of SC.PERF.DETAIL into the required utility format file and
to update the concat File SC.PERF.DETAIL.CONCAT.
Routine Updation
5.1.6.1 DM.SC.FORMAT:
This utility is used to convert the provided data file into RUN.PERFORMANCE.TAK EON
execution format.
Required Input for DM.SC.FORMAT:
Steps to be followed for Program execution.
1. Create a directory PERF.DATA in ...bnk.run path
2. Enter the path and file name
3. In PERF.DATA directory create a file data.txt and save it
4. Run the program DM.SC.FORMAT it will create a new Data file in data.txt.
Expected output
Before Processing DM.SC.FORMAT data file in format:
5.1.6.2 DM.SC.PERF.CONCAT.UPDATE
The routine will update the concat file SC.PERF.DETAIL.CONCAT based on the loaded
o Field Position.1: This field specifies the position of the field and applicable only for
environment compare.
o Source Field.1: This field specifies the first field of source data file.
Data File:
In the below source file, the first two string ids are ending with 004,005. But the target file, the first
two string ids are ending with 005,004.
SOURCE.txt
TARGET.txt
DM.DATA.COMPARE record
Record ID: TEST
Output:
The source and target new records are created in the separate file.
Mismatch records
In the below source file, the first two string ids are ending with 004,005. But the target file, the first
two string ids are ending with 005,004.
SOURCE.txt
TARGET.txt
DM.DATA.COMPARE record
Record ID: TEST
The source and target new records are created in the separate file.
Mismatch records
In the below source file, the first two string ids are ending with 004,005. But the target file, the first
string is in the second position.
SOURCE.txt
TARGET.txt
DM.DATA.COMPARE record
Record ID: TEST
After updating the error log file, the source and target concat file are clear by the routine.
5.1.8 DM Scheduler
Table
DM.SERVICE.SCHEDULER
Purpose
o The services for validating, loading and authorising of Data in T24 is required
to be scheduled and automated. Most of the tasks are manually performed
which needs to be eliminated. The multithread service will perform the tasks
in an automated way to eliminate the manual task to an extent.
Data File:
Output:
Once the Scheduler is completed, the DSS updates with Sceduler Status as Scheduler
Completed.And also indicating No of Errors as 0 for both (Positive Test case).
5.1.9 DM.DATA.TARNSLATION
Table
DM.DATA.TRANSLATION
Purpose
The data provided by the Client will contain values that correspond to the
Legacy Data Format which cannot be loaded into T24 without any TRANSLATION.
Hence a valid TRANSLATION must be done so that the Data is converted into a T24
Data format and can be loaded.
o Application: The application for which the TRANSLATION carried out.
o Source Dir: The source directory of the data file consisting of Legacy ID is provided in this
field.
(Please note that any error log generated from the
TRANSLATION will be available in this directory)
o Source Fname: The Datafile name consisting of Legacy ID is given.
o Target Dir: The name of the output directory where the converted file will be available after
the TRANSLATION is over. Any error log generated will be available as mentioned above.
By default DATA.BP is defaulted which can be modified as per the
user.
o Target Fname: The name of the converted data file with T24 Ids must be given here.
o FM Delim: The Field Marker that is used in the datafile to delimit field values. By default ‘|’ is
defaulted which can be modified as per the user.
o VM Delim: The Value Marker that is used in the datafile to delimit field values. By default ‘::’ is
defaulted which can be modified as per the user.
o SM Delim: The Sub Value Marker that is used in the datafile to delimit field values. By default
‘!!’ is defaulted which can be modified as per the user.
o Mapping Table: Any Concat file name that will be used for the TRANSLATION can be
provided in this field. This is a multivalue field.
o Pos Convert: The positions in the data file where the TRANSLATION has to be done and
replaced with T24 Ids. This is a sub value field associated with Mapping Table field.
o Pos Value: fixed values change the position in the data file where the TRANSLATION has to
be done and replaced the fixed value. This is an associated sub value set with Pos Check field.
o Pos Routine: A subroutine can be attached to this field that will be used for TRANSLATION
logic. This is an associated sub value set with Pos Check field.
o No. Of Service: The number of agents that will be required to run the TRANSLATION can be
mentioned in this field.
Input a record.
Output
Thus any changes that are to be done in the data file can me made through this table.
After starting the service, while listing the OFS.REQUEST.DETAIL, it is found there is no
breakage of record even it consists of special characters or more than 1024 characters.
In the above screenshots, it is found that there is no truncation of record even if there is
special characters or more than 1024 characters.
5.1.11 DM.AA.LENDING.BAL.EXTRACT
Balances can be extracted for Reconciliation Purpose using the routine
DM.AA.LENDING.BAL.EXTRACT.
Execution:
OUTPUT File:
Sample Output:
5.1.12 DM.AA.LENDING.SCHEDULE.EXTRACT
Execution:
OUTPUT File:
Sample Output:
5.1.13 DM.DC.BATCH.UPDATE
DM.DC.BATCH.UPDATE can be done via DSC and also it checks for the updation of
DC.BATCH.CONTROL and DC.DEPT.BATCH.CONTROL.
It also checks the ID format for Data capture like Number of characters, prefixed with DC and
the current Julian Date in T24
DSC record:
If the batch already exist in the environment then the below error will throw
During AA load. Suppose if the AA product is proofed and not published, error log is created.
After running the service, if in the data file the product used is SMALL.BUSINESS.LOAN then
error log is created as shown below.
5.1.16 TDM.DATA.STRUCTURE.EXTRACT
It is a program which is used to extract all the fields and properties of any table given in the
saved list.
Once the savedlists name is given, it will extract the field properties of that application and also it will
get stored in the T24.DATA.STRUCTURE.LIST directory in the format of .xls.
Selection Criteria
Output of the selection showing number of error, number of records validated and the error details
After verifying the DMP we need to specify the value ‘Yes’ for converting the data file and the
converted data file is generated in same path.
5.1.19 DM.CLEANUP
1. The purpose of the routine is to clean up the existing DMD'S, DSC'S, TSA’s
(DM.SERVICE) and BATCH records related to DM SERVICE and VERSIONS and
The lib & bin files (GR0800005bin, GR0800005lib) and DM.BP and T24MIG.BP
folders are decataloged and deleted
2. For this we need to run program DM.CLEANUP.
Prerequisites
To export the corresponding LIB and BIN :
export JBCDEV_LIB=GR0800005lib
Run the routine, it has been read the DMD and get the version ID from the DMD
(OFS.VERSION).
Then read version record and get the CURR.NO, based on the CURR.NO the record
will be delete the Version’s LIVE NAU and HISTORY path.
DM.BP and T24MIG.BP routines get decataloged and the folder get deleted.
DM.CLEANUP Process:
6 Conclusion:
Thus the document explaining the functionality of a non -core T24™ Data Migration Tool that is used
for Data extract from T24 Load into T24 from any Client system. This could be used as a base for
using Data Migration tool in any T24 data Extract & Load process.