Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Load / Replicate complex Objects

Based on DMIS 2011 SP7

Define Configuration

A configuration defines the connection between the source and the target system. As a first step this
configuration must be defined in the central system using transaction LTR.

Press the New button to create a new configuration

In the first step you have to enter a configuration name (without any spaces). In addition a
description can be set

In the second step the connection data to the source system must be defined. In this case we use an
existing RFC connection to the source system.

Per default data is read from all clients. If the data selection should be limited to the RFC client, the
flag Read from Single Client must be activated. This option is not relevant here as source tables don’t
have a client field (typed as CLNT).
In the third step the connection data to the target system must be defined. Also choose RFC
connection and enter the existing RFC connection to the target system. As Scenario choose Standard
RFC scenario

In the fourth step you can define the initial load mode. It’s preferable to use Performance Optimized
but you will need approx. 10 % additional storage on the source system during initial load. Enter 1 as
No. of Data Transfer Jobs as you can increase it later on (if required).

In the next screen you can review your settings and press button Create Configuration if everything is
fine.
Define Complex Object

As an example we have two source tables (ZTCLIENT & ZTCLNTSUB) that are mapped to three
different target tables (BUT000, BUT020 & ADRC).

As source and target header table differ, we first have to define the deviating target table name in
transaction LTRS.

When you enter the transaction you have to select your created mass transfer ID from the list (you
can find it via the configuration name in the overview list or you can find your MT ID in transaction
LTR in the overview screen). Via context menu on node Table Settings you can add a special setting
for your source table (ZTCLIENT). Enter target table name in field Deviating Table Name (BUT000)
and save.
To define the complex object start program IUUC_REPL_PREDEF_OBJECTS and enter the mass
transfer. Press button Create New Entry and enter your source table name (ZTCLIENT) and activate
option to create Load and Replication template.

Afterwards you have a load and replication object with the header tables and you can enhance and
modify those objects. Click on the object name to get the MWB UI (TA MWB).

Press Change button to enhance the corresponding object.


Define Structures
First of all we enhance the receiver structure in work step 3 (Recipient Range: Edit Structures and
Fields). Via context menu on existing table you can create additional structures on same, lower or
higher level. Right click on R_BUT000 and choose Create Recipient Structure on Lower Level.

As recommended syntax use R_ as prefix for receiver structures. Therefore enter R_BUT020 as
recipient structure and BUT020 as DDIC name.

Right click on BUT020 and create table ADRC on lower level

It should finally look as follows


Now we enhance source structure in work step 4 (Sender Range: Edit Structures and Fields).
Right click on S_ZTCLIENT and choose Create Sender Structure on Lower Level.

As recommended syntax use S_ as prefix for sender structures. Therefore enter S_ZTCLNTSUB as
sender structure and ZTCLNTSUB as DDIC name.

Select the newly created table and click on the Key button to define foreign key relation.
Define Structure Relations
In work step 5 (Define Structure Relations) we define the structure relations. As we need table
ZTCLIENT to fill fields in all target tables we need to assign this table to every target table. You can
assign a source table to a target table via drag (source table) and drop (on target table). Fields of
table ZTCLNTSUB are only required for table BUT000 therefore it’s only assigned to the header table.

If you double click on a line you can define detailed settings for the structure relation. For R_BUT000
we have to adapt the default settings generated as we don’t want to have 2 loops but read
corresponding record from ZTCLNTSUB for every ZTCLIENT record. Therefore we unflag Primary
Relation for S_ZTCLNTSUB and change DataPathType to User-Defined Path.

Click on button Data Path and enter the WITH KEY condition for the READ TABLE statement.

As for every record in ZTCLIENT we want to create one record in all target tables we adjust the
structure relations and set DataPathType to suppress read to avoid the generation of additional
LOOP statements.
You can check the defined structure relation via a high level preview on the generated runtime
objects. Therefore press button Visualize Structure Relations to see a preview on the LOOP and READ
TABLE statements

For every primary relation which is not marked with Suppress Read a LOOP statement is generated,
for every secondary relation a READ TABLE is created.
Define Field Relations
In work step 6 (Define Field Relations) we define the field relations. You can define field relations via
drag (source fields) and drop (on target fields) or you enter the corresponding Transfer Rule if you
double click on a target field name.

Via drag and drop a simple move rule is applied. If target field is not compatible you get a Mapping
Conflict which informs you that source field does not completely fit in the target field. You have to
check whether this is ok (e.g. as not all characters are used) or whether you require a mapping rule.

If you want to assign a fix value, double click on the receiver field, assign transfer rule MOVE and
define your constant with single quotes as actual parameter.

Press the Transfer Mapping button to assign the rule to the selected field.
For more complex transformation you can create your own individual transformation rule. You can
create a rule in the field relation step via pull down menu (Goto -> Create Rule). Enter the rule ID and
define on which level the rule should be created:

- A rule on project level can be used in all migration objects within a project
- A rule on subproject level can be used in all migration objects within a subproject
- A rule on conversion object level can only be used in current migration object

If the rule should be assigned to an event (e.g. Begin of Record) choose Event-Related, if it should be
assigned to a receiver field choose Processing-Related.

For a rule you have to define the parameters. An event rule does not have any export parameters
whereas a field rule must have exactly one export parameter. In addition you can specify the
required import parameters.
In tab variant you can create a variant with type Free Code to write your individual transformation
logic.

If you press the source code button at the end of the line you enter the editor where you can define
your rule code. Based on the parameter definition a corresponding FORM routine will be generated
later on as you can see in the header are of this screen.

In this example a simple transformation is realized as just all import parameters are concatenated
into the target field.
If the rule definition is completed you need to change the Development Status to released as
otherwise you cannot assign the rule to an event or field

In the field mapping UI you can assign the previously created rule and assign the corresponding
sender fields to the import parameter.

If you also want to replicate data in the same way you need to repeat the steps for the replication
object. The only difference is that you will have the logging table as a header table. If you add
additional tables on the source side you have to add them below the original table
Control Load / Replication via SLT
If the required definition is done set the relevant objects to status active (in program
IUUC_REPL_PREDEF_OBJECTS) so that they are used within load or replication. As soon as you
activate a predefined object for a table, this object will be used as a template and copied to the
respective load or replication object. If you deactivate or delete a predefined object and start load
for that table a standard SLT load object will be created (with the header table only).

When the objects are activated you can use SLT to control your load and replication. Therefore call
transaction LTRC and enter your mass transfer ID. In the table overview tab you can stop or start a
table via the Data Provisioning UI (called via the respective button).

Enter the table for which you have defined your predefined objects and choose the required action
(replication consist of load plus replication).

You can monitor the load and replication via the LTRC UI. In Data Transfer Monitor you can see the
table name once the load or replication object is created. Be aware that this is a copy of your
predefined object. So if you changed that object the predefined object remains unchanged and your
changes are lost when the load or replication object is deleted. Also if you change your predefined
object it will only take effect in your load or replication if you restart the table or delete the load or
replication object.
Replication of Header and Children
If you start a table in replication mode, the delta recording (DB trigger) will be activated on the
header table (e.g. ZTCLIENT). Therefore only changes on this table will be recognized. If any other
table defined in the object changes but no corresponding change on the header takes place, SLT will
not recognize this change. This applies and is sufficient in cases where the header is always updated,
e.g. last change date.

In cases the header does not get updated, but the child get isolated updates, even the children need
to be recorded, but the whole data instance needs to get transferred.

Activate Trigger without replication:

In the SLT System go to Menu: System -> User Profile -> Own Data and set parameter SLT_EXPERT to
X and save

If you now call transaction LTRC you have additional options in the data provisioning UI:

Enter the table name where you want to activate the delta recording but do not start load or
replication and choose option Start Recording

In addition you need a program that writes changes from this logging table to the header logging
table.

You might also like