Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 85

SAP BODS - Beginners guide

By Alok Ranjan

AIM:- The purpose of the tutorial is to give a novice a good hands experience on BODS product to be
able to create his/her own local repository , configure job server, start his basic job development and
execute the job to extract data from source systems and load data to target systems after performing
transformations, look-ups and validations.

What is BODS:- It is an ETL tool currently acquired by SAP used for integration with all types of disparate
systems, extracting data from them, transforming them into meaningful information and loading them into
all types of systems.

It is tightly integrated with SAP systems and hence really a good tool to migrate data from legacy systems
to SAP systems with ease and less development efforts with effective debugging and monitoring
capabilities.

How to get started?:-

After getting BODS installed on the machine, you can see the following on Start>All Programs

There are several components installed as part of SAP BODS 3.2 installation.

In the recent release like SAP BODS4, there have been improvements on the architecture, components
and the enhancement features in several transforms.

However, the BODS Designer looks pretty same on both 3.2 and 4.0 versions.

The major enhancement is in the Validation Transform in 4.0 which facilitates developers to switch on and
off the validations flags easily for one or several columns.

What to launch next- Normally, the developers would like to start BODS Designer to start developing
jobs straightaway, but it is always good to start creating own local repository first so that you have full
control on your own development environment without any dependency on others.

In order to create your own local repository, it is important to know the database where you are allocated
space quota to create the same.
It depends on the database vendor like Oracle, SQL Server, Informix, DB2, etc which you have currently
access to.

However, BODS installation comes with default mySQL database, which you should have installed on
your system.

There is need of user/schema and database creation on my SQL which can be easily done by DBA.

Once user /schema and database name information is assigned to you, you can create the local
repository.

Next Screen shows the BODS Repository Manager which administers you to create local repository.

As you need to create your own local repository to do your own development activities, you need to select
the Repository type as ‘Local’.
Now you need to choose the Database Type/vendor, where you want to create repository on.

Since as mentioned earlier, mySQL database comes shipped with BODS product, it is convenient to
create repository on the same.

Hence, choose ‘MySQL’ in Database type.


MySQL Database type requires you to choose ODBC data source.

Therefore, you just need to configure ODBC with necessary parameters.

All you should know is your user name and password and database schema name.

Next screen helps you how to configure ODBC datastore:-

Launch ODBC from Control Panel>Administration tools>ODBC


Go to ‘System DSN’ tab.
You can find several default added DSN names.

If you want a dedicated DSN for yourself, you should be able to add yours.

Click on Add button.

Select the driver ‘MySQL ODBC’.( Please note that the version can be different, hence you need to ask
admin team if there is any discrepancy, but all what you want is mySQL driver).
Now, you need to supply information as below:

Data Source Name( it can be as per your desired name).

Server name- local

User-(as given by DBA)

Password –(as given by DBA)

Database –(as given by DBA)

After having provided all the information , you would like to go for testing the connection.

So, click on Test button


You can see your newly created DSN name in the screen below.

Now, launch again the Repository Manager and check in ODBC drop-down for newly created DSN name.
If the name is not visible, then you need to go to ODBC again and re-configure.

If it is listed now, you can proceed with the next steps.


You again need to provide all the mySQL login credentials and database name above as you did earlier
for DSN configuration.

Select the flag ‘Show details’. It gives you all the information what version it is and all the other details
when you start creating local repository.

Now, hit the ‘Create’ button.

You need to wait for some time for the repository to be created.

Follow the next screen.


You should see that ‘local repository was successfully created’
You can check for the version of the repository as well by hitting ‘Get Version’.

Now, it is time for the most important work.

You need to launch BODS Designer to start your development activities.

Go to All Programs>SAP Business Objects XI 3.2>sap Business Objects Data Services>Data Services
Designer
Provide all the information you used in creating local repository a while ago.

Now, you get the screen layout.

There are basically three important areas you need to understand.


1) Local Object Library:- It contains all the necessary objects used for the development of jobs.

2) Project Area:- It contains all the jobs which are ready for execution or you are currently working on.

3) Workspace area: The bigger space, where you actually design your job.

Start development

Right on empty space in local object area/click on ‘Create Project’ under Getting Started in the main
page.

Create with any name like above

You can find it in the top left pane (project area).


Right click on the project name and select ‘new Batch Job’.

Give any name to the new job as shown.


The job is created.

No, it is not!!!

You need to create dataflow under it.

On the right hand side, you can see the list if icons.

You click-once the dataflow and click-once on the workspace area (bigger area)
Name the dataflow as per your wish. It should start with DF_.

Now, you need to define the source structure.

In this example, we are using the flat file as source. So, we will create flat file format.

Go to ‘formats’ tab in local object library as below:-


Right click on the flat file and select new.
You will get the File Format Editor as shown above.

Now, you need to define the structure of the input file.


In this example, the input file has 3 fields EMP_ID,EMP_NAME and EMP_DEPT.

Click under the ‘field name’ and add the field names.

Under ‘Data type’ choose relevant datatype like ‘int’ for EMP_ID, ‘varchar’ for EMP_NAME’ etc.

In the left hand side, click on ‘Root directory’ folder to browse to the correct path where the input file is
placed .

Click on ‘File name(s)’ and you should be able to browse to see the input file.

Follow the next subsequent screens.


In this example, the input file is a text file which has 2 records as shown below:-
The text file contains the first row as header defining the field types.

The next 2 rows contain data.

Since, we already defined the flat file format structure with field names, we can ignore the first row header
(shown in the next screen)
When you try to save and close the file format editor, it gives you the above warning.

If you have defined the structure with field names in your file format editor, you should select ‘no’, else it
will overwrite with input file provided in the file name.

Here, you select ‘no’.

Ignoring the row header

Now, since we have already defined the file format with our field names, we do not require the row header
provided in the input file.

So, ‘ignore the row header’. See the next screen.


You find the flat file format created as shown above.
Now, you need to use this file format as source in your data flow.

Drag and drop the file format in the workspace area and make it as source.

Click once on the ‘Query’ icon on the right hand side and click once on the workspace.
Name the query and join it to the source file format.

Now, you need to define the target.

In this example, we are loading to table.

So, we need to create a datastore(logical connection to database).


Go to ‘datastores’ tab in local object library and right click on the blank space.

You will get the above screen where you need to provide inputs.

Datastore name:- as per your choice.

Datastore type;-Database

Database type:-mySQL( in this case, as we are using my SQL database for staging tables)
All over again, you need to provide all login credentials and database name as you provided earlier for
creating local repository.

If you are using another database for staging tables, you can use information provided by the DBA.
You see the datastore created.

Now, you click once on the template table icon in the right hand side and click once on the workspace
You need to define the name of the target table as per your wish.

Click on OK

Now, join the query transform icon to the target.

But, we have not still mapped the fields from source to target.

Click on the name of the query.


You will find that, there are three fields in the source.

You would like to map all the three fields from source to target

Drag and drop each field from left pane to right pane one by one.

Now, try to validate that the design has any errors or warnings

Click on validate current icon


If you get the following warnings

[Source File:"EMPLOYEE.txt"(EMPLOYEE)]

Job Server error (the Job Server may not be responding). The directory for the file <C:/EMPLOYEE.txt>
cannot be validated to ensure that it is valid at design time. Ensure that the directory is valid by run-time.
(BODI-1110017)

It was expected. The job server was not configured.

Did you see the icon ?

Please remember that you have BODS server components installed on your machine and so you are
responsible to configure Job server as well.

Launch Server manager


Click on ‘Edit Job server config’.
Click on ‘Add’ and then ‘OK’.
Provide Job server name as per your wish.

Click on ‘Add’.
In the ‘Repository Information’ section, provide the local repository details.

Click on ‘Apply’.
In the associated repositories, the repository name will be reflected.

Click on ‘OK’.
You can now see the name of the Job server configured.

Click on OK.
Click on ‘Restart’ in the window
Click on OK to restart Data Services service.

Close the designer.

Re-launch the BODS Designer.

Double-click the project where you have your job developed.


In the project area, right click on the job and select ‘Execute’.
You will see the Job server name listed that you configured earlier.

Click on OK.
You find the job executed successfully!!!

The good challenge is when job fails.

Now, we see what records have been added.

Go back to the dataflow.

Click on the small view lens on each source and target.

Source first:-
Target now:-

So, we find that data is loaded from flat file to database table.
SAP-BODS Integration using BAPI

1. Introduction:

We can call SAP ERP or R/3 Remote Function Call (RFC) enabled functions, including Business
Application Programming Interface (BAPI) functions, from queries inside data flows. A Remote Function
Call is really what the word suggests: You call a function but it is remote rather than part of your code.
This is the standard way how 3rd party’s tools access SAP, both for read and write. All RFCs meant to be
used for business reasons, their function name starts with the text BAPI_XXXX like
BAPI_ABSENCE_CREATE or BAPI_ABSENCE_GETDETAIL, both meant to create or display
employee’s absence records in HR.

If you design data flows with BAPI calls against one version of SAP ERP or R/3, then change data stores
to a later version of SAP ERP or R/3. Data Services allows this without the need to re-import the BAPI.
Any new parameters added to the function call, including additional columns in table parameters, are
added automatically to the call and filled with NULL values. Thus Data Services allows you to design jobs
that are portable between SAP ERP or R/3 systems. After you import the metadata for an SAP ERP or
R/3 function, the function is listed in the Functions category of the ERP or R/3 data store. You will also
see the function in the function wizard, listed under the data store name.

Data Services supports tables as input and output parameters for SAP ERP or R/3 RFC and BAPI
functions. The function import process automatically includes the metadata for tables included as function
parameters.

2. BAPI Call:

After you import the metadata for an SAP ERP or R/3 function, the function is listed in the Functions
category of the ERP or R/3 datastore. You will also see the function in the function wizard, listed under
the datastore name. To call a BAPI depends on the requirement; here in this document we have
considered an example of creating a CONTRACT using ‘BAPI_CONTRACT_CREATEFROMDATA’. For
this BAPI, we need to send mandatory input and Tables parameters to create a CONTRACT.

To specify a table as an input parameter to a function, the table must be an input to a query, either as a
top-level input or nested under the top-level. The table must also be available in the FROM clause of the
context where you call the function. Data Services maps columns in the input schema by name to the
columns in the table used as the function input parameter. You need only supply the columns that are
required by the function. At validation, if Data Services encounters type mismatches between supplied
columns and the function signature, it attempts to convert the given type to the expected type. For type
mismatches that it cannot resolve, Data Services produces validation errors.

3. Importing the BAPI:

Note: This entire documentation refers to IDES SAP system, refer the below screen shot for detail.
The following Process indicates how to import the BAPI.
4. Processing the BAPI:

In this BAPI the mandatory importing and Tables parameters required to create a Contract are:

1. Sales Document Type (Importing Parameter)

2. Sales Organization (Importing Parameter)

3. Distribution Channel (Importing Parameter)

4. Division (Importing Parameter)

5. CONTRACT_PARTNERS (Tables Parameter)

We can call this BAPI in a query transform and give the inputs as shown below. Before calling the BAPI
create a output schema for passing the Tables parameter as shown below.
Mapping Inside the Query from the above screen shot.

Calling the BAPI inside the Query_1 as shown below.


Select the required BAPI and pass the mandatory parameters as shown below.
To pass the table’s parameter, click the arrow as shown below and map the correct schema which is
defied for partners.
Double click on the query to map with the schema.
Pass the required return parameters to check the output.
As the structure belongs to NRDM, need to unnest the structure as shown below.
Finally execute the JOB to check the contract created.
As shown above the Contract ‘40000227’ has been created successfully. Check the same in the SAP
system using the transaction ‘VA43’.
5. Considerations in BAPI:

Consider the following issues:

• All character values must be uppercase

• Padding values

• Assumed decimal values (QTY)

• Codes are language-specific


• Automatic type conversion

• SAP ERP or R/3 version-specific behavior

To determine the data requirements of various SAP ERP or R/3 functions, you can read the function
requirements in the SAP GUI transaction screens:

• BAPI list by functional area: bapi

• BAPI and RFC source and input and output parameters: SE37

You can also determine appropriate values, such as the language-specific code values, by looking at the
table where the data is ultimately stored.

6. Calling RFC’s enabled Function Module:

One solution would be to RFC-enable the function. That is a SAP provided function, I guess the better
idea is to write a new function which does nothing else than calling this function and enable RFC. For
example to call the Function module ‘READ_TEXT’ which was a normal function one, copy the same
function module and make it RCF enabled. The following steps are required to enable the RFC.
Create a new function module in the transaction SE37 ‘ZREAD_TEXT’ and assign to function group
‘ZTEST_GRP’ as shown.

In the attributes tab make it as RFC enabled as shown below


Call the normal function module ‘READ_TEXT’ in the source code. Save it and activate it.

Now the RFC enable function module ‘ZREAD_TEXT’ is ready.

Now let us see how to call this from BODS designer.


After dragging the source and connected to query then this RFC function call is done.
In the Query edition a ‘New Function Call’ option is available as shown below and give the required
importing parameters.

As we have table parameter available in this Function module, we need to normalize the schema by
calling one more query as shown below.
Once the Job is executed we can see the result that has been extracted from the Function module.
SAP-BODS integration using IDOCS

1. Introduction:

Imagine you want to build a Reporting Solution, not a Data Warehouse in pure terms. So somebody
opens the balance sheet report and does not like the way it looks. So a booking in SAP is created to
correct it and then....he has to wait the entire night until the Data Warehouse gets refreshed. Another
option would be to configure SAP so it sends all changes to the reporting database immediately. And that
is what IDOCs are for.

On the downside though, to configure SAP to actually send changes is quite a challenge. If an IDOC is
provided by SAP already it is not that bad, but if you want to distribute changes for data SAP never
thought about, you have to write your IDOC from scratch and hook it into every single application dealing
with that data.

The basic problem is the IDOC design. It is not like a database triggers that is independent from all the
applications and fires no matter who and how a change happened. It is happening on application level, so
there is a common ABAP call that has to be done at every single application.

Inside SAP, a couple of settings have to be made, this configuration is mandatory for both sending and
receiving IDOCs. The following configurations steps are involved to send IDOCs from BODS to SAP
ECC.

1. Define Logical Systems

2. Define RFC Destination


3. Defining RFC Port(Transactional RFC)

4. Define Partner Profile

Let us see in brief how to configure the above steps:

2.SAP Configuration settings:


2.1 Define Logical Systems:

Defining Logical systems is done in SAP ECC with the transaction ‘SALE’.

Note: This entire documentation refers to SAP IDES system refer the below screen shot for detail.

Enter the Transaction ‘SALE’ to define logical system and its assignment. As this confirmation doesn’t
have authorization we will request basis to create it. Hence we used the following logical system as
shown below.
Click on the above shown to define the logical system. The below popup come just continue

The Logical system created by basis is ‘ID3CLNT801’ as shown below.


2.2 Define RFC Destination:

Enter the transaction ‘SM59’ to configure the RFC destination in SAP ECC as shown below.

Click on TCP/IP connections as shown above and the create it.


Once the RFC created with Registered Server Program (Program ID) as ‘DI_RFC’ as shown here. With
the defined program ID, configure this from BODS side from management console as shown below.
Logon on to Management console and click administrator.

RFC server configuration is done as shown below.


The configured served need to be started the interface as shown below.

Now check the connection from SAP side as shown below. Click on the connection test as shown.
If the connection is fine then the following screen will appear.

2.1 Defining RFC Port (Transactional RFC):

Enter the transaction ‘WE21’ to create the Port as shown. The RFC destination will be defined here.
2.2 Define Partner Profile

And finally, we can configure SAP to route all IDOCs of a given type to DI by running the transaction
"WE20". You click on "New", type in the Partner Number created in "/nsale" (we called it ID3CLNT801)
and its type is "LS"....Logical System. Then you click on save.
Configure the Inbound parameters as SAP need to receive the IDOCS.

Note: From SAP if we need to send the IDOCS to BODS then configure outbound parameters, vice versa
to receive the IDOCS (Inbound).

In this case it is considered to create the customer which comes from BODS; hence the configuration
should be Inbound. It is shown below. Click on Add
Enter the message type “DEBMAS” and process code as ‘DEBM’, click on save.
Note: The following are the IDOC types, Message types on category wise.
General Message types / IDoc types / BAPI

Vendor CREMAS / CREMAS02

Customer DEBMAS / DEBMAS03

Material master MATMAS / MATMAS03

Accounting Message types / IDoc types / BAPI

G/L account GLMAST / GLMAST01

Cost center COSMAS / COSMAS01

Cost element COELEM / COELEM01

Cost center group COGRP1 / COGRP01

Cost element group COGRP1 / COGRP01

Activity type COAMAS / COAMAS01

Activity group COGRP1 / COGRP01

Activity price COACTV / COACTV01


Profit center PRCMAS / PRCMAS01

Profit center group COGRP1 / COGRP01

Profit center account group COGRP1 / COGRP01

Logistics Message types / IDoc types / BAPI

Article master ARTMAS / ARTMAS02 /


RetailMaterial.Clone

Additional MMADDI / MMADDI01 /


RetailMaterial.SaveAdditionalReplicas

Product catalog PRDCAT / PRDCAT01

Product catalog item PRDPOS / PRDPOS01

Price list PRICAT / PRICAT01

Assortment ASSORTMENT / ASSORTMENT01 /


Assortment.SaveReplica

Service master SRVMAS / SRVMAS01

Characteristic CHRMAS / CHRMAS02

Class CLSMAS / CLSMAS03

Classification CLFMAS / CLFMAS01

Document DOCMAS / DOCMAS04

Purchasing info record INFREC / INFREC01

Conditions COND_A / COND_A01

Order book SRCLST / SRCLST01

Change master ECMMAS / ECMMAS01

Bill of material BOMMAT / BOMMAT01

Document BOM BOMDOC / BOMDOC01

Work breakdown structure Project / PROJECT01

Human Resources Message types / IDoc types / BAPI

PA object type person HRMD_A / HRMD_A03


PD object types HRMD_A / HRMD_A03

3. Sending IDOCs:
Sending an IDOC to SAP is an asynchronous task. One or many IDOCs are posted to SAP, just like
inserting into a database table, and get stored in the SAP database. Later, SAP starts to process the
IDOC and e.g. creates this new material record. Once the processing is done, no matter if successful or
not, the status of this IDOC is updated in the SAP table EDIDC.
As a consequence, unlike with BAPIs, you get no return of the processing. Your only choice is to
periodically query above status table and copy the status back into the senders table. If an immediate
return code is required or the actual error message you have to use BAPIs. On the counter side, IDOCs
are supposed to be faster while still running full data consistency tests.
Before sending the IDOCs we need to have SAP Data store to be created and then import the required
IDOC. For our case to create the customer ‘DEBMAS06’
The data store created as referred here as ‘SAP_ECC’ shown above, then import the IDOC ‘DEBMAS06’
as shown below.

Click on import by name and give the IDOC name as shown below.
The following flat file information is required to create the customer through IDOCs. Row generator is
required for input the configuration.

Flat file information as shown below.

Configuration information as shown below.


Once the Job completed check the generated IDOC by using the transaction code ‘WE02’.
Status 53 shows IDOC successfully received to SAP from BODS.
Customer (310) got created successfully by checking the transaction ‘XD03’.

You might also like