Tips Transportes, Jerarquias, Transacciones, Etc V1

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 37

@5C\QError@ Objeto ELEM 006EI3MHTBIYTZ8BNQ63XJY6S necesita

obj.ELEM 46K8V08583BVN57D240C381YJ: El último no se encuentra en ning.orden


@35\QExiste texto explicativo@

Activacion de Datasources y transfer rules luego de transporte

Thanks all for your help. We had to take SAP help as we Going-Live this week. SAP
has given the following solution.

 1. Ran the program  "RSDS_DATASOURCE_ACTIVATE_ALL" for the Source


System which re-activated all the InfoSources in the system.

 2.Ran "RS_TRANSTRU_ACTIVATE_ALL" to activate all the transfer Structures.

 We were able to load the data after we ran these programs.

Reimportar transportes

U can re-import the required transport request using STMS.


These are the small ateps:
First u go to the Import Queue of the desired system into which u want to
import that Trans. Request.
Then Follow this Menu path: Extras->Other Requests->Add
Then, u will have to specify the exact Released Transport Request no.which u
want to re-import in the Respected Import Queue.
Also, Turn on the checkbox of "Import Again".
Then, U will find that Transport Request again in the Import Queue.
Afterwards, u can re-import that Trans. Req. with the "Import Transport
Req. Again" in the Import Options (Turn on this CheckBox).
If u find any mistake, then reply..

Hierarchies in Authorizations

RSECADMIN is tcode for maintain the authorization system, you use the following
standard infoobject 0TCAIPROV, 0TCAVALID and 0TCAACTVT for set up your
analysis authorization, when you work with characteristic relevant of authorization and
this characteristic have a hierarchy, to your develop system from quality system also
you have to transport the hierchies by each infoobject.

If you want to build left outer join with DSO on the left side and InfoCube on the right,
do the following.

1. Build InfoSet based on DSO and InfoCube, utilizing Inner Join.

2. Build Multiprovider that will include both the above InfoSet and original DSO. In
characteristic assignment of Multiprovider assign the same characteristics to both
InfoSet and DSO.
As Multiprovider is a union, you will end up with a union betweem original DSO and
inner join of original DSO and InfoCube. Effectively this will constitute Left Outer Join
between DSO and InfoCube with DSO on the left side.

Colocar opciones de modificacion por objetos

Para que pida transporte cada vez que se modifique algo

Como debe estar la configuración en la SCC4 para ambientes de desarrollo.


Error en DTP de 0Material BMG135 Conversion de Nro. De Material

Go to SPRO in BI and enter values for SAP NetWeaver -> Business Intelligence ->
General BI Settings -> Set Material Number Display.

Tip: The value here shall be same as value displayed in the transaction OMSL in source
system.

Transaccion paraver parametros de Infoproviders de Read Mode, Cache Etc


RSDIPROP

Display de prompts in Webi

=UserResponse("Enter Tran Date(Start):") + " " + UserResponse("Enter Tran


Date(End):")

Autorizaciones de Analisis

Recommendation

In principle, all authorization-relevant characteristics are checked for existing


authorizations if they occur in a query or in an InfoProvider that is being used. You
should therefore avoid flagging too many characteristics as authorization-relevant. This
will keep the administrative effort to a minimum and ensure satisfactory performance.

To prevent performance from being impaired, we recommend having no more than 10


authorization-relevant characteristics in a query. Authorization-relevant characteristics
with asterisk (*) authorization are an exception. You can include more authorization-
relevant characteristics of this type in a query.

Features

Analysis authorizations are not based on authorization objects. You create


authorizations that include a group of characteristics instead. You restrict the values for
these characteristics.

The authorizations can include any authorization-relevant characteristics, and treat


single values, intervals and hierarchy authorizations in the same way. Navigation
attributes can also be flagged as authorization-relevant in the attribute maintenance for
characteristics and can be added to authorizations as separate characteristics.

You can then assign this authorization to one or more users. Since the authorizations are
TLOGO objects (analytics security objects), they can also be transported to other
systems.

All characteristics flagged as authorization-relevant are checked when a query is


executed.

A query always selects a set of data from the database. If authorization-relevant


characteristics are part of this data, you have to make sure that the user who is executing
the query has sufficient authorization for the complete selection. Otherwise, an error
message is displayed indicating that the user does not have the required authorization. In
general, the authorizations do not work as filters. Very limited exceptions to this rule are
hierarchies in the drilldown and variables that are filled from authorizations. Hierarchies
are mostly restricted to the authorized nodes, and variables that are filled from
authorizations act like filters for the authorized values for the characteristic in question.

Limpiar el reporte Webi / Dashboard antes de abrirlo

We are using SAP Authentication.

 We are using OLAP connections with SSO.

 We have enabled authorization relevance on a number of characteristics including


Funds Center.

 Solucion??????

The test query is on a multiprovider (FM data) with Funds Center in the characteristic
restrictions area and in the rows.  In the characteristic restriction area, Funds Center has
on it an authorization variable (with selection options setting on the details tab)
OJOOOOOOOOOOOO.  We have added a simple key figure to the rows.

We've created WEBIs on Bex Queries, we're using Web Services (created on the webi
in the WEB Intelligence Rich Client) to prepare the data for transmission to the
dashboard. Within the dashboard, we're using Web Service Query (query as a web
service) (data --> new connection) to connect the web services.  We set our components
to refresh before opening so we assumed it would refresh the data for the user logging
into the dashboard and only bring values permitted by the underlying BW Analysis
Authorizations on the query.

 We were wrong.  We determined we need to set the Refresh on Open checkbox that
appears under the advanced button in the webi save as dialog.  As a result, our simple
test query now refreshes from within the WEBI Rich Client and displays only the values
permitted by the Analysis Authorization.

The Webi Report using this connection prompts correctly and correctly only shows the
Values they have access for the Filter. 

But fails when executed: BW System BIQ Returned state:


USER_NOT_AUTHORIZED, Message = WARNING EYE (007): You do not have
sufficent authorization.  Trace showed the user required to have 0BI_ALL access which
defeats the purpose of the Authorization.

I put in a SAP Ticket but only thing they could tell me was only Analysis for OLAP
uses BW Authorizations correctly and they couldnt give me any direction on the other
tools or if they even consider it a problem and would be corrected when going through
WEBI.  It was kind of strange considering the InfoObject filter on the Query did work
correctly in WEBI by only showing what they have access but failed during report
running.  Seems like it would be a simple fix on their end.

I also head the same error USER_NOT_AUTHORIZED.

(The restricted values in prompt LOV were correct, but if I refreshed it without the
restricted value (*) I got the error. If I filtered one of the values, everything was ok.)

Now I have two BICS variables, one for the prompt as before and one authorization
variable.

Now everything is working fine.

SAP published the note 1812327 for this issue. However I didn't use any authorization
object and solved the issue by user exit. Check the following steps:

 - Extract authorization data (user-characteristics value mapping) to ODS in BW.

- Create bex variable with customer exit type and no input ready in characteristic
restrictions area on BEX.

- Write a customer exit to read the mapping data from ODS in CMOD.

- Try with a user to see how the data is displayed in BEX.

- Create an olap connection with SSO type in CMC.

- Create a webi report using this olap connection and using this bex query.

- Run webi report and you can see the data in bex and webi is same.

 by the way, I haven't tried this method with hierarchy.

As per SAP Note 1812327  - Error:" BW System <system ID> returned state:
USER_NOT_AUTHORIZED (WIS 00000)" is displayed in Webi when restricted SAP
user doesn't provide an input value for an optional BW variable with analysis
authorization
1) Turn on logging within RSECADMIN for the ID you're testing. That log will be
infinitely more helpful than SU53, which will always show a failure on 0BI_ALL. The
log will usually show you exactly where the problem is.

2) If the log doesn't help, try changing the restriction in the ZEMPLOYEE analysis auth
to a star. If the query still fails, then the problem is likely some other field you didn't
realize was set to auth-relevant. If the query runs, then the variable in the query isn't set
up to work correctly with the restricted variable in the analysis auth. That happens a lot.

1) Also if the user has to be restrained to <u>only one characteristic</u> (ES:


Costelement = 1000*) you have to put in the authorization built in RSECADMIN all
authorization relevant infoobjects that you have in your infocube (the value in
RSCEDADMIN will be = *)

2) You have to make an authorization relevant variable in BEX for 0costelement that
will be not ready for input if the user don't have to select the values in pop up; if the
user has to select values you make it ready for input but it is mandatory to make the
variable.

3) Add also the infoobject 0tcakyfnm in authorization RSECADMIN and make the
value = *

In this way it works great

Here is the solution we got for the analysis authorization check from Webi report.

It was a known bug with SAP


First SAP asked us to upgrade to SP8 but company wide they agreed to move to 4.1
directly. So after moving it was still not stable. Then figured out that we need the
authorisation variables in the "Characteristics restrictions" tab rather than "Default
values" tab of the Bex query.

SAP published the note 1812327 for this issue. However I didn't use any authorization
object and solved the issue by user exit. Check the following steps:

 - Extract authorization data (user-characteristics value mapping) to ODS in BW.

- Create bex variable with customer exit type and no input ready in characteristic
restrictions area on BEX.

- Write a customer exit to read the mapping data from ODS in CMOD.
- Try with a user to see how the data is displayed in BEX.

- Create an olap connection with SSO type in CMC.

- Create a webi report using this olap connection and using this bex query.

- Run webi report and you can see the data in bex and webi is same.

 by the way, I haven't tried this method with hierarchy.

SAP Note 1812327

1. Create a BICS connection on a bex query with analysis authorization using a


non SAP_ALL authorization account
2. Create a webi report based on that connection with any user
3. Run the query and don't provide any input value for the optional variable

Workarounds:

 Include the Hierarchy which contains authorized nodes in the Webi


query ResultObjects
 Provide a default value for the variable which is accessible by all the users
 Disable the optional variable in the query panel each time the user decides to not
provide an input value

https://wiki.scn.sap.com/wiki/display/BI/The+authorization+Log+in+RSECADMIN

EXCELENTE Ver caso de “:” para Autorizaciones de 0HRPOSITION aun cuando el


query no lo use pero el infoprovider lo tiene

http://scn.sap.com/people/kamaljeet.kharbanda/blog/2009/02/26/step-by-step-sap-bi-
security

http://www.sapsecuritypages.com/authorization-trace-in-bw/

RSEC320 VER ESTE ERROR (log vacio)

0tcakyfnm

SAP Note  2212143


Reproducing the Issue

1. Create a Bex query which include an authorization variable on a hierarchy


object.
2. Make sure to define the authorization variable into "default values" area.
3. Create a Webi report that is based on a BICS connection to this BEx query.
4. Run a query using hierarchy restricted object.
5. the following error is shown: "BW System <SID>
USER_NOT_AUTHORIZED. Message = WARNING EYE (007): You do not
have sufficient authorization".

Cause

 The autorization variable should be implemented in "Characteristic Restrictions"


area not "Default values".
 Indeed, this is BY DESIGN behavior ( as per the KBA  1854601 - BEx
Variables that were specified as "Default Values" are no longer being treated as
filters in Webi reports) The non-support of this functionality (into default
values) means that DSL semantic layer will retrieve empty ResultSet when
executing the query, and in case of autorization variable , it will consider this
empty result as authorization error.

Resolution

In Bex query designer, define the authorization variable into "Characteristic Restriction"

Año

2117325 - Error: "The action cannot be performed. (WIS 30650)" while opening the
Web Intelligence report based on UNX universe in Web Intelligence Rich Client.

Symptom

 Error "The action cannot be performed. (WIS 30650)" is thrown while opening the
report based on UNX universe.
 While creating a new report based on .UNX universe, error: "An error occurred while
calling ‘getSessionInfosEX’ API (Error: ERR_WIS_30270) (WIS 30270)" is thrown.
 These error are only seen from Web Intelligence Rich Client.
 Only Web Intelligence Rich Client is installed on client machine (No other client tools).
Environment

SAP BusinessObjects Business Intelligence platform 4.1

Reproducing the Issue

Scenario 1:

1. Launch Web Intelligence Rich Client on client machine.


2. Navaigate to the WebI report which is based on .UNX universe.
3. Try to open the report and observed the error: "The action cannot be performed. (WIS
30650)".

Scenario 2:

1. Launch Web Intelligence Rich Client on client machine.


2. Try to create a new report.
3. When list of universe is popped up, click on .UNX universe and observe the error:  "An
error occurred while calling ‘getSessionInfosEX’ API (Error: ERR_WIS_30270) (WIS
30270)".

Cause

While installation of client tools, some of the "Databse Access and Security"
components were not selected.

Resolution

Perform "Modify" installation on client machine and at the 'Select Features' page,
select all the "Databse Access and Security" components.

Keywords

Activar Cadenas de Proceso luego de transportar

SE37. Modulo de Funciones RSPC_CHAIN_ACTIVATE_REMOTE

En el entorno test Modo sencillo

Antes de transportar cadenas primero cambiar en el Inicio Opciones de Planificacion a


“Inicio mediante Metacadena o API” para que no la ejecute al transportar y activar en el
sistema destino

Cadenas de Proceso – Borrar PSA

No pueden ser cargas deltas las DSO por que no borra las PSA.
Variantes en queries APD

Al transportar hay que crear las variantes nuevamente por RSRT en el sistema destino

Parámetros requeridos por objetos de autorización

http://scn.sap.com/docs/DOC-48579

Tabla de variantes de queries

VARID

RSRVARIANT

Activacion de Datasources y Estructuras de transferencia

SE38: RSDS_DATASOURCE_ACTIVATE_ALL

RS_TRANSTRU_ACTIVATE_ALL

Schedule :

You use this function to schedule and execute the process chain in the background, in
accordance with the settings in the start process.

Execute synchronously:

You use this function to schedule and execute the process in dialog mode, instead of in
the background.

The processes in the chain are processed serially using a dialog process. The system
writes a log entry for each process. You can display the chain run in the log view.

Variantes Globales de Queries

Deben crearse por Bex Analyzer. Quitar Checkmark de variante de Usuario y aparecerá
en la tabla RSRPARAMETRIZA

ZPI_BW_C4C_TRACKING_CONF
Modificación masiva de Usuarios en masa

Transacción SU10

Suprimir errores de Runtime en Dashboards

https://apps.support.sap.com/sap/support/knowledge/public/en/1922053

Firstly make sure you have no old Java installs or 'add ons' lingering about on your
system, check in programs and features (assuming your using windows 7) and uninstall
Java completely. Check the add ons in IE and show: 'All add ons' - remove manually if
required. Do a clean up of the computers temp files and cache. I used a program called
ccleaner. Also check c:\users\<username>\app data\locallow\ and delete the 'Sun' folder. Then reboot
machine.

- Install latest Java 1.7.0_51

- Go to Java control panel in windows control panel

- add the exception for your BOBJ server e.g. http://<SERVER>:8080

- click continue

- go to the advanced tab

- scroll down and tick 'use TLS 1.2'

- apply or OK to exit java control panel

- open 'Internet Options'

- go to the 'advanced' tab

- scroll down and tick 'use TLS 1.2'

- OK everything and close any IE windows

 Now try again, you will need to enable the 'add on' when prompted on opening IE for
thefirst time. Also when accessing the BI Web I for the first time, I placed a tick in the
'do not show this again

for apps from this publisher'

Eliminar filas en Cero en Webi

https://scn.sap.com/thread/1879196 si no funciona eliminar filas en Zero dentro de las


propiedades del webi
Eliminar registros de tablas en el ERP

Transaccion Se16N o SM30

/H (modo debugging)

GD-EDIT

GD-SAPEDIT

Colocar X a estas dos variables. Ir hasta el final del debugging y borrar las líneas que se
requieren y salvar

NODIM NO funciona en APD con queries

Queries duplicados (igual Nombre)

You can do this also direct editing table in RSRREPDIR SE16.

Just put your query ID in field COMPID and execute. Than select duplicate record for
the specific query in table and delete record (Shift+F2).

Autorizaciones en USER EXIT

Si se coloca ‘*’debe colocarse CP no EQ en l_s_range-opt


        CASE SY-UNAME.
          WHEN 'LATDNUNES'.
            l_s_range-low = '*'.
*            l_s_range-high = '*'.
            l_s_range-sign = 'I'.
            l_s_range-opt = 'CP'.
            APPEND l_s_range TO e_t_range.
        ENDCASE.

Uso del ALL in Combo Boxes

COLOCAR TIMESTAMP EN CUBOS


WEBI – MANEJO DE FECHAS

Creación de archivos con nombres dinámicos

https://blogs.sap.com/2012/06/27/dynamic-determination-of-file-name-in-ohdsapds/

https://scn.sap.com/thread/1720012

http://scn.sap.com/docs/DOC-29556
MANEJO DE ARCHIVOS DINAMICOS EN AL11 en Infopaquetes

program filename_routine.
* Global code
*$*$ begin of global - insert your declaration only below this line  
*-*
* Enter here global variables and type declarations
* as well as additional form routines, which you may call from the
* main routine COMPUTE_FLAT_FILE_FILENAME below
*TABLES: ...
* DATA:   ...
*$*$ end of global - insert your declaration only before this line   
*-*

* -------------------------------------------------------------------
form compute_flat_file_filename
  using    p_request      type RSREQUID
           p_infopackage  type rslogdpid
           p_datasource   type rsoltpsourcer
           p_logsys       type rsslogsys
  changing p_filename     type RSFILENM
           p_subrc like sy-subrc.
*$*$ begin of routine - insert your code only below this line        
*-*
* This routine will be called by the adapter,
* when the infopackage is executed.
  DATA: lit_dir_list  TYPE STANDARD TABLE OF epsfili INITIAL SIZE 0,
  "Files Table
        wa_dir_list   LIKE LINE OF lit_dir_list.
  DATA: lv_dir_name   TYPE epsf-epsdirnam. "Directory Name

   lv_dir_name = '\\BEHNWBI011\sapmnt\BWFILES\'.

*** Fetch all the files stored in the directory.
    CALL FUNCTION 'EPS_GET_DIRECTORY_LISTING'
      EXPORTING
        dir_name                     = lv_dir_name
        FILE_MASK                    = 'PresupuestoVenta_*'
* IMPORTING
*   DIR_NAME                     =
*   FILE_COUNTER                 =
*   ERROR_COUNTER                =
      TABLES
        dir_list                     = lit_dir_list[]
     EXCEPTIONS
       invalid_eps_subdir           = 1
       sapgparam_failed             = 2
       build_directory_failed       = 3
       no_authorization             = 4
       read_directory_failed        = 5
       too_many_read_errors         = 6
       empty_directory_list         = 7
       OTHERS                       = 8.

    IF sy-subrc <> 0.
    ENDIF.

    SORT lit_dir_list DESCENDING BY NAME.

    READ TABLE lit_dir_list
    INTO wa_dir_list INDEX 1.
    IF sy-subrc = 0.
      concatenate  lv_dir_name wa_dir_list-NAME into p_filename.
    ENDIF.
    p_subrc = 0.
*$*$ end of routine - insert your code only before this line         
*-*
endform.

PROGRAMAS PARA RENOMBRAR ARCHIVOS EN AL11


*&--------------------------------------------------------------------
-*
*& Report  ZCREATE_TRACK_CONF_S
*&
*&--------------------------------------------------------------------
-*
*&
*&
*&--------------------------------------------------------------------
-*
REPORT ZCREATE_TRACK_CONF_S.

DATA: BEGIN OF itab OCCURS 0,
        data(100000) TYPE c,
      END OF itab,

      wa LIKE itab.

DATA: FILE_OUTPUT type string,
      TIMESTAMP1 type string,
      TIMESTAMP type timestamp.

    GET TIME STAMP FIELD TIMESTAMP.
    MOVE TIMESTAMP TO TIMESTAMP1.

    concatenate '\\172.16.5.201\sapmnt\BWFILES\TrackingConfirmado_' TI
MESTAMP1 '.csv' into FILE_OUTPUT.

    OPEN DATASET '\\172.16.5.201\sapmnt\BWFILES\TrackingConfirmado.csv
' FOR INPUT IN TEXT MODE ENCODING DEFAULT.

    IF sy-subrc NE 0.
      MESSAGE e999(za) WITH 'Error opening file' '\\172.16.5.201\sapmn
t\BWFILES\TrackingConfirmado.csv'.
    ENDIF.

    DO.
*     Reads each line of file individually
      READ DATASET '\\172.16.5.201\sapmnt\BWFILES\TrackingConfirmado.c
sv' INTO wa.
      IF sy-subrc NE 0.
      EXIT.
      ELSE.
      APPEND wa TO itab.
      ENDIF.
*     Perform processing here
*     .....
    ENDDO.

    OPEN DATASET FILE_OUTPUT FOR OUTPUT IN TEXT MODE ENCODING NON-
UNICODE.
    LOOP AT itab INTO wa.
      TRANSFER wa TO FILE_OUTPUT.
    ENDLOOP.

    DELETE DATASET '\\172.16.5.201\sapmnt\BWFILES\TrackingConfirmado.c
sv'.
    CLOSE DATASET FILE_OUTPUT.
    CLOSE DATASET '\\172.16.5.201\sapmnt\BWFILES\TrackingConfirmado.cs
v'.

Como crear archivos con nombres dinamicos

https://blogs.sap.com/2012/06/27/dynamic-determination-of-file-name-in-ohdsapds/

Applies to:

SAP BI 7.0. Will also work on SAP BI 3.5. For more information, visit the Business
Intelligence homepage.

Summary

This article gives you a way of generating files with dynamic file names using Open
Hub Destination (OHD)/ Analysis Process Designer (APD) in the application server.

Author: Prasath Vettrinathan

Company: Infosys Limited

Created on: 27 – Jun – 2012

Author Bio

Prasath Vettrinathan is an SAP BI/BW consultant currently working with Infosys


Technologies. He has got rich experience in various areas of SAP BI and worked on
various implementation/Support Projects

Table of contents

Introduction

Scenario

Steps to be followed

           Step 1: Create Logical File Path

           Step 2: Create File Exit Function module

           Step 3: Create Logical File Name

           Step 4: Creating OHD


Results

Introduction

Many a times, data from BW is to be sent to various legacy systems or for business
users in the form of flat files. These flat files are generated as per the requirements and
placed in the application server of BW in a specific directory with a specific naming
convention. Here we deal with dynamic determination of the path of file or the name of
the files using logical file name in OHDs/ APDs.

Scenario

When using OHDs or APDs there might be a requirement to place the generated file in a
specific directory in the application server, also the naming convention of the file may
contain dynamic parameter such as timestamp, counter etc., to distinguish the files that
are generated by the same OHD or APD every day.  The timestamp used in the file
name should be logical instead of placing the system date and time. 

For example, consider the flow

When the loads to the cube are completed, the delta/full data is to be put on the
application server. OHD is used for generation of this file. The file generated by OHD
will have the following naming convention. “ZXXXX_21062011163045.csv” where
ZXXXX is the fixed component and 21062011163045 is the dynamic component, i.e.,
the time when the last load to the cube has happened.

Steps to be followed

1.      1. Create a logical file path, path where the file needs to be stored in the application
server.

2.      2. Create file exit function module, which is used for filling the dynamic parameter in
the file name.

3.      3. Create logical file name.

4.      4. Use the logical file name in the OHD, to generate files in the specified path in the
application server with the specific naming

      convention.

Step 1: Create logical file path


The logical file path definition is used to associate a physical path in the application
sever. Hence when a logical file name containing this logical file path is used, the file is
stored in the actual physical path that is defined.

To define logical path name, Go to the FILE transaction. Select “Logical File Path
Definition” and select “New Entries”.

Add an entry with the logical file path and a description

To assign the logical file path to physical path, select the new entry added and click on
“Assignment of Physical Paths to Logical Path” and select “New Entries”

Enter the syntax group, which has to same as that of the OS of the application server.
For ex., Windows NT, UNIX etc., also enter the physical path where the file has to be
stored in Application server. The reserved name <FILENAME> is to be used in the
physical path as a place holder for the file name. There are lists of other reserved words
like <OPSYS>, <SYSID>, <DATE> which can be replaced with current value at
runtime. Save the entry.

Now, If this Logical file path is used in the logical file name then the files that are
generated would be placed in the physical path i.e. DIR_HOME

When generating the file with an OHD/APD without using a logical file name in the
application server, by default the files are stored in DIR_HOME directory. Logical file
name gives us flexibility to store the files in any directory we require. In this example,
the files are saved in DIR_HOME directory itself.

Step2 : Create File Exit Function Module

The physical file name must have the timestamp which is the runtime parameters that
needs to be filled during execution. For doing this, we will be using the reserved word
<F=name> while defining the logical file name.  And a corresponding file name exit
user defined function module should be created with the following naming
FILENAME_EXIT_name. The return value of this function module will replace the
reserved word in the file name.

Following is a list of reserved words and the corresponding function modules that are to
be created.

<Z=name>   ->   Z_FILENAME_EXIT_name

<Y=name>   ->   Y_FILENAME_EXIT_name

The function module that is written must satisfy the following requirements:

An export parameter with the name “OUTPUT” must exist.  No structure may
exist for these parameters.

Import parameters are only supported if they have default values.

Below is the code used for obtaining the timestamp.


Note:

The FM must be created before creating the logical file name, otherwise we will get an
“error message”.

Step 3: Create Logical File Name

Click on “Logical File Name Definition” and select “New Entries”


Add an entry with the logical file name and a description. The Physical File must
consist name of the file with the desired naming convention. Where ZCNTTS is the
name used in the file name exit function module.

Save the entry.

Step 4: Creating OHD

OHDs are created with the required fields. Logical file name is used in the OHD that is
created.

Give “Type of the file name” as “Logical file name” and “Appl Server File Name” as
the logical file name that is created (Y_FILE_MULTIPLE) 
Results

On executing the OHD, the files are generated in the application server. The application
server can be accessed using tcode AL11. Go to the directory DIR_HOME. The file is
generated with the file name as ZOPT_20120611141959.7140000_G1.csv

Where the timestamp is the last load timestamp of the cube, which is present in the
RSBKREQUEST table.

Note:

1.      1. The logical file name created can be used in APDs when the Data Target of an APD
is “Files

  2. The reserved words <F=name>, <Z=name>, <Y=name> can also be used in the logical
file path definition along with a   corresponding file exit function module. By doing so
the path in which the file has to be stored can be determined dynamically.
COMO RECUPERAR DATA DE DELTAS DE Logistica

https://blogs.sap.com/2013/10/10/full-repair-requests-for-lo-extractors/

Full Repair Requests for LO Extractors

Full Repair Requests: Full Repair extraction request is a data request for data that has previously been
extracted and loaded to a target InfoProvider. These types of requests can be updated in every target
InfoProvider, even if that target already has the data from an initialization or Init w/o Data Transfer for this
DataSource.

Need of Full Repair Requests: Sometimes, delta extractions don’t extract all the delta records or the
process failed and can’t be restarted and to fill this gap we need Full Repair Requests. Even we might
need to re-synchronize the data between BW and the source system. These types of requests are also
useful for the times when we are initially extracting large volumes of data and execute an Init w/o Data
Transfer and execute multiple parallel Infopackages that are Full Repair requests with specific selection
criteria.

Before moving ahead let us have a brief overview on LO Extractors.

LO Extractors

Logistics involves movement/flow of material, people, money, energy and information from a point of
origin to a point of consumption so as to meet the customer requirements for goods and services.SAP
ERP has a number of Logistics (LO) applications, so SAP provides a number of DataSources that are
related to Logistics. The extraction for these DataSources follows a well-defined process that uses a
special framework – the Logistics Cockpit.

LO Cockpit:

                                                 

DataSource: A DataSource is a structure in the source system that specifies the method for transferring
data out of the source system. It is created in the source system and replicated to the BW system. With
SAP ERP and other SAP systems as source, we have a large number of DataSources that are offered by
SAP which support a wide range of analysis requirements.But when our requirements are not met by using
SAP Business Content, we can create a DataSource based on our own logic. Such a DataSource is
known as Generic DataSource.
All the DataSources belonging to logistics can be found in LO Cockpit grouped by their respective
application areas. Also, if the datasource is provided in the LO Cockpit, changes can be made there too
depending on how the extraction of data is made for that datasource.

                                                      

Datasource for LO extraction is delivered by SAP as a part of business content. It has the naming convention:
2LIS_<Application_Component>_<Event>_<Suffix>

Application Component: Application Components are sets of DataSources grouped logically. Here,
these are 2 digit numbers. These can be checked in LO cockpit. e.g.: application 11 refers to SD Sales.

Event: specifies the transaction that provides the data for the application specified, and is optional in the
naming convention.

A. E.g.:

VA: creating, changing or deleting sales orders

VB: creating, changing or deleting quotations

VC: creating, changing or deleting deliveries

VD: creating, changing or deleting billing doc

Suffix = It details the DataSource, what level of data is extracted etc.

HDR represents Header data

ITM represents Item data

SCL represents Schedule line data

KON represents Conditions data

So, 2LIS_11_VASCL will extract Sales Order Schedule Line data from SAP ECC system.

DataSource can be activated at RSA5 (T-code) and the activated datasources can be viewed in
transaction RSA6. Upon activation of the business content datasources, all components like the extract
structure, extractor program etc. also gets activated in the system.
An extract structure generated will have the naming convention:

MC <Application><Event/group of events>0<Suffix>, where suffix is optional.

Thus, MC11VA0SCL: Extraction structure for the DataSource 2LIS_11_VASCL.

Initial and Delta Data Extraction for Logistics Transaction DataSources

Initialization/Full Upload: To differentiate the new data from the historical data, the process of
initialization is used which sets a point of reference for the historical data and also provides the option to
load the historical data to SAP NetWeaver BW. Also, to use the delta capability of the DataSource, the
process of initialization must be carried out. The new data generated after initialization is identified as delta
data.

Setup Tables: For loading data first time into the BW system, the set up tables have to be filled. The
restructuring/set up tables are cluster tables that hold the respective application data, and the BI system
extracts the data as a onetime activity during the initialization process or during the full update mode, and
the data can be deleted from the set up tables after successful data extraction into BW to avoid redundant
storage. The setup tables are filled for the entire application component and not for the individual
DataSources.

Naming Convention for Setup Tables : <Extraction structure>SETUP

Thus, the DataSource 2LIS_11_VASCL having extract structure MC11VA0SCL has the set up table
MC11VA0SCLSETUP.

Once the initialization is completed successfully, we don’t need the data in the setup table. It can be
deleted using transaction LBWG.

Delta Loads: After the successful initialization, delta records get captured and are passed to an area
known as delta queue, which stores records that are ready to be extracted by the BW system.

There are three different methods for processing this data but before proceeding its necessary to
understand the various update modes.

Update Modes: While carrying out a transaction, for e.g. the creation of a sales order, the user enters
data and saves the transaction. The data entered by the user from a logistics application perspective is
directly used for creating the orders and also indirectly forms a part of the information for management
information reporting. The data entered by the user is used by the logistic application for achieving both
the above aspects, but the former, i.e. the creation of the order takes a higher priority than result
calculations triggered by the entry. The latter is often termed as statistical updates. The SAP system treats
both these events generated by the creation of order with different priorities by using different update
modes to achieve the same.

V1 Update: A V1 update is carried out for critical or primary changes and these affect objects that has a
controlling function in the SAP System, for example the creation of an sales order in the system. These
updates are time critical and are synchronous updates.

V2 Update: A V2 update, in contrast with V1 is executed for less critical secondary changes and are pure
statistical updates resulting from the transaction.

V3 Update: V3 update which consists of collective run function modules. Compared to the V1 and V2
updates, the V3 update is a batch asynchronous update, which is carried out when a report (RSM13005)
starts the update (in background mode). The V3 update does not happen automatically unlike the V1 and
V2 updates.

Delta Load Methods: SAP provides us with different mechanisms for pushing the data into the delta
queue and is called update mode. In transaction LBWE, we can see the various update modes available.
                                                      

Direct Delta: This method stores the posted documents in the delta queue using V1 update. V1 update
accords the highest priority to this process over others and is generally used for most critical updates. It is
recommended only when a small number of documents are being posted.

Queued Delta: This method stores the posted document data in an extraction queue using the V1 update.
A collective run is required to read the data from the extraction queue and transfer it to the delta queue.
This method can be used when the number of documents posted is large, and serialization is also
required.

Unserialized V3: This method stores posted documents in an update table using the V3 collective run,
which happens before the data is written to the delta queue. Another V3 collective run than reads the data
from the update table without considering the sequence, and transfers the data to the delta queue. Since
this doesn’t retain the sequence in which records were generated, we should not load data with this update
type directly into a DSO. This is used when serialization of data isn’t required.

Full Repair For DataSources in LO

If there is an issue in the data being extracted through delta load from LO DataSource and it is required to
reload the data into BW from R/3, then full repair has to be used. To use full repair option for LO
datasources, setup tables need to be filled from the source tables present in SAP R/3. It is required
because delta load data moves directly into delta queue only and not into the setup tables.

In case it is required to do a full repair for a LO data source:

Identify the extract structure corresponding to the LO data source using transaction code
LBWE.

Identify the corresponding Set-up table (Set-up table is Extract structure name suffixed by
SETUP).

Identify the transaction to fill up Set-up table (depends on Set-up table).

Identify the selection criteria for filling Set-up table and the selection options available in Full
repair Infopackage.

Delete the Set-up table using transaction code LBWG and fill it based on the decided selection
criteria.

Reload the required data from setup table by executing a Full Repair Infopackage.

To explain this process, let us consider an example to reload the data for 2LIS_11_VASCL datasource
which uses LO Extractor.
We can check all the active extract structures in transaction LBWE (LO Data Extraction: Customizing
Cockpit).

                                            

Now, the name of the setup table as explained before will be MC11VA0SCLSETUP for the corresponding
datasource.

To move ahead with the reload, first we need to identify the selection criteria to fill the setup table.

Transaction to fill the setup table is OLI*BW, where “*” depends upon the application area. Some
examples are given below:

OLI1BW: Article Movements

OLI2BW: Stocks

OLI3BW: Purchasing documents

OLI7BW: Order

OLI8BW: Deliveries

OLI9BW: Billing documents

In our case it will be OLI7BW.

So the selection screen has Sales Organization, Company code and SD Document (Sales Document
Number) which means data based on the above selection can only be filled in the setup tables. In case we
have other fields for selection, then we need to find the corresponding values for these columns from the
application tables to proceed.

Before filling the setup tables, it’s a good practice to delete the setup table. We can use the transaction
LBWG to delete the contents of setup tables. It is not mandatory to delete the setup table but we follow
this practice to maintain consistency as setup table has an additive property.
                                            

In our case we delete the setup table for application component 11 i.e. SD Sales BW.

Before filling the setup table, we must ensure that our system is locked to avoid loss of the new records
posted.

Now we enter the desired selection criteria and the details in the other fields where date and time of
termination is the estimated time to fill the setup table with desired data and then execute. On clicking
continue, the selected records are transferred to the setup table.

                                            

Once we have filled the setup table, we can go to transaction RSA3 and check the number of records in
2LIS_11_VASCL. It should be equal to the number of records we have loaded.

Now, we need to execute a Full Repair Infopackage to load the data from setup table. Below steps define
the complete procedure to execute the Infopackage:

1) 

        1)  Create/Find a Full Repair Infopackage in 2LIS_11_VASCL datasource in BW system.

         
                                                 

     

          2) From the Scheduler dropdown, select the Repair Full Request Option.

                                                   

          3) Check the Indicate Request as Repair request option.


                                                      

          4) Make sure to check the Full Update mode in the Update tab.

                                                      

          5) Execute the Infopackage by clicking Start. We can execute the Infopackage immediately or schedule it
to a future date as per our requirements.
                                                      

Once the load completes successfully, check the data in the InfoProvider.

Thanks!!

CONFIGURACION PARA MODIFICACION DE OBJETOS EN AMBIENTES

En RSA1 Conexión de Transportes se pueden modificar las opciones de modificación de objetos

Como eliminar columnas NULAS en un column chart en DASHBOARD

https://archive.sap.com/discussions/thread/3196262

Eliminar o desliberar transportes

Of course, you can

But you have to chnage status of the request:

1. Use transaction SE38 or SA38 and run program: RDDIT076


2. Set your request number and run the program
3. After request is shown double click on it
4. You should be able to change status from R to D

And now, you can delete the transport for example in transaction: SE10...

Comparar intervalos de fiscper actual vs. Mismo intervalo ano anterior

Hi Vam.

You can do it like this:


For the periods of the current year, restrict with an interval on fiscper where both limits
are variables: [1st period of current year;current period]

For the periods of the previous year, you restrict with and interval where you use
variable offsets: [1st period of current year-12;current period-12].

I dont know if there are standard variables for this, but I'd be surprised if the "current
period" variable is not delivered in bct... if not, you'll have to create your own exits.

good luck!

Jacob

Programa para desliberar ordenes de transporte

Por la SE38 ejecutar RDDOT076

Enhancement 2LIS_02_ITM etc Purchasing

Se entra por CMOD y EXIT_SAPLRSAP_001 e Include   ZXRSAU01. Si no existe se


crea en la misma CMOD (doble click con la advertencia del nombre)

DATA : l_s_MC02M_0ITM LIKE MC02M_0ITM .

DATA : TI_BUKRS like EKPO-BUKRS ,

l_tabix LIKE sy-tabix.

CASE i_datasource.

WHEN '2LIS_02_ITM'.

LOOP AT c_t_data INTO l_s_MC02M_0ITM .

l_tabix = sy-tabix.

CLEAR TI_BUKRS .

SELECT SINGLE BUKRS FROM EKPO INTO TI_BUKRS

WHERE EBELN EQ l_s_MC02M_0ITM-EBELN .

l_s_MC02M_0ITM-BUKRS = TI_BUKRS .

MODIFY c_t_data FROM l_s_MC02M_0ITM INDEX l_tabix.

ENDLOOP.

ENDCASE.

Convertir 0fiscper en 0calmonth

FISCPER_CALMONTH( Fiscal year/period, Fiscal Year Variant, Fiscal year, 0 )


data: cal type /BIC/OI0calmonth.

'use this function module to convert 0fiscper to 0calmonth.

CALL FUNCTION 'UMC_FISCPER_TO_CALMONTH'

EXPORTING

I_PERIV = trans_structure-/bic/0fiscvarnt

I_FISCPER = trans_structure-/bic/0fiscper

I_CALMONTH_ICHANM = ''

IMPORTING

ES_CALMONTH = cal

EXCEPTIONS

date_invalid = 1

OTHERS = 2.

result = cal.

Broadcasting y cambio de usuarioç

https://blogs.sap.com/2014/01/18/user-id-changes-in-sap-bw/

 Information Broadcasting:
 

When a setting for the Information broadcasting is created, a background job with the user
name will be created and ABAP program RSRD_BROADCAST_STARTER will be triggered with
the same user name. This program will access the table for the authorization user mentioned in
the table RSRD_SETT_NODE_A.

       Due to this process, we have to change the user in the background job and the
authorization user. Background user can be changed using TBTCP and the authorization user
can be changed in the table RSRD_SETT_NODE_A.

Before doing the changes in the user ID we will have to implement the above mentioned
changes so that there is not much effect on BW users after their ids are changed. We will have
to design ABAP programs accordingly so that the required changes could be done in the
standard tables mentioned above. Logic which needs to be implemented is discussed below.

You might also like