Professional Documents
Culture Documents
Demantra Solutions
Demantra Solutions
1
Demantra Solutions
Day, Date, 2004
Future Demantra Solutions Advisor Webcasts!!
time p.m.
Wednesday, MayET6, 2009
Don't miss out.. Register today!!
Time MDT
Teleconference Access: Wednesday Troubleshooting Techniques for 791239.1
North America: xxxx
Teleconference Access: May 6, 2009 Demantra Data Load Issues
International: xxxx Wednesday The Key to Optimizing Demantra 797933.1
Password: Advisor
1-866-627-3315 June 3, 2009 Worksheet Performance
• A dialog box will open. Enter your question and select “Send.”
• During the Q&A session your question will be read and an answer will
follow.
Thank you.
© 2009 Oracle Corporation – Proprietary and Confidential
4
Safe Harbor Statement
6
Agenda
Please note that all three files are available in the Demantra Solutions Metalink Forum
The material presented below is of a technical nature. Little attention has been
given to functional navigation, functional demonstrations. The vast majority of
data load issues are investigated / solved using SQL. To that end, this presentation
focuses almost exclusively on problem investigation and resolution.
This material has been assembled from 100s of bugs in which DEV gave us techniques
to drill into source/destination data.
There are four data flows that move data in and out of Demantra:
1. Loading data from source into collection staging tables
* These are the T_SRC_% and error tables. ep_load_main procedure.
2. Moving data from the collection staging tables into Demantra data store
3. Loading data from the Demantra data store into the Demantra engine and
downloading from Demantra back to the Demantra data store.
4. Pushing the data back to the source instance in the form of a forecast.
next
================================
Creating a new series through data model upgrade from business modeler, brining in
the data through custom hooks and ep_load.
Please see:
next
================================
Please set the profile MSD_DEM: Debug Mode to Yes. And then run the
shipment and booking history program again and provide the following:
1. Log files and Output Files of all the concurrent programs launched.
2. Trace file and the tkprof output of the trace file for the DB session.
Also upload the modified custom hooks package spec and body.
next
================================
- You do not want the demand class item level to be imported to Demantra.
- You want to update the demand class column in item staging table to N/A.
- To accomplish this task, you can modify the sales history custom hook package,
on the EBS side, so that it will be executed during EBS collections.
- The sql stament inserted into the custom hook program is below.
----
UPDATE MSDEM.t_src_sales_tmpl
SET EBS_DEMAND_CLASS_SR_PK = '-777', EBS_DEMAND_CLASS_CODE = '0';
COMMIT;
----
- After running the Shipping/Booking History COllections, the SQL statement will
update the T_SRC_SALES_TMPL table. All other custom hooks which are placed
for updating T_SRC_ITEMS_TMPL and T_SRC_LOC_TMPL will also be updated.
next
================================
SLC:M1:Seattle Manufacturing
- Computer Service:1006:Chattanooga (OPS):Vision Operations -> CN974444
5. Rerun worksheet and see that the old value is shown instead of the value entered.
6. Close and reoopen worksheet - "Forecast Dependent Demand Override" still shows old value.
8. See that new value finally shows up in "Forecast Dependent Demand Override"
2. Therefore, the solution will be: add the GL Data table LUD column (e.g. -
t_ep_cto_data_lud) into the POPU Sql in the Expression which will check
if a combination was changed.
4. The solution will be: Do not clear the Sub Comb & Combination Maps, but
only synch them with the latest requested Sub Combs & Combinations.
next
================================
I installed the latest build on my local environment, but I still have a problem with
the worksheet.
When I update a base model in the "zzz. CTO: My BOM view", the "parent demand" series
should change when the update hook runs.
2) On the page level browser, go click on APAC, then APAC Site 1, and then D530
3) For the row item 10/27/2008 | D530 | D530, put the number 22 in the "Base
Override" Series. Press update.
4) Wait 10 seconds and hit refresh. Notice that the "parent demand" series
has not changed. Several of values in this series should have the number 22
as a value.
You should see several records that have the value 22 for the parent
6) Close the worksheet, then reopen it. Only now will the value 22 will
apear in the "parent demand" series.
Customer Solution
-----------------
- The problem in this is in the Update Hook code. Upon examination of the CALC_WORKSHEET
procedure I noticed that you update the 'LAST_UPDATE_DATE' column in T_EP_CTO_DATA table.
- While running the worksheet we do not check the 'LAST_UPDATE_DATE' column in the
T_EP_CTO_DATA
table but the 'LUD' columns in the T_EP_CTO_MATRIX table.
- This is according to a new feature named 'GL_MATRIX Support' which has been added to version 7.3.
- In each GL Matrix table, we have the LUD columns of all the Levels which belongs to the current
GL (General Level):
* Plus the LAST_UPDATE_DATE column of this table
* Plus the LUD column of the GL DATA table ('T_EP_CTO_DATA).
After adjusting the update, the customer worksheet displayed correct results.
next
================================================
Object MSD_DEM_QUERIES
* I have little information regarding the table that stores the dynamic SQL used for Demantra.
If there is interest, we can speak with DEV and develop a white paper geared to diagnostics.
next
================================================
I have seen numerous errors that are related to the awareness of which instance is being
collected.
Check the t_src_loc_tmpl (or other staging table) table to determine instances and organizations
are collected.
next
================================
There have been few issues reported concerning the publishing of Demantra forecast to source. Here
are a few pointers:
The workflow has completed with out error yet there are no rows in the table
MSD_DP_SCN_ENTRIES_DENORM.
----353
next
================================
1. Item-Org
2. Item - Customer-customer site
3. Item - Zone
When we publish the constrained forecast from SNO, we publish all these 3 types.
ASCP handles these 3 types of constrained forecasts and plans successfully.
next
================================
Please set the profile MSD_DEM: Debug Mode to Yes. And then run the
shipment and booking history program again and provide the following:
1. Log files and Output Files of all the concurrent programs launched.
2. Trace file and the tkprof output of the trace file for the DB session.
next
================================================
- We are assembling a note that will cover install procedures for r12 and Demantra 7.2.0.2
next
================================================
Wrap Up
-------
Please reference the following notes available on Metalink or email jeffery.goulette@oracle.com
if the article is being moderated. These are available in a .zip file.
The material presented below is of a technical nature. Little attention has been
given to functional navigation, functional demonstrations. The vast majority of
data load issues are investigated / solved using SQL. To that end, this presentation
focuses almost exclusively on problem investigation and resolution. T
This material has been assembled from 100s of bugs in which DEV gave us techniques
to drill into source/destination data.
==================================
Data Loading into Demantra EP_Load
==================================
There are four data flows that move data in and out of Demantra:
1. Loading data from source into collection staging tables
2. Moving data from the collection staging tables into Dematra data store
3. Loading data from the Demantra data store into the Demantra engine and
downloading from Demtra back to the Demantr data store.
4. Pushing the data back to the source instance in the form of a forecast.
We will not focus on the tools, which appear to be intuitive, but instead discuss methods of investigation
to indentify, explain and fix the load result.
next
================================
EP_LOAD download procedures are used for booking history streams and level members.
- For example, the EP_LOAD procedures are used to load booking history by
organization-site-sales channel and item-demand class into staging tables.
- If the Download Now check box was not selected during the collections
process, run EP_LOAD and Import Integration Profiles to move data from the
staging tables into the Oracle Demantra Demand Management schema.
next
================================
Launch EP LOAD.
- Historical information and level data are imported into Oracle Demantra via
the EP_LOAD procedure.
- All other series data are imported into Oracle Demantra via Import Integration Profiles.
An assumption of the EP LOAD procedure is that the series are populated into the Oracle
Demantra staging tables before the load process begins.
- To ensure this occurs, the collection programs for all eight historical series have been
merged so that these streams are always collected simultaneously.
next
================================
For members and history series, which are downloaded via the EP_LOAD mechanism,
the mode of update is:
- If a member has been deleted in the E-Business Suite source, it stays in Oracle
Demantra along with all series data for combinations that include the member.
* The administrative user must manually delete the member in Oracle Demantra.
- Series data in the staging area overwrites the series data in Oracle Demantra, for the
combinations that are represented in the staging area.
- Series data in Oracle Demantra for combinations that are not in the staging area are
left unchanged.
- All series data in Oracle Demantra, for all combinations, are set to null before the
download actions take place.
* There are a total of three EP_LOAD workflows, one EP_LOAD workflow for each of the following
series:
- Item members
- Location members
- Shipment and Booking History
* We are not covering the Create the EP_LOAD_MAIN procedure, which loads data into the data
model from staging tables or from files, according to your choice as setup in the Data Model Wizard.
next
================================
EP_LOAD process. All future information (that is, forecast data) is loaded using
integration profiles or other loading mechanisms. This mechanism controls the dates
marked as end of history for the Forecasting Engine and the Collaborator Workbench.
With the addition of the MaxSalesGen parameter, you can now use the EP_LOAD
process to load future dates into Demantra. This parameter determines how data after
the end of history is populated.
next
================================
T_SRC_ITEM_TMPL
This staging table is used by the ep_load_main procedure. Each record corresponds to a
unique item entity based on all lowest item levels in the model.
T_SRC_LOC_TMPL
This staging table is used by the ep_load_main procedure. Each record corresponds to a
unique location entity based on all lowest item levels in the model.
T_SRC_SALES_TMPL
This staging table is used by the ep_load_main procedure. Each record corresponds to
sales data for a given item and location combination, based on all lowest item levels in
the model.
next
================================
next
================================
This pre seeded WF should include all required steps to enable a successfull
download of items/location/sales data into Demantra tables. Currently the WF
runs the following processes:
EP_LOAD_ITEMS
EP_LOAD_LOCATION
EP_LOAD_SALES
However, in order to complete the loading process you may need to run the MDP_ADD
procedure to make sure new combination are added into MDP_MATRIX.
next
================================
Technical Investigation
-----------------------
On 7.2.0.1 in Production:
We find that after running ep_load_sales, the procedure completed without any error
message, but there is no record in sales_data table.
EXPECTED BEHAVIOR
-----------------------------------
We expect data to be loaded
Steps To Reproduce:
1. load data into t_src_sales_tmpl
2. run ep_load_sales
3. check integ_status
4. check t_src_sales_tmpl_err
5. check sales_data
We have just one record in t_src_sales_tmpl, after running ep_load_sales, the procedure
completed without any error message and there is no error row in t_src_sales_tmpl_err.
Upon checking sales_data and integ_status, there are no rows in these tables.
Debugging Steps:
2. Verify that there is not 'new' data. The existing row could have simply been updated.
* You would need to change the value coming from the source to successfully load without
manipulating staging table data.
next
================================
After launching Collection, Collect Shipment and Booking history with the
parameter 'EP_LOAD'=N and running ep_load after collection, the
actual_quantity loaded into sales_data is different from the actual_qty in
t_src_sales_tmpl.
5. Verified the data for the items 130-0290-910 in the bucket starting 05/26/2008=18333
- Result:18333 <----
Debugging
---------
Please check the setting of SYS_PARAMS 'accumulatedOrUpdate' setting:
'update' or 'accumulate'
EP_LOAD_SALES should aggregate the sales in the T_SRC_SALES table by combination and
sales date.
- accumulatedOrUpdate = Update
If the sale already exists then the actual_quantity value should be replaced.
- accumulatedOrUpdate = accumulate
Should add the new value to any existing values (provides the series is set
as proportional)
Debugging Step 2
----------------
This is one method of comparing source data to mdp_matrix. In this case we
are verifying the sums for a given date range / item.
If there is a difference, the sql can be adjusted to weed out extra rows or
next
================================
I created Integration interface Variable Cost and a workflow with transfer step
to load data for that integration interface. When the staging table for integration
interface is populated with data and the workflow is run, the errored out records move
to the err table and the correct records vanish from the staging table but nothing is
populated in demantra internal tables.
If all the records in the staging table are correct then the data is
populated into the base tables of Demantra.
Investigation
-------------
The staging table (integ_inf_var_cost) seems to contain dirty data. By this
I mean that in addition to missing members (that are being handled by the
system and can be noted in the error table), some of the combinations do not
have valid item-location combination in the MDP_MATRIX table. Hence, the
update process had failed to find any valid rows to update.
Any error that is found, a row in the error table indicates and holds
information about the problem.
On v7.1.1 validation DID NOT include population validation (i.e., that the
specified combination indeed exsits in MDP_MATRIX). This feature was added
in the v7.2.0 release.
Nonetheless, please find the following 2 SQL statements, either one will reveal those notorious
combinations in the staging table:
t_ep_region r,
t_ep_finproductgroup f
WHERE i.level1 = e.e1_it_br_cat_3
AND i.level2 = r.region
AND i.level3 = f.finproductgroup
AND NOT EXISTS (
SELECT *
FROM mdp_matrix m
WHERE m.t_ep_e1_it_br_cat_3_ep_id = e.t_ep_e1_it_br_cat_3_ep_id
AND m.t_ep_region_ep_id = r.t_ep_region_ep_id
AND m.t_ep_finproductgroup_ep_id = f.t_ep_finproductgroup_ep_id);
next
================================
This is a a sound approach to test the complete collection and presention of
data to MDP_MATRIX and Demantra.
-- Reset
TRUNCATE TABLE integ_inf_var_cost_err;
TRUNCATE TABLE integ_inf_var_cost;
UPDATE mdp_matrix
-- Checked that all data from the staging table was handled
SELECT COUNT (*)
FROM integ_inf_var_cost;
b) Verified that that the mdp_matrix table had new rows using
the following:
next
================================
An export file which contains your data model can be exported by running
the Business Modeler and selecting the menu "Data Model --> Open Data
Model". Then select the active model (it is yellow) and click on the
"Export" button. Then specify the location to save the *.dmw file.
next
================================
1. Load a single record (for the purpose of this test case) into each of
t_src_item_tmpl,t_src_loc_tmpl and t_src_sales_tmpl
3. Look for errors the %err tables and find that t_src_sales_tmpl_err has an
error and t_src_sales_tmpl is empty
Determined Cause:
When loading SALES_DATA all of the levels must match existing levels in
the hierarchy, not just the item code. This is ensured by a large JOIN near
the beginning of the EP_LOAD_SALES procedure.
We have also seen this strange result as product of bug 6520853 EP_LOAD_ITEMS DOESN'T
LOAD ITEM AND EP_CHECK_ITEMS DOESN'T GIVE ERROR. This was based on proper case.
When the case did not match.
next
================================
After running the EP_Load procedure the worksheet does not show the data.
The TO_DATE & FROM_DATE columns are not populated.
Steps to Reproduce:
1. Run Build Model.
- My results are that the levels, members and data exist in the data base.
- The levels and members are seen in the worksheet wizard and can be selected.
- The levels, members and data are not shown in the worksheet.
Update mdp_matrix from_date and until_date based on the table created above:
UPDATE mdp_matrix m
set (from_date,until_date) = (
select from_date,until_date
from temp_matrix_dates WHERE item_id = m.item_id
AND location_id = m.location_id)
WHERE item_id = m.item_id
AND location_id = m.location_id;
This will update the date columns in mdp_matrix for all combinations not just
the ones you loaded. Use the following SQL to identify which combinations in
MDP_MATRIX are missing the date values.
next
================================
The following investigation can be used to verify that data from the source actually
loads into the Demantra tables. The item hhm-1 is missing in the Demantra sales_data tables
but there were no errors. This customer is useing .ctl files to load the dmtra_template
schema. You can change the default schema name, see note 551455.1.
Following are the steps followed and the details of the issue.
Load your data into flat files matching the .ctl and perform the following:
1. Performed booking and shipping history collections with Auto down load option.
2. Found that the data is not collected into demantra. this was confirmed after
verifying the data is not present in the work sheets.
3. Performed EBS full down load Work Flow. This collected the location data into the staging table.
select *
from dmtra_template.t_src_sales_tmpl
where lower(dm_item_code) = 'hse-1';
4. Then performed booking and shipping history collections with auto down load
5. SELECT DISTINCT
dm_item_code,
dm_org_code,
dm_site_code,
t_ep_lr1,
t_ep_ls1,
t_ep_p1,
ebs_demand_class_code,
ebs_sales_channel_code,
aggre_sd -- filter column
FROM ep_T_SRC_SALES_TMPL_ld
WHERE ep_T_SRC_SALES_TMPL_ld.actual_qty IS NOT NULL
and dm_item_code ='HHM-1'
ORDER BY
dm_item_code,dm_org_code,dm_site_code,t_ep_lr1,t_ep_ls1,t_ep_p1,ebs_demand_cla
ss_code,ebs_sales_channel_code,aggre_sd;
6. SELECT ITEMS.item_id,LOCATION.location_id
FROM ITEMS, LOCATION,
t_ep_item,
t_ep_organization,
t_ep_site,
t_ep_lr1,
t_ep_ls1,
t_ep_p1,
t_ep_ebs_dema,
nd_class,
t_ep_ebs_sales_ch
WHERE 1 = 1
AND items.t_ep_item_ep_id = t_ep_item.t_ep_item_ep_id
AND t_ep_item.item = 'HHM-1'
AND location.t_ep_organization_ep_id = t_ep_organization.t_ep_organization_ep_id
AND t_ep_organization.organization = 'TST:M1'
AND location.t_ep_site_ep_id = t_ep_site.t_ep_site_ep_id
AND t_ep_site.site = 'ABC Corporation Americas:2637:7233:Vision Operations'
AND location.t_ep_lr1_ep_id = t_ep_lr1.t_ep_lr1_ep_id
AND t_ep_lr1.lr1 = 'N/A'
AND location.t_ep_ls1_ep_id = t_ep_ls1.t_ep_ls1_ep_id
AND t_ep_ls1.ls1 = 'N/A'
AND items.t_ep_p1_ep_id = t_ep_p1.t_ep_p1_ep_id
AND t_ep_p1.p1 = 'N/A'
AND items.t_ep_ebs_demand_class_ep_id = t_ep_ebs_demand_class.t_ep_ebs_demand_class_ep_id
AND t_ep_ebs_demand_class.ebs_demand_class = '0'
AND location.t_ep_ebs_sales_ch_ep_id = t_ep_ebs_sales_ch.t_ep_ebs_sales_ch_ep_id
AND t_ep_ebs_sales_ch.ebs_sales_ch = 'Direct;'
7. What you now see is only the 2 rows inserted / updated into SALES_DATA
because the 'aggre_sd' date is used as a filter from the actual_qty NOT NULL select.
SELECT COUNT(*)
from sales_data
where item_id = 565
and location_id = 607
and TRUNC(sales_date) = TO_DATE('05-FEB-07','DD-MON-RR')
* The actual_qty is NOT NULL list is aggregated by the aggre_sd filter date and the rows
are inserted / updated in SALES_DATA.
* The columns quantity values are averaged across all the distrinct rows even
including the NOT NULL actual_qty rows by the 'aggre_sd'
From T_SRC_SALES_TMPL
Rows
Total 12,476
Distinct IDs with actual_qty NULL 807 <-- These are the problem rows
Distinct ID's Incl. aggre_sd and actual_qty NOT NULL 11,669
Why are there 807 problem rows? Why are they not in the err table?
select i.t_ep_item_ep_id,
i.item_id,t.item,
i.is_fictive
from dmtra_template.items i, t_ep_item t
where lower(item) ='hse-1'
and i.t_ep_item_ep_id = t.t_ep_item_ep_id
order by 1 ;
The above will deliver the item_id to be used to verify that the data is in
the sales_data table:
Unclear as to why ep_load_sales does not load sales nor insert errored rows
into error tables.
SELECT *
FROM dmtra_template.t_ep_item
WHERE item = 'HSE-1';
Output:t_ep_item_ep_id = 188
11. What are the actual_qty values and sales_date in the source table?
How many rows where the actual_qty values are NULL in the source table?
12. SELECT *
FROM dmtra_template.items
WHERE t_ep_item_ep_id = 188;
Output :
item_id demand_class
------- ----------------------------------
654 Unassicaoted
671 Australia Sales Region
727 East US Sales Region
823 West US Sales Region
13. What is the corresponding item id values from the ITEMS table?
As you can see where quantity is not null there is data with correct
corresponding values and dates in SALES_DATA.
Expected Behaviour
-------------------
We should be able to see the data for History for these series as well even
when the acutal_quantity IS NULL
Booking History - booked items - booked date,
Booking History - requested items - booked date,
Booking History - booked items - requested date,
Booking History - requested items - requested date,
Shipment History - shipped items - shipped date,
Shipment History - shipped items - requested date,
Shipment History - requested items - requested date
The material presented below is of a technical nature. Little attention has been
given to functional navigation, functional demonstrations. The vast majority of
data load issues are investigated / solved using SQL. To that end, this presentation
focuses almost exclusively on problem investigation and resolution.
This material has been assembled from 100s of bugs in which DEV gave us techniques
to drill into source/destination data.
There are four data flows that move data in and out of Demantra:
1. Loading data from source into collection staging tables
* These are the T_SRC_% and error tables. ep_load_main procedure.
2. Moving data from the collection staging tables into Dematra data store
3. Loading data from the Demantra data store into the Demantra engine and
downloading from Demtra back to the Demantr data store.
4. Pushing the data back to the source instance in the form of a forecast.
next
================================================
2. Collect Data and Download to Three Staging Tables. See Implementation Guide.
* EP_LOAD
* Import Integration Profiles
4. Generate forecasts
next
================================================
We will not focus on the tools, which appear to be intuitive, but instead discuss methods of investigation
to indentify, explain and fix the load result.
Load Methods
------------
- Integration Interface Wizard
- Data Model Wizard
- Demantra Import Tool
- SQL*Loader
next
================================================
The following table summarizes the core Demantra import and export tools:
next
================================================
The core Demantra tools allow you to do the following:
• Import lowest-level item, location, and sales data
• Import or export series data at any aggregation level, with optional filtering
• Import promotions and promotional series
• Export members of any aggregation level
• Import supplementary data into supporting tables as needed
next
================================================
Object Manipulation
-------------------
The Demantra Business Modeler contains many useful tools to perform maintenace and
development.
- Create Table
- Alter Table
- Recompile
- View the Procedure Error Log
- Cleanup Demantra temporary tables
- Oracle Sessions Monitor
- Please see user guide for complete list
next
================================================
Oracle Demantra Workflows
- EBS Full Download: Downloads items, locations, and shipment and booking history.
- EBS Return History: Download: Downloads Return History
- EBS Price List Download: Downloads Price Lists
next
================================================
WF Technical
------------
Monitor the WF for errors:
I have error in the collaborator.log file resulting from the "Download Plan Scenario Data"
workflow.
- They where caused by the "Notify" email step in the workflow, these errors were being
generated because the demantra environment was not configured with the proper details for sending
email notifications to users.
- These errors might prevent the workflow from completing the rest of the steps in the flow.
- The User Guide explains proper setup quite well.
Additional WF Checks
--------------------
1. Are there more than a single instance of the Demantra application
running (i.e., with different context root, on different machines, etc.)?
2. What is the name of the problematic workflow? Is there another workflow schema by that name?
3. How were schemas removed? Using Demantra application or directly from
the database (using the DELETE statement)? Use the application whenever possible.
Poor WF Performance?
--------------------
- Ask yourself, is this necessary data? While historic data is important, not all historic
data needs to be included.
- Are most of the quantities in these records valid quantities?
- Have you considered changing the plan settings to limit out quantities that
are below a threshold?
- Can older scenario revision data be deleted if it's no longer required?
- What are the output levels (including time) of the scenarios?
DEMAND_PLAN_ID=6031 group by
SCENARIO_ID;
next
================================================
For performance reasons, the updates are not being executed immediately,
but are being accumulated, and once exceeding the configured limit being written
as a chunk or block.
* defined in the ImportBlockSize property in appserver.properties file
next
================================================
When fully configured, Demantra imports the following data, at a minimum, from your
enterprise systems:
- Location data, which describes each location to which you sell or ship your products.
- Sales history, which describes each sale made at each location. Specifically this
includes the items sold and the quantity of those items, in each sale.
next
================================================
Functional Considerations
-------------------------
There are many functional setup issues relating directly to successful data load and execution.
For example, the MSC:Organization containing generic BOM for forecast explosion profile
can be set to one of the organizations at the site level. This will limit the entry of rows into
MSD_DP_SCENARIO_ENTRIES and MSD_DP_SCN_ENTRIES_DENORM.
Changing the profile to the Global Item Master Organization will make the records available for loading.
next
================================================
Imported Data
-------------
You can collect internal sales orders. They appear in the customer dimension; the customer
appears as the organization code.
Seeded collections from EBS into Oracle Demantra Demand Management include:
- Shipment history, booking history, and returns history
- Manufacturing and fiscal calendars
- Price lists, currencies, and currency conversion factors
- Dimensions, levels, hierarchies, and level values for demand analysis.
- Customer class: Site > Account > Customer > Customer class
- Business group: Organization > Operating unit > Business group
- Sales channel
next
================================================
Make a note of the names of your tables, as displayed within the Integration Interface Wizard.
(not a complete list)
- biio_supply_plans
- biio_supply_plans_pop
- biio_other_plan_data
- biio_PURGE_PLAN
- biio_scenario_resources
Troubleshooting
---------------
Look to the logs
- The _Err table will contain the rows that were not successfully added to the staging tables.
NOTE, it is advised that you review the contents of the following for erred rows:
- UPDATE_BATCH_TRAIL
- UPDATE_BATCH_VALUES
- UPDATE_BATCH_TRAIL_ERR
- UPDATE_BATCH_VALUES_ERR
Please run the following script in the demantra schema to verify data
---------------------------------------------------------------------
begin
apps.msd_dem_sop.load_plan_data (:supply_plan_id);
end;
The parameter to the procedure is the column value of supply_plan_id from the
supply_plan table. The correct ID is for the supply plan for which data is being
loaded from ASCP to Demantra. Use SQL to attain.
For e.g. if for plan 'ASCP-DEM', the supply plan id is 134, the script should be run as
begin
apps.msd_dem_sop.load_plan_data (134);
end;
If you must log an SR, please enable trace for the session and provide the tkprof
output of the trace file after the execution.
Also provide the row count of the following tables in Demantra schema after
running the script:
1. biio_other_plan_data
2. biio_resource_capacity
3. biio_supply_plans
4. biio_supply_plans_pop
5. biio_scenario_resources
6. biio_scenario_resource_pop
7. biio_resources
8. t_src_item_tmpl
9. t_src_loc_tmpl
10. BIIO_PURGE_PLAN
next
================================================
T_SRC_LOC_TMPL,
T_SRC_SALES_TMPL,
TABLE_NAME,
WF_GROUPS_SCHEMAS,
WF_PROCESS_LOG,
WF_SCHEDULER,
WF_SCHEMAS,
WF_SCHEMA_GROUPS,
WF_SCHEMA_TEMPLATES.
next
================================================
Prarmeter checks Used for Source Data Load and EP Load. Be familiar
with the contents of the following tables:
reaches a sleep period limit > 128 seconds (current default). This gives
a total attempt time of 255 seconds. This limit can be changed via the
new parameter in DB_PARAMS 'check_and_drop_sleep_limit'.
- aps_params contains the password the Business Modeler should use to connect to
connect to the database as well as other important operational settings.
next
================================================
Scenario: Shipment and booking history are not completely collected when
we establish a new schema. Though the collections for shipment and booking
history is successful with no error or warning message, we could not see the
items in Demantra. If there is unclean data in t_src_item_tmpl and
t_src_loc_tmpl tables, EBS full download moves the unclean data from these
tables to the error table. This works fine but the process is not moving
clean data to the Demantra base tables.
* You will need to find your item_id organization_id as well as other data required
to run the following sql.
Steps to Produce:
1. Performed booking and shipping history collectins with Auto down load option.
2. Found that the data is not collected into Demantra. This was confirmed after
verifying the data in work sheets.
3. Verified that this issue is related to locations via simple SQL checks.
4. Performed EBS full down load work flow. This collected the location data into the
staging table. But not sales data as last time the locations data were not collected.
Hence nothing in the staging table.
5. Then performed booking and shipping history collections with auto down load. Still
the sales data not collected into demantra. (not sure why the customer did this)
Step 1
================================================--
In the table INTEG_STATUS it shows that when DMTRA_TEMPLATE runs the EP_LOAD
it succeeds but when APPS runs it, it fails, this could be a permissions issue.
Step 2
------
select * from dmtra_template.sales_data where item_id in (654,671,727,823);
Step 3
------
Have confirmed items HSE-1, HSE-2,HSE-3,HSE-4,HSE-5 are in staging tables:
Step 4
------
Also confirmed items are loaded into the system:
select i.t_ep_item_ep_id,i.item_id,t.item,i.is_fictive
from dmtra_template.items i,dmtra_template.t_ep_item t where lower(item) in
('hse-1', 'hse-2','hse-3','hse-4','hse-5')
and i.t_ep_item_ep_id=t.t_ep_item_ep_id
order by 1;
Step 5
------
Launched complete refresh collections for Shipment and booking history with
Auto-download set to 'yes'.
After the above data is available you should be able to determine the missing data.
In this case the customer was missing location. Location is one of the major keys.
next
================================================
Demand Class
------------
The customer noted that the load was successful however, there were sales orders
missing.
For eg. - The demand class code '1-WLKLE' is available in order lines, but it
is not available in the lookup 'DEMAND_CLASS'. The following query returns zero
rows:
As a temporary workaround, you could modify the shipment history query to not bring
in demand class code, if the demand class is missing in the lookup 'DEMAND_CLASS'.
The issue is due to missing demand classes in the lookup. If the demand
classes available in the lookup 'DEMAND_CLASS' and order lines are in synch
with each other, then the sales history quantities should be loaded correctly
into sales data.
You must check why the demand classes are missing from the lookup
DEMAND_CLASS, but present in order lines. These need to be in synch.
next
================================================
1. Verify if the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL' are set
to the correct values. Also make sure demantra URL is up and accessible.
2. Set the profile 'MSD_DEM: Debug Mode' to 'Yes'. Launch collection with
'Launch Download' set to 'Yes'.
3. Upload log as well as output files for the 'Launch EP LOAD' stage.
4. Also provide values for the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL'.
next
================================================
Version Control
- Downgrading is not supported. Our schema upgrade mechanism only knows how to
rev foreward not look backward.
next
================================================
EBS Data Load
Please note that all successfully loaded rows are loaded into the MSD_DP_SCN_ENTRIES_DENORM
table.
MSD_DP_SCENARIO_ENTRIES table contains only the records which errored out. Errored records of
a
previous load will be deleted in the next load.
1. Verify if the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL' are set
to the correct values. Also make sure demantra URL is up and accessible.
2. Set the profile 'MSD_DEM: Debug Mode' to 'Yes'. Launch collection with
'Launch Download' set to 'Yes'.
3. Upload log as well as output files for the 'Launch EP LOAD' stage.
4. Also provide values for the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host
URL'.
next
================================================
Error Check
-----------
- DB_EXCEPTION_LOG table
select * from db_exception_log order by err_date;=
- Please provide the Log file for the respective 'request set' collection program.
After running ASCP -> Demand Management System Administrator -> Collect Shipment
and Booking History - Flat File,concurrent program completed successfully
- integ_inf_var_cost_err
- biio_SUPPLY_PLANS_POP / _err
- biio_supply_plans / _er
- biio_other_plan_data / _err
- BIIO_PURGE_PLAN
next
================================================
Setup issue:
1. You must select the "Planning Method" & "Forecast control" for the Master Org.
2. The customer has built an new data model, which deleted all the seeded
display units in demantra. Because of this, pricelists were not able to be
loaded into demantra. You would have to rereate new display units like the
original seeded units.
minus
select distinct display_units_id
from DEM.DCM_PRODUCTS_UNITS;
next
================================================
next
================================================
Problem: Legacy Collections for Shipment and Booking History errored out at Collect
Level Type stage with the following error in the log:
* Workaround: Run ERP collections for future dates which will wipe out
the sales staging tables. Then run the legacy collections with proper
vaues for demand class and sales channel (or your desired data group)
next
==================================
EBS to Demantra collection with download = YES does not insert date into the Demantra
tables. Data moves into the Demantra staging not into the base table. There is no
error in the log file.
Steps followed:
2. Ran the standard collections and EP_LOAD (with download = Yes) The
concurrent programs have been completed successfully with no errors.
3. The item and sales data have been inserted into the Demantra staging tables.
4. Ran the workflow "EBS Full Download" manually, then the data was moved into
5. However, we could not see the data in worksheet. We expect EP_LOAD to put
the data into Demantra base tables.
Solution Explanation
--------------------
AppServerURL
1. The first check to make is proper setting of the parameter 'AppServerURL' in demantra.
This can be verified from the Business Modeler or from the backend. Use the following
query to get the value of this parameter from the demantra schema
- You have to start the application server before running the Business Modeler wizard.
- In Most cases if the application server is up the problem is with the application server URL.
2. Check the profiles 'MSD_DEM_SCHEMA' and 'MSD_DEM_HOST_URL'. Are they set properly.
* For more information regarding MSD_DEM_HOST_URL, see note 431301.1
-------------------------------------------------------------
Profile Name - Value
-------------------------------------------------------------
Profile MSD_DEM_CATEGORY_SET_NAME -
Profile MSD_DEM_CONVERSION_TYPE -
Profile MSD_DEM_CURRENCY_CODE -
Profile MSD_DEM_MASTER_ORG - 204
Profile MSD_DEM_CUSTOMER_ATTRIBUTE - NONE
Profile MSD_DEM_TWO_LEVEL_PLANNING - 2
Profile MSD_DEM_SCHEMA - MSDEM
-------------------------------------------------------------
next
==================================
After running "Shipment and booking history" request from EBS application, only
some partial data gets loaded in demantra TMPL tables.
Here is my log:
+---------------------------------------------------------------------------+
Demand Planning: Version : 12.0.0
+---------------------------------------------------------------------------+
**Starts**22-AUG-2008 18:19:53
**Ends**22-AUG-2008 18:24:38
ORA-29273: HTTP request failed
ORA-06512: at "SYS.UTL_HTTP", line 1577
ORA-12535: TNS:operation timed out
+---------------------------------------------------------------------------+
Start of log messages from FND_FILE
+---------------------------------------------------------------------------+
Exception: msd_dem_collect_history_data.run_load - 22-AUG-2008 18:24:38
ORA-29273: HTTP request failed
ORA-06512: at "SYS.UTL_HTTP", line 1577
ORA-12535: TNS:operation timed out
+---------------------------------------------------------------------------+
End of log messages from FND_FILE
+---------------------------------------------------------------------------+
+---------------------------------------------------------------------------+
Executing request completion options...
+---------------------------------------------------------------------------+
+---------------------------------------------------------------------------+
Concurrent request completed
Current system time is 22-AUG-2008 18:24:40
+---------------------------------------------------------------------------+
QUESTION
========
After running "Shipment and booking history" request from EBS application, only
some partial data gets loaded in demantra TMPL tables.
ORA-29273: HTTP request failed
ORA-06512: at "SYS.UTL_HTTP", line 1577
ORA-12535: TNS:operation timed out
1. The issue might be because the parameter 'AppServerURL' isn't set properly.
Please run the following query from the Demantra schema:
next
================================================
1. Delete t_src_loc_tmpl
2. Execute request set "Standard Collection".
3. Execute request set "Shipment and Booking History".
4. Review the process log from EBS
5. Review the collaboration.log
6. Login to Workflow in Demantra environment
SELECT booked_date
FROM oe_order_headers_all
Also:
Do you have a shipped date populated for new combinations?
Series data must be populated into the Demantra staging tables before the load
process begins. To ensure this occurs, the collection programs for all eight
historical series have been merged:
next
================================================
Regarding the issue of data downloaded into demantra not showing up in worksheet, this is
because of the date range specified in the worksheet settings.
CONFIGURE LOADING TEXT FILES DOES NOT LOAD DATA FOR MORE THAN 2000
COLUMN WIDTH.
During data load from text file, if total width of columns is more than 2000,
it shows error as "ORA-12899: value too large for column
"DEMANTRA"."DM_WIZ_IMPORT_FILE_DEF"."SRC_CTL_SYNTAX" (actual: 3037,
maximum:2000)".
* This is the limitation of the functionality. Change covered in enhancement request 6879562.
next
================================================
1. Make sure that SQLLDR.EXE utility exist under the oracle client bin
directory and that the system path points there. Verify by opening a CMD
window and executing SQLLDR.EXE, if it's not found, add it to the system path and restart.
2. Otherwise, please provide the full engine log from that run, also provide
all the *.log/*.bad/*.txt files that should be under the engine's bin
directory.