Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 73

How to Setup and Run Data Collections

Contents
PDF for Printing This Document..................................................................................................................2
Purpose.......................................................................................................................................................2
Understanding Centralized vs. Distributed Installation...............................................................................3
Understanding the Data Collections Process...............................................................................................4
3 Key Stages to the Data Collections Process..........................................................................................4
Refresh Snapshots Process..................................................................................................................4
Planning Data Pull................................................................................................................................4
ODS Load.............................................................................................................................................5
Complete, Net Change and Targeted Refresh.........................................................................................5
Setup Requests for Data Collections........................................................................................................6
Purge Staging Tables - A Special Note......................................................................................................6
ATP Data Collections vs. Standard Data Collections................................................................................6
Fresh Installation or Upgrade - Basic Steps.................................................................................................7
For 12.1...................................................................................................................................................7
For 12.0.6................................................................................................................................................8
For 11.5.10.2...........................................................................................................................................8
Upgrading EBS Applications - Special Considerations..................................................................................9
DBA Checklist...............................................................................................................................................9
VCP Applications Patch Requirements.....................................................................................................9
RDBMS Patches and Setups for Known Issues.........................................................................................9
Other RDBMS Setups / Maintenance....................................................................................................10
RAC Database - Special Requirements...................................................................................................11
System Administrator Setups................................................................................................................11
Applications Super User Checklist.............................................................................................................12
VCP Applications Patch Requirements...................................................................................................12
Fresh Install/Upgrade Setup Steps........................................................................................................12
Running Data Collections for the First Time in a New Instance.............................................................13
Parameters for First Run of Data Collections.....................................................................................13
Preventing Data Collections Performance Issues......................................................................................14
How Can I Check on Performance When Data Collections is Running?.................................................16
Cloning and Setup after Cloning for Data Collections................................................................................16
I Have a Centralized Installation – What should I do?...........................................................................16
Steps to Consider after Cloning when Using Distributed Installation....................................................17
What Happens when I Setup a New Instance in the Instances Form to Connect an APS
Destination to the EBS Source?.............................................................................................................17
Key Points..............................................................................................................................................17
I Am Going to Use ATP Data Collections for Order Management ATP, What Do I Need to Do?................18
Upgrade and Patching...........................................................................................................................18
DBA’s Information.................................................................................................................................18
Applications Super User’s Information..................................................................................................18
Appendix A................................................................................................................................................18
List of Setup Requests...........................................................................................................................19
References.................................................................................................................................................19
Revisions....................................................................................................................................................21

PDF for Printing This Document


Click here to download the PDF of this document

Purpose
This document will offer recommendations for customers using Value Chain Planning -
VCP (aka Advanced Planning and Scheduling - APS) who are performing:

1. Fresh Installation of EBS Applications

2. Upgrading EBS Applications


OR

3. Cloning the EBS Source Instance and/or APS Destination instance.

We will explain the difference between Centralized vs. Distributed and other
Terminology used in this document
We will help you insure the patching and setups are correct so you can launch and
complete Data Collections successfully.

We will address this functionality for 11.5.10 and above, with a concentration on R12.1.

Understanding Centralized vs. Distributed Installation


Centralized Installation - single database where the EBS Source applications for the
OLTP transactions and setups used by VCP applications are all on the same database.
Data Collections moves the data from these tables for INV, PO, BOM, WIP, OE, etc from
these base tables to the MSC tables where this data can be used by our applications

Patching for a Centralized Installation - All patches are applied to the single
instance. You can ignore references in the patch readme for centralized and
distributed installation and apply all patches to your single instance. Most
patches will mention to just apply the patch to your instance when Centralized.

Distributed Installation - Two (or more) databases located on different machines. One is
the EBS Source Instance where all OLTP transactions and setups are performed. The APS
Destination is where Data Collections process is launched and uses database links to
communicate and move the data to the MSC tables in the APS Destination.

Why do many customers use Distributed Installation?

1. The processing performed for Planning in ASCP, IO, DRP, etc. can take huge
amounts of memory and disk IO. This can slow down the OLTP processing while
the plan processes are running. It is not unusual for large plans to consume
anywhere from 5-20+ GB of memory while running and move millions of rows
into and out of tables to complete the planning process.

2. It is possible to have APS Destination on different release. APS Destination could


be 12.1 with EBS Source running 11.5.10 or 12.0.6

1. Note that you may NOT have an APS Destination with lower version that
EBS Source (example - APS Destination on 12.0 or 11.5 with EBS Source
on 12.1)

3. It is possible to have more than one EBS Source connected to the APS
Destination. This requires careful consideration as patching for all instances
must be maintained.
Example: 2 EBS Source instances connected to 1 APS destination instance and all
are on the same EBS applications release. For VCP patches that effect
Collections, all three instances must be patched at the same time in order to
prevent breaking this integration. Or customer may have 2 EBS Source on
different releases. APS destination on 12.1 with an EBS Source on 12.1 and
another on 11.5.10.
Patching for a Distributed Installation - This requires careful attention to the
patch readme to make sure the patches are applied to the correct instance and
that pre-requisite and post-requisite patches are applied to the correct instance.
Patch readme should be clear on where the patch is to be applied. Usually the
patch will refer to the EBS (or ERP) Source (or 'source instance') and to the APS
Destination (or 'destination instance') for where the patch is to be applied.

12.1.x – For our 12.1 CU (or cumulative patches), we are only releasing a
single patch and the patch will be applied to both the EBS and APS
instances.
Demantra Integration patches will have a separate EBS Source Side
patch requirement.

12.0.x and 11.5.10 – For these releases we have separate patches for
different parts of the VCP applications.

1. ASCP Engine/UI patches are applied only to the APS Destination.


There is also a Source Side UI patch to be applied to the EBS
Source only - see the readme.

2. Data Collections patches are applied to BOTH the EBS and APS
instances. The readme may list specific patches to be applied to
the EBS Source for certain functionality.

3. GOP (or ATP) and Collaborative Planning are applied to BOTH


the EBS and APS instances.

4. For Demantra, the main patch is applied to the APS Destination


instance. And there is usually a Source side patch listed in the
readme.

Understanding the Data Collections Process 


Note 179522.1 has an attachment showing critical information on Data Collection
parameters, profiles examples of requests launched and the objects used for Data
Collections.

3 Key Stages to the Data Collections Process

Refresh Collection Snapshots


1. This process is launched by the stage Planning Data Pull to refresh the
snapshots used to collect the data.
2. The snapshots work in conjunction with the views to provide the Planning
Data Pull with the information that will be loaded into the staging tables.
3. If the setup profile MSC: Source Setup Required = Yes (or Y in 11.5.10 and
below), then this process will also launch the Setup Requests that create all
the objects used by Data Collections in the EBS Source applications.

4. When deployed with Distributed installation, these processes are launched


on the EBS Source instance. In a Centralized - single instance installation all
requests are launched in the same instance.

5. The process performs a Fast Refresh of the snapshots (with a couple of


exceptions where Complete Refresh is performed OR if it encounters an
error in Fast refresh, it may be able to recover by performing a complete
refresh in some cases)

6. You can launch this as a standalone process and specify either Fast,
Complete or Automatic Refresh (Refer to Note 211121.1 for more
information)

7. Processes Launched by Refresh Collection Snapshots

1. Standard processes

1. Refresh Collection Snapshots

2. Refresh Collections Snapshots Thread – R12.0 and above – we


added a new request that runs refresh of individual snapshots
concurrently as sub-requests of Refresh Collection Snapshots
main request. This was done to improve performance as we
have around 50 snapshots that must be refreshed. For example
in 12.1.1, we see 49 child requests launched when Data
Collections is launched with a Complete Refresh. These requests
are performing a Fast Refresh on the EBS Source instance of each
snapshot, then the reporting back to the main refresh process,
which then performs the Fast refresh of all snapshots again as
one atomic transaction to read consistency.

3. With this new 12.1 technology, and when running Data


Collections with a Complete Refresh, you could benefit from
having a very large number – 25 to 40 – work shifts (processes)
on the Standard Manager where these will run.

2. Setup Requests

1. These are requests that create all the objects required for
Collections process.

2. Download the spreadsheet attachment from Note 179522.1 and


review the sheet for your particular release for more specific
information on the setup requests.
Planning Data Pull
1. When the Snapshot process is finished, then the Planning Data Pull will
move the data from the Snapshots and Views to the Staging Tables - all
names starting with MSC_ST%
2. This will spawn workers to help manage the many tasks (40+) that gather
the data into the staging tables.

3. While a larger number of workers than the default of 2 can improve


performance, the maximum number of workers that can contribute to
improved performance is about 8.

4. Processes Launched by Planning Data Pull

1. This launches the Refresh Collection Snapshots request, which must


complete before any other workers planning data pull processes can
launch. The planning data pull with go to status – Pending while waiting
for this to complete.

2. Note that the timeout parameter (default 180 minutes) does not count
time used for Refresh Collection Snapshots request – so if Refresh takes
45 minutes and Data Pull takes 150 minutes, it will not time out.

3. When Snapshot completes, then it launches the Planning Data Pull


Worker requests, and it performs some those same tasks as well.

4. When all workers are finished, then it finishes, it reports back to the
main request, which will launch the Planning ODS Load for the next
stage of collections.

Planning ODS Load


1. This will launch and perform Key Transformations of critical data into unique
keys for the VCP applications. For instance, since we can collect from
multiple instances, the INVENTORY_ITEM_ID in MSC_SYSTEM_ITEMS must
be given a unique value. This happens for several entities. We store the
Source instance key values in columns with SR_ like
SR_INVENTORY_ITEM_ID in MSC_SYSTEM_ITEMS
1. These Key transformations are split into separate requests starting
in 12.1
Generate Items and Category Sets Keys
Generate Trading Partner Keys

2. Then ODS then launches Planning ODS Loader Worker requests to handle
the many different load tasks that move data from the MSC_ST staging
tables to the base MSC tables.

3. During complete refresh we create TEMP tables to copy data from MSC_ST
staging tables into the TEMP tables, then use Exchange Partition technology
to flip this temp table into the partitioned table.
1. For instance, if your instance code is TST, you can see in the log file
where we create temp table SYSTEM_ITEMS_TST,

2. Then we move data into this table from MSC_ST_SYSTEM_ITEMS

3. Then we exchange this table with the partition of


MSC_SYSTEM_ITEMS used for collected data - example here shows
table name if the instance_id is 2021 - partition name is
SYSTEM_ITEMS__2021)

4. Starting in Some of these requests can be handled by separate


process to improve performance and in R12 and above we launch
Alter Temporary ODS Tables

4. In 12.1, we have a lot of new integration functionality and have introduced


APS Collections Miscellaneous Task for new processes that can be handled
outside the standard ODS Loader Workers; so these are done in a separate
request to help performance. How many of these are launched can depend
on settings for the Planning Data Pull and/or ODS Load.

5. The ODS Load will launch Planning Data Collections - Purge Staging Tables
to remove data from the MSC_ST staging tables and if Data Collections fails
at any stage then you should run Planning Data Collections - Purge Staging
Tables with parameter Validation = No to reset the process and get ready
to launch it again. If you miss this step then it will error with messages like:

1. Another planning data pull process is running.

2. Staging tables data not collected. Launch ODS Load

6. Additionally the ODS Load can launch related processes depending in


setups. These are not part of getting data collected to run your plans in
VCP, so if they fail, you could still run a plan, but the related applications
would have problems if these fail.

1. If Collaborative Planning is setup (Profile MSC: Configuration – APS


&CP [or CP], then we launch the Collaboration ODS Load.

2. In 12.1, we have introduced Advanced Planning Command Center


application, and if this is installed, and you are on 12.1.3 and above,
then ODS parameter Refresh Collected Data In APCC – Yes will
launch process Refresh Collected Data in APCC Repository to
refresh data for APCC to use in analyzing ASCP Plan results. This was
added as part of the performance improvements for this new
application.

Complete, Net Change and Targeted Refresh


1. Complete Refresh replaces all data for the entity, with a few exceptions:
1. Items are never deleted.

2. Trading Partners

3. Sales orders are special entity. We default Sales Orders parameter – No


in complete refresh and will collect sales orders in Net Change mode.
This is done for performance reasons. If you need to refresh sales orders
in Complete mode, then you will have to set this parameter to Yes,
explicitly. OR you can run standalone Targeted Refresh of only Sales
Orders.

2. In order to choose Refresh Mode = Net Change or Targeted, you must set
Planning Data Pull parameter Purge Previously Collected Data = No.

3. Net Change picks up only the changes to the entities.

1. This can only be run for transactional entities, like WIP, PO, OM and
some setup entities like, Items, BOM, etc. The XLS attachment in Note
179522.1 explains which entities can be collected in Net Change mode.

2. If you set parameter Yes for a setup entity - like ATP Rules - it is ignored
when running Net Change collections.

4. Targeted Refresh is used to collect one of more entities in complete refresh


mode. This is very useful to synching data for certain business purposes, or
collecting certain setup data, or quick collection for setting up a test case.
Examples:

1. Business Purpose - Customer runs an ASCP Plan intraday at 1 pm, after


running Net Change collections, they run a Targeted Refresh of
Customers, Suppliers, Orgs to get all new customers created that
morning.

2. We run Net Change Collections during the week, and Complete


Collections on weekends, but during the week, we need to collect new
Sourcing rules and Customers daily, so we run targeted refresh each
night.

3. I need to change Sourcing rules and setup a Supplier capacity calendar


and run a testcase. You can collect Approved Supplier List, Calendars,
and Sourcing Rules to get these changes quickly into the MSC tables for
testing without running a complete refresh of all entities.

5. OPM customers cannot use Targeted Refresh. They must use Complete
Refresh to collect OPM data.

Setup Requests for Data Collections


1. Profile MSC: Source Setup Required = Yes (or Y in 11.5.10) is set by Patch
installation or by user if required.
1. The profile should be setup on EBS Source in a distributed installation.
Refresh Snapshots and the Setup Requests will launch on the EBS
Source.

2. Data Collections will launch Refresh Snapshots process which will detect this
setting and launch the Setup Requests. See List if Appendix A

3. When the Setup Requests are completed successfully, then the profile will
automatically set this profile to No (or N in 11.5.10)

4. If running Refresh Snapshots as a standalone request, then choose Refresh type


- Fast or Complete. If you use Automatic, then this profile is not checked.

5. The Request ID and Request Name will be listed in the Refresh Collection
Snapshots log file when these requests are launched as part of Data Collections

6. If a Setup Request(s) fail, then run Planning Data Collections - Purge Staging
Tables with parameter Validation = No. Then launch again. We have seen many
times where initial failures in the Setup Requests are resolved by running 2 or 3
times.

Purge Staging Tables - A Special Note


1. When Data Collections fails, you must run this request, Planning Data
Collections - Purge Staging Tables, to:

2. Reset the value in ST_STATUS column of MSC_APPS_INSTANCES.

3. This will also remove the data from the MSC_ST% staging tables that are
populated by the Planning Data Pull.

4. Always use parameter Validation = No.

ATP Data Collections vs. Standard Data Collections


1. ATP Data Collections can only be used for customers who are:

1. Centralized Installation

2. Only using ATP for Order Management and NOT using ANY other Value
Chain Planning Applications

3. If you run ATP Data Collections in Mode - Complete Refresh when you
are using Value Chain Planning Applications, you will corrupt the data in
the MSC tables. You will have to run Standard Data Collections with a
Complete Refresh to recover from this mistake.

2. The ATP Data Collections section in the Note 179522.1 XLS attachment includes
information on how to run Standard Data Collections to mimic ATP Data
Collections.
1. This can be useful if there is a problem with ATP Data Collections. You
can try standard collections to see if this works to overcome the issue.

2. Understanding how to run a Targeted Refresh using Standard Data


Collections can be useful to quickly collect changes for only certain
entities.

3. Running a Targeted Refresh of just Sales Orders can be useful to synch


changes if it appears that sales order changes are not being collected
properly.

4. A Targeted Refresh is like Complete, but only collects changes for any
entity set to Yes. CAUTION: if you make a mistake and run Complete
Refresh instead of Targeted Refresh and set wrong entities to No, then
data will be lost and errors will occur. Then a Complete Refresh would
be required to recover from this mistake.

3. Note 436771.1 is an ATP FAQ which will be help to any customer using
GOP/ATP.

1. For understanding when ATP will not be available during Data


Collections, see Q: Why do we get the message 'Try ATP again later'?

Fresh Installation or Upgrade - Basic Steps


When installing and using VCP applications, you are installing the entire EBS Application.
There is not standalone install package just limited to VCP applications.

For 12.1

Customers must install the latest release of EBS Applications and during the
implementation cycle, if a new release of EBS becomes available, plan to install that
release (example: Installed EBS 12.1.2, and three months later EBS 12.1.3 becomes
available, then it is very important to plan a move to this latest release.)

For the applications install, review Note 806593.1 Oracle E-Business Suite Release 12.1 -
Info Center.

For VCP Applications the following Critical Notes are required

1. Note 746824.1 12.1 - Latest Patches and Installation Requirements for Value
Chain Planning. Use this note to reference:

1. The latest Installation Notes

2. The Latest available VCP patch for your release must be applied after
the installation.
3. Understand the Patching requirements if you have a separate EBS
Source instance.

1.This includes R12.1 EBS Source

2.OR if you intend to run EBS Source Applications on R12.0.6 or


11.5.10.2

4. Links to other critical notes about 12.1 VCP Applications

5. List of limitations on new functionality if using R12.0.6 or 11.5.10.2 for


the EBS Source instance

2. Note 552415.1 INSTALL ALERT - Setting Up The APS Partitions and Data
Collections Instance in R12

3. Latest User Guides are in Note 118086.1

4. Note.763631.1 Getting Started With R12.1 Value Chain Planning (aka Advanced
Planning and Scheduling APS) - Support Enhanced Release Content Document
(RCD)

1. If you also need to review new features for R12.0, then also see Note
414332.1

5. If you are installing to use Rapid Planning, then see Note 964316.1 - Oracle
Rapid Planning Documentation Library

For 12.0.6
Customers should NOT be planning to use this release, but should move to 12.1.

Note 401740.1 - R12 Info Center is available for customers who must perform this
installation

For VCP Applications, critical notes are:

1. Note 421097.1 R12 - Latest Patches and Critical Information for VCP - Value
Chain Planning (aka APS - Advanced Planning and Scheduling)

1. You must apply the latest CU Patches available after the installation
and before releasing the instance to users.

2. If you are using only ATP Data Collections, then apply the GOP Patch and
Data Collections patches.

2. Note 412702.1 Getting Started with R12 - Advanced Planning and Scheduling
Suite FAQ

3. Note 414332.1 Getting Started With R12 - Advanced Planning Support Enhanced
RCD
4. Note 552415.1 INSTALL ALERT - Setting Up The APS Partitions and Data
Collections Instance in R12

For 11.5.10.2
In the rare case that an upgrade to 11.5.10.2 is being considered, AND it is not an
intermediate step on the way to R12.1, then it is very critical that the latest patches be
applied.

Review Note 223026.1 and apply the latest Rollup patches for at minimum:

1. Using ASCP - then apply ASCP Engine/UI and Data Collections rollup

2. Using any other applications, then also apply those patches.

3. Using ATP Data Collections only - then apply GOP Patch and Data Collections
Patch

Also review Note 883202.1 - Minimum Baseline Patch Requirements for Extended
Support on Oracle E-Business Suite 11.5.10

Upgrading EBS Applications - Special Considerations


1. If upgrading from 11.5.10.2 WITH the latest rollup patches applied, then you should not
encounter the INSTALL ALERT mentioned in Note 552415.1

1. You can run the SQL in the note to verify that tables MSC_NET_RES_INST_AVAIL
and MSC_SYSTEM_ITEMS have matching partitions.

2. After upgrade, applying the latest patches is very critical step that cannot be postponed.
Users should not start setup and testing on the base release of your upgrade.

DBA Checklist

VCP Applications Patch Requirements


1. Latest VCP Patches applied for your release as noted above in Fresh Installation
or Upgrade - Basic Steps

2. If Demantra integration is required, then review Note 470574.1 List Of High


Priority Patches For Oracle Demantra Including EBS Integration

Note: After Patching, there may be invalids or missing objects used for Data
Collections.
The patch should set profile MSC: Source Setup Required = Yes (or Y in 11.5.10)
Then with the first run of Data Collections, the Setup Requests will be launched and
build all the required objects for Data Collections, so this will resolve any invalid or
missing objects used for Data Collections. As noted above, you can find more
information on the Data Collection objects and the Setup Requests, etc in Note
179522.1

RDBMS Patches and Setups for Known Issues


In addition to any RDMBS patches recommended in the Patch Readme, please log an
RDBMS SR to get patches for the following RDBMS bugs:

All three of these RDBMS bugs relate to some data not being collected and the issue has
been seen in different entities under different conditions in different database versions.
So we recommend all these patches,

If you have this issue NOW, see step #3.c and execute this as a workaround
immediately. It has resolved most issues, but there will be exposure to other RDBMS
bugs below

1. RDBMS bug 6720194 fix should be applied - Wrong results from SQL with
semijoin and complex view merging

1. Fixed in version - 10.2.0.5 and 11.1.0.7 and 11g R2

2. RDBMS bug 6044413 fix should be applied

1. Fixed in version - 10.2.0.4 and 11g R1

2. Final fix for Table Prefetching causes intermittent Wrong Results in


9iR2,10gR1 and 10gR2 Note 406966.1

3. RDBMS bug 5097836 and 6617866 fixes should be applied

1. Both relate to query run in parallel not fetch all rows.

2. 5097836 Fixed in version - 10.2.0.4 and 11g R1

3. 6617866 Fixed in version - 10.2.0.5 and 11g R2

4. As a workaround, you can also set all snapshots for Data Collections to
use NOPARALLEL to prevent missing data.

5. Run this SQL


select 'alter table '|| owner||'.'|| table_name|| ' NOPARALLEL;'
from all_tables
where table_name like '%_SN'
and degree > 1;

6. Then if any rows are returned, run the output and it will set DEGREE = 1
Example: alter table MRP.MRP_FORECAST_DATES_SN NOPARALLEL;

4. Set init.ora parameter _mv_refresh_use_stats = FALSE

1. This is part of our performance tuning for MLOGs per Note 1063953.1
Other RDBMS Setups / Maintenance
1. Insure that RDBMS setups for Oracle applications in note Note 396009.1 (Or for
11i - Note 216205.1) are completed and all relevant settings for the RDBMS are
correct.
In the section for Database Initialization Parameter Sizing, for large data
volumes, we recommend that minimum sizing settings are 101-500 users.

2. Database links are required for the distributed installation

1. See Note 813231.1 Understanding DB Links Setup for APS Applications -


ASCP and ATP Functionality.

2. Test and confirm that database links are working for APPS User in
SQL*Plus.

3. These database links will be used in Advanced Planning Administrator /


Admin / Instances form when setting up the application.

3. Note 137293.1 - How to Manage Partitions in the MSC Schema

1. This note is required reading for both DBA’s and Applications Super
Users

2. When cloning distributed databases, manual manipulation of certain


tables is required in order to maintain proper database links and
instance information.

4. Note 1063953.1 Refresh Collection Snapshots Performance - Managing MLOG$


Tables and Snapshots for Data Collections

1. There will be performance problems in Refresh Collection Snapshots


process if this note is ignored.

2. Review the note to understand and manage the MLOGs for Data
Collections and related applications that share the same MLOGs we use
for Data Collections

3. When a lot of data changes happen to the base tables, then we get
large MLOGs that must be managed
AND/OR

4. If the ENI and OZF snapshots mentioned in the note are on the system,
then they must be manually refreshed to keep the MLOG table
maintained and prevent performance issues.

5. You may see initial run of Refresh Snapshots taking a long time. During
the setup phases, this will usually settle down after the first 2-3 runs of
Data Collections unless there is a lot of data processing happening
between runs of data collections and MLOGs grow rapidly.
1.Review the note and use the steps in section - I Want to Check
My System and Prevent a Performance Problem (ANSWER #1)
to prepare the system to avoid performance problems.

2.These steps are usually best accomplished after the first 2 -3


runs of Data Collections has been completed, but can be
executed earlier if required.

5. PLEASE NOTE that we expect Data Collections to take significant time during the
initial run of Data Collections. This is because we could be moving millions of
rows of data from the OLTP table to empty MSC tables.

1. Once Data Collections has completed for the first time and Gather
Schema Statistics is run for MSC schema (and MSD schema if using
Demantra or Oracle Demand Planning), then the process should settle
down and have better timing

2. It is not unusual to have to setup Timeout Parameter for Data


Collections to 900 – 1600 for the first run to complete without a timeout
error.

RAC Database - Special Requirements


1. Review Note 279156.1 if you have a RAC database. note that if this is a
centralized Installation, then the setups are mandatory.

2. When using ATP and a Distributed Installation and the EBS Source is RAC
database, then you must see Note 266125.1 and perform additional setups

System Administrator Setups


1. Setup the Standard Manager (or if using a specialized manager) to have as many
processes as possible (15 is the minimum recommendation, 25 or higher is
recommended), since we will launch many requests for Data Collections and
other VCP processes. Plus many other processes also use the Standard Manager.

1. NAV: System Administrator / Concurrent / Manager /Administer to


check the setups

2. NAV: System Administrator / Concurrent / Manager /Define and query


the Standard Manager, then click Workshifts to increase the number of
processes.

Applications Super User Checklist

VCP Applications Patch Requirements


1. The Super User must also be aware of the patching requirements mentioned in
Fresh Installation or Upgrade - Basic Steps
2. AND it is critical to review the Patch Readme of the latest patches applied to
check for any User required steps/setups listed in the patches.

Fresh Install/Upgrade Setup Steps


1. For a Distributed installation, you must be familiar with Note 137293.1

2. Note 179522.1 - Data Collection Use and Parameters explains:

1. The parameters used for Data Collections

2. Provide performance tips for Data Collections based on parameters

3. Setup of the profiles - ref Step #8 below

4. Provides examples of the requests that run during Data Collections

5. Provides details on the objects created for Data Collections

3. Confirm that you are have resolved issue for install mentioned in Note 552415.1
INSTALL ALERT - Setting Up The APS Partitions and Data Collections Instance in
R12

1. Running the SQL in the note should confirm that both tables have
matching partitions.

4. If you have RAC Database then see RAC Database - Special Requirements

5. Confirm that Standard Manager (or if used, the specialized manager) has
workshifts/processes as defined above in System Administrator Setups

6. Make sure profile MSC: Share Plan Partitions - No

7. Run Request - Create ATP Partitions (APS Destination)

1. This has no parameters - do not confuse with Create APS Partitions


request

2. This will insure that all partitions are created properly for the ATP tables
and if they are already created, then it does no harm.

8. Download the attachment in Note 179522.1 - see the Profiles Tab to setup the
profiles for Data Collections.

9. Decide if you need to be running the Planning Manager in your Instance

1. The Planning Manager is an EBS Source process that performs interface


tasks between the Order Management, WIP, Purchasing, and MRP
applications.

2. Under certain setups, this process must be running in order to collect


data properly.
3. Review Note.790125.1 When Does The Planning Manager Need To Be
Running? - And make a decision on your implementation.

10. Setup the Instance Form

1. For a distributed installation - see Note 813231.1 and confirm database


links are working properly.

2. More complete information on the Instances form in Note 137293.1 -


Section IX and download the Viewlet to review with the steps in the
note.

11. Organizations setup in the instances form:

1. You MUST include the Master Organization

2. Only should collect from ALL Orgs for the first run of Data Collections, so
do NOT enter any extra Orgs that you plan to collect later in the
implementation.

3. You can setup Collection Groups, just do NOT use a Collection Group for
the first run of Data Collections.

Running Data Collections for the First Time in a New Instance


We expect that Data Collections will take an extraordinary amount of time for
the first run. Collecting what can be millions of rows from the OLTP tables into
the MSC tables - which are empty - is a very hard process for the database.

It is not unusual for the first run to require that you use Timeout parameter of
600-1600 minutes to get all data collected from the EBS Source tables to the
APS Destination MSC tables. It does not matter if you are using database links
and two instances or a single instance. Be sure to set this high parameter for
both the Planning Data Pull and ODS Load.

Do not use a Collection Group during the first run, collect using parameter = All
If you are not collecting from all Orgs defined in the EBS source, then only define
the required orgs in the Instances form. Be sure to include the Master Org in the
Instances form.

Once Data Collections has completed successfully for the first time, run Gather
Schema Stats for the MSC schema, then the timing should settle down and be
reasonable. The default timeout of 180 for Planning Data Pull and 60 minutes
for the ODS Load can be maintained.

Parameters for First Run of Data Collections


1. Check that profile MSC: Source Setup Required = Yes (or Y in 11.5.10)
2. Note 179522.1 explains the parameters and the Setup Requests that
create all the objects used for Data Collections

3. NAV: Data Collection / Oracle Systems / Standard Data Collections

4. If your EBS Source instance is 11.5.10 and you have just applied Data
Collections Rollup #32 or higher for the first time, then special steps are
required.
Run Data Collections using Targeted Refresh for Planners only. This is a
one-time setup step for the very first run of Collections after the patch is
applied.
Key Parameters are:

1. Purge Previously Collected Data - No

2. Collection Method - Targeted

3. Set only Planners = Yes and all others No (or equivalent)

5. If you have a lot of data to collect, then the Refresh Snapshots process
could take several hours to run for this initial run.

6. Planning Data Pull Parameters - special considerations

1. Set Timeout parameter very high - suggest 900 - 1600 for the
first run

2. Set Sales Order parameter explicitly to Yes if using Order


Management application.

1. When running Complete Refresh - this parameter


defaults to No and runs Net Change Collection of Sales
Order for performance reasons.

2. This is normal and recommended during normal


operations.

3. All other parameters should be set to default for the first run of
collections

4. If there are other parameters that will need to be Yes for your
implementation, then plan to set those Yes after you get the
initial run of collections completed successfully.

7. ODS Load Parameters - special considerations

1. Set Timeout parameter very high - suggest 900 - 1600 for the
first run

2. Set all other Parameters No for this first run.


3. If you need to set these to Yes, then do this later after the first
run has completed successfully and timing for Data Collections
is stable.

8. When the first run is completed, then run Gather Schema Statistics for
the MSC schema.

9. If a Setup Request(s) fail, then run Planning Data Collections - Purge


Staging Tables with parameter Validation = No. Then launch again. We
have seen many times where initial failures in the Setup Requests are
resolved by running 2 or 3 times.

10. IF Planning Data Pull or ODS Load fails, check for a Timeout error or
other specific error in the log file of the Main request and all the log files
for the Workers also. You must run Planning Data Collections - Purge
Staging Tables with parameter Validation = No, before you launch the
Data Collections again. If you fail to run this request, then Data
Collections will error with message text as see in Note 730037.1.

Preventing Data Collections Performance Issues


After the initial run to collect the data to the empty tables is completed, the timing of
Data Collections should settle down.
The profiles settings in #1 can be done after the initial run of data collections
When a large volume of changes are made to source data via Data Loads or from
transactional loads, then active management steps are required - see #2.

1. The Profile setup in Note 179522.1 should be setup correctly for your
instance. Below we have some suggestions, BUT you must review all the
profiles for proper settings.
Notes on the Profile Setups:

1. MSC: Collection Window for Trading Partner Changes (Days) – Must be


NULL or 0 (zero) for first run, but then should be setup for smaller
number. Generally 7 days should be sufficient. As long as you run a
Complete Refresh at least once per 7 days, then this will collect the
changes.

2. Make sure Staging Tables profile is set to Yes if you are collecting from
only 1 EBS Source instance - MSC: Purge Staging and Entity Key
Translation Tables / MSC: Purge Stg Tbl Cntrl

1.If you are collecting from Multiple Instances, then you must set
this profile - No

2.In this case, the MSC_ST staging tables may grow very large in
dba_segments and high water mark will not be reset when we
delete from these tables.
3.If you must set profile to No, then we suggest that the DBA
periodically schedule a truncate of the MSC_ST% tables when
Data Collections is NOT running to recover this space and help
performance.

3. MSC: Refresh Site to Region Mappings Event - this profile will be


relevant when setting up Regions and Zones in Shipping Application.
When new Regions or Zones are defined, then mapping that occurs
during Refresh Collection Snapshots could take a long time. Otherwise,
we just collect new supplier information, which is not performance
intensive.

4. Check to make sure Debug profiles are not setup except in case where
they are required for investigation of an issue.

5. IF using Projects and Tasks and have many records, then if profile MSC:
Project Task Collection Window Days is available, you can be set to help
performance.

2. MLOG Management is the key to performance of Refresh Collection


Snapshots. Note 1063953.1 discuses the important steps to managing
MLOGs to keeping the program tuned to perform as fast as possible.

1. For customers actively using Data Collections, these performance


problems occur for two primary reasons:

1.MLOG tables are shared between VCP Applications and other


applications - usually Daily Business intelligence applications or
custom reporting

2.Large source data changes via Data Loads and/or large


transactional data volumes.

2. Once the initial collection has been performed and most data collected,
then use the steps in section -
I Want to Check My System and Prevent a Performance Problem - and
ANWER #1: Yes, we are using Data Collections now

3. When you execute these steps, you will be setting the MLOGs to
provide best performance.

4. You must also be aware if you have any MLOGs that are shared with
other applications. In this case, the DBA will need to actively manage
those other Snapshots, refreshing them manually - via cron job may be
the best method.

3. If you have very large BOM tables and plan to use Collection Groups to
collect information only for certain Orgs, then be aware that this can take
very significant TEMP space during the ODS Load. We have seen reports
where All Orgs takes only 6-12gb of TEMP space, but when using a
Collection Group, then TEMP space requirements exceed 30gb.
How Can I Check on Performance When Data Collections is Running?
1. Note 186472.1 has SQL to check on specific requests that are running and show
the SQL that is running on the RBDMS.

1. When you run this SQL you provide the Request ID and it checks to find
the SQL being run for this session

2. If you have several Planning Data Pull workers (or ODS Load Workers)
running, then you may need to check several different requests.

3. Run this SQL every 10-15 minutes and check if the output changes.

2. Note 280295.1 has requests.sql script which will provide you with details on the
all the requests in a set.

1. Enter the request id for the request that launched the set and it will
provide details of all the requests. Default request name for the set is
Planning Data Collection (Request Set Planning Data Collection)

3. Note 245974.1 - #7 and #11 show how to setup trace for requests, should this
be required to get details on a performance issue.

1. #11 in that note is very useful for setting trace for Refresh Collection
Snapshots on the EBS Source when it is launched by Data Collections
from the APS Destination.

2. You can download guided navigation for both sections to better


understand the steps involved.

Cloning and Setup after Cloning for Data Collections

I Have a Centralized Installation – What should I do?


When you have a Centralized Installation, then you do not have to consider any
steps here after cloning your instance and can proceed with normal operations
of running Data Collections and most Planning applications.

However, if you want to change the Instance Code in the Instances form, then
you must follow steps in Note 137293.1 - Section X to clean up all the old data
BEFORE you create a new instance. If you just change the instance code in the
form, then this will cause data corruption.

Steps to Consider after Cloning when Using Distributed Installation


There are many different strategies to cloning when you have a Distributed
Installation using separate databases for VCP and EBS Applications. It would
easy to find 10-20 different ways that configurations can be manipulated by
cloning, so we cannot address them in any document that covers all the
strategies.
What Happens when I Setup a New Instance in the Instances Form to Connect an APS
Destination to the EBS Source?
When you create the line in the Instances form on the APS Destination (NAV:
Advanced Planning Administrator / Admin / Instances) you are:

1. Using an instance_id from MSC_INST_PARTITIONS where FREE_FLAG = 1


to store information on these setups you define in the form.

2. Inserting a line into MSC_APPS_INSTANCES on the APS Destination


instance with the Instance Code, Instance_id.

3. Inserting a line into the MRP_AP_APPS_INSTANCES_ALL table on the


EBS source Instance

4. Then when creating Orgs in the Organizations region of the Instances


form you are inserting rows in the APS Destination table
MSC_APPS_INSTANCE_ORGS for that instance_id.

5. In MSC_INST_PARTITIONS, FREE_FLAG = 1 means that the instance_id is


not yet used. Once the Instance_id is used the FREE_FLAG = 2.

6. You should NOT manually change the FREE_FLAG using SQL. You
should create a new Instance partition using Create APS Partition
request with parameters Instance partition = 1 and Plan partition = 0.
We do not recommend creating extra Instance Partitions, since this
creates many objects in the database that are not required.

7. If the previous collected data and plan data from the old instance is not
required, then it should be removed from the system.

Key Points
1. You will have to MANUALLY manipulate the EBS Source instance table
MRP_AP_APPS_INSTANCES_ALL.

2. You cannot have TWO instances which have the same INSTANCE_ID or
INSTANCE_CODE in EBS Source table MRP_AP_APPS_INSTANCES_ALL

3. You cannot have TWO instances which have the same INSTANCE_ID or
INSTANCE_CODE in APS Destination table MSC_APPS_INSTANCES

4. You can always use steps in Note 137293.1 - Section X to clean up the
data in the APS Destination and reset the EBS Source and collect all data
fresh and new into a new instance.

5. Once you have a VERY CLEAR and intimate understanding of Note


137293.1 and the ways to manipulate Instance Codes and Instance ID’s,
then you can use Section XV of Note 137293.1 and come up with a
custom flow to manipulate and setup after cloning your instances.
6. AGAIN, if you have problems, you can always reset using Section X and
collect all data fresh into a new instance.

I Am Going to Use ATP Data Collections for Order Management ATP, What Do I Need to Do?
For ATP Data Collections, the instance must be Centralized installation on a single
database.
You cannot run ATP Data Collections for a distributed installation.
You cannot run ATP Data Collections if you are using any Value Chain Applications
besides ATP for Order Management.

Upgrade and Patching


Make sure the latest patch are applied for 12.1.x and for 12.0.6 or 11.5.10, then both
Data Collections and GOP patches must be applied to your instance

Make sure the INSTALL ALERT Note 552415.1 is resolved for R12 and R12.1 installations.

DBA’s Information
All steps in DBA Checklist apply, EXCEPT that no database links are required and Note
137293.1 will not be so important unless you move to Distributed installation.
Patching is simpler, since all patches are applied to the same instance.

Applications Super User’s Information


All steps in the Applications Super User Checklist apply, with the following exceptions:

1. Note 137293.1 will not be so important.

2. When setting up profiles, we have noted which profiles need to be considered


when running only ATP Data Collections.

3. Review section ATP Data Collections vs. Standard Data Collections for more
information

Appendix A
Note 179522.1 should be the primary resource for information about the requests and
other information on data collections objects, parameters, profiles, etc.

List of Setup Requests


The Request ID and Request Name will be listed in the Refresh Collection
Snapshots log file when these requests are launched as part of Data Collections

Shortname Request Name New in 12.1 New in 12.0


MSCDROPS Drop Changed Snapshots    
MSCWSMSN Create WSM Snapshots    
MSCBOMSN Create BOM Snapshots    
MSCINVSN Create INV Snapshots    
MSCCSPSN Create CSP Snapshots x  
MSCMRPSN Create MRP Snapshots    
MSCPOXSN Create PO Snapshots    
MSCONTSN Create OE Snapshots    
MSCWSHSN Create WSH Snapshots    
MSCAHLSN Create AHL Snapshots    
MSCEAMSN Create EAM Snapshots   x
MSCWIPSN Create WIP Snapshots    
MSCSYNMS Collections Synonyms    
MSCVIEWS Collections Views    
MSCTRIGS Collections Triggers    
MSCTRITM Create Collections Item Snapshot Triggers x  
MSCTRBOM Create Collections BOM Snapshot Triggers x  
MSCTRRTG Create Collections Routing Snapshot Triggers x  
MSCTRWIP Create Collections WIP Snapshot Triggers x  
MSCTRDEM Create Collections Demand Snapshot Triggers x  
MSCTRSUP Create Collections Supply Snapshot Triggers x  
MSCTROTH Create Collections Other Snapshot Triggers x  
MSCTRRPO Create Collections Repair Order Snapshot Triggers x  
MSCVWSTP Create Collections Setup Views x  
MSCVWITM Create Collections Item Views x  
MSCVWBOM Create Collections BOM Views x  
MSCVWRTG Create Collections Routing Views x  
MSCVWWIP Create Collections WIP Views x  
MSCVWDEM Create Collections Demand Views x  
MSCVWSUP Create Collections Supply Views x  
MSCVWOTH Create Collections Other Views x  
MSCVWRPO Create Collections Repair Order Views x  

Goal
What are the necessary steps to setup an Item so it appears in the Advanced Supply Chain
Planning - Planner's Workbench - and can be released into Purchasing as a Requisition and then
have the Requisition automatically created into a Purchase Order?

The Solution below contains the following steps to achieve this Goal. Click on any of the links
below to go directly to a specific step.

Summary of Steps
Part 1 - ASCP Item to Purchasing Requisition

1. Step 1-Create the Item


2. Step 2-Enter a Forecast Set

3. Step 3-Create the Master Demand Schedule

4. Step 4-Run the Collection

5. Step 5-Enter the Plan

6. Step 6-Run the Supply Chain Planning Process

7. Step 7-Locate the Requisition

Part 2 - Setup Sourcing to Automatically Create a Purchase Order from the Requisition

8. Step 8-Create the Supplier


9. Step 9-Create the Quote

10. Step 10-Create the ASL

11. Step 11-Enter the Sourcing Rule

12. Step 12-Assign Sourcing Rule/Assignment Set

13. Step 13-Modify the Createpo Workflow

14. Step 14-Modify the Requisition Workflow

15. Step 15-Supply/Demand Plan

16. Step 16-Find the Requisition

17. Step 17-Find the Purchase Order

Solution

This example is based on the Vision Demonstration Database - using the Login of MFG/Welcome
and two responsibilities - already assigned to the User.  The MFG login is tied to the employee
Jonathan Smith.

1. Manufacturing and Distribution Manager


2. Advanced Supply Chain Planner

Part 1 - ASCP Item to Purchasing Requisition


Step 1-Create the Item
Manufacturing and Distribution Manager
Navigate: Inventory - Items - Master Items

Define a new item - Enter the Item and a Description - choosing M1 Inventory Organization
when the form opens.
- From the top tool bar choose Tools/Copy From - and use the Template 'Purchased Item'
- Choose Apply - then Done
- Enter an Item Name
- Assign the Item to the M1 Inventory Organizaton

From the Purchasing tab ensure that you have a List price entered as well as a Default Buyer. 
The list price is what will be used when executing Requisition Import.

In the General Planning tab - ensure the Make or Buy option is set to Buy and also ensure there
is a Planner entered.  Notice this is set at the Inventory Organization Level for M1.   Save the
item.

Step 2-Create the Item


Manufacturing and Distribution Manager
Navigate: Material Planning - Forecast - Sets
Enter a Forecast Set and Description
 - Click down into the Forecast Lines - and enter an actual Forecast Name.  Make them different
so they are distinguishable. Choose the Forecast Items button off to the right.

As the new screen opens - enter the Item which was created in Step 1.  Choose the Bucketed
Button in the lower right hand corner of the form.

Then choose the Detail button, enter in the Bucket as Weeks, give a date of today or in the near
future and span the End Date a couple of months out. Enter 25 in the Current field.  The forecast
is essentially pre-calculated demand.  In the example above - the system is being told that there
will be a demand of 25 quantity - per week - from the 26-Mar through 02-Jul.  Save your entry.

Step 3-Create the Master Demand Schedule


Manufacturing and Distribution Manager
Navigate: Material Planning - MDS - Names

Create a new Master Demand Schedule by entering a new record and then entering a Name. 
Save the name and then choose the Load/Copy/Merge button.

 -
The parameter screen will appear for the Load/Copy/Merge MDS request. 
   - For Destination Schedule - choose the MDS name that was just entered
   - For Source Type - choose Specific Forecast
   - For Source Name - give the Forecast Name (not forecast set) of the forecast created
previously
   - Overwrite - All Entries
   - Set the Cut Off Date a week or two past the date that was entered in the Forecast
 Submit the request and ensure it completes before moving to the next step.

In this step you are loading the Forecast into the Demand schedule. 

Step 4-Run the Collection


Advanced Supply Chain Planner
Navigate: Collections - Oracle Systems - Standard Collection

The collections is responsible for pulling the data into the MSC schema.   Essentially, the
Advanced Supply Chain planning product co-exists with the Oracle E-Business Suite application -
in it's own schema.  The data collections process is responsible for collecting the data necessary
in order to properly run a plan.

For instance - Double click on the Parameters field on the Planning Data Pull line then choose
TST (Seeded with Vision Demonstration Database):
Collection Method - Complete Refresh

Take the defaults for the rest and submit this request.  It will take some time to complete, but
do not move to the next step until the process completes.

Step 5-Enter the Plan


Advanced Supply Chain Planner
Navigate: Supply Chain Plan - Names

As the form opens choose the following organization (below):


 

Enter a Plan Name - and then choose the Plan Options button at the bottom of the form.

Set the following parameters in the Main tab.


   - Planned Items - Demand Schedule Item Only
   - Material Scheduling - Order Start Date
   - Overwrite - All

From the Organization tab enter the Org as TST:M1 - and then choose the MDS (Demand
Schedule) created previously.  It is available as it was pulled during the Collections process.  
Save then close the Plan Options form.

Once the plan options are entered, close the Options and choose the Launch Plan button.
 - Choose the Plan Name if not already present - and submit the plan.
 - Ensure that the plan runs to completion.

Step 6-Run the Supply Chain Planning Process


Advanced Supply Chain Planner
Navigate: Supply Chain Plan - Workbench
In the top of the form - change the View By to 'Items'

 - Choose the '+' next to Plan


 - A list of the ASCP plans are present - double click on the plan which was just created
 - Scroll down to locate 'NEW.MISC'. Choose the "+"next to 'NEW.MISC'
 - The item should appear under the category (below)

Right Click on the Item - to bring up the shortcut menu and navigate to the Supply/Demand.

The Supply Demand screen will appear.  In this screen - there are two order types.
 - Planned Order and Forecast MDS
 - The forecast MDS is the demand that was imposed by the forecast which was created
 - The planned order is what is being suggested to the Planner for an action to take to alleviate
the demand

Still in the Supply/Demand form - choose the Release Properties tab.


 - Check the box 'For Release' - and the information should drop in with what the
recommendation is going to be
 - Choose Plan/Release from the top text tool bar - so that the release process can be engaged
Choosing View/Requests - will show that the Requisition Import process was now kicked off.
This is going to create the requisition back in the E-Business Suite.

Step 7-Locate the Requisition


Manufacturing and Distribution Manager
Navigate: Purchasing - Requisitions - Requisition Summary

Enter the item in Item field of the Requisition Find form - and search to locate the newly created
requisition.

The requisition is now created - based on the release of the planned order from the Advanced
Supply Chain Planning.  If desired, sourcing rules and approved supplier list entries can be setup
using the Manufacturing and Distribution Manager to create Purchase Orders automatically
from ASCP. 

The steps will follow here:

Part 2 - Setup Sourcing to Automatically Create a Purchase Order from the Requisition

Step 8-Create the Supplier

Manufacturing and Distribution Manager


Navigate: Purchasing - Supply Base - Suppliers

Create a Supplier by entering the Supplier name, then click on the Sites button.

Enter the appropriate site information for the Supplier and save.

Step 9-Create the Quote


Manufacturing and Distribution Manager
Navigate: Purchasing - RFQs and Quotations - Quotations
Create a standard Quotation using the Supplier created in the last step and the ASCP item
previously created.  Change the Status field to Active and save.

From the Price Breaks button approve the line.

Step 10-Create the ASL


Manufacturing and Distribution Manager
Navigate: Purchasing - Supply Base - Approved Supplier List

Enter the following fields:


-Item-enter the ASCP item created previously
-Business-Direct
-Supplier-enter the Supplier created previously
-Site-enter the corresponding Supplier Site
-Status-enter Approved
-Choose the Record Details tab
Save your entry

Enter Yes on the Global field and choose the Attributes tab.

Enter the following fields:


-Release Method-enter Release Using Autocreate

-Type-Quotation

-Number-enter the Quotation number previously created

-Line-enter the line number on the Quotation that corresponds to the ASCP item created

Save your entry

Step 11-Enter the Sourcing Rule


Manufacturing and Distribution Manager
Navigate: Purchasing - Supply Base - Sourcing Rules

Enter a Sourcing Rule name and Effective From and To Dates.  Choose to have the Sourcing rule
for All Orgs or a single Org.  Choose Buy From under Type and enter the Supplier and Site in the
appropriate fields.  Allocation should be 100 with Rank entered as 1. 

Save your entry.

Step 12-Assign Sourcing Rule/Assignment Set


Manufacturing and Distribution Manager
Navigate: Purchasing - Supply Base - Assign Sourcing Rules

Enter a new Assignment set name to be entered in the Profile - MRP: Default Sourcing
Assignment Set or query the assignment set already entered in the profile.  Enter the ASCP item
in the Item/Category field. Type should be Sourcing Rule and enter under the Rule/BoD field the
Sourcing Rule name entered in Step 10. Save your entry.

Step 13-Modify the Createpo Workflow


Workflow Builder

Navigate: Open the PO Create Document Workflow

Double click on the + next to PO Create Documents then double click on the + next to Attributes.

Right click on 'Is Automatic Creation Allowed? and change Value to Y.

Move down to 'Is Automatic Approval Allowed?' and right click. Change Value to Y.

Step 14-Modify the Requisition Workflow


Workflow Builder
Navigate: Open the PO Requisition Approval Workflow
Open the Attributes and on 'Send PO Autocreation to Background' right click.

Change Value to N.  Exit from Workflow Builder.

Step 15-Supply/Demand Plan


Manufacturing and Distribution Manager
Navigate: Advanced Planning - Advanced Supply Chain Planning - Supply Chain Plan - Workbench

Right click on the plan previously created to choose Supply/Demand - Supply/Demand

Search on the ASCP item in the Item field.

Choose For Release on the Planned Order line.

Save your entry.

Choose Plan - Release from the toolbar.

Choose View>Request

In Request ID 2799708 it shows the Requisition Import was automatically called.

In Request ID 2799709 it shows the Create Releases program was automatically called.

Step 16-Find the Requisition

Manufacturing and Distribution Manager


Navigate: Purchasing - Requisition - Requisition Summary
Query in the Item field on the ASCP item and press Find.

We can then see the Requisition was automatically created in Approved status.

Step 17-Find the Purchase Order


Manufacturing and Distribution Manager
Navigate: Purchasing - Purchase Orders - Purchase Order Summary

Choose the Related Documents tab and enter the Requisition number created in the previous
step and press Find.

We can see above that Purchase Order 4619 was created from the requisition automatically
using our ASCP item.

Introduction

The Planning Data Collection process involves three separate concurrent programs:

 Refresh Collection Snapshots


 Planning Data Pull

 Planning ODS Load


The Planning Data Collection collects data from the EBS source instance tables to the
APS/Planning destination instance tables (MSC_ tables).  When the Planning Data Collection
process is run, the first program that is launched is the Planning Data Pull.  This program in turn
spawns the Refresh Collection Snapshots program.  This program looks at the changes that have
taken place in the EBS source instance and updates the associated snapshots.  Once all the
snapshots are updated, control is returned to the Planning Data Pull and data is pulled into the
staging tables (MSC_ST_ tables) on the APS/Planning destination instance through the related
planning views (MRP_AP_ views) on the EBS source instance.  The planning views are based on
the snapshots that were just refreshed.  After the Planning Data Pull completes, the Planning
ODS Load program runs and moves the data from the staging tables to the planning tables.

Many times customers face issues where certain entities are not collected into the APS/Planning
destination instance.  The goal of this note is to provide diagnostic information to customers and
support engineers to track the various data collection entities from the EBS source tables to the
APS/Planning destination tables.  Each script will provide all the data from the related EBS table,
snapshot, synonym, planning view, staging table, and planning table.  If data is not collected into
the planning table, we can trace back through the output to see where the data went missing. 
For example, the problem could be that the data was in the snapshot, but not in the planning
view.  Investigation would then be needed to understand why the view did not pick up the data. 
Or maybe the issue is that the data from the EBS source table is not in the snapshot.  In this
case, investigation would be needed to understand why the Refresh Collection Snapshot
program did not update the corresponding snapshot.

Instructions

Click here to download dc_entity.zip

Below are a list of the data collection entities that are seen when you run the Planning Data
Collection concurrent request.  The entities correspond to the parameters for the Planning Data
Pull program.  Click on the link of the desired entity and it will take you to the corresponding
section in the note.  There you will find the name of the script, it's description and parameters,
the name of the output file, and how to run it.  All the scripts have been zipped into a single file
and can be downloaded here.  Unzip the files and then select the folder that corresponds to the
desired entity.

The scripts can be run in both a centralized and a decentralized APS configuration, but are
ALWAYS run from the EBS Source instance.  At the beginning of each script you are provided
with a list of instances and database links (if applicable).  Enter the corresponding database link
with an '@' sign at the beginning.  For example, if the database link is ERP_TO_APS, you would
enter '@ERP_TO_APS'.  If no database link is listed, then just press the 'ENTER' key and continue
to the other parameters for that particular script.  Each script will create a list of .csv files
corresponding to the Base Table, Source Snapshot, Source Synonym, etc.  Zip the files together
and upload them to the SR.
Note:  At this time, not all entities have scripts.  Only the most commonly used ones were
created.  If you find that there is an entity without a script, please email brett.fox@oracle.com
and I will work to get you a script and the note updated.

Data Collection Entity

   

Approved Supplier Lists ATP Rules

Bills Of Resources Calendars - Dates

End Item Substitutes Forecasts

Key Performance Indicator Targets Master Demand Schedules

On Hand Planning Parameters

Project/Tasks Purchase Orders/Purchase Requisitions

Resource Availability Safety Stock

Sourcing Rules Sourcing History

Supplier Responses Suppliers/Customers/Orgs

Unit Numbers Units of Measure

Work In Process Sales Channel

Payback Demand/Supply  

 
Approved Supplier Lists [top]

Source Source
Base Table Snaps Synon Source View Staging Table Planning Table
hot ym

PO_APPOROVED_SUPP MRP_AP_PO_SUPP MSC_ST_ITEM_SU MSC_ITEM_SUP


N/A N/A
LIER_LIST LIERS_V PPLIERS PLIERS

   
Script: dc_asl.sql

Description: Tracks Approved Supplier List information from EBS to APS.

Parameters: Organization Code, Item Name

Usage: sqlplus <apps username>/<apps password>@dc_asl.sql

Example: sqlplus apps/apps@dc_asl.sql

PO_APPOROVED_SUPPLIER_LIST.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
Output File:
MRP_AP_PO_SUPPLIERS_V.csv
MSC_ST_ITEM_SUPPLIERS.csv
MSC_ITEM_SUPPLIERS.csv

 
ATP Rules [top]

Source Source
Base Table Snapsho Synony Source View Staging Table Planning Table
t m

MTL_ATP_RULE MRP_AP_ATP_RULES_ MSC_ST_ATP_RULE MSC_ATP_RULE


N/A N/A
S V S S

   

Script: dc_atp.sql

Description: Tracks ATP rules from EBS to APS

Parameters: ATP Rule Name

Usage: sqlplus <apps username>/<apps password>@dc_atp.sql

Example: sqlplus apps/apps@dc_atp.sql

Output File: MTL_ATP_RULES.csv


NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
MRP_AP_ATP_RULES_V.csv
MSC_ST_ATP_RULES.csv
MSC_ATP_RULES.csv

 
Bills of Materials/Routings/Resources [top]

Source Source
Base Table Source View Staging Table Planning Table
Snapshot Synonym

BOM_STRUCTU BOM_BO MRP_SN_


MRP_AP_BOMS_V MSC_ST_BOMS MSC_BOMS
RES_B MS_SN BOMS

BOM_INV MRP_SN_I
BOM_COMPON MRP_AP_BOM_CO MSC_ST_BOM_CO MSC_BOM_CO
_COMPS_S NV_COMP
ENTS_B MPONENTS_V MPONENTS MPONENTS
N S

BOM_SUBSTITU BOM_SUB MRP_SN_ MRP_AP_COMPO MSC_ST_COMPO MSC_COMPONE


TE_COMPONEN _COMPS_S SUB_COM NENT_SUBSTITUTE NENT_SUBSTITUT NT_SUBSTITUTE
TS N PS S_V ES S

BOM_OPERATI
BOM_OPR MRP_SN_ MRP_AP_ROUTIN MSC_ST_ROUTIN
ONAL_ROUTIN MSC_ROUTINGS
_RTNS_SN OPR_RTNS GS_V GS
GS

BOM_OPERATI
BOM_OPR MRP_SN_ MRP_AP_ROUTIN MSC_ST_ROUTIN MSC_ROUTING_
ONAL_ROUTIN
_SEQS_SN OPR_SEQS G_OPERATIONS_V G_OPERATIONS OPERATIONS
GS

BOM_OPERATI MSC_ST_OPERATI MSC_OPERATIO


BOM_OPR MRP_SN_ MRP_AP_OP_RES
ON_SEQUENCE ON_RESOURCE_SEN_RESOURCE_SE
_RESS_SN OPR_RESS OURCE_SEQS_V
S QS QS

   

Script: dc_bom.sql

Description: Tracks bill of material information from EBS to APS

Parameters: Organization Code, Assembly Name


Usage: sqlplus <apps username>/<apps password>@dc_bom.sql

Example: sqlplus apps/apps@dc_bom.sql

BOM_STRUCTURES_B.csv
BOM_BOMS_SN.csv
MRP_SN_BOMS.csv
MRP_AP_BOMS_V.csv
MSC_ST_BOMS.csv
MSC_BOMS.csv
BOM_COMPONENTS_B.csv
BOM_INV_COMPS_SN.csv
MRP_SN_INV_COMPS.csv
Output File:
MRP_AP_BOM_COMPONENTS_V.csv
MSC_ST_BOM_COMPONENTS.csv
MSC_BOM_COMPONENTS.csv
BOM_SUBSTITUTE_COMPONENTS.csv
BOM_SUB_COMPS_SN.csv
MRP_SN_SUB_COMPS.csv
MRP_AP_COMPONENT_SUBSTITUTES_V.csv
MSC_ST_COMPONENT_SUBSTITUTES.csv
MSC_COMPONENT_SUBSTITUTES.csv

   

Script: dc_rtg.sql

Description: Track routing information from EBS to APS

Parameters: Organization Code, Assembly Name

Usage: sqlplus <apps username>/<apps password>@dc_rtg.sql

Example: sqlplus apps/apps@dc_rtg.sql

Output File: BOM_OPERATIONAL_ROUTINGS.csv


BOM_OPR_RTNS_SN.csv
MRP_SN_OPR_RTNS.csv
MRP_AP_ROUTINGS_V.csv
MSC_ST_ROUTINGS.csv
MSC_ROUTINGS.csv
BOM_OPERATIONAL_ROUTINGS.csv
BOM_OPR_SEQS_SN.csv
MRP_SN_OPR_SEQS.csv
MRP_AP_ROUTING_OPERATIONS_V.csv
MSC_ST_ROUTING_OPERATIONS.csv
MSC_ROUTING_OPERATIONS.csv
BOM_OPERATION_SEQUENCES.csv
BOM_OPR_RESS_SN.csv
MRP_SN_OPR_RESS.csv
MRP_AP_OP_RESOURCE_SEQS_V.csv
MSC_ST_OPERATION_RESOURCE_SEQS.csv
MSC_OPERATION_RESOURCE_SEQS.csv

 
Bills Of Resources [top]

Sourc Sourc
e e
Base Table Source View Staging Table Planning Table
Snaps Synon
hot ym

CRP_BILLS_OF_R MRP_AP_BILL_OF_RES MSC_ST_BILL_OF_R MSC_BILL_OF_RE


N/A N/A
ESOURCES OURCES_V ESOURCES SOURCES

CRP_RESOURCE_ MRP_AP_CRP_RESOUR MSC_ST_BOR_REQ MSC_BOR_REQUI


N/A N/A
HOURS CE_HOURS_V UIREMENTS REMENTS

   

Script: dc_bor.sql

Description: Tracks Bill of Resource Information from EBS to APS

Parameters: Organization Code, Bill of Resource Name

Usage: sqlplus <apps username>/<apps password>@dc_bor.sql

Example: sqlplus apps/apps@dc_bor.sql

Output File: CRP_BILLS_OF_RESOURCES.csv


NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
MRP_AP_BILL_OF_RESOURCES_V.csv
MSC_ST_BILL_OF_RESOURCES.csv
MSC_BILL_OF_RESOURCES.csv
CRP_RESOURCE_HOURS.csv
NO_SOURCE_SYNONYM.csv
NO_SOURCE_SYNONYM.csv
MRP_AP_CRP_RESOURCE_HOURS_V.csv
MSC_ST_BOR_REQUIREMENTS.csv
MSC_BOR_REQUIREMENTS.csv

 
Calendars - Dates [top]

SourcSourc
e e
Base Table Source View Staging Table Planning Table
Snap Syno
shot nym

WSH_CALENDAR_ MRP_AP_CAL_ASSIGN MSC_ST_CALENDAR MSC_CALENDAR_


N/A N/A
ASSIGNMENTS MENTS_V _ASSIGNMENTS ASSIGNMENTS

BOM_CALENDAR_ MRP_AP_CALENDAR_ MSC_ST_CALENDAR MSC_CALENDAR_


N/A N/A
DATES DATES_V _DATES DATES

BOM_PERIOD_STA MRP_AP_PERIOD_STA MSC_ST_PERIOD_ST MSC_PERIOD_STA


N/A N/A
RT_DATES RT_DATES_V ART_DATES RT_DATES

BOM_CAL_YEAR_S MRP_AP_CAL_YEAR_S MSC_ST_CAL_YEAR_ MSC_CAL_YEAR_S


N/A N/A
TART_DATES TART_DATES_V START_DATES TART_DATES

BOM_CAL_WEEK_ MRP_AP_CAL_WEEK_ MSC_ST_CAL_WEEK MSC_CAL_WEEK_S


N/A N/A
START_DATES START_DATES_V _START_DATES TART_DATES

   

Script: dc_cal.sql

Description: Tracks calendar information from EBS to APS

Parameters: Organization Code, Calendar Name, Instance Code

Usage: sqlplus <apps username>/<apps password>@dc_cal.sql

Example: sqlplus apps/apps@dc_cal.sql


WSH_CALENDAR_ASSIGNMENTS.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
MRP_AP_CAL_ASSIGNMENTS_V.csv
MSC_ST_CALENDAR_ASSIGNMENTS.csv
MSC_CALENDAR_ASSIGNMENTS.csv
BOM_CALENDAR_DATES.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
MRP_AP_CALENDAR_DATES_V.csv
MSC_ST_CALENDAR_DATES.csv
MSC_CALENDAR_DATES.csv
BOM_PERIOD_START_DATES.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
Output File:
MRP_AP_PERIOD_START_DATES_V.csv
MSC_ST_PERIOD_START_DATES.csv
MSC_PERIOD_START_DATES.csv
BOM_CAL_YEAR_START_DATES.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
MRP_AP_CAL_YEAR_START_DATES_V.csv
MSC_ST_CAL_YEAR_START_DATES.csv
MSC_CAL_YEAR_START_DATES.csv
BOM_CAL_WEEK_START_DATES.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
MRP_AP_CAL_WEEK_START_DATES_V.csv
MSC_ST_CAL_WEEK_START_DATES.csv
MSC_CAL_WEEK_START_DATES.csv

 
Demand Classes [top]

SourceSource
Base Table Snaps Synon Source View Staging Table Planning Table
hot ym
FND_COMMON_L MRP_AP_DEMAND_C MSC_ST_DEMAND_ MSC_DEMAND_C
N/A N/A
OOKUPS LASSES_V CLASSES LASSES

   

Script: dc_dc.sql

Description: Tracks demand classes from EBS to APS

Parameters: Demand Class

Usage: sqlplus <apps username>/<apps password>@dc_dc.sql

Example: sqlplus apps/apps@

FND_COMMON_LOOKUPS.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
Output File:
MRP_AP_DEMAND_CLASSES_V.csv
MSC_ST_DEMAND_CLASSES.csv
MSC_DEMAND_CLASSES.csv

 
End Item Substitutes [top]

Source Source
Base Table Snaps Synon Source View Staging Table Planning Table
hot ym

MTL_RELATED_ MRP_AP_ITEM_SUBSTI MSC_ST_ITEM_SUBS MSC_ITEM_SUBST


N/A N/A
ITEMS TUTES_V TITUTES ITUTES

   

Script: dc_eis.sql

Description: Tracks end item substitute information from EBS to APS

Parameters: Organization Code, Item Name

Usage: sqlplus <apps username>/<apps password>@dc_eis.sql

Example: sqlplus apps/apps@dc_eis.sql


MTL_RELATED_ITEMS.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
Output File:
MRP_AP_ITEM_SUBSTITUTES_V.csv
MSC_ST_ITEM_SUBSTITUTES.csv
MSC_ITEM_SUBSTITUTES.csv

 
Forecasts [top]

Source Source Staging Planning


Base Table Source View
Snapshot Synonym Table Table

MRP_FORECAST_ MRP_FORECAS MRP_SN_FORE MRP_AP_FORECA MSC_ST_DES MSC_DESI


DESIGNATORS T_DSGN_SN CAST_DSGN ST_DSGN_V IGNATORS GNATORS

MRP_FORECAST_ MRP_FORECAS MRP_SN_FORE MRP_AP_FORECA MSC_ST_DE MSC_DEM


ITEMS T_ITEMS_SN CAST_ITEMS ST_DEMAND_V MANDS ANDS

MRP_FORECAST_ MRP_FORECAS MRP_SN_FORE MRP_AP_FORECA MSC_ST_DE MSC_DEM


DATES T_DATES_SN CAST_DATES ST_DEMAND_V MANDS ANDS

   

Script: dc_fcst.sql

Description: Tracks item forecast information from EBS to APS

Parameters: Forecast Name, Organization Code, Item Name

Usage: sqlplus <apps username>/<apps password>@dc_fcst.sql

Example: sqlplus apps/apps@dc_fcst.sql

Output File: MRP_FORECAST_DESIGNATORS.csv


MRP_FORECAST_DSGN_SN.csv
MRP_SN_FORECAST_DSGN.csv
MRP_AP_FORECAST_DSGN_V.csv
MSC_ST_DESIGNATORS.csv
MSC_DESIGNATORS.csv
MRP_FORECAST_ITEMS.csv
MRP_FORECAST_ITEMS_SN.csv
MRP_SN_FORECAST_ITEMS.csv
MRP_AP_FORECAST_DEMAND_V.csv
MSC_ST_DEMANDS.csv
MSC_DEMANDS.csv
MRP_FORECAST_DATES.csv
MRP_FORECAST_DATES_SN.csv
MRP_SN_FORECAST_DATES.csv
MRP_AP_FORECAST_DEMAND_V.csv
MSC_ST_DEMANDS.csv
MSC_DEMANDS.csv

 
Items [top]

Source Source
Base Table Source View Staging Table Planning Table
Snapshot Synonym

MTL_SYSTEM_ MTL_SYS_IT MRP_SN_SY MRP_AP_SYS_ITE MSC_ST_SYSTEM MSC_SYSTEM_


ITEMS_B EMS_SN S_ITEMS MS_V _ITEMS ITEMS

MTL_CATEGO MRP_AP_CATEGO MSC_ST_CATEG MSC_CATEGO


N/A N/A
RY_SETS RY_SETS_V ORY_SETS RY_SETS

MTL_ITEM_CA MTL_ITEM_ MRP_SN_ITE MRP_AP_ITEM_CA MSC_ST_ITEM_C MSC_ITEM_CA


TEGORIES CATS_SN M_CATS TEGORIES_V ATEGORIES TEGORIES

   

Script: dc_items.sql

Description: Tracks item and category information from EBS to APS

Parameters: Organization Code, Item Name, Category Set Name

Usage: sqlplus <apps username>/<apps password>@dc_items.sql

Example: sqlplus apps/apps@dc_items.sql

Output File: MTL_SYSTEM_ITEMS_B.csv


MTL_SYS_ITEMS_SN.csv
MRP_SN_SYS_ITEMS.csv
MRP_AP_SYS_ITEMS_V.csv
MSC_ST_SYSTEM_ITEMS.csv
MSC_SYSTEM_ITEMS.csv
MTL_CATEGORY_SETS.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
MRP_AP_CATEGORY_SETS_V.csv
MSC_ST_CATEGORY_SETS.csv
MSC_CATEGORY_SETS.csv
MTL_ITEM_CATEGORIES.csv
MTL_ITEM_CATS_SN.csv
MRP_SN_ITEM_CATS.csv
MRP_AP_ITEM_CATEGORIES_V.csv
MSC_ST_ITEM_CATEGORIES.csv
MSC_ITEM_CATEGORIES.csv

 
Key Performance Indicator Targets [top]

Sourc Sourc
e e
Base Table Source View Staging Table Planning Table
Snaps Synon
hot ym

MRP_AP_BIS_PERIODS MSC_ST_BIS_PERIOD
GL_PERIODS N/A N/A MSC_BIS_PERIODS
_V S

BIS_INDICATO BISBV_PERFORMANCE_ MSC_ST_BIS_PFMC_ MSC_BIS_PFMC_M


N/A N/A
RS MEASURES MEASURES EASURES

BIS_TARGET_L MSC_ST_BIS_TARGET MSC_BIS_TARGET_


N/A N/A BISBV_TARGET_LEVELS
EVELS _LEVELS LEVELS

BIS_TARGET_V MSC_ST_BIS_TARGET
N/A N/A BISBV_TARGETS MSC_BIS_TARGETS
ALUES S

BIS_BUSINESS BISBV_BUSINESS_PLAN MSC_ST_BIS_BUSINE MSC_BIS_BUSINES


N/A N/A
_PLANS S SS_PLANS S_PLANS

   
Script:  

Description:  

Parameters:  

Usage: sqlplus <apps username>/<apps password>@

Example: sqlplus apps/apps@

Output File:  

 
Master Demand Schedules [top]

Source Source Planning


Base Table Source View Staging Table
Snapshot Synonym Table

MRP_SCHEDULE_D MRP_AP_DESIGN MSC_ST_DESI MSC_DESIG


N/A N/A
ESIGNATORS ATORS_V GNATORS NATORS

MRP_SCHEDULE_D MRP_SCHD_D MRP_SN_SCH MRP_AP_MDS_D MSC_ST_DEM MSC_DEMA


ATES ATES_SN D_DATES EMANDS_V ANDS NDS

   

Script: dc_mds.sql

Description: Tracks master demand schedule information from EBS to APS

Parameters: MDS Name, Organization Code, Item Name

Usage: sqlplus <apps username>/<apps password>@dc_mds.sql

Example: sqlplus apps/apps@dc_mds.sql

Output File: MRP_SCHEDULE_DESIGNATORS.csv


NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
MRP_AP_DESIGNATORS_V.csv
MSC_ST_DESIGNATORS.csv
MSC_DESIGNATORS.csv
MRP_SCHEDULE_DATES.csv
MRP_SCHD_DATES_SN.csv
MRP_SN_SCHD_DATES.csv
MRP_AP_MDS_DEMANDS_V.csv
MSC_ST_DEMANDS.csv
MSC_DEMANDS.csv

 
Master Production Schedules [top]

Source Source Staging Planning


Base Table Source View
Snapshot Synonym Table Table

MRP_SCHEDULE MRP_SCHD_DA MRP_SN_SCHD MRP_AP_MPS_SU MSC_ST_SU MSC_SUP


_DATES TES_SN _DATES PPLIES_V PPLIES PLIES

   

Script: dc_mps.sql

Description: Tracks Master Production Schedules from EBS to APS.

Parameters: MPS Name, Organization Code, Item Name

Usage: sqlplus <apps username>/<apps password>@dc_mps.sql

Example: sqlplus apps/apps@dc_mps.sql

MRP_SCHEDULE_DATES.csv
MRP_SCHD_DATES_SN.csv
MRP_SN_SCHD_DATES.csv
Output File:
MRP_AP_MPS_SUPPLIES_V.csv
MSC_ST_SUPPLIES.csv
MSC_SUPPLIES.csv

 
On Hand [top]

Base Table Source Source Source View Staging Planning


Snapshot Synonym Table Table

MTL_ONHAND_QUANT MTL_OH_Q MRP_SN_O MRP_AP_ONHAND_ MSC_ST_SU MSC_SUP


ITIES_DETAIL TYS_SN H_QTYS SUPPLIES_V PPLIES PLIES

   

Script: dc_ohq.sql

Description: Tracks onhand quantity information from EBS to APS.

Parameters: Organization Code, Item Name

Usage: sqlplus <apps username>/<apps password>@dc_ohq.sql

Example: sqlplus apps/apps@dc_ohq.sql

MTL_ONHAND_QUANTITIES_DETAIL.csv
MTL_OH_QTYS_SN.csv
MRP_SN_OH_QTYS.csv
Output File:
MRP_AP_ONHAND_SUPPLIES_V.csv
MSC_ST_SUPPLIES.csv
MSC_SUPPLIES.csv

 
Planning Parameters [top]

Source Source
Base Table Snapsh Synony Source View Staging Table Planning Table
ot m

MRP_PARAMET MRP_AP_PARAMETER MSC_ST_PARAMET MSC_PARAMET


N/A N/A
ERS S_V ERS ERS

   

Script: dc_params.sql

Description: Tracks Planning Parameters from EBS to APS.

Parameters: Organization Code


Usage: sqlplus <apps username>/<apps password>@dc_params.sql

Example: sqlplus apps/apps@dc_params.sql

MRP_PARAMETERS.csv.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
Output File:
MRP_AP_PARAMETERS_V.csv
MSC_ST_PARAMETERS.csv
MSC_PARAMETERS.csv

 
Planners [top]

Source Source
Base Table Source View Staging Table Planning Table
SnapshotSynonym

MTL_PLANNER MRP_AP_PLANNERS_ MSC_ST_PLANNER MSC_PLANNER


N/A N/A
S V S S

   

Script: dc_planr.sql

Description: Tracks Planners from EBS to APS.

Parameters: Organization Code, Planner Code

Usage: sqlplus <apps username>/<apps password>@dc_planr.sql

Example: sqlplus apps/apps@dc_planr.sql

MTL_PLANNERS.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
Output File:
MRP_AP_PLANNERS_V.csv
MSC_ST_PLANNERS.csv
MSC_PLANNERS.csv

 
Project/Tasks [top]
hot ym

PJM_BORROW_TRAN MRP_AP_OPEN_PAYB MSC_ST_OPEN_P MSC_OPEN_PA


N/A N/A
SACTIONS ACK_QTY_V AYBACKS YBACKS

   

Script:  

Description:  

Parameters:  

Usage: sqlplus <apps username>/<apps password>@

Example: sqlplus apps/apps@

Output File:  

 
 

Attachments

dc_entity.zip (51.98 KB)

You might also like