Professional Documents
Culture Documents
Data Collection Process in ASCP
Data Collection Process in ASCP
Contents
PDF for Printing This Document..................................................................................................................2
Purpose.......................................................................................................................................................2
Understanding Centralized vs. Distributed Installation...............................................................................3
Understanding the Data Collections Process...............................................................................................4
3 Key Stages to the Data Collections Process..........................................................................................4
Refresh Snapshots Process..................................................................................................................4
Planning Data Pull................................................................................................................................4
ODS Load.............................................................................................................................................5
Complete, Net Change and Targeted Refresh.........................................................................................5
Setup Requests for Data Collections........................................................................................................6
Purge Staging Tables - A Special Note......................................................................................................6
ATP Data Collections vs. Standard Data Collections................................................................................6
Fresh Installation or Upgrade - Basic Steps.................................................................................................7
For 12.1...................................................................................................................................................7
For 12.0.6................................................................................................................................................8
For 11.5.10.2...........................................................................................................................................8
Upgrading EBS Applications - Special Considerations..................................................................................9
DBA Checklist...............................................................................................................................................9
VCP Applications Patch Requirements.....................................................................................................9
RDBMS Patches and Setups for Known Issues.........................................................................................9
Other RDBMS Setups / Maintenance....................................................................................................10
RAC Database - Special Requirements...................................................................................................11
System Administrator Setups................................................................................................................11
Applications Super User Checklist.............................................................................................................12
VCP Applications Patch Requirements...................................................................................................12
Fresh Install/Upgrade Setup Steps........................................................................................................12
Running Data Collections for the First Time in a New Instance.............................................................13
Parameters for First Run of Data Collections.....................................................................................13
Preventing Data Collections Performance Issues......................................................................................14
How Can I Check on Performance When Data Collections is Running?.................................................16
Cloning and Setup after Cloning for Data Collections................................................................................16
I Have a Centralized Installation – What should I do?...........................................................................16
Steps to Consider after Cloning when Using Distributed Installation....................................................17
What Happens when I Setup a New Instance in the Instances Form to Connect an APS
Destination to the EBS Source?.............................................................................................................17
Key Points..............................................................................................................................................17
I Am Going to Use ATP Data Collections for Order Management ATP, What Do I Need to Do?................18
Upgrade and Patching...........................................................................................................................18
DBA’s Information.................................................................................................................................18
Applications Super User’s Information..................................................................................................18
Appendix A................................................................................................................................................18
List of Setup Requests...........................................................................................................................19
References.................................................................................................................................................19
Revisions....................................................................................................................................................21
Purpose
This document will offer recommendations for customers using Value Chain Planning -
VCP (aka Advanced Planning and Scheduling - APS) who are performing:
We will explain the difference between Centralized vs. Distributed and other
Terminology used in this document
We will help you insure the patching and setups are correct so you can launch and
complete Data Collections successfully.
We will address this functionality for 11.5.10 and above, with a concentration on R12.1.
Patching for a Centralized Installation - All patches are applied to the single
instance. You can ignore references in the patch readme for centralized and
distributed installation and apply all patches to your single instance. Most
patches will mention to just apply the patch to your instance when Centralized.
Distributed Installation - Two (or more) databases located on different machines. One is
the EBS Source Instance where all OLTP transactions and setups are performed. The APS
Destination is where Data Collections process is launched and uses database links to
communicate and move the data to the MSC tables in the APS Destination.
1. The processing performed for Planning in ASCP, IO, DRP, etc. can take huge
amounts of memory and disk IO. This can slow down the OLTP processing while
the plan processes are running. It is not unusual for large plans to consume
anywhere from 5-20+ GB of memory while running and move millions of rows
into and out of tables to complete the planning process.
1. Note that you may NOT have an APS Destination with lower version that
EBS Source (example - APS Destination on 12.0 or 11.5 with EBS Source
on 12.1)
3. It is possible to have more than one EBS Source connected to the APS
Destination. This requires careful consideration as patching for all instances
must be maintained.
Example: 2 EBS Source instances connected to 1 APS destination instance and all
are on the same EBS applications release. For VCP patches that effect
Collections, all three instances must be patched at the same time in order to
prevent breaking this integration. Or customer may have 2 EBS Source on
different releases. APS destination on 12.1 with an EBS Source on 12.1 and
another on 11.5.10.
Patching for a Distributed Installation - This requires careful attention to the
patch readme to make sure the patches are applied to the correct instance and
that pre-requisite and post-requisite patches are applied to the correct instance.
Patch readme should be clear on where the patch is to be applied. Usually the
patch will refer to the EBS (or ERP) Source (or 'source instance') and to the APS
Destination (or 'destination instance') for where the patch is to be applied.
12.1.x – For our 12.1 CU (or cumulative patches), we are only releasing a
single patch and the patch will be applied to both the EBS and APS
instances.
Demantra Integration patches will have a separate EBS Source Side
patch requirement.
12.0.x and 11.5.10 – For these releases we have separate patches for
different parts of the VCP applications.
2. Data Collections patches are applied to BOTH the EBS and APS
instances. The readme may list specific patches to be applied to
the EBS Source for certain functionality.
6. You can launch this as a standalone process and specify either Fast,
Complete or Automatic Refresh (Refer to Note 211121.1 for more
information)
1. Standard processes
2. Setup Requests
1. These are requests that create all the objects required for
Collections process.
2. Note that the timeout parameter (default 180 minutes) does not count
time used for Refresh Collection Snapshots request – so if Refresh takes
45 minutes and Data Pull takes 150 minutes, it will not time out.
4. When all workers are finished, then it finishes, it reports back to the
main request, which will launch the Planning ODS Load for the next
stage of collections.
2. Then ODS then launches Planning ODS Loader Worker requests to handle
the many different load tasks that move data from the MSC_ST staging
tables to the base MSC tables.
3. During complete refresh we create TEMP tables to copy data from MSC_ST
staging tables into the TEMP tables, then use Exchange Partition technology
to flip this temp table into the partitioned table.
1. For instance, if your instance code is TST, you can see in the log file
where we create temp table SYSTEM_ITEMS_TST,
5. The ODS Load will launch Planning Data Collections - Purge Staging Tables
to remove data from the MSC_ST staging tables and if Data Collections fails
at any stage then you should run Planning Data Collections - Purge Staging
Tables with parameter Validation = No to reset the process and get ready
to launch it again. If you miss this step then it will error with messages like:
2. Trading Partners
2. In order to choose Refresh Mode = Net Change or Targeted, you must set
Planning Data Pull parameter Purge Previously Collected Data = No.
1. This can only be run for transactional entities, like WIP, PO, OM and
some setup entities like, Items, BOM, etc. The XLS attachment in Note
179522.1 explains which entities can be collected in Net Change mode.
2. If you set parameter Yes for a setup entity - like ATP Rules - it is ignored
when running Net Change collections.
5. OPM customers cannot use Targeted Refresh. They must use Complete
Refresh to collect OPM data.
2. Data Collections will launch Refresh Snapshots process which will detect this
setting and launch the Setup Requests. See List if Appendix A
3. When the Setup Requests are completed successfully, then the profile will
automatically set this profile to No (or N in 11.5.10)
5. The Request ID and Request Name will be listed in the Refresh Collection
Snapshots log file when these requests are launched as part of Data Collections
6. If a Setup Request(s) fail, then run Planning Data Collections - Purge Staging
Tables with parameter Validation = No. Then launch again. We have seen many
times where initial failures in the Setup Requests are resolved by running 2 or 3
times.
3. This will also remove the data from the MSC_ST% staging tables that are
populated by the Planning Data Pull.
1. Centralized Installation
2. Only using ATP for Order Management and NOT using ANY other Value
Chain Planning Applications
3. If you run ATP Data Collections in Mode - Complete Refresh when you
are using Value Chain Planning Applications, you will corrupt the data in
the MSC tables. You will have to run Standard Data Collections with a
Complete Refresh to recover from this mistake.
2. The ATP Data Collections section in the Note 179522.1 XLS attachment includes
information on how to run Standard Data Collections to mimic ATP Data
Collections.
1. This can be useful if there is a problem with ATP Data Collections. You
can try standard collections to see if this works to overcome the issue.
4. A Targeted Refresh is like Complete, but only collects changes for any
entity set to Yes. CAUTION: if you make a mistake and run Complete
Refresh instead of Targeted Refresh and set wrong entities to No, then
data will be lost and errors will occur. Then a Complete Refresh would
be required to recover from this mistake.
3. Note 436771.1 is an ATP FAQ which will be help to any customer using
GOP/ATP.
For 12.1
Customers must install the latest release of EBS Applications and during the
implementation cycle, if a new release of EBS becomes available, plan to install that
release (example: Installed EBS 12.1.2, and three months later EBS 12.1.3 becomes
available, then it is very important to plan a move to this latest release.)
For the applications install, review Note 806593.1 Oracle E-Business Suite Release 12.1 -
Info Center.
1. Note 746824.1 12.1 - Latest Patches and Installation Requirements for Value
Chain Planning. Use this note to reference:
2. The Latest available VCP patch for your release must be applied after
the installation.
3. Understand the Patching requirements if you have a separate EBS
Source instance.
2. Note 552415.1 INSTALL ALERT - Setting Up The APS Partitions and Data
Collections Instance in R12
4. Note.763631.1 Getting Started With R12.1 Value Chain Planning (aka Advanced
Planning and Scheduling APS) - Support Enhanced Release Content Document
(RCD)
1. If you also need to review new features for R12.0, then also see Note
414332.1
5. If you are installing to use Rapid Planning, then see Note 964316.1 - Oracle
Rapid Planning Documentation Library
For 12.0.6
Customers should NOT be planning to use this release, but should move to 12.1.
Note 401740.1 - R12 Info Center is available for customers who must perform this
installation
1. Note 421097.1 R12 - Latest Patches and Critical Information for VCP - Value
Chain Planning (aka APS - Advanced Planning and Scheduling)
1. You must apply the latest CU Patches available after the installation
and before releasing the instance to users.
2. If you are using only ATP Data Collections, then apply the GOP Patch and
Data Collections patches.
2. Note 412702.1 Getting Started with R12 - Advanced Planning and Scheduling
Suite FAQ
3. Note 414332.1 Getting Started With R12 - Advanced Planning Support Enhanced
RCD
4. Note 552415.1 INSTALL ALERT - Setting Up The APS Partitions and Data
Collections Instance in R12
For 11.5.10.2
In the rare case that an upgrade to 11.5.10.2 is being considered, AND it is not an
intermediate step on the way to R12.1, then it is very critical that the latest patches be
applied.
Review Note 223026.1 and apply the latest Rollup patches for at minimum:
1. Using ASCP - then apply ASCP Engine/UI and Data Collections rollup
3. Using ATP Data Collections only - then apply GOP Patch and Data Collections
Patch
Also review Note 883202.1 - Minimum Baseline Patch Requirements for Extended
Support on Oracle E-Business Suite 11.5.10
1. You can run the SQL in the note to verify that tables MSC_NET_RES_INST_AVAIL
and MSC_SYSTEM_ITEMS have matching partitions.
2. After upgrade, applying the latest patches is very critical step that cannot be postponed.
Users should not start setup and testing on the base release of your upgrade.
DBA Checklist
Note: After Patching, there may be invalids or missing objects used for Data
Collections.
The patch should set profile MSC: Source Setup Required = Yes (or Y in 11.5.10)
Then with the first run of Data Collections, the Setup Requests will be launched and
build all the required objects for Data Collections, so this will resolve any invalid or
missing objects used for Data Collections. As noted above, you can find more
information on the Data Collection objects and the Setup Requests, etc in Note
179522.1
All three of these RDBMS bugs relate to some data not being collected and the issue has
been seen in different entities under different conditions in different database versions.
So we recommend all these patches,
If you have this issue NOW, see step #3.c and execute this as a workaround
immediately. It has resolved most issues, but there will be exposure to other RDBMS
bugs below
1. RDBMS bug 6720194 fix should be applied - Wrong results from SQL with
semijoin and complex view merging
4. As a workaround, you can also set all snapshots for Data Collections to
use NOPARALLEL to prevent missing data.
6. Then if any rows are returned, run the output and it will set DEGREE = 1
Example: alter table MRP.MRP_FORECAST_DATES_SN NOPARALLEL;
1. This is part of our performance tuning for MLOGs per Note 1063953.1
Other RDBMS Setups / Maintenance
1. Insure that RDBMS setups for Oracle applications in note Note 396009.1 (Or for
11i - Note 216205.1) are completed and all relevant settings for the RDBMS are
correct.
In the section for Database Initialization Parameter Sizing, for large data
volumes, we recommend that minimum sizing settings are 101-500 users.
2. Test and confirm that database links are working for APPS User in
SQL*Plus.
1. This note is required reading for both DBA’s and Applications Super
Users
2. Review the note to understand and manage the MLOGs for Data
Collections and related applications that share the same MLOGs we use
for Data Collections
3. When a lot of data changes happen to the base tables, then we get
large MLOGs that must be managed
AND/OR
4. If the ENI and OZF snapshots mentioned in the note are on the system,
then they must be manually refreshed to keep the MLOG table
maintained and prevent performance issues.
5. You may see initial run of Refresh Snapshots taking a long time. During
the setup phases, this will usually settle down after the first 2-3 runs of
Data Collections unless there is a lot of data processing happening
between runs of data collections and MLOGs grow rapidly.
1.Review the note and use the steps in section - I Want to Check
My System and Prevent a Performance Problem (ANSWER #1)
to prepare the system to avoid performance problems.
5. PLEASE NOTE that we expect Data Collections to take significant time during the
initial run of Data Collections. This is because we could be moving millions of
rows of data from the OLTP table to empty MSC tables.
1. Once Data Collections has completed for the first time and Gather
Schema Statistics is run for MSC schema (and MSD schema if using
Demantra or Oracle Demand Planning), then the process should settle
down and have better timing
2. When using ATP and a Distributed Installation and the EBS Source is RAC
database, then you must see Note 266125.1 and perform additional setups
3. Confirm that you are have resolved issue for install mentioned in Note 552415.1
INSTALL ALERT - Setting Up The APS Partitions and Data Collections Instance in
R12
1. Running the SQL in the note should confirm that both tables have
matching partitions.
4. If you have RAC Database then see RAC Database - Special Requirements
5. Confirm that Standard Manager (or if used, the specialized manager) has
workshifts/processes as defined above in System Administrator Setups
2. This will insure that all partitions are created properly for the ATP tables
and if they are already created, then it does no harm.
8. Download the attachment in Note 179522.1 - see the Profiles Tab to setup the
profiles for Data Collections.
2. Only should collect from ALL Orgs for the first run of Data Collections, so
do NOT enter any extra Orgs that you plan to collect later in the
implementation.
3. You can setup Collection Groups, just do NOT use a Collection Group for
the first run of Data Collections.
It is not unusual for the first run to require that you use Timeout parameter of
600-1600 minutes to get all data collected from the EBS Source tables to the
APS Destination MSC tables. It does not matter if you are using database links
and two instances or a single instance. Be sure to set this high parameter for
both the Planning Data Pull and ODS Load.
Do not use a Collection Group during the first run, collect using parameter = All
If you are not collecting from all Orgs defined in the EBS source, then only define
the required orgs in the Instances form. Be sure to include the Master Org in the
Instances form.
Once Data Collections has completed successfully for the first time, run Gather
Schema Stats for the MSC schema, then the timing should settle down and be
reasonable. The default timeout of 180 for Planning Data Pull and 60 minutes
for the ODS Load can be maintained.
4. If your EBS Source instance is 11.5.10 and you have just applied Data
Collections Rollup #32 or higher for the first time, then special steps are
required.
Run Data Collections using Targeted Refresh for Planners only. This is a
one-time setup step for the very first run of Collections after the patch is
applied.
Key Parameters are:
5. If you have a lot of data to collect, then the Refresh Snapshots process
could take several hours to run for this initial run.
1. Set Timeout parameter very high - suggest 900 - 1600 for the
first run
3. All other parameters should be set to default for the first run of
collections
4. If there are other parameters that will need to be Yes for your
implementation, then plan to set those Yes after you get the
initial run of collections completed successfully.
1. Set Timeout parameter very high - suggest 900 - 1600 for the
first run
8. When the first run is completed, then run Gather Schema Statistics for
the MSC schema.
10. IF Planning Data Pull or ODS Load fails, check for a Timeout error or
other specific error in the log file of the Main request and all the log files
for the Workers also. You must run Planning Data Collections - Purge
Staging Tables with parameter Validation = No, before you launch the
Data Collections again. If you fail to run this request, then Data
Collections will error with message text as see in Note 730037.1.
1. The Profile setup in Note 179522.1 should be setup correctly for your
instance. Below we have some suggestions, BUT you must review all the
profiles for proper settings.
Notes on the Profile Setups:
2. Make sure Staging Tables profile is set to Yes if you are collecting from
only 1 EBS Source instance - MSC: Purge Staging and Entity Key
Translation Tables / MSC: Purge Stg Tbl Cntrl
1.If you are collecting from Multiple Instances, then you must set
this profile - No
2.In this case, the MSC_ST staging tables may grow very large in
dba_segments and high water mark will not be reset when we
delete from these tables.
3.If you must set profile to No, then we suggest that the DBA
periodically schedule a truncate of the MSC_ST% tables when
Data Collections is NOT running to recover this space and help
performance.
4. Check to make sure Debug profiles are not setup except in case where
they are required for investigation of an issue.
5. IF using Projects and Tasks and have many records, then if profile MSC:
Project Task Collection Window Days is available, you can be set to help
performance.
2. Once the initial collection has been performed and most data collected,
then use the steps in section -
I Want to Check My System and Prevent a Performance Problem - and
ANWER #1: Yes, we are using Data Collections now
3. When you execute these steps, you will be setting the MLOGs to
provide best performance.
4. You must also be aware if you have any MLOGs that are shared with
other applications. In this case, the DBA will need to actively manage
those other Snapshots, refreshing them manually - via cron job may be
the best method.
3. If you have very large BOM tables and plan to use Collection Groups to
collect information only for certain Orgs, then be aware that this can take
very significant TEMP space during the ODS Load. We have seen reports
where All Orgs takes only 6-12gb of TEMP space, but when using a
Collection Group, then TEMP space requirements exceed 30gb.
How Can I Check on Performance When Data Collections is Running?
1. Note 186472.1 has SQL to check on specific requests that are running and show
the SQL that is running on the RBDMS.
1. When you run this SQL you provide the Request ID and it checks to find
the SQL being run for this session
2. If you have several Planning Data Pull workers (or ODS Load Workers)
running, then you may need to check several different requests.
3. Run this SQL every 10-15 minutes and check if the output changes.
2. Note 280295.1 has requests.sql script which will provide you with details on the
all the requests in a set.
1. Enter the request id for the request that launched the set and it will
provide details of all the requests. Default request name for the set is
Planning Data Collection (Request Set Planning Data Collection)
3. Note 245974.1 - #7 and #11 show how to setup trace for requests, should this
be required to get details on a performance issue.
1. #11 in that note is very useful for setting trace for Refresh Collection
Snapshots on the EBS Source when it is launched by Data Collections
from the APS Destination.
However, if you want to change the Instance Code in the Instances form, then
you must follow steps in Note 137293.1 - Section X to clean up all the old data
BEFORE you create a new instance. If you just change the instance code in the
form, then this will cause data corruption.
6. You should NOT manually change the FREE_FLAG using SQL. You
should create a new Instance partition using Create APS Partition
request with parameters Instance partition = 1 and Plan partition = 0.
We do not recommend creating extra Instance Partitions, since this
creates many objects in the database that are not required.
7. If the previous collected data and plan data from the old instance is not
required, then it should be removed from the system.
Key Points
1. You will have to MANUALLY manipulate the EBS Source instance table
MRP_AP_APPS_INSTANCES_ALL.
2. You cannot have TWO instances which have the same INSTANCE_ID or
INSTANCE_CODE in EBS Source table MRP_AP_APPS_INSTANCES_ALL
3. You cannot have TWO instances which have the same INSTANCE_ID or
INSTANCE_CODE in APS Destination table MSC_APPS_INSTANCES
4. You can always use steps in Note 137293.1 - Section X to clean up the
data in the APS Destination and reset the EBS Source and collect all data
fresh and new into a new instance.
I Am Going to Use ATP Data Collections for Order Management ATP, What Do I Need to Do?
For ATP Data Collections, the instance must be Centralized installation on a single
database.
You cannot run ATP Data Collections for a distributed installation.
You cannot run ATP Data Collections if you are using any Value Chain Applications
besides ATP for Order Management.
Make sure the INSTALL ALERT Note 552415.1 is resolved for R12 and R12.1 installations.
DBA’s Information
All steps in DBA Checklist apply, EXCEPT that no database links are required and Note
137293.1 will not be so important unless you move to Distributed installation.
Patching is simpler, since all patches are applied to the same instance.
3. Review section ATP Data Collections vs. Standard Data Collections for more
information
Appendix A
Note 179522.1 should be the primary resource for information about the requests and
other information on data collections objects, parameters, profiles, etc.
Goal
What are the necessary steps to setup an Item so it appears in the Advanced Supply Chain
Planning - Planner's Workbench - and can be released into Purchasing as a Requisition and then
have the Requisition automatically created into a Purchase Order?
The Solution below contains the following steps to achieve this Goal. Click on any of the links
below to go directly to a specific step.
Summary of Steps
Part 1 - ASCP Item to Purchasing Requisition
Part 2 - Setup Sourcing to Automatically Create a Purchase Order from the Requisition
Solution
This example is based on the Vision Demonstration Database - using the Login of MFG/Welcome
and two responsibilities - already assigned to the User. The MFG login is tied to the employee
Jonathan Smith.
Define a new item - Enter the Item and a Description - choosing M1 Inventory Organization
when the form opens.
- From the top tool bar choose Tools/Copy From - and use the Template 'Purchased Item'
- Choose Apply - then Done
- Enter an Item Name
- Assign the Item to the M1 Inventory Organizaton
From the Purchasing tab ensure that you have a List price entered as well as a Default Buyer.
The list price is what will be used when executing Requisition Import.
In the General Planning tab - ensure the Make or Buy option is set to Buy and also ensure there
is a Planner entered. Notice this is set at the Inventory Organization Level for M1. Save the
item.
As the new screen opens - enter the Item which was created in Step 1. Choose the Bucketed
Button in the lower right hand corner of the form.
Then choose the Detail button, enter in the Bucket as Weeks, give a date of today or in the near
future and span the End Date a couple of months out. Enter 25 in the Current field. The forecast
is essentially pre-calculated demand. In the example above - the system is being told that there
will be a demand of 25 quantity - per week - from the 26-Mar through 02-Jul. Save your entry.
Create a new Master Demand Schedule by entering a new record and then entering a Name.
Save the name and then choose the Load/Copy/Merge button.
-
The parameter screen will appear for the Load/Copy/Merge MDS request.
- For Destination Schedule - choose the MDS name that was just entered
- For Source Type - choose Specific Forecast
- For Source Name - give the Forecast Name (not forecast set) of the forecast created
previously
- Overwrite - All Entries
- Set the Cut Off Date a week or two past the date that was entered in the Forecast
Submit the request and ensure it completes before moving to the next step.
In this step you are loading the Forecast into the Demand schedule.
The collections is responsible for pulling the data into the MSC schema. Essentially, the
Advanced Supply Chain planning product co-exists with the Oracle E-Business Suite application -
in it's own schema. The data collections process is responsible for collecting the data necessary
in order to properly run a plan.
For instance - Double click on the Parameters field on the Planning Data Pull line then choose
TST (Seeded with Vision Demonstration Database):
Collection Method - Complete Refresh
Take the defaults for the rest and submit this request. It will take some time to complete, but
do not move to the next step until the process completes.
Enter a Plan Name - and then choose the Plan Options button at the bottom of the form.
From the Organization tab enter the Org as TST:M1 - and then choose the MDS (Demand
Schedule) created previously. It is available as it was pulled during the Collections process.
Save then close the Plan Options form.
Once the plan options are entered, close the Options and choose the Launch Plan button.
- Choose the Plan Name if not already present - and submit the plan.
- Ensure that the plan runs to completion.
Right Click on the Item - to bring up the shortcut menu and navigate to the Supply/Demand.
The Supply Demand screen will appear. In this screen - there are two order types.
- Planned Order and Forecast MDS
- The forecast MDS is the demand that was imposed by the forecast which was created
- The planned order is what is being suggested to the Planner for an action to take to alleviate
the demand
Enter the item in Item field of the Requisition Find form - and search to locate the newly created
requisition.
The requisition is now created - based on the release of the planned order from the Advanced
Supply Chain Planning. If desired, sourcing rules and approved supplier list entries can be setup
using the Manufacturing and Distribution Manager to create Purchase Orders automatically
from ASCP.
Part 2 - Setup Sourcing to Automatically Create a Purchase Order from the Requisition
Create a Supplier by entering the Supplier name, then click on the Sites button.
Enter the appropriate site information for the Supplier and save.
-Type-Quotation
-Line-enter the line number on the Quotation that corresponds to the ASCP item created
Enter a Sourcing Rule name and Effective From and To Dates. Choose to have the Sourcing rule
for All Orgs or a single Org. Choose Buy From under Type and enter the Supplier and Site in the
appropriate fields. Allocation should be 100 with Rank entered as 1.
Enter a new Assignment set name to be entered in the Profile - MRP: Default Sourcing
Assignment Set or query the assignment set already entered in the profile. Enter the ASCP item
in the Item/Category field. Type should be Sourcing Rule and enter under the Rule/BoD field the
Sourcing Rule name entered in Step 10. Save your entry.
Double click on the + next to PO Create Documents then double click on the + next to Attributes.
Move down to 'Is Automatic Approval Allowed?' and right click. Change Value to Y.
Choose View>Request
We can then see the Requisition was automatically created in Approved status.
Choose the Related Documents tab and enter the Requisition number created in the previous
step and press Find.
We can see above that Purchase Order 4619 was created from the requisition automatically
using our ASCP item.
Introduction
The Planning Data Collection process involves three separate concurrent programs:
Many times customers face issues where certain entities are not collected into the APS/Planning
destination instance. The goal of this note is to provide diagnostic information to customers and
support engineers to track the various data collection entities from the EBS source tables to the
APS/Planning destination tables. Each script will provide all the data from the related EBS table,
snapshot, synonym, planning view, staging table, and planning table. If data is not collected into
the planning table, we can trace back through the output to see where the data went missing.
For example, the problem could be that the data was in the snapshot, but not in the planning
view. Investigation would then be needed to understand why the view did not pick up the data.
Or maybe the issue is that the data from the EBS source table is not in the snapshot. In this
case, investigation would be needed to understand why the Refresh Collection Snapshot
program did not update the corresponding snapshot.
Instructions
Below are a list of the data collection entities that are seen when you run the Planning Data
Collection concurrent request. The entities correspond to the parameters for the Planning Data
Pull program. Click on the link of the desired entity and it will take you to the corresponding
section in the note. There you will find the name of the script, it's description and parameters,
the name of the output file, and how to run it. All the scripts have been zipped into a single file
and can be downloaded here. Unzip the files and then select the folder that corresponds to the
desired entity.
The scripts can be run in both a centralized and a decentralized APS configuration, but are
ALWAYS run from the EBS Source instance. At the beginning of each script you are provided
with a list of instances and database links (if applicable). Enter the corresponding database link
with an '@' sign at the beginning. For example, if the database link is ERP_TO_APS, you would
enter '@ERP_TO_APS'. If no database link is listed, then just press the 'ENTER' key and continue
to the other parameters for that particular script. Each script will create a list of .csv files
corresponding to the Base Table, Source Snapshot, Source Synonym, etc. Zip the files together
and upload them to the SR.
Note: At this time, not all entities have scripts. Only the most commonly used ones were
created. If you find that there is an entity without a script, please email brett.fox@oracle.com
and I will work to get you a script and the note updated.
Payback Demand/Supply
Approved Supplier Lists [top]
Source Source
Base Table Snaps Synon Source View Staging Table Planning Table
hot ym
Script: dc_asl.sql
PO_APPOROVED_SUPPLIER_LIST.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
Output File:
MRP_AP_PO_SUPPLIERS_V.csv
MSC_ST_ITEM_SUPPLIERS.csv
MSC_ITEM_SUPPLIERS.csv
ATP Rules [top]
Source Source
Base Table Snapsho Synony Source View Staging Table Planning Table
t m
Script: dc_atp.sql
Bills of Materials/Routings/Resources [top]
Source Source
Base Table Source View Staging Table Planning Table
Snapshot Synonym
BOM_INV MRP_SN_I
BOM_COMPON MRP_AP_BOM_CO MSC_ST_BOM_CO MSC_BOM_CO
_COMPS_S NV_COMP
ENTS_B MPONENTS_V MPONENTS MPONENTS
N S
BOM_OPERATI
BOM_OPR MRP_SN_ MRP_AP_ROUTIN MSC_ST_ROUTIN
ONAL_ROUTIN MSC_ROUTINGS
_RTNS_SN OPR_RTNS GS_V GS
GS
BOM_OPERATI
BOM_OPR MRP_SN_ MRP_AP_ROUTIN MSC_ST_ROUTIN MSC_ROUTING_
ONAL_ROUTIN
_SEQS_SN OPR_SEQS G_OPERATIONS_V G_OPERATIONS OPERATIONS
GS
Script: dc_bom.sql
BOM_STRUCTURES_B.csv
BOM_BOMS_SN.csv
MRP_SN_BOMS.csv
MRP_AP_BOMS_V.csv
MSC_ST_BOMS.csv
MSC_BOMS.csv
BOM_COMPONENTS_B.csv
BOM_INV_COMPS_SN.csv
MRP_SN_INV_COMPS.csv
Output File:
MRP_AP_BOM_COMPONENTS_V.csv
MSC_ST_BOM_COMPONENTS.csv
MSC_BOM_COMPONENTS.csv
BOM_SUBSTITUTE_COMPONENTS.csv
BOM_SUB_COMPS_SN.csv
MRP_SN_SUB_COMPS.csv
MRP_AP_COMPONENT_SUBSTITUTES_V.csv
MSC_ST_COMPONENT_SUBSTITUTES.csv
MSC_COMPONENT_SUBSTITUTES.csv
Script: dc_rtg.sql
Bills Of Resources [top]
Sourc Sourc
e e
Base Table Source View Staging Table Planning Table
Snaps Synon
hot ym
Script: dc_bor.sql
Calendars - Dates [top]
SourcSourc
e e
Base Table Source View Staging Table Planning Table
Snap Syno
shot nym
Script: dc_cal.sql
Demand Classes [top]
SourceSource
Base Table Snaps Synon Source View Staging Table Planning Table
hot ym
FND_COMMON_L MRP_AP_DEMAND_C MSC_ST_DEMAND_ MSC_DEMAND_C
N/A N/A
OOKUPS LASSES_V CLASSES LASSES
Script: dc_dc.sql
FND_COMMON_LOOKUPS.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
Output File:
MRP_AP_DEMAND_CLASSES_V.csv
MSC_ST_DEMAND_CLASSES.csv
MSC_DEMAND_CLASSES.csv
End Item Substitutes [top]
Source Source
Base Table Snaps Synon Source View Staging Table Planning Table
hot ym
Script: dc_eis.sql
Forecasts [top]
Script: dc_fcst.sql
Items [top]
Source Source
Base Table Source View Staging Table Planning Table
Snapshot Synonym
Script: dc_items.sql
Key Performance Indicator Targets [top]
Sourc Sourc
e e
Base Table Source View Staging Table Planning Table
Snaps Synon
hot ym
MRP_AP_BIS_PERIODS MSC_ST_BIS_PERIOD
GL_PERIODS N/A N/A MSC_BIS_PERIODS
_V S
BIS_TARGET_V MSC_ST_BIS_TARGET
N/A N/A BISBV_TARGETS MSC_BIS_TARGETS
ALUES S
Script:
Description:
Parameters:
Output File:
Master Demand Schedules [top]
Script: dc_mds.sql
Master Production Schedules [top]
Script: dc_mps.sql
MRP_SCHEDULE_DATES.csv
MRP_SCHD_DATES_SN.csv
MRP_SN_SCHD_DATES.csv
Output File:
MRP_AP_MPS_SUPPLIES_V.csv
MSC_ST_SUPPLIES.csv
MSC_SUPPLIES.csv
On Hand [top]
Script: dc_ohq.sql
MTL_ONHAND_QUANTITIES_DETAIL.csv
MTL_OH_QTYS_SN.csv
MRP_SN_OH_QTYS.csv
Output File:
MRP_AP_ONHAND_SUPPLIES_V.csv
MSC_ST_SUPPLIES.csv
MSC_SUPPLIES.csv
Planning Parameters [top]
Source Source
Base Table Snapsh Synony Source View Staging Table Planning Table
ot m
Script: dc_params.sql
MRP_PARAMETERS.csv.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
Output File:
MRP_AP_PARAMETERS_V.csv
MSC_ST_PARAMETERS.csv
MSC_PARAMETERS.csv
Planners [top]
Source Source
Base Table Source View Staging Table Planning Table
SnapshotSynonym
Script: dc_planr.sql
MTL_PLANNERS.csv
NO_SOURCE_SNAPSHOT.csv
NO_SOURCE_SYNONYM.csv
Output File:
MRP_AP_PLANNERS_V.csv
MSC_ST_PLANNERS.csv
MSC_PLANNERS.csv
Project/Tasks [top]
hot ym
Script:
Description:
Parameters:
Output File:
Attachments