Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 17

White Paper on Oracle Time and Labor (OTL) Archive and Purge Process

www.Fullinterview.com Page 1 of 17











OTL Tables Archive and Purge
WHITE PAPER


White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 2 of 17


1 INTRODUCTION ............................................................................................................................. 3
2. ARCHIVING TIMECARDS PROCESS FLOW ......................................................................... 4
3 TECHNICAL DETAILS .................................................................................................................. 5
3.1 Points to Remember .............................................................................................................. 6
3.2 Package and Procedure Details.......................................................................................... 8
3.3 Algorithm and Technical Details ........................................................................................ 8
4 SCREEN SHOTS SHOWING THE TIMECARD: .................................................................... 14
White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 3 of 17
1. Introduction

Oracle Time & Labor is an e-business suite solution to your requirements for
time capture, validation, time management, and approval.

You can archive timecards you no longer access to release disk storage space
and potentially improve OTL performance. When you archive a timecard, its
detail is moved to archive tables, and summary information remains accessible.
The summary information appears in windows that display summary
information and in reports to assure users that the timecard exists, even if they
cannot update or view the timecard details.

Archiving timecards involves identifying timecards you no longer need to access
and moving them to archive tables in OTL. Restoring timecards involves
identifying timecards you want to reinstate and restoring them to live tables in
OTL. You need to restore archived data only if you have archived a data set in
error or you need to bring the data back for audit or legal reasons.

When you archive a timecard, its detail is moved automatically to archive
tables, but summary information remains accessible in windows that display
summary information and in reports. In this way, users can remain confident
that their timecards exist, even if they cannot view or update timecard details."

Intended Audience
This paper is intended for readers who have the desire to understand the
concept and architecture of OTL tables. Reader must possess fair knowledge of
Oracle and the concept of Oracle Time and Labor to understand this document.
The scope of this document is to detail the requirements for OTL tables Archive
and Purge process.
The OTL processes are standard process and one must be aware of how/when
to run and know the pre-requisite setups. The document will be helpful for
Oracle Project/OTL folks to understand the process, to know about the
packages and procedures being called. This process is required for better
performance of daily running jobs in a Production Environment.

How do workers enter their time using Oracle Time & Labor?

Workers use a self service web page in the format of a timecard. You can create
different timecard layouts for each group of workers so that they only see the
fields that are relevant to them. You can rearrange the fields on the layouts, edit
the lists of values, and rewrite the instruction text to make it easy for your
workers to fill out their timecards quickly and accurately.

Workers on the move can enter their time and labor data in a spreadsheet and
upload it later to the application.

White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 4 of 17
Workers can also use other entry devices such as time clocks, telephones, or
legacy entry systems. You can easily enter this data into Oracle Time & Labor
using the robust timecard API.

Effective Management of Tablespace after Archive
(DBA Specific Activity)

By moving the data off the live tables to the backup tables, the Archive Data set
process safely creates the backup copy of your old data. While the backup
tablespace can be taken offline and can be stored with some cheaper memory,
we recommend that the online tables are effectively re-organized and the spaces
reused completely. While reorganizing tables is a DBA activity, ensure that all
dependent objects involved with the above tables are considered as part of
reorganizing activity.

2. Archiving Timecards Process Flow



White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 5 of 17
3. Technical Details
1. Set up the OTL Advanced Process Administrator responsibility. You set up
the OTL Advanced Process Administrator responsibility to define data sets and
run the Archive and restore processes.
The programs are available to the standard responsibility. For business needs,
the processes/programs could be added to any custom request group (ALL
Projects and OTL Programs) attached to any custom responsibility. This way we
need not have to request for the standard responsibility.
2. Set the following profile options at the responsibility level for the
responsibility level OTL Advanced Process Administrator/ custom
responsibility to which the programs have been added. The following profile
options could be given any value depending on the Business Needs.
OTL: Archive Restore Chunk Size: 150
OTL: Minimum Age of Data Set for Archiving: 6 (This is the minimum
value from Profile values LOV)
OTL: Max Errors: 100
3. For the Custom Responsibility, the profile options should have the Hierarchy
Type Access Level set to VISIBLE and UPDATABLE. This is to ensure that the
profile options can be set at responsibility level.











White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 6 of 17
For Eg.

3.1 Points to Remember
1. Defining a data set would depend on the week ending date. It will by
default pick up all weeks periods in which the timecards lie.
For Eg. This query will let you know the number of records that would
be fetched initially. The count will differ because of the scope column.
Here the start date is 10-JUN-2008 and end date 14-JUN-2008

SELECT * from hxc_time_building_blocks a
WHERE TRUNC(stop_time) BETWEEN to_date('10-JUN-2008','DD-
MON-YYYY') AND to_date('14-JUN-2008','DD-MON-YYYY')
AND scope = 'TIMECARD'.
2. In case the Validate Data Set program completes in WARNING, please
check the log message for description on the exception records. If they
are not corrected and the next step Archive Data Set is run with Ignore
Validation Warnings as Y the valid records along with exception
records will be archived.
White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 7 of 17
3. Archive Data Set process and Restore Data set processes are DML
intensive Programs and works best when the load on the application is
minimal.
4. It is important that you include one complete period of Approval and
Retrieval within a data set. If your approval period spans across
multiple data sets, you might have an issue when you have one of these
data sets offline and one online.
5. Good data sets are small data sets, because they let you Archive and
Restore easily.
6. Since the processes are DML intensive, it would be best to have the
schema statistics gathered at maximum possible percentage after
each run. After every couple of executions of Archive or Restore
Process, it is recommended that you run the Gather Schema Statistics
program for the entire HXC schema or run the Gather Table Statistics
for both the live and archive tables for best results
7. Tables that get archived
HXC_AP_DETAIL_LINKS
HXC_APP_PERIOD_SUMMARY
HXC_TC_AP_LINKS
HXC_TIME_ATTRIBUTE_USAGES
HXC_TIME_ATTRIBUTES
HXC_TIME_BUILDING_BLOCKS
HXC_TRANSACTION_DETAILS
HXC_TRANSACTIONS

into

HXC_AP_DETAIL_LINKS_AR
HXC_APP_PERIOD_SUMMARY_AR
HXC_TC_AP_LINKS_AR
HXC_TIME_ATTRIBUTE_USAGES_AR
HXC_TIME_ATTRIBUTES_AR
HXC_TIME_BUILDING_BLOCKS_AR
HXC_TRANSACTION_DETAILS_AR
HXC_TRANSACTIONS_AR
8. After a set of data has been marked for Archival and then restored
later, the same dates cant be used again for defining another data set.
That will result in ERROR: There is an existing Data Set whose range
overlaps with the period specified.

9. In case the process errors out due to an unforeseen error, ensure you
rerun the process to successful completion

White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 8 of 17

3.2 Package and Procedure Details

Module Name Package/Procedure Description
Deflne Dutu Set hxc_urchlve_restore_process.deflne_dutu_set
Thls procedure deflnes the
dutu set whlch hus to be
urchlved und murks the
records with a data set id.
This will uniquely identify
the set of records to be
archived as a group in
future.
Vulldute Dutu
Set hxc_urchlve_restore_process.vulldute_dutu_set
Thls procedure vulldutes lf
the records ure flt for urchlvul
Archlve Dutu Set hxc_urchlve_restore_process.urchlve_dutu_set
Thls procedure deletes the
records from the llve tubles
und lnserts them ln the
urchlve tubles
Restore Dutu
Set hxc_urchlve_restore_process.restore_dutu_set
Thls procedure restores the
dutu from urchlve tubles to
llve tubles
Undo Deflne
Dutu Set hxc_urchlve_restore_process.undo_deflne_dutu_set
Thls procedure ls used to
undo the deflned dutu set

3.3 Algorithm and Technical Details
1. Run the Define Data Set Process:
To run the Validate Data Set process:
y In the Submit Request window, select Define Data Set.
y Enter a unique name for the data set, for example, March 2003.
You use this defined data set name throughout the archiving
process.
y Enter the date range. For example, to archive all timecards for
March 2003, enter 01-Mar-2003 in the Data Set Start Date field
and 31-Mar-2003 in the Data Set End Date field.
Brief Description
y Calls hxc_data_set.validate_data_set_range Validates that start
date of data set is not greater than or equal to stop date of data
set.
White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 9 of 17
y Validates that a data set is already not present in the
hxc_data_sets table whose date range corresponds with this data
sets date range.
y Validates that a data set with the same name does not exist in the
hxc_data_sets table.
y Marks all the tables with a value of data_set_id
More Details:
Calls hxc_archive_restore_process.define_data_set
1. Validate valid data range set?
2. Call hxc_data_set.insert_into_data_set and Mark the table for the
data set id in hxc_data_sets table
3. Call hxc_data_set.mark_tables_with_data_set
Select records from hxc_time_building_blocks for the Data
Set and within date ranges.
INSERT INTO hxc_temp_timecard_chunks
Update all related hxc tables with data_set_id
Parameter
Name
List of Values
(If any)
Validations (Required
parameter, Validation against
db, etc.)
Comments
Data Set Name
- -
E.g. Q2FY_Test1
Start Date
- -
10-Jun-2008
End Date
- -
14-Jun-2008
After Runnlng the Progrum you should notlce some of the followlng polnts:
1) DATA_SET_ID hxc_tlme_bulldlng_blocks Inserted wlth 1
2) New Record lnserted ln hxc_dutu_sets wlth stutus='ON_LINE'.
Other Inltlul Updutes/Inserts. Check Log for More Detulls
2. Run the Validate Data Set Process:
To run the Validate Data Set process:
1. In the Submit Request window, select Validate Data Set.
2. Select the data set you want to validate.
3. Click Submit.
4. View the error log for warnings. If there are errors, then review
the timecards to find out whether they are still active, and
run the validate process again.

Brief Description:
White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 10 of 17
y Validates how many unretrieved timecards are there in submitted,
approved, rejected and working status.
y Validates how many unnotified timecards are there.
y Validates how many rejected timecards are there.
y If the no of errors exceed the value set for profile option OTL: Max
Errors in Validate Data Set then sets validation status in
hxc_data_sets to E.
More Details:
Calls hxc_archive_restore_process.validate_data_set
1. Call hxc_data_set.validate_data_set validate the data set.
2. Fetch all the non-errord status unreterived Timecards from all
HXC Tables with hxc_transactions.status <> SUCCESS and
approval_status <> ERROR
3. From the data set marked, fetch all the errored records (between
the date ranges) fetch and display all non retrieved records,
except ERROR records.
Get Counts: 1) APPROVED
2) REJECTED
3) SUBMITTED
4) WORKING
4. Validation to check if there are any errored timecards in the range
5. Validation to check whether there are any un-notified timecard
6. After all Validations update validation_status:= 'V' in
hxc_data_sets for the data set id.
7. If there are any records which have not been retrieved yet, then
validation_status will be marked with E.

Parameter
Name
List of Values (If
any)
Validations (Required parameter, Validation
against db, etc.)
Comments
Data Set Name - - E.g
Q2FY_Test1


3. Run the Archive Data Set Process:
To run the Archive Data Set process:
White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 11 of 17
1. In the Submlt Request wlndow, select Archlve Dutu Set.
2. Select the dutu set you wunt to urchlve.
3. Indlcute whether the urchlve process ls to lgnore uny vulldutlon errors.
Warning: Thls optlon enubles you to urchlve tlmecurds thut ure ln error. Use
thls optlon only when you ure certuln the errors ure ucceptuble.
4. Cllck Submlt.

Brief Description:
y Checks the Data Set Name passed to the archive program.
y Checks if the time range specifies satisfies the condition of OTL:
Minimum Age of Data Set for Archiving. If not throws error.
Validates that the Validate Data Set process has been run at least
once otherwise forces to run Validate Data Set Process first.
y If there are warnings shown by Validate Data Set process asks to
correct them first or set the Ignore Validation warnings flag to yes.
y Takes a count of the live and archive tables before the archival.
y Starts the archival process. First deletes all records from temp
table hxc_temp_timecard_chunks. Then inserts records from live
tables (one by one all tables mentioned above) limited to the
chunk size in temp table. Then inserts into corresponding archive
tables.
More Details:
Calls hxc_archive_restore_process.archive_data_set
1. Gets the Data set info by calling hxc_data_set.get_data_set_info.
2. Looks for minimum period of gap between sysdate and end_date of
Archival period marked for records in Data Set. Validates for the
Date ranges.
3. Call hxc_data_set.mark_tables_with_data_set (make sure that all the
timecards have been marked correctly)
4. Call hxc_data_set.lock_data_set (lock all the timecards first) which
in turn calls hxc_lock_api.request_lock to LOCK Select records
from hxc_time_building_blocks with data_set_id.
5. Validate again by calling: hxc_data_set.validate_data_set To show
all errors.
6. Call hxc_archive_restore_utils.core_table_count_snapshot (Take
Table count before archiving)
7. Call hxc_archive_restore_utils.bkup_table_count_snapshot (This
procedure will give a snapshot of the backup tables)
8. Call the Archival process : hxc_archive.archive_process
This procedure is called during Archive Data Set process. For a
given data set, it copies the records FROM base tables to archive
tables AND DELETEs the records in base table. It removes the links
FROM hxc_tc_ap_links AND hxc_ap_detail_links. It cancels open
notifications. This process is done in chunks.
White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 12 of 17
9. Release Locks by calling : hxc_data_set.release_lock_data_set
10. After Archiving repeat the process :
--Call hxc_archive_restore_utils.core_table_count_snapshot
--Call hxc_archive_restore_utils.bkup_table_count_snapshot
--Call hxc_archive_restore_utils.count_snapshot_check (procedure
will give a snapshot of the backup tables)


After Running the Program you should notice some of the following:
(View Log for more details)
1. For Eg. Inserted Records:
y hxc_time_building_blocks_ar 6544
y hxc_time_attribute_usages_ar 10972
y hxc_time_attributes_ar 10160
y hxc_tansaction_details_ar 5041
y hxc_transactions_ar 150
y hxc_tc_ap_links_ar 110
y hxc_ap_detail_links_ar 1514
y hxc_app_period_summary_ar 110
2. Update Table hxc_data_sets , status=OFF_LINE
3. Deleted Records from hxc_time_building_blocks where
data_set_id=1 and other HXC tables.


4. Run the Restore Data Set Process:
Run the Restore Dutu Set process to move urchlved tlmecurd dutu from the urchlve
Tubles buck to the uctlve tubles ln the OTL uppllcutlon.

To run the Restore Data set process:
Parameter Name List of Values
(If any)
Validations (Required parameter,
Validation against db, etc.)
Comments
Data Set Name - - E.g
Q2FY_Test1
Ignore Validation
Warnings
- - Yes/No
White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 13 of 17
1. In the Submlt Request wlndow, select Restore Dutu Set.
2. Select the nume of the dutu set you wunt to restore.
3. Cllck Submlt. The process restores the urchlved dutu und mukes lt uvulluble ln the
OTL Appllcutlon.

More Details
Culls hxc_urchlve_restore_process.restore_dutu_set
1. Call hxc_data_set.get_data_set_info and get the Data set
to be restored.
2. Call:
y hxc_archive_restore_utils.core_table_count_snapshot
y hxc_archive_restore_utils.bkup_table_count_snapshot
y Call hxc_restore.restore_process for Restore.
This procedure is called during Restore Data Set process. For a
given data set, it copies the records FROM archive tables to base
tables AND deletes the records IN archive table. It restores the
links IN hxc_tc_ap_links AND hxc_ap_detail_links. It starts the
approval process again for eligible application period ids. This
process is done IN chunks.
3) After Restore take a Table Count again and compare:
y hxc_archive_restore_utils.core_table_count_snapshot
y hxc_archive_restore_utils.bkup_table_count_snapshot
y hxc_archive_restore_utils.count_snapshot_check
Parameter
Name
List of Values (If
any)
Validations (Required parameter, Validation
against db, etc.)
Comments
Data Set Name - - E.g
Q2FY_Test2
5. Run the Restore Data Set Process:
The Undo Dutu Set process ungroups the tlmecurd dutu you deflned ln the Deflne Dutu Set
process. Run thls process lf the Deflned Dutu Set process returns errors. Typlcully, errors
occur ln the Deflne Dutu Set process lf the dute runge lncludes too much tlmecurd dutu for
the temporury tubles to uccommodute.

To run the Undo Data Set process:
1. In the Submlt Request wlndow, select Undo Dutu Set.
2. Select the dutu set nume you deflned ln the Deflne Dutu Set process.
3. Cllck Submlt.

More Detulls
White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 14 of 17
Culls hxc_urchlve_restore_process.undo_deflne_dutu_set
1) Get the dutu set lnformutlon by culllng hxc_dutu_set.get_dutu_set_lnfo
2) Puss dutu set to hxc_dutu_set.undo_deflne_dutu_set
3) Updute the Dutu Set ld to null ln chunks of 100 for the followlng tubles:
HXC_TIME_BUILDING_BLOCKS
HXC_TIME_ATTRIBUTE_USAGES
HXC_TIME_ATTRIBUTES
HXC_TRANSACTION_DETAILS
HXC_TRANSACTIONS
HXC_TIMECARD_SUMMARY

Parameter
Name
List of Values (If
any)
Validations (Required parameter, Validation
against db, etc.)
Comments
Data Set Name - - E.g
Q2FY_Test2



















4. Screen Shots showing the Timecard:

Before Archival
Note that Details icon is enabled.

White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 15 of 17





















Details of the Timecard

White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 16 of 17























After Archival

White Paper on Oracle Time and Labor (OTL) Archive and Purge Process
www.Fullinterview.com Page 17 of 17
Note that Details icon is disabled.

You might also like