Professional Documents
Culture Documents
2016.01 Audit Work On Information Produced by
2016.01 Audit Work On Information Produced by
2016.01 Audit Work On Information Produced by
SOFTWARE TOOLS
AUDIT WORK ON INFORMATION PRODUCED BY THE
ENTITY
Introduction
ISA 500, para. 9 states, “When using information produced by the entity [(IPE) as audit evidence during the execution phases of the
audit], the auditor shall evaluate whether the information is sufficiently reliable for the auditor’s purposes, including, as necessary in the
circumstances:
a. Obtaining audit evidence about the accuracy and completeness of the information; and
b. Evaluating whether the information is sufficiently precise and detailed for the auditor’s purposes.”
All IPE used as audit evidence is to be documented and evaluated. The documentation will include identification of the procedures
the particular IPE will be used in, related IT systems and approach to assessing the reliability of the IPE (whether testing controls or
substantively) and is to be completed regardless of the audit phase during which the IPE is identified and received.
This guidance focuses on approaches to gaining assurance over the reliability of system-generated IPE. Other procedures may be
performed over other types of IPE.
The amount of testing required to assess whether the IPE is ‘sufficiently reliable’ depends on two main factors:
1. Assurance planned from the audit procedure in which the IPE is being used; and
2. The IPE’s inherent reliability based on its characteristics.
At least a minimal level of work has to be performed for any IPE used as audit evidence.
The amount of assurance we plan to obtain from an audit test that uses the IPE, combined with the inherent reliability of the IPE that
we are using, will impact how much testing we consider necessary to perform on the IPE to evaluate whether it is sufficiently reliable
for our purposes.
H
H
Amount of IPE Testing to be Performed
Examples of minimal procedures that could be performed for system generated IPE may include the following (if appropriate):
• Completeness checks (agreeing to other reports / support; ensuring rows / columns / sheets are not missing);
• Observe IPE production or reproduce the IPE to ensure the report mechanics work and that the relevant data has been captured in
the report;
• Review selection criteria used to produce the report (to verify that the relevant accounting period, date and population were used);
• Compare IPE to the previous year (by inquiry or by comparison to the previous year’s received report);
• Perform recalculations (of the report as a whole or of a sample of relevant fields);
• Review system reliability checks / reports.
For example, when IPE is a standard report from an off-the-shelf application, appropriate testing of the reliability of the IPE may be to
perform a completeness check and reproduce the IPE ourselves.
For assessing the amount of assurance planned for the audit procedure that uses the IPE, the relevant R factor or assurance to be
obtained from the procedure is needed.
For example, where IPE is used in an Other Substantive Procedure (OSP), the amount of assurance obtained from that OSP can vary,
thus influencing the level of reliability needed from the IPE. If the FSA and assertion have a Significant RMM Level, and no other
procedures are planned for that FSA and assertion other than the OSP which uses the IPE, then the planned assurance level from that
OSP is 3.0. In contrast, if Tests of Controls (TOCs) or Substantive Analytical Procedures (SAPs) are planned to achieve an assurance
level of 2.0 on that FSA and assertion, then the assurance required from the OSP that uses the IPE is 1.0. The higher the assurance we
want to obtain from the test, the more reliable IPE needs to be.
• Standard report - a built in report produced from a commercial off-the-shelf application that end users cannot
change or customise;
Reducing IPE
Reliability
• Vendor customised report - a report from a commercial off-the-shelf application that was customised specially
for the client (e.g., fields were added / removed, value lists populated);
• Client customised report - a custom built report according to user predefined requirements; or
• User customised report - Reports produced by the client using Report Writing and Design Tools or Query
Language (e.g., SQL queries).
iii. Test data - input test transactions or data for which we can predict the outcome and match this to the actual report (e.g. by
creating exceptions that should be captured in an exception report).
iv. Code review of report queries to analyse the report logic and other parameters.
To assist audit teams in summarising IPE used as audit evidence, an optional form was provided in the October 2015 APT
library (document 3.01.02, Summary of Information Produced by the Entity)(APT BDO World pages). Work performed to assess the
reliability of the IPE is documented in appropriate places in the CAW and cross-referenced to this summary. A link of this optional
form is provided for your convenience.
CONTACT
Questions related to this publication are to be directed to your Regional Audit Adviser (RAA) or submitted via email to
audit@bdo.global.