Professional Documents
Culture Documents
Report Testing
Report Testing
BI Testing
Data which is loaded into Data warehouse is used for reporting purposes. There are many
Business intelligence reporting tools in the market which helps in getting data from data
warehouse which helps business people to manage the facts of business data.
In order for the business people to make the decisions using the reports, the data present
in the reports should be accurate. All the reports should be tested thoroughly before they are used
by the business users.
Drilling path is validated by drilling down all the possible paths and validating the data.
All the sub totals in different paths are validated to ensure that they are accurate.
1.4 Prompting reports
Some of the reports which the users use are prompt reports which help in giving different
data inputs. All the prompt reports are validated with different input data.
Business objects are generally used by Business users to analyze data and make business critical
decisions. Hence testing needs to be of high priority and it should ensure that the tool is working
as expected for different scenarios and data obtained for different scenarios is in sink with data
warehouse or any other data source.
Business objects universe deals with complex data models and joins between tables and
databases. Hence testing universe needs expertise in that particular domain. Universe testing
mainly deals with testing every fact tables and joins of fact tables to its dimension tables.
Business objects reporting testing basically deals with front end validation of the tool like errors
while preparing are running the report.
Also, testing can be done to verify the functionality of every feature with report wherever
necessary.
This section deals with verification of Classes, objects and measures to check if they are mapped
to correct fields.
We can achieve the above checks by opening the data warehouse and comparing the table
structure with the Class structure of the report.
This is the main area of testing. Data is the soul of every report. Hence data validation deals with
verification to make sure that correct data has come in the report.
Universe is a very big complex data model and hence has many joins.
Now due to below reasons we might get extra / less or Duplicate records than desired in the
report.
Also, there can be cases where record count is matching but still few records can fall under the
category of missing / extra or duplicate. We can identify such records under this validation
section.
Here we can consider some good sample of records (around 5000) from Data warehouse tables
and check if these records are coming as expected in the report or not. We get export the report
data into Excel and we can compare the data in excel with the help of function s/s macros.
Source Verification:
This check can be done once above two validations are complete.
Above two validations are based on Data warehouse data. But there are chances that the data
itself may be incorrect in data warehouse. Hence it would be good practice if we verify the report
data directly against the source.
Also, reports are generally used by business users and they can only see the data in source and in
report. Hence verification against source will ensure that data is matching with what is present in
source.
Context Verification:
Contexts technique comes into picture when we can join two tables in different ways. Universe
in itself can not identify which join to choose. Hence report user will be given options about
different possible joins.
Now as part of context verification we need to make sure that each context has the correct join
and is fetching the data based on that particular join.
This technique may not be used in every project and hence this validation is optional.
Verification of Performance:
Performance of reports is of huge concern. Performance issues can be caused because of 2 below
reasons:
- Report is trying to access huge data. This issue can be resolved by using query filters.
- Joins between tables may be incorrect / not fine tuned and hence resulting in huge number of
records. This issue can be resolved by fine tuning the joins and queries.