Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 5

1.

BI Testing
Data which is loaded into Data warehouse is used for reporting purposes. There are many
Business intelligence reporting tools in the market which helps in getting data from data
warehouse which helps business people to manage the facts of business data.

In order for the business people to make the decisions using the reports, the data present
in the reports should be accurate. All the reports should be tested thoroughly before they are used
by the business users.

Following methodology is used for validating the BI:


1. Schema objects validation
2. Report data validation
3. Drill path validation
4. Prompting reports testing
5. Other validations

1.1 Schema Objects validation


All the schema objects which are created are validated to ensure that they are mapped to
correct tables and the joins are happening with appropriate tables. This validation helps in
identifying the defect in the earlier stage of report development.
The more advantage of validating this is, if any of the objects are mapping to different table then
all the reports which are using these objects will give incorrect data and hence will result in huge
time & cost to fix the defect.

1.2 Report data validation


The data retrieved by the report is validated with the data warehouse to ensure that accurate data
is appearing in the report.

1.3 Drill Path validation


Some of the business intelligence reporting tools has the functionality of drilling down
the data into lowest level. When a business user looking for a sale at a country level and is
interested in looking into state level then he can drill down to that level. This is the area where
the report is more likely to get errors.

Drilling path is validated by drilling down all the possible paths and validating the data.
All the sub totals in different paths are validated to ensure that they are accurate.
1.4 Prompting reports
Some of the reports which the users use are prompt reports which help in giving different
data inputs. All the prompt reports are validated with different input data.

1.5 Other report validations


1. Derived metrics validation
2. Layout of report
3. Color and format of the headers
4. Sorting of data in report
5. Movement of data in the report
6. Performance of report generation

2. Testing Business objects

Business objects are generally used by Business users to analyze data and make business critical
decisions. Hence testing needs to be of high priority and it should ensure that the tool is working
as expected for different scenarios and data obtained for different scenarios is in sink with data
warehouse or any other data source.

Business object testing can be sub divided into two sections.


1) Testing Business Objects Universe
2) Testing the reporting tool

Business objects universe deals with complex data models and joins between tables and
databases. Hence testing universe needs expertise in that particular domain. Universe testing
mainly deals with testing every fact tables and joins of fact tables to its dimension tables.

Business objects reporting testing basically deals with front end validation of the tool like errors
while preparing are running the report.
Also, testing can be done to verify the functionality of every feature with report wherever
necessary.

a. Business objects Universe

We will be concentrating only on testing basic features of Business objects universe.


At a high level the below scenarios can be considered for validation

• Schema Object Validation


• Report Data Validation
• Other business requirements Validation

Now let us understand about these validation scenarios in detail.


b. Schema Object Validation

This section deals with verification of Classes, objects and measures to check if they are mapped
to correct fields.

As discussed in section 2.3:


Classes are mapped to tables / views in data warehouse
Objects are mapped to fields in the tables in data warehouse
Measures are mapped to calculated fields in the tables in data warehouse.
Now as part of schema object validation we need to check if
1) All the tables required for reporting are present as Classes or not.
2) All the fields in the above tables are present as objects or not.
3) All the calculated fields are present as objects or not.

We can achieve the above checks by opening the data warehouse and comparing the table
structure with the Class structure of the report.

c. Report Data Validation

This is the main area of testing. Data is the soul of every report. Hence data validation deals with
verification to make sure that correct data has come in the report.

The following scenarios needs to be verified as part of Report Data validation

- Verification of record count against DW tables to check for duplicates and


Filtered records (Check inner and outer joins)
- Verification of good sample of data against Data warehouse tables.
- Verification of data against source itself.
- Verification of Contexts

Let us cover each one in detail.

Record Count Verification:


This scenario is easy to test and is very effective. As part of record count verification, get the
desired count of records from source tables. Now check this record count against the record
count obtained by the report. If there is a mismatch then it could be because of below reasons.

Universe is a very big complex data model and hence has many joins.
Now due to below reasons we might get extra / less or Duplicate records than desired in the
report.

- No join between two tables.


- Incorrect join between two tables.
- Outer join is used at places where equijoin is required.
- Equijoin is used at places where outer join is required.
Sample Records Verification:
This verification should be done after record count verification. Once we are sure that record
count is matching we can go ahead and check if data for each object is coming as expected or
not.

Also, there can be cases where record count is matching but still few records can fall under the
category of missing / extra or duplicate. We can identify such records under this validation
section.

Here we can consider some good sample of records (around 5000) from Data warehouse tables
and check if these records are coming as expected in the report or not. We get export the report
data into Excel and we can compare the data in excel with the help of function s/s macros.

Possible issue cases are:


- Objects having values in Data warehouse table but coming as NULL in Report
- Objects having NULL in Data warehouse tale but having some values in Report
- Objects having different values in Data warehouse table and in Report. It happens due to
outer joins.
- Objects data getting truncated in Reports.

Source Verification:
This check can be done once above two validations are complete.
Above two validations are based on Data warehouse data. But there are chances that the data
itself may be incorrect in data warehouse. Hence it would be good practice if we verify the report
data directly against the source.

Also, reports are generally used by business users and they can only see the data in source and in
report. Hence verification against source will ensure that data is matching with what is present in
source.

General issues identified under this section are:


- Data warehouse it self having wrong values.
- Source will be having NULL or default values but report is showing a different value.
- Source data and report data is similar but not same. Such cases are due to type II
functionality of data warehouse and incorrect dimension joins.

Context Verification:
Contexts technique comes into picture when we can join two tables in different ways. Universe
in itself can not identify which join to choose. Hence report user will be given options about
different possible joins.
Now as part of context verification we need to make sure that each context has the correct join
and is fetching the data based on that particular join.

Issues identified during this verification:


- Joins for each context are not identified properly.
- Outer joins and Equijoins issue. Replacing each other.
- Some where conditions are missing for particular context.

This technique may not be used in every project and hence this validation is optional.

d. Other Business Validation


Based on the requirements of the projects there can be other validation cases. Most common
among them are

Verification of Performance:
Performance of reports is of huge concern. Performance issues can be caused because of 2 below
reasons:
- Report is trying to access huge data. This issue can be resolved by using query filters.
- Joins between tables may be incorrect / not fine tuned and hence resulting in huge number of
records. This issue can be resolved by fine tuning the joins and queries.

Verification of synchronization errors during context selection


Sometimes reports may not run if we are pulling objects from 3 or 4 classes. This is because the
classes are not joined in universe properly. During such cases we get synchronization errors.

Verification of existing user reports:


When universe is modified due to some issues or new functionality addition, it would be best if
we verify existing user reports just to make sure that everything is working as expected.

You might also like