Share:

A Report Test Plan allows you to group and execute a set of BI reports or Documents (webi) or Views (Tableau) for regression testing. The main goal of this Test Plan is to find the differences in the BI report data when compared with the benchmarked data. Users can drag reports from a BI connection available in the Test Plans & Connections tree or add pages with prompts using an intuitive user interface.

You can select the format type of the report for comparison purpose. Depending on the BI connection, the format options available for benchmarking and report comparision are PDF, Data, and Excel. For example, the format options for Tableau connection are PDF and Data. When a report is benchmarked, the report is downloaded and saved in the selected format. User can change the format and re-benchmark report. For OBIEE, users can also add dashboard pages to the report Test Plan. BI Validator will automatically scan for reports in the dashboard page and add them along with any parameters.

When the BI Report Test Plan is executed, the benchmarked report file is compared with the latest report page. If any differences found between the current and the benchmarked file, they are shown in red color. The result data gets saved when ever the Test Plan is executed. A history of previous runs is preserved and can be tracked from the Run History page. User can also view the detailed statistics in the form of charts.

For information on the BI tools supported by the Report Test Plan, please see here.


High-Level Steps for Adding a Report Test Plan

Below are the high-level steps for adding a Report Test Plan.

  1. On the side menu, click Test Plans.  
  2. Click Add New.
  3. In the New Test Plan page, select Report Test Plan.
  4. In the Properties page, do the following,
    1. In the Basic Information screen, enter all the required details and click Next
    2. In the Options screen, complete the desired fields and click Save.
  5. In the Home page, do the following:
    1. Click Add test cases to add dashboards to the test plan.
    2. Click Run to execute the test plan. 
  6. Click the > arrow menu button (next to Home) and select Run History to view the test plan results and manage test run history.
  7. Click the > arrow menu button (next to Home) and select Statistics to view detailed statistics of previous runs in graphical charts.

Setting up and Working with Report Test Plan

This section walks you through the pages and the detailed explanation of options in a Report Test Plan

Set up Properties

The Properties are split into Basic Information and Options.

Basic Information

In Basic Information, complete the following:

Test Plan Name

Enter a  name for the Report test plan.

Label

Create a label for easy identification and grouping of test plan. To create a label, click the Click to add label link, enter a name, and press Enter.  

Connection

Select a connection from the drop-down list.

BI Validator Groups

Select one or more groups to assign the test plan.

Select Default Format

Select a report format for the connection type from the drop-down list. Depending on the connection type, the different formats available for selection are Data, PDF, and Excel.

Options

In Options, complete any of the following:

  • Parallel Pages. Number of pages within the Test Plan will execute in parallel when the Test Plan is benchmarking (base lining) the reports.
  • Report Timeout. The time limit for the report run before the request is killed if no response is received.
  • Row count. Number of records to fetch for a report.
  • Drill down count. Allows you to  open the drill down reports to the level mentioned here. The default is 0. For information on how drill-down works, read the knowledgebase article here.
  • PDF Compare Type. There are two comparison options - Appearance and Text. The Appearance option compares the Benchmark and Result PDF as an image (pixel to pixel) and marks the test run as 'Fail' if there are no differences. The 'Text' option compares the text in the report pages (including data) while ignoring differences due to 'Date Run' and marks the test case 'Fail' only if there is a 'Text' difference.
  • Run Time Variation Allowed . This is the percentage (%) of variation in the dashboard page run time that is allowed from the benchmarked (or base-lined) value. For example, if the dashboard page took 10 seconds during the benchmark but took 21 seconds to run, the status of the Dashboard Page is marked as a 'Warning' provided there are no other failures. A message is also recorded in the 'Message' column stating the same. The default variation is 100%.
  • Ignore data differences. This will ignore the differences in the report data (for selected format type). If any differences are found in the report data, the report differences status will be marked as Pass though it has actually failed.
  • Include report XML. Select this checkbox will check for any differences in the report XML with the benchmarked report XML data. 
  • Include logical query. Select this checkbox will check for any differences in the logical query with the benchmarked logical query data. 
  • Copy parent report parameters for drill report. Allows you to copy same parameters from parent reports for drill-down reports.
  • Use combination of selectors. This will auto-create test plans based on the combination of available view and column selectors.

Add Test Cases

In this page, you can add reports / views / documents to the test plan. Users can select the format type in order to compare the reports with. The formats available for benchmarking and report comparison are PDF, Data, and Excel. When a report is benchmarked, the report is downloaded and saved in the selected format. User can change the format and re-benchmark report, if required. For OBIEE, users can also add dashboard pages to the Report Test Plan. BI Validator will automatically scan for reports in the dashboard page and add them along with any parameters. For Tableau, Data format is available only at Worksheet level (summary and full data).

How to Add Test Cases?

  1. Ensure that you are on the Home page. If not, click the Home button.
  2. Do any one of the following:
  • Drag reports (or Documents) from the Connection Explorer (available on the left-hand side) of the BI Connection.
  • Click Add test cases. Add the report from within the browser window that opens, click Capture, and then click Add Test Page(s) to Test Plan.
  • Click the + icon to add pages from within a browser window that opens. This method allows users to select prompts for the dashboards and capture the dashboard page URLs before adding them to the Test Plan. This feature is only available for OBIEE connection.
  • Copy dashboard prompted links from the browser window of your BI application into a notepad and import them.
Information For Tableau connection, you can also add worksheets when the default format type is PDF.

Additional Tasks Post Test Case Addition 

After the test cases are added, you can perform the following additional tasks:

Add Description

Click the Edit link under the Description column if there are any details to be added. 

User selection

Select the user from the drop-down list under the User column to run report for that particular user.

Benchmark Report (s)

Click the View link under the "Benchmark Result" column to view the benchmarked report. Update link can be used to benchmark the report.

Selectors

Applies to OBIEE connection only. A Create link will be displayed when report is added. When user clicks the Create link, the "View / Update" link will be shown in place of Create link if the View / Column selectors are present for that particular report. User can click the View link to select different option(s) in the selectors. The Update link is used to update the refetch the selectors.

Add / Edit Parameters

Click the Edit link under the Parameters column to open a dialog containing parameters applicable for specific reports. Users can add/edit/delete these parameters. For information about page-specific parameters, see the how-to-procedure here.

Set Variance

Variance allows to mark the testcase as pass or fail if data in a particular column in the testcase satisfies that allowed variance percentile even though data is different on that column. This is used to select the variance against different columns. The Set Variance link under the Variance column is enabled when format type is data. For information about to set Variance, see the how-to-procedure here.

Filters

For information about to how to set filters in a Tableau report, see the how-to-procedure here 

Delete

Click the Delete icon to delete selected reports.

More Options

Clicking  More displays the following options:

Refresh

Allows you to refresh the page (if any benchmark schedule is running in the background).

Import 

Allows you to import reports into the test plan using a text file containing report urls. For import information, please refer to the how-to-procedure here.

Export

Allows you to export the selected reports. For information on where the export options are available in a test plan and how to use them, please see the how-to-procedure here

Run Report Test Plan

After adding the test cases, you click the Run button on the Home page to execute test cases. When you click Run, you will be navigated to the Run page, where you will need to click either Run or Run Test Cases. By any chance if you are not in the Home page, click  More available besides the Home button at the right-hand side and select Run. At any point, you can click the Stop button to stop the on-going execution of test cases.

When the BI Report test plan is executed, the benchmarked report is compared with the latest report page. If there is any difference found between the current and the benchmarked file, the differences are shown in red color. The result data gets saved when ever the test plan is executed.

InformationUsers in Multi-User environment can also run the test plan using the Command Line Interface.

Filter differences

Select this checkbox to compute any differences in filter as well as filter related data with the previously benchmarked filter data. This is useful to verify when any new filter is added to the report or any new data added to the filter.  

Configure Notifications

Allows you to set up email addresses of recipients for whom you want to send notifications about test run details. To configure notifications, click  More available besides the Home button at the right-hand side and select Notify.

Schedule Test Plan

This option allows you to schedule test plan run process at specified time/date selected from the schedule window. To schedule test plan, click  More available besides the Home button at the right-hand side and select Schedule.

View Run History

In this page, the latest test plan run details are shown. Previous runs details are also shown when the particular time stamp is selected from the bottom navigation bar.

In addition, counts for all status, passed, fail, warning, error, and running status are shown. When any status count is selected, a report is displayed based on the selected status category.

To view Run History, click  More available besides the Home button at the right-hand side and select Run History

The Run History page contains the following:

Refresh

Click the  Refresh icon to refresh test plan results.

Delete

Select the time stamp and click the  Delete icon to remove the test run.

Delete All

Click the  Delete All icon to remove the complete test run history.

View Statistics

The View Statistics page displays the detailed statistics of the previous runs in graphical charts. To view Run History, click  More available besides the Home button at the right-hand side and select Run History.  

PreviousNext

© Datagaps. All rights reserved.
Send feedback on this topic to Datagaps Support