Catalog Test Plan provides a means to group and execute a set of BI catalog objects for regression testing. The main goal of this test plan is to find the differences in the Catalog data when compared with the different environment data. A user selects the Catalog folder of a BI Connection from the BI Catalog tree, benchmarks the XML of the catalog objects, and then runs the Catalog Test Plan.
When the test plan is run, the following things occur:
The benchmark catalog XML is compared with the latest catalog XML and any differences found are shown. Users can select a different environment in the run step for comparing the catalog benchmarked from one environment with the catalog in another environment.
The result data is saved.
The run history is saved.
For information on BI tools supported by the Catalog Test Plan, please see here.
High-Level Steps for Adding a Catalog Test Plan
Below are the high-level steps for adding a Catalog Test Plan.
On the side menu, click Test Plans.
- Click Add New.
- In the New Test Plan page, select Catalog Test Plan.
In the Properties page, do the following,
In the Basic Information screen, enter all the required details and click Next.
In the Options screen, complete the desired fields and click Save.
In the Home page, do the following:
Click Add test cases to add catalog folders / objects to the test plan.
Click Run to execute the test plan.
Click the > arrow menu button (next to Home) and select Run History to view the test plan results and manage test run history.
Setting up and Working with Catalog Test Plan
The pages you need to set up in the Catalog Test Plan wizard are described in the sequence below.
Set up Properties
The Properties are divided into Basic Information and Options.
In Basic Information, complete the following:
Test Plan Name
Enter a name for the Catalog Test Plan.
Create a label for easy identification and grouping of test plan. To create a label, click the Click to add label link, enter a name, and press Enter
Select a connection for OBIEE or Tableau from the drop-down list. To add a new connection, click the + plus icon.
BI Validator Groups
Assign a group to this test plan.
In Options, complete any of the following:
Parallel count. Enter the number of pages to run in parallel.
Catalog XML. Allows you to capture/benchmark catalog XML. Changes in Catalog XML are shown. Differences between the benchmarked and results Catalog XML are shown.
Permissions. Allows you to benchmark permissions to the reports, dashboards, and folders. If this option is not selected, the Permissions data will not be considered when the test plan is run. Permissions in the benchmarked data and results are shown. If there is a new permission assigned, it will be shown as Unmatched. If there are permissions that exist in both benchmarked data and results, they will be shown as Matched.
Audit data. Allows you to benchmark audit data. If this option is not selected, the Audit data will not be considered when the test plan is run. Changes shown are when was the catalog created, last accessed and updated. If there are any changes between the benchmarked data and results, the differences are shown as Unmatched. If there are no differences, the results are shown as Matched.
Run Connection. Allows you to select a different BI connection and compare the catalog objects betweeen two environments.
Run User. Select the user in the BI tool with which you want to run the test plan.
Add Test Cases
In this page, you will add catalog folders or objects to the test plan.
This page contains the following options:
How to add Test Cases?
Catalog folders or objects can be added using the following steps:
Ensure that you are on the Home page. If not, click the Home button.
From the Connection Explorer on the left-hand side, expand the BI connection, and drag and drop the required catalog folders/objects.
Additional Tasks Post Test Case Addition
After the test cases are added, you can perform the following additional tasks:
Click the Edit link under the Description column if there are any details to be added. When finished, click Save.
Select the user from the drop-down list under the User column to run the report for that particular user.
Click the Delete icon to remove the selected object(s).
Click More and select Refresh to refresh the page.
This option allows you to schedule test plan run process at specified time/date selected from the schedule window.
Allows you to set up email addresses of the recipients for whom you want to send notifications about the test plan. To configure notifications, click More available besides the Home button at the right-hand side and select Notify.
Schedule Test Plan
Allows you to schedule test plan run process at specified time/date selected from the schedule window. To schedule test plan, click More available besides the Home button at the right-hand side and select Schedule.
Run Catalog Test Plan
In this page, you will run the test cases (objects). After adding the test cases, you click the Run button on the Home page to execute test cases. When you click Run, you will be navigated to the Run page, where you will need to click either Run or Run Test Cases. By any chance if you are not in the Home page, click More available besides the Home button at the right-hand side and select Run. At any point, click Stop to stop the execution of ongoing test case execution.
Users in Multi-User environment can also run the test plan using the Command Line Interface.
When the test plan is run, the benchmark catalog XML is compared with the latest catalog XML and any differences produced are highlighted. You can select a different environment in the Properties > Options for comparing the catalog benchmarked in one environment with the catalog from another environment. The result data gets saved when ever the Catalog Test Plan is executed.
After the Catalog Test Plan is executed, you can see differences for Catalog XML, Permission, Audit Data. In the grid, you will see View links for the previous benchmarked data (labelled as One), newly benchmarked data or a different environment (labelled as Two), and differences show both benchmarked and result. If there are differences, the View link appears in the Catalog XML Differences, Permissions Differences, and Audit Data Differences columns.
If you select Permissions and Audit data options without benchmarking permissions and audit data, you will notice differences in the result. Because there was no data for permissions and audit when the test plan was benchmarked, the run will show the Unmatched differences.
View Run History
In this page, the run history of the test plan is shown. Different time stamps of the test plan are available on the left-hand side. Each time stamp indicates when the test plan was run. By default, the latest Test Plan run details are shown. To view previous runs details, select a particular time stamp that interests you. Each run time stamp displays counts for all status, passed, fail, warning, error and running status. When selecting any status count, dashboards will be shown based on the selected status category.
To view Run History, click More available besides the Home button at the right-hand side and select Run History.
This page contains the following options:
Click the Refresh icon to refresh test plan results.
Select the time stamp and click the Delete icon to remove the test run.
Click the Delete All icon to remove the complete test run history.
© Datagaps. All rights reserved.
Send feedback on this topic to Datagaps Support