Share:

A Dashboard Test Plan helps you to group and execute a set of Dashboard pages or Document requests for regression testing. The main goal of this test plan is to find the differences in the BI Dashboard pages (or Documents) data by comparing with the Benchmarked data.

For information on the BI tools supported by the Dashboard Test Plan, please see here


High-Level Steps for Creating a Dashboard Test Plan

Below are the high-level steps for adding a Dashboard Test Plan.

  1. On the side menu, click Test Plans.  
  2. Click Add New.
  3. In the New Test Plan page, select Dashboard Test Plan.
  4. In the Properties page, do the following:
    1. In the Basic Information screen, enter all the required details and click Next
    2. In the Options screen, complete the desired fields and click Save.
  5. In the Home page, do the following:
    1. Click Add test cases to add dashboards to the test plan.
    2. Click Run to execute the test plan. 
  6. Click the arrow menu button (next to Home) and select Run History to view the test plan results and manage test run history.
  7. Click the arrow menu button (next to Home) and select Statistics to view detailed statistics of previous runs in graphical charts.

Setting up and Working with Dashboard Test Plan

This section walks you through the pages and the detailed explanation of options in a Dashboard Test Plan.

Setting up Properties

The Properties are divided into Basic Information and Options.

Basic Information

In Basic Information, complete the following:

Test Plan name

Enter a name for the Dashboard Test Plan.

Label

Create a label for easy identification and grouping of test plan. To create a label, click the Click to add label link, enter a name, and press Enter.  

Connection 

Select a connection from the drop-down list.

BI Validator Groups

Select groups that can access the test plan. Only users within the selected group can view and work on this test plan.

Options

In Options, complete any of the following:

  • No of Pages in Parallel . Number of pages within the test plan will execute in parallel when the Test Plan is benchmarking (base lining) the dashboard pages.
  • Dashboard Timeout. The time limit for the dashboard page run before the request is killed if no response has been received.
  • Run Time Variation Allowed. This is the percentage of variation in the dashboard page run time that is allowed from the benchmarked (or base-lined) value. For example, if the dashboard page took 10 seconds during the benchmark but took 21 seconds to run, the status of the Dashboard Page is marked as a 'Warning' provided there are no other failures. A message is also recorded in the 'Message' column stating the same. The default run time variation is 100%. 
  • Compare . There are two comparison options - Appearance and Text. The Appearance option compares the Benchmark and Result dashboard PDF as an image (pixel to pixel) and marks the test run as 'Fail' if there is any difference found. The Text option compares the text in the dashboard pages (including data) while ignoring differences due to 'Date Run' and marks the test case 'Fail' only if there is a 'Text' difference. It also checks for 'ODBC' errors in the dashboard page and marks the test run as failure accordingly.
  • Different Environment. Select this option to find the differences by comparing the benchmarked data with a different BI environment. 
    • Connection. The BI connection setting in the Run page options can be used to modify the target connection for comparing the differences. For example, you can benchmark against the development environment and run the test against the 'Test' environment by selecting a different connection in the run options.    
    • User. Select the BI user that will be used to run the test plan.
  • Both Dashboard and Prompts. When the test plan is run, selecting this option will show result, benchmark, and differences for Dashboard and Prompts. Here, both and Dashboard and Prompts (PDF snapshot) is considered to mark the result as pass.
  • Only Dashboard . When the test plan is run, selecting this option will show result, benchmark, and differences for Dashboard only. Here, the PDF snapshot is not considered to  mark the test result as pass.
  • Only Prompts.  When the test plan is run, selecting this option will show result, benchmark, and differences for Prompts only.  Here, Dashboard data is not considered to mark the test result as pass.
  • Generate Prompt Data. When the test plan is run, selecting this option  will include prompt data in the results. 

Add Test Cases 

In the Home page, you will add BI Dashboard pages (or Documents) and then benchmark them. A PDF version of Dashboard page is downloaded and saved in the BI Validator repository when a Dashboard page is benchmarked.

How to add Test Cases?

Dashboard pages (or Documents) can be added to a Dashboard Test Plan using the following steps:

  1. Ensure that you are on the Home page. If not, click the Home button.
  2. Do any one of the following:
  • Drag Dashboard pages (or Documents) from the Catalog tree (available on the left-hand side) of the BI Connection.
  • Click Add test cases. Add the dashboard from within the browser window that opens, click Capture, and then click Add Test Page(s) to Test Plan.
  • Click the + icon to add pages from within a browser window that opens. This method allows users to select prompts for the dashboards and capture the dashboard page URLs before adding them to the Dashboard Test Plan. This feature is only available for OBIEE connection.
  • Copy dashboard prompted links from the browser window into a notepad and import them.

Additional Tasks Post Test Case Addition 

After the test cases are added, you can perform the following additional tasks:

Add Description

Click the Edit link under the Description column if there are any details need to be added. After adding the details, click Save to save the changes.

User

Select the user from the drop-down list under the User column to run the dashboard for that particular user.

Benchmark Dashboard(s)

Click the View link under the Benchmark Result column to view the benchmarked dashboard. Update link can be used to benchmark the dashboard. You can also click the Benchmark button available at the bottom to benchmark the selected dashboards.

Prompts

Click the View link to view the prompts present in any particular dashboard.

Selectors

In case of OBIEE connection, the Create link will be displayed when dashboard is added. When you click the Create link, a View / Update link will be shown in place of Create link if any View and Column selectors are present for that particular dashboard. You can click the View link to select different option(s) in the selectors. The Update link will be used to update the selectors in the dashboard.

Add / Edit Parameters

Click the Edit link under the Parameters column to open the parameters pop-up to list the parameters present in any particular dashboard. For more information, please see the how-to-procedure here.

Delete

Click the  delete icon to remove selected dashboards. 

More Options

Clicking  More displays the following options:

Refresh

Allows you to refresh the page (if any benchmark schedule is running in the background).

Import 

Allows you to import reports into the test plan using a text file containing report urls. For more import information, please see the how-to-procedure here

Export

Allows you to export the selected dashboard(s). For information on where the export options are available in a test plan and how to use them, please see the how-to-procedure here

Run Dashboard Test Plan

After adding the test cases, you click the Run button on the Home page to execute test cases. When you click Run, you will be navigated to the Run page, where you will need to click either Run or Run Test Cases. By any chance if you are not in the Home page, click  More available besides the Home button at the right-hand side and select Run. 

After the Dashboard Test Plan is executed, the benchmarked PDF is compared with the latest PDF dashboard. If there is any difference found between the current and the benchmarked PDFs, those differences are shown with a highlighted color side by side. The result data gets saved whenever the test plan is executed.

Information Users in Multi-User Edition can also run the test plan using the Command Line Interface.

For test cases that have differences, you will notice a View link shown along with a red bubble under the Differences column. Clicking the View link opens a window displaying differences in that current and benchmarked PDFs are shown, where differences are highlighted. If there are no differences, then a label "No differences" appears, along with the green bubble.  

Stop

Allows you to stop the on-going execution of test run.     

Configure Notifications

Allows you to add email addresses of recipients to whom you want to send notifications about the test plan results. To configure notifications, click  More available besides the Home button at the right-hand side and select Notify.

Schedule Test Plan

This option allows you to schedule test plan run process at specified time/date selected from the schedule window. To schedule test plan, click More available besides the Home button at the right-hand side and select Schedule.

View Run History 

In this page, latest test plan run details are shown. Previous runs details are also shown when the particular time stamp is selected from the bottom navigation bar.  In addition, counts for all status, passed, fail, warning, error and running status are shown. When any status count is selected, dashboards will be shown based on the selected status category. 

To view Run History, click  More available besides the Home button at the right-hand side and select Run History

The Results page contains the following options:

Refresh

Click the  Refresh icon to refresh test plan results.

Delete

Select the time stamp and click the  Delete icon to remove the test run.

Delete All

Click the  Delete All icon to remove the complete test run history.

View Statistics

In this page, users can view the detailed statistics of the previous runs in graphical charts. To view Run History, click  More available besides the Home button at the right-hand side and select Run History.  

PreviousNext

© Datagaps. All rights reserved.
Send feedback on this topic to Datagaps Support