Share:

The purpose of the BI Stress Test Plan is to simulate concurrent users and test the performance of BI dashboard pages with varying user load. A Stress Test Plan can be instrumental in identifying the bottlenecks in the system and validate the environment for increased user load.

BI Validator simulates a stress on the BI environment depending on the number of parallel users selected for each run. User can pick up to five runs, with different number of users in each Run. The results of a BI Stress Test Plan are shown in both tabular format as well as in graphical format. The execution results are saved whenever the test plan is run and available for download. The history of test plan runs is also saved for further analysis.

For information on BI tools supported by the Stress Test Plan, please see here.


High-Level Steps for Adding a Stress Test Plan

Below are the high-level steps for adding a Stress Test Plan.

  1. On the side menu, click Test Plans.  
  2. Click Add New.
  3. In the New Test Plan page, select Stress Test Plan.
  4. In the Properties page, do the following,
    1. In the Basic Information screen, enter all the required details and click Next
    2. In the Options screen, complete the desired fields and click Save.
  5. Click Home and do the following:
    1. Click Add test cases to add reports/views/documents to the test plan.
    2. Click Run to execute the test plan. 
  6. Click the > arrow menu button (next to Home) and select Run History to view the test plan results and manage test run history.
  7. Click the > arrow menu button (next to Home) and select Statistics to view detailed statistics of previous runs in graphical charts.

Setting up and Working with Stress Test Plan

The pages you need to set up in the Stress Test Plan wizard are described in the sequence below.

Set up Properties

This section walks you through the pages and the detailed explanation of options in a Stress Test Plan.

Basic Information

In Basic Information, complete the following:

Test Plan Name

Enter a name for the test plan.

Label

Create a label for easy identification and grouping of test plan. To create a label, click the Click to add label link, enter a name, and press Enter

Connection

Select a connection from the drop-down list.

BI Validator Groups

Assign one or more groups to the test plan.

Options

In Options, complete any of the following:

  • Number of Users. It is possible to select number of users as Run1 and Run2. In Run 1, whatever count is specified, that number of users are logged in parallel. If both Run 1 and Run 2 are specified, first it will complete number of users in Run 1 and then it will consider the number of users in Run 2. 
  • Page visits per user. Each user specified in the run label will log in to the number of pages visits per user. If in Run 1 we have specified as 5 and in page visits per user is 2 then page visits will be 10. Each user navigates to the number of page visit per user specified. 
  • Ramp Up Time(sec). All the users specified in Run 1 will log in to the environment in the specified time. For example, if the ramp up time is 50 and the run 1 is 5, so in 50 seconds all the 5 users in the run 1 should log in to the environment. 
  • Think Time(sec). Waiting time when a user navigates from one page to an another page. For example, if Run1 =2 and Think Time is 5. User 1 of Run 1 navigates to the first page, waits for 5 seconds and then navigates to the second page. 
  • Timeout(sec). If the time taken to navigate to a particular page is more than the time specified here, a time-out message appears. 
  • SLA(sec). If the loading of page takes more than the specified time, a warning message appears.
  • Select Display Group. Select or clear the following statistics to display/hide in the test plan run. 
    • Minimum Runtime in sec
    • Maximum Runtime in sec
    • RunCount
    • Number of Failures
    • SLA Failures
    • Status Message
    • Web Response Code
    • Web Response Details
  • Purge settings. Select any one of the following settings:
    • Do not purge cache. This option does not purge the cache at all.
    • Purge all cache before each run. This option always purges cache prior to running the test plan.
    • Do not use cache for all requests (does not support prompts). This option does not use cache while sending requests to the server.
  • Configure Users 
    • None. The test plan will run with the user given in the Dashboard page.
    • Users with password. You need to import / browse the user names and password in order to run the test plan. 
    • Users only.  Available for OBIEE connection only. Select any one of the following otions: 
      • Impersonate. The Impersonate user type impersonates (pretends to be) the OBIEE user and performs the required actions instead of the OBIEE user, without needing their passwords.  
      • Act as user. This user has the ability to access other user accounts. The "Act As" or Proxy functionality gives the opportunity to users to run dashboards and reports as someone else.   

For more information, please refer to this article in our Support Portal. 

Add Test Cases

In this page, you will add test cases and then benchmark them.

How to add Test Cases?

Reports/ dashboards/ workbooks/ views/documents are added by dragging them from the BI Connections tree or by using the + button or Import button. For Tableau connection, if users add the workbook, all the views underneath will also get added. 

  1. Ensure that you are on the Home page. If not, click the Home button.
  2. Do any one of the following:
  • Drag Dashboard pages (or Documents) from the BI connection available in the Connection Explorer available on the left-hand side.
  • Click Add test cases. Add the dashboard from within the browser window that opens, click Capture, and then click Add Test Page(s) to Test Plan.
  • Click the + icon to add pages from within a browser window that opens. This method allows users to select prompts for the dashboards and capture the dashboard page URLs before adding them to the StressTest Plan. This feature is only available for OBIEE connection.
  • Copy dashboard prompted links from the browser window into a notepad and import them.

Additional Tasks Post Test Case Addition 

After the test cases are added, you can perform the following additional tasks:

Add Description 

Click the Edit link under the Description column if there are any details need to be added. When finished, click Save.

User

Select the user from the drop-down list under the User column to run the dashboard for that particular user.

Parameters 

Clicking the Edit link under Parameters column opens a pop-up containing a list the parameters, where users can add/edit/delete parameters. For more information, see the how-to-procedure here.

Delete

Click the  Delete icon to remove the selected dashboard(s).

More Options

Clicking  More displays the following options:

Refresh

Allows you to refresh the page (if any benchmark schedule is running in the background).

Import 

Allows you to import reports into the test plan using a text file containing report URLs. For import information, see the  how-to-procedure here.

Export

Export button can be used to export the selected dashboard(s).  For information on where the export options are available in a test plan and how to use them, please see the how-to-procedure here

Configure Notifications

Allows you to set up email addresses of the recipients for whom you want to send notifications about the test plan. To configure notifications, click  More available besides the Home button at the right-hand side and select Notify.

Schedule

Allows you to schedule test plan run process at specified time/date selected from the schedule window. To schedule a test plan, click and select Schedule. To schedule test plan, click  More available besides the Home button at the right-hand side and select Schedule.

Run Stress Test Plan

After adding the test cases, you click the Run button on the Home page to execute test cases. When you click Run, you will be navigated to the Run page, where you will need to click either Run or Run Test Cases. By any chance if you are not in the Home page, click  More available besides the Home button at the right-hand side and select Run. At any point, you can click the Stop button to stop the on-going execution of the test cases.

Information Users in Multi-User environment can also run the test plan using the Command Line Interface.

View Run History

In this page, the run history of the test plan is shown. Different time stamps of the test plan are available on the left-hand side. Each time stamp indicates when the test plan was run. By default, the latest Test Plan run details are shown.  To view previous runs details, select a particular time stamp that interests you.  Each run time stamp displays counts for all status, passed, fail, warning, error and running status. When selecting any status count, dashboards will be shown based on the selected status category.

To view Run History, click  More available besides the Home button at the right-hand side and select Run History

Additional Tasks Post Test Plan Execution

After the test plan is run, you can do the following:

View Web response details

Allows users to view web response details of each report/dashboard. 
User can also use the view link under " Web response details" column. When clicking the View link, a window opens in that the "View Response File" link under the  "View Response File" column is available for you to see the response.

 If the Web Response Details option (under Select Display Group) is not selected, there will not be any data to show.

Refresh

Click the  Refresh icon to refresh test plan results.

Delete

Select the time stamp and click the  Delete icon to remove the test run.

Delete All

Click the  Delete All icon to remove the complete test run history.

View Statistics

In this page, users can view the detailed statistics of the previous runs in graphical charts. To view test plan run statistics, click  More available besides the Home button at the right-hand side and select Statistics.  

PreviousNext

© Datagaps. All rights reserved.
Send feedback on this topic to Datagaps Support