Open System Testing Architecture

TOC PREV NEXT INDEX



Results Display


Results Display Overview

HTTP/S Load provides a variety of data collection and display options to assist you in the analysis of Test results. Running a Test and displaying the results enables you to identify whether the Web Application Environments (WAEs) under test are able to meet the processing demands you anticipate will be placed on them. After a Test-run is complete use Commander to control which results are displayed and how they are presented, in order to help you analyze the performance of target WAEs and the network used to run the Test.

Open the Test you want from the Repository Window and click on the Results tab in the Test Pane, then choose the results you want to display using the Results Window. Depending on the category of results you select, data is displayed in graph or table format. You can choose from a wide range of tables and customizable graphs to display your results which can be filtered and exported for further analysis and print. Use the Results Window to view multiple graphs and tables simultaneously to compare results from different Test-runs.

When a Test is run a wide range of results data is collected automatically. Virtual User response times and resource utilization information is recorded from all Web sites under test, along with performance data from WAE components and the Hosts used to run the Test. Results categories include the Test Configuration option which presents a brief description of the Test and the Task Groups settings that applied during a Test-run. The Test Audit log records significant events that occur during a Test-run and the HTTP Data List records the HTTP/S requests issued, including the response times and codes for every request. The Timer List option records the length of time taken to load each Web page defined in the Scripts referenced by a Test.

Creating and referencing Collectors in a Test helps to improve the quality and extend the range of the results data produced during a Test-run. Collectors give you the ability to target the Host computers and devices used to run a Test and the back-end database components of WAEs under test, with user-defined data collection queries. Use NT Performance and SNMP Collectors to collect data from Host devices within target WAEs or the test network.

The range of results produced during a Test-run can depend on the content of the Scripts that are referenced by a Test. For example Report and History logs are only produced if the Scripts included have been modeled to incorporate the SCL commands used to generate the data content for these logs.

See also:

Results Tab

General Results Display Procedures

Test Audit Log

Test Report Log

Test History Log

HTTP Data List

HTTP Data Graphs

Timer List

SNMP and NT Performance Collector Graphs

Results Tab

Results are stored in the Repository after a Test-run is complete. You can view them by working from the Repository Window to open the Test you want, then click on the  Results tab in the Test Pane. Use the Results Window to select the results you want to view in the workspace of the Test Pane. You can reposition the Results Window by floating it over the Main Window to give yourself more room for results display, or close it once you have selected the results options you want to view.

The Results Tab of the Test Pane
Results Tab Display Options

Graphs can be customized to improve the presentation of data by right-clicking within a graph then selecting Customize. This function includes options that enable you to modify the graph style from the default line plot to a vertical bar, as well as controlling the color of elements within the graph display.

You can control the information displayed in some graphs and tables by filtering the data they represent. Right-click within a graph or table, then select Filter or Filter URLs, or click the Filter button in the toolbar and make your selection. You can also opt to export results data for further analysis and printing. Right-click and select Export to Excel or Export from the menu.

You can also zoom in on a graph by clicking and dragging over the area of the graph you want to study. Use the Windows option to control the presentation of results options in the Test Pane, or right-click within the empty workspace of the Test Pane to access these functions as illustrated in the diagram above.

See also:

The Results Window

Display Test Results

Customize Graph Display

Zoom In and Out of a Graph

Export Test Results

Close Test Results

Delete Test Results

The Results Window

When you click on the Results tab, the Results Window opens automatically. Its default location is on the right-hand side of the Test Pane where it is docked. Use it to select and display results from any of the Test-runs associated with the current Test.

Test-runs are stored in date and time stamped folders which you can double-click on to open, or click . When you open a Test-run folder, the available results are listed below. Display the results you want by clicking on the options and ticking the check boxes to the left of the results options. The results you choose are displayed in the Test Pane.

Multiple graphs and tables from different Test-runs associated with the current Test can be displayed concurrently. Use the Results Window to select additional Test-runs and equivalent results options to compare Test results and help evaluate performance.

Results Window Display Options

The Results Window is located on the right-hand side of the Test Pane. It can be closed to increase the workspace area available, or moved to a new position by floating it over the Main Window.

See also:

Hide/Display The Results Window

Move The Results Window

Resize The Results Window

Display Test Results

Results Tab

Hide/Display The Results Window
Move The Results Window
  1. Click on the double bar at the top of the Results Window.
  2. Drag, then drop it in the new position within the Main Window.
    Note: The Results Window docks with the Main Window's borders if it contacts them. Hold down the Control key while you reposition the Results Window to avoid this.
Resize The Results Window
  1. Move your cursor over part of the window edge.
  2. Click and drag, then drop the border in the required position.

General Results Display Procedures

Display Test Results
  1. In the Repository Window, double-click Tests to expand the directory structure.
  2. Double-click the Test , whose results you want to display.
  3. In the Test Pane click the Results tab.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  4. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and view a list of results display options and Task Group results folders.
  5. Click next to a results option to display your selection in the Test Pane or open a Task Group folder and select from the display options listed.
    A ticked check box to the left of a display option indicates that it is open in the Test Pane.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
    Tip: All available results have display and output options associated with them. These options may include filtering, customizing and exporting. Right-click within a graph or table to display and select from the choices available.
    Use the Windows option in the Menu Bar to control the display of graphs and tables. Or, right-click within the empty workspace of the Test Pane to access these functions.
See also:

Customize Graph Display

Zoom In and Out of a Graph

Export Test Results

Customize Graph Display
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, double-click on a Test-run folder or click , to open it and display the available results.
  3. Click on a graph display results option to open your selection in the Test Pane.
  4. Right-click inside the graph and select Customize.
  5. Select the Graph Type you want:
  1. Click OK to apply your choices.
    Note: The customize settings you select are not saved when you close a graph.
See also:

Display HTTP Data Graphs

Display Custom Collector Graphs

Zoom In and Out of a Graph
  1. Open a Test and click the Results tab of the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and view a list of results display options and Task Group results folders.
  3. Click next to a graph option to display your selection in the Test Pane.
  4. Click and drag over the area of the graph you want to zoom in on and release your mouse button.
    The data range you select is magnified to fill the graph window.
  5. Double-click anywhere in the graph to zoom out and return to the full graph display.
Export Test Results
  1. Open a Test and click the Results tab of the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and view a list of results display options and Task Group results folders.
  3. Click next to a results option to display your selection in the Test Pane.
  4. Right-click inside the graph or table and select either Export to Excel (graphs), or Export (tables and lists).
    Note: The Export to Excel option automatically launches Excel and converts the data into Microsoft Excel Workbook format. Save and edit your results as required.
    The Export option enables you to export results as a .CSV file. The Test Configuration results option only supports text file format for exporting data.
Close Test Results
Delete Test Results
  1. Open a Test and click the Results tab of the Test Pane.
  2. Click , in the toolbar.
  3. In the Delete Test-runs dialog box, select the Test-runs you want to delete.
    Note: Test-runs are labelled with a date and time stamp to help you identify them.
  4. Click Delete to remove the results from the Repository.

Test Configuration

The Test Configuration display option consists of a summary of data collected during a Test-run. It provides data relating to the Task Groups, Scripts, Hosts and Virtual Users that comprised the Test-run.

See also:

Display Test Configuration

Display Test Configuration
  1. Open a Test and click the Results tab in the Test Pane.
  2. In the Results Window, double-click on a Test-run folder or click , to open it and display the available results.
  3. Click the Test Configuration results option in the list.
    Test configuration information is displayed in the Results tab in the following format:
    Tip: Display multiple graphs and tables concurrently to compare results using the Results Window.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
See also:

Test Configuration

Test Audit Log

The Test Audit log contains a list of significant events that have occurred during a Test-run. These include the times and details of Test initiation and completion, errors that may have occurred and Virtual User details.

Additional Audit log entries may be written to the log if the Scripts included in the Test have been modeled to incorporate the appropriate SCL code. Use the LOG SCL command in a Script, to generate the data content for the Test Audit log. For more information on SCL refer to the SCL Reference Guide; an on-line copy is available within the Script Modeler, Help menu.

See also:

Display Test Audit Log Data

Error Reporting and Tracing

Display Test Audit Log Data
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and display the available results.
  3. Click the Test Audit Log results option in the list.
    Audit information is displayed in the Results tab in table format:
    Tip: Display multiple graphs and tables concurrently to compare results using the Results Window.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
    Tip: You can export the data displayed in the Test Audit Log by right-clicking within the table and selecting Export. The data is exported in CSV format.
See also:

Test Audit Log

Error Reporting and Tracing

Test Report Log

The Test Report log is a sequential text file that is used to record information about a single Test-run. Usually, a single record is written to the Report log whenever a Test case passes or fails.

Additional Report log entries may be written to the log if the Scripts included in the Test have been modeled to incorporate the appropriate SCL code. Use the REPORT SCL command in a Script, to generate the data content for the Test Report log. For more information on SCL refer to the SCL Reference Guide; an on-line copy is available within the Script Modeler, Help menu.

See also:

Display Test Report Log Data

Error Reporting and Tracing

Display Test Report Log Data
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and display the available results.
  3. Click the Test Report Log results option in the list.
    Report information is displayed in the Results tab in table format:
    Tip: Display multiple graphs and tables concurrently to compare results using the Results Window.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
    Tip: You can export the data displayed in the Test Report Log by right-clicking within the table and selecting Export. The data is exported in CSV format.
See also:

Test Report Log

Error Reporting and Tracing

Test History Log

The Test History log is a sequential text file that is used to maintain a chronological history of each occasion on which the Test was run, together with the results of that Test. Usually, a single record is written to the History log when the Test-run is complete.

In addition, further Test History log entries may be written to the log if the Scripts included in the Test have been modeled to incorporate the appropriate SCL code. Use the HISTORY SCL command in a Script, to generate the data content for the Test History log. For more information on SCL refer to the SCL Reference Guide; an on-line copy is available within the Script Modeler, Help menu.

See also:

Display Test History Log Data

Error Reporting and Tracing

Display Test History Log Data
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and display the available results.
  3. Click the Test History Log results option in the list.
    History information is displayed in the Results tab in table format.
    Tip: Display multiple graphs and tables concurrently to compare results using the Results Window.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
    Tip: You can export the data displayed in the Test History Log by right-clicking within the table and selecting Export. The data is exported in CSV format.
See also:

Test History Log

Error Reporting and Tracing

Test Error Log

The Test Error Log records all significant error messages from the Test Manager, Task Group Executers and OpenSTA Daemon.

Data included in the log are: Time Stamp, Test Name, Location and Message.

See also:

Display Test History Log Data

Error Reporting and Tracing

Display the Test Error Log
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and display the available results.
  3. Click the Test Error Log display option in the list to open it in the Test Pane.
    Test Error Log data is displayed in table format.
    Tip: Display multiple graphs and tables concurrently to compare results using the Results Window.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
    Tip: You can export the data displayed in the Test Error Log by right-clicking within the table and selecting Export. The data is exported in CSV format.
See also:

Test History Log

Error Reporting and Tracing

Test Summary Snapshots

The Test Summary Snapshots option displays a variety of Test summary data captured during a Test-run. Snapshots of Test activity are recorded at defined intervals and summarized in table format. You can set this interval in seconds using the Task Monitor Interval button.
The test statistics provided relate mainly to Task and HTTP request behavior. They are particularly useful in determining the number of HTTP requests issued, request duration and the time elapsed between request issue and results receipt during Tests-runs.

See also:

Display Test Summary Snapshots

Task Monitoring Interval

Display Test Summary Snapshots
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and display the available results.
  3. Click the Test Summary Snapshots display option in the list to open it in the Test Pane.
    Test Summary Snapshots data is displayed in table format:
    Test Summary Snapshots data categories are:
See also:

Test Summary Snapshots

HTTP Data List

The HTTP Data List stores details of the HTTP requests issued by the Scripts included in a Test when it is run. This data includes the response times and codes for all the HTTP requests issued. The amount of HTTP data recorded depends on the Logging level specified for a Script-based Task Group when you created the Test and defined the Virtual User settings to be applied. The Logging level setting controls the number of Virtual Users that statistics are gathered for and can be edited from the Configuration tab of the Test Pane.

The data is presented in a table and can be sorted by clicking on the column headings to reverse the display order of the data entries. These results can also be filtered by right-clicking inside the table and selecting the Filter option. Use the Export right-click menu option to export data in .CSV text file format which allows them to be imported into other data analysis and report generating tools.

See also:

Display the HTTP Data List

Filter HTTP Data List

Virtual User Settings

Display the HTTP Data List
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and display the available results.
  3. Click the HTTP Data List display option in the list to open it in the Test Pane.
    HTTP Data List information is displayed in table format:
    Tip: Right-click within the table and use the menu options to Filter and Export the data.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
See also:

Filter HTTP Data List

Export Test Results

HTTP Data List

Filter HTTP Data List
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and display the available results.
  3. Click on a graph display, results option to open your selection in the Test Pane.
  4. Click , in the toolbar or right-click inside the graph and select Filter.
    The Filter dialog box offers a variety of selection criteria as illustrated:
  5. The filter criteria available correspond to the column categories in the HTTP Data List table. Select your settings from the filter options:
  1. Select the filter options you want then click OK to apply them.
    Note: Click the Defaults button to apply the original filter settings, which reflect the full range of data measurements of the HTTP Data List listed.
    Note: The filter settings you apply are not saved when you close the table.
See also:

Display the HTTP Data List

HTTP Data List

HTTP Data Graphs

The volume of HTTP data recorded is controlled by the Logging level you set for a Task Group's Virtual Users. The Logging level determines the number of Virtual Users that data is collected for and controls the quality of the data displayed in the graphs. The HTTP data collected relates only to responses to HTTP requests issued as part of Test.

The HTTP data collected during a Test-run can be displayed in a number of different graphs where you can scrutinize your Test results. There are seven graphs in total which you can display using the Results Window.

Right-click within a graph and select to Customize, Export to Excel Filter URLs.

See also:

Display HTTP Data Graphs

Filter URLs in HTTP Data Graphs

Customize Graph Display

HTTP Response Time (Average per Second) v Number of Responses Graph

HTTP Errors v Active Users Graph

HTTP Errors v Elapsed Time Graph

HTTP Responses v Elapsed Time Graph

HTTP Response Time v Elapsed Time Graph

HTTP Active Users v Elapsed Time Graph

Virtual User Settings

Display HTTP Data Graphs
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, click next to a Test-run folder or double-click on it to open the folder and display the available results.
  3. Click on an HTTP data list option such as HTTP Monitored Bytes / Second v Elapsed Time to open your selection in the Test Pane.
    This graph shows the total number of bytes per second returned during the Test-run.
    Note: Graphs are displayed in the default line plot style.
    Tip: Right-click within the graph and use the menu options to Customize, Filter URLs and Export to Excel.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
See also:

Filter HTTP Data List

Customize Graph Display

HTTP Data Graphs

Filter URLs in HTTP Data Graphs
  1. Open a Test and display an HTTP data graph in the Test Pane.
  2. Click , in the toolbar or right-click inside a graph then select Filter URLs.
    Use the Filter URLs dialog box to select the URLs you want to display. Click Select All to display all the URLs.
  3. In the Filter URLs dialog box select the URLs you want to view.
  4. Click OK to display the selected URLs.
    Note: The filter settings you apply are not saved when you close the table.
See also:

Display HTTP Data Graphs

Customize Graph Display

HTTP Data Graphs

HTTP Response Time (Average per Second) v Number of Responses Graph

This graph displays the average response time for requests grouped by the number of requests per second during a Test-run.

Tip: Right-click within the graph and use the menu options to Customize, Filter URLs and Export to Excel.

HTTP Errors v Active Users Graph

This graph is used to display the effect on performance measured by the number of HTTP server errors returned as the number of active Virtual Users varies during a Test-run.

Note: This graph has been customized to display data points as vertical bars. Right-click within a graph and select Customize, then select Graph Type, Vertical bars.

Make use of the Filter URLs and Export to Excel options associated with this graph by right-clicking within it.

HTTP Errors v Elapsed Time Graph

This graph displays a cumulative count of the number of HTTP server errors returned during the Test-run.

Note: This graph has been customized to display the area under the data points as a solid. Right-click within a graph and select Customize > Area under points from the menu to change the appearance of your graphs.

HTTP Responses v Elapsed Time Graph

This graph displays the total number of HTTP responses per second during the Test-run.

Right-click within a graph and select to Customize or Export to Excel.

HTTP Response Time v Elapsed Time Graph

This graph displays the average response time per second of all the requests issued during the Test-run.

Use the right-click menu options to Customize, Export to Excel Filter URLs.

HTTP Active Users v Elapsed Time Graph

This graph displays the total number of active Virtual Users sampled at fixed intervals during a Test-run.

Right-click within the graph and use the menu options to Customize or Export to Excel.

Single Step Results

During Test development it is important to check that a Test runs correctly. You can run a single stepping session to help verify a Test by monitoring Task Group replay to check that the WAE responses are appropriate. Then use the Single Step Results option to analyze the results data obtained. The data includes the HTTP requests issued to a target WAE and the HTTP returned in response during a single stepping session.

Single stepping a Test is a useful method to help you verify that a Test with a modular structure runs as you expect. A modular Test incorporates two or more Scripts in one Task Group to simulate a continuous Web browser session when the Test is run and requires some modeling of the Scripts included. After single stepping the Task Group that contains the Script sequence, open up the Single Step Results option and double-click on an HTTP request to display the request details.

View the details of the HTTP request in response to which the first cookie was issued during a Test-run. In the Response Header section of the Request Details window look for the Set-Cookie entry and make a note of the cookie ID including its name and value. Then view first request included in the next Script in the sequence and look in the Request section of the Request Details window for the Cookie entry. The cookie ID recorded here should be the same as the first cookie value issued at the end of the previous Script. Ensure that the value of the last cookie issued in each Script is handed onto the next Script in the sequence, for all the Scripts in the Task Group.

See also:

Display Single Step Results

Single Stepping

Timer List

Developing a Modular Test Structure

Display Single Step Results
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, double-click on a single stepping Test-run folder or click , to open it and display the available results.
  3. Click the Single Step Results display option to open your selection in the Test Pane.
    Single step results are displayed in table format:
    Single Step Results data categories are:
  1. Double-click on a request to display more details about your selection.
    The HTTP data in the Response Body section is the same data displayed in the HTTP section when you replay a Task Group during a single stepping session.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
See also:

Single Stepping

Timer List

The Timer List file gives details of the Timers recorded during a Test-run. Timer results data records the time taken to load each Web page specified by a Script for every Virtual User running the Script during a Test-run. The level of Timer information recorded is controlled by adjusting the Virtual User settings in the Test's Script-based Task Groups. Open the Test with the Configuration tab of the Test Pane displayed, then click on a VUs table cell in a Task Group and check the activate the Generate Timers for each page option in the Properties Window. The Logging level you select here controls the volume of HTTP data and the number of timers recorded.

The information collected is presented in a table and can be sorted by clicking on the column headings to reverse the display order of the data entries.

Timer List can be exported to a .CSV text file which allows results to be imported into many other data analysis and report generating tools.

See also:

Display the Timer List

Timer Values v Active Users Graph

Timer Values v Elapsed Time Graph

Virtual User Settings

Display the Timer List
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, double-click on a Test-run folder or click , to open it and display the available results.
  3. Click the Timer List display option to open your selection in the Test Pane.
    Timer List information is displayed in table format:
    Note: Right-click within the graph and select Export to save the data to a .CSV text file, which allows results to be imported into other data analysis and report generating tools.
    Tip: Display multiple graphs and tables concurrently to compare results using the Results Window.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
    Tip: To improve the display of your results use the Customize, option to display your data in vertical bars style. If your timer names and color coding key is not displayed, you can maximize the display area by double-clicking in the title bar of the graph.
Timer Values v Active Users Graph

This graph is used to display the effect on performance as measured by timers, as the number of Virtual Users varies.

You can control the information displayed by filtering the timers. The Select Timers to display dialog box appears when you choose this option from the Results Window. Use it to select the timers you want to view, then click OK to proceed.

Right-click within a graph and select to Customize, Export to Excel Filter URLs.

Timer Values v Elapsed Time Graph

This graph is used to display the average timer values per second.

You can control the information displayed by filtering the timers. The Select Timers to display dialog box appears when you choose this option from the Results Window. Use it to select the timers you want to view, then click OK to proceed.

Right-click within a graph and select to Customize, Export to Excel Filter URLs.

SNMP and NT Performance Collector Graphs

The data collection queries defined in a Collector generate results data that can be displayed in custom graphs. A maximum of two custom graphs are produced per Test-run. All NT Performance Collector data is displayed in the Custom NT Performance graph. All SNMP Collector data is displayed in the Custom SNMP graph.

If your Test includes more than one NT Performance or SNMP Collector, the appropriate custom results graph combines the data collection queries from all Collectors of the same type and displays them in one graph which you can then filter to display the data you require.

Use the Filter option to select and display specific data collection queries defined in the Collectors. The unique names you assigned to each query are displayed below the graph in a color coded key. The IP address of the Host used to run the Collector Task during a Test-run is automatically listed alongside the query name.

Right-click within a graph and select to Customize, Export to Excel Filter.

See Also:

Display Custom Collector Graphs

Filter Custom Collector Graphs

Custom SNMP Graph

Display Custom Collector Graphs
  1. Open a Test and click the Results tab in the Test Pane.
    The Results Window opens automatically listing all Test-runs associated with the current Test. Results are stored in date and time stamped folders.
  2. In the Results Window, double-click on a Test-run folder or click , to open it and display the available results.
  3. Click the Custom NT or Custom SNMP from the list results option to open your selection in the Test Pane.
    The Custom NT Performance Graph is displayed below:
    Note: Graphs are displayed in the default line plot style. Right-click within a graph and select Customize from the menu to change their appearance.
    Tip: Right-click within the graph and use the menu options to Customize, Export to Excel and Filter the data.
    Tip: Display multiple graphs and tables concurrently to compare results using the Results Window.
    Note: Click , in the Title Bar of a graph or table to close it or deselect the display option in the Results Window.
See Also:

Filter Custom Collector Graphs

Custom SNMP Graph

Filter Custom Collector Graphs
  1. Open a Test and display a Custom Collector graph in the Test Pane.
  2. Click , in the toolbar or right-click inside a custom graph then select Filter.
    Use the Filter dialog box to select the data collection queries you want to display.
    Note: If you have more than one Collector of the same type referenced in a Test, all the results collected are merged and displayed in one custom graph.
    The Filter dialog box displays the data collection queries alongside the Task Group name indicating which Collector a data collection query belongs to.
  3. In the Filter dialog box select the data collection queries you want to view.
  4. Click OK to display the selected queries.
    Note: The filter settings you apply are not saved when you close a graph.
See Also:

Display Custom Collector Graphs

Custom SNMP Graph

Custom SNMP Graph

The Custom SNMP graph displays results returned by all the SNMP Collectors executed during a Test-run. You can filter the data collection queries displayed to control the amount of data displayed.

The data collection queries as defined in the Collectors referenced by a Test are color coded for easy identification. Each query displays the IP address of the Host targeted during a Test- run.

There is a right-click menu associated with the custom graph. Use the Customize option to change the appearance of the graph. Other options include the Export to Excel option which enables you to convert data for further analysis and output, and the Filter option which is used to display specific data collection queries.

See Also:

Display Custom Collector Graphs

Filter Custom Collector Graphs


OpenSTA.org
Mailing Lists
Further enquiries
Documentation feedback
CYRANO.com
TOC PREV NEXT INDEX