- About Vivit
- LUGs & SIGs
- Vivit Blogs
- News & Events
- Knowledge Base
|HPE Software Products: StormRunner Load Practitioner Forum|
Software Content on HPE Enterprise Community in Read-Only May 9 - 15
As you may have seen in the last few months, Hewlett Packard Enterprise is combining some software assets with Micro Focus. As part of this spin-merge with Micro Focus, a new Software instance of an online community will go live on May 16, 2017.
All boards within the current Software category will be located to a new community. All URLs will redirect to the new community. All current users will be migrated to the new community as well. Please be sure to update your bookmarks after May 16.
As part of the migration to a new Software instance, all software content will be in read-only mode from May 9 – 15. We apologize for any inconvenience.
We will continue to provide further updates in this News board.
In the StormRunner report the SLA status is showing as failed even though the 90th Percentile value is less than the SLA Threshold. In these cases the % Breakers is greater than 10%.
Where the % Breakers is less than 10% the SLA Threshold for these transactions is showing as passed.
Does the report fail the SLA Status when the % Breaker is over some threshold? I'm guessing 10%
I don't get how a transaction can be considered to have a failed SLA status when the 90th Percentile is less than the SLA threshold.
SLA Status = Failed, 90th% = 2.622, SLA Threshold = 3.000, % Breakers = 17.647
How can this be considered to be a failure if 90% of transactions have a response time of less than 2.622 secs?
How is % Breakers calculated? The doco doesn't say.
as mentioned already, you should move / cross-post your question in the LoadRunner Forum
HP Stormrunner is incorrectly reporting on errors.
When on the "Analyzing Test Results" page, clicking on "script errors" should bring up a list of all errors (see attached), but the error ID doesn't match the ID in the Error Message. In this example, only two types of errors (and hence only two screenshots) appear, but if I download the CSV with all the error messages in it, there are seven different errors (two of which match the Error ID, the other five don't).
To make things worse, after a few days, the real errors (visible only in the CSV file), no longer appear in the downloaded CSV file, making it impossible to review (as Firefox saves these files in a temp folder)
Has anyone else experienced this?
I recently added 6 test scenarios on the same project.
I setup 3 test scenarios to start with, did some runs and then later configured a further three test scenarios
3 of the scenarios share the same scripts - so there are two sets of identical scripts (uploaded zip files) - one set is shared by 3 scenarios the other set by the remaining 3 scenarios.
The first three senarios appear to have linked runtime settings - that is - if i adjust the pacing in one of these 3 test scenarios then the pacing also changes in the other two. I can't see how these 3 tests are linked. The second set of 3 test scenarios also do this.
The 2 sets of 3 test scenarios do share a similar name - apart from one word high/medium/low. They also share the same 'additional attributes'. I have tried renaming them and still they appear to be 'linked.
I see there is a disclaimer in the runtime settings box 'Note ! Changes will affect the script in related tests' - what does this mean and what are 'related' tests? Is this the cause of the problem or am I missing something here - do I have to have 6 projects with 1 test in each? None of the tests can actually share the same runtime settings in my configuration, all 6 have a unique configuration.
Join us for the Best of Show on Tour where we will present the latest trends and developments in Application Delivery Management. Providing high-quality applications quickly and reliably, creating a positive digital user experience – these are the declared goals of IT and development departments. These can be achieved with Continuous Testing, agile development and DevOps. We will show you how on the Best of Show on Tour. At the same time, we would like to take the opportunity to kick off the HPE German Testing Community – a format which is already well received in Switzerland and which we would like to offer our German customers as a platform for knowledge exchange, best practice sharing, networking and collaboration among peers. Don’t miss this opportunity in a city near you and register now! (The events will be conducted in German language.)
I am configuring a test using Stormrunner with 6 on premises load injectors.
What firewall procedures are needed to allow traffic to pass through. Is there a range of ports?
I am configuring a test for next week on Stormrunner. I have 2 scripts, one of the scripts use cloud load generators and the other uses multiple on premise load genrators.
When creating the test I can only select either "Cloud" or "On premise" Load Generators. How can I use a combination of both? At the moment I am just using a trial version.
I have created scripts using Vugen12.53 recording as winlet instead sockets.
LR controller replays the script option "Winlet replay instead of Sockets" enabled with no issues.
As there is very limited runtime settings in Stormrunner will it take the other extended settings from the script? Is there an option to enable Winlet?
When attempting to execute a StormRunner test scenario that contains a Java VUser script I'm seeing the following error message.
Error ID: -1017
One or more of the scripts included in this test appear to be in an invalid format.
The script in question currently works through Performance Center.
Any suggestions on what would be causing StormRunner to say that this script is in an invalid format?
When I was on HPE Discover I got an introduction of HPE StormRunner,
as I understood the form of license is new for StormRunner.
Also what is the big difference if we compare StormRunner Load with LoadRunner.
Thank you in advance
HPE Software R&D constantly strives to improve its products, and values your feedback.
How can you help? Participate in our annual survey and earn your chance to win one of five $100 Amazon gift cards!
The survey covers HPE’s performance testing suite of tools, including Performance Center, LoadRunner, StormRunner Load, Network Virtualization & Service Virtualization.
To increase your chance of winning, forward the survey to relevant colleagues, and copy us at email@example.com - your name will be entered again in the raffle!
Thank you and good luck,
Hi All ,
We are trying to find out if Storm Runner can be used for testing Soap based WCF Services... Could not find any articles related to the same
I have pre-set the percentile value in my SLA to 90 prior starting the test. After the test is finished, how do i check the 95th percentile values of the transactions that are evaluated in my test?
I have made a zip file of the truclient script and when I try to upload the script in StormRunner, the upload fails after few minutes. I get the message "Cancelled uploading script xxxxx.zip". I am trying to evaluate the product. Thanks
What is ChatOps?
ChatOps, a term widely credited to GitHub, is all about collaboration in the Dev/IT workforce. By bringing your tools into your conversations and using a chat bot modified to work with key plugins and scripts, teams can automate tasks and collaborate, working better and faster. (the full article)
What is HPE StormRunner Load?
Hewlett Packard Enterprise StormRunner Load is a Software as a Service (SaaS) solution for Web and mobile application performance and cloud testing, for both internal and external applications. Its capabilities include:
It’s only natural that one will want to create a ChatOps bot so that we can communicate with StormRunner Load through. After doing some research, I found out that there wasn’t a bot created for StormRunner Load, so I decided to develop one by myself.
Beginning the creation process
This is where StormRunner Load Public API came in handy. The API exposes many capabilities of the product that you can perform against it. is very well documented and easy to use. In addition, it allows you to try the API for yourself in your browser to examine how the REST calls work.
This is only a portion of the capabilities offered by StormRunner Load Public API. Get more information on StormRunner Load Public API here.
Don't forget to follow us on Twitter @HPE_LoadRunner.
***This is an open source project (under Apache 2 EULA)
This post was written by Wei Sun, Lynn Liu from the StormRunner Load R&D team
As a performance engineer, you could have many testing scripts for various projects you are working on. To efficiently manage your testing artifacts, you need a version control system like Git, SVN etc. The distributed nature of Git is a perfect fit to the Agile testing nowadays. It allows users to be more flexible in how they collaborate on projects.
In recent StormRunner 2.2 release, Git integration was introduced to allow you to connect your Git repository to your StormRunner tenant, not only from your internally hosted Git repository, also GitHub, GitLab and Atlassian Stash (Bitbucket). Using the integration, you can directly upload your scripts to StormRunner from your Git repository. If the scripts had been updated in your Git repository, you can easily synchronize the changes by one click.
Here we use a small example to demonstrate how the Git integration simplifies your script management. The demonstration will show you how to seamlessly sync the script between Vugen and StormRunner. More about ‘VuGen now connects you to Git repository’.
1. Configure StormRunner Git Agent, connect it to your tenant. See how to configure the StormRunner Git Agent.
2. Choose Upload from Git. Click Add to upload scripts via selected Git Agent.
3. Vugen Git integration
-> Note you could configure git ignore list or untrack file to reduce the size of your script.
That’s it! We urge you to try out this new capability to see how it will help you shorten your scripting and uploading process. Try it out
Do we have snapshot from the runs for the error ?
This post was written by Huan Feng (Ramsey), Lynn Liu, Wei Sun from the StormRunner Load R&D team
Today’s engineer is likely to be working on code that is continuously integrated into the main source code repository. Continuous Integration systems such as Jenkins help ensure that unit tests and other tasks are automatically run whenever a build takes place. A task that more and more developers are running as part of their continuous integration suite is load testing their software. Users of StormRunner Load can take advantage of HPE’s StormRunner Jenkins plugin to run their tests automatically. This post describes the four simple steps you can take to make this happen.
1. Install the latest version of the plugin to your Jenkins server
2. Configure StormRunner plugin in Jenkins
-> Note that to validate the credential inputs are correct, you can click the Test Connection.
3. Create and configure a new job to run a StormRunner test
-> Note that to find your test’s TestID, you can navigate to StormRunner Load, and go to Load Tests > General for information:
4. Build the job and view the output
-> Note that the StormRunner Jenkins plugin generates both a .csv and an .xml file in the workspace folder after the build completes:
That’s all there is to it. Four simple steps to run your StormRunner Load tests as part of your continuous integration! Take StormRunner out for a free spin, here’s where you can sign up for a free trial now.
i have created load test, in which i want to import my jmx, but i am unabel to import the same i.e i do not see any option . when i click upload it says add custome files where as when i explicitly click on jmx it says unsupported format.