Jmeter is an open source testing tool written 100% in Java that enables Performance testing on our Web Application.
The main objective of this kind of test is to assess if our system fulfills the established performance requirements, simulating a certain amount of concurrent users sending requests to the system.

Jmeter enables us to assess response time for those requests by generating graphic and chart reports.
It can be downloaded from
: http://jmeter.apache.org/

BlazeMeter is a web tool that, like Jmeter, allows us to record and execute performance testing on our application in a simple way, though not as configurable and flexible as Jmeter. Besides, its reports tend to be pretty basic.  

In order to test, the first step is to record different requests whose performance we want to assess. Here BlazeMeter Chrome extension has its main advantage.  Requests can be easily recorded to obtain a list that will be exported  to be used in Jmeter so as to run the performance test and generate the reports wanted.   

The extension can be installed from Chrome web store.

Recording requests BlazeMeter

Once the filters and options are configured in the extension, we just press the button to start the recording  and then browse our webpage for BlazeMeter to collect all data about requests. 

All the recorded  requests can be also seen and edited when pressing the editing button.

Once the data is collected, we simply press the export button to download a file with .jmx extension. 

Performance test configuration with JMETER

First we have to import with Jmeter the file with .jmx extension previously downloaded.  

Our test will have several components by default. The most important one is the “Thread Group” that contains all the requests previously recorded and their configuration.

Every request can be individually configured .

Requests have subelements. BlazeMeter adds, by default,  an “HTTP Header Manager” with the configuration of the header, and a “Uniform Random Timer” that presents a configurable delay for the request once executed the previous one. It is advisable to configure the delay manually so as to closely simulate a real user interacting with the system.  

We can add new configuration elements or request assertions to configure when a request should be marked as “failed” 

By default, Jmeter only validates the response code but we can configure more validations by adding a “response assertion ” that enables to verify the server response request and validate its content.  

We can also add size and duration validation, among others.  

Reports configuration and performance test execution

In order to generate reports we have to add “recipients” to our thread group. These recipients collect information about the requests and response time as the test executes, and then they show a detailed report. 

The most useful recipients are listed next:  

  • Results tree:

It allows us to visualize all the information about each request executed and its response. The request will be marked as failed if it does not pass the assertions added, or if it returns an error code.

  • Summary report:

It is a report that shows charts with totally configurable columns. We can export the chart to a CSV, or configure the report for it to automatically write all the data in a certain file when the test finishes.  

  • Graph:

Generates a totally configurable graph in which we can select bars to be drawn. In our example, each of the 10 users make 3 requests. The graph shows:

– Grey: shortest request time  

– Green: longest request time  

– Red: time average for the 10 requests of each type

In a future post, we will see how to create a more advanced test and report configuration.