Performance Testing Terminology | Loadium

Performance Testing Terminology

Performance testing is the process of determining the speed, responsiveness, and stability of a computer, network, software program or a…

Performance Testing Terminology

Performance testing is the process of determining the speed, responsiveness, and stability of a computer, network, software program or a device under a workload. In order to conduct and analyze a good load test, you need to understand the performance testing terminology including connect time, latency, etc…

Let’s elaborate on what they mean.

Connect Time

Connect time is the time taken to establish a TCP connection between the server and the client. TCP/IP model guarantees delivery of data by its nature by using TCP Handshake. If TCP Handshake is successful, then the client can send further requests. That data send/receive task is not on the HTTP layer. If there’s no TCP connection made between the server and the client, the client can’t talk to the server. This can happen if the server is not live or busy responding to other requests.

Latency Time

Latency is measured by the time taken for information to get to its target and back again. It’s a round trip. Sometimes, latency means delay which is a very problematic issue when working with remote data centers. Data hops through nodes till it’s sent from the server so the bigger the distance the more the delay. That’s why those nodes will increase the response time and violate your service level agreements (SLA’s). That’s why dealing with latency is the hardest one. JMeter measures the latency from the first moment of sending the request until the first byte is received. So in JMeter Connect time is included when calculating Latency Time. There’s also the network latency to express the time for a packet of data to get from one designated point to another.

Elapsed Time

Elapsed time is measured by the time from the first moment of sending the data and the time of the last byte of the received response.

So our basic math tells that

Latency time — Connect time = Server Processing Time

Elapsed time — Latency time = Download Time

Throughput

Throughput is the number of units of work that can be handled per unit of time. In performance testing perspective, it can count requests per second, request calls per day, hits per second, bytes per second. JMeter allows you to create assertions for response time so you can set your request’s fail\success status according to its result.

Performance Thresholds

Performance thresholds are the KPIs or the maximum acceptable values of a metric. There are many metrics to be identified for a performance test project. That can be response time, throughput and resource-utilization levels like processor capacity, memory, disk I/O, and network I/O.

Example Thresholds:

Login service shouldn’t execute more than 3 seconds.

Server should handle 40 payment requests per second.

CPU shouldn’t be higher than %60.

Saturation:

Saturation means the point where a resource is at the maximum utilization point. At this point, if we are talking about a server, it cannot respond to any more requests.

Hope this information will clear your mind about performance testing terminology and it would help you analyze performance test reports in a more efficient way.

Check out Loadium blog to see more great content!

Happy Load Testing!

Latest articles

All Articles →

Postman Alternatives for Api Testing

Postman Alternatives for Api Testing

How To Use Json Extractor in Jmeter

How To Use Json Extractor in Jmeter

What Is Multi Location Load Testing

What Is Multi Location Load Testing