Compression,Decompression,Mobile Performance and LoadRunner

4 Feb

Recently I inherited some of the  LR scripts from one of my colleagues,it was all about building the json calls for stressing the backend spring security framework which was first layer of entry into the mobile infrastructure.Those scripts were simple scripts  built using the custom request with json string as a body part.One of the things that really surprised me as part of this effort was that web custom request in itself was taking close to 100ms to 300ms to do decompression of the server response during load testing.

Okay first let me give you some background,servers were configured to send the response compressed in gzip format with content encoding header as gzip.The functionality under scope had SLA of 1 sec max and quite a few functionality in scope also had SLA that was less than 500ms.Quite a challenging SLA’s I would say.But again these functionality were supposed to be accessed over the mobile device,so probably less the response time better it is for users.

Most of the response coming from the server for all functionality was served as chunked bytes,so what it means is that server sends initially some bytes as response in compressed gzip format,LR decompresses  those bytes in 5 to 10ms and then again server sends next range of bytes as chunked gzip response and then again LR will spend close to 5 to 10ms to decompress those bytes and like wise the process continues till we have final set of bytes.All these process happens in the single connection and connection never closes with the server.In case if you do have some server response validation in place, then expect that it will add another 10ms to do that validation.

Now I have measured all these times in the single iteration of vugen,these times increase exponentially when we are running the Load Test in controller or PC and this overhead of decoding the gzip content becomes a quite an issue when response time SLA are in ms.

Here is how it looks when you see the behavior in LR Vugen with decompression on in the script.You can see that it takes 5ms to decode the 154 bytes of response.Now imagine the normal webpage will have size of 2mb of data gzipped,so you can see the impact of this decoding  when size of page increase specially when response is coming as chunked bytes with no fixed content length from the server.

pic1

 

I think HP LR team might also be aware of this behavior and probably that the reason as why they might have come up with function to disable this.Use Web set option with decode content flag turned off if you are running the scripts which do not require validation and has response time SLA’s in ms.The drawback of disabling this feature is that all your correlation and other checks for server response will fail since server response will show up as binary content like below.

pic3

 

I would suggest you to disable this feature if you can and do the response validation by using the other techniques like verifying server logs etc.By disabling this you will gain close to 15 to 20% reduction in response time reported by LR.

Is this expected behavior of LoadRunner ?, I think they have to do this,unless they decode the response, none of the other function like web reg save param or web reg find will work and these functions are core functions of LoadRunner.Probably the right way is that LR should not add these decompression timing in their transaction markers.These timing really pollute the results specially for web applications or probably they can increase the speed of this decompression library what they are using in LoadRunner.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: