(J)Metering a Jazz server – Part I

Tox wrote (is writing) a series on monitoring Jazz performance. Timely indeed as I’ve been working out just how a CLM server performs under load : a single-tier server holding about 3 million Work Items, with 300+ project areas, 3000 registered users (upto 500 concurrent with 20% of them raising ~5 work items an hour ), 50+ GB of SCM content (120 components, 300 streams).

There’s enough excellent stuff on CLM performance on Jazz.net (articles 790, 720 , 641, 814) but what I wanted was to see what happened with the specific repository size described above.

Two problems presented themselves :

1) where to get my hands on a repository with such a large number of artefacts

2) how to run some simple tests against the repository that would give me some indication of performance and response times

It turned out that both the problems could be solved with the one tool: JMeter.

I’d used Rational Performance Tester briefly in a past life but the prospect of setting it up for just some simple tests was too daunting and a customer I’d worked with liked JMeter. So never ever having used JMeter before I set out to figure out if I could use it or if I had to go down the RPT path after all.

The aim of my very first Test Plan was simple: authenticate with the Jazz server. After a bit of mucking about and reading I had a basic working plan that consisted of

I added a couple of variables to the top-level Test Plan element: “testhost” and “testport”. The  “Server Defaults” element uses these to variables so if my server name or port changes I only have to change it in one spot.

I need to authenticate just once per thread (user) so “Login Once” is a Once only controller that encompasses a request to the Jazz login page and a login action which passes a username and password (j_username and j_password) as parameters.

I use a Response Assertion that assumes the login succeeds if the response header does not contain “X-com-ibm-team-repository-web-auth-msg: authfailed“.

 Pleased with myself I ran the test and..well, nothing happened. Or at least I could not *see* anything happening, except some messages in the Log Viewer window. So I added a “View Results Tree” sampler and a “Summary Report” sampler so I could see/monitor the requests, the response data, the results and response times easily.

Now that I had been able to login I needed to add a request that would run a query but only if the login succeeded. An “If Controller” which evaluates the condition “${JMeterThread.last_sample_ok}” will serve as the gatekeeper.

For the query, I’ll start by being lazy and using the URL for a pre-defined query and my plan now looks like this:

Running this doesn’t  do anything exciting but I can see some response times coming back from the server.

I’ll add another HTTP request, this time an OSLC query that returns all workitems in a project area (/ccm/oslc/contexts/_rc-1kLanEeGNAOF-ZCganQ/workitems) . Since I need to set some headers (OSLC-Core-Version: 2.0, Accept:application/xml) I use an HTTP Header Manager to store those values.

Running the updated plan gives me the results I expect but  I add a couple of other useful JMeter constructs. First a “Regular Expression Extractor” to set a user definced variable called “WICOUNT” to the value of “oslc:totalCount” which is returned in the response to the OSLC query.

Second, I add a “Debug Sampler” to show the value of the WICOUNT variable, a simple way of checking that my query is returning what I expect it to.

Running this augmented plan runs both queries and I can see the results, response times and Work Item count.

I add one more query, this time using the OSLC interface to search for a keyword in the workitems in a project area. No rocket surgery here: I just duplicate the “OSLC Query all workitems in project area”  HTTP Request and modify the request to include a search term (https://clm.jkebanking.net:9443/ccm/oslc/contexts/_rc-1kLanEeGNAOF-ZCganQ/workitems?oslc.searchTerms=”exception”).

Now that I have a basic framework for testing query performance, I turn my attention to creating Work Items. Again I make use of an  HTTP Request and OSLC. The Path for the HTTP POST Request I use is that of the OSLC Change Request Creation Factory service (/ccm/oslc/contexts/_rc-1kLanEeGNAOF-ZCganQ/workitems). The HTTP Header manager has an additional value “Content-Type: application/xml“. The Post Body contains the data used to create the Work Item: its title, description, type and which category it is filed against.

The response code returned on successful work item creation is “201” so I’ll check for that with a Response Assertion.

Executing the test plan shows the Work Item created successfully with the representation of the newly created resource returned by the server.

As it stands this  test plan lets me do what I set out to do. I can now change the Number of Threads and Loop Count properties of the Thread Group, to simulate multiple users creating Work Items or running queries.

I’m not entirely happy with this though, as there are a bunch of other JMeter features I’d like to try out that could make this more useful and generic.

That is (J)Metering a Jazz Server: Part II.

4 thoughts on “(J)Metering a Jazz server – Part I”

  1. I am trying to setup the Jmeter test plan as described in your post. Could you please elaborate on the “Server Defaults” element?

      1. Thanks. Got it.
        The WICOUNT is not set correctly. I run the OSLC query in the browser and it seems OK.
        Tried to change the tags to with no avail.

Leave a comment