Skip to end of metadata
Go to start of metadata

From November 2007 through February 2008, the University of Michigan performance testing team conducted a series of tests on the 2.4.x and 2.5 version of the Resources tool. The tests were designed to provide a point of performance comparison between the two builds. The 2.4 test version was based on Sakai 2.4.x v37145. Due to an oversight on our part, quota calculations were turned off during all testing. This is important to note. Quota-related data gathering is one of the most improved parts of the 2.5 release.

Both builds stress tested to find the limits of the Resources tool. We executed a test scenario designed to create a user load equal to 500% of Resources' projected peak usage at the University of Michigan. This translates to 180 simulated users uploading, downloading and revising resource files at the same time. Both stress tests generated 500 hits/second on the application servers and 12Mb/sec of bandwidth throughput during peak periods. The following five scripts were used:

The 2.5 version of Resources showed marked improvement in file uploads during stress testing. Using the 2.5 version, uploading a 10Mb file during high stress averaged roughly 6 seconds and peaked at 11 seconds. Using 2.4 resources, the same transaction averaged 16 seconds and peaked at 27 seconds. Faster uploads led to a slight throughput increase at peak periods. In the following graphs, the file upload response time is the top gold line, opening the Resources tool is the light blue. In comparing times, please note the scale change on the left axis between the 2.4 and 2.5 graph. In all graphs below, the dark stepped line shows the number of concurrent users being tested for at that point.

Both stress tests experienced difficulty after the load peaked and continued at peak for several hours. During both tests, the time to open the resources tool from the course menu increased throughout this peak period. For the 2.5 version, opening the tool during the test's ramp-up required less than a second. During the first hour of peak stress, opening Resources required roughly 3 seconds. During the second hour of peak stress opening the tool took 6 seconds and there were spikes of over a minute. We found similar results with the the 2.4 release. Please keep in mind, stress testing expects some level of attrition and degradation. Attached are the throughput graphs which detail the system slowdown throughout the test.

Both builds were also stress tested using the Drop Box tool. We wanted to evaluate any performance impact on Drop Box from the upgraded Resources tool. This test involved 300 simulated users opening the Drop Box tool, uploading a file, waiting a few seconds, then deleting a file. This generated 650-950 hits/second and 6Mb/second of throughput. The following script was used to evaluate Drop Box:

All response times were relatively faster in this test than in the Resouces-specific stress test. The 2.5 release performed better on average. Opening the Drop Box tool from a course menu and deleting files performing roughly 10% faster than in the 2.4 release. File uploads performed 30% faster than the 2.4 release. In the following graphs, the file upload response time is the top gold line, opening the Drop Box tool is the brown line typically in the middle, and deleting a file from the Drop Box is the darkest line at the bottom, denoting the fastest response time.

Please let me know if you have any questions or concerns. I can be reached at: ckretler@umich.edu

  • No labels

1 Comment

  1. Chris, so that one can see the difference between 2.4 and 2.5, might it be possible to plot the 2.4 and 2.5 data on the same graph, or use the same vertical axis scale (e.g., Average Response Time) in the pairs of plots?