The applications were introduced over an extended period and ranged from applications with a huge user base to applications used by a smaller community of specialists. Some applications were implemented with little change; others had bespoke extensions and some were built to order. The user communities were distributed over several locations and changed frequently. The success of the adoption on many of these applications depended on confidence in their quality and ease of use.
To meet this need, Acutest set up a virtual performance testing centre. Whenever a new software application was scheduled for introduction, a specialist from the testing centre was called in to identify with the client's business and technical staff the most effective way to conduct performance testing. For the limited number of internal developments we added functional and usability test capabilities to the project.
In all cases, the time available to construct and execute the tests was very short. As a first step Acutest selected the Mercury Interactive LoadRunner tool which has been enhanced to provide the most advanced testing and tuning tool for Citrix environments as well as the mixed environments which were required for several projects. This was supplemented by experienced performance testers with strong technical skills and clear communication skills. The result was a service that could be delivered quickly and provided clear results that allowed effective decisions to be taken by the client management.
In one case, a packaged application being introduced had never had as many concurrent users as the client required. The initial tests showed that the application could not operate at this level. The detailed testing information that was provided to the supplier enabled them to locate and correct the limiting factors quickly and the implementation met the level of demand when it went live. In another case the testing identified bottleneck issues in the performance of the infrastructure, which was modified to address the problems. Another example was an application that needed a huge up front investment but would save £4m annually. The performance assessment proved that the application would work and allowed the business the confidence to commit to the plans.
All of the applications that were introduced met their performance goals and did not disrupt existing services. There was no slippage to the timetable because applications were not available on time or because they ran too slowly. This gave the user community confidence in the quality of delivery from the IT department and allowed for quicker roll out and adoption.