
Georgia Institute of Technology
Founded in 1885, Georgia Institute of Technology has 25,000 students and a staff of over 5,600. The school is organized into 31 departments, with an emphasis on science and technology. Georgia Institute of Technology is ranked 7th among all public national universities and 36 among all colleges and universities according to US News & World Report rankings.
Challenge
Dr. James Logan is QA Manager at Georgia Institute of Technology (AKA Georgia Tech). He and his team of 3 analysts are responsible for insuring that all IT assets are live 24/7, 365 days a day. Among the IT assets that Dr. Logan and his team are responsible for are the school’s Ellucian Banner system, the Research Portal web site, Oracle PeopleSoft, Drupal websites and even homegrown Cold Fusion applications.
The Research Portal web site is a campus-wide system which is used by the entire research community on campus. Georgia Tech relies on the Research Portal to disseminate critical research data. The university’s Ellucian Banner system is the core system at Georgia Tech. It serves tens of thousands of visitors a day and nothing at the school happens without it. The university cannot afford a slowdown or crash on either system and needs to know about potential performance issues ahead of time.
Keeping these systems live and running smoothly is a huge obligation for Dr. Logan and his team.
Solution
Georgia Tech standardized their performance testing on RadView WebLOAD. According to Dr. Logan, “We needed a performance testing tool that could run complex simulations. Like many universities, we have a mix of ERP and web portals with high traffic. Registration week is our version of Cyber Monday. Our ERP system demands the resources of a mid-cap enterprise. Downtime is not an option at Georgia Tech.”
Georgia Tech uses WebLOAD for Ellucian Banner, Oracle PeopleSoft, along with their own homegrown Adobe ColdFusion applications and Drupal applications.
“This technology allowed my team to effortlessly run regression analysis on several experiments at once and study users’ transactions over time, while the users were accessing the portal. It also allows us to test and compare the performance of our Ellucian Banner and Research Portal web site by varying the number of users authenticated to the portal, and display the dashboard portlets page, and measure response times.”
Methodology
Dr. Logan wrote a detailed refereed technical paper on his experience with RadView WebLOAD. The below section is an excerpt from his paper. It is a treasure chest for QA managers!
Workload
After gathering requirements from the web portal’s steward, the author designed, created, and executed the following automated scripted user scenario:
- Launch a browser, type in credentials, click on submit, and display the dashboard of portlets.
The calculation, U = V / (60/D) [Little 2011], Little’s Law was used to determine the number of virtual users to use for the experimental runs, hence 3, 8, and 16 virtual users and a single script, simulated 45, 100, and 200 real-world users, respectively. The average duration of a visitor for each experiment was 5 minutes.
Approach
When the metric, performance transaction’s response time, over time was analyzed, a profile of how well the research portal web site performed surfaced. The author believes the transaction response time can shed light on the end user perceived response time when displaying the portlets page.
RadView WebLOAD and Analytics are unobtrusive performance measurement tools. By “unobtrusive”, the author means that no RadView WebLOAD and Analytics software is installed on the research portal web site. The RadView WebLOAD component was configured to use a ramp profiler. Each experimental run executed for 10 minutes, after ramp up, and reaching a steady-state. The experimental runs represent a small, medium and a large load test experiment.
Evaluation
The author’s evaluation of the data gathered using RadView WebLOAD and Analytics is based on the approach described in the above sections. The analysis is centered on the processing of several session files data captured during each experimental run. The author captured performance metrics such as Response Time, Login Time, All Portlets Display Time, CPU Idle, and Memory Free. The research portal web site performed well. The ‘All Portlets Display Time’ did not degrade for any of the experimental runs. The high ‘Login Time, Average’ for 45 real-world users is attributed to some unknown network or system event during that experimental run. This measurement was treated as an outlier. The data was collected, stored, and analyzed from a centralized repository. The Analytics tool performed regression analysis on the raw sessions’ data, and reduced the data.
Conclusion
Georgia Tech has standardized its performance testing on RadView WebLOAD and is not looking back. WebLOAD helps keep the large university’s ERP and portals live 365 days a year.
991 Highway 22 West, Suite 200
Bridgewater, NJ 08807
908-526-7756
Email: sales@radview.com