Testing Project Info
- Testers: Thousands (simulated)
- Geographic Coverage: Global
- Testing Type: Simulated Load
- App Type: Web
- Duration: Two tests
- Location: New York, NY
- Industry: Marketing Analytics
- Company Size: 30+ employees
- Engineering Team: 5+ members (software developed in-house)
Located in the heart of New York City, RevTrax helps companies from all over the world track their digital advertising campaigns. Since 2006, the company’s unique SaaS platform has provided a one-stop shop for creating, distributing and tracking promotions that lead to in-store sales.
Over the years, as the application grew in complexity, so too did the number of users. This trend eventually prompted RevTrax CTO Greg Hansen to search for improved ways to conduct load testing of their web application.
“We really wanted an objective third party to come and verify how much activity we could handle and where the bottlenecks were,” said Hansen. “uTest turned out to be a great choice for us. They were reputable, had a good price, and took the time to understand our goals and objectives.”
This case study will take a closer look into how RevTrax used uTest for their latest rounds of simulated load testing. Along the way, we’ll show how uTest was able to meet RevTrax’s testing requirements; the role of uTest’s project management team; the quality of test reports and more.
RevTrax: Defining Load Test Requirements
Working with their uTest project manager (there’s one assigned to every account) Hansen and his team would start off by defining their load testing requirements. After a quick consultation, Hansen and his project manager decided that uTest’s simulated load testing services would be the best option in this particular instance. This service, which involves thousands of simulated users, is commonly used by web companies looking to see how their app performs under unusual amounts of stress (i.e. traffic).
Once RevTrax had settled on a plan, the specifics came into focus.
“We used what’s called a step-up approach,” said Hansen. “We started out by simulating a typical day and then stepped up the number of concurrent users until we knew how much we could tax the system. We also ended up using a lot of use cases – simulating promos, printing and purchases – to see where and when things slowed down.”