Testing Project Info
- Testers: 20+ per release
- Geographic Coverage: Global
- Testing Type: Live Load
- App Type: Web
- Duration: Weekly
- Location: San Francisco, CA.
- Industry: Gaming
- Company Size: 35+ employees (4-person QA team)
- Dev Methods: Agile
An industry-leading publisher and accelerator of social games, Rocket Ninja combines a mix of proprietary technologies and top talent to help companies enhance third-party games and maximize revenues. Focusing primarily on “mid-core” gamers seeking a more sophisticated experience, the company leverages its game design expertise and 3D browser-based engine, Shr3d™, to form lasting connections between players and the games they love.
But before these games could be launched on social networking sites like Facebook, Howard Lo – the company’s Head of Quality Assurance – needed to expand their testing coverage. Specifically, he needed more real-real world testers to verify the load and performance capabilities of their gaming platform.
“We didn’t have enough testers that can be online simultaneously, and we needed many of them to test our live event feature,” said Howard. “uTest was able to add a team of testers for us in a very short period of time, and the rest was history.”
This case study will demonstrate how uTest was able to provide Rocket Ninja with real-world “live event” testing with select members of its global community. Additional topics will include the role of uTest’s project management team, bug reporting, feedback and other aspects of the uTest customer experience.
Rocket Ninja: Testing Goals and Objectives
For this specific assignment, Howard needed to extend testing coverage for the company’s Wrestler: Unstoppable™ product – a game that lets users chop, pile-drive and pin their social media friends (all in good fun, of course). Since the game consists of several tournaments, with multiple users all participating simultaneously, Howard needed insight into the load and performance of the multi-player format while the app was under large amounts of stress.
“When the app is deployed, there are going to be thousands of users participating in tournaments at the same time, and if our engine can’t hold up, there’s going to be a lot of angry users,” said Howard. “So we needed to know more about support for concurrent users, how the engine was responding and how well the platform recorded results. This is hard to do with a small in-house team of testers.”
To start, Howard would get set up with a select team of professional testers from within the uTest community. With the help of his dedicated uTest project manager (there’s one assigned to every customer account) Howard would upload a series of test cases to the uTest platform - tasks designed to mimic the actions of real-world users. To ensure better insight, Howard would also create a chat room to easily communicate with testers.