Testing Project Info
- Testers: 12-20 per release
- Geographic Coverage: UK only
- Testing Type: Usability
- App Type: Mobile
- Browser/OS: iPhone, Android
- Location: London, England
- Industry: Health care
- Company Size: Enterprise
- Product: Symptom Checker app
NHS Direct, an extension of the UK’s National Health Service, is a service that gives patients the ability to assess symptoms, get self-care and advice, or have a nurse call them back if their symptoms require it. Originally designed to be used via landline phones, the service eventually migrated to the web, and most recently, to smartphone devices.
Enter Charlie Young. As the lead consultant for NHS Direct, Charlie was responsible for creating the company’s mobile strategy, managing the development consortium, and of course, to oversee testing of the application across the iPhone and Android operating systems. Lacking the in-house resources needed for this type of project, Young turned his attention to uTest and their survey-based usability testing services.
“The safety of the application wasn’t an issue at that point,” said Charlie, who mentioned that the clinical requirements had already been met. “What we really needed to know was how humans would use this app. People use smartphones in many different ways, so we needed to make sure the user experience lived up to expectations – and that’s where uTest provided such value.”
This case study will detail the usability test cycles run by NHS Direct – covering the process of requirements, creating usability surveys and other items associated with uTest’s usability services.
Test Cycle Set Up: Defining Requirements, Selecting Testers
To start, Charlie and his team would explain their testing goals to their uTest project manager, who would select a test team that met their geographic criteria (i.e. located within the UK) and help them devise a test plan or “user journey” that would yield the highest level of feedback.
The first test cycle, as Charlie explained, would begin with submitting some basic information about their hardware and software, including handset models, operating system and Wi-Fi connection. From there, testers were asked to browse the application without any stated requirements, meaning they were free to search the app at their own discretion. Once they had a handle on the basic functionality, they were then asked a series of questions which touched upon areas such as:
- User interface (i.e. scrolling, vertical view, physical layout, etc.)
- Navigation (i.e. awareness of location within the app, easy to find what you need, etc.)
- Results (i.e. results are easy to find and understand)
- General feedback (i.e. what they liked or disliked, and would they recommend to a friend, etc.)
“The level of feedback was incredibly high,” said Charlie. “The testers showed a lot of credibility.”