Successful UX Tests: How to Write a Test Script
If you want to conduct a successful UX test, one of the most important things is writing the test script – without a proper script your results will be useless.
Before you can start writing the actual script, first you have to decide on which testing method you want to use. This is because the test script will differ when you want to do a Usability and UX Study, a Quantitative Usability Study, a Competitor Analysis, or Remote Interviews, for example.
In our masterclass in partnership with Business Reporter, our CEO and Co-Founder Philipp Benkler talks about the importance of finding the right testing methods, figuring out the right target group, and creating a firm test plan.
Get a first glimpse in the following video:
It’s likely you’ve already come across one of our previous blog posts in this series:
How to Write a Perfect Test Script
In this article I want to draw your attention to the third step: writing the test script.
It’s important to define clear and unbiased use cases and enrich them with qualitative and quantitative questions. It’s equally important to explain exactly what you want testers to do and which steps need to be taken (of course you can also leave this point open, depending on the scope of your test).
To make it even clearer what’s important when writing a test script, I’d like to illustrate this with an example from a real customer. In a Usability Study for a German mobile provider, the goal of the test was to check the ordering process and find out how to improve the conversion rate.
20 testers checked the entire process from choosing a product to finalising the order and were asked to give extensive feedback on the process afterwards. The mobile provider wanted to identify weaknesses and disruptive factors in the process to determine recommended courses of action.
Let me show you how our test cases are structured.
Before You Jump In: Give Users a Scenario
Surprisingly, you can’t just start by telling the testers what to do. You have to give them an idea of what’s expected of them and what situation they should imagine themselves in while conducting the test.
Sometimes this can be easy – in our example we asked testers to look for a new mobile phone tariff, something we knew all of the testers had done before. But sometimes, scenarios are a bit tricky and need to be described in more detail.
That’s why our testing experts from the Project Management team always write a short introduction for the testers. In this case, it was rather short, because we knew we had testers who wouldn’t need a lot of guidance.
This is what it looked like:
Are you interested in using your feedback to improve the usability of products and processes? Then this test is the right one for you!
In this usability study, we will ask you to go through the complete ordering process of a mobile phone provider. The ordering process is already live, but you don’t have to provide any personal data or payment information and the test ends before the order is complete.
The test will take between 45 and 60 minutes and it is all about your personal impression!
Have fun testing!
Writing the Test Script: How to Define Tasks
According to Nielsen Norman Group, there are 3 things you need to consider when writing the tasks for your test scripts.
- Make the task realistic
- Make the task actionable
- Avoid giving clues or describing steps
Let me show you how we did this in the test for the mobile phone provider by walking through the tasks.
You have decided to buy a prepaid mobile phone plan and your choice has fallen on TheBestProviderEver as your provider. Please open the page http://www.thebestprovider.com and have a look at the tariff overview on the start page. Choose one of the tariffs. Then click on “To Tariff” and take a look at the product information provided.
Why is the task realistic? Because it leaves the decision up to the testers, they can choose any of the available phone plans. This is important because we want real insights into the thoughts of the target group. Eventually, you might find out that your target group doesn’t need the super fancy unlimited plan with the newest iPhone because it’s out of their price range. And that’s knowledge you might miss if you set constraints such as telling testers which tariff to choose.
Why is it actionable? We always ask users to do things – not only talk about how they would do it. Also, we ask them to take screenshots during the testing process so we can actually see what they did and how they fulfilled the given task. To make this even more insightful we offer so called Remote Usability Videos where the tester’s screen is actually filmed while working on his or her task. This gives a deeper understanding of user behaviour, thoughts, and possible problems testers have to overcome.
How is it avoiding giving clues? You might feel the urge to help testers along by telling them where to find things – for example, we could have written “click on ‘To Tariff’ in the upper left corner”. But that would ruin the test case since that’s one of the exact things we want to find out during our test: Do users find what they’re looking for – without external help. Let me give you some more examples from our test, and you can check yourself if we managed to follow the three rules 🙂
After you have made sufficient enquiries about the tariff you have chosen, you decide to book the tariff. Click on “Order now” and enter all mandatory data. Important: You don’t have to fill in your real data in the mandatory fields, but you are welcome to take imaginary data. However, make sure that you enter a valid date of birth (age from 18).
You can find a dummy IBAN for the test under the tab “Access data”. Please also be aware that our customer has access to your screenshots and thus to all data visible in them.
As you can see, our project managers create realistic scenarios so that every tester can figure out what he/she is supposed to do. They also leave the tester enough freedom to complete the task as they please – not giving them a specific tariff to choose, for example. Additionally, we asked open and closed questions in relation to every task, like “Can you find all the product information you would expect here?” or “How would you rate the structure of this site? (1=very good; 5=very bad)”.
Test Script Design: Don’t Forget the Feedback Channel
Besides the fact that your test script has to clarify what the testers should do, and what they shouldn’t do, there‘s another point that’s often missed.
What if testers struggle during the test or have additional questions?
Most people underestimate the time and resources it takes to manage testers during an ongoing test. You need experienced people who monitor the test, implement an easy-to-use feedback channel (such as email), and not only give testers the opportunity to ask questions but encourages them to as well.
Do you want to know more about conducting successful UX tests? Then check out our masterclass with Philipp Benkler.
Share it if you like it:
About the author
Content Marketing Manager
When Simone is not working on superb texts for Testbirds, she and her horse live it up on the tournament areas in Bavaria.
Rate this post
Other content that might be interesting for you:
Testbirds specialises in the testing of software such as apps, websites and Internet of Things applications by using innovative technologies and solutions. Under the slogan, “Testing Reality”, the company offers various testing methods to its clients looking to optimise the user-friendliness and functionality of their digital products. With over 250,000 registered testers located in 193 countries, Testbirds is one of the world’s leading crowdtesting providers. In addition, the IT service provider utilises cloud based technologies to support customers in the optimisation of their digital products. The combination of the two testing methods delivers a unique and extensive portfolio that takes the quality of software to the next level. Testbirds was founded in 2011 by Philipp Benkler, Georg Hansbauer and Markus Steinhauser. Today, the company has over 100 employees. Other than its headquarters in Munich, there are now offices in Amsterdam, London and Stockholm, franchises in Hungary and Russia and sales partners in Italy.
© Testbirds GmbH. All rights reserved.