HDX guerilla usability testing
Goal: Observe users trying to accomplish tasks on HDX, without assistance, to discover what parts of the UI and UX need improvement.
User research vs Usability testing: User research comes before implementation, and predicts what will work for users; usability testing comes after implementation, and tests how well the research and implementation met their goals. The testing provides input for the next iteration.
Who: Anyone on the team should be able to conduct tests, anywhere, any time, as long as the person is able to take a video with her/his mobile phone (and has the patience to stay quiet when the user is having trouble using the site). The term comes from Martin Belam's article Changing the Guardian through guerrilla usability testing. Most of our insights will emerge after only a small number of tests on the same task.
How to test
A good test should be short, perhaps 5 minutes at most:
- Choose a simple, self-contained task.
- Describe the goal to the user, but give no instructions on how to accomplish it.
- Seat the user in front of a computer displaying the HDX home page (or another suitable starting page).
- Take a video of the user attempting to complete the task. Do not provide help (you can explain how it should have worked after the test is complete).
- Ask for general feedback.
A good opportunity for testing is at events that humanitarians and/or data people attend, where we can set up a laptop and pull people aside during breaks, but we can also grab people in coffee shops, airport departure lounges, or a UN cafeteria. We can schedule longer series of tests with committed users: in that case, the tester shouldn't provide any explanations until all the tests are complete.
Sample single-test script
Here's a sample user test for finding data on the site.
Approach: Hi, I'm working with a UN project to share data about humanitarian crises, and we're looking for feedback to help improve our web application. Would you be interested in sitting down for five minutes and trying a couple of simple tasks? We'd like to record the test, but it will be for internal use only.
(If user accepts.)
Set up: Thanks for agreeing to help. I'm going to give you a short task, then I'd like you to try it without any help. Don't worry if you can't figure it out: that's good feedback for us.
(Start video)
Background: If you'd like, you can tell us about yourself in a few words—what you do, how technical you are, etc.—but it's entirely optional.
(User gives background; don't let it go too long.)
Task: You need to find and download data about water access in Mali, a country in West Africa. Here is your starting page (note: could be the HDX home page or the Mali country page): please jump in and tell us what you find.
(User takes 2–4 minutes to attempt task. If the user gets too frustrated to continue, try "It looks like we have work to do on our interface. How do you think it should have worked?")
Completion: Thank you very much. Please enjoy this doughnut as a token of our appreciation. Do you have any general comments you'd like to add?
(Stop video)
(If the person didn't complete the task and wants to know how it could have worked, offer an explanation. Provide contact info for followup if the user is interested.)
Follow up
Once we have about 5 of these videos for the same task, we make the videos available internally for the entire team to watch, then have a call to talk about what we learned and what changes we should schedule for the site. People with different specialities (coding, design, user research, communications, data team, etc.) will all likely have different insights from watching the video.
Repeat the entire process regularly.