Usability Testing Basics:

How does a basic user test go down?

Overview: Sure you know that you're supposed to go and test some users. But how? What is the actual flow of events in the user testing process? This document gives you a nice basic outline. Of course, you may want to add or modify steps in your particular scenario, but start with this model as your "default" process.

Phase 0: Preparation:

Before participants arrive, you'll need to have everything prepared and ready to go. Nothing is more embarrassing than have the people there and you still figuring out why something isn't working right! Here are the basic prep items:

Phase 1: When participants arrive.

Ok, so you have everything prepared and set up, and the participants are knocking at the lab door. Time to rock-n-roll! Here is the basic flow:

  1. Welcome the participants. Be extra nice!
  2. Make them each read and sign the Informed Consent form that you have ready.
  3. Explain a little about how it will go down: we'll put you in there in front of the app we're testing, you'll get a workbook of tasks to work through using the app, should take around X minutes...but there is no time limit, it's not a race, just take your time and work through things. It's super-important to emphasize that (1) YOU, the designer, is the dummy: you've taken a feeble attempt at designing a great app, but you really need their help, to give you a real end-user perspective; and (2) there is no "wrong" or "screwing up". Any difficulty or weirdness that happens is YOUR (the designers) fault...by just being patient and trying get through despite the poor interface design would really help us. The idea is to fight this psychopathology that users tend to have that they are "just too dumb". Make it clear that you are trying to learn from them.
  4. Ok, now take them into the lab room, get them seated, attach the lapel mics, make sure that everything (video, audio, your app) is up and working well. Make sure the camera didn't get bumped out of position. Start the recording rolling now, before you leave the room.
  5. Finally, give them the lab manual and tell them they can start. Make sure you emphasize that you want them to actually write down things in the lab manual as they go. Make sure they have a working pencil. Then leave the room, close the door. You're off and running.
  6. Now observe the test in progress. Take notes on locations (helps to note the minute/second within the recording (look at the recorder software screen) by each note to help you find the place later as you go back and do close analysis. You should be quiet and just observe. As much as you'd like to, DO NOT jump up and go in there to help them. There are only several conditions under which you should enter the room: when the test is over (obviously); when something really bad happens, e.g., something crashes or they knock over the camera; and when they get massively stuck and have been thrashing in a dead end for a long time. In this case, do not tell them how to do it or give hints...just move them on to the next task in the manual.
  7. When the test ends, get in there FAST, before they stand up. They will forget they are wired to lapel mics and you want to go detach those before they run off and break something. Get them unhooked, take them in the observation room (104 proper), and take care of loose ends. If you were having your users to a post-test or post-survey, this is where you'd fit it in. Otherwise, just thank them and send them off.

Phase 2: Analysis and write-up

This phase involved the deeper analysis of the test, to figure out where the breakdowns were and (!!) what it was about your UI that caused them. Some pointers:

  1. Pop your recordings on a USB stick or ftp them to your laptop. Mailing doesn't work well...they are too big.
  2. There is no need to wait with this stage until you have all your pairs done and tested. You can start right away after you have data from the first pair!
  3. Now do your analysis. Here's the basic process that you repeat for each pair/session:
  4. Come together as a team to discuss what each of you has found. Talk through each analysis, review some clips, try to recognize the same issue occurring in multiple pairs. Consider the potential solutions, and decide on what changes (if any) you are going to make to make things better. If there are multiple ones, prioritize based on severity.
  5. Go off and make the changes. This isn't pure science, it's practical; you don't need to keep "conditions unchanged throughout experiment"!! So if you see a major problem, especially one that several test pairs have duplicated, then go ahead and change the software midstream...may as well get good data for upcoming tests!
  6. Write up your results, prepare usability presentation.