Usability Testing Module
Version: 8 August 2005
This version of the Usability Testing module is based on the original format taught by Mark Ardis at Rose Hulman University. In this format, the schedule included four lectures per week for two weeks. The site for the course as taught during the Spring 2005 semester is http://www.rose-hulman.edu/class/csse/csse376/. The relevant lectures are those from April 14 through April 26. [NOTE: We also plan to design two variants of this module. One variant will be a one-day lecture covering the most important points. The other variant will be an intermediate version: More than one lecture but fewer than eight.]
This module includes materials for teaching the Usabilty Testing module as part of a two-week block consisting of 8 lectures. After reading through these teaching notes, use the schedule table to methodically study the lecture notes, quizzes, and homework assignments. The schedule table shows the mapping between the materials in this module and the lecture meetings of the original format.
The 2-week version of the Usability Testing module is intended to give the students practical experience with performing usability testing. They will learn: the importance of this type of testing, how to design an effective test, how to plan and execute such a test, and how to report the results of testing. This hands-on approach to the material can help students understand the challenges and rewards of carefully planning and executing usability testing..
This 2-week version of the Usability Testing module includes the following materials (which you should be able to click and reach if the materials are downloaded and organized as recommended in the README-FIRST file):
- Lecture notes (seven PowerPoint lectures)
- We recommend giving out the quizzes at the beginning of the lecture for students to fill in as the lecture proceeds. This is a useful teaching tool for helping students focus on the points from the lecture.
- Homework assignments
- includes examples, sample solutions, and example grading comments
- Extra materials useful in preparing for and conducting the various activities.
We have identified two texts as viable choices for supporting this module. Both are excellent books.
During the Spring 2005 offering of the course, Mark Ardis used the textbook by Rubin. He chose the Rubin textbook because it is shorter and more direct. He has never used the Dumas and Redish text, so cannot offer any other advice to guide the choice between the two.
- Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests
Wiley; 1st edition (April 15, 1994)
- A Practical Guide to Usability Testing
Joseph S. Dumas and Janice C. Redish
Intellect, Ltd (UK); Revised edition (October, 1999)
Preparation and Follow-up for Teaching the Module
These are tasks the instructor must complete in order to teach this module using the same format as the original. The key tasks are to adapt the materials, set up a demonstration, select a product for the team-run testing sessions, find volunteers to serve as participants in the team-run sessions, set up a schedule for the team-run sessions, find someone to drive the video recorder, prepare the setting for each of the sessions, do the appropriate follow-up, and grade.
Each of these items is elaborated below:
- Adapt the lectures, quizzes, and homeworks to fit the format
- In all but one of the original lectures, the instructor included applicable cartoons at one or more points during the lecture to provide an opportunity for amused reflection. Due to copyright restrictions, we do not include cartoons as part of this module.
- Set up the demonstration of a usability testing session:
- Prepare the product and a set of tasks to be used during the demonstration; the examples here are based on an Excel spreadsheet prepared with data from conducting the Fitts Experiment given elsewhere in this module.
- Prepare the spreadsheet or other file that will provide the basis for the demonstration. Our sample file gives the Fitts Experiment data (in extras/ both as an Excel file and a PDF file).
- The task list is the list of instructions for the person who serves as the subject to follow during the demonstration session. While there is no standard format, people often create numbered lists of steps to perform. The task file (in the extras/ folder as chartingTaskList.html) gives the tasks for the sample spreadsheet demonstration. The task is to create two charts, a bar chart and a line chart, from the spreadsheet.
- Each student receives copies of two forms to fill out during the demonstration:
- The Data Log form is used to record the time at which each event occurs and errors that the test subject makes. As an example, we include a data log for the Charting demonstration (in the quizzes folder in both Word and PDF form).
- The Test Monitor Evaluation form is used to rate the performance of the test monitor under the second and third conditions described below. As an example, we include an evaluation form for the Charting demonstration (in the quizzes folder in both Word and PDF form).
- Recruit a volunteer to work through the tasks as the students observe. The volunteer should be prepared to remain through the full class period.
- The test monitor will be either the instructor or a second volunteer. The test monitor will run the testing three times, in three different roles:
- an effective test monitor,
- an overly controlling test monitor, and
- an uninvolved, somewhat distracted test monitor.
- Conduct the demonstration during a single class period (the third lecture of the eight we propose in this module).
- Select a product to test.
- Under the original format, the course was set up to use the RoseyCalendar project, which we plan to offer as a separate module within SWENET. The extras/ folder has a sample file in both Word and PDF that gives a list of twenty tasks to complete with the RoseyCalendar.
- Whatever the product, it must be easy to learn, as in general the participants brought in for the team-run testing sessions will have no prior training when the test is performed. At the same time, the product should not be so familiar that some test participants come in as complete experts.
- It is not required to offer the source code for the product, as the students do not have to modify it.
- Recruit participants for the team-run testing sessions. This includes the following steps:
- Ask for volunteers from various sources. It is a good idea to include test participants from disparate groups, such as secretarial staff, faculty, and students. It is a good idea to provide incentives, for example, free food or a gift certificate. (Be sure to check university regulations to ensure that whatever the incentive, it falls within university guidelines.)
- Schedule the volunteers so that each testing session includes a mix of participants (e.g., one secretary, one faculty member, and one student).
- Remind the volunteers of their appointments the day before the session for which they are scheduled.
- After the session is complete, send a follow-up questionnaire to each volunteer. The responses can assist the instructor in grading the performance of students. We include a sample follow-up note in extras/ as volunteer-follow-up.html.
- Send a thank-you note to the immediate supervisor of each volunteer (or the advisor for student volunteers). We include a sample follow-up note in extras/ as supervisor-follow-up.html.
- Set up the team schedule for the team-run testing sessions.
- In the original format, the schedule for the team-run testing sessions was carried out during two lectures. In each testing session, two teams were scheduled: one to conduct the usability testing and one to observe. With four teams, each team attended one lecture and participated in two testing sessions, once as the team conducting the usability testing and once as the team observing the usability testing session.
- Arrange to have the team-run testing sessions videotaped.
- Each team-run session should be recorded on a separate videotape (30-45 minutes long), so that the team conducting the test can review their performance later.
- It is best to bring in an outside volunteer to operate the videotaping equipment so the instructor can focus on other aspects of the session.
- Instruct the video equipment operator to focus on the test participant and the computer monitor, rather than on the projected image of the computer.
- Ensure that the video equipment operator knows the schedule and will arrive a few minutes before class is to begin in order to get everything set up.
- Prepare the setting for the lectures when the team-run usability tests are conducted:
- Set up video projection of the computer used for the testing.
- If possible, place a microphone near the test participant to improve the capture of the participant's "think out loud" during the test.
- Hand out Test Monitor Score Sheets to the members of the team that are reviewing (not conducting) the tests that day. You should use the same form to record your own observations. The Test Monitor Score Sheets can be adapted from the one used during the demonstration (associated with quiz #3, found in the quizzes/ folder in Word / PDF format).
- If desired, hand out Test Monitor Rating Scale (found in both Word and PDF format in the extras/ folder)
- Bring food for the volunteer test participants.
- At the end of class:
- Collect the score sheets from the observing team.
- Give the testing team their videotape so that they can review their performance. Among other things, their homework assignment asks them to evaluate how well they performed their roles.
- After the team-run testing session(s) is complete:
- Grade the performance of each testing team after all teams have turned in their testing results using:
- the comments from the team that was observing
- feedback from the test participants
- the videotape
- your own observations
Additional advice on teaching this material:
- It is a good idea to ask students to record how long they spend on each homework assignment. If possible, ask them to first estimate how long the assignment will take. They can report both figures at the top of their homework or submit the figures in an email.
- The quizzes are meant to be distributed at the beginning of the class and filled in during the class as the lecture progresses. Of course, this means that most students will get all the right answers. The purpose of the quiz is not to test their mastery of the material, but to focus their attention on the important topics of the day. Each quiz provides useful feedback to the students and to the instructor. For example, if several students provide the wrong answer to one of the questions, then the instructor might spend some time in a future class clarifying the topic where students were confused. A more complete explanation of this teaching strategy may be found in:
- M. Ardis and C. Dugas, Test-First Teaching: Extreme Programming Meets Instructional Design in Software Engineering Courses, 34th ASEE/IEEE Frontiers in Education Conference, October 20-23, 2004.