Design Project 5: Usability Testing
Final Write-up: 6/5 (Wed) 11:59 PM
7% of your total grade
[Submission Link]
What do we do?
Now let’s show your working prototype to representative users and get their feedback. Find at least 3 users representative of the target population, let them test your prototype, and learn how you might improve your interface. Note that participants should not be students taking this class. We suggest that your testing process follow the three steps outlined below:
- Planning: Written protocol
You should prepare a written protocol – a planning document – before you run sessions. It is important to plan and rehearse in advance so that your participants receive the consistent experimental treatment, which in turn leads to valid and reliable data. Your protocol document should include:- Instructions for preparation and setting up the testing environment: Anyone in your team should be able to set up and run the experiment just by referring to this document. Example information includes any URL to open, which web browser to use, example data to load, etc.
- Introduction and informed consent: Make sure you read this introduction exactly as written in a study session. For informed consent, we don’t require formal paperwork, but you should explain what information you’re collecting (e.g., photos or voice recording) and get participants’ consent.
- Tutorial or training (if needed): If your tasks could benefit from participants’ familiarity with the interface, we advise adding a tutorial or a training session. It’s a good idea to use a simple, concrete example task to walk through the interface.
- Task list & instruction: What are the tasks participants should perform, and what instruction should they be given? This is another part that should be delivered exactly as written to all participants.
- Recording strategy: How will you record your observations and findings? We advise against video recording, but instead recommend sketches, photos, voice recordings, screen captures, screen recordings, and written notes as you see fit. Also discuss what role members of your team will play in each session: facilitator, observer, and optionally tech person in charge of monitoring if the system is working fine and data is collected properly.
- Questionnaires, Interview questions: After the session, it’s a good idea to get qualitative comments about the task experience from participants. In doing so, we suggest preparing a list of questions to ask in advance. For interviews, consider semi-structured format: follow the initial plan you came up with, but ask additional questions based on participants’ responses.
- Debrief prompt: After the session, thank your participants and give them a short debrief about the experience and the project.
- Testing: Session observations
Before running actual sessions, do a couple of dry rehearsals within your team or with your friends. Carefully follow the protocol you have written up and debug it. As you make observations, focus on usability issues and whether participants find the interface useful. - Reflection: Usability lessons
What did you learn about the usability of your interface? Revisit your interface design, tasks, personas, POV, and user needs, and review if any of them needs revision or additional work.
NOTE: You may keep improving your prototype during this period. You can reflect users’ feedback into your interface and keep iterating until the final presentation. The goal is to have the best version ready until the final presentation.
NOTE: To give you more time to prepare for usability testing and improve the UI, the DP5 studio will only require you to prepare a written protocol. In the studio, you will run pilot sessions of usability testing with peer students.
Your Report
- POV: Clearly state the final version of POV in your project.
- Tasks: List three core tasks your prototype supports.
- Connection with final HMWs & Solutions: How each task is linked to HMWs and solutions in the final. If your HMWs and solutions have changed since last report, please use the final version and explain it in Updates from DP4.
- Updates from DP4: If your team revisited the previous DPs (e.g., HMWs, solutions) and made updates/ improvement to your ideation results, briefly describe what has been changed.
- Written protocol
- Session observations
- Participants: Who are they? How did you recruit them? Why are they representative target users? Add brief demographic, context information about each participant that is relevant to your interface.
- Use at least one photo or sketch for each participant and provide a summary description of each session (e.g., What was unique about this participant? Main takeaway from this participant’s session?).
- Usability lessons
- List at least 10 usability problems you discovered. Organize them by high-level task or theme, not by each participant or time. But mention which participant ran into the problem by referring to them as P1, P2, … (e.g., search results did not show the price information (P1, P3)). For each problem, indicate how critical the problem is: high, medium, and low. Finally, show how you plan to address each of the problems.
- High-level reflections: What did you overall learn from the user testing experience? What would you do differently for better results and insights?
- Studio Reflections: Summarize the feedback from the studio session, and mention how you addressed it or will address it later in the process.
- Plan for iteration: You’ll have an additional week to finalize your overall design process. Discuss in your team how to use this time most effectively, and make a few concrete goals. Justify your goals.
Grading
- Written protocol (30%)
- Preparation and setting information is detailed enough?
- Introduction is concise, to the point, and readable?
- Tutorial or training is well-designed? (if exists)
- Task list and instruction are clearly written and readable?
- Recording strategy is carefully planned, in a way that can capture useful insights?
- Questionnaires and interview questions are carefully thought out to get at deep insights and thought process of participants?
- A debrief prompt is well-written in a polite and informative manner?
- Session observations (20%)
- Participants are representative?
- Participant information is descriptive and contextual?
- Visual aids (photo or sketch) are added?
- Summary of each session is provided?
- Usability lessons (20%)
- 10+ usability issues submitted?
- Are the usability issues described concretely and clearly?
- Organized by high-level task or theme?
- Level of criticality included?
- High-level reflections are thoughtful and insightful?
- Plan for iteration (10%)
- Is the plan for improvement reasonable and concrete?
- Are the goals well-justified?
- Studio Reflections (10%)
- Is all of the feedback received during the studio accurately captured in the summary?
- Is the summary of feedback well structured (e.g., according to high-level themes and/or criticality)?
- Studio Presentation (10%)
- Preparation and organization?
- Articulation and clear delivery?
- Effective use of visual aids?
- Time management?
Deliverables
Studio Presentation: In studio, your team will briefly explain your usability test protocol. You need to prepare a Google Slides presentation, by editing the template slides in your team folder. The slides should be ready by 10:29 am on Tuesday before the studio begins. This time, not every team member needs to participate in the presentation. Instead, we will do a peer pilot run of usability testing in the studio.
Team Report: One report per team. Your report should be submitted as a zip file. The main report should be written in Markdown (please use the .md extension). Name the file with your team name, and submit it by the submission link provided. You can submit multiple times until the deadline, and we’ll use the most recent version for grading.