Understanding How College Students Manage Their Daily Activities, If at All.
JAN - MAR 2017
Time management tools is a saturated market, yet people still struggle to find something that just works. As part of a formative research, I studied undergraduate students' behavior of time and task management with Microsoft researchers to understand their behavior, motivation, and mental models.
Due to NDA, only the research process will be shown below.
Lab Usability Testing
Research to Enable Actions
When doing research, we kept our research questions in mind the whole time. From survey questions to data collecting, we intentionally chose what questions to ask, and what data to collect. We made sure that our research provides actionable next-steps and information that assists design decisions in the long term.
Planning it out
At the beginning of the project, we had a discussion with the stakeholders to have a clearer understanding of how our research will be carried out. We created a study tool kit which included the following
Consent form & Gratuity Form
Scripts used during facilitation
Data-logging / note-taking forms
In our kick-off meeting, we also discussed the strengths and things each member want to improve on, expectations, and communication methods within the team etc. Clarifying logistics like these ensured a smooth collaboration later on.
Collected primarily quantitative data
Received 61 responses within 4 days
Adjusted interview questions accordingly
When creating the survey, I made sure to
Keep the survey short
Use wording is unambiguous and easy-to-understand
Compose questions that align with research goal
I also conducted a pilot survey to ensure questions flow well and make sense from the survey takers' perspective.
Usability Testing Sessions
We made sure to recruit the correct group of participants to our testing sessions. We aimed for doing testing with at least 5 participants, which should be enough to discover the main usability problems with our design. However, we recruited 8 participants as we expected no-shows and/or last-minute cancellation. We sent multiple reminders before the session to ensure participation rate.
Semi-structured Interview Questions
10 prepared user tasks to finish
On the scheduled day of testing, we ended up having 5 participants for our interview and usability testing. Each session lasted around 1 hour in total. I was the moderator of 2 sessions, and the note-taker for 2 participants.
When moderating the session, I let users do the most of the talking. I encouraged them to "think aloud", so I know what their thinking process was like. I stayed patient to let users figure tasks on their own. When needed, I made sure to ask "why" to understand the motivations behind their behaviors.
Video capture of me moderating a testing session
After the testing, we consolidated our notes and analyzed the results we got. We found patterns of users' behavior by studying the qualitative and quantitative data we collected.
Information in the table is dummy data for presentation purpose
S: Success; F: Failure; P: Partial Success
I proposed using success rate as a metric to evaluate the design. It is a fast and simple method to measure usability. It also comes handy when communicating with stakeholders because the data shows clearly the trends and directions of next steps.
Nonetheless, I was aware of the limitations of success rate: it does not reveal "how" or "why". The qualitative information we received from the interviews and the testing helped compensate that.
Based on our findings from the survey, interviews, and usability studies, We came up with a series of design recommendations that were marked with high, medium, and low priorities.
Overall, the Microsoft researchers, PMs, and designers were very pleased with our work. Details will not be discussed due to NDA. On a high level, they found it particularly valuable to have someone outside of their team to provide independent findings that provided both validations and new insights to their current research.
Throughout the project, I have been keeping a weekly journal to document my learning and progress. Below, I have summarized a few important takeaways from this experience.
Always keep the research question in mind. A lot of work goes into research, and it is easy to get lost in the nitty-gritty duties and forget the ultimate goal. Whatever decisions we make, it is critical to ask ourselves: will this help solve the problem? Does this help achieve the goal? This takes conscious work, but it is essential for deploying a valid research.
Document the process. Although it sounds trivial, proper and in-time documentation will save a lot of headaches later on. Having a track record of decisions that were made eliminates repeated arguments and thus help push research forward.
When doing interviews or usability studies, it is helpful to have multiple backup plans. Very often, participants behave differently in the interviews/testing from what researchers expect. In those case, having alternative plans helps ensure we still get the most of a testing session.
When presenting, consider your audience. We had two rounds of presentations at the end of the project: one in front of UW students, and one with Microsoft stakeholders. We tailored our presentations according to our audience. For students, we shared mostly our research process and high-level contents. For Microsoft stakeholders, we went much more into the details of our research findings and recommendations, with consideration of technical and budget constraints.