USABILITY TEST ON SAN FRANCISCO PUBLIC LIBRARY'S Website
The San Francisco Public Library (SFPL) was aiming to release a new version of its calendar that offers a more comprehensive display of its event list. Our team's responsibility was to provide a usability test focusing on specific features of the calendar, with special emphasis placed on the event search filters that filter by location, age, and topic (as indicated by the arrows in the picture below).
We went through different stages of evaluation to cover all aspects of events’ filters. Here is a brief description of our research questions, evaluation methods, results, analysis, and findings.
Methods We Used
The team utilized Jakob Nielsen’s “Ten Usability Heuristics” as a basis for evaluating the site and used several different types of formative analyses to draw our initial conclusions. We looked at the 6 characteristics of usability – usefulness, effectiveness, efficiency, learnability, satisfaction, and accessibility.
We used a variety of different individual and group tests to evaluate the site. These included team heuristic evaluations, expert heuristic evaluations, a one-on-one evaluation, task analyses, and user field tests.
Our Goals
Our UX research objectives were:
1- Determining if and how the calendar filters effectively aid the user in their goal of finding information that is relevant to their needs.
2- Identifying possible usability issues with the calendar filters, such as efficiency in finding events, effectiveness of age categories, and the functionality of finding more information on particular events.
Research Questions
Our primary research questions included:
- Is the functionality of the search filters efficient and learnable?
- Is the display of search results efficient?
- Can users find the event that they are looking for?
Project Management (Project Timeframe)
Working on a long-term project with a team can be unmanageable if there is no structure. It is important for each task to be completed on time and for the team to communicate plans and changes. I created a project plan and coordinated the tasks, and it helped us deliver on time without a last-minute cram session.
Heuristic Analyses
After coming up with research questions, the next step was evaluating the calendar and addressing our research questions within our team and 2 experts before conducting user sessions. The Heuristic Evaluations were very important in shaping our research questions and made us focus on testing certain features. It was interesting that each participant had different comments, but at the end, you could see some trends emerge.
Task Analysis
Before conducting user testing, we analyzed the tasks that were designed for the user field tests, and created flow charts that indicated a path to completing them.
Focus Group AnalysIs
We used the following path in completing our Focus Group Analysis:
We held our focus group analyses inside private rooms. Before performing the test sessions we briefly re-introduced the study and described the general expectations we had of the volunteers.
For our test sessions, each volunteer was isolated in the private room with the moderator and observer, and was provided a MacBook computer to use to test the calendar tool. Before the test sessions began, the MacBook was already set up for each volunteer by having the calendar page on the screen before the volunteers sat down in front of it.
We followed the following testing formats:
Sit-by session: Our test moderator sat next to the user at a comfortable and appropriate distance - enough so that the user was able to at least peripherally see the moderator and not be distracted by their presence. This format allowed the moderator to directly observe the user’s process.
Think-aloud: We asked our volunteers to ‘think-aloud’ and narrate their thought process as they attempted the user tasks to better assess the logic behind their actions.
Follow a script: We had our moderator follow a script to ensure uniformity across test sessions, but allowed the moderator to improvise as needed to keep the interaction between the moderator and the volunteer natural and appropriate.
Analysis of Results & Usability Criteria
The following three tables illustrate comparisons of task performance for each of our three volunteers in our Focus Group Analysis.
Summary of Findings
In general, our tests showed that the calendar functions well, but could benefit from some minor adjustments. Although some of the recommendations are outside the scope of our original research questions, the consistency with which they emerged in our study gave us cause to include them. The general issues that emerged were as follows:
- Default filter choices
- Specificity of filter choices
- Searching for multiple topics within one search
- Event view vs. Calendar view
- New tabs for event information
- Filter reset when switching views
At the end of this project: Recommendations to the Client
We submitted a usability report with our recommendations, focusing on the most important issues we observed. Here are 2 examples of our recommendations:
Issue: Each filter allows for only one choice to be selected.
Recommendation: Use checkboxes to allow users to choose more than one option. This would be particularly helpful within the “Topics” filter.
Issue: Default filters are not universally set to “All”, leading to confusion about how to search for a single event type at multiple locations.
Recommendation: The location, age and topic filters should default to “All”.
Reflection
Working on San Francisco Public library was one of the most informative and exciting experiences I ever had in the UX field. First, this was UX research conducted on an existing product (SF Public Library). Doing UX on an existing product is challenging because the product will come with some previous issue and overlapping interests. There were a few issues that need to be solved, but they were beyond our usability research scope. In the end, that was a good exercise for us to keep focusing on the project's objectives.
Finally, this was also a data-driven and research-based project. It was important for this type of project to analyze the problem in the frame of the collected data, rather than, “this is not how filters work on other websites.” I used results-based observations such as “50% of users in our test did not understand how this calendar works”. I was excited to see how much our hypotheses were accurate, and I learned a lot from directly observing users in a practical rather than theoretical environment. For example, I was surprised by how impatient users can be. If they do not see the right option in the filter list quickly, they would skip it and spend more time navigating other events to find what they are looking for. Overall, it was an educational experience in the UX and usability research, and one I would be happy to repeat.