Briefing

You are junior marketing analyst (or any other non-software job you’ve dreamed of having) who has recently been asked by his boss to learn how to code. “Your next assignment will involve considerable coding,” your boss adds. You have never coded a day in your life and don’t quite know where to start from. Also, you have no other information about what you might be asked to code in the assignment. Having taken a MOOC before, you head over to rethinkED to see if there’s a relevant computer science course that suits your needs. You hope the class is not a lot of unnecessary work and gives you a good introduction into “coding”. Finally, impressed with the activity on the website, you decide to contribute by reviewing 7.00X (the MOOC you recently enrolled in) as well.

Scenario Tasks

Task 1:
Search for relevant Computer Science Courses. You find too many results and decide to refine your search to find computer science courses only offered by either MIT, Stanford or Harvard. Find the three highest rated courses (CS50X, 6.00X and Intro to CS) and compare them to decide which course you like the best.

Task 2:
A friend told you about a course “Startup Engineering” he enrolled in and really likes. You want to find more information regarding the course in case you find it interesting. Search “Startup Engineering” to find out more about the course.

Task 3:
You are extremely happy with rethinkED and decide to contribute to the reviews. Review 7.00X and submit a comment describing your experience.

Observations and Changes

User Observations 

Changes for Next Iteration

Users had some trouble understanding the difference between “Upcoming” and “Announced” filters.

Learnability Issue -- To fix this, we changed the options in the Date filter to say Ongoing, Starting soon, Just Announced. This made the distinction between “Upcoming” courses were those starting soon and “Announced Courses” were the ones that started later on or “not soon”.

Another user tried to find classes that were scheduled in May, two months from now. He wasn’t sure if that was “Starting Soon”, and clicked on both “Announced” and “Starting Soon”.

Our recent quick fix had failed soon after, so we decided to change “Starting soon”. We changed the label of the “Dates” filter to say “Start Date”, and changed the options to “Ongoing”, “This month”, “Next 3 months”, “To Be Announced”. This change made the options much more specific in relation to the months of the calendar and resolves ambiguity regarding what “soon” meant.

The user encountered a problem while trying to close the filters menu. We had forgotten to put a close button, and so the user had to click on apply even though he didn’t want to!

As a simple fix -- we added the close button on the filter. A side note: At this point, our observers discussed the possibility of changing the filter menu interface for round 2. (User feedback: category is probably the most important filter) We  noticed an efficiency issue -- users had to go through several unnecessary clicks to reach filter categories that could be eliminated.

A user faced some difficulty understanding what “Initiative” filter did , hesitated, then clicked on it instead of “Institute” by mistake.

We changed “Initiative” to say “MOOC Provider”. This was in our initial design, but we were unsure which one was better. We decided to change it to “Initiative” for user testing and observe the user’s response. Turns out that test users prefer “MOOC Provider.” 

The users (2 and 3) tried writing into the filter’s results as our design said “Search for...” and then presented search results.

We removed the “Search for...” and replaced it by “Filter by...”

Once the results were listed, the user had some trouble knowing that he must select a few courses and then click “Compare Selected”. He kept trying to click on “Compare” next to the checkbox for each course. He said that he was expecting a new page or interface to compare courses.

Differentiating a label from a hyperlinked text would be much easier with the actual design. As an added safety measure, we decided to add a dialog box that informed the user of the necessary steps in case the user made a mistake.

Once the results were listed, the user had some trouble knowing that he must select a few courses and then click “Compare Selected”. The user kept trying to click on “Compare” next to the checkbox for each course. He said that he was expecting a new page/interface for compare.

A simple fix that we overlooked was placing the close button (again!) next to each review. Additionally, we realized that we had forgotten the “Reset Filter” button to start with.

One of the users searched keywords “Computer Science MIT Upcoming”

We still presented the correct results, but some courses were not “upcoming”.  Observers noted that all courses needed tags with as many synonyms as possible including tags like “upcoming”. Observers detected a safety issue and noted -- Remember tags for start dates during the implementation.

User 2 had some trouble navigating back to the main page after comparing the results.

We added a menu option that navigated to the home page.

User 1 had some trouble knowing where to click to find more information about the course displayed during task 2.

Differentiating a label from a hyperlinked text would be much easier with the actual design. We noted the importance of sufficient feedback and affordances when indicating click ability.

User 2 initially found 7.00X on main page (popular courses) and navigated to the course review page to write a review, but did not find an option to write the review -- this was because the write review option was only on the homepage.   

We added the Write Review menu option in the individual review page. Initially, this was present only on the main page. We fixed this by adding the “Write Review” button to individual course pages.

One user pointed out that he was unsure what “Score” -- how was it calculated?

Learnability issue -- We added a How We Calculate Scores option on the main page.

Apart from the minor changes, during iteration 2 we changed the filter to a more efficient checkbox listed interface similar to Amazon’s interface. Additionally, we discussed the possibility of adding a "filter by difficulty" option.

Paper Prototype Images

  • No labels

1 Comment

  1. Unknown User (meelap@mit.edu)

    Briefing and Tasks: The briefing should provide a scenario in which you can frame your tasks, rather than present the tasks themselves. Some of your tasks aren't high level (they're tied to your specific interface design). For example, the first task could be "Find computer science courses offered by MIT, Harvard, or Stanford."

    User Testing: It looks like you didn't test your second iteration on three more users to evaluate the changes you made since you didn't list additional observations after you made changes. I really like that you described how you responded to each usability issue that you observed.

    Prototype: You included a lot of pictures of your prototype so it's hard to identify the relevant ones to each of your observations.