GR6 - User Testing

Design

For the most part, our final design followed the overall proposed design, which includes a grid containing eight semesters of courses and each semester containing a button to add new classes. Next to the grid is a panel, which allows for editing selected courses. Our initial vision for the interface involved a direct manipulation of courses, making it easy to shuffle around and add prospective courses while allowing for a visible handle on the course map. The entire application prefaced by a user login screen.

Demonstration of the Direct Manipulation feature:

Login Screen:

Those three features (grid, panel, direct manipulation) were the driving motivation for other design decisions we needed to make. One of the more notable undertakings was the concept of autoscrolling. Without a very high resolution monitor, it is impossible to fit an entire course map onto the screen and maintain readability. As a result, we needed to embed our course map into a scroll viewer. Mixing a scroll viewer and direct manipulation creates a  demandsfor autoscrolling when the user tries to drag a course to the left or right of the visible area. Through several iterations, we were successful in implementing effective autoscrolling.

The autoscrolling was a feature which we chose to implement in order to improve the efficiency of our application. An additional feature we implemented for efficiency the concept of autocomplete comboboxes. These comboboxes (not native in Silverlight) are capable of searching through all of their contained items, searching for items whose key contains the entered text. This is useful for students who know that they want to take a class that begins with ‘6.8’, but do not know the rest of the course number. It prevents them from needing to scroll through the entire contents of course 6.  It is also handy for students who are filtering by major and know that they want a chemistry course, but do not realize chemistry is course 5.

Autocomplete Combobox:

Through various iterations of user testing, there was a high demand for better visibility of the course map status, in particular which requirements are met. This concern for visibility raised several design questions. Should we show the student which requirements are met? To what detail of the requirements should we show them? Should we provide explanations of how the requirements are met (with which courses)? etc. Because one of the main objectives of our utility is its simplicity, we chose to move forward with a minimalist approach, and not overcrowd the interface with too much information. We accomplished this by adding two status bars to the utility. The first status bar occurs in the grid itself, and is responsible for keeping track of how many units are being taken each semester. As the user adds and moves courses, these unit counters are updated accordingly.

Credit Counters:

The second status bar served three purposes. First, it kept track of how many of each requirement were being met. Second, provided visual feedback to the user if a requirement was not being satisfied or if other errors occurred somewhere in the map. Lastly, it served as a legend to make sense of the course tile coloring scheme. Many students in early iterations of the interface, were confused what the various color assignments meant on our course objects. This helped to solve that problem.

Requirements Status Bar:

A feature that ties in closely with the status bars, is our handling of scheduling errors. Errors currently occur in two ways - if a class is duplicated in a semester, and iif a class is placed in a semester when it is not offered. To handle errors, we felt it isimportant to notify the user as well as provide optional information as to why the error is occurring. Originally, we sought to prevent users from making these errors. This could have included graying out a semester when they tried placing an invalid course into it. We realized, however, that there are sometime extenuating circumstances that allow classes to be taken in contradiction to our imported course data. As a result we chose to give the control to the user, instead focusing on providing adequate visibility and control. We accomplished this by changing the color of error courses to red and providing an option to ‘Ignore Error’. Additionally, for users who are unclear why an error is occurring, tooltips are activated for the error courses and hovering over an error will provide a brief explanation of its cause.

Sample Error Class with Tooltip:

Another issue we observed while running usability tests, was that even with significant prompting, users had trouble editing course data. They did not realize the panel to the left of the grid held controls for modifying currently selected courses. As a result, when they were asked to modify a selected course, they spent a significant chunk of time clicking around the course tile and in some cases tried deleting and re-adding the course. An adaptation we added to our design was to change the color of the course information panel whenever a course was selected. Whereas different course types are assigned different colors, we set the color of the course pane to match the color of the currently selected class. This helps to both notifying the user of a change in mode, as well as to draw their focus away from the grid and over to the course modification panel.

Course Panel Color Change:

Finishing the project, we realized that the concept of direct manipulation was still not as implicit as we had hoped. Even with all our added visibility features, many users faced a difficult learning curve of what to do with our interface. Once they were aware of the features they could manipulate, the tasks became incredibly simple, but it was this initial awareness we needed to account for. As a result, our application provides an information screen to all first time users. This screen lists several of the key features of the application, which are not immediately apparent. With the aid of this screen (for those users who actually took the time to read it), the initial learning curve was greatly reduced.

Information Screen Displayed at Start-Up:

Overall, the greatest design challenges we faced were to improve the intereface visibility. This was to be expected, because we elected to take a direct manipulation approach, which gains its power from good visibility. We had to tie all of the components of our UI together to ensure that the user manipulation provided adequate feedback.

Implementation

Our interface was built using a small MVC design implemented in C# using the Silverlight web framework. A model was loaded at runtime from preprocessed text data about courses at MIT and modified as the user modified his or her schedule. The view was a static collection of control widgets from the Silverlight toolkit with two custom controls: a search-while-you-type combo box and a dragable course control. The controller managed the event model of C# and handled updating the model and view when neccesary.

Our model was built based on real course data available from IS&T. We began by scraping the last year’s worth of course data and preprocessing it into a text file distributed with our application. On load, our application read in the course data and created a mapping of course IDs to objects, known as course references. This database was kept as an immutable reference collection for other objects, known as courses, which represented a particular instance of a class the user could manipulate. In this manner the data about a proposed schedule was stored as a collection of courses in semesters with certain user-centric state, such as grades, and an ID that linked it to an immutable course reference containing academic information.

The view was constructed using standard components from the Silverlight toolkit, most notably using the grid layout control. This control enabled us to design our own drag and drop system as well as autoscrolling to present the courses in an organized fashion. In this manner we implemented a widely supported interface, since any browser with a Silverlight plugin is capable of supporting our website.
One important decision we made in implementing this system was to use real course data. While this was reasonably easy to do since MIT publishes its course information, it also introduced some added complexity for the end user. The real course data is often incomplete and inadequate to describe what new users, particularly freshman, might need to know about courses. Consider that 7.012 and its many variants are identical for degree purposes, except for the semester they are offered. The course data is also incomplete for the other 7.01X courses and it instead counts on a user to read the 7.012 description. In this manner we created some hidden knowledge that a novice user might not have, where a totally faked backend could have avoided it. This is however, an important lesson for us as implementers. If we use real data, we get the real complications that come with real data.

Another key design decision was to give each course that a user might represent on their schedule a unique identifier. While not strictly necessary for conforming schedules, users often created impossible schedules (duplicate courses in the same semester) that would break a model representation that did not consider these cases. Consequently we needed to expand our interface and model to account for potentially aberrant user behavior so that the user experience would stay constant. Earlier prototypes without this improvement suffered from unexpected direct manipulation behavior when users attempted to manipulate duplicated courses.

Evaluation

We recruiting users who were current MIT students, who were familiar with the course requirements and process of completing a degree at MIT.

We began by giving users the following oral briefing:

Incoming MIT freshman are faced with a daunting task of deciding which classes to take for the next four years. The task is complicated by institute graduation requirements, HASS requirements, major requirements, and overall credit requirements. Additionally, some classes require prerequisites and some are only offered during one semester.
FourPlan seeks to help students build valid course maps with as little effort as possible, by managing most of these constraints and keeping track of satisfied requirements.
Complete the following tasks, trying to resolve “Error Classes” whenever they occur (rather than just ignoring errors).
You have the right to stop at any time.
The users were then asked to complete a series of tasks, which were drawn directly from the tasks in our paper prototype assignment (GR3). We did not give the users a demo of the system beforehand, since our interface includes a help screen and we wanted to gauge is effectiveness. The tasks were as follows:

  1. Let’s say you are a freshman entering MIT in fall 2010 and you know you probably want to be course 6-3. Please login to Fourplan to start creating your schedule.
  2. You've heard awesome things about Prof. Sadoway and really want to take 3.091 instead of 5.111 in order to meet the chemistry requirement. Please change your schedule accordingly.
  3. You've already received AP credit for 18.01, so you want to take 18.02 and 18.03 instead of 18.01 and 18.02 during your freshman year. Please make the appropriate changes.
  4. You decide you'd rather get started on Course 6 classes early, so you want to move 6.01 to Spring 2011 and postpone Bio to Fall 2011.
  5. Congratulation! It's now May 2011, and you just completed your freshman year. Please mark all your freshman classes as "completed" and enter the appropriate grades.

Most of the problems that the users encountered we had either noticed ourselves during development (and hadn’t had time to address) or had been anticipated based on classmate feedback. A brief list of the issues is below, separated by catagory, accompanied by comments..

Critical

  • Subject don’t initially realize they must edit courses using the sidebar (Visibility and Efficiency) - We’d added the color-changing sidebar based on earlier feedback, but that didn’t seem to help enough. This ties into the fact that people REALLY want to be able to directly edit the courses on the boxes. This was a feature we’d planned to implement if there were more time.

Major

  • How to edit new courses isn’t obvious (Visibility) - New courses are white, so if nothing is selected, the sidebar doesn’t change to indicate something new was selected. This would be simple to fix by choosing another color for new courses.
  • Tooltips for errors are not obvious (Learnability) - Even with the info screen explaining the tooltips, subjects had a hard time figuring out how to find out what errors meant. Need a better method for explaining errors. (*see below)
  • More course info should be explained on the boxes. (Visibility) Adding ‘semester offered’ is probably most critical.

Minor

  • Subjects didn’t realize they could type in the autocomplete box (Visibility) Affordances could be improved, maybe by providing a blinking cursor when the box is selected.
  • Initial course schedule is confusing. (Learnability) - This should probably be explained in the opening info screen as an example for their selected major. Also, when students don’t put all the classes on the schedule themselves, they aren’t aware of what already exists.
  • No Undo (Error Prevention) - This only comes up occasionally, but might be useful.
  • Warnings don’t occur when a course is added to the schedule twice. (Visibility) - This was a planned feature that we ran out of time to implement.
  • The meaning of the AP credit column is unclear. (Learnability) - This could be explained at the start. It could also show up in an explanation of what GIRs are missing (*see below)

Cosmetic

  • Entering grades is tedious (Efficiency) - We could set up a separate interface to enter all the grades for a semester at once, for instance.

*Note on explaining errors and requirements: A good suggestion we received from a user was that in order to help explain some of the errors and requirements, we could add a feature that would allow users to click on the categories which could provide a brief explanation of each along with a list of any issues (not meeting the bio requirement, for example, or how many classes were in the wrong semester) This is something we would explore if we had more time.

Reflection

First of all, we learned a great deal about properly scoping a project. We came into this project with a grandiose vision of what we wanted to accomplish. Unfortunately, many of our features did not make the final cut, simply because of time constraints. Some of these, like the constraint solver, had initially been tagged as “reach features”. We’d hoped to get that far, but recognized that this may be beyond what we could do in one semester. As we continued in the prototype, we realized other features that we hadn’t initially considered. Errors, for example, were something we never really prototyped. We had been more concerned with the behavior of our interface, so error notifications and error checking were not really present until our final implementation for GR5. If we had another round of iteration, we would work on increasing the effectiveness of our error notifications based on feedback from later user tests.

There were also features that we received feedback on that we just didn’t have time to reimplement. The best example in this category is the direct editing of the course object boxes. This seemed to be something that users really wanted, and would have indeed been a very intuitive way to work with the system. However, given that we had many other things to fix and improve for GR5, we elected to just focus on making the features we’d selected work as well as possible. If we were to redesign this project, we might just get rid of the sidebar and see how users react to that interface.

One of the most critical lessons we learned was the importance of building a shared mental model. We had figured that after completing the paper prototype it would be straightforward to go forward with the computer prototype implementation. However, when we sat down to draw out the interface and divide up the implementation tasks, we realized that each of us had a different idea of how the interface would behave in different scenarios. Particularly since we were designing a direct-manipulation interface, there were a lot of nuances to work out. For instance, what happens when a user clicks on a class? Where does the focus go? What has to happen before the class is “saved” in that position? Sitting down and working through these issues really made us appreciate the complexity of a well-designed interface. If we were to do this again, we would probably try to work out some of these issues earlier and work them into our paper prototype.

  • No labels

1 Comment

  1. - Excellent report.

    - Great feedback from user testing. For the Major issue "-How to edit new courses." I think the solution to choose a new color rather than white may be not enough. Maybe you should consider highlighting animation on the sidebar or change key focus to the sidebar.

    - I have some comments: Your login screen is pretty clear, so i don't think we need explaining 1->4 on the left. Those rather look redundant. Also at the "Information Screen displayed at Start-Up": 1 -> 5 lines were indented differently with different color is a bit confusing and may convey wrong messages ( like some task is more important than the other). Should keep them consistent.

    - In reflection, I agree that you should do paper prototypes with and without sidebar carefully with users.

    - It was a very good lessons learned that even after paper prototyping, it's good to have the same shared mental model.

    - Hope you guys enjoyed the class!