You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 33 Next »

Design

Login Page

Our login screen features a minimalist design, with the JobTracker logo, a brief paragraph on what services JobTracker provides, and a field where users can type their username. We decided to leave out a password field because there is no reason to ask our users to remember a password if we aren't designing for security.

We do, however, require that our users provide a username so that we can load their data! This message appears every time a user tries to login with an empty username.

We ask for a an email address as a username because email addresses are usually unique identifiers which the user has to remember anyway. Our login page, therefore, places little memory burden on the user.

Homepage

Our homepage features a two-column layout. In one column, the user can view all their tasks; in the other column, the user can view all their companies. At the bottom of the page are buttons that allow the user to add tasks or companies. At the top of the page are buttons that allow the user to add a document and to log out. On the homepage, as with the rest of JobTracker, we used green to highlight things we wanted people to pay attention to. Task and Company names are green, as are parts of the Add Document and Logout icons.

When a user logs in for the first time, the task box and the company box contain text which describes to the user how to operate the interface and also tells the user what "task" and "company" mean in the JobTracker context. The text disappears from a box after the first time the user refreshes the page after successfully adding data to the box. It will reappear if the user ever deletes all her tasks or companies. That way, the user always has a very visible reminder of how to add a task or company, but the instruction text doesn't clutter the interface when it isn't needed.

The two-column layout gives visibility to the two pieces of information the user is likely to be most concerned about: her upcoming tasks, and links to all her application materials for each company. The tasks are sorted by due date (with the nearest deadline appearing first in the list). They can also be filtered by company - so if a user wants to see just the tasks for one company, she doesn't need to leave the main page. Companies are sorted in alphabetical order.

The homepage has two kinds of buttons. We use the default text buttons at the bottom of the screen. These buttons are fairly learnable, as they say exactly what they are for. They are not as large as the buttons at the top of the page, but unlike those buttons, they can also be operated by tabbing until they are selected, then hitting the "enter" key. We did not take a lot of trouble to make this way of operating the buttons visible to our users, but it is consistent with the behavior of other websites and by default, browsers will highlight buttons when they are selected.

The buttons at the top of the page have a different appearance not only for aesthetic reasons but to differentiate them from the buttons that affect the content displayed on the homepage. The icons were chosen from an internet search for open-source icons. Both icons were fairly representative of the results of that search, and are similar to icons in widely-used applications  (examples!).

Documents

Documents have been reduced in importance as we have iterated our design. In our original design, documents were alongside companies on the homepage. By our first round of paper prototyping, however, we put tasks on the homepage instead, reasoning that users would want to view all their tasks much more frequently than they would want to view all their documents. The "Add Documents" button moved to the top left corner of the page, where it has stayed ever since.

Originally we let users type plaintext documents directly into the "Add Documents" form. By our second iteration of paper prototypes, however, we dispensed with this feature, since our user tests found it to be confusing and because it needlessly complicated our interface.

Tasks

Companies

Groups

Contacts

Notes

Implementation

JobTracker was implemented as a webapp. The frontend was written in HTML5, JQuery, Javascript, and CSS. The backend was implemented with Ruby CGI scripts and a SQLite database.

We chose to use CGI scripts because of the steep learning curve involved with learning a web framework such as Ruby on Rails. We also considered writing different pages using different languages for the backend scripts, and figured CGI scripts would be the most straightforward way to allow this. We also chose to use a SQLite database so we'd have a minimal learning curve on the backend, allowing us to focus on the user interface.

We used quite a few jQuery widgets to implement our interface including the accordion widget and the datepicker widget to give familiar and useful functionality to the user without having to write large amounts of code.

The webapp implementation has some disadvantages that adversely affect the user interface. It depends heavily on jQuery and Javascript, which can run slowly depending on the user's browser and computer power. In addition, cross platform functionality is a problem in a webapp. Our system works fine in Firefox and Chrome, but doesn't work very well in Internet Explorer.

However, for an app that was as rapidly updated as Jobtracker, webapps do allow the ability to push instant updates, without requiring the end user to download anything. Also, part of the purpose of JobTracker is to allow people to access their information from anywhere, which is most easily accomplished on a web based platform.

Evaluation

Users

The pool of test users are quite representative of the target audience. We built the website mainly targeting college students, and because of this the bulk of the test users were college students. We have two other users that are apart of different populations; while the site was not directly targeting to them, we wanted to make sure that the site was still usable. 

The users we tested on are the following:

- course 5 sophomore at MIT

- course 6 junior at MIT

- course 7 junior at MIT

- high school CS teacher

- course 6 MIT alum '06

Briefing

Hi, I'm _____ and these are my partners _____ and _________.
Thanks for helping us out! We're testing out a system to help people
manage a job search. The system, named JobTracker, is a website that
people can use to manage documents and keep track of tasks related to
their job search.

We'll show you a prototype of the website. You'll get a few index
cards with tasks written on them. Try to execute these tasks on our
prototype. After we start, my partners and I will be taking notes on
how we can improve our interface design.

Remember, this is a test of the interface and not a test of you. The
interface is in a preliminary stage and might have problems that make
it difficult to use. Your input is important to fixing these problems.
While you execute the tasks, feel free to think out loud so we can
understand your thought process. Also, you are free to stop the test
at any time. Before we get started, do you have any questions for me
or my partners?

Tasks

The tasks we gave the test users are listed below:

Log in using your email address
Add a new Company, "BBN"
Add a new Document, "resume" (which is a file on your computer) and link it with BBN
Add a new Contact to BBN, "Jane Doe jdoe@bbn.com"
Add a Task called "Cover Letter" for BBN due 5/15/11
View all upcoming tasks
Delete the "Cover Letter" task for BBN

Demo

Users were not given a demo. We felt a demo would hurt learnability testing for this site, and we wouldn't find out if design choices we made had or had not improved learnability. Since the target audience would not normally get a demo from the designs of the site, we felt that the best way to find out any issues was let test users approach the site the same way we though new users would.

Usability issues
  1. While not directly a design issue, "View All Tasks" task seems to be universally difficult. It is possible that since there was only one company in the testing environment that all tasks were on that company. One approach to test this would be to add some information on the site, however since we wanted to recreate what a new user would see that would not be the best approach. I think the best approach would have been to add multiple companies and add multiple tasks before given them this task, but since the test users' time was precious we felt that this was probably the best way. One user was confused by the word "upcoming", thought that might be different than just viewing all the tasks, tried to use the company selection box, and ended up more confused.
  2. It wasn't clear which fields on all the forms (Add Company, Add Task, Add Contact) was required, this caused some users to make up information, while others just tried submitting without them. We could resolve this issue by adding (Required) in the placeholder text, or we labels with asterisks  could be re-added.
  3. The icons on the upper left hand corner (Home, Add Document) were hard to locate, and took a long time. One user went to the company details page to add a document, while the majority eventually found the add document button. To fix this issue, it would make sense to move the logo to the top-left corner and move all the other buttons to the top right corner creating a navigation bar. Since most sites use this design, it will be easier for users to find them.
  4. It was not apparent what the "home" icon did for one user, and the thought the "logout" icon was to get back to the main page. One solution would be to replace the home icon with the Logo and have the logo link back, or to add a back icon on the bottom of the page like some sites.
  5. The dialog alerting users about unsaved data, confused most users while others just ignored it and clicked through. One possible solution would be to save changes on leaving the page, or to just not save it. The alert itself didn't seem to help convey that something was unsaved.
  6. "Details" link is not visible enough for most users. It's hard to find, small,and did not line up with the delete icon. Users expanded the company view attempting to get to the company page. A solution would be to make the details link larger, and have the company name link to the companies page also. 
  7. Many users were unclear what a "task" was. SOLUTION? One possible solution could be to change the title of the boxes to "To do list" or something to that effect.
  8. One user attempted to click on the filler text to add a company. One solution to this problem would be to let the text be able to add a company.
  9. The submit button remains disabled until the text boxes are no longer selected. This was confusing to several users, to fix this we could have the event trigger when enough data is added, or to just not disable the submit button.
  10. One user forgot the purpose of the "Name" field in the "Add Contact" form after selecting it, and since it was selected the placeholder text was no longer visible. One solution would be to add tooltips, so that hovering over the field would also give that information, or to add fields for the form back.
  11. One user entered dates in directly to the text box and didn't use the date picker. He ended up entering in 5/15/11, which will confuse the database ordering. A solution to this would be to sanitize the inputs before inputting them into the database.'
  12. In the expanded company box, the notes text area looks editable when it isn't. A solution would be to just use a div instead of a textarea object for that part of the widget.

Reflection (or lessons learned)

Give users as much rope as possible - but don't let them hang themselves.

We found that users do not always know what is best for themselves. Our design once included groups, which were displayed with user-chosen colors. Users could chose any color from a color swatch - their freedom was enormous! We imagined users carefully picking colors that were easy to distinguish.

Instead, this happened:

As at least two of our heuristic evaluations commented, some color choices make the name of the company incredibly difficult to read. Our testers weren't employing the careful consideration we'd imagined, and our interface lost usability.

We know now that a better way to implement the colors would have been to give our users more help in not making this sort of mistake - perhaps automatically adjusting the font color, or restricting the colors to those that make a black font readable, or showing the user a preview. User freedom takes more planning than we thought!

The more testing, the better.

We noticed that our testers varied dramatically in their ability to interact with the interface. Some sailed through our tasks like they'd been using the interface for months; others made many mistakes. Similarly some of our evaluators were alone in disliking certain aspects of our interface strongly. While some individuals raised very helpful points on their own, we found that most of our major changes came about as a result of noticing trends in the way people interacted with our interface.

Prioritize what your target users actually want - let "wouldn't that be cool?" features come later.

Our "groups" again provide a good example of where we went wrong. In our initial computer prototype, we spent a fair bit of time implementing groups, only to find that our testers and evaluators found them confusing and unhelpful. When we interviewed potential users of what they would like in a system like JobTracker, not one of them mentioned wanting a way to group companies. We thought it would be helpful, but it wasn't, and our time could have been better spent by providing more useful (and requested) features.

The iterative design process actually works - and we could probably use another iteration...

Over the semester, we saw JobTracker greatly improve and significantly evolve as a result of the feedback we got on each iteration of the design process. One gratifying comment from our final user testing was "it seems pretty straightforward".

Even so, each iteration seemed to bring on a whole new onslaught of problems. Even now we have a list of improvements for our interface that came up in discussion with our users.

...so make the most of the strengths of each design step.

One mistake we made was on focusing too much on the backend of our interface for our first computer prototype. By the due date, we had a pretty functional interface, but we hadn't put as much effort as we could have into the visual design of our website. Our second computer prototype had a much more polished frontend, but we could still make improvements and might have ended further along if we'd focused our efforts more carefully.

  • No labels