Design

We will present the final design of our interface by walking through a sample user interaction.

The user is first presented with the splash screen that has persisted since our first iteration. This splash screen is consistent with other Android applications and generally serves as a confirmation to the user that they clicked the right application and that everything is working correctly.

The user clicks the button to continue to his music library, which lists the compositions stored in the phone’s internal memory. The compositions are sorted in alphabetical order. When the library is empty, an indication is shown on the screen (“User Library (Empty)”), based on feedback from the heuristic evaluations.

Another request was a “long-click” functionality that gives additional information about the indicated song.

These details can only be edited in Edit Mode.

The new song settings screen is displayed when the user chooses to create a new composition. Default values were requested in the heuristic evaluations.

In Edit Mode, the user can add and remove notes as well as change clef, meter, key signature, title, and tempo. The ability to modify all of these song fields was requested in heuristic evaluation.

The user is given feedback when the buttons are clicked to add notes or flats / sharps / naturals to the screen. This was an original design choice because it gives the user a hint regarding the mode of the application.

When the user adds notes to the song, they snap vertically and horizontally for compact and consistent spacing. Accidentals can only be added to notes (not rests) and snap to a location near their note.

When the user places enough notes on the screen, they can move a subsequent screen and continue placing notes. We originally wanted the user to scroll the score to the left using a finger swipe, but we feel the current implementation is more consistent with the moded note placement.

The application supports menu screens that are consistent with all other Android application screens and are prompted by the phone’s physical menu button. The “Back” button is disabled in both Playback and Edit modes to preserve implementation invariants.

Placing dynamics (p, pp, mp, etc.) is achieved by clicking the dynamics button, choosing a particular dynamic from the list, and placing it on the screen. The screen greys out the areas where a dynamic is not allowed to be placed, making it very obvious to the user where to press.

Playback Mode gives a simplified view consistent with Edit Mode. An arrow points to each note as it is played.

Implementation

Android Limitations

Inherent to Android development, each screen represents a different Activity in the code. There is a defined Activity Lifecycle that gives a general structure to each of these files: there is some code executed on opening and other code executed on closing. All other actions are defined by listeners attached to View objects on-screen, so the application can react to clicks, drags, etc. The interfaces themselves are statically defined in xml files and can be updated dynamically at runtime.

Implementation Specifics

The application consumes minimal resources when resting (i.e. when the user is not immediately clicking or listening). The heavy lifting is done during Activity switches and immediately when the user creates events where listeners are attached.

We loosely followed the Model-View-Controller pattern. Our Model (data representation) is completely blind to the fact that it is holding data that is displayed to the user. The model holds song data in a setup as displayed below:

The View is mostly handled by the Android OS, which is responsible for drawing and updating the screen when items are added, removed, or modified. The OS dispatches events when the user interacts with the device.

Our Controller is divided between different listeners that wait for user events and code that updates the View and Model as necessary. The application is a giant state machine, which makes the listeners completely state-dependent. Their implementation takes the shape of large case blocks with different actions at each possible state.

Our Thoughts

The biggest implementation problems we had were due to the Android API. The Android Developer’s site was a lifesaver, but some of the quirks were difficult to work around. For example, when the screen is rotated on the device, the current Activity is killed and restarted. This means that our screens need to be robust and display consistent information even after being restarted.

We also ran into trouble with multiple API levels. Our first implementation used methods that are only available on the newest version of the OS, which no phones run currently. This meant that we had to scrap our drag-and-drop functionality and rewrite most of the code to use older method calls.

In all, we are very happy with the implementation and believe that after implementing what we learned in the user testing, our application will be up to par with publicly available applications.

Code is available on github here.

Evaluation

Our Users

We initially scheduled three Berklee students (Aubrey’s friends) to test our app, but two of the appointments fell through. At that point, we contacted MIT friends who had music composition experience.
Berklee student: A junior in Film Scoring and Music Composition.
MIT students: Two sophomores, both had taken Harmony and Counterpoint I. The Harmony and Counterpoint classes teach music theory and require a composition as a final project. One student is in Concert Choir and taking Harmony and Counterpoint II. The other is an aspiring music minor and one of the directors of Syncopasian, an a capella group for which she also arranges music.

The Berklee student is representative of our target user population: busy music students who have to compose a lot of music. The MIT students are representative because they know a lot about music and composition and had applications for the app in their lives.

User Test

Users sat with an Android phone and were asked to complete three tasks, each given after the previous was completed, while observers took notes in response to their thoughts while using the app.

Briefing:
After explaining the purposes of a usability test, the students were told that they were busy music students who did not know when inspiration would strike. Suddenly they had an idea come to them, but they didn’t have access to their computer or a piece of paper. They did have their Android phone.

Tasks:
1. Create a piece of music
2. Browse through files to find the sketch
3. Listen to the sketch

Usability problems

Ranked in order of importance, followed by proposed solutions:

  • Fat fingers. Our users struggled to accurately place notes on the staff (despite having rather thin fingers), which is something we did not discover when we were testing the app because we were not concerned with placing specific notes on the screen. The small motions required to increment the note up a space or two was too difficult.** Allow user to zoom in on the screen to accurately place notes.
    • Use an adjustment box that allows the user to drag up and down until the note is in the right place.
    • Add arrows that can be tapped to step the note up and down the staff.
    • Put the buttons on the side of the screen, instead of at the top and bottom. Then the staff could be bigger in the main part of the screen.
    • One of the users shook the phone to see if that would change where the note was, which could be implemented, although we don’t think it would be very intuitive or accurate.
    • Finale shows a shadow note before the note is placed. Alternatively, we could implement something similar to SmartGo, which shows crosshairs before a piece is placed. If the board is too large for the screen, the crosshairs are supplemented with a zoomed-in bubble, which makes accurately placing the pieces very easy.
  • No sound when placing notes in Edit mode. Users would like to hear the notes they placed on the screen when they place them so they can adjust them instead of switching between Edit and Playback modes.
    • Implement playing the note value when the staff canvas receives a note.
  • No measure lines? Users commented that music programs automatically add bars (our app is inconsistent with the market). No defined measures, even in a sketching app, means accidentals don’t automatically “cancel” point, as one user pointed out. Another user was confused because he didn’t know if he had filled up a measure; he was very used to composing inside of measures and didn’t like the lack of divisions. 
    • Add measure lines automatically as notes are placed on the staff.
    • Place notes in pre-measured bars, with each measure taking the majority of the screen space and clicking between measures (suggested by a user in the paper prototype testing), with an option to view the whole sketch. This could also help alleviate the fat finger problem.
  • No chords. Our Berklee tester pointed out that chords are very important for the work that he does, and spent about a minute trying to place a note on top of another note.
    • Implement chords. This could be done by layering images on top of each other and then playing the two frequencies at the same time in playback mode.
  • Moded note placement not obvious. Most users expected drag and drop, but once that did not work, they easily figured out they were moded note buttons.
    • Implement drag-and-drop note capabilities (currently limited to Android SDK 15, Ice Cream Sandwich).
  • Couldn’t tell when a note or accidental was selected. We changed the selection color from Red to Yellow, based on the heuristic evaluation comments. However, on the phone we used for testing, users could barely see the yellow square around a selected note/accidental.
    • Use a different color (perhaps a less jarring red) that is visible on more/all Android phones.
    • Make the colored square more opaque.
  • Incomplete note set. Though a sketching app, only eighth, quarter, and half notes are available to the user.
    • Add more notes (sixteenth, 32nd, whole, triplet).
    • Add dotted capability to notes.
  • Bass clef as the default? Users commented that they would not normally compose a melody in bass clef, so they thought it was interesting that it was the default and also was the first element in the clef list.
    • Place treble clef before bass clef in the selector.
  • Odd tabbing order. In the settings screen, pressing Next on the keyboard (like pressing Tab) to go between fields switches between the two keyboard inputs: Piece Name and Tempo. These two fields are at the top and bottom of the settings screen.
    • Place the two inputs close together on the screen.
    • Change the tabbing order to also tab through the fields in between as well.
  • All Major keys? The Keys list is quite long (there are 24 keys in Western music).
    • Only use the letter names in the list and add a toggle button to the side for Major/Minor, much like an AM/PM toggle button in the Android alarm clock settings. This would also be consistent with Finale key selection.
  • Forward and Rewind buttons confusing in playback. A user was confused about the functionality of the buttons. She thought they turned the page instead of moving between the notes. Another user thought it was a back button. 
    • Use infinite screen scrolling for the staff instead of the page method.

Reflection

What We Learned

We learned how to program for Android and use the different APIs. We should have better applied what we learned in 6.005: designing the implementation before jumping straight in. Though we did sketch data structures and different classes and methods that we would need, we did not complete the code design before we started writing.
We learned how to design GUIs and how important human interaction with the product is during the design process. Iterative designing is really important, and we benefitted a lot from the feedback we received from users. We liked prototyping our product and showing people what we were working on and letting them play with it.

We enjoyed writing an app for a different user population. In the past, most of the software we have written is used to teach concepts and not very useful in the world, or we have been our own users. Designing for someone else made us stretch; it was especially interesting to design for music students who use computers very differently from how we use computers. Additionally, we learned how to properly administer a user test to efficiently get the most information. The tests emphasized that the user is always right, and we had to take their input into account when making design decisions.

External consistency is very important! We made several design decisions for our app that were not consistent with other music software suites. Though our app was specifically designed for sketching and not composing symphonies, we cut out some features that ultimately were very important to our users (such as measures).

Also, some of our less musically-inclined group members (Stephen) learned a lot about music!

What We Would Do Differently: Research and Testing

One of our major challenges was working with the Android development environment, and we would have benefitted from looking into code from other complete Android applications, as well as reading through the developer’s guides before beginning the implementation process. In the same vein, it would have been very helpful to have ~10 different Android phones for testing on different versions of the OS, screen sizes, etc. We were rudely awakened the first time we built the application to a phone and found that the emulator was a very weak simulator for the actual phone environment. During the development process we only had reliable access to one Android phone, so that severely hindered our progress.
We also would have benefitted from another iteration in the design process. Our user tests pointed out several usability problems that we could have fixed if we had another week to develop the final implementation.

All in all, this project was a very positive experience for our group. We got the chance to go through a few iterations of the spiral model and learned the struggles of pleasing a user population we know nothing about. The app that we created is something that we are all proud of and would definitely download for our own personal use. Several people (friends, family, professors) have expressed interest in the app because of its novel idea. This class taught us about the iterative design process for user interfaces and we had the chance to use that knowledge in our project.

  • No labels