August 2008

Preliminary discussions have begun to create a process for ISDA

Notes thus far:

Need better tools or measuring
        Metrics
        •    Page visits
        •    Web trends
        •    Link tracking
        •    Path analysis
        Need a run time production tool (Derek will lead this)
        Pat reached out to CSAIL, Libraries and OSP
        •    Only tool they have used is Google Analytics
Need real time numbers to do accurate load testing
        •    Get an idea of what the threshold is
        •    Collecting the data
        •    Determine what that data means
        •    Create profile from metrics
Individual project teams would use these tools
        •    MIT community, not just IS&T
        •    Every project should be involved in stress testing
Choose possible tool candidates based on:
        •    Ease of use
        •    Support model
Demo tools for as many people as possible
        •    Get feedback from staff
        •    Pending feedback, move forward with a purchuse 

September 2008

Meeting with Wendy Bastos in SAIS to get an overview of the work her group is doing and get some pointers on how we could be structuring quality assurance for ISDA.  Here are the highlights thus far:

    * QA resource should ve part of the requirements gathering
          o this helps with building test plans and strategy as design is being built
    * always define what you want your deliverables to be (especially with external QA vendors)
          o agreed upon test plan
          o agreed upon test cases
    * always do follow-up with QA vendors on deliverables
          o track tasks in Jira, etc
    * QA "good enough" standards should be part of the requirements
    * always smart to visit QA vendor facilities to make sure they provide a good environment and equipment/support, etc

There are also several helpful documents/templates on the Quality Management wiki space *https://wikis.mit.edu/confluence/display/QA/Home*

  • No labels