Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Corrected links that should have been relative instead of absolute.

...

  1. The measures that we have on that page that actually look like performance measures to Jerry as they are include:
    1. Timeliness of Helpdesk Consulting. (#1)
    2. Wiki Markup+Uptime of IS&T Servers+ (#7) \ [but state our goal (in text, since the goal line is shown in the graph)\]\[done\] \ -\- Jerry'd like to see the count of servers that contribute to this number (that much uptime is harder for a large pool of servers than for just for a large pool of servers than for just one)
    3. Timeliness of CD Distributions (#4) (but he wonders if it's important enough -- how much volume do we still do in CDs?  Rob would say that several key distributions are only done by CD and those customers would care)
  2. Capability Measures that could be performance since they have stated goals
    1. Collocation utilization, If we have a marketing program and our goal is to increase utilization by x% per year 
      (Rob gets the impression Jerry doesn't think that the condition is met; Joanne thinks this too.  Rob thinks that offering the service at all has an implicit assumption that someone, call them early adopters, will want to use it -- but maybe we can't call that a quantifiable goal in a PM sense.)
    2. TSM utilitization, If we have a marketing program and our goal is to increase utilization by x% per year 
      (Rob gets the impression Jerry doesn't think that the condition is met; Joanne thinks this too.  Rob thinks that offering the service at all has an implicit assumption that someone, call them early adopters, will want to use it -- but maybe we can't call that a quantifiable goal in a PM sense.  I'd say we offer TSM as the Recommended product in this line of business -- that's our marketing.  80-100% of market share must be our goal, or we'd be recommending something else, like external hard drives or CD-RW.)
    3. Copyright Infringement (#10), looks like a volume measure but Rob said it had a goal of reduction compared to the same time a year before.  Rob would argue that ifyou're going to have an awareness campaign about infringement, as we do, then we need to set  a goal of "some sort of response to the campaign", i.e., make the trend start to go down, as a goal. )
  3. Good Capability Measures to keep as is
    1. Spam email # (add % of total email volume) (#2)
    2. Virus email # (add % of total email volume)  (#3)
    3. Web Self-Service software distributions. (#5)
    4. Collocation (#6), if we don't establish it as a performance measure (see above)
    5. TSM Utilitiation (#8), if we don't establish it as a performance measure (see above)
    6. Network Security incidents (#9)
    7. Copyright infringement, (if we don't accept "establish a downward trend" as a legitimate goal.)
    8. MIT Mailboxes using SpamScreen Auto-filtering.  (#11). 
    9. Wireless Clients, unique users per day  (#12)
    10. On-campus Utilization of IS&T Self-Help web pages  (#13)
  4. IDEAS for performance measures not reported yet
    1. The degree of shift in software distribution from CD to Web download.  Count of product lines offered in each, ratio of CD to Web.  Or Count of licenses distributed -- this is a little dodgy because a single "order" in a CD ticket could be mulitple CDs and licenses, while the N of Web doanloads seems to include a lot of repeated-downloads per license (given that you no CD to draw upon if you need to do a reinstall, say).
      We know we'll have a shift in the ratio for FY07 as Filemaker becomes available over download (with CD/DVD manufacturable if you really need one). 
      Reflects Theresa's very clear goal of encouraging 24x7 access worldwide to software when you need it.

...