...
- The measures that we have on that page that actually look like performance measures to Jerry as they are include:
- Timeliness of Helpdesk Consulting. (#1)
Wiki Markup +Uptime of IS&T Servers+ (#7) \[but state our goal (in text, since the goal line is shown in the graph)\]\[done\] \-\- Jerry'd like to see the count of servers that contribute to this number (that much uptime is harder for a large pool of servers than for just one)
- Timeliness of CD Distributions (#4) (but he wonders if it's important enough -- how much volume do we still do in CDs? Rob would say that several key distributions are only done by CD and those customers would care)
- Capability Measures that could be performance since they have stated goals
- Collocation utilization, If we have a marketing program and our goal is to increase utilization by x% per year
(Rob gets the impression Jerry doesn't think that the condition is met; Joanne thinks this too. Rob thinks that offering the service at all has an implicit assumption that someone, call them early adopters, will want to use it -- but maybe we can't call that a quantifiable goal in a PM sense.) - TSM utilitization, If we have a marketing program and our goal is to increase utilization by x% per year
(Rob gets the impression Jerry doesn't think that the condition is met; Joanne thinks this too. Rob thinks that offering the service at all has an implicit assumption that someone, call them early adopters, will want to use it -- but maybe we can't call that a quantifiable goal in a PM sense. I'd say we offer TSM as the Recommended product in this line of business -- that's our marketing. 80-100% of market share must be our goal, or we'd be recommending something else, like external hard drives or CD-RW.) - Copyright Infringement (#10), looks like a volume measure but Rob said it had a goal of reduction compared to the same time a year before. Rob would argue that ifyou're going to have an awareness campaign about infringement, as we do, then we need to set a goal of "some sort of response to the campaign", i.e., make the trend start to go down, as a goal. )
- Collocation utilization, If we have a marketing program and our goal is to increase utilization by x% per year
- Good Capability Measures to keep as is
- Spam email # (add % of total email volume) (#2)
- Virus email # (add % of total email volume) (#3)
- Web Self-Service software distributions. (#5)
- Collocation (#6), if we don't establish it as a performance measure (see above)
- TSM Utilitiation (#8), if we don't establish it as a performance measure (see above)
- Network Security incidents (#9)
- Copyright infringement, (if we don't accept "establish a downward trend" as a legitimate goal.)
- MIT Mailboxes using SpamScreen Auto-filtering. (#11).
- Wireless Clients, unique users per day (#12)
- On-campus Utilization of IS&T Self-Help web pages (#13)
- IDEAS for performance measures not reported yet
- The degree of shift in software distribution from CD to Web download. Count of product lines offered in each, ratio of CD to Web. Or Count of licenses distributed -- this is a little dodgy because a single "order" in a CD ticket could be mulitple CDs and licenses, while the N of Web doanloads seems to include a lot of repeated-downloads per license (given that you no CD to draw upon if you need to do a reinstall, say).
We know we'll have a shift in the ratio for FY07 as Filemaker becomes available over download (with CD/DVD manufacturable if you really need one).
Reflects Theresa's very clear goal of encouraging 24x7 access worldwide to software when you need it.
- The degree of shift in software distribution from CD to Web download. Count of product lines offered in each, ratio of CD to Web. Or Count of licenses distributed -- this is a little dodgy because a single "order" in a CD ticket could be mulitple CDs and licenses, while the N of Web doanloads seems to include a lot of repeated-downloads per license (given that you no CD to draw upon if you need to do a reinstall, say).