ISTMeasuresInventory

IS&T has been reporting measures in its quarterly reports for years, perhaps not as systematically as one would like. In recent days, each of the IS&T Directorates has been asked to develop a list of measures that would be meaningful in guiding its business and that could be systematically collected for review at different levels of management.

To support this new attention to measures, we have evolved an ISTMeasuresArchitecture to robustly define our measures and their meaning, and to aid in their tracking over time. In addition, we have inventoried the measures that have emerged in past quarterly reports and seem to be candidates for tracking over time.

Some of these measures are also meaningful to our client community and can be reported in a page of ISTPublicFacingMeasures under development in 2006. (Stanford University is well along in the development and use of their own public facing measures to prioritize their process improvement efforts. See http://www.stanford.edu/services/itmetrics/ ) To sustain this project at MIT, we will need to develop more transparent business processes for knowing who has the raw data and how to reliably and repeatably obtain it. Ideally the processes would take only a minimum of incremental effort.

Here is the roster of measures found in Quarterly Reports to date:

Academic Comp

Derived from the Q1 and Q2 reports of Fy2006.

[Machine Deployments]

  1. Public Cluster machines deployed
  2. Residential Cluster machines deployed
  3. Lecture Hall machines deployed
  4. Electronic Classrooms machines deployed
  5. DLC cluster machines deployed

[Applications Usage]

  1. OpenOffice / StarOffice
  2. Matlab
  3. Mathematica
  4. Acrobat
  5. Xess
  6. Stata
  7. Student license server Matlab

[Community Outreach]

  1. IAP classes offered by AcadComp staff
  2. Nof Laptops loaned out
  3. N of courses for which laptops were loaned out

Admin Comp

Derived from the Q1 and Q2 reports of Fy2006.

  1. N of SAP Production System Interactive Dialog Steps
  2. SAP Production System Average Interactive Response Time
  3. SAP Production System Primetime availability
  4. SAP Production System Planned Service Outages (N and hrs)
  5. SAP Production System Unplanned Outages (N and hrs)
  6. Total number of SAP transports
  7. N of Unique SAP (gui + web) Users [New in Q2 2006]
  8. N of Unique SAP GUI Users [New in Q2 2006]
  9. N of Unique SAP Logon Sessions [New in Q2 2006]
  10. Web hits to "Classic" sapweb pages for a series of functions (about 4) [modified for Q2 2006]
  11. Web hits to ITS (new) sapweb pages for a series of functions (about 6) [modified for Q2 2006]

CSS

  1. Client satisfaction with the Helpdesk (7 variables, plus overall % satisfied).
  2. Consultant availability and performance in the ACD system. (7 variables).
  3. Cases created in CSS categories in Casetracker (28 categories at the moment).
  4. Topics underlying Consult cases in pertinent Helpdesk categories.
  5. Topics underlying cases in the Athena OLC/RCC cases.
  6. Cases generated in each of 9 demographic cohorts (admin, faculty, other academic, student, etc.)
  7. Copyright cases and Total cases in the StopIt category of RT.
  8. Proportions of web hits by browser brand, by operating system, by platform brand.
  9. Counts of individuals served in Helpdesk consulting cases, per cohort (see above).
  10. Count of apparently repeat customers by demographic cohort.
  11. Hits to official self-help repositories.
  12. Training classes offered, seats filled, plus client satisfaction with classes.

OIS

Reading my way through the Q1 FY06 Quarterly Report (http://web.mit.edu/ist/about/reports.html) produced this set.

[TNIS]

  1. jack installations
  2. mitnet activations
  3. billable hours
  4. turnaround time
  5. server availability (nagios monitored)
  6. mitvma/c availability

[DW]

  1. increase in registered users
  2. increase in records stored
  3. increase in files size
  4. project completions
  5. projects open
  6. n of client DLCs

[NIST]

  1. hits to win.mit.edu and www.mit.edu
  2. Unique Win.mit.edu users per day
  3. Unique wireless clients per day
  4. Unique logins per day on Wireless
  5. techtme unique logns per day
  6. alum.mit.edu messages per day
  7. citrix unique logins per day
  8. vpn unique logins per day
  9. mit mailboxes using spamscreen
  10. mit mailboxes using auto-purge
  11. mit mail registered as spam
  12. mit mail with viruses
  13. Software downloads from NIST servers
  14. Matlab downloads
  15. Matlab launches against license server
  16. Matlab sessions denied
  17. Matlab concurrent users

[DOST]

  1. N of collocation servers
  2. Listserve lists
  3. Listserve subscribers
  4. TSM nodes registered
  5. TSM data backed up
  6. TSM restores attempted
  7. TSM restores succeeding

SSIT

In the FY2006 Q2 report, some numerical measures appeared, and seem candidates for tracking.

  1. Support cases open/closed in RequestTracker queues "in support of the infrastructure".

and another section tracks performance within the MITSIS system. [MITSIS]

  1. Interactive Logins
  2. Batch processes
  3. Print requests

[J2EE Application Metrics]

  1. Web page hits to 6 different Applications (Undergraduate Admissions, Degree Tracking & Test Scores, UROP, Association Student Activities, Web Grad Aid & SISTIM, Academic (advisory email, exploratory subject download, reg form, reg control list)

[WebSIS]

  1. page hits to websis.

TSS

[Telephony] Allison Dolan reported in a Dec 14, 2005 email that she regularly reports these telephony statistics. They don't currently appear in the quarterly report.

  1. # of lines (analog, isdn, compare to budget)
  2. # / cost of Long Distance and International (interesting stat > 90% of LD/international is less than 1 per call); compare to budget
  3. # call to operators; # handled by NameConnector
  4. # moves/adds/changes; billable MACs; compare to budget
  5. # telephone helpdesk/repair (response average has been under 1 day, so we haven't reported time to repair).
  6. Uptime has consistently been > 99.999%, so that has not been reported on a quarterly basis.

[Staffing Metrics] These statistics actually appear in a table in the FY06 Q1 report.

  1. # positions posted
  2. # positions closed
  3. # closed positions that are incrementally new
  4. # open/posted positions at close of quarter (a pending queue of sorts)
  5. average days between posting and closed
  6. average days between initial discussion and closing date for posted positions
  7. # of new-to-MIT hires (based on hire date)
  8. # MIT transfers
  9. # IS&T transfers
  10. % women new-to-MIT hires
  11. % minorities new-to-MIT hires
  12. headcount at qtr end
  13. # departures (based on IS&T termination date)
  14. annualized attrition
  15. # of departures that were MIT transfers
  • No labels