You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 21 Next »

FY10 Q2

Oliver Thomas's email reply to Robert Smyser's email listed below (dated 1/21//2010)

Thanks Rob! Taking the summary sentence about Quickstations out is fine.

Cheers,

Oliver


Oliver Thomas
Information Services and Technology
Massachusetts Institute of Technology

+1 617 253 9682
+1 617 835 9682 /m
othomas@mit.edu

On Jan 21, 2010, at 10:16 AM, Robert W Smyser wrote:

Robert Smyser’s email (dated 1/21/2010) to Oliver Thomas:
Athena updates to RE: CSS Q2 Report Draft
Hi folks,
I’ve updated the tables and prose in the attached metrics section of the report.
• The 1/4/2010 update from Jon Reed didn’t include any workstation numbers, as Oliver observed, so I’m glad Oliver caught that one. With Oliver’s change, Personal/DLC workstations drop to 76% of the total workstation installations, still a big ratio.
• The Quickstation change would say that logins there are in proportion to their % deployment, but by the same token, there are seven times as many student computing lab machines that account for only twice as many logins. Is this explained by the different session lengths expected of the high-turnover quickstations vs sit-and-stay lab machines? We probably need deeper analysis about session length to see if there’s anything interesting to say. Given that, perhaps we should take the Quickstation sentence out (but leave the dialup sentence in).
• Student Computing Labs substituted for Cluster in the tables and text.

The tables needed me to repast them as bitmaps from the Excel original, and they probably won’t change again in this report. Additional changes to the text can be done by Pat / Elaine up to the moment the report goes in.
Rob

From: Oliver Thomas
Sent: Wednesday, January 20, 2010 5:59 PM
To: Robert W Smyser
Cc: Oliver Thomas; css-managers@mit.edu Managers; Elaine Elizabeth Aufiero; Patricia Sheppard
Subject: Re: CSS Q2 Report Draft

Hi Rob,

A few quick corrections to the Athena and printing metrics:

  • In the summary, the total number of Athena workstations is 1472 (not 2434). In the table, the number of Personal/DLC-owned Workstations is 1113. Jon had sent a correction back in early January that didn't make it into the final version.
  • Can we take out the "A relatively large" in "A relatively large 14% of total Athena logins..."? They're actually 12% of all of our publicly deployed machines (cluster + quickstations) so 14% of logins on them is pretty normal, especially since they encourage higher throughput (of users).
  • Can we replace "Cluster Workstations" in the table with "Student Computing Labs"? It seems like we're moving to that more generic terminology in the president's report, so might as well keep it consistent. The fact that they're Athena workstations is covered by the column heading.

It looks like the utilization data didn't make it in the QR, which is okay, but Jon asked me to follow up to see if it was a formatting issue (no graphs allowed) or if there was some other reason.

Thanks!

Oliver


Oliver Thomas
Information Services and Technology
Massachusetts Institute of Technology

+1 617 253 9682
+1 617 835 9682 /m
othomas@mit.edu

On Jan 12, 2010, at 2:25 PM, Patricia Sheppard wrote:

Hi all,

As discussed in our meeting earlier today, we have some time to review and revise our Q2 report before submitting to Marilyn’s office. Attached is the first draft from which to work from. The narrative has been heavily edited to accommodate space restrictions, and the teams have been pulled out and accomplishments listed by service. Also, new metrics are being piloted, and we will need to do some editing in either the analysis or particular metrics we report as well to reduce it to two pages.

Ideally, we would like to have your feedback by this Friday (Monday at the latest). Please note what you think is missing in your feedback to Elaine, Rob and I - we want to make sure we are representing all the great work that is happening across CSS and IS&T. Thanks so much!

Patricia Sheppard
Business Manager
Client Support Services
Infrastructure Software Development and Architecture
MIT Information Services & Technology
617-253-3179http://ist.mit.edu
<CSS FY2010 Q2 Draft.docx>

Summary of Strategic Metrics
CSS maintained high levels of call center performance in Q2, with Client Satisfaction only slightly below its 4.5 goal, and Abandon Rate well below its goal of 10%.
Repair Center and Telephone Help satisfaction scores show steady improvement over the year and are now above target for the last two quarters. Telephone Analog and ISDN orders are high from requests to remove such phones. Telephone Help kept its recent gain in % of tickets resolved in Tier 1, up 15% this year.
Ticket Topics are assigned by a largely automated process using keyword matching and hinting. Adding User Accounts to the pool for the first time greatly increased volume in the Accounts topic and contributed to the 35% increase in Email tickets, otherwise caused by Exchange-related tickets created in the Call Center. The Backup topic was impacted by the annual cleanup of mostly inactive TSM accounts. Two other lower volume topics – Security and Services – showed large % increases related to phishing and malware, and increased use services like wikis, respectively. Business tickets are down by nearly half compared to a year ago, and Hardware tickets down by 22%. In those areas, newer hardware and easier-to-use software have a positive impact on our clients’ computing environment.
As a consequence of adding User Accounts, Service Desk ticket volume nearly doubled over last year. Over 8300 total tickets, or 126 tickets per day, were created by more than 4700 unique IDs in a roughly 3:2 ratio of employees to students.

Training offered 14% more classes and attracted 8% more attendees than a year ago. A third of the training taken in Q2 was about Exchange. Attendees are pleased with their training, as client satisfaction scores remain high.

Accessibility delivered 39% more Adaptive Technology consultations to the community, and showed the same increase in student users of the 24x7 lab. Software Usability consultations nearly doubled over last year as well.

DDM’s Preventative Maintenance visits and Desktop Deployments have surged over last quarter even as DDM looks to revamp the program later in the year. Declines in MDS metrics across the board are consistent with recent DLC belt-tightening. DCAD shows sustained gains in the SLA business, while new projects are down compared to a year ago but rising steadily in the last three cycles. The DS PLUS reporting is new this quarter and shows a 42% increase over last year in activities related to supporting its 72 client individuals.

Athena and Printing metrics are new this quarter. Athena Workstations by Type reveals that 76% of the 1472 known Athena installations are not in our public Student Computing Labs but are privately owned or in DLCs. Our 8 dialup servers account for 28% of total logins, likely from Windows and Macintosh machines running terminal software and an X client. The 27 public Athena printers seem heavily used at over 91000 pages per quarter altogether, but 55% of the pages printed were in paper-saving duplex mode.

Finally, DMCA notices were in a year-long decline until Q2, now up 100% over Q1 and 57% over Q2 of FY2009.

_________________________________

Steve Winig's email note regarding the Metrics contribution that was submitted by Rob Smyser.

Certainly in my area, both comparisons are relevant and potentially provide different information.

Out of curiosity, what happened to the metrics graphs from earlier reports. I felt that the graphs conveyed a lot of information in a relatively small amount of space. In fact, AUX has been talking with Rob about not only creating some graphs for our operational work but also annotating peaks and valleys in the graph to provide additional information without using any more real estate.

Based on some of the examples in the report, which are very operationally focused, I'm wondering if Accessibility should be included as a service. Additionally, does it make sense to include ATIC accomplishments under Athena (since ATIC is the Athena lab for students with disabilities)?

-Steve

________________________________________
From: Chris Lavallee lavallch@MIT.EDU
Sent: Wednesday, January 13, 2010 9:47 AM
To: Patricia Sheppard; CSS Managers
Cc: Elaine Elizabeth Aufiero; Robert W Smyser
Subject: Re: CSS Q2 Report Draft
Hi all-
I had a discussion yesterday with Rob re: the metrics tables and analysis and would like your take on this idea.

Marilyn wants a comparison of this Q to last FY same quarter. However, there are also interesting and important trends Q to Q in the same FY. Some of these are already being pointed out in the analysis, at least in the DS one. This is a bit confusing since the % change column relates to the comparison to the same Q last FY and not last Q.

I would like to propose adding another column for % change from last Q. This will allow us to talk about and show trends compared to last FY and from Q to Q over the same FY without confusing the readers.

Thoughts?
– Chris

-------------

Summary of Strategic Metrics (submitted by Robert Smyser):

Metrics in this report are shown in a more tabular form to show five quarters of readings (where available) plus % change over last year. CSS’s Metrics Refresh effort has led to some new metrics in this quarter with more to come in Q3. These metrics and more is are available in the Measures Roster section of this wiki page: https://wikis.mit.edu/confluence/display/CSS/Metrics+Clearinghouse.

Looking at highlights from the table, the Service Desk maintained high levels of call center performance in Q2, with Abandon Rate well below the 10% goal and Wait to Answer time the same as a year ago and better than Q1. Agent availability had its usual seasonal decline (as student staff are affected by the academic calendar) but it fell to a 11% lower level than last year. Its impact on client satisfaction seems muted, as Q2 Overall Satisfaction of 4.43 came in just under the goal level of 4.5 and is only 3% lower than a year ago. The Overall Satisfaction score is loosely coupled to the timeliness dimension, which sagged in October when the Service Desk was at its busiest. Repair Center and Telephone Help satisfaction scores show steady improvement over the year and are now above target for the last two quarters. The increase in Telephone Analog and ISDN orders is in requests for removal of those phones. Telephone Help kept its recent gain in % of tickets resolved in Tier 1, up 15% this year.

As a consequence of adding User Accounts to the Service Desk, ticket volume nearly doubled over last year. Over 8300 total tickets, or 126 tickets per day, were created by more than 4700 unique IDs in a roughly 3:2 ratio of employees to students. Ticket Topics are assigned by a largely automated process using keyword matching and hinting. Adding User Accounts to the pool for the first time greatly increased volume in the Accounts topic and contributed to the 35% increase in Email tickets, otherwise caused by Exchange-related tickets created in the Call Center. The Backup topic was impacted by the annual cleanup of mostly inactive TSM accounts. Two other lower volume topics – Security and Services – showed large % increases related to phishing and malware, and increased use services like wikis, respectively. Business tickets are down by nearly half compared to a year ago, and Hardware tickets down by 22%. In those areas, newer hardware and easier-to-use software have a positive impact on our clients’ computing environment.

Training offered 14% more classes and attracted 8% more attendees than a year ago. Those that attend training are pleased with it, as client satisfaction scores remain high at 4.8 out of 5. A third of the training taken in Q2 was for Exchange. The Accessibility team delivered 39% more Adaptive Technology consultations to the community, and showed the same increase in student users of the 24x7 lab. Software Usability consultations nearly doubled over last year as well.

DDM’s Preventative Maintenance visits and Desktop Deployments have surged over last quarter even as DDM looks to revamp the program later in the year. Declines in MDS metrics across the board are consistent with recent DLC belt-tightening. DCAD shows sustained gains in the SLA business, while new projects are down compared to a year ago but rising steadily in the last three cycles. The DS PLUS reporting is new this quarter and shows a 42% increase over last year in activities related to supporting its 72 client individuals.

Athena and Printing metrics are new this quarter. Athena Workstations by Type reveals that 85% of the 2434 known Athena installations are not in our clusters but are privately owned or in DLCs. A relatively large 14% of total Athena logins occur on the 43 high-turnover Quickstations, while just 8 dialup servers account for 28% of total logins, likely from Windows and Macintosh machines running terminal software and an X client. The 27 public Athena printers seem heavily used at over 91000 pages per quarter altogether, but 55% of the pages printed were in paper-saving duplex mode.

Finally, DMCA notices were in a year-long decline until Q2, up 100% over Q1 and 57% over Q2 of FY2009.

For the Metrics Table, please see pages 2 and 3 on the attached document:  https://wikis.mit.edu/confluence/download/attachments/58241539/CSS+FY2010+Q2+Report+draft+measures+portion.doc

Robert Smyser’s email to Patricia Sheppard and Elaine Aufieron (dated 1/7/2010)
SUBJECT: draft version of CSS FY2010 Q2 Report draft measures portion.doc
Pat and Elaine,
Attached you should find my pass at the metrics section of the report. It doesn’t quite fit into two pages, but I want to collaborate on how to change it or what to leave out in order to make it fit. I’ll be seeing you tomorrow at noon if not before. I do have a physical at 9:30 tomorrow, so I can’t be in until late morning.
Rob
Summary of Strategic Metrics
Metrics in this report are shown in a more tabular form to show five quarters of readings (where available) plus % change over last year. CSS’s Metrics Refresh effort has led to some new metrics in this quarter with more to come in Q3. These metrics and more is are available in the Measures Roster section of this wiki page: https://wikis.mit.edu/confluence/display/CSS/Metrics+Clearinghouse.
Looking at highlights from the table, the Service Desk maintained high levels of call center performance in Q2, with Abandon Rate well below the 10% goal and Wait to Answer time the same as a year ago and better than Q1. Agent availability had its usual seasonal decline (as student staff are affected by the academic calendar) but it fell to a 11% lower level than last year. Its impact on client satisfaction seems muted, as Q2 Overall Satisfaction of 4.43 came in just under the goal level of 4.5 and is only 3% lower than a year ago. The Overall Satisfaction score is loosely coupled to the timeliness dimension, which sagged in October when the Service Desk was at its busiest. Repair Center and Telephone Help satisfaction scores show steady improvement over the year and are now above target for the last two quarters. The increase in Telephone Analog and ISDN orders is in requests for removal of those phones. Telephone Help kept its recent gain in % of tickets resolved in Tier 1, up 15% this year.
As a consequence of adding User Accounts to the Service Desk, ticket volume nearly doubled over last year. Over 8300 total tickets, or 126 tickets per day, were created by more than 4700 unique IDs in a roughly 3:2 ratio of employees to students. Ticket Topics are assigned by a largely automated process using keyword matching and hinting. Adding User Accounts to the pool for the first time greatly increased volume in the Accounts topic and contributed to the 35% increase in Email tickets, otherwise caused by Exchange-related tickets created in the Call Center. The Backup topic was impacted by the annual cleanup of mostly inactive TSM accounts. Two other lower volume topics – Security and Services – showed large % increases related to phishing and malware, and increased use services like wikis, respectively. Business tickets are down by nearly half compared to a year ago, and Hardware tickets down by 22%. In those areas, newer hardware and easier-to-use software have a positive impact on our clients’ computing environment.
Training offered 14% more classes and attracted 8% more attendees than a year ago. Those that attend training are pleased with it, as client satisfaction scores remain high at 4.8 out of 5. A third of the training taken in Q2 was for Exchange. The Accessibility team delivered 39% more Adaptive Technology consultations to the community, and showed the same increase in student users of the 24x7 lab. Software Usability consultations nearly doubled over last year as well.
DDM’s Preventative Maintenance visits and Desktop Deployments have surged over last quarter even as DDM looks to revamp the program later in the year. Declines in MDS metrics across the board are consistent with recent DLC belt-tightening. DCAD shows sustained gains in the SLA business, while new projects are down compared to a year ago but rising steadily in the last three cycles. The DS PLUS reporting is new this quarter and shows a 42% increase over last year in activities related to supporting its 72 client individuals.
Athena and Printing metrics are new this quarter. Athena Workstations by Type reveals that 85% of the 2434 known Athena installations are not in our clusters but are privately owned or in DLCs. A relatively large 14% of total Athena logins occur on the 43 high-turnover Quickstations, while just 8 dialup servers account for 28% of total logins, likely from Windows and Macintosh machines running terminal software and an X client. The 27 public Athena printers seem heavily used at over 91000 pages per quarter altogether, but 55% of the pages printed were in paper-saving duplex mode.
Finally, DMCA notices were in a year-long decline until Q2, up 100% over Q1 and 57% over Q2 of FY2009.

_________________________________________________________________________________________________

FY10 Q1

Need to input this information.

  • No labels