Last week I was speaking with a Director of Operations in charge of a Help Desk Customer Service Center. This director was interested in metrics he should be using for his operation. One issue that he mentioned at the beginning of the conversation was that he did not want typical call center statistics.
Our conversation explored his customer service center and what his goals were for the center. It is interesting the differences between inhouse and outsourced customer service centers. I have found that inhouse centers focus on the number of tickets opened and closed while outsourced centers fixate on SLA’s. Perhaps given their respective reporting responsibilities it maks sense.
For this outsourced service center I suggested the following metrics to report and track. These metrics are not in any order or priority.
Total Number of Tickets Logged per Product – real time and for the day
Total Time Worked per Ticket and per Product
Average Number of Tickets per Product
Average Time Worked per Ticket per Product
Tickets Opened per Product real time and for the day (this differs from Tickets logged)
Tickets opened per Product for the Day
Service Level real time for the day per Product / All Product / Group
Tickets Closed pre Product real time / day
Average Handle Time per Ticket per Product (Can be a misleading metric)
Max Handle Time per Ticket per Product
Max Handle Time all Tickets / Products
First Call Resolution per Product
Tickets resolved in X hours since Logged per Product
Tickets resolved in 8 hours since Logged per Product
Tickets resolved in 24 hours since Logged per Product
Tickets unresolved for the day / week per Product
Tickets escalated per Product for the day
Customer Satisfaction Rate per Product / Total
Ticket Closure Ratio per Product / Total
Average Time to Close VS SLA’s
Average Response Time per Product
All of these metrics should be run by group and by agent as well. This level of detail can show which groups are performing well and which CSR needs training. There are many idiosyncrasies with help desk metrics that can be misleading and should be analyzed prior to using the data in performance reviews.