Thursday, November 30, 2006

Top 5 Obstacles to Reviewing Performance in the Call Center

Metrics, reports, and dashboards. Nowhere in an organization is the mania for measurement so pronounced as the modern call center. And yet, with all of this measurement, why do so many call center managers privately admit that overall performance hasn’t really improved?

The answer lies partly in what organizations are measuring. Too often, call centers develop activity metrics like number of calls per day and average length of call, rather than outcome metrics like percent of calls redirected to self-service and percent of customers who rate “highly satisfied.” Unified contact centers must change their focus from emphasizing efficiency (doing work the right way) to considering effectiveness (doing the right work).

But even organizations that choose appropriate metrics often make potentially damaging mistakes when reporting on and reviewing their performance. Surprisingly, the culprit is often the weekly meeting itself. Issues arise in preparing for the meeting, during the meeting itself and in disseminating decisions made after the meeting. Experience shows the top five obstacles to reviewing performance in the call center are as follows:
There is too much human intervention required.
Virtually all meetings use MS PowerPoint or Word to document performance in “briefing books.” People run reports, export data, merge it with other data sources and add color commentary to explain results/trends. One executive was shocked to discover that his analysts were spending nearly two full days to get ready for their weekly operational review. Moreover, given the manual nature of the tasks, the likelihood of unintentional error is high.
Information is inconsistent from one group to another.
A call center director with six managers will likely get performance reports in six different formats. Even if the director dictates a standard, people will interpret outcomes differently. These variations can range from different definitions of the metrics being used (should average delay of all callers include those that hang up during their wait?) to different expectations of progress (when is a new-hire considered trained?) to different interpretations of outcomes (is a satisfied customer one that doesn’t call back with a problem or one that buys more?).
Performance cannot be certified.
Once data is removed from a system, there’s no way to track whether or not any changes have been made to it and, therefore, it is subject to misrepresentation. This might be as benign as an unintentional omission or as malicious as outright gaming of the results.Consequently, meeting time is often spent arguing about the accuracy of the presentation rather than making decisions and any decisions that are made might be based on faulty assumptions. Senior management often doesn’t believe the numbers they see reported from a call center.
Information becomes stale very quickly.
Operational review meetings are a live discussion of performance. Unfortunately, briefing books are static, based on performance at some specific point in the past. If a question comes up during the meeting or if someone wants to “pearl dive” into more detail on a specific topic, there is no way to immediately check the operational systems to see what might have changed in the interim. While it’s interesting to know that staff occupancy was below target last week, it’s critical to know if it’s improved since then to decide what adjustments have to be made for the coming week.
Decisions are often not communicated to those impacted.
Obviously, the purpose of operational review meetings is more than just to review; rather it is to make decisions that incrementally improve performance. Unfortunately, the classic PowerPoint briefing book has no way of disseminating the decisions made, to assign and track action items, and to keep track of progress during the period before the next meeting. Without this background rationale, other stakeholders who are critical to the call center’s success – marketing, finance, and operations staff – can be suspicious of the conclusions and may delay implementation of critical change.

While these issues with operational reviews may seem overwhelming, technology exists that streamlines the process of capturing, consolidating and presenting authoritative, verifiable information on performance. These solutions can greatly simplify and add structure to the process of reviewing performance, allowing the call center manager to get back to his/her real job – creating a world-class unified contact center.

By Jonathan D. Becher, president and CEO of Pilot Software. Jonathan can be reached for feedback at ceo@pilotsoftware.com.

0 Comments:

Post a Comment

<< Home