US20040138944A1 - Program performance management system - Google Patents

Program performance management system Download PDF

Info

Publication number
US20040138944A1
US20040138944A1 US10/624,283 US62428303A US2004138944A1 US 20040138944 A1 US20040138944 A1 US 20040138944A1 US 62428303 A US62428303 A US 62428303A US 2004138944 A1 US2004138944 A1 US 2004138944A1
Authority
US
United States
Prior art keywords
performance
agent
employee
qualitative
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/624,283
Inventor
Cindy Whitacre
Myra Royall
Tom Olsen
Tina Schulze
Robert White
Nancy Newman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Convergys Corp
Original Assignee
Convergys Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Convergys Corp filed Critical Convergys Corp
Priority to US10/624,283 priority Critical patent/US20040138944A1/en
Assigned to CONVERGYS CORPORATION reassignment CONVERGYS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OLSEN, TOM D., SCHULZE, TINA, NEWMAN, NANCY, ROYALL, MYRA, WHITACRE, CINDY, WHITE, ROBERT
Publication of US20040138944A1 publication Critical patent/US20040138944A1/en
Assigned to CONVERGYS CORPORATION reassignment CONVERGYS CORPORATION RE-RECORD TO DELETE CHRISTOPHER D. HORN PREVIOUSLY RECORDED AT REEL/FRAME 014938/0504 Assignors: ROYALL, MYRA, HORN, CHRISTOPHER D., WHITACRE, CINDY, WHITE, ROBERT, NEWMAN, NANCY, SCHULZE, TINA, OLSEN, TOM D.
Assigned to CONVERGYS CORPORATION reassignment CONVERGYS CORPORATION CORRECTIVE ASSIGNMENT TO CORRECT THE EXECUTION DATES & ADD OMITTED INVENTOR'S NAME CHRISTOPHER HORN, PREVIOUSLY RECORDED AT 015961 FRAME 0237. Assignors: ROYALL, MYRA, HORN, CHRISTOPHER D., WHITACRE, CINDY, NEWMAN, NANCY, WHITE, ROBERT, OLSEN, TOM D., SCHULZE, TINA
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06398Performance of employee with respect to a job function

Definitions

  • the present invention relates, in general, to devices and methods that correlate and display employee performance evaluation factors, both objective and subjective, and track their updates, dissemination, and review, and more particularly to computer-based devices and methods particularly suited to evaluating customer service agents.
  • CMS Customer Management Service
  • the invention overcomes the above-noted and other deficiencies of the prior art by providing a performance management system and method that comprehensively addresses qualitative and quantitative measurands of performance for each agent and group of agents, intuitively displays this information in a meaningful fashion to various levels of supervision, including each agent, and tracks the updates, dissemination, and review of performance feedback through each tier of supervision.
  • Sources of information are sourced and tracked in such a way that accuracy and objectivity are enhanced, increasing confidence. Thereby, agent performance is enhanced through timely and appropriate feedback.
  • Efficacy of overall performance management is made transparent to each level of an organization, including a customer for these services.
  • a plurality of quantitative and qualitative measures are selected as being aligned with appropriate business goals. These measures are collected, merged and analyzed in an objective manner to represent the various performance attributes of an agent. Results are then displayed in an intuitive graphical user interface that readily conveys these attributes, both individually and as compared to an overall group.
  • each agent has a current snapshot as to their standing in the eyes of their employer, with its implications for retention and possibly pay for performance, to thus motivate improved performance. Frequent reporting ensures that you will always know how the CMS provider and its individual agents are performing. Regular feedback to each agent helps ensure continuous agent development.
  • a plurality of quantitative and qualitative measures are monitored and collected for each agent, wherein these qualitative measures include supervisory evaluations. Timeliness of supervisory evaluations is tracked, as well as agent review of feedback based on the quantitative and qualitative measures.
  • FIG. 1 is a block diagram of a Program Performance Management (PPM) System incorporated into a Customer Management System (CMS) network.
  • PPM Program Performance Management
  • CMS Customer Management System
  • FIG. 2 is a sequence of operations performed by the PPM System of FIG. 1.
  • FIG. 3 is a depiction of an employee scorecard graphical user interface (GUI) of the PPM system of FIG. 1 useful for a team leader in performing manual update operations and root cause analysis.
  • GUI graphical user interface
  • FIG. 4 is a depiction of an agent dashboard graphical GUI generated by the PPM system of FIG. 1 indicating a comparison of an agent's performance to standards and to peers.
  • FIG. 5 is a depiction of a queued acknowledgement form GUI generated by the PPM system of FIG. 1.
  • FIG. 6 is a depiction of recent acknowledgements report generated by the PPM system of FIG. 1.
  • FIG. 7 is a depiction of acknowledgement detail report generated by the PPM system of FIG. 1.
  • FIG. 8 is a depiction of an employee performance feedback sheet generated by the PPM system of FIG. 1.
  • FIG. 9 is a depiction of a team leader acknowledgement queue form generated by the PPM system of FIG. 1.
  • FIG. 10 is a depiction of scorecard acknowledgement event detail report generated by the PPM system of FIG. 1.
  • FIG. 11 is a depiction of an employee review rankings report generated by the PPM system of FIG. 1.
  • FIG. 12 is a depiction of a measure daily exclusion screen generated by the PPM system of FIG. 1.
  • FIG. 13 is a depiction of a performance trending report generated by the PPM system of FIG. 1.
  • FIG. 14 is a depiction of an account report generated by the PPM system of FIG. 1.
  • FIG. 15 is a depiction of an acknowledgement detail report generated by the PPM system of FIG. 1.
  • FIG. 16 is a depiction of an acknowledgement summary report generated by the PPM system of FIG. 1.
  • FIG. 17 is a depiction of summary review form generated by the PPM system of FIG. 1.
  • Performance Management is the effective deployment of the right people, processes and technology to develop our employees for optimal results. Employees who achieve outstanding business results, will earn more money, the performance management process ensures a consistent, standardized method in which we are measuring our Agents' performance and providing specific improvement opportunity feedback.
  • the benefits as a result of utilizing the performance management process are consistency in feedback and coaching to employees across the organization; Employees will be able to review their status and consequently feel they have more control over their ratings; empowered employees, resulting in improved morale and job satisfaction; improved performance; and reduced attrition.
  • a program performance management (PPM) system 10 (aka “Metrex”) is functionally depicted as advantageously leveraging a broad range of quantitative data sources available to a Consolidated Reporting Database (CRDB) 12 as part of a customer management system (CMS) network 14 .
  • PPM program performance management
  • CRDB Consolidated Reporting Database
  • CMS customer management system
  • the CRDB system 12 is a reporting tool utilized to access multiple project reports and to maintain accurate team and employee listings. The accurate listings are important when accessing Agent-level PPM performance data.
  • the existing CRDB system 12 provides benefits include creation of reports by pulling from other sources, therefore eliminating the need for manual input of data; reduction in the time needed to pull reports and to produce reports by pulling together data from existing systems into one place; maintenance of accurate team and agent identification (IDs); and allowance for custom reporting.
  • IDs team and agent identification
  • the CRDB system 12 interfaces with a number of components, processes or systems from which information may be received that has bearing on agent (i.e., employee), team leader (i.e., supervisor), project, and management performance.
  • agent i.e., employee
  • team leader i.e., supervisor
  • project i.e., project
  • management performance i.e., a Time Keeping System 16 used for payroll functions.
  • TKS Time Keeping System
  • the TKS system 16 may detail time spent coaching, in meeting, in training, or administrative tasks.
  • Absentee/Tardiness Tracking components 18 that augment what is available from a payroll-focused capability. For example, “clocking in” may be performed as a time and place removed from the actual worksite with more detailed information being available based on an agent's interaction with a log-in function at their station.
  • a team leader maintains a staffing/scheduling process 20 , such as DIGITAL SOLUTIONS by ______, manage the schedule adherence of team members and to document any feedback to Agents, thereby enhancing the team statistics and managing the team efficiently.
  • a staffing/scheduling process 20 such as DIGITAL SOLUTIONS by ______, manage the schedule adherence of team members and to document any feedback to Agents, thereby enhancing the team statistics and managing the team efficiently.
  • an agent calls an Interactive Voice Recognition (IVR) interface to report that he will be absent.
  • the Agent's file in staffing scheduling process 20 is maintained to adjust the number of occurrences, including adjustments for agent earnbacks and exceptions for approved leave of absence, such as under the Family and Medical Leave Act (FMLA).
  • FMLA Family and Medical Leave Act
  • Other types of absence data maintained includes No Call, No Show (NCNS) for an entire shift as well as showing up late (i.e., tardy).
  • the CRDB system 12 may advantageously interface to a Human Resources (HR) system 22 that provides guidelines associated with leaves of absence, appropriate feedback procedures, and other attendance policies.
  • HR Human Resources
  • the HR system 22 also provides updates on attrition, hiring, transfers, etc.
  • ACD Automated Call Distribution
  • Dialers 26 The amount of time by each agent spent handling outbound calls is logged by Dialers 26 . Sales made in response to an ACD call are tracked by a Sales system 28 . Similarly, a wider range of agent contacts may be managed, such as customer contacts initiated by email or a website form, on a Contact Management System 30 . Agents are to disposition all customer contacts in an Information Technology (IT) outlet so that a comparison of all calls handled by ACD shows that all were dispositioned.
  • IT Information Technology
  • the data and reporting capabilities of the CRDB system 12 and PPM system 10 are interactively available to great advantage by administers who may customize the PPM system via a PPM system manual input system 40 with manual inputs 42 , such as selecting what measures are to be assessed, weighting to be applied to the measures, target ranges for grading the weighted measures, and enabling inputs of qualitative assessments, such as comments and enhanced data capture.
  • agents may access via an agent on-line review system 44 various non-PPM utilities 46 , such as time keeping system information, schedule, paid time off (PTO), unpaid time off (PTO), attendance, and a Human Resources portal to assignment and policy guidance.
  • the agent may access or be automatically provided acknowledgement feedback forms 48 as follow-up to supervisory feedback sessions (See FIGS. 5, 6, 7 .) as well as an performance feedback sheet that shows trends in performance. (See FIG. 8.)
  • the agent may make frequent reference to an agent dashboard 50 that comprehensively and intuitively depicts the agent's performance as compared to targets and as compared to his peers on the team.
  • a team leader interacts with the PPM system 10 through a supervision/management computer 52 to update and monitor agent performance on an agent scorecard 54 .
  • the team leader performs root cause analysis, initiates a corrective action plan with the agent, and inputs feedback acknowledgment tracking forms 56 into the PPM system 10 .
  • the team leader or his management may also access PPM reports 58 , such as program performance month to date, project scorecard status, scorecard measures applied/not applied, feedback status report, semi-annual performance appraisal, and semi-annual review ranking. (See FIG. 11.)
  • a sequence of operations, or PPM process 100 is implemented by the PPM system 10 of FIG. 1 to effectively supervise and manage employees. It should be appreciated that the process 100 is depicted as a sequential series of steps between a team leader and an agent for clarity; however, the PPM process 100 is iteratively performed across an enterprise with certain steps prompted for frequent updates.
  • maintenance of a consolidated reporting database is performed so that organizational and performance related information are available, for example maintaining employee or agent identifiers (ID's), a supervision hierarchy, and project assignments, which may be more than one per employee.
  • ID's employee or agent identifiers
  • a team leader periodically reviews a listing of his direct reports maintained in a time keeping system to make sure that these are accurate, taking appropriate steps to initiate a change if warranted.
  • an administrator of the PPM system may customize what measures are used, the weightings given for these measures for a combined score, target ranges for evaluating the degree of success for each measure, implementations that designate how, when and by whom observations/comments are incorporated into the combined score, and other enhanced data capture (ENC) features.
  • EEC enhanced data capture
  • an 80% share is divided among three categories: Quality (based on overall quality score), Effectiveness (based on Average Handle Time (AHT) and After Call Work (ACW)), and Efficiency (based on schedule adherence). Ten percent is Attendance (based on the tardiness and absences). The final ten-percent is Professionalism (based on teamwork and integrity).
  • Quality based on overall quality score
  • Effectiveness based on Average Handle Time (AHT) and After Call Work (ACW)
  • Efficiency based on schedule adherence
  • Ten percent is Attendance (based on the tardiness and absences).
  • the final ten-percent is Professionalism (based on teamwork and integrity).
  • Managers have the ability to apply or not apply measures. This provides management the flexibility to compensate for elements outside an employee's control and correct input errors for manual measures.
  • a “Scorecard Measures Apply/Not Apply report” is available to ensure that this function is used properly.
  • scorecard measures may need to be excluded from the scorecard. Some examples are shown below that illustrate when a measure may need to be “not applied”. (See FIG. 12.) When an employee works in a temporary assignment that will not extend past 30 days. It may be appropriate, depending on the circumstances, to not apply the second scorecard's Quality and Efficiency measures. Note: The system automatically generates another scorecard, when an employee works on another team or project that has an existing scorecard. If a manager inputs a manual measure twice for the same month, one of the duplicate measures may be marked as “not applied”. If something outside of employees' control has impacted a specific measure across the majority of the project, the measure may need to be not applied for the entire project.
  • a measure that is “Not Applied” will not populate on the scorecard.
  • the scorecard automatically changes the weightings of the scorecard, and only applied measures will be totaled. Not applied measures will exclude the data for that measure on higher level scorecards (i.e., Team Leader, Operations Manager, etc.) and all types of project or team level reporting. Managers will use the Metrex system to not apply or apply measures.
  • the Employee Performance and Attendance folder may be selected and choose the “Employee Scorecard” for Agents and the “Management Scorecard” for Team Leaders and above.
  • agent measures are calculated to determine how the agent compares against the standards and against their peers for the current and historical rating periods.
  • a quality score is derived by pulling the overall quality score from either e-Talk (Advisor), Metrex Observations or EDC (Enhanced Data Capture). The final score is the average of all quality evaluations for an Agent within the month.
  • An exemplary formula is:
  • each measure has a set of five ranges that are possible to achieve, corresponding to a grade of 5, 4, 3, 2, 1, and having the following names respectively: Key Contributor (“KC”), Quality Plus Contributor (“QPC”), Quality Contributor (“QC”), Contribution Below Expectations (“CBE”), and Contribution Needs Immediate Improvement (“CNII”). Suggested Targets are for KC: 100%-97%; QPC: 96%-95%; QC: 94%-87%; CBE: 86% 82%; NII: 81%-0.
  • KC Key Contributor
  • QPC Quality Plus Contributor
  • QC Quality Contributor
  • CBE Contribution Below Expectations
  • CNII Contribution Needs Immediate Improvement
  • Inbound Average Handle Time is the length of time it takes for an Agent to handle a call. There are various factors that affect inbound AHT. The formula below outlines the most inclusive factors for providing the complete calculation for inbound AHT. An exemplary formula is:
  • the Inbound AHT calculation captures all three of ACD time, which includes the time an Agent spends calling out during a call; Hold time, which includes all of the activities an Agent performs while a call is on hold; and After Call Work time.
  • ACD time which includes the time an Agent spends calling out during a call
  • Hold time which includes all of the activities an Agent performs while a call is on hold
  • After Call Work time includes potential IB or OB non-ACD calls made to complete the customer's call, non-ACD calls made or received while in the ACW mode, and time in ACW while the Agent is not actively working an ACD call.
  • AUX time includes all of the AUX time captured no matter what the Agent is doing (i.e., including making or receiving non-ACD calls).
  • the value of capturing all of the AUX time is the accountability that it creates for the Agents. It drives proper and accurate phone usage by Agents.
  • Outbound Average Handle Time is the length of time it takes for an Agent to handle a call. There are various factors that affect outbound AHT. The formula below outlines the most inclusive factors for providing the complete calculation for outbound AHT. An exemplary formula is:
  • the Outbound AHT captures the total time an Agent spends on a call while logged into the switch but not handling regular Inbound ACD calls.
  • the ACW Time contains all of the time an Agent is in ACW, while logged into the phone, placing a call, and the actual Talk Time of that call.
  • the AUX Out Time contains all of the time an Agent is in AUX placing calls and talking on calls. ACW and AUX are the only modes that Agents can place themselves in and still be able to place outbound calls.
  • the After Call Work (ACW) percentage is the percent of time an Agent spends in ACW following an ACD call. It measures the percentage of actual online time an Agent spends in ACW without counting AUX time. This provides a clean view of an Agent's use of ACW to handle actual calls and removes the various activities that may be performed, while an Agent is in AUX.
  • An exemplary formula is:
  • the ACW % measure captures the Agent's total ACW time and calculates the percentage by dividing the total ACW time by the Agent's Staff time removing the Total AUX time to create a pure online time then multiplying by 100 to create the percentage figure.
  • Suggested Targets are KC: 0-10%; QPC: 11%-15%; QC: 16%-20%; CBE: 21%-25%; CNII: 26%-above.
  • Average After Call Work is an actual average of the time an Agent spends in ACW following an ACD call.
  • the average ACW measure provides the average number of seconds in ACW and is an accurate view of the actual time an Agent spends in ACW. For projects that bill for ACW, this measure provides a quick view of the potential ACW that may be included on the bill.
  • An exemplary formula is:
  • Average ACW captures the Agent's total ACW time and calculates the average by dividing the ACW time by the total ACD calls the Agent receives. This provides the Agent's average, which can be used for projected billing when applicable.
  • AUX time is the time an Agent spends in AUX work logged into the Split.
  • True AUX time which is the time an Agent spends doing various activities, provides an accurate view of the time Agents spend performing activities other than actual calls.
  • An exemplary formula is:
  • I_AUX time includes I_AUX_In time and I_AUX_Out time.
  • AUX_In time and AUX_Out time are actually time spent by an Agent placing or receiving non-ACD calls, so to capture true AUX these two components must be removed from the total AUX time.
  • AUX time captures all of the AUX reason codes to prevent Agents from selecting codes not reported. Suggested Targets are KC: 0-4%; QPC: 5%-7%; QC: 8%-11%; CBE: 12%-15%; CNII: 16%-above.
  • Average Talk Time measures the actual time spent by Agents talking to customers on ACD calls. This provides a clear view of the time Agents spend talking on calls and can be used to ensure that Agents are controlling the calls.
  • An exemplary formula is:
  • ATT captures the Agent's Total Talk time as measured in CMS (Call Management System) and divides the result by the total number of ACD calls the Agent receives. It pulls the data directly from CMS without any components being added or removed. This makes it a pure measure of the Agent's actual time with the customer.
  • Information Technology (IT) Sales Conversion is the percentage of sales in IT to ACD calls received by the Agent. This measure may contain Interlata, Intralata, or combined total sales.
  • the sales type names contained in IT must be determined when a specific sales type conversion is desired such as Intralata conversion only.
  • the data label for the various sales types may be referred to as APIC rather than Intralata, etc.
  • An exemplary formula is:
  • IT Sales Conversion captures all sales types in IT for the project and then divides that by the total ACD Calls In or IT Calls, whichever is applicable, then calculates the percentage.
  • a specific sales conversion can be calculated using the same calculation by selecting the appropriate sales type when setting up the measure in the Agent's scorecard.
  • Agent Productivity is often referred to in many project as “Adjusted Agent Yield”. This measure is intended to measure the actual online productivity of an Agent when handling calls. It is not an overall Billing Yield of an Agent. Therefore, productive time in TKS is the only time used in this calculation.
  • An exemplary formula is:
  • Agent Productivity captures an Agent's total Staff time from CMS and adds that to the Agent's actual customer handling productive time in TKS, which includes mail+e-mail+data entry and divides that total by the “clock_in seconds” or total TKS, then multiplies by 100 to provide a percentage format.
  • Suggested Targets are KC: 100%-93%; QPC: 92% 90%; QC: 89%-85%; CBE: 84%-80%; CNII: 79%-below.
  • Billing Yield is used to determine the actual billable work of an Agent by capturing all billable time for an Agent including team meetings, training, offline non-customer handling time, etc. This measure is not intended to provide an Agent Yield, which is captured in the Agent Productivity measure.
  • An exemplary formula is:
  • Billing Yield is calculated by taking an Agent's Total Staff time from CMS and adds this to the Agent's total billable TKS time then removes the online time from TKS to avoid double counting of online time. This total is then divided by the Agent's total TKS. Suggested Targets are KC: 100%-96%; QPC: 95%-93%; QC: 92%-88%; CBE: 87%-83%; CNII: 82% below.
  • Schedule Adherence reflects an Agent's actual adherence to their schedules utilized by Work Force Management. It is important to maintain accurate schedules in WFM and to notify the Command Center immediately of changes, as this measure will be negatively impacted by any change.
  • An exemplary formula is:
  • Schedule Adherence is calculated using the following data from IEX, total minutes in adherence (i.e., total number of minutes the scheduled activity matches the actual activity) and compares them to the total minutes scheduled, then multiplies the result by 1100.
  • Suggested Targets are KC: 100%-95%; QPC: 94%-93%; QC: 92%-90%; CBE: 89%-87%; CNII: 86%-below.
  • Staffed to Hours Paid provides an overall view of the online Agent's daily time spent logged into CMS compared to the Agent's total day in TKS to determine whether or not the Agent is logging into the phones for the appropriate portion of the day. It is not intended to replace Schedule Adherence, but it provides a payroll view of an Agent's activities similar to Agent Productivity.
  • An exemplary formula is:
  • Staffed to HP captures the Agent's Total Staff time in CMS divided by the Agent's total TKS for the day multiplied by 100.
  • Suggested Targets are KC: 100%-90%; QPC: 89%-87%; QC: 86%-82%; QBE: 81%-77%; and CNII: 76%-below.
  • Attendance is a direct feed from the Digital Solutions system (i.e., Attendance IVR).
  • the feed captures occurrences, which are applied to the Agent's scorecard. The occurrences will only be correct when Team Leaders maintenance the Digital Solutions web site.
  • Attendance is a mandatory measure and is composed of Absences and Tardies. Formula for Attendance is based on total number of tardies and absences in a calendar month. Tardies and Absences are applied directly to the automated scorecard from Digital Solutions. If Team Leaders do not maintenance Digital Solutions on a daily basis for their Agents, the Agents scorecard occurrence count will be inaccurate.
  • the professionalism category assists Team Leaders in measuring Agents' performance relative to core values.
  • There are 5 skills i.e., Unparalleled Client Satisfaction, Teamwork, Respect for the Individual, Diversity, and Integrity
  • An example of a formula for professionalism is: Unparalleled Client Satisfaction (2 Pts)+Teamwork (2 Pts)+Respect For The Individual (2 Pts)+Diversity (2 Pts)+Integrity (2 Pts) 10 Total Points Possible.
  • These measures compose 10% of an Agent's scorecard.
  • IEX Scheduled time is the amount of time an Agent is scheduled to work.
  • Team Leaders make adjustments to Digital Solutions. The adjustments are picked up by the Command Center and applied to their IEX Schedule. The actual TKS worked hours are subtracted out of the scheduled time to create the numerator. If a TL has maintained an Agent's schedule properly in Digital Solutions, the Lost Hours % should be a low number.
  • Agent scorecard 126 that allows a team leader or manager to review the performance summaries and status of a number of employees.
  • Agent trending reports 128 provides indications of whether a substandard performance is improving or becoming more and more of a problem. (See FIG. 13.) Different demographic cross sections may be selected, such as an account report 130 so that managers can see how particular clients of an outsourced service are being served by assigned employees, for instance. (See FIG. 14).
  • agent dashboard GUI 134 that gives an agent and team leader a frequently accessed and up-to-date snapshot of their current standing relative to the standards and to their peers. Also associated with these performance results are agent performance feedback items 136 that are created by the team leader and acknowledged by the agent to memorialize coaching for improved performance.
  • the team leader may reference these indications from PPM system in order to perform root cause analysis. Determining the root cause of any problem ensures that it does not reoccur in the future. Root Cause Analysis is useful in helping employees achieve the performance goals set by the project. A root cause analysis may be completed whenever an employee's performance is not meeting the guidelines set by the project. One technique in determining the root cause of a problem is to ask why three to five times, thereby eliminating the surface reasons for missing a target and to thus identify the root cause. The following is a list of tools that can help determine the root cause: Brainstorming, Cause and effect analysis (fishbone diagram), Histogram, Graphs, Pareto diagrams, and Checklists.
  • Root cause analysis Several steps are useful in conducting root cause analysis: (a) Enlist individuals to help in the root cause analysis. Include individuals that are directly affected by the outcome of the actions to be taken (e.g., Subject Matter Expert, another Team Leader, and/or an Operations Manager). (b) Conduct cause and effect analysis or use any of the helpful tools mentioned in this section. (c) Select the potential causes most likely to have the greatest impact on the problem. Note: It is not enough to identify that the root cause is present when the problem occurs. It must also be present when the problem does not occur. (d) Create and implement an action plan to address the root causes. The action plan may be reviewed to ensure that the corrective actions do not cause more problems.
  • An example of a root cause analysis checklist may be the following inquiries:
  • Agent Productivity Online Hours Verify the Agent was scheduled to work enough hours to be able to meet the goal (i.e., take into consideration training and vacation that may have been scheduled). Determine if off-line activities are affecting Agent Productivity. Review Agents Log In and Log Out reports to determine if Agents are staying online for the appropriate amount of time. Other Measures to Review the following measures to determine Review their impact on Agent Productivity: After Call Work AUX Time Schedule Adherence TKS Conformance Schedule Adherence Agents Changes Determine if the Agent's ESC and IEX schedule accurately reflect the Agent's scheduled hours. Environment Determine if Agents are following the attendance and tardy policy.
  • corrective action plans are used to identify areas for improvement and a timeline in which expectations are to be reached. These plans may answer who, what, when and where and consider the conditions and approvals necessary for success.
  • Action planning is used when negative trends are identified in an Agent's performance. Creating a plan will establish a roadmap to achieve excellent call quality. It also ensures an organized objective implementation.
  • a typical procedure for creating action plans would include: (1) Analyze the proposed improvement or solution; (2) Break it down into steps; (3) Consider the materials and numbers of people involved at each step; (4) Brainstorm, if necessary, for other items of possible significance; (5) Add to the list until the team thinks it is complete; and (6) Follow-up frequently to ensure the action plan is completed on time and accurately.
  • the team leader provides feedback on the agent's performance, including any corrective action plans that are to be implemented. Thereafter, the team leader captures the individual feedback items from the feedback session into the PPM system (block 144 ). Thereafter, the agent is prompted to acknowledge these items set into the PPM system by his team leader, perhaps with comments of explanation or disagreement (block 146 ). The PPM system tracks the setting and acknowledgement of feedback (block 148 ), which supports various reports and interactive screen to facilitate the process, such as acknowledgement pending/completed queues/details 150 . (See FIGS. 15, 16.)
  • the weekly or monthly or other cycle of evaluation and feedback is used for a review (e.g., quarterly, semi-annually, annually), which may coincide with compensation bonuses or raises.
  • the PPM system tracks these periodic agent or team leader review (block 152 ), and therefrom produces various reports or interactive screens to facilitate their use, such as tracking summaries 154 and an agent review ranking report 156 .
  • an employee scorecard 200 allows a team leader to select one or more factors, such as project 202 , type of employee (e.g., team leader, agent) 204 , assigned supervisor 206 (e.g., either the team leader interacting with the screen or another supervisor assigned to the team leader/manager), and a period of review, such as start date 208 and finish date 210 .
  • type of employee e.g., team leader, agent
  • supervisor 206 e.g., either the team leader interacting with the screen or another supervisor assigned to the team leader/manager
  • a period of review such as start date 208 and finish date 210 .
  • search button 212 a listing of employees are provided (not depicted), typically a listing of agents assigned to the team leader whose log-in identifier enables him to view this subset of employees.
  • One particular agent is selected with a detail employee pull-down 214 , and filtering as desired for only applied measures yes/no radio buttons 216 and/or unreviewed (“pending”) radio button 218 , with the listing displayed upon selecting a refresh button 220 .
  • a record 222 is displayed comprised of a time period 224 , scorecard description (e.g., project and facility) 226 , scorecard measure 228 , score 230 , grade 232 , apply/not apply toggle 234 , and manually-entered comment button 236 , the latter launching a text entry window for written comments.
  • a top detail record 222 is shown highlighted as a currently selected record that may be interacted with by buttons 238 - 248 .
  • a “show daily detail” button 238 will show daily statistics associated with the selected measure.
  • a “show weekly detail” button 240 will show weekly statistics associated with the selected measure.
  • a “show grade scale” button 242 will pop-up a legend explaining the grading scale standards to assist in interpreting the grades presented.
  • a “remove scorecard for this employee” button 244 is used early in a feedback period to remove a pending scorecard until completed and restored with a “restore scorecard for this employee” button 246 when readily to be applied or not applied.
  • An “add alternate project measures” button 248 is not grayed out when the employee is assigned to more than one project. Selecting button 248 allows the designating of the other projects and populating the scorecard with these alternate project measures.
  • an agent dashboard GUI 300 gives an intuitive and comprehensive presentation of the agent's performance as compared to standards and to his peers and can be frequently referenced to instruct on areas needing attention.
  • An individual pie chart 302 summarizes the 5 grade ranges by proportioning their relative weighting and stacking them radially from poor, fair, good, excellent and finally to outstanding.
  • An arrow 304 shows the composite score for the agent, which in this instance falls within the good grade.
  • a similar pie chart 306 is presented that is a summary for all of the team. Category measure summaries of attendance, quality, professionalism, efficiency and effectiveness are summarized by respective percentage values 308 - 316 for both the agent and the project as well as a grade color-coded bar chart 318 .
  • a Employee Performance Feedback (CRDB) report displays employees' (i.e., managers and Agents) month-to-date scorecard results and documented feedback, thereby assisting in providing coaching and feedback to employees.
  • a Employee Reviews (CRDB) report provides a summary of an employee's monthly scorecard results by category, overall points achieved and documented feedback, which assists in providing coaching and feedback to employees on their overall results. (See FIG. 17.)
  • a Program Performance Month-to-Date provides a roll-up of program-level scorecard data, assisting in managing results across teams and centers.
  • an Agent Productivity Management (APM) Cross Reference Detail Report displays the start date of a project and includes the following: switch number, splits, IEX code, IEX team IDs, TKS and PSF project code, thereby determining where the APM Measures report is pulling information.
  • An APM Measures Report with Targets and Charts sorts by Business Unit, center, portfolio, billing unit and PSFN project code and compares the following information to targets determined by the Project: agent productivity, phone time variance, percent occupancy, call service efficiency, percent of calls forecasted accurately, percent of average handle time forecasted accurately, on-line conformance and on-line adherence. Tracking a project's efficiency on key measures, and to review the accuracy of client forecasts for business planning (i.e., staffing etc.).
  • an APM Trend Report sorts by business unit, center, portfolio, billing unit and PSFN project code. It provides three months of project trends for the following information: agent productivity, phone time variance, percent occupancy, call service efficiency, percent of calls forecasted accurately, percent of average handle time forecasted accurately, on-line conformance and on-line adherence, on-line diagnostic measures, breakdown of TKS categories and percent billable and non billable time. It identifies trends and improvement opportunities.
  • a TKS Activity Analysis Report Detail provides project level data on where payroll time is being spent i.e., total coaching hours, meeting hours, training hours, etc. It assists in conducting project level analysis to ensure the team is following standard processes and to identify improvement areas.
  • a TKS Activity Analysis Report provides a summary of daily, weekly and monthly data for TKS data analysis. It also provides interval data and displays statistics on a project's billable time.
  • a TKS Agent Productivity Report Detail provides by TKS Project code and employee the following information: manned time, other productive, productivity and phone time variance, thereby assisting managers in identifying how a project can be more efficient.
  • a TKS Agent Productivity Report provides by Business Unit, location, and TKS project(s) the following information: manned time, other productive, productivity and phone time variance. It assists managers in identifying how a project can be more efficient.
  • a Yes/No Line Item Trends by Agent, Team Leader, and Project report provides call monitoring line item results of a project's evaluations completed by QA, Team Leader, OJT, client and a summary of all evaluations. It Assists in conducting analysis on project level quality results and identify areas for improvement.
  • various Agent, Skill and VDN Reports provides a summary of monthly, weekly, daily and interval data for ACD and Agent.
  • Displayed project statistics report assists in managing employees' performance, including displaying billing data for client bills.
  • various ACD DN, Agent, CDN and DNIS Reports supplies a summary of monthly, weekly, daily and interval data for ACD, Agent, CDN and DNIS data. Displays project statistics.
  • Multi-project, Project, Team Leader and Agent Level Reports identifies detailed and summary data daily, weekly, and monthly for key metrics at the following levels: multi-project, a single project, Team Leader and Agent. Assists in managing the key metrics and analyzing the raw data for the key metrics.
  • Sixteenth, various Multi-project, Project, Team Leader and Agent Level Reports identifies detailed and summary data daily, weekly, and monthly for key metrics at the following levels: multi-project, a single project, Team Leader and Agent. It assists in managing the key metrics and analyzing the raw data for the key metrics.
  • BU Business Unit
  • Center Project and Job Category Headcount Reports supplies headcount information from CORT at the following levels: business unit, project and job category. This report should be pulled using the same start and finish date. It assists in verifying the accuracy of the information in CORT and developing business plans.
  • a Statistical by Interval and by Summary report supplies data on all calls that went through the IVR. Displays IVR usage, conversant routing, etc.
  • a PTO Report displays paid time off (PTO) information by project, team and employee, assisting in managing employee PTO days accrued and taken.
  • PTO paid time off
  • Attrition Reports (e.g., 12 Month Rolling, Calendar, Turnover Analysis and Employee Term Listings) supplies attrition information from CORT at the following levels: business unit, vertical, center, project and job category to assist in verifying the accuracy of the information in CORT, to develop action plans and strategic business plans.
  • Program, Finance, and Activity Summary Reports provides TKS reporting through CRDB for tracking and managing Agent activities, payroll, etc.
  • PPM Process Conformance is key objective of several reports that can be used to verify whether managers and projects are complying with the process.
  • a Project Scorecard Status report identifies the measures that have populated on the scorecard. Retrieves both applied and pending measures. Identifies automated measures that have not populated on an individual's scorecard or need to be added manually.
  • a Scorecard Measures Exception Report identifies the following types of measures by employee name: Not Applied, Removed and Pending. It assists in identifying frequency of unapplied and pending measures and the manager responsible.
  • a Scorecards with Zero Grade displays employees who have received a zero due to their scores falling outside the grading criteria. It assists in identifying issues that need to be investigated and resolved prior to final scorecard processing.
  • a Feedback Status Report identifies, by Team Leader and Agent, the percent of feedback that has been acknowledged in the system. Coaching Team Leaders on providing timely feedback to Agents.
  • an Acknowledgement Detail Report identifies acknowledgement type, Event number, status and by whom it was acknowledged by project, supervisor, and Agent. It assists in evaluating the status of acknowledgement types and by whom they were acknowledged.
  • an Acknowledgement Summary Report displays by business unit, center, project, supervisor, and Agent the following: Total number of acknowledgements, Number of pending acknowledgments, Percentage of completed acknowledgements, and Number and percentage of acknowledgements completed by a Scorecard Project Coordinator, Team Leader, and Agent. It assists in evaluating the completion percentage of acknowledgements and by whom they were acknowledged.
  • a Report Usage by Project & User Type & User identifies which employees are pulling reports and the reports being reviewed. It assists in providing coaching and feedback to managers and other employees (i.e., Reports Specialists, etc.).
  • a Report Usage by Folder BU, Report, Project Level identifies by business unit and project level what folders have been reviewed, thereby assessing the level of CRDB and PPM process usage by a project.
  • Administrative—Core CRDB Agent Profile Reports identify the structure necessary for scorecards to accurately roll-up at each level.
  • a Supervisor Hierarchy Report identifies the structure of a specific project, from the Agent level and to the President level, providing a quick and easy way to find an employee's manager and determine if the appropriate employees are on the list.
  • a Supervisor Hierarchy Audit Detail report shows by project the following employee information: name, Employee Number, active or inactive status, level of authority in CRDB, and Supervisor's Employee Number. Provides a quick view of employee linkages that projects can verify the accuracy of the Hierarchy report.
  • a CRDB CMS Dictionary provides split, VDN, and skill information at the project level and is utilized as a quick reference tool for managers when discussing changes with Workforce Management, etc.
  • a Project and PPM Rollup List by SME shows CRDB SME's by project, program, sponsor and Workforce Management group. Displays CRDB SME to contact when a project needs assistance, displaying agents, Team Leaders & Operations Managers Only.
  • an Average Quality by Guideline and Evaluator Report identifies when the first and last call monitoring evaluation was completed, average overall quality score and the total number of evaluations, thereby assisting in providing coaching and feedback to direct reports on monitoring goals and overall results.
  • a Quality Summary by Agent/Team Leader Report displays by project the Team Leaders, their Agents, number of evaluations completed per Agent, average overall quality score from QA, Team Leader (TL), QA & TL, OJT, client and all evaluations, thereby assisting in managing and providing feedback on project level, Team Leader level and Agent level results.
  • an Employee Review Rankings report ranks employees against their peers according to the points received on the scorecards on a monthly basis over the six-month period, determining Agent's appraisal ratings within a project.
  • a Semi-Annual Performance Appraisal report shows employees' performance over the six-month period, assisting in providing coaching and feedback to employees.
  • an Agent Profile by Project report provides Agent's name, Employee number, system ids, active status, and Team Leader's name, assisting Managers in troubleshooting why a measure is not displaying on a scorecard.
  • a Team Change Request Maintenance report provides a list of Agents and their Team Leaders by project. Transfers Agents to other Team Leaders within the same project, as well as transferring Agents to other projects.
  • a Manager Approval report provides Operations Managers with a list of pending Agent transfers; Approving or denying Agent transfer requests.
  • a Delegation of Authority report enables Operations Managers to delegate the authority to approve Agent transfer requests, delegating transfer approval authority when an Operations Manager is not in the office.
  • a Team Change Request Status Report provides a list of Team Change requests and their status, and tracks Team Change requests.
  • a program performance management (PPM) system 10 is set up as part of a customer management system (CMS) network 14 , leveraging already existing quantitative information regarding employee work activities (e.g., attendance, time engaged in performing specific tasks, scheduling, sales results, etc.).
  • CMS customer management system
  • a team leader interacts with an employee scorecard 54 to input manual measures of quality and professionalism. These measures are compiled into a score that may be compared to targets and to the peers of the employee, with the results intuitively presented to the employee on an agent dashboard 134 .
  • Feedback acknowledgement is facilitated by the PPM system 10 , as well as tracking accomplishment of periodic reviews, with an array of reports available for upper management to evaluate agent, team leader, and project performance.
  • CMS customer management services

Abstract

A Program Performance Management (PPM) system enforces consistency in feedback and coaching to employees across the organization lower attrition through improved morale and job satisfaction. Employees are empowered because they can review their status and thus feel that they have more control over their ratings. Consistency in performance data is maintained across an enterprise. Management insights are gained by comparisons made across projects, programs, and Business Units on standardized measures, thereby enabling accountability at all levels. Integration of quantitative information and qualitative assessments of Customer Management System (CMS) agents performance is summarized and plotted in an intuitive fashion, with feedback acknowledgements and reviews tracked for management. Team leaders have a scorecard interface to efficiently supervise their team members. Agents have access to a dashboard that provides up to date and intuitive indications of their performance and that of their fellow team members.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application hereby claims the benefit of the provisional patent application entitled “PROGRAM PERFORMANCE MANAGEMENT SYSTEM” to Shawn R. Anderson, Serial No. 60/397,651, filed on 22 Jul. 2002.[0001]
  • FIELD OF THE INVENTION
  • The present invention relates, in general, to devices and methods that correlate and display employee performance evaluation factors, both objective and subjective, and track their updates, dissemination, and review, and more particularly to computer-based devices and methods particularly suited to evaluating customer service agents. [0002]
  • BACKGROUND OF THE INVENTION
  • Accurate and timely employee evaluations are important for motivating good employees and taking corrective action with not-so-good employees. While this is generally true for all industries and services, customer service providers have a particular need for a comprehensive approach to agent evaluation. Each contact with an agent may positively or adversely impact a customer's perception of a business. [0003]
  • While customer care management is a challenging service in and of itself, recent trends are for outsourcing this function in order to leverage customer care management technology, expertise, and economies of scale. However, such a decision is not made without reservations. For instance, a business may be concerned that a Customer Management Service (CMS) provider would tend to have outsourced agents that are not as motivated to perform their duties well as the business's own employees. These businesses in particular may not deem the CMS provider to have comprehensive and transparent program performance management capabilities to provide this confidence. [0004]
  • Even if the CMS provider may demonstrate an agent evaluation process, a business may yet be concerned about how do these processes effectively manage performance to achieve the specific business goals of the business, rather than a generic, non-tailored process. Furthermore, even if tracking performance factors of value to the business, does the CMS provider ensure that performance feedback and coaching is truly delivered to agents in a timely manner to ensure its efficacy. Finally, even if the evaluation process is appropriate and timely for the business, another concern is that the performance data is unduly subjective and haphazardly reported. [0005]
  • Consequently, a significant need exists for an approach to performance management that is suitable for motivating agents who provide customer care, that is disseminated and reviewed in a timely fashion, and that is rigorously tracked and subject to audit to enhance confidence in its efficacy and accuracy. [0006]
  • BRIEF SUMMARY OF THE INVENTION
  • The invention overcomes the above-noted and other deficiencies of the prior art by providing a performance management system and method that comprehensively addresses qualitative and quantitative measurands of performance for each agent and group of agents, intuitively displays this information in a meaningful fashion to various levels of supervision, including each agent, and tracks the updates, dissemination, and review of performance feedback through each tier of supervision. Sources of information are sourced and tracked in such a way that accuracy and objectivity are enhanced, increasing confidence. Thereby, agent performance is enhanced through timely and appropriate feedback. Efficacy of overall performance management is made transparent to each level of an organization, including a customer for these services. [0007]
  • In one aspect of the invention, a plurality of quantitative and qualitative measures are selected as being aligned with appropriate business goals. These measures are collected, merged and analyzed in an objective manner to represent the various performance attributes of an agent. Results are then displayed in an intuitive graphical user interface that readily conveys these attributes, both individually and as compared to an overall group. Thereby, each agent has a current snapshot as to their standing in the eyes of their employer, with its implications for retention and possibly pay for performance, to thus motivate improved performance. Frequent reporting ensures that you will always know how the CMS provider and its individual agents are performing. Regular feedback to each agent helps ensure continuous agent development. [0008]
  • In another aspect of the invention, a plurality of quantitative and qualitative measures are monitored and collected for each agent, wherein these qualitative measures include supervisory evaluations. Timeliness of supervisory evaluations is tracked, as well as agent review of feedback based on the quantitative and qualitative measures. [0009]
  • These and other objects and advantages of the present invention shall be made apparent from the accompanying drawings and the description thereof.[0010]
  • BRIEF DESCRIPTION OF THE FIGURES
  • The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention, and, together with the general description of the invention given above, and the detailed description of the embodiments given below, serve to explain the principles of the present invention. [0011]
  • FIG. 1 is a block diagram of a Program Performance Management (PPM) System incorporated into a Customer Management System (CMS) network. [0012]
  • FIG. 2 is a sequence of operations performed by the PPM System of FIG. 1. [0013]
  • FIG. 3 is a depiction of an employee scorecard graphical user interface (GUI) of the PPM system of FIG. 1 useful for a team leader in performing manual update operations and root cause analysis. [0014]
  • FIG. 4 is a depiction of an agent dashboard graphical GUI generated by the PPM system of FIG. 1 indicating a comparison of an agent's performance to standards and to peers. [0015]
  • FIG. 5 is a depiction of a queued acknowledgement form GUI generated by the PPM system of FIG. 1. [0016]
  • FIG. 6 is a depiction of recent acknowledgements report generated by the PPM system of FIG. 1. [0017]
  • FIG. 7 is a depiction of acknowledgement detail report generated by the PPM system of FIG. 1. [0018]
  • FIG. 8 is a depiction of an employee performance feedback sheet generated by the PPM system of FIG. 1. [0019]
  • FIG. 9 is a depiction of a team leader acknowledgement queue form generated by the PPM system of FIG. 1. [0020]
  • FIG. 10 is a depiction of scorecard acknowledgement event detail report generated by the PPM system of FIG. 1. [0021]
  • FIG. 11 is a depiction of an employee review rankings report generated by the PPM system of FIG. 1. [0022]
  • FIG. 12 is a depiction of a measure daily exclusion screen generated by the PPM system of FIG. 1. [0023]
  • FIG. 13 is a depiction of a performance trending report generated by the PPM system of FIG. 1. [0024]
  • FIG. 14 is a depiction of an account report generated by the PPM system of FIG. 1. [0025]
  • FIG. 15 is a depiction of an acknowledgement detail report generated by the PPM system of FIG. 1. [0026]
  • FIG. 16 is a depiction of an acknowledgement summary report generated by the PPM system of FIG. 1. [0027]
  • FIG. 17 is a depiction of summary review form generated by the PPM system of FIG. 1.[0028]
  • DETAILED DESCRIPTION OF THE INVENTION
  • Performance Management is the effective deployment of the right people, processes and technology to develop our employees for optimal results. Employees who achieve outstanding business results, will earn more money, the performance management process ensures a consistent, standardized method in which we are measuring our Agents' performance and providing specific improvement opportunity feedback. The benefits as a result of utilizing the performance management process are consistency in feedback and coaching to employees across the organization; Employees will be able to review their status and consequently feel they have more control over their ratings; empowered employees, resulting in improved morale and job satisfaction; improved performance; and reduced attrition. [0029]
  • Turning to the Drawings, wherein like numerals denote similar components throughout the several views, in FIG. 1, a program performance management (PPM) system [0030] 10 (aka “Metrex”) is functionally depicted as advantageously leveraging a broad range of quantitative data sources available to a Consolidated Reporting Database (CRDB) 12 as part of a customer management system (CMS) network 14. In particular, The CRDB system 12 is a reporting tool utilized to access multiple project reports and to maintain accurate team and employee listings. The accurate listings are important when accessing Agent-level PPM performance data. The existing CRDB system 12 provides benefits include creation of reports by pulling from other sources, therefore eliminating the need for manual input of data; reduction in the time needed to pull reports and to produce reports by pulling together data from existing systems into one place; maintenance of accurate team and agent identification (IDs); and allowance for custom reporting.
  • The CRDB [0031] system 12 interfaces with a number of components, processes or systems from which information may be received that has bearing on agent (i.e., employee), team leader (i.e., supervisor), project, and management performance. First, in an exemplary group of inputs, a Time Keeping System (TKS) 16 used for payroll functions. In addition to being a source of absence and tardy data on each agent, the TKS system 16 may detail time spent coaching, in meeting, in training, or administrative tasks. There may other Absentee/Tardiness Tracking components 18 that augment what is available from a payroll-focused capability. For example, “clocking in” may be performed as a time and place removed from the actual worksite with more detailed information being available based on an agent's interaction with a log-in function at their station.
  • A team leader maintains a staffing/[0032] scheduling process 20, such as DIGITAL SOLUTIONS by ______, manage the schedule adherence of team members and to document any feedback to Agents, thereby enhancing the team statistics and managing the team efficiently. For absences, an agent calls an Interactive Voice Recognition (IVR) interface to report that he will be absent. If the Agent is absent for consecutive days, the Agent's file in staffing scheduling process 20 is maintained to adjust the number of occurrences, including adjustments for agent earnbacks and exceptions for approved leave of absence, such as under the Family and Medical Leave Act (FMLA). Other types of absence data maintained includes No Call, No Show (NCNS) for an entire shift as well as showing up late (i.e., tardy).
  • The [0033] CRDB system 12 may advantageously interface to a Human Resources (HR) system 22 that provides guidelines associated with leaves of absence, appropriate feedback procedures, and other attendance policies. The HR system 22 also provides updates on attrition, hiring, transfers, etc.
  • The amount of time by each agent spent handling inbound calls is logged by an Automated Call Distribution (ACD) [0034] system 24. Similarly, the amount of time by each agent spent handling outbound calls is logged by Dialers 26. Sales made in response to an ACD call are tracked by a Sales system 28. Similarly, a wider range of agent contacts may be managed, such as customer contacts initiated by email or a website form, on a Contact Management System 30. Agents are to disposition all customer contacts in an Information Technology (IT) outlet so that a comparison of all calls handled by ACD shows that all were dispositioned.
  • In addition to the range of quantitative information that represents agent performance, qualitative information is gathered about the agent, depicted as a [0035] quality system 32. One source of assessments of agent performance may be observations input by a team leader. Another may be by a quality assurance (QA) entity.
  • These sources of information allow for the [0036] CRDB system 12 to maintain a range of reports: headcount reporting, attrition reporting, agent profile reporting, supervisory hierarchy reporting, advisor reporting, CMS ACD reporting, TKS reporting, IVR reporting, and Performance Management Reporting. The latter is produced by the PPM system 10 in conjunction with unique PPM observations 34, PPM tabulations 36, and PPM review tracking 38.
  • The data and reporting capabilities of the [0037] CRDB system 12 and PPM system 10 are interactively available to great advantage by administers who may customize the PPM system via a PPM system manual input system 40 with manual inputs 42, such as selecting what measures are to be assessed, weighting to be applied to the measures, target ranges for grading the weighted measures, and enabling inputs of qualitative assessments, such as comments and enhanced data capture.
  • In addition, agents may access via an agent on-[0038] line review system 44 various non-PPM utilities 46, such as time keeping system information, schedule, paid time off (PTO), unpaid time off (PTO), attendance, and a Human Resources portal to assignment and policy guidance. On a frequent basis, the agent may access or be automatically provided acknowledgement feedback forms 48 as follow-up to supervisory feedback sessions (See FIGS. 5, 6, 7.) as well as an performance feedback sheet that shows trends in performance. (See FIG. 8.) In addition, the agent may make frequent reference to an agent dashboard 50 that comprehensively and intuitively depicts the agent's performance as compared to targets and as compared to his peers on the team.
  • A team leader interacts with the [0039] PPM system 10 through a supervision/management computer 52 to update and monitor agent performance on an agent scorecard 54. When performance indications from the scorecard warrant corrective action, the team leader performs root cause analysis, initiates a corrective action plan with the agent, and inputs feedback acknowledgment tracking forms 56 into the PPM system 10. (See FIGS. 9, 10.) The team leader or his management may also access PPM reports 58, such as program performance month to date, project scorecard status, scorecard measures applied/not applied, feedback status report, semi-annual performance appraisal, and semi-annual review ranking. (See FIG. 11.)
  • In FIG. 2, a sequence of operations, or [0040] PPM process 100, is implemented by the PPM system 10 of FIG. 1 to effectively supervise and manage employees. It should be appreciated that the process 100 is depicted as a sequential series of steps between a team leader and an agent for clarity; however, the PPM process 100 is iteratively performed across an enterprise with certain steps prompted for frequent updates.
  • In [0041] block 102, maintenance of a consolidated reporting database is performed so that organizational and performance related information are available, for example maintaining employee or agent identifiers (ID's), a supervision hierarchy, and project assignments, which may be more than one per employee. Typically, a team leader periodically reviews a listing of his direct reports maintained in a time keeping system to make sure that these are accurate, taking appropriate steps to initiate a change if warranted.
  • In [0042] block 104, an administrator of the PPM system may customize what measures are used, the weightings given for these measures for a combined score, target ranges for evaluating the degree of success for each measure, implementations that designate how, when and by whom observations/comments are incorporated into the combined score, and other enhanced data capture (ENC) features.
  • With the PPM system prepared, automatic performance data is compiled (block [0043] 106) based on Effectiveness data 108, Efficiency data 110, and Attendance data 112. These measures are rolled up as well into a similar performance tracking record for the team leader's performance data (block 114). In addition to quantitative measures, manual (qualitative) performance data is compiled (block 116) from Quality data 118 and Professionalism data 120, both typically input by the team leader and/or other evaluators such as QA. In the illustrative version, a scorecard has five categories, which total 100 points. At a high level, Quality, Effectiveness and Efficiency categories may be broken out in any value, but the categories must add up to 100%. In the illustrative version, an 80% share is divided among three categories: Quality (based on overall quality score), Effectiveness (based on Average Handle Time (AHT) and After Call Work (ACW)), and Efficiency (based on schedule adherence). Ten percent is Attendance (based on the tardiness and absences). The final ten-percent is Professionalism (based on teamwork and integrity). However, it should be appreciated that these specific characteristics and percentages are exemplary only and that other combinations may be selected for a specific application consistent with aspects of the invention.
  • In [0044] block 122, Managers have the ability to apply or not apply measures. This provides management the flexibility to compensate for elements outside an employee's control and correct input errors for manual measures. A “Scorecard Measures Apply/Not Apply report” is available to ensure that this function is used properly. There are a few instances when scorecard measures may need to be excluded from the scorecard. Some examples are shown below that illustrate when a measure may need to be “not applied”. (See FIG. 12.) When an employee works in a temporary assignment that will not extend past 30 days. It may be appropriate, depending on the circumstances, to not apply the second scorecard's Quality and Efficiency measures. Note: The system automatically generates another scorecard, when an employee works on another team or project that has an existing scorecard. If a manager inputs a manual measure twice for the same month, one of the duplicate measures may be marked as “not applied”. If something outside of employees' control has impacted a specific measure across the majority of the project, the measure may need to be not applied for the entire project.
  • There are several impacts that occur when a measure is not applied. A measure that is “Not Applied” will not populate on the scorecard. The scorecard automatically changes the weightings of the scorecard, and only applied measures will be totaled. Not applied measures will exclude the data for that measure on higher level scorecards (i.e., Team Leader, Operations Manager, etc.) and all types of project or team level reporting. Managers will use the Metrex system to not apply or apply measures. The Employee Performance and Attendance folder may be selected and choose the “Employee Scorecard” for Agents and the “Management Scorecard” for Team Leaders and above. [0045]
  • In [0046] block 124, agent measures are calculated to determine how the agent compares against the standards and against their peers for the current and historical rating periods.
  • Quality Score. [0047]
  • A quality score is derived by pulling the overall quality score from either e-Talk (Advisor), Metrex Observations or EDC (Enhanced Data Capture). The final score is the average of all quality evaluations for an Agent within the month. An exemplary formula is: [0048]
  • (QA OVERALL QUALITY SCORE+TEAM LEADER OVERALL QUALITY SCORE)/(QA OVERALL # OF MONITORINGS+TL OVERALL # OF MONITORINGS)
  • The above-described formula pulls automatically from either Advisor or Metrex Observation. If a system other than the above mentioned is utilized, manual entry may be necessary. In the illustrative embodiment, each measure has a set of five ranges that are possible to achieve, corresponding to a grade of 5, 4, 3, 2, 1, and having the following names respectively: Key Contributor (“KC”), Quality Plus Contributor (“QPC”), Quality Contributor (“QC”), Contribution Below Expectations (“CBE”), and Contribution Needs Immediate Improvement (“CNII”). Suggested Targets are for KC: 100%-97%; QPC: 96%-95%; QC: 94%-87%; CBE: 86% 82%; NII: 81%-0. [0049]
  • Efficiency Category [0050]
  • Inbound Average Handle Time (AHT) is the length of time it takes for an Agent to handle a call. There are various factors that affect inbound AHT. The formula below outlines the most inclusive factors for providing the complete calculation for inbound AHT. An exemplary formula is: [0051]
  • (I_ACDTIME+DA_ACDTIME+I_ACDAUX_OUTTIME+I_ACDOTHERTIME+I_ACWTIME+I_DA_ACWTIME+TI_AUXTIME)/(ACDCALLS+DA_ACDCALLS).
  • With regard to the above-described formula, the Inbound AHT calculation captures all three of ACD time, which includes the time an Agent spends calling out during a call; Hold time, which includes all of the activities an Agent performs while a call is on hold; and After Call Work time. The latter includes potential IB or OB non-ACD calls made to complete the customer's call, non-ACD calls made or received while in the ACW mode, and time in ACW while the Agent is not actively working an ACD call. [0052]
  • AUX time includes all of the AUX time captured no matter what the Agent is doing (i.e., including making or receiving non-ACD calls). The value of capturing all of the AUX time is the accountability that it creates for the Agents. It drives proper and accurate phone usage by Agents. [0053]
  • Outbound Average Handle Time (AHT) is the length of time it takes for an Agent to handle a call. There are various factors that affect outbound AHT. The formula below outlines the most inclusive factors for providing the complete calculation for outbound AHT. An exemplary formula is: [0054]
  • (ACW TIME+AUX OUT TIME)/(AUX OUT CALLS+ACW OUT CALLS)
  • With regard to the above-described formula, the Outbound AHT captures the total time an Agent spends on a call while logged into the switch but not handling regular Inbound ACD calls. The ACW Time contains all of the time an Agent is in ACW, while logged into the phone, placing a call, and the actual Talk Time of that call. The AUX Out Time contains all of the time an Agent is in AUX placing calls and talking on calls. ACW and AUX are the only modes that Agents can place themselves in and still be able to place outbound calls. [0055]
  • The After Call Work (ACW) percentage is the percent of time an Agent spends in ACW following an ACD call. It measures the percentage of actual online time an Agent spends in ACW without counting AUX time. This provides a clean view of an Agent's use of ACW to handle actual calls and removes the various activities that may be performed, while an Agent is in AUX. An exemplary formula is: [0056]
  • (I_ACW_TIME+DA_ACW_TIME)*100/(TI_STAFF_TIME−TI_AUX_TIME−AUX_IN_TIME−AUX_OUT_TIME)
  • With regard to the above-described formula, the ACW % measure captures the Agent's total ACW time and calculates the percentage by dividing the total ACW time by the Agent's Staff time removing the Total AUX time to create a pure online time then multiplying by 100 to create the percentage figure. Suggested Targets are KC: 0-10%; QPC: 11%-15%; QC: 16%-20%; CBE: 21%-25%; CNII: 26%-above. [0057]
  • Average After Call Work (ACW) is an actual average of the time an Agent spends in ACW following an ACD call. The average ACW measure provides the average number of seconds in ACW and is an accurate view of the actual time an Agent spends in ACW. For projects that bill for ACW, this measure provides a quick view of the potential ACW that may be included on the bill. An exemplary formula is: [0058]
  • (I_ACW_TIME+DA_ACW_TIME)/(ACD_CALLS+DA_ACD_CALLS)
  • With regard to the above-described formula, Average ACW captures the Agent's total ACW time and calculates the average by dividing the ACW time by the total ACD calls the Agent receives. This provides the Agent's average, which can be used for projected billing when applicable. AUX time is the time an Agent spends in AUX work logged into the Split. True AUX time, which is the time an Agent spends doing various activities, provides an accurate view of the time Agents spend performing activities other than actual calls. An exemplary formula is: [0059]
  • (TI_AUX_TIME−AUX_IN_TIME−AUX_OUT_TIME)*100/TI_STAFF_TIME
  • With regard to the above-described formula, I_AUX time includes I_AUX_In time and I_AUX_Out time. AUX_In time and AUX_Out time are actually time spent by an Agent placing or receiving non-ACD calls, so to capture true AUX these two components must be removed from the total AUX time. AUX time captures all of the AUX reason codes to prevent Agents from selecting codes not reported. Suggested Targets are KC: 0-4%; QPC: 5%-7%; QC: 8%-11%; CBE: 12%-15%; CNII: 16%-above. [0060]
  • Average Talk Time (ATT) measures the actual time spent by Agents talking to customers on ACD calls. This provides a clear view of the time Agents spend talking on calls and can be used to ensure that Agents are controlling the calls. An exemplary formula is: [0061]
  • (ACD_TIME+DA_ACD_TIME)/(ACD_CALLS+DA_ACD_CALLS)
  • With regard to the above-described formula, ATT captures the Agent's Total Talk time as measured in CMS (Call Management System) and divides the result by the total number of ACD calls the Agent receives. It pulls the data directly from CMS without any components being added or removed. This makes it a pure measure of the Agent's actual time with the customer. [0062]
  • Information Technology (IT) Sales Conversion is the percentage of sales in IT to ACD calls received by the Agent. This measure may contain Interlata, Intralata, or combined total sales. The sales type names contained in IT must be determined when a specific sales type conversion is desired such as Intralata conversion only. For example, the data label for the various sales types may be referred to as APIC rather than Intralata, etc. An exemplary formula is: [0063]
  • (Number of Sales)*100/(ACD Calls) or (Number of Sales)*100/(IT Calls)
  • With regard to the above-described formula, IT Sales Conversion captures all sales types in IT for the project and then divides that by the total ACD Calls In or IT Calls, whichever is applicable, then calculates the percentage. A specific sales conversion can be calculated using the same calculation by selecting the appropriate sales type when setting up the measure in the Agent's scorecard. [0064]
  • The total calls dispositioned in IT vs. CMS (Call Management System) provides a measure to confirm whether an Agent is or is not adhering to the call dispositioning step in the Agent's call handling procedures. The goal should be around 100% to ensure that all CMS calls are being properly dispositioned in IT. An exemplary formula is: [0065]
  • IT CALLS*100/(ACD CALLS)
  • With regard to the above-described formula, the total number of calls dispositioned in IT divided by the total number of CMS calls received by an Agent then multiplied by 100. [0066]
  • Effectiveness Category [0067]
  • Agent Productivity is often referred to in many project as “Adjusted Agent Yield”. This measure is intended to measure the actual online productivity of an Agent when handling calls. It is not an overall Billing Yield of an Agent. Therefore, productive time in TKS is the only time used in this calculation. An exemplary formula is: [0068]
  • (CMS STAFF TIME+TKS PRODUCTIVE TIME)*100/(TOTAL TKS TIME)
  • With regard to the above-described formula, Agent Productivity captures an Agent's total Staff time from CMS and adds that to the Agent's actual customer handling productive time in TKS, which includes mail+e-mail+data entry and divides that total by the “clock_in seconds” or total TKS, then multiplies by 100 to provide a percentage format. Suggested Targets are KC: 100%-93%; QPC: 92% 90%; QC: 89%-85%; CBE: 84%-80%; CNII: 79%-below. [0069]
  • Billing Yield is used to determine the actual billable work of an Agent by capturing all billable time for an Agent including team meetings, training, offline non-customer handling time, etc. This measure is not intended to provide an Agent Yield, which is captured in the Agent Productivity measure. An exemplary formula is: [0070]
  • (TI_STAFF TIME+(TKS_BILLABLE−TKS_ONLINE)/(TKS_PAID)
  • With regard to the above-described formula, Billing Yield is calculated by taking an Agent's Total Staff time from CMS and adds this to the Agent's total billable TKS time then removes the online time from TKS to avoid double counting of online time. This total is then divided by the Agent's total TKS. Suggested Targets are KC: 100%-96%; QPC: 95%-93%; QC: 92%-88%; CBE: 87%-83%; CNII: 82% below. [0071]
  • Schedule Adherence reflects an Agent's actual adherence to their schedules utilized by Work Force Management. It is important to maintain accurate schedules in WFM and to notify the Command Center immediately of changes, as this measure will be negatively impacted by any change. An exemplary formula is: [0072]
  • (Open In+Other In)*100(Open In+Open Out+Other In+Other Out)
  • Note: In other words, all of the time in adherence is divided by total scheduled time. With regard to the above-described formula, Schedule Adherence is calculated using the following data from IEX, total minutes in adherence (i.e., total number of minutes the scheduled activity matches the actual activity) and compares them to the total minutes scheduled, then multiplies the result by 1100. Suggested Targets are KC: 100%-95%; QPC: 94%-93%; QC: 92%-90%; CBE: 89%-87%; CNII: 86%-below. [0073]
  • Staffed to Hours Paid (HP) provides an overall view of the online Agent's daily time spent logged into CMS compared to the Agent's total day in TKS to determine whether or not the Agent is logging into the phones for the appropriate portion of the day. It is not intended to replace Schedule Adherence, but it provides a payroll view of an Agent's activities similar to Agent Productivity. An exemplary formula is: [0074]
  • (TOTAL STAFFED TIME)*100/(TOTAL_TK_DAY_SECONDS)
  • With regard to the above-described formula, Staffed to HP captures the Agent's Total Staff time in CMS divided by the Agent's total TKS for the day multiplied by 100. Suggested Targets are KC: 100%-90%; QPC: 89%-87%; QC: 86%-82%; QBE: 81%-77%; and CNII: 76%-below. [0075]
  • Attendance is a direct feed from the Digital Solutions system (i.e., Attendance IVR). The feed captures occurrences, which are applied to the Agent's scorecard. The occurrences will only be correct when Team Leaders maintenance the Digital Solutions web site. Attendance is a mandatory measure and is composed of Absences and Tardies. Formula for Attendance is based on total number of tardies and absences in a calendar month. Tardies and Absences are applied directly to the automated scorecard from Digital Solutions. If Team Leaders do not maintenance Digital Solutions on a daily basis for their Agents, the Agents scorecard occurrence count will be inaccurate. [0076]
  • The professionalism category assists Team Leaders in measuring Agents' performance relative to core values. There are 5 skills (i.e., Unparalleled Client Satisfaction, Teamwork, Respect for the Individual, Diversity, and Integrity), which Team Leaders manually enter into the system periodically (e.g., monthly). An example of a formula for professionalism is: Unparalleled Client Satisfaction (2 Pts)+Teamwork (2 Pts)+Respect For The Individual (2 Pts)+Diversity (2 Pts)+Integrity (2 Pts) 10 Total Points Possible. These measures compose 10% of an Agent's scorecard. [0077]
  • Team Leader Measures [0078]
  • All Agent measures in the Quality, Effectiveness, and Efficiency categories roll up to the Team Leader's scorecard. In addition, the Team leader is evaluated for Attendance and Professionalism. For Attendance, a lost hours are tracked, with the target begin a low percentage if Team Leaders are using their scheduling system effectively (e.g., DIGITAL SOLUTIONS). Formula [0079]
  • (IEX SCHEDULED TIME−TKS TOTAL TIME WORKED)/(IEX TOTAL TIME SCHEDULED)
  • With regard to the above-described formula, IEX Scheduled time is the amount of time an Agent is scheduled to work. To alter the scheduled time, Team Leaders (TL) make adjustments to Digital Solutions. The adjustments are picked up by the Command Center and applied to their IEX Schedule. The actual TKS worked hours are subtracted out of the scheduled time to create the numerator. If a TL has maintained an Agent's schedule properly in Digital Solutions, the Lost Hours % should be a low number. [0080]
  • Professionalism Category [0081]
  • The professionalism category has been developed to assist Operations Managers in measuring Team Leader's performance relative to Convergys' core values. There are 5 skills (i.e., Unparalleled Client Satisfaction, Teamwork, Respect for the Individual, Diversity, and Integrity), which Operations Managers enter into the system, manually. An exemplary formula is: [0082]
  • Unparalleled Client Satisfaction (2 pts)+Teamwork (2 pts)+Respect for the Individual (2 pts)+Diversity (2 pts)+Integrity (2 pts)=10 Total Points Possible
  • With regard to the above-described formula, Operations Managers input the manual professionalism measures monthly. These measures compose 10% of a Team Leader's scorecard. [0083]
  • With the cross referencing associated with the events tracked, a number of performance analysis tools are made available, for instance an [0084] agent scorecard 126 that allows a team leader or manager to review the performance summaries and status of a number of employees. Agent trending reports 128 provides indications of whether a substandard performance is improving or becoming more and more of a problem. (See FIG. 13.) Different demographic cross sections may be selected, such as an account report 130 so that managers can see how particular clients of an outsourced service are being served by assigned employees, for instance. (See FIG. 14).
  • These calculations and comparisons are intuitively plotted in [0085] block 132 and displayed as an agent dashboard GUI 134 that gives an agent and team leader a frequently accessed and up-to-date snapshot of their current standing relative to the standards and to their peers. Also associated with these performance results are agent performance feedback items 136 that are created by the team leader and acknowledged by the agent to memorialize coaching for improved performance.
  • In [0086] block 138, the team leader may reference these indications from PPM system in order to perform root cause analysis. Determining the root cause of any problem ensures that it does not reoccur in the future. Root Cause Analysis is useful in helping employees achieve the performance goals set by the project. A root cause analysis may be completed whenever an employee's performance is not meeting the guidelines set by the project. One technique in determining the root cause of a problem is to ask why three to five times, thereby eliminating the surface reasons for missing a target and to thus identify the root cause. The following is a list of tools that can help determine the root cause: Brainstorming, Cause and effect analysis (fishbone diagram), Histogram, Graphs, Pareto diagrams, and Checklists. Several steps are useful in conducting root cause analysis: (a) Enlist individuals to help in the root cause analysis. Include individuals that are directly affected by the outcome of the actions to be taken (e.g., Subject Matter Expert, another Team Leader, and/or an Operations Manager). (b) Conduct cause and effect analysis or use any of the helpful tools mentioned in this section. (c) Select the potential causes most likely to have the greatest impact on the problem. Note: It is not enough to identify that the root cause is present when the problem occurs. It must also be present when the problem does not occur. (d) Create and implement an action plan to address the root causes. The action plan may be reviewed to ensure that the corrective actions do not cause more problems.
  • An example of a root cause analysis checklist may be the following inquiries: [0087]
  • (a) Is there a performance gap (i.e., basis, difference from target)? If so, what is the performance gap? Else, no further analysis required. (b) Is it worth the time and effort to improve (i.e., importance, cost, consequence if ignored, effect if corrected)? If yes, consider further its importance. No, do not waste time and effort. (c) Does the Team Member know that the performance is less than satisfactory (e.g., feedback given to team member, team member aware of unsatisfactory performance)? If yes, consider the basis for how you know the team member is aware that his performance is less than satisfactory. Else, provide appropriate feedback to the team member. (d) Does the Team Member know what is supposed to be done and when (i.e., objectives and standards been defined and mutually agreed upon and clearly stated)? If yes, how do you know the Team Member knows what is suppose to be done and when? Else, set clear goals, objectives and standards with the Team Member to clarify expectations. (e) Are there obstacles beyond the Team Member's control (e.g., conflicting demands, team member lacks necessary authority, time and/or tools, environmental interference such as noise or poor lighting, outdated or unduly restrictive policies in place)? If not, what have you done to verify that there are no obstacles? Else, take appropriate action to remove obstacles. (f) Are there negative consequences or a lack of positive consequences following positive performance, and in particular, how does the team member feel about the rewards for performance? If so, change the consequences, such as reward positive performance and work with the Team Member to provide appropriate support and create a developmental plan. No, eliminate this reason as a possibility for poor performance from the Team Member. (g) Are there positive consequences following non-performance? Specifically, is this team member receiving rewards of avoiding negative consequences even though performance is poor, or do they perceive other team members as doing so? What reward is the Team Member or other team members receiving for non-performance? How will you change the consequences? If yes, for instance, someone else does the work, if the Team Member does not do it, then change the consequences. Communicate expectations to the Team Member. Create a developmental plan. Else, eliminate this as a possibility for poor performance from the Team Member. (h) Does the team member understand the consequences of poor performance? How will the Team Member change the performance? What will you do to provide coaching? If not, work with the Team Member to define consequences and create a developmental plan. If so, stop here. Consider lack of motivation as the problem for poor performance. (i) Is the Team Member willing to undertake appropriate change? If yes, work with the team member to create a developmental plan. If not, terminate or transfer the team member, or live with the performance as it is. [0088]
    TABLE 1
    Quality
    Measurement Review Action
    Call Quality Agent Determine if changes to procedures have
    Knowledge been reviewed with Agents. Determine
    if Agents understand each element of the
    call flow and the system. Determine if
    Agents are rushing through the calls.
    Expectations Determine if the types of improvement
    opportunities are clearly defined and
    understood by Agents.
    Other Review the following measures to
    Measures determine their impact on Call Quality
    (e.g., After Call Work, Average Handle
    Time, Attrition)
    Quality Meet with Agents to discuss trends and
    Results identify the root cause.
    Staffing Review the schedule to determine if
    appropriately staffed so that the Agent is
    not tempted to rush through the calls
    (i.e., look at staffing for peak
    calling periods.
    IT vs. CMS Call Quality Monitor calls and follow-up to determine
    Call if the calls were dispositioned correctly.
    Dispositioning Environment Observe the Agents and determine why
    Agents are not dispositioning the calls
    (e.g., talking to neighbors, etc.). Meet
    with Agents and discuss any
    obstacles in dispositioning calls
    correctly (e.g., coding issues).
    Determine if Agents understand the
    dispositioning procedures.
    Systems Determine if the codes in the system
    accurately reflect the call types.
  • [0089]
    TABLE 3
    Effectiveness
    Measurement Review Action
    Agent Productivity Online Hours Verify the Agent was scheduled to work
    enough hours to be able to meet the goal (i.e.,
    take into consideration training and vacation
    that may have been scheduled).
    Determine if off-line activities are affecting
    Agent Productivity.
    Review Agents Log In and Log Out reports to
    determine if Agents are staying online for the
    appropriate amount of time.
    Other Measures to Review the following measures to determine
    Review their impact on Agent Productivity:
    After Call Work
    AUX Time
    Schedule Adherence
    TKS Conformance
    Schedule Adherence Agents Changes Determine if the Agent's ESC and IEX
    schedule accurately reflect the Agent's
    scheduled hours.
    Environment Determine if Agents are following the
    attendance and tardy policy.
    Observe Agents in their work area to
    determine if Agents are talking with neighbors
    instead of logging on to the phones when
    appropriate.
    Review Agents Log In and Log Out reports to
    determine if Agents are staying online for the
    appropriate amount of time (i.e., leaving and
    returning from breaks on time).
    Staffing Determine if appropriately staffed to meet the
    volume.
    Systems Determine if everything is entered correctly
    into Digital Solutions.
    TKS Conformance Determine whether Agents are coding time
    appropriately in TKS.
    Meet with the Agent to determine why the
    Agent is not following TKS procedures.
  • [0090]
    TABLE 4
    Attendance
    Measurement Review Action
    Absenteeism/ Workplace Meet with Agents to identify root cause
    Tardies Environment and to discuss Agents' concerns.
    Schedule Review schedule with Agent to determine
    if a change to the schedule would
    eliminate further attendance problems.
  • In [0091] block 140, corrective action plans are used to identify areas for improvement and a timeline in which expectations are to be reached. These plans may answer who, what, when and where and consider the conditions and approvals necessary for success. Action planning is used when negative trends are identified in an Agent's performance. Creating a plan will establish a roadmap to achieve excellent call quality. It also ensures an organized objective implementation. A typical procedure for creating action plans would include: (1) Analyze the proposed improvement or solution; (2) Break it down into steps; (3) Consider the materials and numbers of people involved at each step; (4) Brainstorm, if necessary, for other items of possible significance; (5) Add to the list until the team thinks it is complete; and (6) Follow-up frequently to ensure the action plan is completed on time and accurately.
  • In [0092] block 142, the team leader provides feedback on the agent's performance, including any corrective action plans that are to be implemented. Thereafter, the team leader captures the individual feedback items from the feedback session into the PPM system (block 144). Thereafter, the agent is prompted to acknowledge these items set into the PPM system by his team leader, perhaps with comments of explanation or disagreement (block 146). The PPM system tracks the setting and acknowledgement of feedback (block 148), which supports various reports and interactive screen to facilitate the process, such as acknowledgement pending/completed queues/details 150. (See FIGS. 15, 16.)
  • Periodically, the weekly or monthly or other cycle of evaluation and feedback is used for a review (e.g., quarterly, semi-annually, annually), which may coincide with compensation bonuses or raises. The PPM system tracks these periodic agent or team leader review (block [0093] 152), and therefrom produces various reports or interactive screens to facilitate their use, such as tracking summaries 154 and an agent review ranking report 156.
  • In FIG. 3, an [0094] employee scorecard 200 allows a team leader to select one or more factors, such as project 202, type of employee (e.g., team leader, agent) 204, assigned supervisor 206 (e.g., either the team leader interacting with the screen or another supervisor assigned to the team leader/manager), and a period of review, such as start date 208 and finish date 210. Upon selecting search button 212, a listing of employees are provided (not depicted), typically a listing of agents assigned to the team leader whose log-in identifier enables him to view this subset of employees. One particular agent is selected with a detail employee pull-down 214, and filtering as desired for only applied measures yes/no radio buttons 216 and/or unreviewed (“pending”) radio button 218, with the listing displayed upon selecting a refresh button 220. For each performance measure, a record 222 is displayed comprised of a time period 224, scorecard description (e.g., project and facility) 226, scorecard measure 228, score 230, grade 232, apply/not apply toggle 234, and manually-entered comment button 236, the latter launching a text entry window for written comments.
  • A [0095] top detail record 222 is shown highlighted as a currently selected record that may be interacted with by buttons 238-248. In particular, a “show daily detail” button 238 will show daily statistics associated with the selected measure. A “show weekly detail” button 240 will show weekly statistics associated with the selected measure. A “show grade scale” button 242 will pop-up a legend explaining the grading scale standards to assist in interpreting the grades presented. A “remove scorecard for this employee” button 244 is used early in a feedback period to remove a pending scorecard until completed and restored with a “restore scorecard for this employee” button 246 when readily to be applied or not applied. An “add alternate project measures” button 248 is not grayed out when the employee is assigned to more than one project. Selecting button 248 allows the designating of the other projects and populating the scorecard with these alternate project measures.
  • In FIG. 4, an [0096] agent dashboard GUI 300 gives an intuitive and comprehensive presentation of the agent's performance as compared to standards and to his peers and can be frequently referenced to instruct on areas needing attention. An individual pie chart 302 summarizes the 5 grade ranges by proportioning their relative weighting and stacking them radially from poor, fair, good, excellent and finally to outstanding. An arrow 304 shows the composite score for the agent, which in this instance falls within the good grade. A similar pie chart 306 is presented that is a summary for all of the team. Category measure summaries of attendance, quality, professionalism, efficiency and effectiveness are summarized by respective percentage values 308-316 for both the agent and the project as well as a grade color-coded bar chart 318.
  • Performance Reports for Management use. [0097]
  • Senior Management Reports and Screens Guide leverages the comprehensive performance data and analysis of agents and team leaders to detect trends and problem areas. First, a Employee Performance Feedback (CRDB) report displays employees' (i.e., managers and Agents) month-to-date scorecard results and documented feedback, thereby assisting in providing coaching and feedback to employees. Second, an Employee Reviews (CRDB) report provides a summary of an employee's monthly scorecard results by category, overall points achieved and documented feedback, which assists in providing coaching and feedback to employees on their overall results. (See FIG. 17.) Third, a Program Performance Month-to-Date provides a roll-up of program-level scorecard data, assisting in managing results across teams and centers. Fourth, an Agent Productivity Management (APM) Cross Reference Detail Report displays the start date of a project and includes the following: switch number, splits, IEX code, IEX team IDs, TKS and PSF project code, thereby determining where the APM Measures report is pulling information. Fifth, an APM Measures Report with Targets and Charts sorts by Business Unit, center, portfolio, billing unit and PSFN project code and compares the following information to targets determined by the Project: agent productivity, phone time variance, percent occupancy, call service efficiency, percent of calls forecasted accurately, percent of average handle time forecasted accurately, on-line conformance and on-line adherence. Tracking a project's efficiency on key measures, and to review the accuracy of client forecasts for business planning (i.e., staffing etc.). Sixth, an APM Trend Report sorts by business unit, center, portfolio, billing unit and PSFN project code. It provides three months of project trends for the following information: agent productivity, phone time variance, percent occupancy, call service efficiency, percent of calls forecasted accurately, percent of average handle time forecasted accurately, on-line conformance and on-line adherence, on-line diagnostic measures, breakdown of TKS categories and percent billable and non billable time. It identifies trends and improvement opportunities. Seventh, a TKS Activity Analysis Report—Detail provides project level data on where payroll time is being spent i.e., total coaching hours, meeting hours, training hours, etc. It assists in conducting project level analysis to ensure the team is following standard processes and to identify improvement areas. Eighth, a TKS Activity Analysis Report—Summary provides a summary of daily, weekly and monthly data for TKS data analysis. It also provides interval data and displays statistics on a project's billable time. Ninth, a TKS Agent Productivity Report—Detail provides by TKS Project code and employee the following information: manned time, other productive, productivity and phone time variance, thereby assisting managers in identifying how a project can be more efficient. Tenth, a TKS Agent Productivity Report—Summary provides by Business Unit, location, and TKS project(s) the following information: manned time, other productive, productivity and phone time variance. It assists managers in identifying how a project can be more efficient. Eleventh, a Yes/No Line Item Trends by Agent, Team Leader, and Project report provides call monitoring line item results of a project's evaluations completed by QA, Team Leader, OJT, client and a summary of all evaluations. It Assists in conducting analysis on project level quality results and identify areas for improvement. Twelfth, various Agent, Skill and VDN Reports provides a summary of monthly, weekly, daily and interval data for ACD and Agent. Thirteenth, Displayed project statistics report assists in managing employees' performance, including displaying billing data for client bills. Fourteenth, various ACD DN, Agent, CDN and DNIS Reports supplies a summary of monthly, weekly, daily and interval data for ACD, Agent, CDN and DNIS data. Displays project statistics. It assists in managing employees' performance and displays billing data for client bills. Fifteenth, various Multi-project, Project, Team Leader and Agent Level Reports identifies detailed and summary data daily, weekly, and monthly for key metrics at the following levels: multi-project, a single project, Team Leader and Agent. Assists in managing the key metrics and analyzing the raw data for the key metrics. Sixteenth, various Multi-project, Project, Team Leader and Agent Level Reports identifies detailed and summary data daily, weekly, and monthly for key metrics at the following levels: multi-project, a single project, Team Leader and Agent. It assists in managing the key metrics and analyzing the raw data for the key metrics. Seventeenth, various Business Unit (BU), Center, Project and Job Category Headcount Reports supplies headcount information from CORT at the following levels: business unit, project and job category. This report should be pulled using the same start and finish date. It assists in verifying the accuracy of the information in CORT and developing business plans. Eighteenth, a Statistical by Interval and by Summary report supplies data on all calls that went through the IVR. Displays IVR usage, conversant routing, etc. Nineteenth, a PTO Report displays paid time off (PTO) information by project, team and employee, assisting in managing employee PTO days accrued and taken. Twentieth, various Attrition Reports (e.g., 12 Month Rolling, Calendar, Turnover Analysis and Employee Term Listings) supplies attrition information from CORT at the following levels: business unit, vertical, center, project and job category to assist in verifying the accuracy of the information in CORT, to develop action plans and strategic business plans. Twenty first, various Program, Finance, and Activity Summary Reports provides TKS reporting through CRDB for tracking and managing Agent activities, payroll, etc. [0098]
  • PPM Process Conformance is key objective of several reports that can be used to verify whether managers and projects are complying with the process. First, a Project Scorecard Status report identifies the measures that have populated on the scorecard. Retrieves both applied and pending measures. Identifies automated measures that have not populated on an individual's scorecard or need to be added manually. Second, a Scorecard Measures Exception Report identifies the following types of measures by employee name: Not Applied, Removed and Pending. It assists in identifying frequency of unapplied and pending measures and the manager responsible. Third, a Scorecards with Zero Grade displays employees who have received a zero due to their scores falling outside the grading criteria. It assists in identifying issues that need to be investigated and resolved prior to final scorecard processing. Fourth, a Feedback Status Report identifies, by Team Leader and Agent, the percent of feedback that has been acknowledged in the system. Coaching Team Leaders on providing timely feedback to Agents. Fifth, an Acknowledgement Detail Report identifies acknowledgement type, Event number, status and by whom it was acknowledged by project, supervisor, and Agent. It assists in evaluating the status of acknowledgement types and by whom they were acknowledged. Sixth, an Acknowledgement Summary Report displays by business unit, center, project, supervisor, and Agent the following: Total number of acknowledgements, Number of pending acknowledgments, Percentage of completed acknowledgements, and Number and percentage of acknowledgements completed by a Scorecard Project Coordinator, Team Leader, and Agent. It assists in evaluating the completion percentage of acknowledgements and by whom they were acknowledged. Seventh, a Report Usage by Project & User Type & User identifies which employees are pulling reports and the reports being reviewed. It assists in providing coaching and feedback to managers and other employees (i.e., Reports Specialists, etc.). Eighth, a Report Usage by Folder BU, Report, Project Level identifies by business unit and project level what folders have been reviewed, thereby assessing the level of CRDB and PPM process usage by a project. [0099]
  • Administrative—Core CRDB Agent Profile Reports identify the structure necessary for scorecards to accurately roll-up at each level. First, a Supervisor Hierarchy Report identifies the structure of a specific project, from the Agent level and to the President level, providing a quick and easy way to find an employee's manager and determine if the appropriate employees are on the list. Second, a Supervisor Hierarchy Audit Detail report shows by project the following employee information: name, Employee Number, active or inactive status, level of authority in CRDB, and Supervisor's Employee Number. Provides a quick view of employee linkages that projects can verify the accuracy of the Hierarchy report. Third, a CRDB CMS Dictionary provides split, VDN, and skill information at the project level and is utilized as a quick reference tool for managers when discussing changes with Workforce Management, etc. Fourth, a Project and PPM Rollup List by SME shows CRDB SME's by project, program, sponsor and Workforce Management group. Displays CRDB SME to contact when a project needs assistance, displaying agents, Team Leaders & Operations Managers Only. [0100]
  • Some reports are used strictly by Operations Managers and Team Leaders to manager their employees. First, an Average Quality by Guideline and Evaluator Report identifies when the first and last call monitoring evaluation was completed, average overall quality score and the total number of evaluations, thereby assisting in providing coaching and feedback to direct reports on monitoring goals and overall results. Second, a Quality Summary by Agent/Team Leader Report displays by project the Team Leaders, their Agents, number of evaluations completed per Agent, average overall quality score from QA, Team Leader (TL), QA & TL, OJT, client and all evaluations, thereby assisting in managing and providing feedback on project level, Team Leader level and Agent level results. Third, an Employee Review Rankings report ranks employees against their peers according to the points received on the scorecards on a monthly basis over the six-month period, determining Agent's appraisal ratings within a project. Fourth, a Semi-Annual Performance Appraisal report shows employees' performance over the six-month period, assisting in providing coaching and feedback to employees. Fifth, an Agent Profile by Project report provides Agent's name, Employee number, system ids, active status, and Team Leader's name, assisting Managers in troubleshooting why a measure is not displaying on a scorecard. Sixth, a Team Change Request Maintenance report provides a list of Agents and their Team Leaders by project. Transfers Agents to other Team Leaders within the same project, as well as transferring Agents to other projects. Seventh, a Manager Approval report provides Operations Managers with a list of pending Agent transfers; Approving or denying Agent transfer requests. Eighth, a Delegation of Authority report enables Operations Managers to delegate the authority to approve Agent transfer requests, delegating transfer approval authority when an Operations Manager is not in the office. Ninth, a Team Change Request Status Report provides a list of Team Change requests and their status, and tracks Team Change requests. [0101]
  • In use, a program performance management (PPM) [0102] system 10 is set up as part of a customer management system (CMS) network 14, leveraging already existing quantitative information regarding employee work activities (e.g., attendance, time engaged in performing specific tasks, scheduling, sales results, etc.). In addition to automatic measures such as efficiency, effectiveness, and attendance, a team leader interacts with an employee scorecard 54 to input manual measures of quality and professionalism. These measures are compiled into a score that may be compared to targets and to the peers of the employee, with the results intuitively presented to the employee on an agent dashboard 134. Feedback acknowledgement is facilitated by the PPM system 10, as well as tracking accomplishment of periodic reviews, with an array of reports available for upper management to evaluate agent, team leader, and project performance.
  • While the present invention has been illustrated by description of several embodiments and while the illustrative embodiments have been described in considerable detail, it is not the intention of the applicant to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications may readily appear to those skilled in the art. [0103]
  • For example, although performance evaluation of agents who perform customer management services (CMS) is illustrated herein, it should be appreciated that aspects of the invention have application to other industries and services.[0104]

Claims (14)

What is claimed is:
1. A method of managing performance of an employee, comprising:
collecting a set of quantitative data generated as a result of employee activities;
collecting a set of qualitative data input characterizing employee performance;
generating a performance grade based on the sets of quantitative and qualitative data; and
displaying an intuitive representation of the performance grade.
2. The method of claim 1, wherein collecting the set of quantitative data comprises collecting customer management service (CMS) information characterizing actions by a customer service agent from a plurality of CMS systems.
3. The method of claim 2, wherein collecting the set of quantitative data further comprises:
receiving time keeping information;
receiving an assigned schedule;
referencing an attendance target; and
generating an attendance score based on a comparison of the time keeping information with the assigned schedule and the attendance target.
4. The method of claim 2, wherein collecting the set of quantitative data further comprises:
receiving call duration information receiving time keeping information;
referencing an efficiency target; and
generating an efficiency score based on a comparison of the call duration information with the time keeping information and efficiency target.
5. The method of claim 1, wherein collecting the set of qualitative data input further comprises:
prompting a supervisor to input qualitative performance scores;
accessing qualitative comment entries in response to a supervisor input;
receiving a qualitative entry from the supervisor referencing a qualitative target; and
generating a qualitative score based on a comparison of the qualitative entry with the qualitative target.
6. The method of claim 1, wherein collecting the set of quantitative data further comprises:
receiving time keeping information;
receiving on-line time information;
referencing an effectiveness target; and
generating an effectiveness score based on a comparison of the on-line time information with the time keeping information and effectiveness target.
7. The method of claim 1, further comprising excluding a measure in response to a supervisor do-not-apply selection.
8. The method of claim 1, further comprising:
plotting a grading scale of a based upon a compiled plurality of weighted quantitative and qualitative performance measures; and displaying an indicator upon the grading scale corresponding to a compiled performance score.
9. The method of claim 8, further comprising:
referencing compiled performance scores for a plurality of individuals assigned to a group;
computing a combined score for the group; and
plotting a grading scale of a based upon a compiled plurality of weighted quantitative and qualitative performance measures; and displaying an indicator upon the grading scale corresponding to the computed combined score for the group.
10. The method of claim 1, further comprising assigning the quantitative data to a supervisor of the employee for managing performance of the supervisor.
11. A method of managing performance of an employee, comprising:
displaying performance scores of an employee to a supervisor;
receiving a feedback acknowledgement entry from the supervisor;
prompting the employee to interact with the feedback acknowledgement entry; and
tracking accomplishment of the interaction.
12. The method of claim 1, further comprising:
prompting a supervisor to make a periodic review;
ranking employees in response to the periodic review;
tracking accomplishment of the review; and
reporting the employee rankings for performance incentive decisions.
13. The method of claim 11, further comprising generating performance scores representative of customer management service measures.
14. The method of claim 13, further comprising:
generating a performance score based on attendance;
generating a performance score based on efficiency;
generating a performance score based on effectiveness;
generating a performance score based on quality; and
generating a performance score based on professionalism.
US10/624,283 2002-07-22 2003-07-22 Program performance management system Abandoned US20040138944A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/624,283 US20040138944A1 (en) 2002-07-22 2003-07-22 Program performance management system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US39765102P 2002-07-22 2002-07-22
US10/624,283 US20040138944A1 (en) 2002-07-22 2003-07-22 Program performance management system

Publications (1)

Publication Number Publication Date
US20040138944A1 true US20040138944A1 (en) 2004-07-15

Family

ID=32717042

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/624,283 Abandoned US20040138944A1 (en) 2002-07-22 2003-07-22 Program performance management system

Country Status (1)

Country Link
US (1) US20040138944A1 (en)

Cited By (132)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040098290A1 (en) * 2002-10-29 2004-05-20 Stefan Hirschenberger Available resource presentation
US20040128188A1 (en) * 2002-12-30 2004-07-01 Brian Leither System and method for managing employee accountability and performance
US20040133578A1 (en) * 2003-01-07 2004-07-08 Stephanie Dickerson Performance management system and method
US20040143489A1 (en) * 2003-01-20 2004-07-22 Rush-Presbyterian - St. Luke's Medical Center System and method for facilitating a performance review process
US20040158487A1 (en) * 2003-02-07 2004-08-12 Streamlined Management Group Inc. Strategic activity communication and assessment system
US20040172323A1 (en) * 2003-02-28 2004-09-02 Bellsouth Intellectual Property Corporation Customer feedback method and system
WO2004114177A2 (en) * 2003-06-20 2004-12-29 Gaiasoft Limited System for facilitating management and organisational development processes
US20050091071A1 (en) * 2003-10-22 2005-04-28 Lee Howard M. Business performance and customer care quality measurement
US20050138167A1 (en) * 2003-12-19 2005-06-23 Raymond Whitman, Jr. Agent scheduler incorporating agent profiles
US20050135600A1 (en) * 2003-12-19 2005-06-23 Whitman Raymond Jr. Generation of automated recommended parameter changes based on force management system (FMS) data analysis
US20050138153A1 (en) * 2003-12-19 2005-06-23 Whitman Raymond Jr. Method and system for predicting network usage in a network having re-occurring usage variations
US20050137893A1 (en) * 2003-12-19 2005-06-23 Whitman Raymond Jr. Efficiency report generator
US20050135601A1 (en) * 2003-12-19 2005-06-23 Whitman Raymond Jr. Force management automatic call distribution and resource allocation control system
US20050144022A1 (en) * 2003-12-29 2005-06-30 Evans Lori M. Web-based system, method, apparatus and software to manage performance securely across an extended enterprise and between entities
US20050165930A1 (en) * 2003-12-19 2005-07-28 Whitman Raymond Jr. Resource assignment in a distributed environment
US20050203786A1 (en) * 2004-03-11 2005-09-15 International Business Machines Corporation Method, system and program product for assessing a product development project employing a computer-implemented evaluation tool
US20050251438A1 (en) * 2004-05-04 2005-11-10 Yi-Ming Tseng Methods and system for evaluation with notification means
US20060047566A1 (en) * 2004-08-31 2006-03-02 Jay Fleming Method and system for improving performance of customer service representatives
US20060136248A1 (en) * 2004-12-21 2006-06-22 Mary Kay Inc. Computer techniques for distributing information
US20060136461A1 (en) * 2004-12-22 2006-06-22 Alvin Lee Method and system for data quality management
US20060136486A1 (en) * 2004-12-16 2006-06-22 International Business Machines Corporation Method, system and program for enabling resonance in communications
US20060224442A1 (en) * 2005-03-31 2006-10-05 Round Matthew J Closed loop voting feedback
US20070061189A1 (en) * 2005-09-12 2007-03-15 Sbc Knowledge Ventures Lp Method for motivating competitors in an enterprise setting
US20070101334A1 (en) * 2005-10-27 2007-05-03 Atyam Balaji V Dynamic policy manager method, system, and computer program product for optimizing fractional resource allocation
US20070150972A1 (en) * 2003-09-22 2007-06-28 Institut Pasteur Method for detecting Nipah virus and method for providing immunoprotection against Henipa viruses
US20070201675A1 (en) * 2002-01-28 2007-08-30 Nourbakhsh Illah R Complex recording trigger
US20070233600A1 (en) * 2006-04-03 2007-10-04 Computer Associates Think, Inc. Identity management maturity system and method
US20070239573A1 (en) * 2006-03-30 2007-10-11 Microsoft Corporation Automated generation of dashboards for scorecard metrics and subordinate reporting
US20070239871A1 (en) * 2006-04-11 2007-10-11 Mike Kaskie System and method for transitioning to new data services
US20070239660A1 (en) * 2006-03-30 2007-10-11 Microsoft Corporation Definition and instantiation of metric based business logic reports
US20070265863A1 (en) * 2006-04-27 2007-11-15 Microsoft Corporation Multidimensional scorecard header definition
US20070266054A1 (en) * 2006-03-24 2007-11-15 Stephens Frank M Method and system for salary planning and performance management
US20070276722A1 (en) * 2006-01-27 2007-11-29 Teletech Holdings, Inc. Performance Optimization
US20070299709A1 (en) * 2006-06-26 2007-12-27 Bellsouth Intellectual Property Corporation Automated performance quality tracking utility
US20070299718A1 (en) * 2006-06-26 2007-12-27 Bellsouth Intellectual Property Corporation Management activity tracking utility
WO2008005334A2 (en) * 2006-06-30 2008-01-10 American Express Travel Related Services Company, Inc. Availability tracker
US20080021755A1 (en) * 2006-07-19 2008-01-24 Chacha Search, Inc. Method, system, and computer readable medium useful in managing a computer-based system for servicing user initiated tasks
US20080027791A1 (en) * 2006-07-31 2008-01-31 Cooper Robert K System and method for processing performance data
US20080033791A1 (en) * 2006-07-18 2008-02-07 Chacha Search, Inc Method and system tracking work done by human workers
US20080040130A1 (en) * 2006-08-08 2008-02-14 Potential Point, Llc Method of distributing recognition and reinforcing organization focus
US20080040206A1 (en) * 2006-01-27 2008-02-14 Teletech Holdings,Inc. Performance Optimization
US20080040196A1 (en) * 2006-07-06 2008-02-14 International Business Machines Corporation Method, system and program product for hosting an on-demand customer interaction center utility infrastructure
US20080059292A1 (en) * 2006-08-29 2008-03-06 Myers Lloyd N Systems and methods related to continuous performance improvement
US20080097819A1 (en) * 2003-12-19 2008-04-24 At&T Delaware Intellectual Property, Inc. Dynamic Force Management System
US20080154711A1 (en) * 2006-12-22 2008-06-26 American Express Travel Related Services Company, Inc. Availability Tracker
US20080172629A1 (en) * 2007-01-17 2008-07-17 Microsoft Corporation Geometric Performance Metric Data Rendering
US20080195464A1 (en) * 2007-02-09 2008-08-14 Kevin Robert Brooks System and Method to Collect, Calculate, and Report Quantifiable Peer Feedback on Relative Contributions of Team Members
US20080318197A1 (en) * 2007-06-22 2008-12-25 Dion Kenneth W Method and system for education compliance and competency management
US20090037235A1 (en) * 2007-07-30 2009-02-05 Anthony Au System that automatically identifies a Candidate for hiring by using a composite score comprised of a Spec Score generated by a Candidates answers to questions and an Industry Score based on a database of key words & key texts compiled from source documents, such as job descriptions
US20090043621A1 (en) * 2007-08-09 2009-02-12 David Kershaw System and Method of Team Performance Management Software
US20090063221A1 (en) * 2007-08-30 2009-03-05 Software Ag, Inc. System, method and computer program product for generating key performance indicators in a business process monitor
US20090171770A1 (en) * 2007-12-31 2009-07-02 Carmen Blaum Integrated purchasing system
US20090204460A1 (en) * 2008-02-13 2009-08-13 International Business Machines Corporation Method and System For Workforce Optimization
US20090204461A1 (en) * 2008-02-13 2009-08-13 International Business Machines Corporation Method and system for workforce optimization
US20090234719A1 (en) * 2008-03-13 2009-09-17 Ana Dvoredsky Method to determine individual work effectiveness
WO2010011652A1 (en) * 2008-07-21 2010-01-28 Talent Tree, Inc. System and method for tracking employee performance
US20100100771A1 (en) * 2008-10-20 2010-04-22 Oracle International Corporation Setup verification for an employee compensation system
US20100100408A1 (en) * 2008-10-21 2010-04-22 Dion Kenneth W Professional continuing competency optimizer
US7711104B1 (en) 2004-03-31 2010-05-04 Avaya Inc. Multi-tasking tracking agent
US20100121776A1 (en) * 2008-11-07 2010-05-13 Peter Stenger Performance monitoring system
US20100120000A1 (en) * 2008-11-11 2010-05-13 Valorie Bellamy Method and Business Form for Employee Management and Improvement
US20100121685A1 (en) * 2008-11-07 2010-05-13 Oracle International Corporation Method and System for Implementing a Ranking Mechanism
US20100121686A1 (en) * 2008-11-07 2010-05-13 Oracle International Corporation Method and System for Implementing a Scoring Mechanism
US20100122218A1 (en) * 2008-11-07 2010-05-13 Oracle International Corporation Method and System for Implementing a Compensation System
US7734032B1 (en) 2004-03-31 2010-06-08 Avaya Inc. Contact center and method for tracking and acting on one and done customer contacts
US7752230B2 (en) 2005-10-06 2010-07-06 Avaya Inc. Data extensibility using external database tables
US20100198647A1 (en) * 2009-02-02 2010-08-05 Ford Motor Company Technical hotline resource management method and system
US7779042B1 (en) 2005-08-08 2010-08-17 Avaya Inc. Deferred control of surrogate key generation in a distributed processing architecture
US7787609B1 (en) 2005-10-06 2010-08-31 Avaya Inc. Prioritized service delivery based on presence and availability of interruptible enterprise resources with skills
US20100223212A1 (en) * 2009-02-27 2010-09-02 Microsoft Corporation Task-related electronic coaching
US7809127B2 (en) 2005-05-26 2010-10-05 Avaya Inc. Method for discovering problem agent behaviors
US7822587B1 (en) * 2005-10-03 2010-10-26 Avaya Inc. Hybrid database architecture for both maintaining and relaxing type 2 data entity behavior
US20100299650A1 (en) * 2009-05-20 2010-11-25 International Business Machines Corporation Team and individual performance in the development and maintenance of software
US7848947B1 (en) 1999-08-03 2010-12-07 Iex Corporation Performance management system
US7936867B1 (en) 2006-08-15 2011-05-03 Avaya Inc. Multi-service request within a contact center
US7949121B1 (en) 2004-09-27 2011-05-24 Avaya Inc. Method and apparatus for the simultaneous delivery of multiple contacts to an agent
US8000989B1 (en) 2004-03-31 2011-08-16 Avaya Inc. Using true value in routing work items to resources
US8073731B1 (en) * 2003-12-30 2011-12-06 ProcessProxy Corporation Method and system for improving efficiency in an organization using process mining
US8094804B2 (en) 2003-09-26 2012-01-10 Avaya Inc. Method and apparatus for assessing the status of work waiting for service
US20120130768A1 (en) * 2010-11-19 2012-05-24 Accenture Global Services Limited Work force planning analytics system
US8190992B2 (en) 2006-04-21 2012-05-29 Microsoft Corporation Grouping and display of logically defined reports
US8200527B1 (en) * 2007-04-25 2012-06-12 Convergys Cmg Utah, Inc. Method for prioritizing and presenting recommendations regarding organizaion's customer care capabilities
US8234141B1 (en) 2004-09-27 2012-07-31 Avaya Inc. Dynamic work assignment strategies based on multiple aspects of agent proficiency
US20120203597A1 (en) * 2011-02-09 2012-08-09 Jagdev Suman Method and apparatus to assess operational excellence
US8261181B2 (en) 2006-03-30 2012-09-04 Microsoft Corporation Multidimensional metrics-based annotation
US20120253886A1 (en) * 2011-03-28 2012-10-04 Lexisnexis, A Division Of Reed Elsevier Inc. Systems and Methods for Client Development
US8311880B1 (en) * 2002-10-30 2012-11-13 Verizon Corporate Services Group Inc. Supplier performance and accountability system
US8321805B2 (en) 2007-01-30 2012-11-27 Microsoft Corporation Service architecture based metric views
US8326714B1 (en) * 2008-12-29 2012-12-04 Intuit Inc. Employee pre-payroll paycheck preview
US8379830B1 (en) 2006-05-22 2013-02-19 Convergys Customer Management Delaware Llc System and method for automated customer service with contingent live interaction
US8391463B1 (en) 2006-09-01 2013-03-05 Avaya Inc. Method and apparatus for identifying related contacts
US20130085795A1 (en) * 2008-09-29 2013-04-04 Fisher-Rosemount Systems, Inc. Event synchronized reporting in process control systems
US8495663B2 (en) 2007-02-02 2013-07-23 Microsoft Corporation Real time collaboration using embedded data visualizations
US8504534B1 (en) 2007-09-26 2013-08-06 Avaya Inc. Database structures and administration techniques for generalized localization of database items
US8548843B2 (en) * 2011-10-27 2013-10-01 Bank Of America Corporation Individual performance metrics scoring and ranking
US8565386B2 (en) 2009-09-29 2013-10-22 Avaya Inc. Automatic configuration of soft phones that are usable in conjunction with special-purpose endpoints
US20140100923A1 (en) * 2012-10-05 2014-04-10 Successfactors, Inc. Natural language metric condition alerts orchestration
US20140122144A1 (en) * 2012-11-01 2014-05-01 Vytas Cirpus Initiative and Project Management
US8737173B2 (en) 2006-02-24 2014-05-27 Avaya Inc. Date and time dimensions for contact center reporting in arbitrary international time zones
US8738412B2 (en) 2004-07-13 2014-05-27 Avaya Inc. Method and apparatus for supporting individualized selection rules for resource allocation
US20140172514A1 (en) * 2012-12-14 2014-06-19 Level 3 Communications, Inc. Method and apparatus for calculating performance indicators
US8811597B1 (en) 2006-09-07 2014-08-19 Avaya Inc. Contact center performance prediction
US8856182B2 (en) 2008-01-25 2014-10-07 Avaya Inc. Report database dependency tracing through business intelligence metadata
US8938063B1 (en) 2006-09-07 2015-01-20 Avaya Inc. Contact center service monitoring and correcting
US20150135095A1 (en) * 2013-11-13 2015-05-14 Tempo Al, Inc. Smart scheduling and reporting for teams
US9058307B2 (en) 2007-01-26 2015-06-16 Microsoft Technology Licensing, Llc Presentation generation using scorecard elements
US20150269512A1 (en) * 2012-10-10 2015-09-24 Daniel DANIEL WARTEL Productivity Assessment and Rewards Systems and Processes Therefor
US9323736B2 (en) 2012-10-05 2016-04-26 Successfactors, Inc. Natural language metric condition alerts generation
US9338296B2 (en) * 2013-12-20 2016-05-10 Avaya Inc. System and method for driving a virtual view of agents in a contact center
US20160171416A1 (en) * 2014-12-10 2016-06-16 Ziprealty Llc Real estate agent rating
US20160335580A1 (en) * 2015-05-13 2016-11-17 Wal-Mart Stores, Inc. Systems, devices, and methods for configuring a graphical user interface
US20160335581A1 (en) * 2015-05-15 2016-11-17 Wal-Mart Stores, Inc. Systems, devices, and methods for controlling an electronic display device to render a graphical user interface with selectively obfuscated portions to protect confidential or private information
US9516069B2 (en) 2009-11-17 2016-12-06 Avaya Inc. Packet headers as a trigger for automatic activation of special-purpose softphone applications
US9563751B1 (en) * 2010-10-13 2017-02-07 The Boeing Company License utilization management system service suite
US9792356B2 (en) 2011-11-02 2017-10-17 Salesforce.Com, Inc. System and method for supporting natural language queries and requests against a user's personal data cloud
US9886676B1 (en) * 2012-03-30 2018-02-06 Liberty Mutual Insurance Company Behavior-based business recommendations
US9893905B2 (en) 2013-11-13 2018-02-13 Salesforce.Com, Inc. Collaborative platform for teams with messaging and learning across groups
US20180285907A1 (en) * 2017-03-31 2018-10-04 Ask Chemicals L.P. System and method for facilitating the identification of potential sales targets
US20180315001A1 (en) * 2017-04-26 2018-11-01 Hrb Innovations, Inc. Agent performance feedback
US10140322B2 (en) 2011-11-02 2018-11-27 Salesforce.Com, Inc. Tools and techniques for extracting knowledge from unstructured data retrieved from personal data sources
US10210530B1 (en) * 2006-08-11 2019-02-19 Infor (Us), Inc. Selecting a report
CN109858812A (en) * 2019-01-31 2019-06-07 泰康保险集团股份有限公司 Human Resources Management Method, device, medium and electronic equipment based on block chain
US10672068B1 (en) 2003-06-09 2020-06-02 Thomson Reuters Enterprise Centre Gmbh Ensuring the accurateness and currentness of information provided by the submitter of an electronic invoice throughout the life of a matter
US10747713B2 (en) * 2004-11-30 2020-08-18 Thomson Reuters Enterprise Centre Gmbh Vendor/client information system architecture
US10943193B1 (en) * 2018-05-03 2021-03-09 Saverio Dalia Food and beverage venue management system
US11023864B2 (en) * 2014-05-16 2021-06-01 New York Life Insurance Company System and method for integrating privacy into contact management systems
US11164198B2 (en) 2017-03-31 2021-11-02 ASK Chemicals LLC Graphical user interface for visualizing market share analysis
US11238408B2 (en) 2019-02-19 2022-02-01 Next Jump, Inc. Interactive electronic employee feedback systems and methods
US11336770B2 (en) * 2013-06-07 2022-05-17 Mattersight Corporation Systems and methods for analyzing coaching comments
US11367026B2 (en) 2009-10-30 2022-06-21 Verint Americas Inc. Systems and methods for automatic scheduling of a workforce
US20220300881A1 (en) * 2021-03-17 2022-09-22 Accenture Global Solutions Limited Value realization analytics systems and related methods of use
US20220300886A1 (en) * 2020-12-17 2022-09-22 Nice Ltd System and method for determining and utilizing after-call-work factor in contact center quality processes

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6049779A (en) * 1998-04-06 2000-04-11 Berkson; Stephen P. Call center incentive system and method
US6078894A (en) * 1997-03-28 2000-06-20 Clawson; Jeffrey J. Method and system for evaluating the performance of emergency medical dispatchers
US6119097A (en) * 1997-11-26 2000-09-12 Executing The Numbers, Inc. System and method for quantification of human performance factors
US20010032120A1 (en) * 2000-03-21 2001-10-18 Stuart Robert Oden Individual call agent productivity method and system
US6324282B1 (en) * 2000-03-02 2001-11-27 Knowlagent, Inc. Method and system for delivery of individualized training to call center agents
US20020024531A1 (en) * 2000-08-30 2002-02-28 Herrell William R. Method for evaluating employees and aggregating their respective skills and experience in a searchable database for sharing knowledge resources
US20020035506A1 (en) * 1998-10-30 2002-03-21 Rami Loya System for design and implementation of employee incentive and compensation programs for businesses
US20020065751A1 (en) * 2000-08-08 2002-05-30 Bellows Paul Felton Automated, interactive management systems and processes
US20020091562A1 (en) * 2000-06-02 2002-07-11 Sony Corporation And Sony Electrics Inc. Facilitating offline and online sales
US20020133464A1 (en) * 2001-03-16 2002-09-19 Erica Ress System and method for providing on-line ancillary content for printed materials
US6460848B1 (en) * 1999-04-21 2002-10-08 Mindplay Llc Method and apparatus for monitoring casinos and gaming
US6735570B1 (en) * 1999-08-02 2004-05-11 Unisys Corporation System and method for evaluating a selectable group of people against a selectable set of skills
US6754874B1 (en) * 2002-05-31 2004-06-22 Deloitte Development Llc Computer-aided system and method for evaluating employees
US6856986B1 (en) * 1993-05-21 2005-02-15 Michael T. Rossides Answer collection and retrieval system governed by a pay-off meter
US6898235B1 (en) * 1999-12-10 2005-05-24 Argon St Incorporated Wideband communication intercept and direction finding device using hyperchannelization
US7080057B2 (en) * 2000-08-03 2006-07-18 Unicru, Inc. Electronic employee selection systems and methods

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6856986B1 (en) * 1993-05-21 2005-02-15 Michael T. Rossides Answer collection and retrieval system governed by a pay-off meter
US6078894A (en) * 1997-03-28 2000-06-20 Clawson; Jeffrey J. Method and system for evaluating the performance of emergency medical dispatchers
US6119097A (en) * 1997-11-26 2000-09-12 Executing The Numbers, Inc. System and method for quantification of human performance factors
US6049779A (en) * 1998-04-06 2000-04-11 Berkson; Stephen P. Call center incentive system and method
US20020035506A1 (en) * 1998-10-30 2002-03-21 Rami Loya System for design and implementation of employee incentive and compensation programs for businesses
US6460848B1 (en) * 1999-04-21 2002-10-08 Mindplay Llc Method and apparatus for monitoring casinos and gaming
US6735570B1 (en) * 1999-08-02 2004-05-11 Unisys Corporation System and method for evaluating a selectable group of people against a selectable set of skills
US6898235B1 (en) * 1999-12-10 2005-05-24 Argon St Incorporated Wideband communication intercept and direction finding device using hyperchannelization
US6324282B1 (en) * 2000-03-02 2001-11-27 Knowlagent, Inc. Method and system for delivery of individualized training to call center agents
US20010032120A1 (en) * 2000-03-21 2001-10-18 Stuart Robert Oden Individual call agent productivity method and system
US20020091562A1 (en) * 2000-06-02 2002-07-11 Sony Corporation And Sony Electrics Inc. Facilitating offline and online sales
US7080057B2 (en) * 2000-08-03 2006-07-18 Unicru, Inc. Electronic employee selection systems and methods
US20020065751A1 (en) * 2000-08-08 2002-05-30 Bellows Paul Felton Automated, interactive management systems and processes
US20020024531A1 (en) * 2000-08-30 2002-02-28 Herrell William R. Method for evaluating employees and aggregating their respective skills and experience in a searchable database for sharing knowledge resources
US20020133464A1 (en) * 2001-03-16 2002-09-19 Erica Ress System and method for providing on-line ancillary content for printed materials
US6754874B1 (en) * 2002-05-31 2004-06-22 Deloitte Development Llc Computer-aided system and method for evaluating employees

Cited By (189)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7848947B1 (en) 1999-08-03 2010-12-07 Iex Corporation Performance management system
US9451086B2 (en) 2002-01-28 2016-09-20 Verint Americas Inc. Complex recording trigger
US20070201675A1 (en) * 2002-01-28 2007-08-30 Nourbakhsh Illah R Complex recording trigger
US9008300B2 (en) * 2002-01-28 2015-04-14 Verint Americas Inc Complex recording trigger
US7953625B2 (en) * 2002-10-29 2011-05-31 Sap Aktiengesellschaft Available resource presentation
US20040098290A1 (en) * 2002-10-29 2004-05-20 Stefan Hirschenberger Available resource presentation
US8311880B1 (en) * 2002-10-30 2012-11-13 Verizon Corporate Services Group Inc. Supplier performance and accountability system
US20040128188A1 (en) * 2002-12-30 2004-07-01 Brian Leither System and method for managing employee accountability and performance
US20040133578A1 (en) * 2003-01-07 2004-07-08 Stephanie Dickerson Performance management system and method
US7610288B2 (en) * 2003-01-07 2009-10-27 At&T Intellectual Property I, L.P. Performance management system and method
US20040143489A1 (en) * 2003-01-20 2004-07-22 Rush-Presbyterian - St. Luke's Medical Center System and method for facilitating a performance review process
US20040158487A1 (en) * 2003-02-07 2004-08-12 Streamlined Management Group Inc. Strategic activity communication and assessment system
US20040172323A1 (en) * 2003-02-28 2004-09-02 Bellsouth Intellectual Property Corporation Customer feedback method and system
US11763380B2 (en) 2003-06-09 2023-09-19 Thomson Reuters Enterprise Centre Gmbh Ensuring the accurateness and currentness of information provided by the submitter of an electronic invoice throughout the life of a matter
US10672068B1 (en) 2003-06-09 2020-06-02 Thomson Reuters Enterprise Centre Gmbh Ensuring the accurateness and currentness of information provided by the submitter of an electronic invoice throughout the life of a matter
WO2004114177A3 (en) * 2003-06-20 2005-08-18 Show Business Software Ltd System for facilitating management and organisational development processes
US10115077B2 (en) 2003-06-20 2018-10-30 Gaiasoft Ip Limited System for facilitating management and organisational development processes
WO2004114177A2 (en) * 2003-06-20 2004-12-29 Gaiasoft Limited System for facilitating management and organisational development processes
US20070055564A1 (en) * 2003-06-20 2007-03-08 Fourman Clive M System for facilitating management and organisational development processes
US20070150972A1 (en) * 2003-09-22 2007-06-28 Institut Pasteur Method for detecting Nipah virus and method for providing immunoprotection against Henipa viruses
US8891747B2 (en) 2003-09-26 2014-11-18 Avaya Inc. Method and apparatus for assessing the status of work waiting for service
US8094804B2 (en) 2003-09-26 2012-01-10 Avaya Inc. Method and apparatus for assessing the status of work waiting for service
US8751274B2 (en) 2003-09-26 2014-06-10 Avaya Inc. Method and apparatus for assessing the status of work waiting for service
US9025761B2 (en) 2003-09-26 2015-05-05 Avaya Inc. Method and apparatus for assessing the status of work waiting for service
US20050091071A1 (en) * 2003-10-22 2005-04-28 Lee Howard M. Business performance and customer care quality measurement
US7783513B2 (en) * 2003-10-22 2010-08-24 Intellisist, Inc. Business performance and customer care quality measurement
US20050137893A1 (en) * 2003-12-19 2005-06-23 Whitman Raymond Jr. Efficiency report generator
US7920552B2 (en) 2003-12-19 2011-04-05 At&T Intellectual Property I, L.P. Resource assignment in a distributed environment
US20050165930A1 (en) * 2003-12-19 2005-07-28 Whitman Raymond Jr. Resource assignment in a distributed environment
US20080097819A1 (en) * 2003-12-19 2008-04-24 At&T Delaware Intellectual Property, Inc. Dynamic Force Management System
US7499844B2 (en) 2003-12-19 2009-03-03 At&T Intellectual Property I, L.P. Method and system for predicting network usage in a network having re-occurring usage variations
US20050135601A1 (en) * 2003-12-19 2005-06-23 Whitman Raymond Jr. Force management automatic call distribution and resource allocation control system
US7539297B2 (en) 2003-12-19 2009-05-26 At&T Intellectual Property I, L.P. Generation of automated recommended parameter changes based on force management system (FMS) data analysis
US7406171B2 (en) * 2003-12-19 2008-07-29 At&T Delaware Intellectual Property, Inc. Agent scheduler incorporating agent profiles
US8781099B2 (en) 2003-12-19 2014-07-15 At&T Intellectual Property I, L.P. Dynamic force management system
US7551602B2 (en) 2003-12-19 2009-06-23 At&T Intellectual Property I, L.P. Resource assignment in a distributed environment
US20050138153A1 (en) * 2003-12-19 2005-06-23 Whitman Raymond Jr. Method and system for predicting network usage in a network having re-occurring usage variations
US20050135600A1 (en) * 2003-12-19 2005-06-23 Whitman Raymond Jr. Generation of automated recommended parameter changes based on force management system (FMS) data analysis
US20090210535A1 (en) * 2003-12-19 2009-08-20 At&T Intellectual Property I, L.P. Resource assignment in a distributed environment
US7616755B2 (en) 2003-12-19 2009-11-10 At&T Intellectual Property I, L.P. Efficiency report generator
US20050138167A1 (en) * 2003-12-19 2005-06-23 Raymond Whitman, Jr. Agent scheduler incorporating agent profiles
US20050144022A1 (en) * 2003-12-29 2005-06-30 Evans Lori M. Web-based system, method, apparatus and software to manage performance securely across an extended enterprise and between entities
US8073731B1 (en) * 2003-12-30 2011-12-06 ProcessProxy Corporation Method and system for improving efficiency in an organization using process mining
US8407081B1 (en) 2003-12-30 2013-03-26 ProcessProxy Corporation Method and system for improving effciency in an organization using process mining
US7680682B2 (en) * 2004-03-11 2010-03-16 International Business Machines Corporation Method, system and program product for assessing a product development project employing a computer-implemented evaluation tool
US20050203786A1 (en) * 2004-03-11 2005-09-15 International Business Machines Corporation Method, system and program product for assessing a product development project employing a computer-implemented evaluation tool
US7953859B1 (en) 2004-03-31 2011-05-31 Avaya Inc. Data model of participation in multi-channel and multi-party contacts
US7711104B1 (en) 2004-03-31 2010-05-04 Avaya Inc. Multi-tasking tracking agent
US7734032B1 (en) 2004-03-31 2010-06-08 Avaya Inc. Contact center and method for tracking and acting on one and done customer contacts
US8731177B1 (en) 2004-03-31 2014-05-20 Avaya Inc. Data model of participation in multi-channel and multi-party contacts
US8000989B1 (en) 2004-03-31 2011-08-16 Avaya Inc. Using true value in routing work items to resources
US20050251438A1 (en) * 2004-05-04 2005-11-10 Yi-Ming Tseng Methods and system for evaluation with notification means
US8738412B2 (en) 2004-07-13 2014-05-27 Avaya Inc. Method and apparatus for supporting individualized selection rules for resource allocation
US8805717B2 (en) 2004-08-31 2014-08-12 Hartford Fire Insurance Company Method and system for improving performance of customer service representatives
US20060047566A1 (en) * 2004-08-31 2006-03-02 Jay Fleming Method and system for improving performance of customer service representatives
US7949121B1 (en) 2004-09-27 2011-05-24 Avaya Inc. Method and apparatus for the simultaneous delivery of multiple contacts to an agent
US8234141B1 (en) 2004-09-27 2012-07-31 Avaya Inc. Dynamic work assignment strategies based on multiple aspects of agent proficiency
US10747713B2 (en) * 2004-11-30 2020-08-18 Thomson Reuters Enterprise Centre Gmbh Vendor/client information system architecture
US8112433B2 (en) * 2004-12-16 2012-02-07 International Business Machines Corporation Method, system and program for enabling resonance in communications
US20060136486A1 (en) * 2004-12-16 2006-06-22 International Business Machines Corporation Method, system and program for enabling resonance in communications
US20060136248A1 (en) * 2004-12-21 2006-06-22 Mary Kay Inc. Computer techniques for distributing information
US8626570B2 (en) * 2004-12-22 2014-01-07 Bank Of America Corporation Method and system for data quality management
US20060136461A1 (en) * 2004-12-22 2006-06-22 Alvin Lee Method and system for data quality management
US20060224442A1 (en) * 2005-03-31 2006-10-05 Round Matthew J Closed loop voting feedback
US8566144B2 (en) * 2005-03-31 2013-10-22 Amazon Technologies, Inc. Closed loop voting feedback
US7809127B2 (en) 2005-05-26 2010-10-05 Avaya Inc. Method for discovering problem agent behaviors
US8578396B2 (en) 2005-08-08 2013-11-05 Avaya Inc. Deferred control of surrogate key generation in a distributed processing architecture
US7779042B1 (en) 2005-08-08 2010-08-17 Avaya Inc. Deferred control of surrogate key generation in a distributed processing architecture
US20070061189A1 (en) * 2005-09-12 2007-03-15 Sbc Knowledge Ventures Lp Method for motivating competitors in an enterprise setting
US7822587B1 (en) * 2005-10-03 2010-10-26 Avaya Inc. Hybrid database architecture for both maintaining and relaxing type 2 data entity behavior
US7787609B1 (en) 2005-10-06 2010-08-31 Avaya Inc. Prioritized service delivery based on presence and availability of interruptible enterprise resources with skills
US7752230B2 (en) 2005-10-06 2010-07-06 Avaya Inc. Data extensibility using external database tables
US20070101334A1 (en) * 2005-10-27 2007-05-03 Atyam Balaji V Dynamic policy manager method, system, and computer program product for optimizing fractional resource allocation
US8327370B2 (en) 2005-10-27 2012-12-04 International Business Machines Corporation Dynamic policy manager method, system, and computer program product for optimizing fractional resource allocation
US8086482B2 (en) * 2006-01-27 2011-12-27 Teletech Holdings, Inc. Performance optimization
US20080040206A1 (en) * 2006-01-27 2008-02-14 Teletech Holdings,Inc. Performance Optimization
US20070276722A1 (en) * 2006-01-27 2007-11-29 Teletech Holdings, Inc. Performance Optimization
US8095414B2 (en) * 2006-01-27 2012-01-10 Teletech Holdings, Inc. Performance optimization
US8737173B2 (en) 2006-02-24 2014-05-27 Avaya Inc. Date and time dimensions for contact center reporting in arbitrary international time zones
US20070266054A1 (en) * 2006-03-24 2007-11-15 Stephens Frank M Method and system for salary planning and performance management
US20070239660A1 (en) * 2006-03-30 2007-10-11 Microsoft Corporation Definition and instantiation of metric based business logic reports
US20070239573A1 (en) * 2006-03-30 2007-10-11 Microsoft Corporation Automated generation of dashboards for scorecard metrics and subordinate reporting
US8261181B2 (en) 2006-03-30 2012-09-04 Microsoft Corporation Multidimensional metrics-based annotation
US7716592B2 (en) 2006-03-30 2010-05-11 Microsoft Corporation Automated generation of dashboards for scorecard metrics and subordinate reporting
US7840896B2 (en) 2006-03-30 2010-11-23 Microsoft Corporation Definition and instantiation of metric based business logic reports
US20070233600A1 (en) * 2006-04-03 2007-10-04 Computer Associates Think, Inc. Identity management maturity system and method
US20070239871A1 (en) * 2006-04-11 2007-10-11 Mike Kaskie System and method for transitioning to new data services
US8190992B2 (en) 2006-04-21 2012-05-29 Microsoft Corporation Grouping and display of logically defined reports
US7716571B2 (en) 2006-04-27 2010-05-11 Microsoft Corporation Multidimensional scorecard header definition
US20070265863A1 (en) * 2006-04-27 2007-11-15 Microsoft Corporation Multidimensional scorecard header definition
US9549065B1 (en) 2006-05-22 2017-01-17 Convergys Customer Management Delaware Llc System and method for automated customer service with contingent live interaction
US8379830B1 (en) 2006-05-22 2013-02-19 Convergys Customer Management Delaware Llc System and method for automated customer service with contingent live interaction
US20070299709A1 (en) * 2006-06-26 2007-12-27 Bellsouth Intellectual Property Corporation Automated performance quality tracking utility
US20070299718A1 (en) * 2006-06-26 2007-12-27 Bellsouth Intellectual Property Corporation Management activity tracking utility
WO2008005334A2 (en) * 2006-06-30 2008-01-10 American Express Travel Related Services Company, Inc. Availability tracker
WO2008005334A3 (en) * 2006-06-30 2008-07-03 American Express Travel Relate Availability tracker
US20080040196A1 (en) * 2006-07-06 2008-02-14 International Business Machines Corporation Method, system and program product for hosting an on-demand customer interaction center utility infrastructure
US20080033791A1 (en) * 2006-07-18 2008-02-07 Chacha Search, Inc Method and system tracking work done by human workers
US7873532B2 (en) * 2006-07-19 2011-01-18 Chacha Search, Inc. Method, system, and computer readable medium useful in managing a computer-based system for servicing user initiated tasks
US20080021755A1 (en) * 2006-07-19 2008-01-24 Chacha Search, Inc. Method, system, and computer readable medium useful in managing a computer-based system for servicing user initiated tasks
US20080027791A1 (en) * 2006-07-31 2008-01-31 Cooper Robert K System and method for processing performance data
US20080040130A1 (en) * 2006-08-08 2008-02-14 Potential Point, Llc Method of distributing recognition and reinforcing organization focus
US10546251B1 (en) 2006-08-11 2020-01-28 Infor (US) Inc. Performance optimization
US10210530B1 (en) * 2006-08-11 2019-02-19 Infor (Us), Inc. Selecting a report
US7936867B1 (en) 2006-08-15 2011-05-03 Avaya Inc. Multi-service request within a contact center
US20080059292A1 (en) * 2006-08-29 2008-03-06 Myers Lloyd N Systems and methods related to continuous performance improvement
US8391463B1 (en) 2006-09-01 2013-03-05 Avaya Inc. Method and apparatus for identifying related contacts
US8811597B1 (en) 2006-09-07 2014-08-19 Avaya Inc. Contact center performance prediction
US8938063B1 (en) 2006-09-07 2015-01-20 Avaya Inc. Contact center service monitoring and correcting
US20080154711A1 (en) * 2006-12-22 2008-06-26 American Express Travel Related Services Company, Inc. Availability Tracker
US20080172629A1 (en) * 2007-01-17 2008-07-17 Microsoft Corporation Geometric Performance Metric Data Rendering
US9058307B2 (en) 2007-01-26 2015-06-16 Microsoft Technology Licensing, Llc Presentation generation using scorecard elements
US8321805B2 (en) 2007-01-30 2012-11-27 Microsoft Corporation Service architecture based metric views
US8495663B2 (en) 2007-02-02 2013-07-23 Microsoft Corporation Real time collaboration using embedded data visualizations
US9392026B2 (en) 2007-02-02 2016-07-12 Microsoft Technology Licensing, Llc Real time collaboration using embedded data visualizations
US20080195464A1 (en) * 2007-02-09 2008-08-14 Kevin Robert Brooks System and Method to Collect, Calculate, and Report Quantifiable Peer Feedback on Relative Contributions of Team Members
US7996257B2 (en) * 2007-02-09 2011-08-09 International Business Machines Corporation Collecting, calculating, and reporting quantifiable peer feedback on relative contributions of team members
US8200527B1 (en) * 2007-04-25 2012-06-12 Convergys Cmg Utah, Inc. Method for prioritizing and presenting recommendations regarding organizaion's customer care capabilities
US8503924B2 (en) 2007-06-22 2013-08-06 Kenneth W. Dion Method and system for education compliance and competency management
US20080318197A1 (en) * 2007-06-22 2008-12-25 Dion Kenneth W Method and system for education compliance and competency management
US20090037235A1 (en) * 2007-07-30 2009-02-05 Anthony Au System that automatically identifies a Candidate for hiring by using a composite score comprised of a Spec Score generated by a Candidates answers to questions and an Industry Score based on a database of key words & key texts compiled from source documents, such as job descriptions
US20090043621A1 (en) * 2007-08-09 2009-02-12 David Kershaw System and Method of Team Performance Management Software
US9779367B2 (en) * 2007-08-30 2017-10-03 Software Ag Usa, Inc. System, method and computer program product for generating key performance indicators in a business process monitor
US20090063221A1 (en) * 2007-08-30 2009-03-05 Software Ag, Inc. System, method and computer program product for generating key performance indicators in a business process monitor
US8504534B1 (en) 2007-09-26 2013-08-06 Avaya Inc. Database structures and administration techniques for generalized localization of database items
US20090171770A1 (en) * 2007-12-31 2009-07-02 Carmen Blaum Integrated purchasing system
US8856182B2 (en) 2008-01-25 2014-10-07 Avaya Inc. Report database dependency tracing through business intelligence metadata
US20090204461A1 (en) * 2008-02-13 2009-08-13 International Business Machines Corporation Method and system for workforce optimization
US20090204460A1 (en) * 2008-02-13 2009-08-13 International Business Machines Corporation Method and System For Workforce Optimization
US20090234719A1 (en) * 2008-03-13 2009-09-17 Ana Dvoredsky Method to determine individual work effectiveness
US20110131082A1 (en) * 2008-07-21 2011-06-02 Michael Manser System and method for tracking employee performance
WO2010011652A1 (en) * 2008-07-21 2010-01-28 Talent Tree, Inc. System and method for tracking employee performance
US20130085795A1 (en) * 2008-09-29 2013-04-04 Fisher-Rosemount Systems, Inc. Event synchronized reporting in process control systems
US8874461B2 (en) * 2008-09-29 2014-10-28 Fisher-Rosemount Systems, Inc. Event synchronized reporting in process control systems
US20100100771A1 (en) * 2008-10-20 2010-04-22 Oracle International Corporation Setup verification for an employee compensation system
US20100100408A1 (en) * 2008-10-21 2010-04-22 Dion Kenneth W Professional continuing competency optimizer
US20100121686A1 (en) * 2008-11-07 2010-05-13 Oracle International Corporation Method and System for Implementing a Scoring Mechanism
US20100122218A1 (en) * 2008-11-07 2010-05-13 Oracle International Corporation Method and System for Implementing a Compensation System
US20100121776A1 (en) * 2008-11-07 2010-05-13 Peter Stenger Performance monitoring system
US20100121685A1 (en) * 2008-11-07 2010-05-13 Oracle International Corporation Method and System for Implementing a Ranking Mechanism
US9147177B2 (en) * 2008-11-07 2015-09-29 Oracle International Corporation Method and system for implementing a scoring mechanism
US9032311B2 (en) 2008-11-07 2015-05-12 Oracle International Corporation Method and system for implementing a compensation system
US20100120000A1 (en) * 2008-11-11 2010-05-13 Valorie Bellamy Method and Business Form for Employee Management and Improvement
US8326714B1 (en) * 2008-12-29 2012-12-04 Intuit Inc. Employee pre-payroll paycheck preview
US20100198647A1 (en) * 2009-02-02 2010-08-05 Ford Motor Company Technical hotline resource management method and system
US20100223212A1 (en) * 2009-02-27 2010-09-02 Microsoft Corporation Task-related electronic coaching
US20100299650A1 (en) * 2009-05-20 2010-11-25 International Business Machines Corporation Team and individual performance in the development and maintenance of software
US8565386B2 (en) 2009-09-29 2013-10-22 Avaya Inc. Automatic configuration of soft phones that are usable in conjunction with special-purpose endpoints
US11699112B2 (en) 2009-10-30 2023-07-11 Verint Americas Inc. Systems and methods for automatic scheduling of a workforce
US11367026B2 (en) 2009-10-30 2022-06-21 Verint Americas Inc. Systems and methods for automatic scheduling of a workforce
US9516069B2 (en) 2009-11-17 2016-12-06 Avaya Inc. Packet headers as a trigger for automatic activation of special-purpose softphone applications
US11122012B2 (en) 2010-10-13 2021-09-14 The Boeing Company License utilization management system service suite
US9563751B1 (en) * 2010-10-13 2017-02-07 The Boeing Company License utilization management system service suite
US20120130768A1 (en) * 2010-11-19 2012-05-24 Accenture Global Services Limited Work force planning analytics system
US20120203597A1 (en) * 2011-02-09 2012-08-09 Jagdev Suman Method and apparatus to assess operational excellence
US20120253886A1 (en) * 2011-03-28 2012-10-04 Lexisnexis, A Division Of Reed Elsevier Inc. Systems and Methods for Client Development
US8548843B2 (en) * 2011-10-27 2013-10-01 Bank Of America Corporation Individual performance metrics scoring and ranking
US11100065B2 (en) 2011-11-02 2021-08-24 Salesforce.Com, Inc. Tools and techniques for extracting knowledge from unstructured data retrieved from personal data sources
US11093467B2 (en) 2011-11-02 2021-08-17 Salesforce.Com, Inc. Tools and techniques for extracting knowledge from unstructured data retrieved from personal data sources
US9792356B2 (en) 2011-11-02 2017-10-17 Salesforce.Com, Inc. System and method for supporting natural language queries and requests against a user's personal data cloud
US10140322B2 (en) 2011-11-02 2018-11-27 Salesforce.Com, Inc. Tools and techniques for extracting knowledge from unstructured data retrieved from personal data sources
US9886676B1 (en) * 2012-03-30 2018-02-06 Liberty Mutual Insurance Company Behavior-based business recommendations
US9953022B2 (en) 2012-10-05 2018-04-24 Successfactors, Inc. Natural language metric condition alerts
US9323736B2 (en) 2012-10-05 2016-04-26 Successfactors, Inc. Natural language metric condition alerts generation
US20140100923A1 (en) * 2012-10-05 2014-04-10 Successfactors, Inc. Natural language metric condition alerts orchestration
US20150269512A1 (en) * 2012-10-10 2015-09-24 Daniel DANIEL WARTEL Productivity Assessment and Rewards Systems and Processes Therefor
US20140122144A1 (en) * 2012-11-01 2014-05-01 Vytas Cirpus Initiative and Project Management
US20140172514A1 (en) * 2012-12-14 2014-06-19 Level 3 Communications, Inc. Method and apparatus for calculating performance indicators
US11336770B2 (en) * 2013-06-07 2022-05-17 Mattersight Corporation Systems and methods for analyzing coaching comments
US10367649B2 (en) * 2013-11-13 2019-07-30 Salesforce.Com, Inc. Smart scheduling and reporting for teams
US20150135095A1 (en) * 2013-11-13 2015-05-14 Tempo Al, Inc. Smart scheduling and reporting for teams
US9893905B2 (en) 2013-11-13 2018-02-13 Salesforce.Com, Inc. Collaborative platform for teams with messaging and learning across groups
US9338296B2 (en) * 2013-12-20 2016-05-10 Avaya Inc. System and method for driving a virtual view of agents in a contact center
US11023864B2 (en) * 2014-05-16 2021-06-01 New York Life Insurance Company System and method for integrating privacy into contact management systems
US20160171416A1 (en) * 2014-12-10 2016-06-16 Ziprealty Llc Real estate agent rating
US20160335580A1 (en) * 2015-05-13 2016-11-17 Wal-Mart Stores, Inc. Systems, devices, and methods for configuring a graphical user interface
US20160335581A1 (en) * 2015-05-15 2016-11-17 Wal-Mart Stores, Inc. Systems, devices, and methods for controlling an electronic display device to render a graphical user interface with selectively obfuscated portions to protect confidential or private information
US10713673B2 (en) 2017-03-31 2020-07-14 ASK Chemicals LLC Interactive map displaying potential sales targets within a geographical distance to visiting sales representatives
US11164198B2 (en) 2017-03-31 2021-11-02 ASK Chemicals LLC Graphical user interface for visualizing market share analysis
US10540668B2 (en) * 2017-03-31 2020-01-21 ASK Chemicals LLC Map based graphical user interface for identifying sales targets and determining sales potential
US20180285907A1 (en) * 2017-03-31 2018-10-04 Ask Chemicals L.P. System and method for facilitating the identification of potential sales targets
US20180315001A1 (en) * 2017-04-26 2018-11-01 Hrb Innovations, Inc. Agent performance feedback
US10943193B1 (en) * 2018-05-03 2021-03-09 Saverio Dalia Food and beverage venue management system
CN109858812A (en) * 2019-01-31 2019-06-07 泰康保险集团股份有限公司 Human Resources Management Method, device, medium and electronic equipment based on block chain
US11238408B2 (en) 2019-02-19 2022-02-01 Next Jump, Inc. Interactive electronic employee feedback systems and methods
US20220300886A1 (en) * 2020-12-17 2022-09-22 Nice Ltd System and method for determining and utilizing after-call-work factor in contact center quality processes
US11676094B2 (en) * 2020-12-17 2023-06-13 Nice Ltd. System and method for determining and utilizing after-call-work factor in contact center quality processes
US20220300881A1 (en) * 2021-03-17 2022-09-22 Accenture Global Solutions Limited Value realization analytics systems and related methods of use
US11507908B2 (en) * 2021-03-17 2022-11-22 Accenture Global Solutions Limited System and method for dynamic performance optimization

Similar Documents

Publication Publication Date Title
US20040138944A1 (en) Program performance management system
US8244565B2 (en) Individual productivity and utilization tracking tool
US8180662B2 (en) Rapid deployment of training for company representatives in contact handling systems
US20150269512A1 (en) Productivity Assessment and Rewards Systems and Processes Therefor
US20110054968A1 (en) Continuous performance improvement system
EP1998278A1 (en) System and method for long term forecasting
Batt The economics of teams among technicians
US20020147632A1 (en) Contact center management
US8595051B2 (en) Metrics capability self assessment
US20080262888A1 (en) Method and system for analytical recruitment
US20180349829A1 (en) Method for Optimizing Employee Work Assignments
US10657479B2 (en) System and method for integrating employee feedback with an electronic time clock or computer login
Ahlert Enterprise customer management: Integrating corporate and customer information
US20050288995A1 (en) Method for evaluating interactive corporate systems
Crawford Measuring performance
Phillips et al. Converting Data to Monetary Value
Goldenson et al. Use and organizational effects of measurement and analysis in high maturity organizations: results from the 2008 SEI state of measurement and analysis practice surveys
AU2010201888B2 (en) Individual productivity and utilization tracking tool
HADGU DEPARTMENT OF BUSINESS ADMINISTRATION AND INFORMATION SYSTEM
Păunescu et al. Managing quality in organizations through performance measurement
Tinkham et al. New approaches to managing performance appraisals
Drew et al. Linking customer intelligence to service operations: Exploiting the connection at GTE
Thomas et al. Evaluating a performance support environment for knowledge workers
Guarino et al. Evaluating the success of Telecommuting at the Census Bureau
Carbonell The impact of help center workforce competency on service effectiveness: The National Systems for Geospatial-Intelligence (NSG) case study

Legal Events

Date Code Title Description
AS Assignment

Owner name: CONVERGYS CORPORATION, OHIO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WHITACRE, CINDY;ROYALL, MYRA;OLSEN, TOM D.;AND OTHERS;REEL/FRAME:014938/0504;SIGNING DATES FROM 20040114 TO 20040123

AS Assignment

Owner name: CONVERGYS CORPORATION, OHIO

Free format text: RE-RECORD TO DELETE CHRISTOPHER D. HORN PREVIOUSLY RECORDED AT REEL/FRAME 014938/0504;ASSIGNORS:WHITACRE, CINDY;ROYALL, MYRA;OLSEN, TOM D.;AND OTHERS;REEL/FRAME:015961/0237;SIGNING DATES FROM 20040304 TO 20040318

AS Assignment

Owner name: CONVERGYS CORPORATION, OHIO

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE EXECUTION DATES & ADD OMITTED INVENTOR'S NAME CHRISTOPHER HORN, PREVIOUSLY RECORDED AT 015961 FRAME 0237;ASSIGNORS:WHITACRE, CINDY;ROYALL, MYRA;OLSEN, TOM D.;AND OTHERS;REEL/FRAME:016250/0106;SIGNING DATES FROM 20040303 TO 20040318

STCB Information on status: application discontinuation

Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION