Fatal Error Accuracy Formula
Contents |
Upload Documents Write Course Advice Refer your Friends Earn Money Upload Documents Apply for Scholarship Create what is fatal error in quality Q&A pairs Become a Tutor Find Study Resources by School
Fatal Error In Bpo
by Literature Guides by Subject Get Instant Tutoring Help Ask a Tutor a Question Use Flashcards View
How To Calculate Fatal Accuracy
Flashcards Create Flashcards Earn by Contributing Earn Free AccessLearn More > Upload Documents Write Course Advice Refer your Friends Earn MoneyLearn More > Upload Documents Apply for Scholarship Create Q&A pairs Become a Tutor Are you an educator? Log in Sign up Home Alfaisal University QA QA QA 01 CS+Quality+Assurance+Scoring+System+Manual+(V1.6) Non-fatal error accuracy = 100*(1-(50/ (100*10))) = SCHOOL Alfaisal University COURSE TITLE QA QA 01 TYPE Notes UPLOADED BY SuperIronShark8870 PAGES 21 Click to edit the document details This preview shows pages 6–9. Sign up to view the full content. View Full Document This preview has intentionally blurred sections. Sign up to view the full version. View Full Document This preview has intentionally blurred sections. Sign up to view the full version. View Full Document This is the end of the preview. Sign up to access the rest of the document. Unformatted text preview: Non-Fatal error accuracy = 100*(1-(50/ (100*10))) = 100*(1-(50/1000)) = 100*(1-0.05) =100*0.95=95% 6. Measurements and Benchmarks: Quality Assurance Monitoring Benchmark: Fatal Error accuracy: Measure and track the percentage of Fatal error increase Fatal error percentage 98% Non Fatal error accuracy: Measure and track the percentage of non Fatal error increase Non Fatal Error percentage 95% 7. Doer's skills: 1. Trained, Assessed and calibrated Monitoring personnel. 2. Excellent business& product knowledge awareness. SCORING SYSTEM MANUAL 1. E. Scoring Sheet Items Explanation: 1- Opening (Non-Fatal) 1-a- Uses appropriate welcome 2- Customer Data and Verification (Fatal) 2-a- Asking the cust
4 point scale rating, Sum of % in Top 3 boxes for 5 questions on Service Offered 2 End User Satisfaction (COPC) %age in top 2 boxes On a 5 point scale, Sum of % in Top 2 boxes for the question on Overall Satisfaction For a 4 point scale rating, Sum of % in Top 2 boxes for the question on Overall Satisfaction 3 End User Dissatisfaction (COPC) %age in bottom 2 boxes On a 5 point scale, Sum of % in Bottom 2 boxes for the question on Overall Satisfaction For a 4 point scale rating, % in Bottom box for the question on Overall Satisfaction 4 Attrition %age https://www.coursehero.com/file/p3bf7qj/Non-Fatal-error-accuracy-1001-50-10010-1001-501000-1001-005-10009595-6/ annualized - External (Staff Attrited during the Month / Staff count at the End of the Month) * 100 * 12 • Staff attrited during the month would include resignations, terminations and absconding staff. • Staff count at the end of the month=(Opening Balance for the month + New recruits - Attrited staff) 5 %age annualized - Internal (Staff Transferred during the Month / Staff count at the End of the Month) * 100*12 • Staff transferred during http://inboundbpo.blogspot.com/2011/07/formulas.html the month would include staff who have been transferred from on department to another within the same process or staff who have been promoted within the same process. • Staff count at the end of the month=(Opening Balance for the month + New recruits-transferred staff) 6 Absenteeism % Staff Absent ( All Unscheduled Leaves) (Total unscheduled absenteeism for the process for the month / Total scheduled working days)* 100 • Total unscheduled absenteeism = (SLs+UCLs+ULWPs) • Total Scheduled working days = (Total present + Total unscheduled absenteeism) 7 Schedule Adherence % Adherence 1- [(Total Exceptions + Non Adherent hrs) / (8.5*total staff present during the month)]*100 8 Processing inbound end-user calls On Time - Service Level ( Number of calls answered within the service level / Total calls offered ) * 100 Calls Offered includes Terminated Calls 9 Abandonment Rate (Calls Abandoned + Calls Terminated) / Calls Offered Calls Offered includes Terminated Calls 10 Calls Answered % (Client Definition) Calls Answered / Calls Offered Calls Offered includes Terminated Calls 11 On Time - Grade of Service (Client Definition) ( Number of calls answered within the service level / Total calls Answered) * 100 12 Accuracy (Number of passed transactions / Transactions Monitored)*100 13 Non-Fatal Error Accuracy [1- ( Non Fatal Errors / ( Transactions Monitored * Opportunities for Non Fatal Errors per Transaction))]*100 14 Critical Error Accuracy (1- (Critical Err
Tips Jonty's Tips Hints & tips Strategy Polls Management Life Tips Motivation Customer Experience Attrition Empathy Technology White Papers Directory Guides Definitions Outbound Dialing Call Recording WFM IVR Jargon Tools Life https://www.callcentrehelper.com/forum/topic/quality-monitoring-forms Humour Games Jargon News Call Centre News Industry Insights Blogs Events Free Newsletter White Papers Events Event Calendar Webinars Recorded Webinars Conferences Training Advertise Advertise Jobs Directory About Us Newsletter Username: Password: Remember me Click to Register. (Forgotten password?) Quality Monitoring Forms Call Centre Helper Forum » Call Centre Management (44 posts) Tags: | coaching | Does anybody have any Quality Monitoring Forms fatal error they wan | feedback | Quality Monitoring | thornden4 In our contact centre we have been using Quality Monitoring forms that have a scale from 1 - 5 for each thing that is being marked. The problem is that what someone sees as a 3, someone else might mark as a 5. This meant that scoring was not consistant. I made up a new fatal error in form and it seems to be working a bit better although it is not perfect by any means. I am interested to hear how other people Quality Monitor their calls and if possible it would be great to see some examples. I am putting together a new form that will be scored as yes/no, a work in progress. Looking forward to hearing how everyone else does it. Posted 5 years ago Bunnycatz Hello, do you have calibration sessions? - basically this is when all of those involved in monitoring all score the same call then discuss any points of variation and agree a consensus stand-point. You can do this in a live environment (more time consuming) ie play the call in the meeting room as each monitor scores the call, then at the end of the call discuss how each point is scored. Alternatively, get everyone to monitor the call first and have their scored monitoring forms with them - discuss point by point and ensure that you have the call available to play to support the concensus view. (I find this works best) The above assumes you