How To Calculate Average Forecast Error
Contents |
Interpretation of these statistics can be tricky, particularly when working with low-volume data or when trying to assess accuracy across multiple items (e.g., how to calculate forecast error in excel SKUs, locations, customers, etc.). This installment of Forecasting 101 surveys common error
Forecast Accuracy Metrics
measurement statistics, examines the pros and cons of each and discusses their suitability under a variety of circumstances. The mape forecasting MAPE The MAPE (Mean Absolute Percent Error) measures the size of the error in percentage terms. It is calculated as the average of the unsigned percentage error, as shown in
Forecast Bias
the example below: Many organizations focus primarily on the MAPE when assessing forecast accuracy. Most people are comfortable thinking in percentage terms, making the MAPE easy to interpret. It can also convey information when you don’t know the item’s demand volume. For example, telling your manager, "we were off by less than 4%" is more meaningful than saying "we were off tracking signal forecasting by 3,000 cases," if your manager doesn’t know an item’s typical demand volume. The MAPE is scale sensitive and should not be used when working with low-volume data. Notice that because "Actual" is in the denominator of the equation, the MAPE is undefined when Actual demand is zero. Furthermore, when the Actual value is not zero, but quite small, the MAPE will often take on extreme values. This scale sensitivity renders the MAPE close to worthless as an error measure for low-volume data. The MAD The MAD (Mean Absolute Deviation) measures the size of the error in units. It is calculated as the average of the unsigned errors, as shown in the example below: The MAD is a good statistic to use when analyzing the error for a single item. However, if you aggregate MADs over multiple items you need to be careful about high-volume products dominating the results--more on this later. Less Common Error Measurement Statistics The MAPE and the MAD are by far the most commonly used error measurement statistics. There are a slew of alternative statistics in the fore
Of course, a good forecast is an accurate forecast. Today, I’m going to talk about the absolute best metric to use to measure forecast accuracy. Let’s start with a sample forecast. The following table represents the forecast and actuals for
Measuring Forecast Accuracy Best Practices
customer traffic at a small-box, specialty retail store (You could also imagine this representing the foot
Mse Forecasting
traffic in a department inside of a larger store, too.). Is this a good or a bad forecast?  Sun Mon types of forecasting errors Tue Wed Thu Fri Sat Total Forecast 81 54 61 68 92 105 121 582 Actual 78 62 64 72 http://www.forecastpro.com/Trends/forecasting101August2011.html 84 124 98 582 Certainly, the weekly forecast is good. After all, the forecasts says that 582 customer would visit the store, and by the end of the week, 582 customers did visit the store. The problems are the daily forecasts. There are some big swings, particularly towards the end of the week, that cause labor to be misaligned with demand. Since we’re trying to align labor to demand, understanding these swings – these forecast http://www.axsiumgroup.com/the-absolute-best-way-to-measure-forecast-accuracy-2/ errors – is important. It’s easy to look at this forecast and spot the problems. However, it’s hard to do this more more than a few stores for more than a few weeks. To overcome that challenge, you’ll want use a metric to summarize the accuracy of forecast. This not only allows you to look at many data points. It also allows you to compare forecasts. This is useful when you want to determine if one forecasting method is better than another, if forecast the workforce management system produced better than than the one provided by finance, or if forecasts getting more or less accurate over time. I frequently see retailers use a simple calculation to measure forecast accuracy. It’s formally referred to as “Mean Percentage Error”, or MPE but most people know it by its formal. It is calculated as follows: MPE = ((Actual – Forecast) / Actual) x 100 Applying this calculation to Sunday in our table above, we can quickly find the error for that day is –3.9 percent. MPE = ((79 – 81) / 79) x 100 = –3.9 This means that the actual results were 3.9 percent less than what was forecasted. The benefits of MPE is that it is easy to calculate and the results are easily understood. Statisticians and math-heads like to throw around complex ways of calculating forecast accuracy which are intimidating by
Du siehst YouTube auf Deutsch. Du kannst diese Einstellung unten ändern. Learn more You're viewing YouTube in German. You can change this preference below. Schließen Ja, ich möchte sie behalten Rückgängig machen Schließen Dieses https://www.youtube.com/watch?v=8cgIb9He5F8 Video ist nicht verfügbar. WiedergabelisteWarteschlangeWiedergabelisteWarteschlange Alle entfernenBeenden Wird geladen... Wiedergabeliste Warteschlange __count__/__total__ Forecast Accuracy Mean Average Percentage Error (MAPE) Ed Dansereau AbonnierenAbonniertAbo beenden901901 Wird geladen... Wird geladen... Wird verarbeitet... Hinzufügen Möchtest du dieses Video später noch einmal ansehen? Wenn du bei YouTube angemeldet bist, kannst du dieses Video zu einer Playlist hinzufügen. Anmelden Teilen Mehr Melden Möchtest du dieses how to Video melden? Melde dich an, um unangemessene Inhalte zu melden. Anmelden Transkript Statistik 15.681 Aufrufe 18 Dieses Video gefällt dir? Melde dich bei YouTube an, damit dein Feedback gezählt wird. Anmelden 19 2 Dieses Video gefällt dir nicht? Melde dich bei YouTube an, damit dein Feedback gezählt wird. Anmelden 3 Wird geladen... Wird geladen... Transkript Das interaktive Transkript konnte how to calculate nicht geladen werden. Wird geladen... Wird geladen... Die Bewertungsfunktion ist nach Ausleihen des Videos verfügbar. Diese Funktion ist zurzeit nicht verfügbar. Bitte versuche es später erneut. Veröffentlicht am 13.12.2012All rights reserved, copyright 2012 by Ed Dansereau Kategorie Bildung Lizenz Standard-YouTube-Lizenz Mehr anzeigen Weniger anzeigen Wird geladen... Anzeige Autoplay Wenn Autoplay aktiviert ist, wird die Wiedergabe automatisch mit einem der aktuellen Videovorschläge fortgesetzt. Nächstes Video 3-3 MAPE - How good is the Forecast - Dauer: 5:30 Excel Analytics 3.776 Aufrufe 5:30 Forecasting: Moving Averages, MAD, MSE, MAPE - Dauer: 4:52 Joshua Emmanuel 28.985 Aufrufe 4:52 Forecasting - Measurement of error (MAD and MAPE) - Example 2 - Dauer: 18:37 maxus knowledge 16.373 Aufrufe 18:37 MFE, MAPE, moving average - Dauer: 15:51 East Tennessee State University 29.852 Aufrufe 15:51 Error and Percent Error - Dauer: 7:15 Tyler DeWitt 116.549 Aufrufe 7:15 Rick Blair - measuring forecast accuracy webinar - Dauer: 58:30 Rick Blair 158 Aufrufe 58:30 Calculating Forecast Accuracy - Dauer: 15:12 MicroCraftTKC 1.824 Aufrufe 15:12 Forecast Accuracy: MAD, MSE, TS Formulas - Dauer: 3:59 IntroToOM