Follow

How does the automated outlier and detection work in DemandCaster?

The detection and correction algorithm works as follows:

  1. The specified forecasting model is fit to the time series, the residuals (fitted errors) are generated and their standard deviation is calculated.
  2. If the size of the largest error exceeds the outlier threshold, the point is flagged as an outlier and the historic value for the period is replaced with the fitted value.
  3. The procedure is then repeated using the corrected history until either no outliers are detected or the specified maximum number of iterations is reached.
  4. In a multiple-level problem the detection is only performed on the end items (i.e., the non-group level). If the correction option has been selected, after all end items are corrected, the group level totals are re-aggregated to reflect the corrected values.
  5. You can adjust the Sensitivity setting to make the outlier threshold more or less sensitive.
  6. Sensitivity (std deviations) allows you to set the sensitivity of the outlier detection algorithm. If a given fitted error exceeds this threshold and it is the largest error detected during the current iteration it will be flagged as an outlier. Our default setting is 1 standard deviation.
  7. Maximum iterations allows you to set the maximum number of iterations permitted during outlier detection for a given item. This setting thereby also defines the maximum number of outliers than can be detected for a given item. This is user defined. Each iteration is another standard deviation.
Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request

Comments

Powered by Zendesk