Uploaded image for project: 'JDK'
  1. JDK
  2. JDK-8227434

G1 predictions may over/underflow with high variance input

    Details

    • Subcomponent:
      gc
    • Resolved In Build:
      b26

      Description

      Currently G1 uses a predictor based on a decaying average that adds a safety margin using the sequences' variance/standard deviation.

      On workloads/situations with recent high variance, the predictors used in the G1Analytics class may currently return unexpected negative/positive values as they are not clamped to a useful range.

      As usual these errors can influence the prediction accuracy significantly (e.g. I have seen predictions for the amount of bytes survived to be in the ~2^63 range due to overflow, which are then propagated further to result in completely impossible overall time predictions).

      This is a day one bug as far as I understand; only in very few cases consumers of the predictions "manually" clamp values already, e.g. the G1Policy::predict_yg_surv_rate() method.

      It would be better if the G1Analytics predict_* would do value clamping already.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                tschatzl Thomas Schatzl
                Reporter:
                tschatzl Thomas Schatzl
              • Votes:
                0 Vote for this issue
                Watchers:
                1 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: