RecalibrationFunction
Details
- For classifiers, recalibration is also known as probability calibration and is typically used to correct overconfident or underconfident classifiers.
- RecalibrationFunction can be used at training time, inference time or to update the calibrator of an existing model.
- When specified at inference time or to update the calibrator of an existing model, typical settings for RecalibrationFunction include:
-
None remove existing recalibration f append the function f to the existing calibrator - For classifiers, the function f is applied to the class probabilities f[<class1p1,class2p2,… >] and should return new class probabilities. New class probabilities are automatically normalized.
- For predictors, the function f transforms the predictive distribution by being applied to the output values.
- When specified at training time, typical settings for RecalibrationFunction include:
-
None prevent any recalibration Automatic recalibrate the model when needed All always recalibrate the model - Besides being used on the final model, recalibration is also applied to candidate models generated by Classify and Predict during their training procedure.
- When RecalibrationFunctionAll, recalibration is applied to every candidate model.
- When RecalibrationFunctionAutomatic, recalibration is only used for candidate models that need it.
Examples
open allclose allBasic Examples (4)
Train a random forest classifier without any recalibration:
Visualize the calibration curve on a test set:
Train a random forest classifier with recalibration:
Visualize the calibration curve on a test set:
Compute the class probabilities of a new example:
Check if the model has been calibrated:
Temporarily set a recalibration function to apply to the probabilities:
Set a permanent recalibration function to apply to the probabilities:
Compute the class probabilities of a new example:
Remove the recalibration function from the classifier:
Load the Boston Homes dataset:
Train a predictor with model calibration:
Visualize the comparison plot on a test set:
Remove the recalibration function from the predictor:
Visualize the new comparison plot:
Compute the predictive distribution:
Temporarily set a recalibration function to apply to the prediction:
Compute the predictive distribution with this new recalibration:
Applications (1)
Text
Wolfram Research (2021), RecalibrationFunction, Wolfram Language function, https://reference.wolfram.com/language/ref/RecalibrationFunction.html.
CMS
Wolfram Language. 2021. "RecalibrationFunction." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/RecalibrationFunction.html.
APA
Wolfram Language. (2021). RecalibrationFunction. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/RecalibrationFunction.html