Performance measurements #781
-
Hi, I wanted to know if it's possible to get performance measurements like accuracy, precision, recall etc. other than the AUROC and F1Score? And since the latter two does require the use of recall, precision, etc. it should be possible to get them as output as well? I checked the codes in |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 9 replies
-
Anomalib uses the Torchmetrics package to evaluate model performance. Anomalib supports many metrics classes from the torchmetrics package out of the box, but also implements several custom metric implementations such as AUROC, PRO and AUPRO. The metrics that will be used to evaluate your models can be configured in the When adding a metric to the evaluation pipeline, Anomalib will first search for a metric with the specified name in the In your case, Precision, Recall and Accuracy are all available from the Torchmetrics package. So, to evaluate your Anomalib models using these metrics, simply list them under So to summarize, your desired use-case could be achieved by entering the following parameter values in the
|
Beta Was this translation helpful? Give feedback.
Anomalib uses the Torchmetrics package to evaluate model performance. Anomalib supports many metrics classes from the torchmetrics package out of the box, but also implements several custom metric implementations such as AUROC, PRO and AUPRO.
The metrics that will be used to evaluate your models can be configured in the
metrics
section of theconfig.yaml
. Image- and pixel-level metrics can be configured separately, respectively undermetrics.image
andmetrics.pixel
.When adding a metric to the evaluation pipeline, Anomalib will first search for a metric with the specified name in the
anomalib.utils.metrics
module. If the metric is not found in Anomalib, it will look for the metric in the a…