Trainer Class Predict Func == Decreasing Prediction Time #1698
-
Hi everyone, I have a dataloader with 100 images.
True |
Beta Was this translation helpful? Give feedback.
Replies: 6 comments 7 replies
-
Hi @semaegrii, this might be related to the warm up. As shown by @alexriedel1 here, it might be an idea to add a warmup predictions but exclude them from the throughput computation. |
Beta Was this translation helpful? Give feedback.
-
Batch size does in fact matter at inference time! Due to parallel computation the inference will be faster per sample with a higher batch size (that is why for a fair comparison algorithm speed in papers should be measured at batch size 1) |
Beta Was this translation helpful? Give feedback.
-
Ah that's a great point! |
Beta Was this translation helpful? Give feedback.
-
@samet-akcay @alexriedel1 I understand When I calculate it according to the situation you mentioned here, the model's prediction time is approximately 4.7 ms, that is, 209 FPS. This looks nice. However, what I need during this time is to be able to access ('pred_scores', 'pred_labels' ) information in my test images. That's why I use trainer.predict. |
Beta Was this translation helpful? Give feedback.
-
Hello, this time I will ask another question. @samet-akcay @alexriedel1 |
Beta Was this translation helpful? Give feedback.
-
sure, here are my codes: @alexriedel1 I edited the torch inference according to my own work. With this function, I obtain the true values (good or anomaly) of my images and the prediction score in the form of a list.
Afterwards, I calculate AUC and accuracy values from the formula.
|
Beta Was this translation helpful? Give feedback.
you have to save the return of your model call in a variable. then you can access its content. For Padim, the anomaly score is the max of the anomaly map.
In this case the pred_score is calculated as follows:
anomalib/src/anomalib/deploy/inferencers/torch_inferencer.py
Lines 212 to 215 in 4459c2b
You also might want to post process the map and the score. Please see the torch inference for details:
anomalib/src/anomalib/deploy/inferencers/torch_inferencer.py
Lines 246 to 259 in 44…