Anomalib with Intel ARC GPU #2243
Replies: 3 comments 2 replies
-
As far as I remember, @blaz-r tried it, and got it working. He can comment better. Meanwhile, you could first install anomalib dependencies and install intel extension later after. |
Beta Was this translation helpful? Give feedback.
-
I didn't try to use it with Anomalib, but I use it to train my own model which does use some Anomalib things. You'll need to install Intel extension for pytorch, and verify that it works. As Samet said, to do that first install Anomalib dependencies and then install the Intel extension with the correct torch. I believe that once you have that sorted, all you have to do is import the extension (only in the main file) right after the torch like so:
and then specify the accelerator in Engine args as |
Beta Was this translation helpful? Give feedback.
-
@Nakagawatokuji thanks for opening this discussion. For running the inference on Intel GPUs, please refer to the OpenVINO Inferencer in Anomalib and follow the installation process Intel GPU Plugin for OpenVINO. https://github.com/openvinotoolkit/openvino/blob/master/src/plugins/intel_gpu/README.md, Once you have the GPU working with OpenVINO won't be blockers for running the Anomalib inference in your ARC. Best regards |
Beta Was this translation helpful? Give feedback.
-
I am using an Intel ARC A310 graphics card. I installed Anomalib and tried some models, but it doesn't recognize my GPU and I can't utilize it. It runs on CPU.
I tried to install `Intel Extension for Pytorch'. But Anomalib installer seems to overwrite it.
Does anyone have solutions? Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions