-
Notifications
You must be signed in to change notification settings - Fork 112
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clean and clear CI #975
Clean and clear CI #975
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
.github/workflows/test_openvino.yml
Outdated
test_pattern: | ||
[ | ||
"*modeling.py", | ||
"*modeling_diffusion.py", | ||
"*modeling_sentence_transformers.py", | ||
"*quantization*", | ||
"*training*", | ||
"*export*", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this one gives a maximum test time of 20 minutes, the alternative:
test_pattern:
[
"test_modeling*",
"test_quantization*",
"test_training*",
"test_export*",
]
takes 40 minutes, and the original tests/openvino/*
takes 1 hour 10 minutes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great thanks @IlyasMoutawwakil !
if hasattr(pipelines, module_name): | ||
module = getattr(pipelines, module_name) | ||
else: | ||
module = importlib.import_module(module_name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in diffusers sometimes they import entire modules from pipelines
https://github.com/huggingface/diffusers/blob/13e8fdecda91e27e40b15fa8a8f456ade773e6eb/src/diffusers/loaders/single_file.py#L65C7-L71C21
@@ -55,18 +55,15 @@ | |||
"tiktoken", | |||
"sentence-transformers", | |||
"open_clip_torch>=2.26.1", | |||
"peft", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added at test time to see if it breaks any diffusion related stuff.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
probably good to align lowerbound with diffusers
https://github.com/huggingface/diffusers/blob/main/setup.py#L118
"nncf": ["nncf>=2.11.0"], | ||
"openvino": ["nncf>=2.11.0", "openvino==2024.4.1.dev20240926", "openvino-tokenizers==2024.4.1.0.dev20240926"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there a reason why these versions are pinned ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, it is done because issues with openvino 2024.4.0, 2024.4.1 contains fixing for it, but this package is not released yet, prerelease packages can not be installed in pip without --pre (this approach does not work with setup.py, the only solution to use strict version). We have AR to unpin it when release package will be released
- if: ${{ matrix.test-pattern == '*modeling*' }} | ||
name: Uninstall NNCF | ||
run: | | ||
pip uninstall -y nncf |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any reason we uninstall NNCF here? We can help to resolve if there are any issues caused by the NNCF presence.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No issue, just to make sure NNCF is not a dep for OV modeling, I guess this was the case at some point then nncf was added to ov extras. Also #975 (comment)
run: | | ||
pip install transformers==${{ matrix.transformers-version }} accelerate==0.* | ||
|
||
- if: ${{ matrix.test-pattern == '*modeling*' }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@IlyasMoutawwakil, could you please tell a little bit more about idea behind precommit changes? I correctly understand that you split precommit on several independent workflow?
I'm wondering is there really need to remove nncf during modeling testing here and can be some cases when both modeling and quantization runs in the same time
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could you please tell a little bit more about idea behind precommit changes? I correctly understand that you split precommit on several independent workflow?
Yes, the number of tests results in ~1h20min wait time for openvino per-commit tests. So the idea is to distribute it.
I'm wondering is there really need to remove nncf during modeling testing here
My understanding from https://github.com/huggingface/optimum-intel/pull/975/files#r1822437311 is that modeling tests should pass without nncf (they do), by uninstalling it, we're making sure that's the case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These are useful changes, thank you!
os: ["ubuntu-22.04", "windows-2019"] | ||
openvino-version: ["stable", "nightly"] | ||
transformers-version: ["4.36.0", "latest"] | ||
nncf: ["nncf", "git+https://github.com/openvinotoolkit/nncf.git"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be honest, I don't think all these 16 combinations are strictly necessary.
It also so happens that I've just started working on introducing a separate OV workflow to run not only slow OV tests, but all OV tests: #985. This is because during slow tests almost none quantization-related tests are actually run which is especially critical for ensuring nncf develop compatibility.
So after this PR is merged, if there are no objections I would remove ov-nightly and nncf-develop from slow tests in my PR and introduce a separate workflow to test ov-nightly and nncf-develop versions on full OV scope on a single OS and transformers version.
What does this PR do?
2.2.0
to2.4.*
.transformers
version to avoid pinning a specific version in the CI that needs to be updated with each release.testing
utility whenever possible (nice reports):Before submitting