Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: wait to allow PodDefaults to synced in Job's namespace before running tests #120

Merged
merged 10 commits into from
Sep 17, 2024

Conversation

NohaIhab
Copy link
Contributor

@NohaIhab NohaIhab commented Sep 13, 2024

closes canonical/bundle-kubeflow#1066

This is to overcome the race condition sometimes happening between:

  1. the poddefaults being created in the test namespace
  2. the job being created in the test namespace

The extra wait ensures that 1 happens before 2

Summary of changes

  • Adds a sleep after creating the namespace to allow the resources to be synced to the namespace before the job runs
  • Asserts that the list of poddefaults in the namespace is not empty
  • Logs the names of poddefaults found in the namespace
  • Asserts that the access-ml-pipeline poddefault exists in the namespace

Results

UATs run logs from the PR
tox -e uats-remote -- --filter "not e2e"
uats-remote: commands[0]> pytest -vv --tb native /home/ubuntu/uat/driver/ -s --model kubeflow --filter 'not e2e'
=========================================================================================== test session starts ============================================================================================
platform linux -- Python 3.8.20, pytest-7.4.3, pluggy-1.3.0 -- /home/ubuntu/uat/.tox/uats-remote/bin/python
cachedir: .tox/uats-remote/.pytest_cache
rootdir: /home/ubuntu/uat
configfile: pyproject.toml
plugins: anyio-4.0.0, asyncio-0.21.1, operator-0.31.0
asyncio: mode=strict
collected 2 items                                                                                                                                                                                          

driver/test_kubeflow_workloads.py::test_create_profile 
---------------------------------------------------------------------------------------------- live log setup ----------------------------------------------------------------------------------------------
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/apiextensions.k8s.io/v1/customresourcedefinitions "HTTP/1.1 200 OK"
INFO     test_kubeflow_workloads:test_kubeflow_workloads.py:83 Creating Profile test-kubeflow...
INFO     httpx:_client.py:1013 HTTP Request: POST https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/kubeflow.org/v1/profiles "HTTP/1.1 201 Created"
---------------------------------------------------------------------------------------------- live log call -----------------------------------------------------------------------------------------------
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/kubeflow.org/v1/profiles/test-kubeflow "HTTP/1.1 200 OK"
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/api/v1/namespaces/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:32 Waiting for namespace test-kubeflow to become 'Active': phase == Terminating
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/api/v1/namespaces/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:32 Waiting for namespace test-kubeflow to become 'Active': phase == Active
INFO     test_kubeflow_workloads:test_kubeflow_workloads.py:151 Sleeping for 40s to allow the creation of PodDefaults in test-kubeflow namespace..
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/kubeflow.org/v1alpha1/namespaces/test-kubeflow/poddefaults "HTTP/1.1 200 OK"
INFO     test_kubeflow_workloads:test_kubeflow_workloads.py:160 PodDefaults in test-kubeflow namespace are ['access-ml-pipeline', 'mlflow-server-access-minio', 'mlflow-server-minio'].
PASSED
driver/test_kubeflow_workloads.py::test_kubeflow_workloads 
---------------------------------------------------------------------------------------------- live log call -----------------------------------------------------------------------------------------------
INFO     test_kubeflow_workloads:test_kubeflow_workloads.py:169 Starting Kubernetes Job test-kubeflow/test-kubeflow to run notebook tests...
INFO     httpx:_client.py:1013 HTTP Request: POST https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs "HTTP/1.1 201 Created"
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == not ready)
INFO     utils:utils.py:40 Retrying in 2 seconds (attempts: 1)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 4 seconds (attempts: 2)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 8 seconds (attempts: 3)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 16 seconds (attempts: 4)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 5)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 6)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 7)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 8)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 9)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 10)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 11)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 12)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 13)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 14)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 15)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 16)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 17)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 18)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 19)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 20)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 21)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 22)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 23)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 24)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 25)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 26)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 27)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:76 Waiting for Job test-kubeflow/test-kubeflow to complete (status == active)
INFO     utils:utils.py:40 Retrying in 32 seconds (attempts: 28)
INFO     httpx:_client.py:1013 HTTP Request: GET https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"
INFO     utils:utils.py:69 Job test-kubeflow/test-kubeflow completed successfully!
INFO     test_kubeflow_workloads:test_kubeflow_workloads.py:196 Fetching Job logs...
##### git-sync initContainer logs #####
INFO: detected pid 1, running init handler
{"logger":"","ts":"2024-09-13 13:01:46.153796","caller":{"file":"main.go","line":722},"level":0,"msg":"starting up","pid":12,"uid":65533,"gid":65533,"home":"/tmp","flags":["--add-user=false","--change-permissions=0","--cookie-file=false","--depth=1","--exechook-backoff=3s","--exechook-timeout=30s","--git=git","--git-gc=always","--group-write=true","--help=false","--http-metrics=false","--http-pprof=false","--link=charmed-kubeflow-uats","--man=false","--max-failures=0","--max-sync-failures=0","--one-time=true","--password=REDACTED","--period=10s","--ref=204694d0a192202611ec171da54f858fea9b4c21","--repo=https://github.com/canonical/charmed-kubeflow-uats","--root=/tests","--ssh=false","--ssh-key-file=/etc/git-secret/ssh","--ssh-known-hosts=true","--ssh-known-hosts-file=/etc/git-secret/known_hosts","--stale-worktree-timeout=0s","--submodules=recursive","--sync-timeout=2m0s","--timeout=0","--v=-1","--verbose=0","--version=false","--wait=0","--webhook-backoff=3s","--webhook-method=POST","--webhook-success-status=200","--webhook-timeout=1s"]}
{"logger":"","ts":"2024-09-13 13:01:46.167813","caller":{"file":"main.go","line":1248},"level":0,"msg":"repo directory failed checks or was empty","path":"/tests"}
{"logger":"","ts":"2024-09-13 13:01:46.167938","caller":{"file":"main.go","line":1258},"level":0,"msg":"initializing repo directory","path":"/tests"}
{"logger":"","ts":"2024-09-13 13:01:46.514997","caller":{"file":"main.go","line":1797},"level":0,"msg":"update required","ref":"204694d0a192202611ec171da54f858fea9b4c21","local":"","remote":"204694d0a192202611ec171da54f858fea9b4c21","syncCount":0}
{"logger":"","ts":"2024-09-13 13:01:47.024882","caller":{"file":"main.go","line":1848},"level":0,"msg":"updated successfully","ref":"204694d0a192202611ec171da54f858fea9b4c21","remote":"204694d0a192202611ec171da54f858fea9b4c21","syncCount":1}
{"logger":"","ts":"2024-09-13 13:01:47.025076","caller":{"file":"main.go","line":1022},"level":0,"msg":"exiting after one sync","status":0}
##### test-kubeflow container logs #####
WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnection object at 0x7fed2a5bbc90>: Failed to establish a new connection: [Errno 101] Network is unreachable')': /simple/pytest/
WARNING: Retrying (Retry(total=3, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<pip._vendor.urllib3.connection.HTTPSConnection object at 0x7fed2a5daa10>: Failed to establish a new connection: [Errno 101] Network is unreachable')': /simple/pytest/
============================= test session starts ==============================
platform linux -- Python 3.11.9, pytest-8.3.3, pluggy-1.5.0 -- /opt/conda/bin/python3.11
cachedir: .pytest_cache
rootdir: /tests/.worktrees/204694d0a192202611ec171da54f858fea9b4c21/tests
configfile: pytest.ini
plugins: anyio-4.4.0
collecting ... collected 9 items / 1 deselected / 8 selected

test_notebooks.py::test_notebook[katib-integration] 
-------------------------------- live log call ---------------------------------
INFO     test_notebooks:test_notebooks.py:44 Running katib-integration.ipynb...
PASSED                                                                   [ 12%]
test_notebooks.py::test_notebook[kfp-v1-integration] 
-------------------------------- live log call ---------------------------------
INFO     test_notebooks:test_notebooks.py:44 Running kfp-v1-integration.ipynb...
PASSED                                                                   [ 25%]
test_notebooks.py::test_notebook[kfp-v2-integration] 
-------------------------------- live log call ---------------------------------
INFO     test_notebooks:test_notebooks.py:44 Running kfp-v2-integration.ipynb...
PASSED                                                                   [ 37%]
test_notebooks.py::test_notebook[kserve-integration] 
-------------------------------- live log call ---------------------------------
INFO     test_notebooks:test_notebooks.py:44 Running kserve-integration.ipynb...
PASSED                                                                   [ 50%]
test_notebooks.py::test_notebook[mlflow-integration] 
-------------------------------- live log call ---------------------------------
INFO     test_notebooks:test_notebooks.py:44 Running mlflow-integration.ipynb...
PASSED                                                                   [ 62%]
test_notebooks.py::test_notebook[mlflow-kserve] 
-------------------------------- live log call ---------------------------------
INFO     test_notebooks:test_notebooks.py:44 Running mlflow-kserve.ipynb...
PASSED                                                                   [ 75%]
test_notebooks.py::test_notebook[mlflow-minio-integration] 
-------------------------------- live log call ---------------------------------
INFO     test_notebooks:test_notebooks.py:44 Running mlflow-minio-integration.ipynb...
PASSED                                                                   [ 87%]
test_notebooks.py::test_notebook[training-integration] 
-------------------------------- live log call ---------------------------------
INFO     test_notebooks:test_notebooks.py:44 Running training-integration.ipynb...
PASSED                                                                   [100%]

================= 8 passed, 1 deselected in 775.42s (0:12:55) ==================
PASSED
-------------------------------------------------------------------------------------------- live log teardown ---------------------------------------------------------------------------------------------
INFO     test_kubeflow_workloads:test_kubeflow_workloads.py:96 Deleting Profile test-kubeflow...
INFO     httpx:_client.py:1013 HTTP Request: DELETE https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/kubeflow.org/v1/profiles/test-kubeflow "HTTP/1.1 200 OK"
INFO     test_kubeflow_workloads:test_kubeflow_workloads.py:202 Deleting Job test-kubeflow/test-kubeflow...
INFO     httpx:_client.py:1013 HTTP Request: DELETE https://kubeflow-4b5f4so8.hcp.westeurope.azmk8s.io/apis/batch/v1/namespaces/test-kubeflow/jobs/test-kubeflow "HTTP/1.1 200 OK"


====================================================================================== 2 passed in 843.26s (0:14:03) =======================================================================================
  uats-remote: OK (844.42=setup[0.06]+cmd[844.36] seconds)
  congratulations :) (844.52 seconds)

Note: e2e test is skipped, it is failing due to canonical/bundle-kubeflow#1067

See AKS run using this branch

@NohaIhab NohaIhab changed the title fix: check PodDefaults in Job's namespace before running tests fix: wait to allow PodDefaults to synced in Job's namespace before running tests Sep 13, 2024
Copy link
Contributor

@orfeas-k orfeas-k left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left 1 comment, good job @NohaIhab on tackling this!

driver/test_kubeflow_workloads.py Outdated Show resolved Hide resolved
Copy link
Contributor

@orfeas-k orfeas-k left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM from my side. Leaving it to @misohu for approval

Copy link
Member

@misohu misohu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really nice job Noha

@NohaIhab NohaIhab merged commit 14f4199 into main Sep 17, 2024
1 check passed
@NohaIhab NohaIhab deleted the KF-5915-check-poddefaults-in-namespace branch September 17, 2024 22:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

kfp and mlflow UATs fail intermittently when run using the driver with poddefaults not applied to Job Pod
3 participants