Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

datasciencepipelines component refactor #1340

Conversation

jackdelahunt
Copy link
Contributor

Description

How Has This Been Tested?

Screenshot or short clip

Merge criteria

  • You have read the contributors guide.
  • Commit messages are meaningful - have a clear and concise summary and detailed explanation of what was changed and why.
  • Pull Request contains a description of the solution, a link to the JIRA issue, and to any dependent or related Pull Request.
  • Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
  • The developer has manually tested the changes and verified that the changes work

Copy link

codecov bot commented Nov 4, 2024

Codecov Report

Attention: Patch coverage is 0% with 9 lines in your changes missing coverage. Please review.

Please upload report for BASE (feature-operator-refactor@77ba8e7). Learn more about missing BASE report.

Files with missing lines Patch % Lines
...atasciencecluster/datasciencecluster_controller.go 0.00% 9 Missing ⚠️
Additional details and impacted files
@@                     Coverage Diff                      @@
##             feature-operator-refactor    #1340   +/-   ##
============================================================
  Coverage                             ?   26.63%           
============================================================
  Files                                ?       55           
  Lines                                ?     4464           
  Branches                             ?        0           
============================================================
  Hits                                 ?     1189           
  Misses                               ?     3134           
  Partials                             ?      141           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.


🚨 Try these New Features:

@lburgazzoli
Copy link
Contributor

you may also need to re-generate some additional metadata:

make generate manifest api-docs bundle

Comment on lines 40 to 106
func devFlags(ctx context.Context, rr *odhtypes.ReconciliationRequest) error {
dsp, ok := rr.Instance.(*componentsv1.DataSciencePipelines)
if !ok {
return fmt.Errorf("resource instance %v is not a componentsv1.Ray)", rr.Instance)
}

if dsp.Spec.DevFlags == nil {
return nil
}

// Implement devflags support logic
// If dev flags are set, update default manifests path
if len(dsp.Spec.DevFlags.Manifests) != 0 {
manifestConfig := dsp.Spec.DevFlags.Manifests[0]
if err := odhdeploy.DownloadManifests(ctx, componentsv1.DataSciencePipelinesComponentName, manifestConfig); err != nil {
return err
}
if manifestConfig.SourcePath != "" {
rr.Manifests[0].Path = odhdeploy.DefaultManifestPath
rr.Manifests[0].ContextDir = componentsv1.DataSciencePipelinesComponentName
rr.Manifests[0].SourcePath = manifestConfig.SourcePath
}
}

return nil
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we know if there is anything additional needed here looking at the old component implementation I don't think so but I am not sure

@jackdelahunt jackdelahunt changed the title WIP refactor: datasciencepipelines component refactor datasciencepipelines component refactor Nov 6, 2024
@lburgazzoli
Copy link
Contributor

Comment on lines 80 to 101
func UnmanagedArgoWorkFlowExists(ctx context.Context, cli client.Client) error {
workflowCRD := &apiextensionsv1.CustomResourceDefinition{}
if err := cli.Get(ctx, client.ObjectKey{Name: ArgoWorkflowCRD}, workflowCRD); err != nil {
if k8serr.IsNotFound(err) {
return nil
}
return fmt.Errorf("failed to get existing Workflow CRD : %w", err)
}
// Verify if existing workflow is deployed by ODH with label
odhLabelValue, odhLabelExists := workflowCRD.Labels[labels.ODH.Component(componentsv1.DataSciencePipelinesComponentName)]
if odhLabelExists && odhLabelValue == "true" {
return nil
}

return fmt.Errorf("%s CRD already exists but not deployed by this operator. "+
"Remove existing Argo workflows or set `spec.components.datasciencepipelines.managementState` to Removed to proceed ", ArgoWorkflowCRD)
}

func SetExistingArgoCondition(conditions *[]conditionsv1.Condition, reason, message string) {
status.SetCondition(conditions, string(status.CapabilityDSPv2Argo), reason, message, corev1.ConditionFalse)
status.SetComponentCondition(conditions, componentsv1.DataSciencePipelinesComponentName, status.ReconcileFailed, message, corev1.ConditionFalse)
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lburgazzoli I added that here

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but that is the used in the DSC controller ? I'm a little confused about if this is a DSP concern or a generat DSC concern. in the first case, the DSP controlelr should set its own condition, the DSC controller can eventually copy them.

@VaishnaviHire do you remember what is the logic here ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

@jackdelahunt jackdelahunt Nov 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am kind of confused but here is what I think you were saying?:

  • UnmanagedArgoWorkFlowExists can be a new action in the DSP controller
  • We can also set the first condition in the DSP controller
  • And if that condition is there we can set the component condition in the DSC controller

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

correct

Copy link
Contributor Author

@jackdelahunt jackdelahunt Nov 7, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New commit has those changes with this logic translated into and action in the DSP controller but I have some questions:

  1. I have this at the top of the new action, in terms of upgrading does this work the same way or because now this is an action do things act a bit differently
// Check preconditions if this is an upgrade
if rr.Instance.GetStatus().Phase != status.PhaseReady {
	return nil
}
  1. I set the CapabilityDSPv2Argo condition for the DSC in this action which is what I think you were looking for, is updating the DSC from a component okay? Just wondering what are the rule around component/DSC interactions

  2. The second part of SetExistingArgoCondition

status.SetComponentCondition(conditions, componentsv1.DataSciencePipelinesComponentName, status.ReconcileFailed, message, corev1.ConditionFalse)

I am not sure where to put this in the DSC, I think failing the new UnmanagedArgoWorkFlowExists action already cause the component's reconcile to be in a failed state because it is erroring. So it this already done for us when we process the componentErrors? Or maybe things are acting differently I am not sure

This is my first PR on the operator so my understanding is limited but hopefully growing 😃

@jackdelahunt
Copy link
Contributor Author

/retest

@jackdelahunt
Copy link
Contributor Author

/retest

@lburgazzoli
Copy link
Contributor

LGTM we would probably need to add a test for the preconditions step, but can be done as a follow up

I let @zdtsw doing the final review

DataSciencePipelinesComponentName = "data-science-pipelines-operator"
// value should match whats set in the XValidation below
DataSciencePipelinesInstanceName = "default-datasciencepipelines"
DataSciencePipelinesKind = "DataSciencePipelines"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we want to use plural?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought for DSP that it is always plural, but maybe I am wrong.

Ray operator == DataSciencePipelines operator ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, Ray is Ray, DSP is DSP, two different components

Copy link
Contributor Author

@jackdelahunt jackdelahunt Nov 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No sorry I didn't mean they are the same, when you refer to ray operator you say Ray, when you refer to DSP operator you say DataSciencePipelines, you use plural not singular. But I could be wrong with that, it is just that I seen that before and I thought I would copy that way

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i was thinking here:

TypeMeta: metav1.TypeMeta{
			Kind:       componentsv1.DataSciencePipelinesKind,   ============> DataSciencePipelines or DataSciencePipeline
			APIVersion: componentsv1.GroupVersion.String(),
		},

but now i see https://github.com/jackdelahunt/opendatahub-operator/blob/datasciencepipelines-refactor/bundle/manifests/components.opendatahub.io_datasciencepipelines.yaml#L13-L14
both singular and pulural are set to the same value :D
idk, let @lburgazzoli to tell if it should be
kind: DataSciencePipelines
or
kind: DataSciencePipeline

@@ -71,3 +76,11 @@ type DataSciencePipelinesList struct {
func init() {
SchemeBuilder.Register(&DataSciencePipelines{}, &DataSciencePipelinesList{})
}

// DSCDataSciencePipelines contains all the configuration exposed in DSC instance for DataSciencePipelines component
type DSCDataSciencePipelines struct {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same for here

tests/e2e/datasciencepipelines_test.go Outdated Show resolved Hide resolved
tests/e2e/datasciencepipelines_test.go Outdated Show resolved Hide resolved
controllers/datasciencecluster/kubebuilder_rbac.go Outdated Show resolved Hide resolved
Copy link
Member

@grdryn grdryn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Generally looks good to me 🚀

tests/e2e/datasciencepipelines_test.go Outdated Show resolved Hide resolved
tests/e2e/datasciencepipelines_test.go Outdated Show resolved Hide resolved
@jackdelahunt jackdelahunt force-pushed the datasciencepipelines-refactor branch 3 times, most recently from 739b5bf to 1ed662a Compare November 19, 2024 13:19
Copy link

openshift-ci bot commented Nov 19, 2024

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: lburgazzoli

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@openshift-merge-bot openshift-merge-bot bot merged commit e980df6 into opendatahub-io:feature-operator-refactor Nov 19, 2024
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

5 participants