Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Will SpreadConstraints make effect when replicaSchedulingType is Divided? #5986

Open
vie-serendipity opened this issue Dec 27, 2024 · 8 comments
Labels
kind/question Indicates an issue that is a support question.

Comments

@vie-serendipity
Copy link
Contributor

Please provide an in-depth description of the question you have:
When I test SpreadConstraints, I found it will not make effect when replicaSchedulingType is Divided. I have two clusters, where one cluster has significantly more resources than the other, and all replicas are scheduled to the larger cluster. I expect to add SpreadConstraints to make sure the smaller cluster will get at least one replica. However, it seems not both for dynamic and static.

What do you think about this question?:
I'm not sure my use case is correct. If it is, I think SpreadConstraints should be supported even replicaSchedulingType is Divided.

Environment:

  • Karmada version:
  • Kubernetes version:
  • Others:
@vie-serendipity vie-serendipity added the kind/question Indicates an issue that is a support question. label Dec 27, 2024
@XiShanYongYe-Chang
Copy link
Member

In theory, it should be in effect. Can you share how your PropagationPolicy is set?

@vie-serendipity
Copy link
Contributor Author

The pp is as following:

  resourceSelectors:
    - apiVersion: apps/v1
      kind: Deployment
      name: whoami
      namespace: test
  placement:
    clusterAffinity:
      clusterNames:
        - member1
        - member2
    spreadConstraints:
      - maxGroups: 2
        minGroups: 2
    replicaScheduling:
      replicaDivisionPreference: Weighted
      replicaSchedulingType: Divided
      weightPreference:
        dynamicWeight: AvailableReplicas

@vie-serendipity
Copy link
Contributor Author

BTW, if the weightPreference is static, for example, three replicas with a weight ratio of 3:1, I believe according to the current code logic, the scheduling result would be 3:0. Would the spreadConstraint then not take effect?

I only see spreadConstraints functioning in selectclusters, not in assignReplicas. Same as dynamic.

@XiShanYongYe-Chang
Copy link
Member

According to the pp configuration you share, there is no problem. Check whether the number of copies in the other cluster is insufficient. You can view the log of the scheduler.

BTW, if the weightPreference is static, for example, three replicas with a weight ratio of 3:1, I believe according to the current code logic, the scheduling result would be 3:0. Would the spreadConstraint then not take effect?

If it's static, it should ignore the amount of resources for allocation.

@vie-serendipity
Copy link
Contributor Author

If it's static, it should ignore the amount of resources for allocation.

I mean static weight ratio is 3:1.

Anyway, thanks pretty much for your answer on the weekend. I will read the code further and test more.

@vie-serendipity
Copy link
Contributor Author

After read the code, I still find SpreadConstraints doesn't work when replicaSchedulingType is Divided.

Actually, SpreadConstraints only works in selecting clusters, as following:

	clusters, err := g.selectClusters(clustersScore, spec.Placement, spec)

So it can't guarantee that the final result of assignReplicas still contains clusters of selectClusters, which means the final scheduled cluster might violate SpreadConstraints.

It's only useful when replicaSchedulingType is Duplicated.

@XiShanYongYe-Chang
Copy link
Member

Thanks for your response @vie-serendipity
Ask @whitewindmills to help take a look and make sure.
/cc @whitewindmills

@whitewindmills
Copy link
Member

@vie-serendipity thanks for your use case. I think it's not SpreadConstraint's fault but StaticWeight's. I mentioned a similar problem in a previous issue #4805. maybe this is a good time to help us push this feature further. @RainbowMango what do you think?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/question Indicates an issue that is a support question.
Projects
None yet
Development

No branches or pull requests

3 participants