Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix setuptools config #39

Closed
wants to merge 4 commits into from

Conversation

francois-rozet
Copy link
Contributor

@francois-rozet francois-rozet commented Nov 6, 2024

Closes #38.

The issue is that setuptools is not allowed to search for sub-modules (notably utils) with the current configuration. Using package discovery solves this issue.

@facebook-github-bot
Copy link
Contributor

Hi @francois-rozet!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

@facebook-github-bot
Copy link
Contributor

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 6, 2024
@runame runame added the bug Something isn't working label Nov 6, 2024
@runame
Copy link
Contributor

runame commented Nov 6, 2024

Thanks for the fix, was just about to tackle this issue!

Can you modify the tests to not use -e, so they will catch if something with the setup breaks again? The lines to change are here and here.

@runame
Copy link
Contributor

runame commented Nov 6, 2024

Thanks for adjusting the tests! I didn't include the mypy type checking because it will detect a duplicate module in build/lib/. So I think we should just keep using -e for this one, otherwise we have to use the --exclude flag or something like that.

Not sure what happened to the GPU tests though.

@francois-rozet
Copy link
Contributor Author

francois-rozet commented Nov 6, 2024

I didn't include the mypy type checking

My bad, I'll revert.

Not sure what happened to the GPU tests though.

Not sure either :/ I did not change anything else in the codebase, so it seems weird that it would fail? Can you launch the tests on the master branch?

@francois-rozet
Copy link
Contributor Author

Note that the "profiler function will be ignored" error is also present in the main branch, but not the segmentation fault.

@runame
Copy link
Contributor

runame commented Nov 6, 2024

Yes, the issue seems unrelated to that. I just reran the test on main and it still works (see here).

@tsunghsienlee
Copy link
Contributor

I didn't include the mypy type checking

My bad, I'll revert.

Not sure what happened to the GPU tests though.

Not sure either :/ I did not change anything else in the codebase, so it seems weird that it would fail? Can you launch the tests on the master branch?

I am reruning the GPU tests now in CI, let's see how it works. This should not affect GPU tests.

@tsunghsienlee
Copy link
Contributor

It seems dropping the -e caused this GPU tests failure. Maybe we could consider bring back -e for .github/workflows/gpu-tests.yaml for now so we could merge this?

@runame
Copy link
Contributor

runame commented Nov 7, 2024

Agreed. But we should check if this issue also occurs outside of the GitHub runner, i.e. if installing without -e and running the tests on GPU causes the same issue on a different machine.

@francois-rozet
Copy link
Contributor Author

Let's go for that (but keep in mind that this bug should be investigated). I am away from my laptop, but if you make a code suggestion (in a review comment) I can merge it from my phone.

@tsunghsienlee
Copy link
Contributor

BTW, I don't think -e should affect how the test works but especially after I read the https://stackoverflow.com/questions/35064426/when-would-the-e-editable-option-be-useful-with-pip-install. However, our CPU tests are still passing but GPU tests are not tell something we might want to investigate later then.

py-modules = [
"matrix_functions",
"matrix_functions_types",
"optimizer_modules",
]

[tool.setuptools.packages.find]
include = ["distributed_shampoo*"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we want to do this instead, to avoid packaging the tests and examples:

Suggested change
include = ["distributed_shampoo*"]
exclude = ["*tests", "*examples"]

But I'm not sure how this could be related to the failing GPU tests.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unfortunately, this does not work. First, without include, setuptools will install too much things (build/lib). Second, exclude seems to have no effect.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. Not sure why exclude has no effect though? Do you know how we can exclude *tests and *examples?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There seems to be a lot of long lasting issues and confusion regarding exclude ... (e.g. pypa/setuptools#3346, pypa/setuptools#3260). I would say, the easiest is likely to move these test directories outside of the module (they should not be there anyway).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed, that makes sense to me. Thanks a lot for your fix and help with debugging!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder this might be caused by how the wildcard matching works in setuptools work. Personally I am not super familiar with this but if we really want to verify this, I might setup something super small to verify this.

@facebook-github-bot
Copy link
Contributor

@tsunghsienlee has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@tsunghsienlee
Copy link
Contributor

Thanks @francois-rozet and @runame , let's merge this for now to unblock others.

@facebook-github-bot
Copy link
Contributor

@tsunghsienlee merged this pull request in 8a3feef.

@francois-rozet francois-rozet deleted the patch branch November 7, 2024 13:51
@runame runame mentioned this pull request Nov 11, 2024
facebook-github-bot pushed a commit that referenced this pull request Nov 11, 2024
Summary:
Follow-up to #39.

This removes all `examples` and `tests` folders from the wheel, but keeps them in the sdist (by adding a `MANIFEST.in` file).

Pull Request resolved: #42

Reviewed By: anana10c

Differential Revision: D65766841

Pulled By: tsunghsienlee

fbshipit-source-id: 1bdcfad566df5ca7170e948d3193126f69fba619
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

ImportError in latest version
4 participants