-
Notifications
You must be signed in to change notification settings - Fork 15
Housekeeping for ultraplot-build.yml
#390
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| python -c "import ultraplot as plt; plt.config.Configurator()._save_yaml('ultraplot.yml')" | ||
| pytest -W ignore --mpl-generate-path=baseline --mpl-default-style="./ultraplot.yml" | ||
| pytest -W ignore \ | ||
| --mpl-generate-path=baseline \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not sure if we still want to bundle the baseline images
|
As I said before, I think the problem is memory use, not compute time. |
|
wouldn't this solve both? |
|
I depends on what is causing the memory issue and what is making things slow.
However, I'd explicitly test by making a change to the actual code and seeing what happens. |
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
|
yeah it appears this is a memory issue and the hash comparisons do not help. maybe we should try forcing calls to fig.close somewhere? |
|
I think you are right. We do have an auto close after a test already: https://github.com/Ultraplot/UltraPlot/blob/c5a017a8806b3dfb826202699ec97572869d3aa0/ultraplot/tests/conftest.py#L17C1-L20C22. Not really sure how to proceed now. |
|
Can confirm that locally the tests (at least in parallel) draw 7Gb -- which hits the limit of the runner. |
|
Cool. So things to think about are
|
|
|
Ok I think the tests were hanging on the newly added Switching to hashing will make a difference in terms of speed, but then we don't have any visuals which is maybe not as intended so I will leave that out for now. |
ultraplot-build.yml
This PR adds better descriptions for tests that are human readable. We know skip the
test_demos.pyon Github Actions as they draw too much ram which causes the entire pipeline to stall. Furthermore, we aggressively clean after each tests to ensure that none of the generated figures stray and cause memory leaks -- we are not sure if that happens but it is hard to see what the env is doing on GHA.I think our GitHub Actions workflows are starting to stall. As our test suite has grown, so has the overall runtime. I initially considered running the tests in parallel, and using the development branch ofpytest-xdistworks well locally. However, on GitHub Actions we’re limited to only two cores per worker. While this could still offer some speedup, I believe the better solution is to switch to hash-based testing for pull requests and use local validation to inspect any failed cases. Hash comparisons are significantly faster than visual ones.