Skip to content

Conversation

@cvanelteren
Copy link
Collaborator

@cvanelteren cvanelteren commented Oct 28, 2025

This PR adds better descriptions for tests that are human readable. We know skip the test_demos.py on Github Actions as they draw too much ram which causes the entire pipeline to stall. Furthermore, we aggressively clean after each tests to ensure that none of the generated figures stray and cause memory leaks -- we are not sure if that happens but it is hard to see what the env is doing on GHA.

I think our GitHub Actions workflows are starting to stall. As our test suite has grown, so has the overall runtime. I initially considered running the tests in parallel, and using the development branch of pytest-xdist works well locally. However, on GitHub Actions we’re limited to only two cores per worker. While this could still offer some speedup, I believe the better solution is to switch to hash-based testing for pull requests and use local validation to inspect any failed cases. Hash comparisons are significantly faster than visual ones.

@cvanelteren cvanelteren requested a review from beckermr October 28, 2025 14:36
python -c "import ultraplot as plt; plt.config.Configurator()._save_yaml('ultraplot.yml')"
pytest -W ignore --mpl-generate-path=baseline --mpl-default-style="./ultraplot.yml"
pytest -W ignore \
--mpl-generate-path=baseline \
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not sure if we still want to bundle the baseline images

@beckermr
Copy link
Collaborator

As I said before, I think the problem is memory use, not compute time.

@cvanelteren
Copy link
Collaborator Author

wouldn't this solve both?

@beckermr
Copy link
Collaborator

I depends on what is causing the memory issue and what is making things slow.

  • I suspect the slowness is actually in making the plots, not in differencing the images. So I don't think moving to hashing will make the test suite faster, though I could be wrong.
  • I don't know where the memory issues are coming from. I've seen memory issues if too many figures are open at once in MPL, but I can't tell if that is the issue here. If my suspicion is true, the hashing won't help here either.

However, I'd explicitly test by making a change to the actual code and seeing what happens.

@codecov
Copy link

codecov bot commented Oct 28, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

@beckermr
Copy link
Collaborator

yeah it appears this is a memory issue and the hash comparisons do not help. maybe we should try forcing calls to fig.close somewhere?

@cvanelteren
Copy link
Collaborator Author

I think you are right. We do have an auto close after a test already: https://github.com/Ultraplot/UltraPlot/blob/c5a017a8806b3dfb826202699ec97572869d3aa0/ultraplot/tests/conftest.py#L17C1-L20C22. Not really sure how to proceed now.

@cvanelteren
Copy link
Collaborator Author

Can confirm that locally the tests (at least in parallel) draw 7Gb -- which hits the limit of the runner.

@beckermr
Copy link
Collaborator

Cool. So things to think about are

  • when is the autoclose actually executed?
  • do we need to call things like gc.collect?
  • are the figures actually getting closed?
  • is there one test in particular that is using more memory than the rest?

@cvanelteren
Copy link
Collaborator Author

  • when is the autoclose actually executed?
    Immediately after each test.
  • do we need to call things like gc.collect?
    I am trying this now
  • are the figures actually getting closed?
    Yes I checked and have an assert on it now
  • is there one test in particular that is using more memory than the rest?
    Potentially the geo ones, but the recently added demo tests are also rather big

@cvanelteren
Copy link
Collaborator Author

Ok I think the tests were hanging on the newly added test_demos.py. I will rework this PR into a housekeeping PR, adding a few nicer statements for the build and test without having any impact on what the tests initially were intended to do. I will leave the aggressive cleaning as it won't hurt.

Switching to hashing will make a difference in terms of speed, but then we don't have any visuals which is maybe not as intended so I will leave that out for now.

@cvanelteren cvanelteren changed the title Switch to mpl hash for tests Housekeeping for ultraplot-build.yml Oct 29, 2025
@cvanelteren cvanelteren enabled auto-merge (squash) October 29, 2025 10:20
@cvanelteren cvanelteren merged commit 33b9dac into Ultraplot:main Oct 29, 2025
25 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants