-
-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
improve Linux wrapper scripts #474
Conversation
Things can be improved even further, but I don't have the energy for a full rewrite. This change at least gets the scripts to a unified level where they're much more helpful out of the box. :) |
- Unify handling of environment variables. - Make all important variables configurable via environment rather than having to edit the script files. - Add support for custom Python command in all scripts. - Add support for venv directories with spaces in path name (especially useful when venv is on another disk). - Restructure a few things to more correct Bash code.
d2dc230
to
f3eb9e2
Compare
There is another open PR already that tries to do the same thing. #466 |
@Nerogar Ah. I had a 20 second look at the other PR now over breakfast and the code of that one looks really bad so far. It passes data through temp files on disk rather than "export" and it uses a bad way of fetching variabes. Was just the first glance where I read maybe 20% of the PR. Will have a closer look when I have free time. Maybe I end up rewriting the whole scripts instead, with a lot bigger improvements and clean and robust code. I just didn't have time at first. There is so much that could be improved with a blank slate. |
it stores one variable because it is used for the updates, in case a user chooses to use different environment than the check detects (NV, AMD, default). I'm not opposed if you implement it in any other way, just saying why stuff was done the way it was. Maybe reading things through and actually understand why things are how they are, would improve avoidance of duplication. |
@Lolzen Hey thanks for adding some details! I can see the use-case for specifying a different requirements.txt file than what the check can auto-detect, so I'll be sure to add that feature too. :) But as an environment variable rather than an on-disk file. I'm going to tag you in the updated version of this PR so you can provide feedback to see if there's any other feature you think we need. I'm starting the rewrite without looking at either the original scripts or any other changes, just to start from a clean slate with best-practices. Then I'll bring in features from the original scripts + similar changes as this PR + your idea for a Edit: After looking at OneTrainer's current scripts, it's pretty impressive... how there's practically not a single thing correct about how the old scripts were written.
|
@Arcitec awesome! As my first and main language is Lua, i'm way more versed therein, so i am pleased if you can improve the scripts in whatever means. Thanks. |
couldn't get to enter a conda env shell otherwise
was adressed in the closed PR, simple oversight.
not generally, but if a git pull is done before, and the requirements are updated, it should force to reinstall the newer versions. Oriented by how sd-ui-forge handled it.
And on Win you have to doubleclick the .bat files. Now that's nitpicking.
Python2 was still the default in e.g. Ubuntu |
You should never enter a conda shell from a shell script. That's meant for interactive sessions where the user is typing.
It doesn't. Doing it that way just reinstalls the latest, cached requirements on disk as long as they still satisfy the requirements, meaning it does not update requirements. Looking at other project's mistakes can often lead to accidents like this.
On Mac and Linux, we don't double-click shell scripts most of the time. Most desktop environments aren't even configured to handle double-clicking of shell scripts at all. We have a terminal open and run the script, often from a path that's not directly inside the project directory. So one of the core things to do in a shell script is to first ensure that the working-directory is the project directory so that all resources can be found.
That's unrelated to how the check was being done. Anyway, closing this now. Will submit the complete overhaul pull request in a moment. :) |
I can't comment on most of these things. I'm using Windows and don't have much experience with Linux. |
There are two issues with that. The first is that it only downloads newer dependencies if the constraints have changed in a way that forces an upgrade (i.e.
Yeah, by default, pip's upgrade is very conservative and will not upgrade anything that's already installed (if requirements.txt hasn't changed its constraints). The correct command is to tell it to eagerly upgrade to the latest compatible version of each dependency. The command for that is:
This achieves two things:
Furthermore, that exact command can be used for the initial fresh install too. There's no need to separate the upgrade vs install commands. Both can use this exact pip command. If they don't have any packages yet, it will install them as usual. If they had a few outdated packages already installed, it will ensure that they get upgrades during the "fresh install" too. So it's a win-win. I also recommend always running The Windows batch scripts could be improved with these changes as well. :) Edit: Here's an exact example of the commands that are being executed by the new Mac/Linux scripts: In Venv:
In Conda (it's the exact same command as Venv, but executed within
|
Does this mean reinstalling Torch will be a thing of the past? |
lol, yes. :) |
Edit: Superceded by #477