Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase max ssh connections on podman machine #23920

Open
wbrefvem opened this issue Sep 10, 2024 · 6 comments
Open

Increase max ssh connections on podman machine #23920

wbrefvem opened this issue Sep 10, 2024 · 6 comments
Labels
kind/feature Categorizes issue or PR as related to a new feature. machine stale-issue

Comments

@wbrefvem
Copy link
Contributor

wbrefvem commented Sep 10, 2024

Feature request description

Currently, podman machine uses the default 10 max ssh connections, which causes errors in situations like the one described here: kubernetes-sigs/kind#3742 .

Suggest potential solution

Manually increasing the max connections through MaxSessions and MaxStartups in an sshd config works, so automatically increasing it when creating the podman machine should be a straightforward approach.

@wbrefvem wbrefvem added the kind/feature Categorizes issue or PR as related to a new feature. label Sep 10, 2024
@baude
Copy link
Member

baude commented Sep 13, 2024

@jakecorrenti you were sort of most recent to mess with the SSH code, what do you think here?

@jakecorrenti
Copy link
Member

If I'm understanding this correctly, it's a matter of changing the /etc/ssh/sshd_config file in the guest?

If this is something we want to support I could see it looking like podman machine init --max-ssh-sessions or podman machine set --max-ssh-sessions. We could even just have some documentation about how to do this manually.

I don't know enough about this to determine if there are any security or performance implications that we need to be concerned about.

@Luap99
Copy link
Member

Luap99 commented Sep 17, 2024

I don't think it must be configurable, given we only allow access from localhost anyway I see no real risk or performance problems here.

The question is more what limit do we want to set? 1000, more, less...?

I am not sure what the exact limit is set, I was able to run more than 10 podman-remote process concurrently but it started to fail when going towards 100 so I can reproduce but the question is what is reasonable?

@jakecorrenti
Copy link
Member

I think answering "what is reasonable" is difficult. There's always going to be a user that isn't satisfied with the default we choose.

This is the first time I've seen an issue come up with this, so I'm leaning towards leaving the sshd config as is. We can always direct the user on how to change the sshd config or we can add some documentation

@Luap99 @baude wdyt

@Luap99
Copy link
Member

Luap99 commented Sep 23, 2024

This is the first time I've seen an issue come up with this, so I'm leaning towards leaving the sshd config as is. We can always direct the user on how to change the sshd config or we can add some documentation

Sure but there is always a first time for any problem/bug. I don't see any negatives by increasing the limits. From limited testing after a certain point connections are "randomly dropped" (i.e. there doesn't seem to be a fixed number after which is starts failing) which is very hard for users to debug or understand and that just leads to a lot of wasted debugging time on all sides (even for us when they report bugs).

As such I think having a higher limit is a positive thing. Another alternative could be to switch the remote client with machine over to use the exposed API unix socket and use a ssh connection per cli command.

Copy link

A friendly reminder that this issue had no activity for 30 days.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/feature Categorizes issue or PR as related to a new feature. machine stale-issue
Projects
None yet
Development

No branches or pull requests

4 participants