Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Control of indexing DRASL pages by search engines #81

Open
hectorskypl opened this issue Jun 8, 2024 · 2 comments
Open

Control of indexing DRASL pages by search engines #81

hectorskypl opened this issue Jun 8, 2024 · 2 comments
Labels
enhancement New feature or request medium priority

Comments

@hectorskypl
Copy link
Contributor

hectorskypl commented Jun 8, 2024

I've proposed it on Matrix channel, so though i make that suggestion there.
It would be nice have option to have setting to discourage search engines from indexing DRASL pages, for example via robots.txt

I did it on my own by placing robots.txt at public and making rewrite in main.go for it, it could be simple "if" though.

	e.Pre(middleware.Rewrite(map[string]string{
		"/authlib-injector/authserver/*":        "/auth/$1",
		"/authlib-injector/api/*":               "/account/$1",
		"/authlib-injector/sessionserver/*":     "/session/$1",
		"/authlib-injector/minecraftservices/*": "/services/$1",
		"/robots.txt": 				 "/drasl/public/robots.txt",
	}))

content of robots.txt

User-agent: *
Disallow: /
@Midou36O
Copy link

or simpler: make caddy return the robots.txt with that

@hectorskypl
Copy link
Contributor Author

I use LSWS as Proxy and not Docker for this.

@evan-goode evan-goode added enhancement New feature or request medium priority labels Oct 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request medium priority
Projects
None yet
Development

No branches or pull requests

3 participants