-
Notifications
You must be signed in to change notification settings - Fork 98
Have option to exclude web crawlers from session tracking #4588
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Assigning to @getsentry/support for routing ⏲️ |
i think the issue is really only filtering, not sampling at least for this issue |
Routing to @getsentry/product-owners-ingestion-and-filtering for triage ⏲️ |
Applying existing filter rules to sessions shouldn't be too hard, @michaelchai-sentry is this a commonly encountered feature request? |
Hey team, adding a comment as this was brought up again by a customer. Also think extending the filtering functionality to apply to sessions would probably solve for #69105 as well. |
I add it to our triage |
Problem Statement
Sentry's session tracking and crash free rates don't take into account filtering or sampling. A customer of ours reflected that since we have the option to filter out web crawler errors, similarly they would like to exclude session tracking from sessions that are coming from web crawlers. This would make their crash free rates much more meaningful and reflective of actual user sessions.
Solution Brainstorm
Leverage the existing
Filter out known web crawlers
functionality and apply it to sessions as wellProduct Area
Ingestion and Filtering
┆Issue is synchronized with this Jira Improvement by Unito
The text was updated successfully, but these errors were encountered: