Skip to content

MSC4276: Soft unfailure for self redactions #4276

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 48 additions & 0 deletions proposals/4276-soft-unfail-redactions.md
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Implementation requirements:

  • Server

Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# MSC4276: Soft unfailure for self redactions

When a user is removed from a room, the server may issue several redactions to their messages to clean
up rooms. Users may also use similar functionality, if supported by their server, to remove their own
messages after being removed.
Comment on lines +3 to +5
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the context for a remote server or remote user redacting their own messages?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some servers apply redactions as an erasure technique.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But why are the servers erasing messages? Can this please be added to the context in the MSC?

Comment on lines +3 to +5
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does remove mean explicitly? Does it mean kicked, banned or both?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No longer joined.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this please be made explicit in the text?

Comment on lines +3 to +5
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it means kicked or banned, then is good faith being assumed in either remote users and servers?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I guess the context is something like this: a user has registered on a server for use as a spam/brigade throwaway. This user has been banned by a bunch of remote servers. The server admins discover the throwaway and send redactions from the same user (hijacking the account) to clean and try undo some of the damage. In the rooms where the user has been banned, the redactions will have to use prior state to authorize and soft fail.

So I guess "good faith" here means both discovery of the user throwaway by server admins, and then also their cooperation. But that's not the right way to frame the situation. This is really a proposal to assist server admins in cleaning up accounts that have violated their tos.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not following the concern here, sorry.


These self redactions can end up being [soft failed](https://spec.matrix.org/v1.13/server-server-api/#soft-failure)
due to the authorization rules preventing the redaction event from being validated, despite being part of the
DAG at a legal point.

This proposal suggests that servers be less strict about soft failing self-redactions in particular.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think servers should be more strict about self-redactions due to legal requirements, room policy (it would suck if someone will remove useful information or frame someone by removing context), or moderation purposes (reporting messages after ban, machine learning, etc).


## Proposal

When evaluating for soft failure, servers SHOULD permit `m.room.redaction` events which target events
by the same `sender` to go through normally, avoiding soft failure. Servers MAY impose limits on how
many events can bypass soft failure. Limits similar to the following may be of interest to developers:

* Only the first 300 redactions from the same `sender` bypass soft failure.
* Only the first redaction targeting a given event bypasses soft failure. "Duplicate" redactions are
subject to soft failure.
* Only the redactions received within 1 hour of the most recent membership event change can bypass
soft failure.
Comment on lines +22 to +23
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What tooling is expected to be deployed by remote users or servers such that they send redactions within 1 hour?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is to deal with possible federation delays, not tooling delays. The redactions may take a little while to send.

Copy link
Contributor

@Gnuxie Gnuxie Mar 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess the problem i'm trying to communicate to you here is that if the user parts from a room an hour before the server admin responsible for them discovers that the account was used for abuse then the redactions will be soft failed.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additionally, not all server admins respond within 1 hour, the user may already have been banned hours and hours before addressed by a server admin, potentially leaving soft failed media (hi, race conditions) in place for other servers in the room.


Applying some or all of the conditions may help servers avoid event spam abuse from making it to
local users over `/sync`.

## Potential issues

This may allow banned users to spam rooms with redaction events - the limits proposed above are potential
ways to mitigate the issue.

## Alternatives

Another approach could be to modify auth rules to exempt same-sender `m.room.redaction` events from the requirement
to pass authorization at the current resolved state. This approach may not work well with [how redactions work](https://spec.matrix.org/v1.13/rooms/v11/#handling-redactions).
Comment on lines +33 to +36
Copy link
Contributor

@Gnuxie Gnuxie Mar 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MSC4194 is an alternative that does not require good faith in remote users and servers. For use by room moderators.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MSC4194 requires this MSC in order to work - it's not an alternative.

Copy link
Contributor

@Gnuxie Gnuxie Mar 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MSC4194 is for use by room moderators, using the room moderator to send redactions, not the target user. So does not require this MSC because the redaction events all use valid auth state. This is true even when redactions are sent for events that locally have been soft failed.


## Security considerations

See Potential Issues.

## Unstable prefix

Not applicable.

## Dependencies

This proposal has no direct dependencies.