Skip to content

Client-side of static invoice server #3618

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 14 commits into
base: main
Choose a base branch
from

Conversation

valentinewallace
Copy link
Contributor

@valentinewallace valentinewallace commented Feb 24, 2025

As part of being an async recipient, we need to interactively build an offer and static invoice with an always-online node that will serve static invoices on our behalf in response to invoice requests from payers.

While users could build this invoice manually, the plan is for LDK to automatically build it for them using onion messages. See this doc for more details on the protocol. Here we implement the client side of the linked protocol.

See lightning/bolts#1149 for more information on async payments.

Partially addresses #2298

@valentinewallace
Copy link
Contributor Author

Will go through the commits in a bit more detail before taking this out of draft, but conceptual feedback or feedback on the protocol itself is welcome, or the way the code is organized overall. It does add a significant amount of code to ChannelManager currently.

Copy link
Collaborator

@TheBlueMatt TheBlueMatt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmmm, I wonder if we shouldn't allow the client to cache N offers rather than only 1. I worry a bit about the privacy implications of having One Offer that gets reused across different contexts.

@valentinewallace
Copy link
Contributor Author

Hmmm, I wonder if we shouldn't allow the client to cache N offers rather than only 1. I worry a bit about the privacy implications of having One Offer that gets reused across different contexts.

I think that makes sense, so they would interactively build and cache a few and then randomly(?) return one of them on get_cached_async_receive_offer?

It seems reasonable to save for follow-up although I could adapt the AsyncReceiveOffer cache struct serialization for this now.

Comment on lines 12709 to 12714
// Expire the offer at the same time as the static invoice so we automatically refresh both
// at the same time.
let offer_and_invoice_absolute_expiry = Duration::from_secs(core::cmp::min(
offer_paths_absolute_expiry.as_secs(),
duration_since_epoch.saturating_add(STATIC_INVOICE_DEFAULT_RELATIVE_EXPIRY).as_secs()
));
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One thing I want to address eventually (but maybe not in this PR) is that right now we cap the expiry of our offer/static invoice at 2 weeks, which doesn't work well for the "offer in Twitter bio" use case. Probably we can add something to UserConfig for this, and expose a method for users to proactively rotate their offer if it never expires?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if for that we shouldn't try to come up with a scheme to allow the offer to last longer than the static invoice? I mean ideally an offer lasts at least a few years, but it kinda could cause you just care about the storage server being reliable, you don't care much about the static invoice.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. We could include another set of long-lived paths in the OfferPaths message that allows the recipient to refresh their invoice later while keeping the same offer [paths].

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean maybe the OffersPath paths should just be super long-lived? I don't see a strong reason to have some concept of long-lived paths?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Even if the OfferPaths offer_paths are super long lived, we still need a way for the recipient to update their static invoice later. So the additional paths would be for that purpose, is my thinking.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh vs just having the original paths be long-lived? I guess we could, but it seems like we could just make all the paths long-lived?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gotcha. In the current PR, the recipient sends PersistStaticInvoice over the reply path to the OfferPaths message, and that reply path is short-lived.

So we could make that reply path long-lived instead and have the recipient cache that reply path to update their invoice later. Just to confirm, that's what you're suggesting?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, that's what I was thinking. Basically just make it a "multi-reply path"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. It means we won't get extra message attempts over alternate blinded paths, but that might be a premature optimization anyway, hard to tell.

@TheBlueMatt
Copy link
Collaborator

TheBlueMatt commented Feb 25, 2025

I think that makes sense, so they would interactively build and cache a few and then randomly(?) return one of them on get_cached_async_receive_offer?

Yea, I dunno what to do for the fetch'er, maybe we just expose the whole list?

It seems reasonable to save for follow-up although I could adapt the AsyncReceiveOffer cache struct serialization for this now.

Makes sense, tho I imagine it would be a rather trivial diff, no?

@jkczyz jkczyz self-requested a review February 27, 2025 18:10
@valentinewallace valentinewallace added the weekly goal Someone wants to land this this week label Feb 27, 2025
Comment on lines +568 to +592
const IV_BYTES: &[u8; IV_LEN] = b"LDK Offer Paths~";
let mut hmac = expanded_key.hmac_for_offer();
hmac.input(IV_BYTES);
hmac.input(&nonce.0);
hmac.input(ASYNC_PAYMENTS_OFFER_PATHS_INPUT);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to include path_absolute_expiry?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought the nonce/IV was sufficient but I'm not certain. @TheBlueMatt would it be an improvement to commit to the expiry in the hmac? IIUC the path still can't be re-purposed...

@valentinewallace valentinewallace marked this pull request as ready for review March 4, 2025 21:14
@valentinewallace
Copy link
Contributor Author

Going to base this on #3640. Will finish updating the ser macros based on those changes and push updates here after finishing some review.

@valentinewallace valentinewallace removed the weekly goal Someone wants to land this this week label Mar 4, 2025
Copy link
Contributor

@vincenzopalazzo vincenzopalazzo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a couple of comments that I find out while working on the CI failure in #3593

@valentinewallace
Copy link
Contributor Author

Pushed some updates after moving the async receive offer cache into the new OffersMessageFlow struct added in #3639.

Copy link

@elnosh elnosh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

New to the codebase but interested in following async payments. From reading the explanation in the commit messages, the protocol/flow between the async recipient and the always-online node to build the static invoice and offer made sense. Overall the code changes look good to me.

Comment on lines +41 to +44
fn handle_offer_paths_request(
&self, message: OfferPathsRequest, context: AsyncPaymentsContext,
responder: Option<Responder>,
) -> Option<(OfferPaths, ResponseInstruction)>;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see it is similar to other message handler traits in the OnionMessenger but I was wondering why return Options in these handle_ methods instead of Results?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good question, I wrote that code forever ago but I think it was just consistency with the other onion message handler traits at the time. Fine to switch if reviewers prefer, although I might punt since the handle_held_htlc_available instance within the async payments trait is pre-existing...

Comment on lines +257 to +260
Self::OfferPathsRequest(_) => OFFER_PATHS_REQ_TLV_TYPE,
Self::OfferPaths(msg) => msg.tlv_type(),
Self::ServeStaticInvoice(msg) => msg.tlv_type(),
Self::StaticInvoicePersisted(_) => INVOICE_PERSISTED_TLV_TYPE,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do some use the const directly here and others get the const set through the tlv_type on the msg?

Copy link
Contributor Author

@valentinewallace valentinewallace May 19, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The variants that return consts correspond to messages that don't implement the OnionMessageContents trait, so they don't have the tlv_type method available. Looks like docs are a bit lacking here but the OnionMessageContents trait implementation seems to only be needed for onion messages that are sent in direct response to other onion messages.

@valentinewallace valentinewallace force-pushed the 2025-02-static-inv-server-client branch from 5455d55 to f8023ca Compare May 21, 2025 00:11
@valentinewallace
Copy link
Contributor Author

Rebased on the latest version of #3639

Copy link

codecov bot commented May 21, 2025

Codecov Report

Attention: Patch coverage is 78.60200% with 150 lines in your changes missing coverage. Please review.

Project coverage is 89.68%. Comparing base (101aa6f) to head (918ed61).
Report is 7 commits behind head on main.

Files with missing lines Patch % Lines
lightning/src/ln/channelmanager.rs 55.00% 52 Missing and 2 partials ⚠️
lightning/src/onion_message/async_payments.rs 0.00% 29 Missing ⚠️
lightning/src/onion_message/functional_tests.rs 0.00% 20 Missing ⚠️
lightning/src/offers/flow.rs 96.52% 11 Missing and 6 partials ⚠️
lightning/src/ln/peer_handler.rs 0.00% 15 Missing ⚠️
lightning/src/offers/async_receive_offer_cache.rs 31.81% 15 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #3618      +/-   ##
==========================================
- Coverage   89.76%   89.68%   -0.09%     
==========================================
  Files         159      161       +2     
  Lines      128828   129266     +438     
  Branches   128828   129266     +438     
==========================================
+ Hits       115644   115931     +287     
- Misses      10503    10628     +125     
- Partials     2681     2707      +26     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@valentinewallace valentinewallace force-pushed the 2025-02-static-inv-server-client branch 3 times, most recently from d1cc154 to 68fd751 Compare May 22, 2025 22:52
@valentinewallace
Copy link
Contributor Author

Pushed some minor fixes for CI.

@valentinewallace valentinewallace force-pushed the 2025-02-static-inv-server-client branch from 68fd751 to a5e4718 Compare May 22, 2025 23:21
shaavan and others added 14 commits May 27, 2025 16:34
Document that MessageForwardNode must represent a node that supports
the onion messages feature in order to be used in blinded reply paths.
Encapsulates logic for fetching peers used in blinded path creation.
Reduces duplication and improves reusability across functions.
`OffersMessageFlow` is a mid-level abstraction for handling
BOLT12 messages and flow control. It provides utilities to
help implement Offer Message Handlers in a cleaner, more modular
way.

The core motivation is to decouple Onion Messaging logic from
`ChannelManager`, reducing its responsibilities and code overhead.
This separation improves clarity, maintainability, and lays the
groundwork for giving users greater flexibility in customizing
their BOLT12 message flows.
These functions will be used in the following commit to replace closure usage
in Flow trait functions.
As part of being an async recipient, we need to support interactively building
an offer and static invoice with an always-online node that will serve static
invoices on our behalf.

Add a config field containing blinded message paths that async recipients can
use to request blinded paths that will be included in their offer. Payers will
forward invoice requests over the paths returned by the server, and receive a
static invoice in response if the recipient is offline.
Because async recipients are not online to respond to invoice requests,
the plan is for another node on the network that is always-online to serve
static invoices on their behalf.

The protocol is as follows:
- Recipient is configured with blinded message paths to reach the static invoice
  server
- On startup, recipient requests blinded message paths for inclusion in their
  offer from the static invoice server over the configured paths
- Server replies with offer paths for the recipient
- Recipient builds their offer using these paths and the corresponding static
  invoice and replies with the invoice
- Server persists the invoice and confirms that they've persisted it, causing
  the recipient to cache the interactively built offer for use

At pay-time, the payer sends an invoice request to the static invoice server,
who replies with the static invoice after forwarding the invreq to the
recipient (to give them a chance to provide a fresh invoice in case they're
online).

Here we add the requisite trait methods and onion messages to support this
protocol.
In future commits, as part of being an async recipient, we will interactively
build offers and static invoices with an always-online node that will serve
static invoices on our behalf.

Once an offer is built and the static invoice is confirmed as persisted by the
server, we will use the new offer cache added here to save the invoice metadata
and the offer in ChannelManager, though the OffersMessageFlow is responsible
for keeping the cache updated.
As an async recipient, we need to interactively build static invoices that an
always-online node will serve to payers on our behalf.

At the start of this process, we send a requests for paths to include in our
offers to the always-online node on startup and refresh the cached offers when
they expire.
As an async recipient, we need to interactively build a static invoice that an
always-online node will serve to payers on our behalf.

As part of this process, the static invoice server sends us blinded message
paths to include in our offer so they'll receive invoice requests from senders
trying to pay us while we're offline. On receipt of these paths, create an
offer and static invoice and send the invoice back to the server so they can
provide the invoice to payers.
As an async recipient, we need to interactively build a static invoice that an
always-online node will serve on our behalf.

Once this invoice is built and persisted by the static invoice server, they
will send us a confirmation onion message. At this time, cache the
corresponding offer and mark it as ready to receive async payments.
As an async recipient, we need to interactively build offers and corresponding
static invoices, the latter of which an always-online node will serve to payers
on our behalf.

Offers may be very long-lived and have a longer expiration than their
corresponding static invoice. Therefore, persist a fresh invoice with the
static invoice server when the current invoice gets close to expiration.
Over the past several commits we've implemented interactively building an async
receive offer with a static invoice server that will service invoice requests
on our behalf as an async recipient.

Here we add an API to retrieve the resulting offers so we can receive payments
when we're offline.
@valentinewallace valentinewallace force-pushed the 2025-02-static-inv-server-client branch from a5e4718 to 918ed61 Compare May 29, 2025 16:58
@joostjager
Copy link
Contributor

joostjager commented May 30, 2025

Would be helpful to link an async payments umbrella issue in the PR description, to get an overview of where things are at in the grander scheme.

Copy link
Contributor

@joostjager joostjager left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Initial pass

/// our behalf.
///
/// [`StaticInvoice`]: crate::offers::static_invoice::StaticInvoice
pub paths_to_static_invoice_server: Vec<BlindedMessagePath>,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe clarify that this is the path that this node, the recipient, is going to use? That it isn't a path for the sender to obtain the static invoice.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

More generally, client-side could have two meanings? Because both sender and recipient act as a client to the static invoice server?

@@ -15,7 +15,8 @@ use lightning::ln::peer_handler::IgnoringMessageHandler;
use lightning::ln::script::ShutdownScript;
use lightning::offers::invoice::UnsignedBolt12Invoice;
use lightning::onion_message::async_payments::{
AsyncPaymentsMessageHandler, HeldHtlcAvailable, ReleaseHeldHtlc,
AsyncPaymentsMessageHandler, HeldHtlcAvailable, OfferPaths, OfferPathsRequest, ReleaseHeldHtlc,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good explanation in the commit message. Maybe something can be added to highlight the downside of not having a static invoice server. Suppose the user would just hand over their keysend invoice to the sender or publish it on a website, what goes wrong? I suppose something goes wrong when the paths would need to change?

const HELD_HTLC_AVAILABLE_TLV_TYPE: u64 = 72;
const RELEASE_HELD_HTLC_TLV_TYPE: u64 = 74;

/// A handler for an [`OnionMessage`] containing an async payments message as its payload.
///
/// [`OnionMessage`]: crate::ln::msgs::OnionMessage
pub trait AsyncPaymentsMessageHandler {
/// Handle an [`OfferPathsRequest`] message. If the message was sent over paths that we previously
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because all the roles are in one library, it might be good to be clear about what 'we' is in this case. The static invoice server role right?

/// Handle an [`OfferPaths`] message. If this is in response to an [`OfferPathsRequest`] that
/// we previously sent as an async recipient, we should build an [`Offer`] containing the
/// included [`OfferPaths::paths`] and a corresponding [`StaticInvoice`], and reply with
/// [`ServeStaticInvoice`].
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a different role I think, recipient?


/// Confirms that a [`StaticInvoice`] was persisted by a static invoice server and the
/// corresponding [`Offer`] is ready to be used to receive async payments. Sent in response to a
/// [`ServeStaticInvoice`] message.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For all of these, probably good to mention the roles of the two involved parties.

///
/// Errors if we failed to create blinded reply paths when sending an [`OfferPathsRequest`] message.
#[cfg(async_payments)]
pub(crate) fn check_refresh_async_receive_offers<ES: Deref>(
&self, peers: Vec<MessageForwardNode>, entropy: ES,
pub(crate) fn check_refresh_async_receive_offers<ES: Deref, R: Deref>(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea is to call this on a timer?

// If a static invoice server has persisted an offer for us but the corresponding invoice is
// expiring soon, we need to refresh that invoice. Here we create the onion messages that will
// be used to request invoice refresh, based on the offers provided by the cache.
let mut serve_static_invoice_messages = Vec::new();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Extract into method?

/// Retrieve our cached [`Offer`]s for receiving async payments as an often-offline recipient.
/// Will only be set if [`UserConfig::paths_to_static_invoice_server`] is set and we succeeded in
/// interactively building a [`StaticInvoice`] with the static invoice server.
#[cfg(async_payments)]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Definitely looks like too many async payments cfg directives to me.

@@ -10357,9 +10357,21 @@ where
#[cfg(c_bindings)]
create_refund_builder!(self, RefundMaybeWithDerivedMetadataBuilder);

/// Retrieve our cached [`Offer`]s for receiving async payments as an often-offline recipient.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe add some explanation on who is going to call the API? Somehow I had to think that if we're offline, we don't need an API?

/// Will only be set if [`UserConfig::paths_to_static_invoice_server`] is set and we succeeded in
/// interactively building a [`StaticInvoice`] with the static invoice server.
#[cfg(async_payments)]
pub fn get_cached_async_receive_offers(&self) -> Vec<Offer> {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These pub fns added, do they need a high level test?

Also curious to see how this is going to be used in ldk node.

@valentinewallace valentinewallace mentioned this pull request Feb 27, 2025
31 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants