Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor(p2p): minor pre-IPC refactors [part 5/11] #1165

Open
wants to merge 2 commits into
base: refactor/remove-vertex-storage-protocol
Choose a base branch
from

Conversation

glevco
Copy link
Contributor

@glevco glevco commented Oct 23, 2024

Depends on #1164

Motivation

In preparation for multiprocess P2P, some method calls between HathorProtocol (which represents a single connection) and the rest of the full node (specifically, the ConnectionsManager, TransactionStorage, and some other indirect calls) will be transformed to IPC, that is, remote calls.

To make this possible, this PR changes some hardcoded calls to properties and such to method calls. This will make it easier to convert them to IPC in future PRs.

Acceptance Criteria

  • Move get_default_capabilities() from HathorManager to HathorSettings and make HathorManager.capabilities non-optional.
  • Implement shortcut methods on ConnectionsManager, HathorProtocol, TransactionStorage, and update usages accordingly.
  • No changes in behavior.

Checklist

  • If you are requesting a merge into master, confirm this code is production-ready and can be included in future releases as soon as it gets merged

@glevco glevco self-assigned this Oct 23, 2024
@glevco glevco changed the title refactor(p2p): minor pre-IPC refactors refactor(p2p): minor pre-IPC refactors [part 6/8] Oct 23, 2024
@glevco glevco force-pushed the refactor/p2p/pre-ipc-refactors branch from 5d0d2fb to dfa95ea Compare October 23, 2024 16:09
@glevco glevco marked this pull request as ready for review October 23, 2024 16:21
Copy link

codecov bot commented Oct 23, 2024

Codecov Report

Attention: Patch coverage is 95.52239% with 3 lines in your changes missing coverage. Please review.

Project coverage is 84.95%. Comparing base (2bcd22c) to head (b9b84ea).

Files with missing lines Patch % Lines
hathor/p2p/manager.py 88.23% 2 Missing ⚠️
hathor/p2p/sync_v2/agent.py 80.00% 0 Missing and 1 partial ⚠️
Additional details and impacted files
@@                             Coverage Diff                             @@
##           refactor/remove-vertex-storage-protocol    #1165      +/-   ##
===========================================================================
- Coverage                                    85.04%   84.95%   -0.10%     
===========================================================================
  Files                                          326      326              
  Lines                                        25123    25150      +27     
  Branches                                      3836     3835       -1     
===========================================================================
  Hits                                         21365    21365              
- Misses                                        3034     3054      +20     
- Partials                                       724      731       +7     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@glevco glevco changed the title refactor(p2p): minor pre-IPC refactors [part 6/8] refactor(p2p): minor pre-IPC refactors [part 4/8] Oct 23, 2024
@glevco glevco force-pushed the refactor/remove-vertex-storage-protocol branch from 5234061 to d421ece Compare October 27, 2024 21:25
@glevco glevco force-pushed the refactor/p2p/pre-ipc-refactors branch from dfa95ea to bcab3c6 Compare October 27, 2024 21:25
@glevco glevco force-pushed the refactor/remove-vertex-storage-protocol branch from d421ece to 4860168 Compare November 5, 2024 18:09
@glevco glevco force-pushed the refactor/p2p/pre-ipc-refactors branch from bcab3c6 to b8a851b Compare November 5, 2024 18:10
@glevco glevco changed the title refactor(p2p): minor pre-IPC refactors [part 4/8] refactor(p2p): minor pre-IPC refactors [part 5/10] Nov 6, 2024
@glevco glevco force-pushed the refactor/remove-vertex-storage-protocol branch from 4860168 to c32f145 Compare November 6, 2024 15:21
@glevco glevco force-pushed the refactor/p2p/pre-ipc-refactors branch from b8a851b to 901e292 Compare November 6, 2024 15:26
@glevco glevco force-pushed the refactor/remove-vertex-storage-protocol branch from c32f145 to 07ecfc0 Compare November 7, 2024 01:58
@glevco glevco force-pushed the refactor/p2p/pre-ipc-refactors branch from 901e292 to 3b27e43 Compare November 7, 2024 01:58
@glevco glevco changed the title refactor(p2p): minor pre-IPC refactors [part 5/10] refactor(p2p): minor pre-IPC refactors [part 5/11] Nov 7, 2024
@glevco glevco force-pushed the refactor/remove-vertex-storage-protocol branch from 07ecfc0 to b2ebfec Compare November 8, 2024 15:44
@glevco glevco force-pushed the refactor/p2p/pre-ipc-refactors branch from 3b27e43 to 97652fe Compare November 8, 2024 15:45
@glevco glevco force-pushed the refactor/remove-vertex-storage-protocol branch 2 times, most recently from 10cad8f to 2bcd22c Compare November 25, 2024 16:15
@glevco glevco force-pushed the refactor/p2p/pre-ipc-refactors branch from 97652fe to 5f45ef6 Compare November 25, 2024 16:17
@@ -206,8 +206,7 @@ def send_ping(self) -> None:
"""
# Add a salt number to prevent peers from faking rtt.
self.ping_start_time = self.reactor.seconds()
rng = self.protocol.connections.rng
self.ping_salt = rng.randbytes(self.ping_salt_size).hex()
self.ping_salt = self.protocol.connections.get_randbytes(self.ping_salt_size).hex()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems better to simply pass an rng to the protocol. Isn't it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The RNG should be the same in the ConnectionsManager and in HathorProtocol, right? At least for simulation purposes.

If that's the case, we can't simply pass the RNG between them because now the protocol will live in a different process. Otherwise, if we can have different RNGs, then yes, it would be simpler to just have its own RNG here.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could use a "derived" rng instance, where rng in the main process just generates a seed that is used to instantiate the rng in the subprocess.

There's really only one reason for using an rng instance instead of a global rng, and that is using seeds to reproduce test cases. It's an important for reproducing rare cases that happen randomly in tests.

Considering there's parallelism, some events can happen in different order regardless of the initial seed, and this will affect how we generate random bytes regardless if we query the main thread for random bytes or generate them locally, so in that sense I think generating them locally is better because it avoids an IPC call. And if we really need to we can focus on avoiding that parallelism in tests or simulating it in a reproducible way.

@@ -102,7 +102,6 @@ def handle_blocks(self, blk: Block) -> None:
if self.tx_storage.partial_vertex_exists(blk.hash):
# We reached a block we already have. Skip it.
self._blk_repeated += 1
is_duplicated = True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why did you remove this? It seems unrelated to the PR and it would change the behavior of sync-v2.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This variable is unused in this context. It's just a local variable that we set here, and the method returns just below.

In fact, in the latest commit (b9b84ea) I completely removed as it's just used to toggle a debug message that would never change.

@jansegre jansegre mentioned this pull request Dec 6, 2024
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: Todo
Development

Successfully merging this pull request may close these issues.

3 participants