-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Run requests in parallel #4
Comments
This should definitely be done in core, so that there are effects describing this abstract "feature", which are than implemented by each backend. We can do it in two ways imo:
good idea bad idea? I'd love to implement option 2, I just have no clue how |
Regarding (1):
So, we have (will have) two backends, As far as I can tell, the same applies to If that's the case, then I think it makes sense to do this at a higher-level. I.e. in the implementation of the copy command in Regarding (2): I'm not sure I understood what you had in mind. Can you please explain it another way? |
So what is the final task? To rewrite commands in |
@sancho20021 yeah let's try solving this at the |
Problem: some function in Commands.hs uses StateT transformer with Sem monad. Solution: use Polysemy State instead of StateT.
Problem: We want to make cmd-s concurrently Solution: Use `Async` effect + `sequenceConcurrent` to run read operations in concurrency Added it for: - `copyCmd`
Problem: We want to run cmd in concurrency But StateT isn't suitable for this task Solution: Use MVar for tread safe operations Rewrite existing recursive `go` function to new realities Now we're sorting every dir in alphabetic order to prevent non stability from concurrency
Most commands need to retrieve all entries descending from either the root directory or some other directory.
When we have hundreds of entries stored in vault, this will mean doing hundreds of requests (1 per directory + 1 per entry).
Unfortunately, Vault's API does not support batch requests.
At the moment, these requests are done sequentially. We'd see a massive speedup if they were run concurrently.
Commands where this could be applied:
view
/find
: read directories/entries concurrentlyrename
/copy
: read / write entriesrename
/delete --recursive
: delete entriesThe text was updated successfully, but these errors were encountered: