Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
As discussed in #1848, @jtwaleson and I have been working on a way to speed up git blame by introducing a caching mechanism. This allows us to start a blame operation from a checkpoint instead of computing it from scratch, significantly reducing computation time.
Proposed Changes
The function
function::file
now accepts aBlameCacheObject
, which stores:Commit ID at which the blame was previously computed.
Blame entries corresponding to that commit.
Using the cached data, we compute the differences between the cached blob and the new target blob at the suspect commit.
If the file has been rewritten, this will probably error, so the BlameCacheObject might need to store the file path as well.
Cached blame entries are updated based on detected changes.
Only
UnblamedHunks
(caused byAddedOrReplace
changes) are recomputed using the standard blame algorithm.Previously, the entire file or a range was marked as
UnblamedHunk
, but now this only happens when necessary.So far the results show significant speed-ups. These are results for the README file in the linux repo starting with a blame at commit
bf4401f3ec700e1a7376a4cbf05ef40c7ffce064
.Next Step's
BlameCacheObject
Curious to hear what you think!