Fix bug around Lazy caching and update Lazy model#1857
Fix bug around Lazy caching and update Lazy model#1857a-morales wants to merge 6 commits intoseries/0.18from
Conversation
| } | ||
|
|
||
| def map[B](f: A => B): Lazy[B] = new Lazy(() => f(make())) | ||
| sealed trait Lazy[A] { |
There was a problem hiding this comment.
This change is not necessarily required for this PR.
There was a problem hiding this comment.
Does making it a sealed abstract class solve the MiMa issue ?
There was a problem hiding this comment.
Unfortunately it does not.
| case BijectionSchema(s, bijection) => | ||
| underlying(BijectionSchema(this(s), bijection)) | ||
| case LazySchema(suspend) => | ||
| underlying(LazySchema(suspend.map(this.apply))) |
There was a problem hiding this comment.
This was the problematic line here and it's actually 2 bugs in the same line.
both underlying and this.apply are the same so the transformation was essentially being applied twice (This is evident with the updated Lazy changes where applying a transform went from
LazySchema(Lazy.Root(_)) to LazySchema(Lazy.Mapped(Lazy.Mapped(Lazy.Root(_), _), _)). The other bug is that this.apply does not have a stable reference and it modifies the underlying schema which means that the store in the CompilationCache was getting a different transformed Schema which would not be in the store and thus be added and returned. Just adding a map that returns the same result for Lazy fixes the bug.
There was a problem hiding this comment.
From my PR, this fix can be simplified to LazySchema(Lazy(underlying(suspend.value)))
There was a problem hiding this comment.
Trying your simplification causes some tests to fail. Need to investigate why.
| case BijectionSchema(s, bijection) => | ||
| underlying(BijectionSchema(this(s), bijection)) | ||
| case LazySchema(suspend) => | ||
| underlying(LazySchema(suspend.map(this.apply))) |
There was a problem hiding this comment.
From my PR, this fix can be simplified to LazySchema(Lazy(underlying(suspend.value)))
While working on updating the Lazy model for #1815 I was writing a test to come up with a case where Lazy was being problematic. I found that schema visitors that implement the
CachedVisitorwhich uses theCompilationCachealong with usingtransformTransitivelyKon the schema results in the store of theCompilationCacheto grow in proportion to how big a recursive structure is. Before my fix the size of the store in the test was the same order of magnitude as the size of the recursive structure (so if I created a cons list with a length of 1000 the store would have a size of 1000). With my fix the size of the store stays constant with the number of underlying schemas for a given schema.PR Checklist (not all items are relevant to all PRs)