-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
wasm serialize & hook interface #4479
base: master
Are you sure you want to change the base?
Conversation
@Tyriar The build passed in CI 😃 (seems my windows patch also works on azure containers...) But the runtime penalty is quite big for mac and windows (>3 min), under linux it is only +20s 🤔 To not penalize things from the compiler sdk, I could change gitignore to include the built wasm file, thus the CI does not have to spin up a compiler at all. Downside - the wasm file would have to reside in the repo. Edit: CI runtime is back to normal 😅 with the two commits below, but we now we might need an explicit force compile step before packing to not let slip through an old wasm binary by accident. Well CI setup still needs fine-tuning... |
Running xterm-benchmark gives these numbers: Context "out-benchmark/benchmark/SerializeAddon.benchmark.js"
Context "Terminal: sh -c "dd if=/dev/urandom count=40 bs=1k | hexdump | lolcat -f""
Context "serialize"
Case "#1" : 5 runs - average throughput: 5.91 MB/s Context "out-benchmark/benchmark/Serialize2Addon.benchmark.js"
Context "Terminal: sh -c "dd if=/dev/urandom count=40 bs=1k | hexdump | lolcat -f""
Context "serialize"
Case "#1" : 5 runs - average throughput: 64.30 MB/s |
@Tyriar Found a way to avoid costly SDK pulling during normal CI runs:
Time to get back to the actual serialization code 😸 |
@jerch committing it and forcing compile during packaging sounds like a fine compromise to me. |
@@ -0,0 +1,42 @@ | |||
## xterm-addon-serialize | |||
|
|||
An addon for [xterm.js](https://github.com/xtermjs/xterm.js) that enables xterm.js to serialize a terminal framebuffer into string or html. This addon requires xterm.js v4+. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
v4 -> v5.2+? Should we just remove that line as there's the experimental line + we call it out in the release notes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure thing. I just copy-cloned from the old addon and did not bother to fix all things yet.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add a "serialize2 (wasm)" button to the demo for convenient testing?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yepp, already did that in my local branch (not pushed yet...)
}); | ||
|
||
describe('text', () => { | ||
// TODO: wirte alot more test cases here... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible to just copy over the old SerializeAddon.test.ts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did that, and removed all HTML test code (since the addon does not implement any HTML serialization yet). Then I realized, that there are no real test cases for the text-based serializer beside a dummy one... 😱
throw new Error('Cannot use addon until it has been loaded'); | ||
} | ||
|
||
return serialize((this._terminal as any)._core); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Haven't checked what you need here, but casting it to an internal interface like this will catch typing problems at compile time:
const core = (terminal as any)._core as ITerminal; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well that code is still the early hacking version (weirdly hacked in type conversions, no proper memory control, no nice interface yet). I hope you did read it too closely, or your eyes will bleed 😺
I am still messing with the right interface idea here - technically the wasm serialize part can act as a static singleton (would safe some resources/runtime later on during wasm bootstrapping), as JS does not allow multiple concurrent serializations from its single thread idea, and we also cannot asyncify things here, as we have to ensure, that the terminal buffer does not change, while a serialization is running.
So my current idea is to put everything behind a lazy static initializer, which will serve all serialization requests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't look too hard, just skimmed it 🙂
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
... did not read ..., tss tss tss 🤣
// itoa LUT | ||
// note: performant decimal conversion of numbers is actually a hard problem | ||
// we use a LUT approach with 0-99 precomputed in utf16 for better performance | ||
const LUT100 = new Uint32Array(100); | ||
for (let i1 = 0; i1 < 10; ++i1) { | ||
for (let i2 = 0; i2 < 10; ++i2) { | ||
LUT100[i1 * 10 + i2] = (48 + i2) << 16 | (48 + i1); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This isn't used other than d32.set(LUT100, 64)
? We could either set it directly on d32
or put LUT100
in a closure such that we don't hold on to the Uint32Array
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, thats part of the bleeding eye code - see my remarks above.
// single call is pretty wasteful? preload combines once instead? | ||
writeCombinedOverload = (dst, x) => { | ||
let dp = dst >> 1; | ||
const comb: string = (line as any)._combined[x] || ''; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any as any
should use internal types where possible, if we need to expose a combined
on the interface that's fine
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above 😺
let writeCombinedOverload: (dst: number, x: number) => number = (dst, x) => dst; | ||
const writeCombined: (dst: number, x: number) => number = (dst, x) => writeCombinedOverload(dst, x); | ||
let writeLinkOverload: (dst: number, x: number) => number = (dst, x) => dst; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of setting these functions via let
, should they just be functions that take line
and whatever else is needed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above 😺 - I already reworked that one locally, it is now a static function pointer on a context object and not on top level anymore.
|
||
const buffer = t.buffer; | ||
const len = t.buffer.lines.length; | ||
let wPos = 150*65536; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Surprised this isn't a lint rule:
let wPos = 150*65536; | |
let wPos = 150 * 65536; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, interesting. Yeah I fixed all linter nagging (was quite a bunch in my hacky code), but this one did not show up.
The hardcoded values will be replaced by proper memory management, prolly static again to avoid wasm compilation over and over, instead calling earlier into TextDecoder.decode
.
reminder - also add overline attribute support. |
Early WIP/PoC...