Replies: 1 comment 1 reply
-
|
Something like that is definitely possible. Server-side, the live view WebSocket operates on a subscription to frames: moonfire-nvr/server/src/web/live.rs Line 90 in 5944f2b It wouldn't be hard to set a parameter saying it only wants key frames (or non-disposable frames, which combined with the "SVC" setting on a lot of cameras might reduce the frame rate by a factor of 2). It could do this at construction time based on the URL of the HTTP UPGRADE request or toggle it mid-stream based on a client->server WebSocket message. (Going from key-frame-only mode to back all-frame mode would presumably be deferred until the next key frame to avoid causing artifacts.) There's a catch, though. Right now the live view works by sending fragmented I'd like to switch to using the WebCodecs API, which is much more straightforward. You give it a frame, it decodes a frame. It doesn't care about durations. It doesn't even really use the timestamps you give it (just copies them from the encoded frame to the decoded frame). It doesn't do buffering. If you want to prioritize liveness over steady playback, you can actually do that in a way that isn't possible with MSE. On one of my cameras, it's only 160 ms glass-to-glass latency, and I assume most of that is the camera's encoder. If you'd like to play around with WebCodecs, Moonfire doesn't support it yet, but there's an example in the retina repository. The catch with WebCodecs (there's always a catch, isn't there?) is that Firefox on Android doesn't support it yet. Firefox on macOS does, although you need to flip a config flag for H.265 to work. Chrome and Safari are both fine. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,

In bluecherry I had the following feature:
I mean I could lower the bandwidth of the camera.
What exactly did this option do?
Basically, instead of running the camera stream at full speed, it was throttled - in practice the image refreshed roughly every ~2 seconds. I’m not sure how this was implemented on the client side - whether it actually reduced bandwidth, or maybe it only displayed keyframes. I honestly don’t know.
In general, this kind of option is great for some of my older devices that simply can’t handle decoding full live streams from multiple cameras at once (and don’t have a hardware decoder in gpu). With that option it was good enough - the stream resolution/quality stayed native, but it was just refreshed less frequently. In practice that was sufficient, and the CPU usage stayed almost idle.
Do you think it's possible to do similar thing in moonfire's LiveView somehow?
Beta Was this translation helpful? Give feedback.
All reactions