You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Do you think it is possible to support MP4 in the same method as you presented here? I know mp4 can be transported by using RTMP, HLS or DASH but not sure if a regular HTTP GET/POST will work in MP4?
I know that WebM and OGV/OGG support streaming and seeking with Range header as you can icecast/oggfwd or FFSERVER them, but in case of MP4 set to -movflags frag_keyframe+empty_moov there still need to be supported transport like RTMP, HLS or DASH?
I asking as I;m being working for a few days on a method to stream live video/audio over a websocket by using FFMPEG/GSTREAMER.
I manage to mux on the same websocket 2 channels, one for video the other for audio.
The raw data is h264, baseline NAL frame and the audio is AAC, stereo 44100Hz.
On the client there is a javascript h264 decoder that works pretty well!
I get less then 2sec delay which is much better then HLS or DASH.
However Audio seems to be a problem.
First I can't play it in Iphone! as I'm using a mp3/aac decoder I compile from native C++ code using emscripten which seems not to work well.
In other platform is works, but then again sometime the audio go out of sync.
I tied to sync it by an external clock and failed doing so.
I can go with a WebRTC solution - which I did, and manage to get a live WebM stream on all device expect again Iphone.
Maybe you will have an idea?
The text was updated successfully, but these errors were encountered:
Hi,
Do you think it is possible to support MP4 in the same method as you presented here? I know mp4 can be transported by using RTMP, HLS or DASH but not sure if a regular HTTP GET/POST will work in MP4?
I know that WebM and OGV/OGG support streaming and seeking with Range header as you can icecast/oggfwd or FFSERVER them, but in case of MP4 set to -movflags frag_keyframe+empty_moov there still need to be supported transport like RTMP, HLS or DASH?
I asking as I;m being working for a few days on a method to stream live video/audio over a websocket by using FFMPEG/GSTREAMER.
I manage to mux on the same websocket 2 channels, one for video the other for audio.
The raw data is h264, baseline NAL frame and the audio is AAC, stereo 44100Hz.
On the client there is a javascript h264 decoder that works pretty well!
I get less then 2sec delay which is much better then HLS or DASH.
However Audio seems to be a problem.
First I can't play it in Iphone! as I'm using a mp3/aac decoder I compile from native C++ code using emscripten which seems not to work well.
In other platform is works, but then again sometime the audio go out of sync.
I tied to sync it by an external clock and failed doing so.
I can go with a WebRTC solution - which I did, and manage to get a live WebM stream on all device expect again Iphone.
Maybe you will have an idea?
The text was updated successfully, but these errors were encountered: