Capture multiple synconized video streams and automated video editing with ffmpeg

0

I have to capture the stream of several cameras (from 3 to 15) simultaneously so as to have for each camera a separate video file on which to make an automatic and predefined video editing.

The videos must have the same duration, same framerate, start time and end time unless there is a way with FFMPEG to save a synchronization timestamp for each file.

All connections are on the local network as it involves filming events in the same place.

Some cameras have wireless/radio connections as they are moving cameras (I thought I would use GoPro or similar with a video signal transmitter via radio), the other cameras will have a fixed location and can also be wired (in this case I thought I was using cameras IP).

The videos must be in high resolution and of variable size from HD (1280x720) to 4K (4096x2160) because the output of the final edited video will be in HD and in some videos (4K) will be automatically made default crop with pan effect and / or zoom.

QUESTION:

What is the best and fastest solution for managing so many cameras and being able to make a predefined video editing as fast as possible?

I thought of streaming by connecting the cameras to the 'Raspberry Pi' with ffserver, saving the videos on a server that with ffmpeg performs the editing.

Do you have suggestions or alternatives?

Thank you

Paperauz

Posted 2019-04-09T10:39:15.740

Reputation: 21

ffserver is deprecated; you can stream with ffmpeg itself. You can also receive and save these streams to files using ffmpeg, of course. In principle this should all be doable. Question is how you want the synchronization to be done. I don't see an obvious solution to that (c.f., https://superuser.com/questions/671380/ and many others). You might get different encoding and streaming delays from different cameras and servers, respectively, so you'd have to synchronize based on the timestamps of the RTP packets…

– slhck – 2019-04-09T18:11:20.553

See https://tools.ietf.org/html/rfc3550: "Instead, for each medium the RTP timestamp is related to the sampling instant by pairing it with a timestamp from a reference clock (wallclock) that represents the time when the data corresponding to the RTP timestamp was sampled. The reference clock is shared by all media to be synchronized."

– slhck – 2019-04-09T18:11:35.533

Oh, and maybe check if your server has enough processing capabilities for handling 15 streams including 4K, if the whole thing has to be somewhat realtime. – slhck – 2019-04-09T18:12:17.893

No answers