Another way to simulate a real world live camera is to use udp://
. For example:
# (make sure you use the correct screen number, in my case it was :1, not :0)
ffmpeg -re -f x11grab -r 15 -s 1280x720 -i :0.0+0,0 -map 0:v -c:v libx264 -f mpegts udp://localhost:50000
The video is received by:
ffmpeg -i udp://localhost:50000 -f mpegts video.ts
Furthermore, if your purpose is simply to simulate a live camera (as for testing a computer vision pipeline), you could use the native frame rate -re
option, the looping -stream_loop -1
option and a static file:
VIDE0=./static-video.mp4
ffmpeg -re -stream_loop -1 -i $VIDEO -map 0:v -f mpegts udp://localhost:50000
Official ffmpeg info on the -re
option:
-re (input)
Read input at native frame rate. Mainly used to simulate a grab
device, or live input stream (e.g. when reading from a file).
Should not be used with actual grab devices or live input
streams (where it can cause packet loss). By default ffmpeg
attempts to read the input(s) as fast as possible. This option
will slow down the reading of the input(s) to the native frame
rate of the input(s). It is useful for real-time output (e.g.
live streaming).