83

I want to make a live stream of (a window on) my linux desktop using a free streaming site, using the captured video as a fake webcam. There are many tools for this on windows. ffmpeg allows me to capture input on a specific window, but I can't find a way to output the video to a fake webcam-style device usable by flash.

Can anyone recommend a method (or software) for doing this?

3
  • 2
    It was all hunky-dory right up until you said "fake webcam-style device usable by flash". Commented Apr 13, 2012 at 3:05
  • I'm not sure what you mean? By "webcam-style" device I meant something like a fake /dev/videoN device file, or something similar. I assume this is how the captured video would be usable by flash.
    – bkconrad
    Commented Apr 13, 2012 at 3:07
  • Exactly. That's hard. Commented Apr 13, 2012 at 3:08

6 Answers 6

93

You can install v4l2loopback. It is a kernel module that simulates a webcam. Load it with:

modprobe v4l2loopback

Then you need to send the video stream to the device /dev/videoN (where N is the number that corresponds to the freshly created device - probably the highest number) using a program like ffmpeg. In order to capture the desktop and forward it to /dev/videoN with ffmpeg, you can use the following command line:

ffmpeg -probesize 100M -framerate 15 -f x11grab -video_size 1280x720 -i :0.0+0,0 -vcodec rawvideo -pix_fmt yuv420p -f v4l2 /dev/videoN

Change the value of -framerate from 15 to something else if you want a different frame rate.

The resolution is chosen in the -video_size parameter. If you want to specify an offset from the upper-left corner of the screen, pass it in the -i parameter in the form -i :0.0+x,y, where x and y are the horizontal and vertical offset respectively.

9
  • 10
    Your image may be mirrored, I mean get horizontal flip depending on your ffmpeg build. Use video filter -vf hflip. If you already use one vf, put them inside quotes and separed by comma, such as -vf 'hflip,scale=640:360'.
    – Rutcha
    Commented Jul 29, 2015 at 18:40
  • 6
    I'm getting a few errors with this [x11grab @ 0x24013c0] Stream #0: not enough frames to estimate rate; consider increasing probesize, [v4l2 @ 0x2409520] ioctl(VIDIOC_G_FMT): Invalid argument, and Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument
    – Lotus
    Commented Aug 2, 2016 at 3:14
  • 1
    I am getting the exact same outputs as @Lotus with ffmpeg 3.1.4. Are there any special codecs or packages requiered beside v4l2loopback and ffmpeg?
    – cguenther
    Commented Oct 6, 2016 at 9:37
  • 3
    @Lotus I managed to get around the problem, when i use another /dev/video device. You should ensure to use the v4l2 video device (which is in my case the /dev/video1). The offset can be caused by an already existing real webcam device using the /dev/video0.
    – cguenther
    Commented Oct 6, 2016 at 9:49
  • 2
    @Lotus Make sure that the modprobe command (as root or with sudo) has been run before you start your chrome browser. Check that the module is loaded correctly with <code>lsmod | grep v4l2loopback</code>.
    – cguenther
    Commented Oct 20, 2016 at 8:08
13

Use v4l2loopback with mplayer.

  1. Download it,

  2. compile it (make and su -c 'make install'),

  3. load the module with su -c 'modprobe v4l2loopback',

  4. then change one line in the file examples/yuv4mpeg_to_v4l2.c of the v4l2loopback source folder from

    v.fmt.pix.pixelformat = V4L2_PIX_FMT_YUV420;
    

to

    v.fmt.pix.pixelformat = V4L2_PIX_FMT_YVU420;
  1. and do make in this folder.

  2. Then run it from the examples directory like this:

    mkfifo /tmp/pipe  # only needed once, as long as you do not delete the file /tmp/pipe
    ./yuv4mpeg_to_v4l2 < /tmp/pipe &
    mplayer movie.mp4 -vf scale=480:360 -vo yuv4mpeg:file=/tmp/pipe
    

where you replace movie.mp4 with the name of your video file. And replace /dev/video0 with your loopback device.

MPlayer is able to play any webstreams, all kind of video files, even from stdin! I just tested it with a file from http://www.tagesschau.de which is a german news site.

TS=$(wget "http://www.tagesschau.de/multimedia/video/" -q -O - | grep --regexp='http.*\.webm"' | sed -e 's%.*href="%%' -e 's%\.webm".*%\.webm%')
./yuv4mpeg_to_v4l2 < /tmp/pipe &
mplayer $TS -vf scale=480:360 -vo yuv4mpeg:file=/tmp/pipe

Instead of the $TS you could put a - (which stands for stdin). And in front of mplayer your ffmpeg command redirecting its output to stdout. So something like:

./yuv4mpeg_to_v4l2 < /tmp/pipe &
ffmpeg -someOptions ... -o - | mplayer - -vf scale=480:360 -vo yuv4mpeg:file=/tmp/pipe

Did not test the last one, because you did not tell how your ffmpeg command look like.

1
  • Please help: ./yuv4mpeg_to_v4l2 < /dev/video0 & leads to ./yuv4mpeg_to_v4l2: : missing YUV4MPEG2 header. How to replace /tmp/pipe by /dev/video0 ?
    – user123456
    Commented Oct 17, 2016 at 1:41
4

What distro are you using? I've had success with WebCamStudio under Arch combined with the Livestream web-based "studio." It's been a little while since I've used it, though.

http://www.ws4gl.org/

What are you trying to do exactly? ffmpeg compiled with x11grab can record the desktop. I've had limited success pushing that to Ustream, but again it's been a while and I think what I was doing won't work anymore.

If you just want to stream a file rather than your desktop (I'm thinking when you say, "A window," you mean, "VLC"), I can point you in the right direction to get that working with Livestream (maybe Ustream). I'm clumsily figuring out how to do this through experimentation. It's not fantastic but it works with Livestream.

Justin.tv has scripts that can stream from VLC to their service, as well.

http://apiwiki.justin.tv/mediawiki/index.php/Linux_Broadcasting_API

3
  • Oh wow this is very interesting. I'm trying to stream live gameplay of some games. I already know how to capture into a video file from ffmpeg, I wonder if I can open that same file in vlc and somehow stream it as it's being written. Thanks for the info.
    – bkconrad
    Commented Apr 13, 2012 at 15:28
  • ws4gl.org website seems very outdated, and all links point to the Wayback Machine. I suppose the latest version is available at sourceforge.net/projects/webcamstudio Commented Feb 12, 2016 at 15:18
  • "WEBCAMSTUDIO IS NO MORE MAINTAINED" :/
    – Raphael
    Commented Sep 5, 2016 at 13:00
4

Without using ffmpeg, this is what worked for me (Ubuntu 20.04):

  1. Install OBS : https://obsproject.com/download
  2. Install the v4l2loopback module: https://github.com/umlaeute/v4l2loopback#run
  3. Start the module: v4l2loopback devices=1 video_nr=10 card_label="OBS Cam" exclusive_caps=1 (in which video_nr means the device number (it will become /dev/video10 in this example)
  4. Install obs-v4l2sink: (deb package) https://github.com/CatxFish/obs-v4l2sink/releases
  5. Install libobs-dev (not sure if needed)
  6. Link the library to the correct directory: ln /usr/lib/obs-plugins/v4l2sink.so /usr/lib/x86_64-linux-gnu/obs-plugins/
  7. Then follow: https://github.com/CatxFish/obs-v4l2sink/

NOTE: remember to use the device you specified, like: /dev/video10

3

First, appear.in probably does what you want without any hassle (I'm not affiliated): http://appear.in/

Second, you can stream to Twitch or other services using OBS, which recently added linux support(!): https://obsproject.com/

OBS also solves the much harder problem of muxing system sound and audio input while screen capturing on Ubuntu (not solved by anything in the universe repo that I've found so far).

I don't have any awesome unix-y solutions. But those worked for me in the real world.

1
1

Another way to simulate a real world live camera is to use udp://. For example:

# (make sure you use the correct screen number, in my case it was :1, not :0)
ffmpeg -re -f x11grab -r 15 -s 1280x720 -i :0.0+0,0 -map 0:v -c:v libx264 -f mpegts udp://localhost:50000

The video is received by:

ffmpeg -i udp://localhost:50000 -f mpegts video.ts

Furthermore, if your purpose is simply to simulate a live camera (as for testing a computer vision pipeline), you could use the native frame rate -re option, the looping -stream_loop -1 option and a static file:

VIDE0=./static-video.mp4
ffmpeg -re -stream_loop -1 -i $VIDEO -map 0:v -f mpegts udp://localhost:50000

Official ffmpeg info on the -re option:

-re (input) Read input at native frame rate. Mainly used to simulate a grab device, or live input stream (e.g. when reading from a file). Should not be used with actual grab devices or live input streams (where it can cause packet loss). By default ffmpeg attempts to read the input(s) as fast as possible. This option will slow down the reading of the input(s) to the native frame rate of the input(s). It is useful for real-time output (e.g. live streaming).

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .