15

I piped a stream from one instance of FFmpeg to another, but because compression was used on the intermediate stream the final result was ugly. I need a lossless pipe to prevent this, and I want it to contain both audio and video.

I suspect that there is more than one answer to this problem, so bonus points go to anyone who provides an exhaustive list of solutions (compatible containers and codecs that can be piped). Bonus points also go to anyone who accounts for other data, like subtitles.

EDIT: I'm looking for suitable codec/container combinations. I don't know why people were having difficulty figuring that out, since I said I already used a pipe and now I need it to be lossless.

I don't know how to explain this without sounding conceited, but this is an FAQ site. Asking questions that require extremely specific answers will not help the millions of users who reach this site by entering their own problems into search engines. My question was designed to help anyone else who needs to losslessly pipe data between FFmpeg instances without distracting everyone with a wall of narrative and code explaining what I was doing, why it didn't work, and why this is the only option.

4
  • 3
    What have you tried? Please show your ffmpeg command and the complete ffmpeg console output. Why are you piping from ffmpeg to ffmpeg? Please provide more details about what you're doing and what you want to achieve.
    – llogan
    Commented Dec 6, 2013 at 21:22
  • 1
    If the pipe is lossless and contains audio and video, it doesn't matter if it's going to and from /dev/null. Context would only complicate this.
    – Wutaz
    Commented Dec 6, 2013 at 21:25
  • 1
    I disagree. If I at least know your ffmpeg version and how it was configured, and information about your inputs (the console output will suffice), and your output requirements I can provide a more accurate answer.
    – llogan
    Commented Dec 6, 2013 at 21:30
  • FFmpeg configuration I can understand, but even as much as FFmpeg changes I doubt that its support for uncompressed streams will ever be a problem. If you know of something that can change the ideal format to use, then please list them all and when each is appropriate. Giving only one narrow answer will not answer the question properly.
    – Wutaz
    Commented Dec 6, 2013 at 21:39

3 Answers 3

18

How to losslessly pipe video and audio from ffmpeg to ffmpeg

Requirements by the question asker:

  • losslessly pipe from one instance of ffmpeg to another
  • it doesn't matter if it's going to and from /dev/null

Example:

ffmpeg -s 1280x720 -f rawvideo -i /dev/zero -ar 48000 -ac 2 -f s16le -i \
/dev/zero -c copy -f nut pipe:1 | ffmpeg -y -i pipe:0 -c copy -f nut /dev/null

I see no reason why anyone would do this. Also, there are very few reasons to pipe from ffmpeg to ffmpeg when you can most likely just use one ffmpeg process to do whatever you want.

What the options do:

  • -s 1280x720 -f rawvideo – Options to describe the input since /dev/zero is not a typical input format and therefore these additional options are required.

  • -i /dev/zero – The video input. It is being used in this example to generate a video stream out of "nothing". This was used in the example because the question asker refused to provide any information about the inputs being used.

  • -ar 48000 -ac 2 -f s16le – Options to describe the input since /dev/zero is not a typical audio format.

  • -i /dev/zero – The audio input. It is being used in this example to generate an audio stream out of "nothing".

  • -c copyStream copy, or re-mux, the inputs to the output. No re-encoding is being performed so the process is lossless. It is unknown if stream copying is acceptable to the question asker or not. Maybe re-encoding is desired instead?

  • -f nut – You need to tell ffmpeg what format to use for the pipe. Nut is a container format. See ffmpeg -formats for a complete list. Another flexible format is -f matroska, but it is impossible to suggest an appropriate or specific output container format to use without more info from the question asker.

  • pipe:1 – Use the pipe protocol to output to stdout. Alternatively, the number can be omitted (just pipe:) and by default the stdout file descriptor will be used for writing and stdin will be used for reading.

2
  • 6
    There definitely are reasons why you would want to do this. One use case is piping a video stream over an SSH tunnel for secure remote viewing: ssh [email protected] "ffmpeg -i $URL -c copy -f nut pipe:1" | ffplay -i pipe:0
    – Yuval A
    Commented Dec 25, 2016 at 12:49
  • 1
    @YuvalA I was referring to the OP's firm requirement of using /dev/null as the final output.
    – llogan
    Commented Dec 25, 2016 at 18:55
11

The way I learned to do this (from parts of previous answers) is to use the rawvideo codec for the video, the pcm_s16le audio codec, and FFmpeg's nut wrapper to encode the stream. nut is not supported by major programs outside of FFmpeg, but it's the only container I currently know of that can support the uncompressed formats needed to efficiently pipe data between processes.

The arguments for this encoding might look like this:

... -c:v rawvideo -c:a pcm_16le -f nut - ...

Some audio is stored with 24-bit or larger samples, and for these you should instead use pcm_24le or a different format. The full list of uncompressed audio formats will be listed by running ffmpeg -codecs (you will have to search the list for them). If you don't know what the sample size of your audio is, using pcm_16le should cause no noticeable loss in quality.

On the recieving end of the pipe, set the input to standard input and ffmpeg will detect the format and decode the stream.

... ffmpeg -i - ...

The ellipsii (...) in this answer are not part of the code. These are where your code goes. The lone hyphens (-) tell FFmpeg to use either standard input or standard output, depending on where they appear.

UPDATE:

I tried a simple experiment to improve this, and it seems that a better container is AVI, because other programs will understand it (at least VLC will).

... -c:v rawvideo -c:a pcm_16le -f avi - ...

This will work exactly like the old version, with the added bonus of compatibility.

In hindsight, I regretted posting a question that wasn't helpful in many situations, despite my claim that questions should be helpful to everyone. This makes the answer more useful.

2
  • 2
    have you tried mkv? I supports subtitles, and I think every codec that ffmpeg supports can be muxed into mkv. mkv, unlike mp4, plays without needing to see the end of the file, so it might work in a pipe. Commented Jan 15, 2015 at 8:06
  • Also, there are named codecs for various raw video formats, so you don't run the risk of mismatching your pixel formats on output/input. e.g. -c:v yuv4 is a codec that reads/writes raw video with the yuv420p pix_fmt. With rawvideo, if you add/remove a filter, it might leave the output in a different pixel format, if the filter didn't support the pixel format of the video. Commented Jan 15, 2015 at 8:41
1

The one problem with the other answer is that it is pcm_s16le, not s16le. Also, it includes a lot of redundant parameters.

I would use pcm instead of flac in the pipe, because it takes far less time to process (PCM is raw audio, FLAC takes lots of time to encode).

Anyway, here's how I would do it.

ffmpeg -i <input video file/stream> -vcodec rawvideo -acodec pcm_s16le pipe:1 | ffmpeg -f rawvideo -i - -vcodec <video output codec> -acodec <audio output codec> -vb <video bitrate if applicable> -ab <audio bitrate if applicable> <final-output-filename>

This worked for me when I last tried, but my goal was to pipe ffmpeg into ffplay, which is a slightly different process.

example:

This pipes a video from ffmpeg to another instance as raw video output and 16 bit little-endian PCM (both lossless unless you have 24 bit PCM, then substitute pcm_s24le.) It then converts them to h.264 in the second instance, with the fraunhoefer AAC library from the Android project (libfaac is more commonly included in ffmpeg builds. You can replace it with this instead.)

ffmpeg -i montypythonsflyingcircus-s1e1.avi -vcodec rawvideo -acodec pcm_s16le pipe:1 | ffmpeg -i - -vcodec libx264 -acodec libfdk_aac -vb 1200k -ab 96k mpfc-s1e01-output.mkv

If this doesn't pipe the subtitles, you can always rip them to SRT's and then mux them back in later, or add them to the pipes above easily.

14
  • Thank you. I was hoping to find an alternative to FLAC, because I know it requires more time.
    – Wutaz
    Commented Dec 6, 2013 at 23:28
  • 1
    There is nothing wrong with s16le for my example. You're confusing the s16le format with the pcm_s16le codec. See ffmpeg -formats and ffmpeg -codecs (or ffmpeg -encoders or ffmpeg -encoders) for all available formats and codecs. ffmpeg must be told which format to use for /dev/zero. What parameters do you consider redundant?
    – llogan
    Commented Dec 7, 2013 at 0:20
  • 1
    Also, you must explicitly define the format (with -f option) ffmpeg should use for the pipe output.
    – llogan
    Commented Dec 7, 2013 at 1:04
  • This does not work. If you don't specify an output format, you will only get Unable to find a suitable output format for 'pipe:1'. Have you tried to run the command? @Wutaz, did this really work for you? I would be very surprised if it did.
    – slhck
    Commented Dec 7, 2013 at 9:01
  • 2
    @Wutaz Let me just add my two cents here, based on my experience as a member for over three years, and as a community moderator: Stack Exchange sites are not FAQ sites. They are Q&A sites about specific and actual problems you are facing. People should be asking about their real problem and not about what they think the solution is. Otherwise this just results in confusion and, generally, questions and answers that aren't useful to anybody, really. I honestly don't know what to make of this question.
    – slhck
    Commented Dec 8, 2013 at 18:48

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .