4

I'm trying to deinterlace a frame using ffmpeg (latest release). Related with this question, I can get the filter I want using this sentence:

AVFilter *filter = avfilter_get_by_name("yadif");

After that, I open the filter context as:

AVFilterContext *filter_ctx;
avfilter_open(&filter_ctx, filter, NULL);

My first question is about this function. Visual Studio warns me about avfilter_open is deprecated. Which is the alternative?

After that, I do:

avfilter_init_str(filter_ctx, "yadif=1:-1");

And always fails. I've tried "1:-1" instead "yadif=1:-1", but always fails too, what parameter I should use?

EDIT: A value of "1" or "2", for example, it works. Debuging it, I found that with one of this values, the function uses mode=1 or mode=2. The explanation of those values is in this link.

Then, I have a AVFrame *frame that is the frame I want to deinterlace. When the last sentence work, I'll have the filter and his context init. How do I apply this filter to my frame?

Thanks for your help.

2
  • 1
    As for your first question, apparently the function you should use is avfilter_graph_alloc_filter() (avfilter_open() should be just a wrapper for it). Commented Jul 1, 2015 at 13:54
  • Can anybody helps me? I'm stuck with this issue. I tried some code examples as filtering_video.c, but it doesn't help me.
    – f3r83
    Commented Oct 15, 2015 at 11:29

1 Answer 1

9

I undertsnad your question is over a year now but recently I had to work with interlaced DVB-TS streams so I might be able to help anybody else coming across this subject.

These snippets are from a finished player I've written

Initialise the filter graph:

void VideoManager::init_filter_graph(AVFrame *frame) {
    if (filter_initialised) return;

    int result;

    AVFilter *buffer_src   = avfilter_get_by_name("buffer");
    AVFilter *buffer_sink  = avfilter_get_by_name("buffersink");
    AVFilterInOut *inputs  = avfilter_inout_alloc();
    AVFilterInOut *outputs = avfilter_inout_alloc();

    AVCodecContext *ctx = ffmpeg.vid_stream.context;
    char args[512];

    int frame_fix = 0; // fix bad width on some streams
    if (frame->width < 704) frame_fix = 2;
    else if (frame->width > 704) frame_fix = -16;

    snprintf(args, sizeof(args),
         "video_size=%dx%d:pix_fmt=%d:time_base=%d/%d:pixel_aspect=%d/%d",
         frame->width + frame_fix,
         frame->height,
         frame->format,// ctx->pix_fmt,
         ctx->time_base.num,
         ctx->time_base.den,
         ctx->sample_aspect_ratio.num,
         ctx->sample_aspect_ratio.den);

    const char *description = "yadif=1:-1:0";

    LOGD("Filter: %s - Settings: %s", description, args);

    filter_graph = avfilter_graph_alloc();
    result = avfilter_graph_create_filter(&filter_src_ctx, buffer_src, "in", args, NULL, filter_graph);
    if (result < 0) {
        LOGI("Filter graph - Unable to create buffer source");
        return;
    }

    AVBufferSinkParams *params = av_buffersink_params_alloc();
    enum AVPixelFormat pix_fmts[] = { AV_PIX_FMT_GRAY8, AV_PIX_FMT_NONE };

    params->pixel_fmts = pix_fmts;
    result = avfilter_graph_create_filter(&filter_sink_ctx, buffer_sink, "out", NULL, params, filter_graph);
    av_free(params);
    if (result < 0) {
        LOGI("Filter graph - Unable to create buffer sink");
        return;
    }

    inputs->name        = av_strdup("out");
    inputs->filter_ctx  = filter_sink_ctx;
    inputs->pad_idx     = 0;
    inputs->next        = NULL;

    outputs->name       = av_strdup("in");
    outputs->filter_ctx = filter_src_ctx;
    outputs->pad_idx    = 0;
    outputs->next       = NULL;

    result = avfilter_graph_parse_ptr(filter_graph, description, &inputs, &outputs, NULL);
    if (result < 0) LOGI("avfilter_graph_parse_ptr ERROR");

    result = avfilter_graph_config(filter_graph, NULL);
    if (result < 0) LOGI("avfilter_graph_config ERROR");

    filter_initialised = true;
}

When processing the video packets from the stream, check if it is an interlaced frame and send the frame off to the filter. The filter will then return the de-interlaced frames back to you.

void FFMPEG::process_video_packet(AVPacket *pkt) {
    int got;
    AVFrame *frame = vid_stream.frame;
    avcodec_decode_video2(vid_stream.context, frame, &got, pkt);

    if (got) {
        if (!frame->interlaced_frame) {     // not interlaced
            Video.add_av_frame(frame, 0);
        } else {
            if (!Video.filter_initialised) {
                Video.init_filter_graph(frame);
            }

            av_buffersrc_add_frame_flags(Video.filter_src_ctx, frame, AV_BUFFERSRC_FLAG_KEEP_REF);
            int c = 0;

            while (true) {
                AVFrame *filter_frame = ffmpeg.vid_stream.filter_frame;

                int result = av_buffersink_get_frame(Video.filter_sink_ctx, filter_frame);

                if (result == AVERROR(EAGAIN) || result == AVERROR(AVERROR_EOF)) break;
                if (result < 0) return;

                Video.add_av_frame(filter_frame, c++);
                av_frame_unref(filter_frame);
            }
        }
    }
}

Hope this helps anyone because finding information about ffmpeg is tough going.

3
  • Thanks, your post helped me a lot
    – f3r83
    Commented Feb 15, 2017 at 9:14
  • Thanks for your code, its been a great help. My code successfully runs to the avfilter_graph_parse_ptr function. At this point it returns -22 and the log shows that 'No such filter: 'yadif'. I have registered the filters with avfilter_register_all(). I looked at the build script for FFMPEG and there is nothing showing -disable-filters. Did you have to alter anything in the build to get the filters working for yourself. Thanks
    – Ajaxharg
    Commented Jun 9, 2017 at 20:26
  • As it turned out I had to explicitly add --enable-filter=yadif to my build script.
    – Ajaxharg
    Commented Jun 10, 2017 at 7:46

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Not the answer you're looking for? Browse other questions tagged or ask your own question.