Improved GStreamer support for Blackmagic Decklink cards

In the last weeks I started to work on improving the GStreamer support for the Blackmagic Decklink cards. Decklink is Blackmagic’s product line for HDMI, SDI, etc. capture and playback cards, with drivers being available for Linux, Windows and Mac OS X. And on all platforms the same API is provided to access the devices.

GStreamer already had support for Decklink since some time in 2011, the initial plugin was written by David Schleef and seemed to work well enough for quite some time. But over the years people used GStreamer in more dynamic and complex pipelines, and we got a lot of reports recently about the plugin not working properly in such situations. Mostly there were problems with time and synchronization handling, but also other minor issues caused by the elements not using the source / sink base classes (GstBaseSrc and GstBaseSink). The latter was not easily possible because of one device providing audio and video input/output at the same time, so the elements would intuitively need two source or sink pads. And as a side effect this also made it very difficult to use the decklink elements in a standard playback pipeline with playbin.

The rewritten plugin now has separate elements for the audio and video parts of a device, giving 4 different elements in the end (decklinkaudiosrc, decklinkvideosrc, decklinkaudiosink and decklinkvideosink). These are now based on the corresponding base classes, work inside playbin (including pausing and seeking) and also handle synchronization properly. The clock of the hardware is now exposed to the pipeline as a GstClock and even if the pipeline chooses a different clock, the elements will slave their internal clock to the master clock of the pipeline and make sure that synchronization is even kept if the pipeline clock and the Decklink hardware clock are drifting apart. Thanks to GStreamer’s extensive synchronization model and all the functionality that already exists in the GstClock class, this was much easier than it sounds.

The only downside of this approach is, that it is now necessary to always have the video and audio element of one device inside the same pipeline, and also keep their states managed by the pipeline or at least make sure that they both go to PLAYING state together. Also the audio elements won’t work without a corresponding video element, which is a limitation of the hardware. Video only works without problems though.

All the code is available from gst-plugins-bad GIT master.

And now?

So what’s next? Testing, a lot of testing! Especially with different hardware, on different platforms and in all kinds of situations. If you have one of these cards, please test the latest code that is available in gst-plugins-bad GIT master. Please report any bugs you notice in Bugzilla, ideally with information about your hardware and platform and including a GStreamer debug log with GST_DEBUG=decklink*:6.

Apart from that many features are still missing, for example all the different configurations that are available via an API should be exposed on the elements, or support for more audio channels and more video modes. If you need any of these features and want to help, feel free to write patches and provide them in Bugzilla.

Any help would be highly appreciated!

20 thoughts on “Improved GStreamer support for Blackmagic Decklink cards”

    1. No, all the changes I did are for GStreamer 1.x. You could probably backport them, but why? 0.10 is no longer maintained since almost 3 years now and there’s just no point in using it anymore.

  1. This is great news. I had a great deal of problems with the Decklink cards back in August last year (bug 725871). Even with your help, I never managed to get a pipeline running properly – perhaps now I will. That project has gone but I am still eager to use the Decklink cards with Gstreamer and have both a Decklink SDI and Decklink Optical Fibre card to test with.
    I will set up a system (Ubuntu 14.04 LTS 64) tomorrow and report my findings. In my applications I need SD keying so have to use the 9.8 drivers, is there any impact on the plugin with the old driver?

  2. Found a strange problem with declinkvideosrc and the problem is: while I encode and stream declinkvideosrc video to a remote pc over the internet. Then on receiving pc video contains a lot of black/empty screen on the video. If i use a very good internet connection to stream to remote pc, then this blank screens within video is not available. This problem appear while I stream video over poor 3g internet connection. Previous driver did not showed such problem. Can you help to fix this problem?

  3. Thansk for all your work on the plugin! It simplified my video mixer’s (Stir) pipeline a lot, and seems more stable with my two Mini Recorders.

  4. I have a question. If I have two video sources, either two independent files or live streams, they are of the same event, one is a wide shot and the other is a tight shot. They both are synced at encoding (Started at the same time), can gstreamer sync them for playback on the same computer, but have two blackmajic display cards HDMI or SDI ports. Can you send each independent signal to each of the blackmajic cards to be displayed synced.

    I can’t find any examples of code that would show how to tell gstreamer file 1 should go to blackmajic card 0 and file 2 should go to backmajic card 1.

    Thank you
    Raymond

  5. I try to play rtp stream on Deckling out.
    I compile streamer from master git

    gst-launch-1.0 -vvv udpsrc port=50110 caps=”application/x-rtp, media=video, payload=100, clock-rate=90000, encoding-name=VP8-DRAFT-IETF-01″ ! rtpjitterbuffer do-lost=true latency=300 mode=0 ! \
    rtpvp8depay ! \
    queue ! \
    vp8dec ! \
    videoparse format=5 width=720 height=486 interlaced=true top-field-first=true pixel-aspect-ratio=10/11 framerate=30/1 ! \
    videoconvert ! \
    decklinkvideosink mode=1

    It’s work but image very distorted.
    Can you tell maybe something wrong in the command line?

  6. It’s work now
    gst-launch-1.0 -vvv udpsrc port=50110 caps=”application/x-rtp, media=video, payload=100, clock-rate=90000, encoding-name=VP8-DRAFT-IETF-01″ ! \
    rtpjitterbuffer do-lost=true latency=300 mode=0 ! \
    rtpvp8depay ! \
    queue ! \
    vp8dec ! \
    videoconvert ! video/x-raw,format=UYVY ! \
    videoscale ! video/x-raw,width=720,height=576 ! \
    videorate ! video/x-raw,framerate=25/1 ! \
    decklinkvideosink mode=5

    But only progressive modes. How to force interlace?

  7. Would it be possible to have the videosink support other pixel formats, specifically bmdFormat8BitARGB? I have fiddled around with some of the code in gstdecklinkvideosink and gstdecklink so caps can accept video/x-raw,format=ARGB and create an ARGB frame but there must be more to do. Basically I want to send ARGB data and have the cards (which support it) output the separate alpha channel on the ‘B’ channel.

  8. I am using the decklinkvideosink with interlaced video (dv in a qt container). The only way I can link uridecodebin with the Decklink is by de- interlacing, autovideoconvert, videorate, interlace, sink. This works fine but it throws the audio out. The audio sounds like it is missing half its samples (correct speed and pitch). This seems sort of correct as I’ve doubled the frame rate when de-interlacing but how do I keep the audio in step with the video?

Leave a Reply

Your email address will not be published. Required fields are marked *