In the last weeks I started to work on improving the GStreamer support for the Blackmagic Decklink cards. Decklink is Blackmagic’s product line for HDMI, SDI, etc. capture and playback cards, with drivers being available for Linux, Windows and Mac OS X. And on all platforms the same API is provided to access the devices.
GStreamer already had support for Decklink since some time in 2011, the initial plugin was written by David Schleef and seemed to work well enough for quite some time. But over the years people used GStreamer in more dynamic and complex pipelines, and we got a lot of reports recently about the plugin not working properly in such situations. Mostly there were problems with time and synchronization handling, but also other minor issues caused by the elements not using the source / sink base classes (GstBaseSrc and GstBaseSink). The latter was not easily possible because of one device providing audio and video input/output at the same time, so the elements would intuitively need two source or sink pads. And as a side effect this also made it very difficult to use the decklink elements in a standard playback pipeline with playbin.
The rewritten plugin now has separate elements for the audio and video parts of a device, giving 4 different elements in the end (decklinkaudiosrc, decklinkvideosrc, decklinkaudiosink and decklinkvideosink). These are now based on the corresponding base classes, work inside playbin (including pausing and seeking) and also handle synchronization properly. The clock of the hardware is now exposed to the pipeline as a GstClock and even if the pipeline chooses a different clock, the elements will slave their internal clock to the master clock of the pipeline and make sure that synchronization is even kept if the pipeline clock and the Decklink hardware clock are drifting apart. Thanks to GStreamer’s extensive synchronization model and all the functionality that already exists in the GstClock class, this was much easier than it sounds.
The only downside of this approach is, that it is now necessary to always have the video and audio element of one device inside the same pipeline, and also keep their states managed by the pipeline or at least make sure that they both go to PLAYING state together. Also the audio elements won’t work without a corresponding video element, which is a limitation of the hardware. Video only works without problems though.
All the code is available from gst-plugins-bad GIT master.
And now?
So what’s next? Testing, a lot of testing! Especially with different hardware, on different platforms and in all kinds of situations. If you have one of these cards, please test the latest code that is available in gst-plugins-bad GIT master. Please report any bugs you notice in Bugzilla, ideally with information about your hardware and platform and including a GStreamer debug log with GST_DEBUG=decklink*:6.
Apart from that many features are still missing, for example all the different configurations that are available via an API should be exposed on the elements, or support for more audio channels and more video modes. If you need any of these features and want to help, feel free to write patches and provide them in Bugzilla.
Any help would be highly appreciated!
I just got two Intensity Pros and expect a decklink extreme soon. I have lots of time to test. I am unsure how to compile from git.
The first items here might help, or from the PiTiVi wiki.
great work! compiling now. would this plugin work with gstreamer 0.10?
No, all the changes I did are for GStreamer 1.x. You could probably backport them, but why? 0.10 is no longer maintained since almost 3 years now and there’s just no point in using it anymore.
This is great news. I had a great deal of problems with the Decklink cards back in August last year (bug 725871). Even with your help, I never managed to get a pipeline running properly – perhaps now I will. That project has gone but I am still eager to use the Decklink cards with Gstreamer and have both a Decklink SDI and Decklink Optical Fibre card to test with.
I will set up a system (Ubuntu 14.04 LTS 64) tomorrow and report my findings. In my applications I need SD keying so have to use the 9.8 drivers, is there any impact on the plugin with the old driver?
I don’t know, I only tested with the 10.3.x drivers. Please let me know of any problems with the older drivers 🙂
Found a strange problem with declinkvideosrc and the problem is: while I encode and stream declinkvideosrc video to a remote pc over the internet. Then on receiving pc video contains a lot of black/empty screen on the video. If i use a very good internet connection to stream to remote pc, then this blank screens within video is not available. This problem appear while I stream video over poor 3g internet connection. Previous driver did not showed such problem. Can you help to fix this problem?
Do you have the same problem with other live sources, like v4l2src or “videotestsrc is-live=true”? But in any case, please file a bug with debug logs at http://bugzilla.gnome.org against GStreamer.
Thansk for all your work on the plugin! It simplified my video mixer’s (Stir) pipeline a lot, and seems more stable with my two Mini Recorders.
I have a question. If I have two video sources, either two independent files or live streams, they are of the same event, one is a wide shot and the other is a tight shot. They both are synced at encoding (Started at the same time), can gstreamer sync them for playback on the same computer, but have two blackmajic display cards HDMI or SDI ports. Can you send each independent signal to each of the blackmajic cards to be displayed synced.
I can’t find any examples of code that would show how to tell gstreamer file 1 should go to blackmajic card 0 and file 2 should go to backmajic card 1.
Thank you
Raymond
Yes that’s possible. And you can select the device to use by using the device-number property on the sink.
I try to play rtp stream on Deckling out.
I compile streamer from master git
gst-launch-1.0 -vvv udpsrc port=50110 caps=”application/x-rtp, media=video, payload=100, clock-rate=90000, encoding-name=VP8-DRAFT-IETF-01″ ! rtpjitterbuffer do-lost=true latency=300 mode=0 ! \
rtpvp8depay ! \
queue ! \
vp8dec ! \
videoparse format=5 width=720 height=486 interlaced=true top-field-first=true pixel-aspect-ratio=10/11 framerate=30/1 ! \
videoconvert ! \
decklinkvideosink mode=1
It’s work but image very distorted.
Can you tell maybe something wrong in the command line?
Ask on the mailing list: http://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
One problem here is that you use videoparse. Don’t.
It’s work now
gst-launch-1.0 -vvv udpsrc port=50110 caps=”application/x-rtp, media=video, payload=100, clock-rate=90000, encoding-name=VP8-DRAFT-IETF-01″ ! \
rtpjitterbuffer do-lost=true latency=300 mode=0 ! \
rtpvp8depay ! \
queue ! \
vp8dec ! \
videoconvert ! video/x-raw,format=UYVY ! \
videoscale ! video/x-raw,width=720,height=576 ! \
videorate ! video/x-raw,framerate=25/1 ! \
decklinkvideosink mode=5
But only progressive modes. How to force interlace?
Use the interlace element
adding
interlace field-pattern=2:2
work fine.
Thank you!
Would it be possible to have the videosink support other pixel formats, specifically bmdFormat8BitARGB? I have fiddled around with some of the code in gstdecklinkvideosink and gstdecklink so caps can accept video/x-raw,format=ARGB and create an ARGB frame but there must be more to do. Basically I want to send ARGB data and have the cards (which support it) output the separate alpha channel on the ‘B’ channel.
Yes, that would be possible but is currently not implemented. For the source RGB support is implemented already, maybe this helps you to find the problem in your changes? https://cgit.freedesktop.org/gstreamer/gst-plugins-bad/commit/sys/decklink?id=3ea431c5b5425806b7139025b6f233999563594d
Once you have a working patch, please provide it for integration 🙂
I am using the decklinkvideosink with interlaced video (dv in a qt container). The only way I can link uridecodebin with the Decklink is by de- interlacing, autovideoconvert, videorate, interlace, sink. This works fine but it throws the audio out. The audio sounds like it is missing half its samples (correct speed and pitch). This seems sort of correct as I’ve doubled the frame rate when de-interlacing but how do I keep the audio in step with the video?
Please ask such questions on the mailing list: https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
Can you also provide the full pipeline that you’re using there, or otherwise a testcase to reproduce it?
i’m really a nub about gstreamer, but with autovideosink this works
gst-launch-1.0 playbin uri=file:///home/v1p3r/Videos/editoria_galileo.mkv video-sink=autovideosink audio-sink=autoaudiosink
this pipeline works, so cards seems ok (video at least):
gst-launch-1.0 videotestsrc pattern=snow ! decklinkvideosink
now, using decklink audio and video i got this error:
gst-launch-1.0 playbin uri=file:///home/v1p3r/Videos/editoria_galileo.mkv video-sink=decklinkvideosink audio-sink=decklinkaudiosink
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
Redistribute latency…
Redistribute latency…
Redistribute latency…
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstMatroskaDemux:matroskademux0: Internal data stream error.
Additional debug info:
matroska-demux.c(4768): gst_matroska_demux_loop (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstMatroskaDemux:matroskademux0:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn’t want to preroll.
Setting pipeline to NULL …
Freeing pipeline …
Try adding converters before both sinks. videoconvert and audioconvert ! audioresample.
thanks for the kindly reply.
gst-launch-1.0 playbin uri=file:///home/v1p3r/Videos/editoria_galileo.mkv video-sink=”videoconvert ! decklinkvideosink” audio-sink=”audioconvert ! audioresample ! decklinkaudiosink”
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
Redistribute latency…
Redistribute latency…
Redistribute latency…
ERROR: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstMatroskaDemux:matroskademux0: Internal data stream error.
Additional debug info:
matroska-demux.c(4768): gst_matroska_demux_loop (): /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstMatroskaDemux:matroskademux0:
streaming stopped, reason not-negotiated (-4)
ERROR: pipeline doesn’t want to preroll.
Setting pipeline to NULL …
Freeing pipeline …
i also tested with camera recorded MTS (1080 25p ac3 audio) and plain mpeg2 file, same error ….
Please ask on the mailing list and also include a debug log (with GST_DEBUG=6) in your mail: https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
I recently implemented a streaming server that uses decklinkvideosrc piped to H264 via gst-rtsp-server. Works great. I have another machine that I have two Decklink cards set up to record to .MXF files via ffmpeg. I’d like to try GStreamer for this machine, but the problem is I don’t see a way to tell the pipeline which card I want to record from. device-number seems to be an output-only property, and I’m trying to select which card to input from. Thanks in advance for any help you could provide!
device-number is the property to use. Set it to 0 for the first card, 1 for the second, etc.
Is there any example pipeline to generate perfect cbr mpegts for dvb using decklink source ?
Fahad