Improved GStreamer support for Blackmagic Decklink cards

In the last weeks I started to work on improving the GStreamer support for the Blackmagic Decklink cards. Decklink is Blackmagic’s product line for HDMI, SDI, etc. capture and playback cards, with drivers being available for Linux, Windows and Mac OS X. And on all platforms the same API is provided to access the devices.

GStreamer already had support for Decklink since some time in 2011, the initial plugin was written by David Schleef and seemed to work well enough for quite some time. But over the years people used GStreamer in more dynamic and complex pipelines, and we got a lot of reports recently about the plugin not working properly in such situations. Mostly there were problems with time and synchronization handling, but also other minor issues caused by the elements not using the source / sink base classes (GstBaseSrc and GstBaseSink). The latter was not easily possible because of one device providing audio and video input/output at the same time, so the elements would intuitively need two source or sink pads. And as a side effect this also made it very difficult to use the decklink elements in a standard playback pipeline with playbin.

The rewritten plugin now has separate elements for the audio and video parts of a device, giving 4 different elements in the end (decklinkaudiosrc, decklinkvideosrc, decklinkaudiosink and decklinkvideosink). These are now based on the corresponding base classes, work inside playbin (including pausing and seeking) and also handle synchronization properly. The clock of the hardware is now exposed to the pipeline as a GstClock and even if the pipeline chooses a different clock, the elements will slave their internal clock to the master clock of the pipeline and make sure that synchronization is even kept if the pipeline clock and the Decklink hardware clock are drifting apart. Thanks to GStreamer’s extensive synchronization model and all the functionality that already exists in the GstClock class, this was much easier than it sounds.

The only downside of this approach is, that it is now necessary to always have the video and audio element of one device inside the same pipeline, and also keep their states managed by the pipeline or at least make sure that they both go to PLAYING state together. Also the audio elements won’t work without a corresponding video element, which is a limitation of the hardware. Video only works without problems though.

All the code is available from gst-plugins-bad GIT master.

And now?

So what’s next? Testing, a lot of testing! Especially with different hardware, on different platforms and in all kinds of situations. If you have one of these cards, please test the latest code that is available in gst-plugins-bad GIT master. Please report any bugs you notice in Bugzilla, ideally with information about your hardware and platform and including a GStreamer debug log with GST_DEBUG=decklink*:6.

Apart from that many features are still missing, for example all the different configurations that are available via an API should be exposed on the elements, or support for more audio channels and more video modes. If you need any of these features and want to help, feel free to write patches and provide them in Bugzilla.

Any help would be highly appreciated!

Web Engines Hackfest 2014

During the last days I attended the Web Engines Hackfest 2014 in A Coruña, which was kindly hosted by Igalia in their office. This gave me some time to work again on WebKit, and especially improve and clean up its GStreamer media backend, and even more important of course an opportunity to meet again all the great people working on it.

Apart from various smaller and bigger cleanups, and getting 12 patches merged (thanks to Philippe Normand and Gustavo Noronha reviewing everything immediately), I was working on improving the WebAudio AudioDestination implementation, which allows websites to programmatically generate audio via Javascript and output it. This should be in a much better state now.

But the biggest chunk of work was my attempt for a cleaner and actually working reimplementation of the Media Source Extensions. The Media Source Extensions basically allow a website to provide a container or elementary stream for e.g. the video tag via Javascript, and can for example be used to implement custom streaming protocols. This is still far from finished, but at least it already works on YouTube with their Javascript DASH player and should give a good starting point for finishing the implementation. I hope to have some more time for continuing this work, but any help would be welcome.

Next to the Media Source Extensions, I also spent some time on reviewing a few GStreamer related patches, e.g. the WebAudio AudioSourceProvider implementation by Philippe, or his patch to use GStreamer’s OpenGL library directly in WebKit instead of reimplementing many pieces of it.

I also took a closer look at Servo, Mozilla’s research browser engine written in Rust. It looks like a very promising and well designed project (both, Servo and Rust actually!). I’m sure I’ll play around with Rust in the near future, and will also try to make some time available to work a bit on Servo. Thanks also to Martin Robinson for answering all my questions about Servo and Rust.

And then there were of course lots of discussions about everything!

Good things are going to happen in the future, and WebKit still seems to be a very active project with enthusiastic people 🙂
I hope I’ll be able to visit next year’s Web Engines Hackfest again, it was a lot of fun.