GStreamer Dynamic Pipelines

Another recurring topic with GStreamer since a long time is how to build applications with dynamic pipelines. That is, pipelines in which elements are relinked while the pipeline is playing and without stopping the pipeline.

So, let’s write a bit about it and explain how it all works.

Note however that I’m not covering the most common and simple case here: a demuxer or decodebin adding pads when set to PLAYING, and then connecting to these pads. My example code does this however, but there’s enough documentation about this already.

Also these two examples unfortunately need GStreamer 1.2.3 or newer because of some bugfixes.

The Theory

What’s difficult about dynamic pipelines? Why can’t you just relink elements and their pads at any time like you do when the pipeline is not running? Let’s consider the example of the plumbing in your house. If you want to change something there in the pipeline, you better make sure nothing is flowing through the pipes at that time or otherwise there will be a big mess 🙂

Pad Probes

In GStreamer this is handled with the pad probe mechanism. Pad probes allow to register a callback that is called when ever a specific condition is met. These conditions are expressed with a flags type, and are e.g. GST_PAD_PROBE_TYPE_BUFFER for a buffer arriving at the pad or GST_PAD_PROBE_TYPE_QUERY_UPSTREAM for an upstream query. Additionally these flags specify the scheduling type (not so important), and can specify a blocking type: GST_PAD_PROBE_TYPE_IDLE and GST_PAD_PROBE_TYPE_BLOCK.

gst_pad_add_probe() adds a probe and returns an identifier, which can later be used to remove the probe again from the pad with gst_pad_remove_probe().

The Callback

The probe callback is called whenever the condition is met. In this callback we get an info structure passed, which contains the exact condition that caused the callback to be called and the data that is associated with this. This can be for example the current buffer, the current event or the current query.

From the callback this data can be inspected but it’s also possible to replace the data stored in the info structure.

Once everything we want to do is done inside the callback, the callback has to return a return value. This specifies if the data should be passed on (GST_PAD_PROBE_PASS), should be dropped (GST_PAD_PROBE_DROP), the probe should be removed and the data should be passed (GST_PAD_PROBE_REMOVE) or the default action for this probe type should happen (GST_PAD_PROBE_OK, more on that later).

Note that the callback can be called from an arbitrary thread, and especially is not guaranteed to be called from your main application thread. For all serialized events, buffers and queries it will be called from the corresponding streaming thread.

Also it is important to keep in mind that the callback can be called multiple times (also at once), and that it can also still be called when returning GST_PAD_PROBE_REMOVE from it (another thread might’ve just called into it). It is the job of the callback to protect against that.

Blocking Types

The blocking types of the conditions are of further interest here. Without a blocking type the probe callback can be used to get notified whenever the condition is met, or intercept data flow or even modify events or buffers. That can also be very useful but not for our topic.

Whenever one of the blocking types is specified in the condition, triggering the probe will cause the pad to be blocked. That means that the pad will not pass on any data related to the condition until the probe is removed (with gst_pad_remove_probe() or by returning GST_PAD_PROBE_REMOVE), unless GST_PAD_PROBE_PASS is returned from the callback. This guarantees that nothing else that matches the condition can pass and the callback can safely do it’s work. Especially if GST_PAD_PROBE_TYPE_DATA_BOTH is specified, no data flow can happen and downstream of the pad until the next queue can be safely relinked. To be able to relink parts after the next queues you additionally need to make sure that all data flow has finished until that point too, which can be done with further pad probes (see also the advanced variant of the first example).

Probes with the GST_PAD_PROBE_TYPE_IDLE blocking type will be called the next time the pad is idle, i.e. there is no data flow happening currently. This can also happen immediately if gst_pad_add_probe() is called, directly from the thread that calls gst_pad_add_probe(). Or after the next buffer, event or query is handled.

Probes with the GST_PAD_PROBE_TYPE_BLOCK blocking type will be called the next time the conditions match, and will block the pad before passing on the data. This allows to inspect the buffer, event or query that is currently pending for the pad while still blocking the pad from doing anything else.

The main advantage of GST_PAD_PROBE_TYPE_BLOCK probes is that they provide the data that is currently pending, while the main advantage of GST_PAD_PROBE_TYPE_IDLE is that it is guaranteed to be called as soon as possible (independent of any data coming or not, there might not be any further data at all). It comes with the disadvantage that it might be called directly from the thread that calls gst_pad_add_probe() though. Depending on the use case, one or both of them should be chosen.

Now to the examples.

Example 1: Inserting & removing a filter

In this example we will have a decodebin, connected to a video sink with the navseek element. This allows us to watch any supported video file and seek with the cursor keys. Every 5 seconds a video effect filter will be inserted in front of the sink, or removed if it was inserted last time. All this without ever stopping playback or breaking because of seeking. The code is available here.

Setting up everything

In main() we set up the pipeline and link all parts we can already link, connect to the GstElement::pad-added signal of decodebin and then start a mainloop.

From the pad-added callback we then connect the first video pad that is added on decodebin to the converter in front of the video sink. We also add our periodic 5 second timeout, which will insert/remove the filter here. After this point the pipeline will be PLAYING and the video will be shown.

The insertion/removal of the filter

The timeout callback is quite boring, nothing is happening here other than calling gst_pad_add_probe() to add an IDLE probe. And here we also initialize a variable that protects our probe callback from multiple concurrent calls. We use an IDLE probe here as we’re not interested in the data causing the callback call, and also just want to get the callback called as soon as possible, even from the current thread.

Now the actual insertion or removal of the filter happens in the probe callback. This is the actually interesting part. Here we first check if the callback was already called with an atomic operation, and afterwards either insert or remove the filter. In both cases we need to make sure that all elements are properly linked on their pads afterwards and have the appropriate states. We also have to insert a video convert in front of the filter to make sure that output of the decoder can be handled by our filter.

A slightly more advanced variant

And that’s already all to know about this case. A slightly more complex variant of this is also in gst-plugins-base. The main difference is that BLOCK probes are used here, and the filter is drained with an EOS event before it is replaced. This is done by first adding a BLOCK probe in front of the filter, then from the callback adding another one after the filter and then sending an EOS event to the filter. From the probe after the filter we pass through all data until the EOS event is received and only then remove the filter. This is done for the case that the filter has multiple buffers queued internally. BLOCK probes instead of IDLE probes are used here because we would otherwise potentially send the EOS event from the application’s main thread, which would then block until the EOS event arrived on the other side of the filter and the filter was removed.

Example 2: Adding & removing sinks

The second example also plays a video with decodebin, but randomly adds or removes another video sink every 3 seconds. This uses the tee element for duplicating the video stream. The code can be found here.

Setting up everything

In main() we set up the pipeline and link all parts we can already link, connect to the GstElement::pad-added signal of decodebin and then start a mainloop. Same as in the previous example. We don’t add a sink here yet.

From the pad-added callback we now link decodebin to the tee element, request a first srcpad from tee and link a first sink. This first sink is a fakesink (with sync=TRUE to play in realtime), and is always present. This makes sure that the video is always playing in realtime, even if we have no visible sinks currently. At the end of the callback we add our 3 seconds, periodic timer.

Addition of sinks

In the timeout callback we first get a random number to decide if we now add or remove a sink. If we add a new sink this is all done from the timeout callback (i.e. the application’s main thread) directly. We can do all this from the main thread and without pad probes because there’s no data flow to disrupt. The new tee srcpad is just created here and if tee pushes any buffer through it now it will just be dropped. For adding a sink we just request a new srcpad from the tee and link it to a queue, video converter and sink, sync all the states and remember that we added this sink. A queue is necessary after every tee srcpad because otherwise the tee will lock up (because all tee srcpads are served from a single thread).

Removal of sinks

Removal of sinks is a bit more complicated as now we have to block the relevant pad because there might be data flow happening just now. For this we add an IDLE probe and from the callback unlink and destroy the sink. Again we protect against multiple calls to the callback, and we pass our sink information structure to the callback to know which sink actually should be removed. Note here that we pass g_free() to gst_pad_add_probe() as destroy notify for the sink information structure and don’t free the memory from the callback. This is necessary because the callback can still be called after we released the sink, and we would access already freed memory then.

I hope this helps to understand how dynamic pipelines can be implemented with GStreamer. It should be easily possible to extend these example s to real, more complicated use cases. The concepts are the same in all cases.

80 thoughts on “GStreamer Dynamic Pipelines”

  1. I’m following the posts on the list and I’ll give a try to this approach as soon as I have some spare time, I’m managing dynamic pipeline since years (works on 0.10 too) with an alternative approach/hack, I use two different pipelines with appsink/appsrc and stop/start the appsrc one when I need to do something dynamic (for example changing the filesink location) and keep the buffers produced by appsink while appsrc is stopped in an application queue, I’m glad to see that this seems finally supported/fixed upstream 🙂

    this approach does not retimestamp frames when a dynamic change occour, right? In some my applications I’ll need to keep original timestamp on splitted files

    1. No, it does not change timestamp or segment information when dynamically changing the pipeline. In 1.x the segments are sticky and automatically propagated to other elements if linked later, so the original timestamps would be kept on the splitted files. My examples don’t split files though, they will just display the video.

      But of course there are many use cases where you want something different, and e.g. have each splitted file start at timestamp 0 again. But this problem is unrelated to dynamic pipelines, and you can do it with e.g. setting pad offsets or intercepting buffers or segment events with pad probes and changing them.

    2. Hello Nicola,

      I’m using 0.10 as well, would it be possible to share some examples of your dynamic pipeline approach please?

      Thanks,
      Ludo

      1. You probably really want to upgrade to 1.x at this point. 0.10 is no longer maintained since more than 4 years now and full of known bugs that are fixed since a long time, including security related ones.

  2. Another useful tool for dynamic pipelines where the elements you want to add are filters is the insertbin element, it will do the pad blocking for you.

    If you have a queue in the part that you want to remove and you want to drain it, a useful trick is to add a pad probe of type GST_PAD_PROBE_TYPE_QUERY_DOWNSTREAM:after the queue, and then do a DRAIN query, in the second probe, you can stop the query from propagating further with GST_PAD_PROBE_DROP. This should all be synchronous.

    1. That’s true, for the filter case you can probably use insertbin, which currently is in a library in gst-plugins-bad. Once that is moved to gst-plugins-base it should be a better solution for the simple case 🙂

      For the DRAIN query, while this in theory is true the problem with that is that most elements that actually need to be drained don’t implement handling of the query but instead do the same on EOS. We should fix that 🙂

  3. In your second example you add elements by timer. What changes should one make if he want to add element e.g. by EOS event catched in probe.

    Right now (gstreamer 1.2.3), the gst_element_sync_state_with_parent call generates the following warning:

    (gst-player:83702): GStreamer-WARNING **:
    Trying to join task 0x10505a050 from its thread would deadlock.
    You cannot change the state of an element from its streaming
    thread. Use g_idle_add() or post a GstMessage on the bus to
    schedule the state change from the main thread.

    1. You’ll need to dispatch that to a different thread then. You can’t change the state of elements from inside their streaming thread, and inside the EOS event probe you’re in the streaming thread of that element.

      Like the warning says, you could use e.g. a g_idle_add() for example to run it from your GMainLoop. You can only do it asynchronously.

      1. Thank you, that’s what I did. Just trying to point that real world example will unlikely use timers…

        May I ask you to cover the “queue” and especially “multiqueue” elements in the next post? I’m interested at what conditions they free memory for internal queues and how to force them to do that.

  4. It would be nice to complete the article with an example where you add sources. One of the challenges is that you might want to sync the playback pos of the new source with existing sources.

  5. Great article describing some features which are in need of more documentation.
    I have some questions:
    – when do you need to make the callback function atomic. I noticed the test-effect-switch code doesn’t do this. How could multiple callbacks to the same function occur concurrently?
    – both your code and the test-effect-switch code do pipeline manipulations in the probe callbacks, if I understood your reply to Kentzo correct, all this code should be scheduled on the main loop instead?

    Thanks!

    1. You need to make them atomic if they could be called multiple times. Blocking probes will be called as often as the condition happens, which usually means that if you block on something that happens in the streaming thread only that it will be called exactly once. But consider the case when you have a condition on buffers and upstream events. Your callback will be called once for the buffer and once for every thread that sends an upstream event.

      You can do all the pipeline manipulations from the probe callbacks as long as your manipulations don’t include shutting down the thread which called your callback. For example you can shut down upstream elements from a blocking probe callback that is called from the streaming thread of these upstream elements. In those cases you must do that from a separate thread and unblock the pad.

      1. Thanks for the explanation. Your post has been great for making a gstreamer application with dynamic pipelines.

        Some more questions:
        When you specify GST_PAD_PROBE_TYPE_IDLE or GST_PAD_PROBE_TYPE_BLOCK, what conditions (e.g. GST_PAD_PROBE_TYPE_BUFFER ) trigger the probe?

        Events and queries can be on other threads than the streaming thread, right? While buffers will only be on the streaming thread?

        When specifying a probe GST_PAD_PROBE_TYPE_BLOCK | GST_PAD_PROBE_TYPE_BUFFER, this probe will be called exactly once?
        Is this also the case when the probe is of type GST_PAD_PROBE_TYPE_IDLE | GST_PAD_PROBE_TYPE_BUFFER? Could it be called twice, as the probe might be called synchronously when adding the probe and perhaps once more from the streaming thread?

      2. Interesting questions 🙂

        Only GST_PAD_PROBE_TYPE_IDLE: The callback will be called as soon as the pad is idle, not matter under what condition. It will be called immediately from gst_pad_add_probe() if the pad is idle already, and otherwise after the current buffer/event/query is handled. The pad is blocked when the callback is called until the probe is removed.
        Only GST_PAD_PROBE_TYPE_BLOCK: The callback will be called right before the next buffer/event/query, and the pad is blocked then. The callback will be called once for every thread that triggers the condition.
        Also relevant here is this part of the code: http://cgit.freedesktop.org/gstreamer/gstreamer/tree/gst/gstpad.c#n1296

        Buffers are always sent from the streaming thread of the pad, and the same goes for serialized events and queries (ALLOCATION and DRAIN queries). Non-serialized events and non-serialized queries (most of the queries) can happen from any thread at any time.

        GST_PAD_PROBE_TYPE_BLOCK | GST_PAD_PROBE_TYPE_BUFFER: The callback will be called right before the next buffer passes through the pad, and then the pad is blocked. Also see the code linked above, providing no “data condition” means “all conditions” (and no scheduling condition is all scheduling modes).
        GST_PAD_PROBE_TYPE_IDLE | GST_PAD_PROBE_TYPE_BUFFER: This one is a bit weird and you probably don’t want that. The callback is called as soon as the pad is idle *or* when the next buffer passes through the pad. Not sure if this is what is expected here, but I can’t think of any situation that would make sense here. You?

        So, GST_PAD_PROBE_TYPE_BLOCK | GST_PAD_PROBE_TYPE_BUFFER will be called exactly once. The same with IDLE instead of BLOCK will be called once or twice.

        Hope this helps, and I hope I didn’t introduce a mistake anywhere.

  6. I assumed GST_PAD_PROBE_TYPE_IDLE | GST_PAD_PROBE_TYPE_BUFFER would work like an *and* and mean something like: no buffers passing. Indeed, it makes no sense if it works like an *or*.

    Thanks for answering my questions.

  7. Hi.

    Thanks for the blog!

    I had small query. I have two live appsrc’s one encoded video and other audio, and then it is muxed and created a mp4 file. The PTS/DTS/Duration of buffers is set and synchronized with buffers while pusjing into appsrc. I need to drop few buffers before muxing from video and audio src’s I do this by dropping buffers in pad probe. but finally when the mp4 is created the video frame doesnt start with 0, it starts at 0.01666 offset how do I fix it ? Any quick pointers would help.

  8. (in live pipeline) When removing a encode-mux-filesink sink bin, how can you probe the EOS ?
    Sink element doesn’t have src pad, so if i probe for EOS event on his sink pad, the last buffer is lost ?

    Thank you for your blog.

    1. If you want to replace a sink you would get the EOS as a message from the sink. That message is posted once the sink is done with everything.

      1. Nice, but i don’t know how i can intercept this message. GstBus is only for pipeline … i’m going to search in doc.
        My project is GiGE Vision client with gstreamer backend (viewing, recording, streaming to RTP).
        https://gitorious.org/jiguiviou

      2. Unfortunately the only way to intercept a message inside some bin inside the pipeline currently is to create a new subclass of GstBin, and then override the handle_message() virtual method in there.

    1. Hi, thank you for your help. Sorry posting here. (and i don’t speak well english)

      I have subclass GstBin and reimplement handle_message (not familiar with class in C).
      static void gst_sink_bin_class_init (GstSinkBinClass * klass)
      {
      GstBinClass *gstbin_class;
      gstbin_class = (GstBinClass *) klass;
      gstbin_class->handle_message = GST_DEBUG_FUNCPTR (gst_sink_bin_handle_message);
      }

      in handler, i trasform EOS message to application message
      static void gst_sink_bin_handle_message (GstBin * bin, GstMessage * message)
      {
      if (GST_MESSAGE_TYPE (message) == GST_MESSAGE_EOS) {
      g_print (“Got EOS in the sink bin %s\n”, GST_ELEMENT_NAME(bin));
      gst_message_unref(message);
      message = gst_message_new_application(
      GST_OBJECT_CAST (bin),
      gst_structure_new_empty (“RemoveSender”));
      }
      GST_BIN_CLASS (gst_sink_bin_parent_class)->handle_message (bin, message);
      }
      Before posting EOS on SinkBin sink pad, i probe SourceBin src pad and i disconnect the SinkBin from the source.
      In pipeline async message, i remove and dispose the sink.
      Do you think this model is GOOD ?

  9. I want write gstreamer app with two file sources. one file should continue playing other should play 4 to 5 seconds without interrupting the first file. After playng 2nd file 5 seconds again 1st file should continue… how can i achive this task?

  10. Hi,
    I am trying ot write a gstreamer-1.0 plugin for a parser where I require to check for tags initially. So I have to change the pad from push to pull to access the whole file then based on the tags and meta data obtained I will do the things accordingly. But changing the pad activate mode from push to pull works after that changing it back to push is giving some problem.

    The code snippet is some what like the below mentioned code
    if (gst_pad_activate_mode ( GST_PAD_PEER(sinkpad), GST_PAD_MODE_PULL, TRUE))
    {
    //tag and metadat parsing
    result = gst_pad_activate_mode ( GST_PAD_PEER(sinkpad), GST_PAD_MODE_PUSH, TRUE);
    //The above statement gives the below mentioned error. The result returned is FALSE
    }

    The result of running my parser gives the below error:
    (gst-launch-1.0:14101): GStreamer-WARNING **:
    Trying to join task 0x9fe6068 from its thread would deadlock.
    You cannot change the state of an element from its streaming
    thread. Use g_idle_add() or post a GstMessage on the bus to
    schedule the state change from the main thread.

    After the above warning the task hangs.
    This problem is faced when I run my plugin with playbin in gstreamer 1.0.
    It is working fine in gstreamer-0.10.

    1. You are not allowed to deactivate pads (or in general change state of your element) from the streaming thread. That’s what probably happens here.

      But please ask this on the GStreamer developers’ mailing list, together with a backtrace of all threads when that happens and/or example code to reproduce it 🙂

  11. I have created a bin that is videoconvert->encoder->muxer->filesink. When I add this bin dinamically to the pipeline it starts recording with no problem at all. But when I checkout the file the first seconds are frozen. If the video is short is a few seconds long the video freezes like the first half of the video. If the video is long it freezes several minutes. Any ideas what could be ?

  12. Sir ,
    I am an trying to understand the dynamic programming using the above mentioned details and sample code.
    After compilation of the i am getting below mentioned error :-
    make dynamic-tee-vsink
    cc -o dynamic-tee-vsink dynamic-tee-vsink.c -I/opt/gstreamer-sdk/include -I/opt/gstreamer-sdk/include -I/opt/gstreamer-sdk/include -pthread -I/opt/gstreamer-sdk/include/glib-2.0 -I/opt/gstreamer-sdk/lib/glib-2.0/include -I/usr/include/gstreamer-1.0 -L/opt/gstreamer-sdk/lib -L/opt/gstreamer-sdk/lib -L/opt/gstreamer-sdk/lib -L/opt/gstreamer-sdk/lib -lgstreamer-1.0 -lgobject-2.0 -lglib-2.0
    /usr/lib/gcc/i686-linux-gnu/4.8/../../../i386-linux-gnu/libgstreamer-1.0.so: undefined reference to `g_type_class_adjust_private_offset’
    collect2: error: ld returned 1 exit status
    make: *** [dynamic-tee-vsink] Error 1

    I am using Gstreamer 0.10
    gst-inspect –gst-version
    GStreamer Core Library version 0.10.36

    Could you please let me know how can get the sample code shared in your link functional.
    Thanks ,
    Ashish

    1. This is a problem in your build environment. /usr/lib/i386-linux-gnu/libgstreamer-1.0.so is linked against a newer GLib version (in that very directory), but you try to link against an older GLib version in /opt/gstreamer-sdk/lib. That can’t work.

      Also please get rid of all this GStreamer SDK stuff, it’s based on a very old (4+ years) and nowadays unsupported GStreamer version. Too old to use any code I provide here. Your system apparently has a newer version in /usr/lib/i386-linux-gnu already.

    2. Ok …looking for probable cause , i got that the sample code could not be used with GSTREAMER-0.10 .

      I downloaded the latest GSTREAMER (1.6) and the code gets compiled properly..

      1. Could you please provide some pointers as to where i should be looking as when i tried to execute the binary created i am getting error.

        ashish@ashish-System-Product-Name:~/gst-snippets-217ae015aaddfe3f7aa66ffc936ce93401fca04e$ export GST_DEBUG=”*:3″
        ashish@ashish-System-Product-Name:~/gst-snippets-217ae015aaddfe3f7aa66ffc936ce93401fca04e$ ./dynamic-tee-vsink test.mp4

        ** (dynamic-tee-vsink:13271): ERROR **: Failed to create elements
        Trace/breakpoint trap (core dumped)
        ashish@ashish-System-Product-Name:~/gst-snippets-217ae015aaddfe3f7aa66ffc936ce93401fca04e$

        When tried to find the details by export GST_DEBUG=”*:4″
        0:00:00.016857094 13229 0x2233800 INFO GST_PLUGIN_LOADING gstplugin.c:842:_priv_gst_plugin_load_file_for_registry: plugin “/usr/local/lib/gstreamer-1.0/libgstcoreelements.so” loaded
        0:00:00.016918125 13229 0x2233800 INFO GST_ELEMENT_FACTORY gstelementfactory.c:364:gst_element_factory_create: creating element “filesrc”
        0:00:00.017232895 13229 0x2233800 INFO GST_ELEMENT_PADS gstelement.c:646:gst_element_add_pad: adding pad ‘src’
        0:00:00.017364756 13229 0x2233800 INFO GST_ELEMENT_FACTORY gstelementfactory.c:456:gst_element_factory_make: no such element factory “decodebin”!
        0:00:00.017381530 13229 0x2233800 INFO GST_ELEMENT_FACTORY gstelementfactory.c:456:gst_element_factory_make: no such element factory “videoconvert”!
        0:00:00.017396569 13229 0x2233800 INFO GST_ELEMENT_FACTORY gstelementfactory.c:364:gst_element_factory_create: creating element “tee”
        0:00:00.017517728 13229 0x2233800 INFO GST_ELEMENT_PADS gstelement.c:646:gst_element_add_pad: adding pad ‘sink’

        2) I could find the libgstdecodebin.so by
        gst-inspect-0.10 decodebin at
        /usr/lib/x86_64-linux-gnu/gstreamer-0.10/libgstdecodebin.so

        3) But with gst-inspect-1.0 decodebin
        [Nothing]
        Could you please provide an pointer how i can resolve this error.

        Thanks

      2. Your GStreamer 1.x installation is incomplete. Where did you get it from, did you only install GStreamer core but not the plugin modules? You need the plugin modules.

  13. 1) Thanks a lot for replying back sir.
    2) I have installed “gstreamer-1.6.1.tar.xz” from
    http://gstreamer.freedesktop.org/src/gstreamer/
    The steps were :-
    ./configure
    sudo make
    sudo make install

    2) Then i tried compiling the sample code from
    gst-snippets-217ae015aaddfe3f7aa66ffc936ce93401fca04e/

    Could you please let me know what all packages i need to install to complete the gstreamer1.6 installation .

    Thanks
    Ashish

      1. Dear Slomo Sir ,
        1) I have installed below mentioned packages in mentioned order :-
        gst-plugins-base-1.6.1
        gst-plugins-good-1.6.1
        gst-plugins-bad-1.6.1
        gst-plugins-ugly-1.6.1
        gst-libav-1.6.1

        2) I am still getting an error stating the plugin’s [ decodebin , videoconvert,videoscale] could be created
        ** (dynamic-filter:19955): ERROR **: Failed to create elements
        Trace/breakpoint trap (core dumped)

        3) But except for decodebin , other plugin gst-inspect is able to detect propely.

        s~/gst-snippets-217ae015aaddfe3f7aa66ffc936ce93401fca04e: gst-inspect-1.0 videoscale
        0:00:00.010290211 19977 0xa9e260 WARN GST_PLUGIN_LOADING gstplugin.c:748:_priv_gst_plugin_load_file_for_registry: module_open failed: /usr/local/lib/gstreamer-1.0/libgstvideoscale.so: undefined symbol: gst_video_converter_new

        (gst-inspect-1.0:19977): GStreamer-WARNING **: Failed to load plugin ‘/usr/local/lib/gstreamer-1.0/libgstvideoscale.so’: /usr/local/lib/gstreamer-1.0/libgstvideoscale.so: undefined symbol: gst_video_converter_new
        0:00:00.010421998 19977 0xa9e260 WARN GST_PLUGIN_LOADING gstplugin.c:1269:gst_plugin_load_by_name: load_plugin error: Opening module failed: /usr/local/lib/gstreamer-1.0/libgstvideoscale.so: undefined symbol: gst_video_converter_new
        0:00:00.010439066 19977 0xa9e260 WARN GST_PLUGIN_LOADING gstpluginfeature.c:132:gst_plugin_feature_load: Failed to load plugin containing feature ‘videoscale’.
        element plugin couldn’t be loaded
        Plugin Details:
        Name videoscale
        Description Resizes video
        Filename /usr/local/lib/gstreamer-1.0/libgstvideoscale.so
        Version 1.6.1
        License LGPL
        Source module gst-plugins-base
        Source release date 2015-10-30
        Binary package GStreamer Base Plug-ins source release
        Origin URL Unknown package origin

        videoscale: Video scaler

        1 features:
        +– 1 elements

        ~/gst-snippets-217ae015aaddfe3f7aa66ffc936ce93401fca04e:

        4) Should i do an re-installation of GSTREAMER core after installing these packages separately. or is there any other problem in the steps i am executing.

        Thanks ,
        Ashish

      2. Ask on the mailing list please, this is not a support forum and your problems are not even GStreamer related but just about how to compile and install software.

  14. Sir ,

    It seems all the plugins are properly installed , but still the application fails to create an instance of these elements.

    ~: gst-inspect-1.0 | grep “videoconvert”
    autoconvert: autovideoconvert: Select color space convertor based on caps
    videoconvert: videoconvert: Colorspace converter
    ~:
    ~:
    ~: gst-inspect-1.0 | grep “decodebin”
    playback: uridecodebin: URI Decoder
    playback: decodebin: Decoder Bin
    ~:
    ~:
    ~: gst-inspect-1.0 | grep “videoscale”
    videoscale: videoscale: Video scaler

    Could you please guide as to how can the “*.c” code creates the elements , which are actually installed properly.
    ( Log is shared above of gst-inspect-1.0)

  15. Hi Sir ,
    Now i am able to get the instance of elements being created.
    But when i run the sample code i am getting error mentioned below:-

    ~/gst-snippets-217ae015aaddfe3f7aa66ffc936ce93401fca04e:
    ~/gst-snippets-217ae015aaddfe3f7aa66ffc936ce93401fca04e: ./dynamic-filter test.mp4

    ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0: Your GStreamer installation is missing a plug-in.
    Additional debug info:
    gstdecodebin2.c(4530): gst_decode_bin_expose (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0:
    no suitable plugins found:
    Missing decoder: MPEG-4 AAC (audio/mpeg, mpegversion=(int)4, framed=(boolean)true, stream-format=(string)raw, level=(string)2, base-profile=(string)lc, profile=(string)lc, codec_data=(buffer)12100000000000000000000000000000, rate=(int)44100, channels=(int)2)
    Missing decoder: H.264 (High Profile) (video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)high, codec_data=(buffer)0164001fffe100176764001facb4028022d0800000030080000018078c195001000468ee3cb0, width=(int)1280, height=(int)544, framerate=(fraction)24/1, pixel-aspect-ratio=(fraction)1/1, parsed=(boolean)true)

    ~/gst-snippets-217ae015aaddfe3f7aa66ffc936ce93401fca04e:

    I would try to figure the cause , but it would be helpful if you can provide any feedback ( might be i am missing any simple point )

    Thanks ,
    Ashish

  16. Dear Sir / All ,
    I am able to resolve the problem , below mentioned are the few more points which helped to address the issue.
    I am updating the solution because this thread has helped me a lot during this process & can be helpful for other’s also.

    I am listing the other points which helped to address the bug:-
    “Ubuntu restricted extras” needs to be installed .

    There are chance’s that the installation might fail stating mismatch in .deb
    file
    a) Try and change the Software & Updates -> Download from -> XXX.
    Mine was Server for India which i changed to Main server. Still the
    error was there.

    b) “configure: *** Orc acceleration disabled. Requires Orc >= 0.4.16,
    which …” during installation
    of gst-libav-1.6.1.
    This i addressed by downloading orc-0.4.24.tar.xz from
    http://gstreamer.freedesktop.org/src/orc/
    [ Installation is standard ./configure + make + sudo make install ]

    c) cat /var/cache/apt/archives/partial , helped to figure that the
    packages are failing due to SIZE
    limit . In the log *” ….The file is larger than the configured
    file size limit….”*
    So our Admin team removed this limit condition from my PC & i am
    able to get the result as
    expected.

    Thanks for your inputs !!!!!

    Thanks ,
    Ashish

  17. Sir ,
    I tried adding MUX->Q-> FILESINK at the end of simple pipeline like
    Test_Pipeline :- Videtestsrc -> Q1 -> AVIMUX -> Q2 -> FILESINK.

    1) When i change the properties of VIDEOTEST src element in dynamic pipeline i am able to get proper file
    i.e i can verify that the test pattern of created file changes by playing the created file.

    2) When i change the properties of FILESINK ( i.e location ) element in dynamic pipeline
    [ dynamic_filesink = 1 ] i am able to get NEW_FILES with some data on it.
    But only the first file is in playable state other files fails to play and gives
    message “……COULD NOT DETERMINE TYPE OF STREAM ……. ”

    Could you please provide inputs as to what is causing this problem & how this can be addressed.

    The sample code is at
    http://gstreamer-devel.966125.n4.nabble.com/Dynamic-Pipeline-Vidoetestsrc-element-works-but-Filesink-element-fails-td4674638.html

    Thanks ,
    Ashish Kumar Mishra.

    1. You have to finalize the file, look it up in the documentation. Send an EOS event to the pipeline, wait for the EOS message to arrive on the pipeline bus, only then shut down the pipeline.

      1. What if you don’t want to shutdown the whole pipeline but the elements after the tee that you want to dynamically add/remove include a file muxer. How do you work the eos to finalize the file yet keep the overall pipeline running??

      2. Set the `message-forward` property on the bin, that way you get the `EOS` message as a wrapped message to the application (see docs of that property). Also make sure that you always have a non-EOS sink running.

  18. Hi and thank you for sharing your knowledge with us. I have a question. I compile and run your 1st example and i notice that memory is increased every time you change the video filter. Why this happens? How can i fix that? My environment is ubuntu 14.04 (gstreamer 1.2).

  19. Dear Sir ,
    As per input , i tried sending EOS event to MUX & in EOS event handler i am updating the FILESINK . But still the problem exist

    static GstPadProbeReturn eos_cb (GstPad * pad, GstPadProbeInfo * info, gpointer user_data)
    {
    gst_pad_remove_probe (pad, GST_PAD_PROBE_INFO_ID (info));
    gst_element_set_state (filesink, GST_STATE_NULL);

    attempt = attempt+1;
    sprintf( buffer, “test_%d.avi” , attempt);
    g_object_set(G_OBJECT (filesink),”location”, buffer, NULL);
    gst_element_set_state (filesink , GST_STATE_PLAYING);
    return GST_PAD_PROBE_PASS;
    }

    static GstPadProbeReturn pad_probe_cb (GstPad * pad, GstPadProbeInfo * info, gpointer user_data)
    {
    gst_pad_remove_probe (pad, GST_PAD_PROBE_INFO_ID (info));
    gst_pad_add_probe (muxsrcpad,GST_PAD_PROBE_TYPE_BLOCK_DOWNSTREAM, eos_cb, user_data, NULL);
    gst_pad_send_event (muxsinkpad, gst_event_new_eos ());

    return GST_PAD_PROBE_OK;
    } // End brace of pad_probe_cb

    static gboolean timeout_cb (gpointer user_data)
    {
    gst_pad_add_probe (q1sinkpad, GST_PAD_PROBE_TYPE_BLOCK_DOWNSTREAM,pad_probe_cb, user_data, NULL);
    return TRUE;
    } // End brace of “timeout_cb”

    Could you please let me know logically where should i tap to address the problem.
    Could you share any sample to dynamically update Filesink in an sample pipeline of videotestsrc ->q1-> avimux -> filesink.

    1. You have to restart the filesink and muxer. Otherwise the muxer won’t create a new file with the headers for the second (and following) files and just creates useless data.

  20. Thanks Sir ,

    Now i am able to create multiple file-chunk with VIDEOTESTSRC and the files are in playable state.

    Ashish Kumar Mishra.

  21. Dear Sir ,

    I have build an dynamic pipeline using “dynamic-filter.c” as an starting point. It works and generates multiple avi files which could be played back .

    But usage of filesrc + alsasrc , leds to drops of sample from pipeline
    If possible could you please provide any pointers as to how could it be resolved.
    The same is being discussed at
    http://gstreamer-devel.966125.n4.nabble.com/How-to-determine-find-quot-period-size-quot-of-ALSASRC-LATENCY-of-alsasrc-td4675332.html#a4675392

    Thanks ,
    Ashish Kumar Mishra.

  22. I’m facing some problem while playing with gstreamer dynamic pipelines. Need your help / suggestions based on below scenario:
    I have a gstreamer demuxer plugin (not native plugin) with 1 video and 1 audio pad.
    This plugin has 2 source pads: video pad, audio pad.
    For each pad there is dedicated gst_buffer_pool allocated which
    containd pool of buffers to host Video and Audio ES post demuxing.
    The pipeline is based on playbin. A/V playback is OK.

    Based on requirement, demuxer has to switch to another audio present in the container (MP4) at run time when gstreamer pipeline is in playing state.
    Now demux has to switch to another another as mentioned above which has different codec type. So following the dynamic pipeline principle, following steps are done in sequence:
    While injection to demux video pad is ongoing (I understand that it should be ok to let continue video inject because we are considering deletetion and creation of audio only chain) ,
    older audio pad is removed by following below steps:
    1. gst_pad_push_event (audio_stream->pad, gst_event_new_flush_start ());
    2. gst_pad_push_event (audio_stream->pad, gst_event_new_flush_stop (TRUE));
    3. gst_pad_push_event (audio_stream->pad, gst_event_new_eos ());
    4. Flush the buffer pool associated associated with audio stream
    5. gst_element_remove_pad (xx, audio_stream->pad);
    6. gst_element_no_more_pads (xx);

    And then a new audio pad is created with following operations
    1. gst_pad_set_active pad
    2. gst_pad_use_fixed_caps
    3. gst_pad_push_event gst_event_new_stream_start
    4. gst_pad_set_caps pad
    5. gst_element_add_pad
    6. new segment event
    7. caps event
    8. push buffers

    gstreamer version: 1.4.5

    The expectation was, after above steps, we can inject audio buffers on new audio pad and overall there should be just change in audio stream with no problem on video.
    But the observations are different !!!
    Video is stalled for some time when older audio pad is deleted and new audio pad is created.
    Thereafter video playback is fine but there is no audio and from demux perspective audio packets are pushed without any error on new audio pad.
    And the pipeline dump gives an idea that older chain is still not deleted and new chain is not built yet.

    Taking dump of GST pipeline it seems that older pipeline is not deleted and new pipeline from new audio pad is partially created upto proxypad16 inside decodebin,
    This pad is connected with Gstaacparse src pad but the chain meets dead end at this stage !!!!

    gst_pad_push() is always successful on new audio pad. The buffers which are pushed on audio pad are allocated from buffer pool.
    Kindly suggest your opinion on handling this dynamic pipeline scenario. Any suggestions for debugging will be helpful. Also posted the query on mailing list but probably not have sufficient rights as of now. Any suggestion would be very helpful.

    Thanks

      1. May request to share your expert opinion for the query raised on gstreamer dev list. Actually the query is related to dynamic pipeline issues in the wake of switching audio tracks with decodebin2 with single audio decoder with random a/v freeze and sync issues. If its genuine workflow wrt playbin2/decodebin2.

        http://gstreamer-devel.966125.n4.nabble.com/query-about-audio-track-switch-with-decodebin2-single-audio-decode-not-multiple-decode-as-standard-g-tc4683271.html

        Thanks
        –Nitin

      2. Please wait for a reply there, this is not really a support forum 🙂 Sooner or later someone will get to it hopefully

  23. Hello.
    I didn’t catch how to change parameters in working pipeline. In videocrop element for example.
    I have tried gst_pad_add_probe() with GST_PAD_PROBE_TYPE_BLOCK flag, but I not understand how it works. is it blocs flow?

  24. Hi. Thank you for your exemple, it help a lot!
    But is it normal to receive the following warning when we run the exemple:
    WARN bufferpool gstbufferpool.c:639:gst_buffer_pool_set_config: can’t change config, have outstanding buffers

  25. Hi,
    I have the following pipeline :

    v4l2src -> h264parse -> avdec_h264 -> identity ->glimagesink.

    The property ‘drop-probability’ is set dynamically for the element identity.
    I want to insert element ‘glimagefreeze’, when the ‘drop-probability’ is set to 1. So the new pipeline is :
    v4l2src -> h264parse -> avdec_h264 -> identity ->glimagefreeze -> glimagesink.

    Also, I want to remove this element ‘glimagefreeze’ when the ‘drop probability’ is set to 0.
    Which kind of probe type should I be using on the caps?

    1. An IDLE probe on the source pad of identity sounds like the right choice here.

      Instead of identity and switching between drop-probability 0 and 1, you might want to use the valve element instead.

      1. Thanks. I have tried using the valve element and it successfully pauses the video. But for some reason, the valve is not able to replay the video again, when drop is set to false. I think, I will log a bug once, I do some more testing with the valve element.

  26. Hi Slomo,

    Does the probe call work on pipelines that are PAUSED ?

    For eg: I have decoupled pipelines as following:

    Pipeline_src = udpsrc timeout=100ms ! rtph264depay ! h264parse ! avdec_h264 ! intervideosink
    Pipeline_sink = intervideosrc ! gltransformation ! glimagesink

    Whenever the timeout at udpsrc occurs due to link break, I PAUSE ‘pipeline_src’ and let gltransformation be applied on ‘pipeline_sink’. The data consists of video frames. Once the link is back, I want the data from udpsrc to be available at the glimagesink. So I installed probes on udpsrc, that can notify me in case of `GST_PAD_PROBE_TYPE_BUFFER` or ` GST_PAD_PROBE_TYPE_PUSH `. In the callback of probes, I set the pipeline_src back to PLAYING state. But since the pipeline_src is in paused state, the probes fail to notice that the link is working again and there is data.

    In such a case, how can I detect that the flow of data at udpsrc is back, and resume displaying the video at the sink?

    Regards

    1. It works in PAUSED state, but a) IDLE probes are not called while the pad is busy (and if it’s blocked in gst_pad_push() because of PAUSED then it’s busy until that returns), and b) non-IDLE probes are only triggering when the event happens (which is generally not happening in PAUSED for the same reason).

      You need to have udpsrc running all the time to notice if data is received, it won’t receive any data in PAUSED.

  27. I used output-selector for my video pipeline and it works properly choosing the correct camera to display since it was sharing on one display source.
    Now, I want to do the same for audio but got some issues where it would say busy.

    Here’s my pipeline:
    #define AUDIO_PIPE “alsasrc device=hw:0 ! audio/x-raw,rate=48000,channels=2,width=32 ! volume volume=.2 ! deinterleave name=d d.src_0 ! tee name=t1 d.src_1 ! tee name=t2”
    GstElement * pipeline, *tee1, *tee2, *outputsel1, *outputsel2;
    GstElement *alsasink1, *alsasink2;
    GstElement *fakesink1, *fakesink2;

    pipeline = gst_parse_launch_full(AUDIO_PIPE, NULL, GST_PARSE_FLAG_NONE, NULL);

    // Request source pads from tee and sink pads from bin
    tee1 = gst_bin_get_by_name(g_recordaudio_pipeline, “t1”);
    tee2 = gst_bin_get_by_name(g_recordaudio_pipeline, “t2”);

    After this point on, I was able to create ouputselect1 and outputselect2 where ouputselect1 has sink_0 = alsasink1 and sink_1 = fakesink1 and ouputselect2 has sink_0 = alsasink2 and sink_1 = fakesink2.

    The goal is to be able to select left or right channel to output to alsasink dynamically like my video case.

    The problem is that it says ‘busy’.
    So in order to make it work is to fakesink both sink_0 and sink_1 to outputselect2 so left channel is output and vice versa if we want the right. But we can’t do this dynamically.

    So I looked at your dynamic-tee-vsink.c tried it out with video source. Do you think it would work with audio alsasink which is a shared resource.

  28. Thank you very much for the detailed information. Right now I’m facing an issue with pad probes. Basically when running the following code:

    full_pipeline_description = g_strdup_printf(“playbin3 uri=%s”, uri);
    gub_log_pipeline(pipeline, “Using pipeline: %s”, full_pipeline_description);
    pipeline->pipeline = gst_parse_launch(full_pipeline_description, &err);
    g_free(full_pipeline_description);
    if (err) {
    gub_log_pipeline(pipeline, “Failed to create pipeline: %s”, err->message);
    return;
    }

    vsink = gst_parse_bin_from_description(gub_get_video_branch_description(), TRUE, NULL);
    gub_log_pipeline(pipeline, “Using video sink: %s”, gub_get_video_branch_description());
    g_object_set(pipeline->pipeline, “video-sink”, vsink, NULL);
    g_object_set(pipeline->pipeline, “flags”, 0x0003, NULL);

    bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline->pipeline));
    gst_bus_add_signal_watch(bus);
    gst_object_unref(bus);
    g_signal_connect(bus, “message”, G_CALLBACK(message_received), pipeline);

    if (vsink) {
    // Plant a pad probe to answer context queries
    GstElement *sink;
    sink = gst_bin_get_by_name(GST_BIN(vsink), “sink”);
    if (sink) {
    GstPad *pad = gst_element_get_static_pad(sink, “sink”);
    if (pad) {
    gulong id = gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_QUERY_DOWNSTREAM, pad_probe, pipeline, NULL);
    gst_object_unref(pad);
    }
    gst_object_unref(sink);
    }
    }

    Everything works correctly and I can play the video on a texture without issues (on Android). However, when I change the pipeline to use udpsrc (instead of playbin3):

    udpsrc port=53512 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H264,payload=96 ! rtph264depay ! decodebin3 ! glupload ! glcolorconvert ! video/x-raw(memory:GLMemory),format=RGBA,texture-target=2D ! fakesink sync=0 qos=1 name=sink

    The video doesn’t play and I can see inserting logs that the pad_probe is not called in this case. I’m not sure what I’m doing wrong. The code I’m using in this case is similar to the first snippet:

    full_pipeline_description = g_strdup_printf(“%s”, pipeline_cmd);
    gub_log_pipeline(pipeline, “Using pipeline: %s”, full_pipeline_description);
    pipeline->pipeline = gst_parse_launch(full_pipeline_description, &err);
    g_free(full_pipeline_description);
    if (err) {
    gub_log_pipeline(pipeline, “Failed to create pipeline: %s”, err->message);
    return;
    }

    vsink = gst_parse_bin_from_description(gub_get_video_branch_description(), TRUE, NULL);
    gub_log_pipeline(pipeline, “Using video sink: %s”, gub_get_video_branch_description());
    g_object_set(pipeline->pipeline, “sink”, vsink, NULL);
    g_object_set(pipeline->pipeline, “flags”, 0x0003, NULL);

    bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline->pipeline));
    gst_bus_add_signal_watch(bus);
    gst_object_unref(bus);
    g_signal_connect(bus, “message”, G_CALLBACK(message_received), pipeline);

    // Plant a pad probe to answer context queries
    GstElement *sink;
    sink = gst_bin_get_by_name(GST_BIN(vsink), “sink”);
    if (sink) {
    GstPad *pad = gst_element_get_static_pad(sink, “sink”);
    if (pad) {
    gulong id = gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_QUERY_DOWNSTREAM, pad_probe, pipeline, NULL);
    gst_object_unref(pad);
    }
    gst_object_unref(sink);
    }

    I understand that “video-sink” is a property of playbin3 and I’m trying to use “sink” when using udpsrc. But it doesn’t work as expected. Any hints are more than welcome.

  29. Hi,
    I don’t know if this is still open, but I am already trying for a few days now to change the filesrc location runtime.

    Evetually I reduced pipeline until next one, but even this fails

    filesrc|decodebin| autovideosink

    I use a blocking probe, where I Post a message to change to change
    the file location.

    When message is receive, I change the file location, and unblock the probe.
    I get the warning gst-base-src-start-complete pad not activated yet. and eventually qtdemux streaming stopped reason not linked

    I tried this with in different ways
    – change filesrc while pipeline set to GST_STATE_READY
    – with removing filesrc and adding a new one and sync to parent
    – with removing de decodebin
    – other ones
    Help would be appreciated.
    Cheers

    1. For changing the filesrc, all you need to do here would be to shut down the filesrc/decodebin to NULL state, set a new location, set to PLAYING state again and link the newly added `decodebin` pads to the video sink again. No pad probes shouldn’t be needed here.

      Restarting the `decodebin` is needed because it doesn’t really support changing caps while it’s running.

  30. Thank you for the great article!
    Is it possible to use probe to survive one the tee branch errors?
    For example I have something like
    “src ! video_processing ! tee name=split ! appsink
    split ! rtmp2sink location=$URI”

    When live sink is failed I want to reconnect it after sometimeout without breaking the appsink branch

    1. You could use the `errorignore` element before the sink and catch error messages coming from that sink on the bus to restart the sink

  31. Hi.
    Thank you for this great article.
    I have a dynamic pipeline containing multiple bins. Some of the bins are added and removed during runtime.
    When I add or remove the bin, I use idle probes before linking/unlinking. I carefully unref all pads and elements.
    When I look into the gstreamer log, I can see that the bins (and the elements inside the bin) are never disposed/finalized before the entire pipeline has been torn down. The documentation says when the reference count is 0, the element will be finalized. Can the pipeline have any references
    to the elements inside a ‘local’ bin ?
    Removing a bin from the pipeline – shouldn’t the bin (and all the elements inside the bin) be disposed and finalized ? Seems like the pipeline still has e reference (or something to the bins removed).

    Thanks!
    /Frederik

Leave a Reply to slomo Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.